1 00:00:08,520 --> 00:00:11,159 Speaker 1: Hey, Kelly, y'all set for today's episode? Yeah, I got 2 00:00:11,240 --> 00:00:13,880 Speaker 1: all my equipment set ready to go. What about a drink? 3 00:00:13,920 --> 00:00:16,799 Speaker 1: Is that part of your podcasting gear these days? You know, 4 00:00:16,920 --> 00:00:19,480 Speaker 1: you and I usually record early enough in the day 5 00:00:19,560 --> 00:00:22,919 Speaker 1: that that I'm not usually drinking while podcasting. And also, 6 00:00:23,200 --> 00:00:26,640 Speaker 1: you know, physics bends my brain enough without booze making 7 00:00:26,680 --> 00:00:29,200 Speaker 1: it all harder. Maybe there's a cocktail out there that 8 00:00:29,240 --> 00:00:32,280 Speaker 1: makes physics easier, maybe something with caffeine in it. I 9 00:00:32,320 --> 00:00:35,200 Speaker 1: thought I saw an energy drink once called dark matter. 10 00:00:35,320 --> 00:00:37,479 Speaker 1: I think that would work. No, No, I'm not drinking 11 00:00:37,520 --> 00:00:41,440 Speaker 1: anything called dark matter because that sounds super scatological. But 12 00:00:41,479 --> 00:00:44,160 Speaker 1: maybe that's the biologist in me. I think it was carbonated. 13 00:00:44,240 --> 00:00:48,040 Speaker 1: Is that health? No, it makes it so much worse, 14 00:00:48,159 --> 00:00:52,159 Speaker 1: so much worse, thumbs down. I'll stick with coffee, Just 15 00:00:52,200 --> 00:01:09,720 Speaker 1: don't call it dark matter. Hi. I'm Daniel. I'm a 16 00:01:09,720 --> 00:01:13,160 Speaker 1: particle physicist and a professor UC Irvine, and I usually 17 00:01:13,160 --> 00:01:16,759 Speaker 1: do physics wild caffeinated. And I'm Kelly Wiener Smith. I'm 18 00:01:16,800 --> 00:01:20,679 Speaker 1: a parasitologist adjunct at Rice University. And I'm a big 19 00:01:20,720 --> 00:01:23,319 Speaker 1: fan of coffee. And we're both parents, which means that 20 00:01:23,360 --> 00:01:26,120 Speaker 1: we've spent many nights up later than we wanted to 21 00:01:26,240 --> 00:01:29,600 Speaker 1: and rely it on coffee to get through the next day. 22 00:01:29,959 --> 00:01:33,640 Speaker 1: So many, so many, and Welcome to the podcast Daniel 23 00:01:33,680 --> 00:01:37,440 Speaker 1: and Jorge Explain the Universe, a production of I Heart Radio. 24 00:01:37,640 --> 00:01:39,880 Speaker 1: My friend and usual co host Jorge can't be with 25 00:01:39,959 --> 00:01:41,920 Speaker 1: us today, so we're delighted to have one of our 26 00:01:41,959 --> 00:01:45,040 Speaker 1: regular guest hosts, Kelly. Thanks very much for joining us again. Thanks. 27 00:01:45,080 --> 00:01:47,039 Speaker 1: I'm excited to be back, and I'm super excited to 28 00:01:47,040 --> 00:01:50,240 Speaker 1: be back for this episode in particular. Usually on the podcast, 29 00:01:50,280 --> 00:01:53,240 Speaker 1: we dive into the mysteries of the universe, thinking about 30 00:01:53,320 --> 00:01:55,720 Speaker 1: where it came from, how it works, how it might 31 00:01:55,840 --> 00:01:58,920 Speaker 1: all happen, what's going on inside black holes, and whether 32 00:01:59,040 --> 00:02:02,240 Speaker 1: or not we could have travel faster than light. Because 33 00:02:02,280 --> 00:02:05,440 Speaker 1: we view the universe like one giant detective novel, the 34 00:02:05,520 --> 00:02:07,880 Speaker 1: universe has left clues for us, and we are trying 35 00:02:07,920 --> 00:02:11,280 Speaker 1: to unravel the rules by which it works. We are 36 00:02:11,320 --> 00:02:14,200 Speaker 1: like bloodhounds sniffing out the next clue to try to 37 00:02:14,200 --> 00:02:17,600 Speaker 1: help us understand what's going on out there in the universe. 38 00:02:19,160 --> 00:02:22,000 Speaker 1: That's that's what bloodhounds sounds like, right right. I guess 39 00:02:22,000 --> 00:02:24,000 Speaker 1: they don't know much about dogs. You're the biologist, so 40 00:02:24,000 --> 00:02:25,960 Speaker 1: I'm not going to fact check your bloodhout for you. 41 00:02:26,280 --> 00:02:28,040 Speaker 1: It was spun on. It was fun. But we have 42 00:02:28,080 --> 00:02:30,680 Speaker 1: a fun series of episodes in which we dig into 43 00:02:30,760 --> 00:02:35,560 Speaker 1: the signs of fictional universes. We think that creativity and 44 00:02:35,760 --> 00:02:39,120 Speaker 1: art is an important part of being a scientist, thinking 45 00:02:39,160 --> 00:02:41,800 Speaker 1: about the way the universe might be, rather than the 46 00:02:41,800 --> 00:02:44,520 Speaker 1: way the universe is. As we stand on the forefront 47 00:02:44,560 --> 00:02:47,880 Speaker 1: of human ignorance, we don't know how the actual universe is, 48 00:02:48,200 --> 00:02:50,880 Speaker 1: so we have to hold in our minds several possibilities. 49 00:02:50,960 --> 00:02:54,240 Speaker 1: Maybe there are lots of particles out there for supersymmetry. 50 00:02:54,280 --> 00:02:57,200 Speaker 1: Maybe the universe is infinite, maybe it's finite, maybe it's 51 00:02:57,200 --> 00:02:59,920 Speaker 1: shaped like a donut. In order to explore all these ideas, 52 00:03:00,000 --> 00:03:03,519 Speaker 1: we need someone to generate these ideas. And while theoretical 53 00:03:03,520 --> 00:03:06,560 Speaker 1: physicists are quite adept at that, there's another group of 54 00:03:06,600 --> 00:03:08,800 Speaker 1: folks who are also working on the front lines to 55 00:03:08,880 --> 00:03:12,880 Speaker 1: expand our minds. And those, of course are science fiction authors. 56 00:03:13,200 --> 00:03:15,840 Speaker 1: Unconstrained by the rules of reality. They are allowed to 57 00:03:15,919 --> 00:03:18,880 Speaker 1: create whatever universe they want for their stories to take 58 00:03:18,919 --> 00:03:22,600 Speaker 1: place in. But sometimes those universes and those ideas inspire 59 00:03:22,720 --> 00:03:26,200 Speaker 1: real scientists to think in new directions. I love the 60 00:03:26,240 --> 00:03:29,520 Speaker 1: interplay between these two communities, the sci fi authors and 61 00:03:29,560 --> 00:03:32,519 Speaker 1: the like physicists space people like I cannot tell you 62 00:03:32,560 --> 00:03:35,200 Speaker 1: how many times while I was researching space settlements, the 63 00:03:35,320 --> 00:03:39,040 Speaker 1: Expanse was cited as like a you know, a serious reference, like, oh, 64 00:03:39,120 --> 00:03:41,119 Speaker 1: you know, space settlements might be like this as seen 65 00:03:41,160 --> 00:03:44,680 Speaker 1: in the Expanse. And it's like, oh, wow, you guys 66 00:03:44,720 --> 00:03:47,520 Speaker 1: are really clued into what's happening in these sci fi novels. 67 00:03:47,520 --> 00:03:50,520 Speaker 1: And anyway, clearly ideas are being generated by scientists while 68 00:03:50,520 --> 00:03:53,200 Speaker 1: they're reading through these novels. Absolutely, and it goes both 69 00:03:53,240 --> 00:03:56,560 Speaker 1: directions right. Science fiction authors take the science seriously of 70 00:03:56,600 --> 00:03:59,840 Speaker 1: their universes. They want to make something plausible, something realistic, 71 00:04:00,080 --> 00:04:03,280 Speaker 1: something that the reader feels like is a really universe 72 00:04:03,320 --> 00:04:06,560 Speaker 1: out there that they could experience, And physicists like me 73 00:04:06,720 --> 00:04:09,120 Speaker 1: who read a lot of science fiction do get inspired 74 00:04:09,120 --> 00:04:11,000 Speaker 1: by the ideas out there. There are some books that 75 00:04:11,040 --> 00:04:14,200 Speaker 1: are very heavy on the physics. Greg Egan's books, for example, 76 00:04:14,280 --> 00:04:17,520 Speaker 1: are very creative in terms of the fundamental theories that 77 00:04:17,640 --> 00:04:20,240 Speaker 1: underlie his universes. He wrote a book recently in which 78 00:04:20,279 --> 00:04:23,159 Speaker 1: there are two time dimensions and he worked out all 79 00:04:23,279 --> 00:04:25,080 Speaker 1: the physics of how it works and came up with 80 00:04:25,120 --> 00:04:28,719 Speaker 1: all these incredible realizations about how such a weird universe 81 00:04:28,720 --> 00:04:31,480 Speaker 1: would work and then wove it into this story with 82 00:04:31,600 --> 00:04:35,320 Speaker 1: his characters. Really an incredible intellectual achievement. Yeah, I'll have 83 00:04:35,360 --> 00:04:36,800 Speaker 1: to read that. That sounds like a lot of fun, 84 00:04:36,880 --> 00:04:38,920 Speaker 1: and that the books that we're talking about today also 85 00:04:38,960 --> 00:04:41,479 Speaker 1: did an amazing job of world building and helping you 86 00:04:41,560 --> 00:04:43,520 Speaker 1: really imagine what it would be like to live in 87 00:04:43,520 --> 00:04:46,160 Speaker 1: that world. It's a real skill, exactly. And so on 88 00:04:46,240 --> 00:04:54,320 Speaker 1: today's episode, we're gonna be talking about the science fiction 89 00:04:54,440 --> 00:04:57,800 Speaker 1: universe of the spare Man. So this is a really 90 00:04:57,839 --> 00:05:00,880 Speaker 1: fun book by Mary Robinett called who has been a 91 00:05:00,920 --> 00:05:04,000 Speaker 1: guest on our podcast before for her book The Calculating 92 00:05:04,040 --> 00:05:07,400 Speaker 1: Stars as part of the Lady Astronauts series. So, Daniel, 93 00:05:07,400 --> 00:05:09,800 Speaker 1: here's what I'm wondering. Drinks come up a lot in 94 00:05:09,839 --> 00:05:13,239 Speaker 1: this book. Do you typically sip cocktails while you're reading 95 00:05:13,240 --> 00:05:18,200 Speaker 1: science fiction? I do enjoy cocktails with friends usually, and 96 00:05:18,320 --> 00:05:20,800 Speaker 1: sometimes in the evening after a bunch of cocktails, I 97 00:05:20,800 --> 00:05:22,640 Speaker 1: will come home and read a bit of science fiction 98 00:05:22,880 --> 00:05:25,000 Speaker 1: as I fall asleep. But I'll admit it just makes 99 00:05:25,040 --> 00:05:27,480 Speaker 1: me fall asleep faster, so I end up reading like 100 00:05:27,560 --> 00:05:30,039 Speaker 1: a quarter of a page while a little bit tipsy. 101 00:05:30,120 --> 00:05:33,120 Speaker 1: Sometimes it makes it harder to stay awake, maybe if 102 00:05:33,120 --> 00:05:35,120 Speaker 1: I was drinking earlier in the day, but that's got 103 00:05:35,200 --> 00:05:38,960 Speaker 1: its own downsides depending on how early you start. How 104 00:05:38,960 --> 00:05:41,080 Speaker 1: about you? Do you ever sip cocktails while listening to 105 00:05:41,120 --> 00:05:44,440 Speaker 1: science talks? For example? I enjoy sipping cocktails when someone 106 00:05:44,440 --> 00:05:47,720 Speaker 1: else does the hard work of telling me what cocktails 107 00:05:47,760 --> 00:05:49,840 Speaker 1: I should be making. But to be honest, I would 108 00:05:49,920 --> 00:05:52,320 Speaker 1: rather just get a glass of wine. But my buddy 109 00:05:52,320 --> 00:05:55,200 Speaker 1: Scott Solomon and I used to interview scientists for this 110 00:05:55,279 --> 00:05:59,719 Speaker 1: series called Dorks, and our acronym was discussing our research 111 00:05:59,800 --> 00:06:02,920 Speaker 1: and knowledge. Socially, it's a tortured acronym, but we got 112 00:06:02,920 --> 00:06:05,200 Speaker 1: there and we had people come on and like dork 113 00:06:05,240 --> 00:06:07,800 Speaker 1: out about the stuff they do. And every week Scott 114 00:06:07,839 --> 00:06:10,960 Speaker 1: Solomon would pick a mocktail and a cocktail for us 115 00:06:11,040 --> 00:06:13,320 Speaker 1: to try out. That was sort of along with the theme, 116 00:06:13,440 --> 00:06:16,000 Speaker 1: and that was really fun. But you know, I usually 117 00:06:16,040 --> 00:06:17,560 Speaker 1: I am sort of lazy and I prefer to just 118 00:06:17,880 --> 00:06:20,719 Speaker 1: pop open a bottle of wine when I'm reading sci 119 00:06:20,760 --> 00:06:22,680 Speaker 1: fi on my own. But the cocktails were fun when 120 00:06:22,680 --> 00:06:24,600 Speaker 1: Scott was picking up Well. I know that the Drunk 121 00:06:24,680 --> 00:06:27,279 Speaker 1: History series is very popular. I wonder if a drunk 122 00:06:27,320 --> 00:06:29,279 Speaker 1: Science series would do as well. If you find out 123 00:06:29,320 --> 00:06:32,920 Speaker 1: like what the scientists really think after the third cocktail. 124 00:06:33,120 --> 00:06:36,359 Speaker 1: Oh my gosh, I really want I want to do 125 00:06:36,440 --> 00:06:38,320 Speaker 1: be the person who does this series. That sounds like 126 00:06:38,360 --> 00:06:40,279 Speaker 1: it would be so much fun. A couple of times 127 00:06:40,320 --> 00:06:42,680 Speaker 1: I've watched Drunk History, I loved it. Well, today, we're 128 00:06:42,680 --> 00:06:45,680 Speaker 1: not here to pitch drunk Science series to Netflix, though, 129 00:06:45,720 --> 00:06:48,520 Speaker 1: if any Netflix exects are listening, shoot us an email. 130 00:06:48,600 --> 00:06:51,200 Speaker 1: We're here to talk about a really fun book, The 131 00:06:51,279 --> 00:06:54,119 Speaker 1: spare Man, which is an unusual kind of book because 132 00:06:54,120 --> 00:06:58,400 Speaker 1: it's both a science fiction novel and it's a murder mystery. 133 00:06:58,560 --> 00:07:00,480 Speaker 1: Keeps you on the edge of your seat. Usually murder 134 00:07:00,560 --> 00:07:02,839 Speaker 1: mysteries take place here on Earth because they rely on 135 00:07:02,880 --> 00:07:05,919 Speaker 1: the reader understanding the rules and being very familiar and 136 00:07:06,000 --> 00:07:08,640 Speaker 1: comfortable with the universe that takes place in, so you 137 00:07:08,680 --> 00:07:11,280 Speaker 1: can understand the context for the rules and the clues 138 00:07:11,640 --> 00:07:13,800 Speaker 1: so that the reader has a chance at least of 139 00:07:13,880 --> 00:07:16,360 Speaker 1: figuring it out. So it's quite a technical achievement to 140 00:07:16,400 --> 00:07:19,520 Speaker 1: put together a murder mystery in space that also makes 141 00:07:19,520 --> 00:07:22,440 Speaker 1: sense and keeps you entertained. Yeah, and one of the 142 00:07:22,440 --> 00:07:24,480 Speaker 1: things that I loved about this book is that it's 143 00:07:24,480 --> 00:07:26,360 Speaker 1: it's in the future, but it's in the sort of 144 00:07:26,400 --> 00:07:28,400 Speaker 1: near future. So you know, we're hearing a lot more 145 00:07:28,440 --> 00:07:31,760 Speaker 1: about space tourism happening, and so that's starting to be 146 00:07:31,760 --> 00:07:34,600 Speaker 1: a thing that's on people's minds. And the couple in 147 00:07:34,640 --> 00:07:38,120 Speaker 1: this story are in space for their honeymoon, and they're 148 00:07:38,160 --> 00:07:42,400 Speaker 1: on this giant cruise ship which is able to simulate 149 00:07:42,440 --> 00:07:46,120 Speaker 1: the gravity of three different places, so the Moon, Mars, 150 00:07:46,240 --> 00:07:50,880 Speaker 1: and Earth, which is just an incredibly complicated cruise ship 151 00:07:51,000 --> 00:07:53,000 Speaker 1: moving through space. But you know, I've been on some 152 00:07:53,080 --> 00:07:56,200 Speaker 1: incredibly complicated cruise ships, and so maybe maybe this is 153 00:07:56,240 --> 00:07:57,720 Speaker 1: the direction of things. What do you think are we 154 00:07:57,720 --> 00:07:59,200 Speaker 1: going to have one of these in the near future. 155 00:08:00,960 --> 00:08:03,120 Speaker 1: I'm not sure. I don't even like cruise ships on 156 00:08:03,160 --> 00:08:05,640 Speaker 1: the water, so I'm definitely not signing up for a 157 00:08:05,720 --> 00:08:09,000 Speaker 1: giant cruise to Mars with huge spinning pieces that could 158 00:08:09,040 --> 00:08:10,920 Speaker 1: pull themselves apart. But you know, there are lots of 159 00:08:10,920 --> 00:08:13,560 Speaker 1: people who are excited about space tourism. It definitely seems 160 00:08:13,600 --> 00:08:16,120 Speaker 1: to be a thing of the near future, but it's 161 00:08:16,120 --> 00:08:18,440 Speaker 1: sort of been a thing of the near future for 162 00:08:18,480 --> 00:08:20,800 Speaker 1: a while. Do you think this is like really the 163 00:08:20,840 --> 00:08:23,520 Speaker 1: taking off point that it's going to be commonplace, that 164 00:08:23,600 --> 00:08:26,680 Speaker 1: we're in the generation that sees space tourism is rare, 165 00:08:26,760 --> 00:08:29,520 Speaker 1: and our kids are gonna be like, yawn, whatever, another 166 00:08:29,560 --> 00:08:31,680 Speaker 1: sub over to flight. You know, I don't really know. 167 00:08:32,000 --> 00:08:35,640 Speaker 1: I'm sort of hesitate to ever make predictions because I'm 168 00:08:35,800 --> 00:08:38,560 Speaker 1: risk averse, I guess, but I mean, certainly we've been 169 00:08:38,559 --> 00:08:40,480 Speaker 1: talking about space tourism for a long time, and this 170 00:08:40,520 --> 00:08:43,240 Speaker 1: has been a super exciting year for space tourism. You know, 171 00:08:43,760 --> 00:08:47,080 Speaker 1: Virgin Galactic had what was it one or two flights, 172 00:08:47,600 --> 00:08:51,679 Speaker 1: they definitely had one, and then Blue Origin had a 173 00:08:51,720 --> 00:08:54,720 Speaker 1: couple of flights. William Shatner made it to space I 174 00:08:55,000 --> 00:08:57,400 Speaker 1: r L. That was very exciting, and Wally Funk made 175 00:08:57,400 --> 00:08:59,600 Speaker 1: it to space finally, that was super exciting. She was 176 00:08:59,600 --> 00:09:01,520 Speaker 1: one of them, a Curry thirteen, so one of the 177 00:09:01,679 --> 00:09:05,120 Speaker 1: thirteen women who passed the tests to be able to 178 00:09:05,600 --> 00:09:08,160 Speaker 1: you know, do space flight, but ultimately didn't get their chance. 179 00:09:08,400 --> 00:09:11,320 Speaker 1: So certainly the pace of it is picking up. There 180 00:09:11,400 --> 00:09:13,720 Speaker 1: was space tourism before that. I think you could pay 181 00:09:13,800 --> 00:09:16,720 Speaker 1: something like ten million dollars. Dennis Tito was on the 182 00:09:16,760 --> 00:09:19,720 Speaker 1: International Space Station for a while, and so it's it's 183 00:09:19,720 --> 00:09:24,200 Speaker 1: going from something that only the super crazy, mega rich 184 00:09:24,240 --> 00:09:26,680 Speaker 1: could do to something that the mega rich could do. 185 00:09:26,760 --> 00:09:29,320 Speaker 1: So you're like removing some of the descriptors that I 186 00:09:29,320 --> 00:09:30,960 Speaker 1: don't I don't know that you and I would be 187 00:09:31,000 --> 00:09:33,559 Speaker 1: able to afford space tourism for quite a long time. 188 00:09:33,600 --> 00:09:36,000 Speaker 1: But but maybe I'm wrong. Who knows? Are you suggesting 189 00:09:36,000 --> 00:09:37,440 Speaker 1: that there are people out there who are mega rich 190 00:09:37,480 --> 00:09:39,599 Speaker 1: who are not super crazy. I don't know enough for 191 00:09:39,679 --> 00:09:43,200 Speaker 1: rich people to say, all right, well, let's get back 192 00:09:43,240 --> 00:09:46,160 Speaker 1: to the book. As you say, it's set in space 193 00:09:46,240 --> 00:09:48,440 Speaker 1: in the near future, and the basic set of the 194 00:09:48,440 --> 00:09:51,000 Speaker 1: book is that a couple are taking a trip to 195 00:09:51,120 --> 00:09:55,319 Speaker 1: space for their honeymoon. So this is supposed to be pleasureful, right. 196 00:09:55,679 --> 00:09:58,520 Speaker 1: They're on a fancy cruise ship. It's all luxurious, they're 197 00:09:58,520 --> 00:10:01,719 Speaker 1: eating fancy food, and and then there's a murder. And 198 00:10:01,840 --> 00:10:03,959 Speaker 1: one of the cool things about the murder is that 199 00:10:04,000 --> 00:10:06,719 Speaker 1: it's not fancy technology. They're not like zapped with the 200 00:10:06,800 --> 00:10:09,960 Speaker 1: laser or stuffed in a space trash compactor. They just 201 00:10:10,080 --> 00:10:12,920 Speaker 1: use a knife. That's true for the first one, but 202 00:10:13,040 --> 00:10:17,920 Speaker 1: that's all I'll say. Actually, technically the first one requires 203 00:10:17,960 --> 00:10:22,680 Speaker 1: knowing something about complicated closed loop ecologies. But but yes, yes, 204 00:10:22,760 --> 00:10:24,920 Speaker 1: the first murder you hear about is with an old 205 00:10:24,920 --> 00:10:26,880 Speaker 1: fashioned knife. It's got a really cool sort of tone 206 00:10:26,920 --> 00:10:28,920 Speaker 1: to it. I think it was inspired by the movie 207 00:10:28,960 --> 00:10:31,320 Speaker 1: The Thin Man, and so it's got these sort of 208 00:10:31,320 --> 00:10:33,520 Speaker 1: old fashioned e cocktails and they're sort of on an 209 00:10:33,520 --> 00:10:35,680 Speaker 1: old fashioned cruise ship. But then the whole thing is 210 00:10:35,720 --> 00:10:38,079 Speaker 1: in space, so it's a really sort of fun mash 211 00:10:38,160 --> 00:10:40,560 Speaker 1: up of styles. I also think it's written in a 212 00:10:40,600 --> 00:10:43,520 Speaker 1: really fun voice. Mary Robinette, the author, is definitely a 213 00:10:43,520 --> 00:10:46,480 Speaker 1: funny person, and you can hear it in her prose. Yeah, 214 00:10:46,480 --> 00:10:49,679 Speaker 1: I've really enjoyed it. She created a very fun environment 215 00:10:49,720 --> 00:10:52,679 Speaker 1: to spend, you know, three four hundred pages in and 216 00:10:52,760 --> 00:10:55,680 Speaker 1: not only did she do this really nice thing where 217 00:10:55,679 --> 00:10:58,440 Speaker 1: you've got this cruise ship with different kinds of gravity, 218 00:10:58,480 --> 00:11:01,200 Speaker 1: but she also included a bunch of other cool, sort 219 00:11:01,240 --> 00:11:04,360 Speaker 1: of near ish future tech and I thought it was 220 00:11:04,400 --> 00:11:07,080 Speaker 1: interesting to see sort of her predictions for what will 221 00:11:07,120 --> 00:11:09,640 Speaker 1: come online and like the next fifty years and what won't. 222 00:11:09,840 --> 00:11:11,880 Speaker 1: What did you think about the tech that she introduced? 223 00:11:12,080 --> 00:11:14,240 Speaker 1: I thought it was really fun. She has, for example, 224 00:11:14,320 --> 00:11:16,840 Speaker 1: heads up displays, this kind of thing where you can see, 225 00:11:16,840 --> 00:11:20,200 Speaker 1: for example, messages from your partner that nobody else can see. 226 00:11:20,200 --> 00:11:22,200 Speaker 1: They're not like on a phone in front of you. 227 00:11:22,440 --> 00:11:25,440 Speaker 1: There's like appearing virtually in your vision. I thought that 228 00:11:25,480 --> 00:11:27,079 Speaker 1: was really cool. It seems to me like a very 229 00:11:27,160 --> 00:11:30,760 Speaker 1: natural progression of technology. And she also has other sort 230 00:11:30,760 --> 00:11:34,840 Speaker 1: of biological tech like pain controlling devices, which seemed to 231 00:11:34,840 --> 00:11:37,160 Speaker 1: me like a really important area of research. I hope 232 00:11:37,160 --> 00:11:38,920 Speaker 1: people are reading about this and going like, oh, that's 233 00:11:38,920 --> 00:11:41,360 Speaker 1: a good idea, we should make that real. Yeah. I 234 00:11:41,600 --> 00:11:44,040 Speaker 1: do think people are working on that already, but yeah, 235 00:11:44,040 --> 00:11:45,600 Speaker 1: I agree. I would love to see a lot more 236 00:11:45,600 --> 00:11:48,440 Speaker 1: progress in that particular area. And I also like that 237 00:11:48,559 --> 00:11:52,000 Speaker 1: she thought through for these devices, like the social implications 238 00:11:52,000 --> 00:11:54,600 Speaker 1: and how they would really impact the people. And so 239 00:11:54,679 --> 00:11:56,240 Speaker 1: one of the things that made me laugh was she 240 00:11:56,320 --> 00:11:58,480 Speaker 1: was talking about the heads up display and how you 241 00:11:58,480 --> 00:12:01,040 Speaker 1: could sort of search for information and by like moving 242 00:12:01,040 --> 00:12:03,839 Speaker 1: your eye to different places, and she said something about 243 00:12:03,840 --> 00:12:06,200 Speaker 1: how like you could tell that this person was probably 244 00:12:06,200 --> 00:12:08,440 Speaker 1: searching for information because their eyes were sort of going 245 00:12:08,440 --> 00:12:10,280 Speaker 1: all over the place and they looked kind of, you know, 246 00:12:10,400 --> 00:12:12,480 Speaker 1: wild eyed. I've always thought like, oh yeah, if you 247 00:12:12,520 --> 00:12:15,360 Speaker 1: had a heads up display, you probably would look like 248 00:12:15,400 --> 00:12:18,160 Speaker 1: a little you have crazy eyes sometimes. And anyway, I 249 00:12:18,160 --> 00:12:20,760 Speaker 1: thought it was interesting the way she had her you know, 250 00:12:20,840 --> 00:12:23,800 Speaker 1: her people sort of engage with the technologies that they had, 251 00:12:24,000 --> 00:12:27,160 Speaker 1: and she didn't go wild with the technological extrapolation. She 252 00:12:27,240 --> 00:12:30,280 Speaker 1: has like a bunch of semi autonomous robots that come 253 00:12:30,280 --> 00:12:32,400 Speaker 1: and help people do things. But this is not an 254 00:12:32,400 --> 00:12:35,560 Speaker 1: AI dystopian novel where you know, the machines rise up 255 00:12:35,600 --> 00:12:38,160 Speaker 1: and murder people on their cruise ships or anything like that. 256 00:12:38,200 --> 00:12:41,720 Speaker 1: It's basically like a realistic extrapolation of where AI might be. 257 00:12:42,160 --> 00:12:45,200 Speaker 1: You know, it's like Superuma rather than you know, your 258 00:12:45,240 --> 00:12:47,120 Speaker 1: car has taken over the world. And it was a 259 00:12:47,160 --> 00:12:51,440 Speaker 1: really nice mix of technology and then advancements in sort 260 00:12:51,480 --> 00:12:53,320 Speaker 1: of the other solutions that we have for things. So 261 00:12:53,360 --> 00:12:55,880 Speaker 1: for example, there's a service dog and not a service 262 00:12:55,880 --> 00:12:59,120 Speaker 1: dog robot, but an actual service dog with all you know, 263 00:12:59,240 --> 00:13:01,080 Speaker 1: the things that come along with dogs, like you can 264 00:13:01,120 --> 00:13:02,800 Speaker 1: lose track of them or something like that, or you 265 00:13:02,800 --> 00:13:04,280 Speaker 1: have to make sure you don't step on them, where 266 00:13:04,280 --> 00:13:05,760 Speaker 1: it's like a robot might get out of the way. 267 00:13:05,840 --> 00:13:08,320 Speaker 1: And so yeah, it was a really nice mix of 268 00:13:08,520 --> 00:13:12,040 Speaker 1: evolutions of various technologies, and then also just sort of 269 00:13:12,120 --> 00:13:14,120 Speaker 1: some of the things that we have today carried forward. 270 00:13:14,920 --> 00:13:18,120 Speaker 1: I'm a big fan of very far future science fiction, 271 00:13:18,280 --> 00:13:21,000 Speaker 1: you know, space operas like Alistair Reynolds. But I also 272 00:13:21,040 --> 00:13:24,880 Speaker 1: really enjoy this almost immediate near future science fiction because 273 00:13:24,880 --> 00:13:27,520 Speaker 1: it feels so tactle, like you can really get there, 274 00:13:27,840 --> 00:13:30,720 Speaker 1: and some of the practical issues are really understandable, you know, 275 00:13:30,800 --> 00:13:32,720 Speaker 1: you can really resonate with them. For example, a lot 276 00:13:32,760 --> 00:13:34,720 Speaker 1: of the questions and we'll dig into this in a minute, 277 00:13:34,760 --> 00:13:38,200 Speaker 1: revolve around law, like what happens when you commit murder 278 00:13:38,200 --> 00:13:41,679 Speaker 1: in space? Whose rules or what laws actually apply? All 279 00:13:41,760 --> 00:13:43,720 Speaker 1: these really funny scenes where she's on the phone with 280 00:13:43,760 --> 00:13:46,680 Speaker 1: her very angry lawyer trying to figure out like exactly 281 00:13:46,720 --> 00:13:49,080 Speaker 1: what to do. I feel like the near term sci fi. 282 00:13:49,200 --> 00:13:51,120 Speaker 1: I like the far term stuff, but the near term 283 00:13:51,120 --> 00:13:54,240 Speaker 1: stuff is almost more personally motivating because you're like, oh, 284 00:13:54,280 --> 00:13:56,600 Speaker 1: what would space law be like in fifty years, and 285 00:13:56,679 --> 00:13:58,560 Speaker 1: you know, would it build off of cruise ship law? 286 00:13:58,640 --> 00:14:00,839 Speaker 1: Would it build off of the law for how we 287 00:14:00,960 --> 00:14:03,240 Speaker 1: run the International Space Station? And I feel like the 288 00:14:03,600 --> 00:14:05,600 Speaker 1: more near term stuff gives you a little bit more 289 00:14:05,600 --> 00:14:08,000 Speaker 1: of a sense of hope for how things could go, 290 00:14:08,120 --> 00:14:09,720 Speaker 1: or gets you a little bit more tied up because 291 00:14:09,760 --> 00:14:11,319 Speaker 1: you know, maybe some of the stuff will get figured 292 00:14:11,320 --> 00:14:13,679 Speaker 1: out in your lifetime. And so you know, for me, 293 00:14:13,720 --> 00:14:15,640 Speaker 1: that's one of the fun parts of more near term 294 00:14:15,640 --> 00:14:17,920 Speaker 1: sci fi exactly. And so we're gonna dig into the 295 00:14:18,000 --> 00:14:20,840 Speaker 1: science of this cruise ship and the law of her 296 00:14:20,960 --> 00:14:22,920 Speaker 1: space law, and then in a bid will have an 297 00:14:22,920 --> 00:14:25,480 Speaker 1: interview with the author. But first we're gonna take a 298 00:14:25,560 --> 00:14:40,920 Speaker 1: quick break. All right, we're back and we are talking 299 00:14:40,960 --> 00:14:44,040 Speaker 1: about The spare Man, a book by Mary Robinett Cole, 300 00:14:44,320 --> 00:14:46,800 Speaker 1: which is just coming out. We encourage you all to 301 00:14:46,880 --> 00:14:49,800 Speaker 1: check out a really fun murder mystery in space that 302 00:14:49,920 --> 00:14:53,240 Speaker 1: involves a few really fascinating bits of science fiction and 303 00:14:53,360 --> 00:14:56,960 Speaker 1: also some bits of maybe legal fiction. So first, let's 304 00:14:57,000 --> 00:14:59,800 Speaker 1: talk about the setting of the book. This takes place 305 00:14:59,840 --> 00:15:03,600 Speaker 1: on essentially, what's a massive cruise ship in space, Like 306 00:15:03,640 --> 00:15:08,080 Speaker 1: imagine taking a cruise to Alaska or Antarctica, except now 307 00:15:08,120 --> 00:15:10,680 Speaker 1: you're in space. You're eating fancy foods, and you're looking 308 00:15:10,720 --> 00:15:14,360 Speaker 1: at the window and seeing very inhospitable climbs. So do 309 00:15:14,400 --> 00:15:17,600 Speaker 1: you think that they'll build that gigantic cruise ship on 310 00:15:17,720 --> 00:15:19,640 Speaker 1: Earth and boost it up to space. So you think 311 00:15:19,640 --> 00:15:21,880 Speaker 1: they'll have to build it in space. It's gonna be 312 00:15:21,920 --> 00:15:24,200 Speaker 1: a tricky problem. It's going to be a tricky problem. 313 00:15:24,240 --> 00:15:27,840 Speaker 1: And this cruise ship specifically has lots of literally moving parts. 314 00:15:28,360 --> 00:15:32,800 Speaker 1: She has several huge rotating rings that can provide centrifugal 315 00:15:32,840 --> 00:15:36,320 Speaker 1: gravity when at Earth levels, one at Martian levels, so 316 00:15:36,360 --> 00:15:39,160 Speaker 1: you have two rings spinning differently. It has to be 317 00:15:39,200 --> 00:15:41,120 Speaker 1: really big for that to work. I don't see how 318 00:15:41,200 --> 00:15:42,880 Speaker 1: you build that on Earth and launch it. In order 319 00:15:42,920 --> 00:15:44,160 Speaker 1: for that to happen, I think you're gonna have to 320 00:15:44,160 --> 00:15:46,920 Speaker 1: build it in space, which means essentially you need like 321 00:15:46,920 --> 00:15:50,360 Speaker 1: a space construction industry, right, which means like you need 322 00:15:50,480 --> 00:15:54,640 Speaker 1: asteroid mining and really a large industrial base in space. 323 00:15:54,800 --> 00:15:56,360 Speaker 1: How far do you think we are away from that 324 00:15:56,440 --> 00:15:59,440 Speaker 1: kind of thing. I think we're really far away. I 325 00:15:59,480 --> 00:16:01,680 Speaker 1: know that there are many people who disagree with me there. 326 00:16:02,080 --> 00:16:04,800 Speaker 1: But you know, your construction on Earth is difficult, and 327 00:16:04,800 --> 00:16:07,840 Speaker 1: when you're constructing in space you also have the problems 328 00:16:07,880 --> 00:16:10,000 Speaker 1: of like you know, when you're away from the Sun, 329 00:16:10,080 --> 00:16:11,840 Speaker 1: it's hard to have a power source, and when it 330 00:16:11,880 --> 00:16:14,920 Speaker 1: gets cold, sometimes it gets so cold that you hit 331 00:16:14,960 --> 00:16:17,680 Speaker 1: I think it's the duct tile to brittle transition, where 332 00:16:17,680 --> 00:16:19,800 Speaker 1: when you have some metals, they go from being like 333 00:16:19,840 --> 00:16:22,480 Speaker 1: something that will sort of bend if you hit it 334 00:16:22,520 --> 00:16:25,040 Speaker 1: to something that will like break and crumble. So I 335 00:16:25,120 --> 00:16:27,080 Speaker 1: thought that one of the problems that the Titanic had 336 00:16:27,200 --> 00:16:29,400 Speaker 1: is that it was really really cold when it hit 337 00:16:29,440 --> 00:16:32,600 Speaker 1: the iceberg, and instead of just sort of like bending 338 00:16:32,680 --> 00:16:34,920 Speaker 1: in a little bit, like you know, if a baseball 339 00:16:35,000 --> 00:16:36,640 Speaker 1: hits your car, maybe you get like a little dent 340 00:16:36,920 --> 00:16:39,240 Speaker 1: in and it transitioned to brittle, and so it just 341 00:16:39,240 --> 00:16:41,480 Speaker 1: sort of like crumbled away in that spot. There's all 342 00:16:41,520 --> 00:16:44,160 Speaker 1: sorts of difficult things when you're work in cold environments, 343 00:16:44,240 --> 00:16:46,640 Speaker 1: and you know, if you're exposed so for example, the 344 00:16:46,640 --> 00:16:48,920 Speaker 1: International Space Station is exposed to the sun, it gets 345 00:16:48,960 --> 00:16:50,800 Speaker 1: really hot and then it gets in the dark and 346 00:16:50,840 --> 00:16:53,800 Speaker 1: it's really cold. That causes a lot of expansions, and 347 00:16:53,840 --> 00:16:56,840 Speaker 1: it's difficult on things like the lubricants that you're gonna use, 348 00:16:57,280 --> 00:16:58,880 Speaker 1: and so you know, there's a lot of things that 349 00:16:58,920 --> 00:17:01,240 Speaker 1: we have to figure out that we really don't have 350 00:17:01,280 --> 00:17:03,360 Speaker 1: a lot of experience with. And you know, if you're 351 00:17:03,360 --> 00:17:06,800 Speaker 1: talking about building this giant, complicated cruise ship, I would 352 00:17:06,800 --> 00:17:09,200 Speaker 1: say We're a long way away from doing that using 353 00:17:09,200 --> 00:17:11,720 Speaker 1: materials that are just in space, but it would be 354 00:17:11,720 --> 00:17:13,960 Speaker 1: fun to see it happen. Isn't it very difficult to 355 00:17:14,000 --> 00:17:16,480 Speaker 1: make these kind of predictions because often they are exponential. 356 00:17:16,520 --> 00:17:19,159 Speaker 1: You know, they're driven by market forces. But once you 357 00:17:19,280 --> 00:17:22,800 Speaker 1: create some new technology, it allows things to happen that 358 00:17:22,840 --> 00:17:25,000 Speaker 1: people didn't envision. And that allows things to happen that 359 00:17:25,040 --> 00:17:27,800 Speaker 1: people didn't envision. I imagine there's a very broad possible 360 00:17:27,840 --> 00:17:30,280 Speaker 1: set of timelines where it could take a hundred years 361 00:17:30,359 --> 00:17:33,080 Speaker 1: or ten years or fifty years, But isn't it possible 362 00:17:33,080 --> 00:17:34,520 Speaker 1: that it could sort of happen in the next few 363 00:17:34,520 --> 00:17:36,879 Speaker 1: decades if the right things fall into place in the 364 00:17:36,960 --> 00:17:39,919 Speaker 1: right way. Ye, if the right investment happens. It's a 365 00:17:39,960 --> 00:17:41,919 Speaker 1: little bit hard for me to imagine because it's going 366 00:17:42,000 --> 00:17:44,119 Speaker 1: to be so incredibly expensive, and that seems like that 367 00:17:44,160 --> 00:17:46,040 Speaker 1: will be a limiting factor for a while. But you know, 368 00:17:46,160 --> 00:17:49,040 Speaker 1: you've got SpaceX driving down the cost of sending stuff 369 00:17:49,040 --> 00:17:52,040 Speaker 1: into space. So maybe we can build construction equipment on 370 00:17:52,119 --> 00:17:54,000 Speaker 1: Earth and send it up into space and then get 371 00:17:54,040 --> 00:17:57,200 Speaker 1: experience with construction in space. So yeah, I mean totally. 372 00:17:57,240 --> 00:17:59,600 Speaker 1: I think if if enough people decide to invest in this, 373 00:17:59,720 --> 00:18:02,800 Speaker 1: and we get some technologies that really break open our 374 00:18:02,840 --> 00:18:05,800 Speaker 1: ability to do construction in space, then you know, maybe 375 00:18:05,800 --> 00:18:08,680 Speaker 1: this could happen in the next couple decades, and then 376 00:18:08,720 --> 00:18:11,960 Speaker 1: maybe we could finally build that space based solar power 377 00:18:12,040 --> 00:18:14,720 Speaker 1: system we've been hoping for. Yeah, check out our episode 378 00:18:14,720 --> 00:18:17,880 Speaker 1: where we discuss whether or not that's a great idea. Surprise, surprise, 379 00:18:18,080 --> 00:18:22,600 Speaker 1: Kelly was skeptical. You were skeptical too, if I remember correctly. 380 00:18:22,880 --> 00:18:25,240 Speaker 1: You were maybe nicer about it, but but you were 381 00:18:25,280 --> 00:18:27,840 Speaker 1: also skeptical. I was more hopeful. You know, I want 382 00:18:27,880 --> 00:18:30,040 Speaker 1: to see that happen. I think it sounds cool, except 383 00:18:30,040 --> 00:18:32,000 Speaker 1: for the bit where we're shooting laser beams to the 384 00:18:32,000 --> 00:18:34,679 Speaker 1: surface of the Earth from space. Yeah, they could have 385 00:18:34,720 --> 00:18:38,360 Speaker 1: some problems geo political and otherwise. So another really fascinating 386 00:18:38,400 --> 00:18:40,960 Speaker 1: element of this cruise ship is the gravity. She works 387 00:18:41,000 --> 00:18:44,320 Speaker 1: really hard to make sure that there's comfortable gravity for 388 00:18:44,480 --> 00:18:47,119 Speaker 1: people on this cruise ship. I guess she imagines people 389 00:18:47,240 --> 00:18:49,000 Speaker 1: don't want to sort of float their way through the 390 00:18:49,040 --> 00:18:51,400 Speaker 1: trip to Mars. Why do you think that is? Don't 391 00:18:51,400 --> 00:18:54,240 Speaker 1: people want to go to space to experience zero gravity? 392 00:18:54,320 --> 00:18:55,959 Speaker 1: If you're in space for a long time, you can 393 00:18:56,000 --> 00:18:59,560 Speaker 1: have some medical problems related to life in micro gravity. 394 00:18:59,600 --> 00:19:03,120 Speaker 1: But I remembered what we were doing the interview that 395 00:19:03,200 --> 00:19:05,240 Speaker 1: she had the trip only last in eleven days, So 396 00:19:05,320 --> 00:19:07,480 Speaker 1: probably that's a short enough trip that you're not going 397 00:19:07,520 --> 00:19:09,720 Speaker 1: to get some of the you know, like bone and 398 00:19:09,800 --> 00:19:12,280 Speaker 1: muscle degradation problems to such a degree that you have 399 00:19:12,359 --> 00:19:14,919 Speaker 1: to worry about it a lot. But when you're in 400 00:19:15,000 --> 00:19:17,640 Speaker 1: micro gravity, your body sort of stretches out and your 401 00:19:17,680 --> 00:19:20,600 Speaker 1: like vertebras start, you know, separating from each other. And 402 00:19:20,680 --> 00:19:26,040 Speaker 1: apparently that's a very uncomfortable feeling that astronauts complain about. Yeah, yeah, 403 00:19:26,119 --> 00:19:29,040 Speaker 1: super uncomfortable, and so having some gravity will help with 404 00:19:29,080 --> 00:19:33,040 Speaker 1: that problem. And then there's also let's say sanitation issues 405 00:19:33,200 --> 00:19:35,720 Speaker 1: which are improved by having a little bit of gravity. 406 00:19:35,880 --> 00:19:38,359 Speaker 1: Are we talking about like spilling your cocktail? So spilling 407 00:19:38,359 --> 00:19:40,760 Speaker 1: your cocktail is one, but I was mostly thinking about 408 00:19:40,800 --> 00:19:44,119 Speaker 1: bathroom related issues. So, like you know, on the I 409 00:19:44,359 --> 00:19:48,000 Speaker 1: S S, for example, there's the complicated two hose vacuum 410 00:19:48,080 --> 00:19:51,520 Speaker 1: system to take care of one and two. And even 411 00:19:51,560 --> 00:19:54,760 Speaker 1: with this vacuum system, they still get quote unquote escape 412 00:19:54,760 --> 00:19:57,399 Speaker 1: ease from time to time. And so you know, in 413 00:19:57,400 --> 00:20:00,680 Speaker 1: a world where everything sort of goes down own. Things 414 00:20:00,680 --> 00:20:03,720 Speaker 1: are a little bit less complicated when it comes to like, yeah, 415 00:20:03,760 --> 00:20:06,639 Speaker 1: to sanitation issues and losing track of your stuff, and 416 00:20:06,680 --> 00:20:08,480 Speaker 1: so it's it's a bit more comfortable and a more 417 00:20:08,560 --> 00:20:10,760 Speaker 1: manageable thing when you have a little bit of gravity. 418 00:20:10,960 --> 00:20:13,320 Speaker 1: And this really is a solution, right if you have 419 00:20:13,440 --> 00:20:17,400 Speaker 1: a huge rotating space station, it can provide the experience 420 00:20:17,520 --> 00:20:19,960 Speaker 1: of gravity. You see lots of fictional portrayals of space 421 00:20:20,000 --> 00:20:22,360 Speaker 1: travel because it really is a thing. For example, you've 422 00:20:22,359 --> 00:20:25,000 Speaker 1: ever been on the amusement park ride where they spin 423 00:20:25,040 --> 00:20:27,240 Speaker 1: you around really really fast and you get pinned to 424 00:20:27,320 --> 00:20:30,000 Speaker 1: the wall and then they drop the floor out beneath you. 425 00:20:30,000 --> 00:20:32,119 Speaker 1: You stay pinned to the wall. You do not drop 426 00:20:32,200 --> 00:20:34,600 Speaker 1: and plummet to your death, because there is the experience 427 00:20:34,640 --> 00:20:38,000 Speaker 1: of the centrifugal force pushing you against the outer edge. 428 00:20:38,160 --> 00:20:40,200 Speaker 1: If the wheel is big enough, we're spinning fast enough, 429 00:20:40,240 --> 00:20:42,960 Speaker 1: you could walk along that surface. So this is a 430 00:20:43,000 --> 00:20:45,600 Speaker 1: real thing. The science there is real. It is, but 431 00:20:45,640 --> 00:20:47,800 Speaker 1: it's also weird. And one of the things that I 432 00:20:47,840 --> 00:20:50,800 Speaker 1: appreciated about the spare Man was that she incorporated how 433 00:20:50,920 --> 00:20:53,879 Speaker 1: life in a rotating environment is weird. So you know, 434 00:20:53,960 --> 00:20:57,240 Speaker 1: there's a story where they're throwing something to the dog 435 00:20:57,800 --> 00:21:01,679 Speaker 1: and one of the two environments where they're using centrifical force. 436 00:21:02,080 --> 00:21:03,879 Speaker 1: So you know, when you're rotating and spinning, if you 437 00:21:03,920 --> 00:21:06,960 Speaker 1: throw something, it's going to curve a little bit, and 438 00:21:07,320 --> 00:21:09,879 Speaker 1: getting used to that is probably going to take a 439 00:21:09,920 --> 00:21:12,560 Speaker 1: little bit of practice. So do you want to explain 440 00:21:12,880 --> 00:21:15,880 Speaker 1: Corioli's forces and how those work? Yeah, there's an important 441 00:21:15,880 --> 00:21:18,040 Speaker 1: wrinkle there. If you are standing on the surface of 442 00:21:18,080 --> 00:21:20,560 Speaker 1: the Earth, you only feel a force down right at 443 00:21:20,560 --> 00:21:23,080 Speaker 1: the force from gravity. But if you're the inside of 444 00:21:23,080 --> 00:21:26,160 Speaker 1: Merry Go Round or a huge spinning spaceship, you feel 445 00:21:26,160 --> 00:21:29,160 Speaker 1: almost exactly the same force. You feel a fource down 446 00:21:29,200 --> 00:21:32,280 Speaker 1: sort of outwards away from the center, which would press 447 00:21:32,320 --> 00:21:34,480 Speaker 1: you up against the ring of the spaceship, giving you 448 00:21:34,520 --> 00:21:39,280 Speaker 1: the experience of gravity. But the spinning also generates another force. Imagine, 449 00:21:39,280 --> 00:21:41,400 Speaker 1: for example, you're on a Merry Go Round and you're 450 00:21:41,440 --> 00:21:43,480 Speaker 1: throwing a ball to your friend. The ball is going 451 00:21:43,520 --> 00:21:44,960 Speaker 1: to travel in a straight line. From the point of 452 00:21:45,000 --> 00:21:46,800 Speaker 1: view of somebody on the ground who's not on the 453 00:21:46,800 --> 00:21:48,880 Speaker 1: Merry Go Round, They'll see you throw the ball from 454 00:21:48,920 --> 00:21:50,760 Speaker 1: one spot and you'll fly through the air in a 455 00:21:50,800 --> 00:21:54,119 Speaker 1: straight line to somebody else. But because you're on the 456 00:21:54,160 --> 00:21:56,720 Speaker 1: Merry Go Round and you're spinning, you're not going to 457 00:21:56,800 --> 00:21:59,440 Speaker 1: see the ball moving a straight line from your perspective. 458 00:21:59,440 --> 00:22:02,400 Speaker 1: It's going to like the ball curves because the Merrygoround 459 00:22:02,440 --> 00:22:05,040 Speaker 1: is sort of spinning under the ball as it flies. 460 00:22:05,240 --> 00:22:07,800 Speaker 1: So we call this the Coreola's effect. Essentially, it's a 461 00:22:07,840 --> 00:22:10,720 Speaker 1: force sideways. So if you're on this space station and 462 00:22:10,760 --> 00:22:13,199 Speaker 1: you do throw a ball to your dog, it's not 463 00:22:13,240 --> 00:22:15,000 Speaker 1: going to go in what feels like a straight line 464 00:22:15,359 --> 00:22:18,359 Speaker 1: to you. Again, that's just a product of this whole 465 00:22:18,440 --> 00:22:21,960 Speaker 1: thing spinning, of this artificial gravity scenario. Yeah, so if 466 00:22:22,000 --> 00:22:24,920 Speaker 1: you are playing baseball and it's Earth team versus the 467 00:22:25,000 --> 00:22:27,800 Speaker 1: rotating space station team, the rotating space station team is 468 00:22:27,800 --> 00:22:30,840 Speaker 1: going to have a massive home team advantage because they'll 469 00:22:30,840 --> 00:22:33,680 Speaker 1: be used to the Coriola's effect and it's gonna totally 470 00:22:33,760 --> 00:22:36,919 Speaker 1: throw the Earth based team of guessing. Mm I think 471 00:22:36,960 --> 00:22:38,800 Speaker 1: that would be really fun in the fire future, the 472 00:22:38,880 --> 00:22:42,320 Speaker 1: Intergalactic Baseball champions. It also made me think about another issue, 473 00:22:42,440 --> 00:22:45,479 Speaker 1: which is that if the wheel is not really very big, 474 00:22:45,680 --> 00:22:49,280 Speaker 1: then the centrifugal force you feel changes as a function 475 00:22:49,320 --> 00:22:51,720 Speaker 1: of the distance from the center so there's like no 476 00:22:51,800 --> 00:22:54,600 Speaker 1: force at the center and a larger force at the edge, 477 00:22:54,800 --> 00:22:57,440 Speaker 1: and if you start walking towards the center, it goes 478 00:22:57,480 --> 00:22:59,480 Speaker 1: down and down and down. So if you're just on 479 00:22:59,520 --> 00:23:01,919 Speaker 1: the ring, it's fine, but imagine you're kind of tall 480 00:23:02,200 --> 00:23:04,520 Speaker 1: or the ring is not very large, then the force 481 00:23:04,560 --> 00:23:07,240 Speaker 1: of gravity you're feeling on your head is less than 482 00:23:07,280 --> 00:23:09,960 Speaker 1: the force of gravity you're feeling on your feet. That 483 00:23:10,040 --> 00:23:12,720 Speaker 1: must be a little bit disorienting. There's so many things 484 00:23:12,800 --> 00:23:15,399 Speaker 1: about building these rotating space stations that to me or 485 00:23:15,440 --> 00:23:19,000 Speaker 1: like counterintuitive and complicated. So you know, the big wheels 486 00:23:19,040 --> 00:23:23,280 Speaker 1: are going to be crazy expensive, and the smaller wheels 487 00:23:23,359 --> 00:23:25,680 Speaker 1: are going to have problems Like what you just said. 488 00:23:25,760 --> 00:23:28,160 Speaker 1: You know, your feet are gonna feel a different gravity 489 00:23:28,160 --> 00:23:31,600 Speaker 1: than your head. And also when you have these smaller wheels, 490 00:23:31,720 --> 00:23:34,200 Speaker 1: you get what I think of as the washing machine effect. 491 00:23:34,280 --> 00:23:37,000 Speaker 1: So if they're unbalanced, you're going to get like, you know, 492 00:23:37,119 --> 00:23:40,160 Speaker 1: the cot chunk, chunk chunk thing that happens in your 493 00:23:40,200 --> 00:23:42,359 Speaker 1: washing machine when you have too many towels on one 494 00:23:42,400 --> 00:23:44,280 Speaker 1: side and nothing on the other. Is that what they 495 00:23:44,280 --> 00:23:49,440 Speaker 1: refer to it in the scientific literature, the chunk effect? Yes, oh, probably, probably, 496 00:23:49,520 --> 00:23:52,200 Speaker 1: but maybe not. So we we read these proposals when 497 00:23:52,200 --> 00:23:54,879 Speaker 1: we were reading about rotating space stations where they were 498 00:23:54,880 --> 00:23:58,480 Speaker 1: talking about maybe having a water system that would counter 499 00:23:58,600 --> 00:24:01,400 Speaker 1: the movement of people. So if everybody was on one 500 00:24:01,480 --> 00:24:04,720 Speaker 1: side for a pot luck, they'd have to move something 501 00:24:04,760 --> 00:24:06,520 Speaker 1: like a bunch of water to the other side to 502 00:24:06,600 --> 00:24:08,920 Speaker 1: counter balance that, so you don't end up with like 503 00:24:09,000 --> 00:24:12,000 Speaker 1: something catastrophic happening, like a hole on the side that 504 00:24:12,040 --> 00:24:14,040 Speaker 1: you've got a pot luck and everybody being ripped out 505 00:24:14,040 --> 00:24:16,120 Speaker 1: into the void of space. But that would be bad. 506 00:24:16,200 --> 00:24:17,960 Speaker 1: That would definitely be a bad way to end your 507 00:24:17,960 --> 00:24:20,400 Speaker 1: pot luck. And why is this a bigger issue for 508 00:24:20,440 --> 00:24:23,159 Speaker 1: the smaller wheel. Is it because people are just like 509 00:24:23,200 --> 00:24:25,680 Speaker 1: a larger fraction of the mass of the wheel, or 510 00:24:25,680 --> 00:24:27,360 Speaker 1: does it have to do with the sort of lever 511 00:24:27,600 --> 00:24:29,600 Speaker 1: arm My understanding is that it has to do with 512 00:24:29,640 --> 00:24:32,120 Speaker 1: the mass. When you get big enough, the pony humans 513 00:24:32,200 --> 00:24:35,280 Speaker 1: moving around don't cause the same kind of problems. You know, 514 00:24:35,320 --> 00:24:38,119 Speaker 1: like your washing machine doesn't have to be perfectly balanced, 515 00:24:38,359 --> 00:24:40,480 Speaker 1: but it needs to be balanced enough. And when you 516 00:24:40,480 --> 00:24:42,520 Speaker 1: get a big enough wheel, the humans moving around don't 517 00:24:42,520 --> 00:24:44,320 Speaker 1: matter as much. But tell me about this this lever 518 00:24:44,440 --> 00:24:46,400 Speaker 1: thing you think that might matter. Also, no, I think 519 00:24:46,400 --> 00:24:48,600 Speaker 1: it probably is just a mass effect. You know, if 520 00:24:48,600 --> 00:24:51,240 Speaker 1: you have like ants crawling around inside your washing machine, 521 00:24:51,400 --> 00:24:53,000 Speaker 1: they don't set it off because, as you say, they're 522 00:24:53,000 --> 00:24:55,560 Speaker 1: a tiny fraction of the mass. So essentially, you need 523 00:24:55,640 --> 00:24:58,879 Speaker 1: this thing to be so massive that humans moving around 524 00:24:59,080 --> 00:25:01,840 Speaker 1: is not going to signific equally imbalance it. And this 525 00:25:01,920 --> 00:25:04,720 Speaker 1: thing has a lot of momentum right it's spinning, it's big, 526 00:25:04,800 --> 00:25:07,080 Speaker 1: it's massive, and so if it gets imbalanced and like 527 00:25:07,440 --> 00:25:10,520 Speaker 1: comes off of its axle, that would definitely be catastrophic. 528 00:25:10,520 --> 00:25:12,840 Speaker 1: That happened in the latest season of For All Main Kind. 529 00:25:12,880 --> 00:25:14,200 Speaker 1: I don't know if you're a fan of that show. 530 00:25:14,359 --> 00:25:17,439 Speaker 1: Almost everybody in my life who knows about space has 531 00:25:17,480 --> 00:25:20,240 Speaker 1: told me I have to watch that series, but everything 532 00:25:20,280 --> 00:25:22,040 Speaker 1: in my life is waiting until my book is done 533 00:25:22,040 --> 00:25:25,120 Speaker 1: on November one. Then I will enjoy that series, but 534 00:25:25,560 --> 00:25:28,120 Speaker 1: until November one, there's no time for joy. Well, I'm 535 00:25:28,119 --> 00:25:29,679 Speaker 1: hoping to get a right from that series on the 536 00:25:29,680 --> 00:25:31,879 Speaker 1: podcast to talk about the signs of that. But essentially, 537 00:25:31,920 --> 00:25:34,240 Speaker 1: you're telling us that these wheels either have to be 538 00:25:34,280 --> 00:25:37,959 Speaker 1: really really big so they're stable, but then they're hugely expensive, 539 00:25:38,160 --> 00:25:40,080 Speaker 1: or they have to be smaller to be cheaper, in 540 00:25:40,119 --> 00:25:43,560 Speaker 1: which case you need very complicated technology to make sure 541 00:25:43,600 --> 00:25:46,439 Speaker 1: they don't get imbalanced. That's my understanding. Yes, it's not 542 00:25:46,480 --> 00:25:48,800 Speaker 1: an easy engineering problem either way. Right, and then if 543 00:25:48,840 --> 00:25:50,720 Speaker 1: you have a bunch of guests on your cruise ship 544 00:25:50,720 --> 00:25:53,160 Speaker 1: and they're all a little tipsy, you can't exactly make 545 00:25:53,200 --> 00:25:55,520 Speaker 1: sure they're all like staying on opposite sides of the 546 00:25:55,600 --> 00:25:57,639 Speaker 1: ring at the same time. When what a drag it 547 00:25:57,680 --> 00:26:00,159 Speaker 1: would be to have to, like, you know, say, oh, hey, 548 00:26:00,200 --> 00:26:02,119 Speaker 1: we've got a group of ten people and we're all friends, 549 00:26:02,119 --> 00:26:03,720 Speaker 1: but we can't be in the same place at the 550 00:26:03,760 --> 00:26:06,560 Speaker 1: same time. So you know, we can only party with 551 00:26:06,600 --> 00:26:08,359 Speaker 1: like in groups of five, and we have to be 552 00:26:08,400 --> 00:26:11,280 Speaker 1: spread out in anyway. Humans are never gonna be able 553 00:26:11,280 --> 00:26:13,159 Speaker 1: to follow worlds like that. Well, I like that in 554 00:26:13,160 --> 00:26:14,880 Speaker 1: this book she really did take a lot of those 555 00:26:14,880 --> 00:26:17,080 Speaker 1: issues seriously and thought about what it would be like 556 00:26:17,240 --> 00:26:20,399 Speaker 1: to be on the ship. She also, for example, really 557 00:26:20,440 --> 00:26:24,200 Speaker 1: worked hard to make sure that realistic time lags were 558 00:26:24,240 --> 00:26:28,080 Speaker 1: a feature in the experience of these cruise shippers, you know, 559 00:26:28,119 --> 00:26:30,280 Speaker 1: when they wanted to talk to their lawyer, for example. 560 00:26:30,280 --> 00:26:32,320 Speaker 1: As they get further and further from Earth, it gets 561 00:26:32,400 --> 00:26:35,520 Speaker 1: much more expensive, and also the time lag increases, so 562 00:26:35,520 --> 00:26:37,880 Speaker 1: they have these weird conversations where they have to wait 563 00:26:37,920 --> 00:26:42,080 Speaker 1: three minutes to hear a response and then six minutes, etcetera, etcetera. Yeah, 564 00:26:42,119 --> 00:26:45,000 Speaker 1: she was really careful with detail in that. It really 565 00:26:45,040 --> 00:26:47,920 Speaker 1: impressed me the way she had the science so clearly 566 00:26:47,960 --> 00:26:50,359 Speaker 1: figured out, and she weaved it really nicely into the 567 00:26:50,359 --> 00:26:52,399 Speaker 1: way the story was told. It was awesome. That was 568 00:26:52,440 --> 00:26:55,520 Speaker 1: really impressive. And because it's a space murder, there are 569 00:26:55,520 --> 00:26:58,520 Speaker 1: a lot of really interesting legal issues, like what laws 570 00:26:58,520 --> 00:27:01,199 Speaker 1: actually govern in your behavior you're in space, who's in 571 00:27:01,320 --> 00:27:04,320 Speaker 1: charge as the captain of the ship basically a dictator 572 00:27:04,400 --> 00:27:07,320 Speaker 1: until they get to their destination? And you know, can 573 00:27:07,400 --> 00:27:10,000 Speaker 1: you do murder in space? Is it legal? Is it illegal? 574 00:27:10,280 --> 00:27:12,040 Speaker 1: I know that you've done a whunch you're reading about that. 575 00:27:12,280 --> 00:27:15,440 Speaker 1: What is the basic situation? Can you murder in space? Well, 576 00:27:15,480 --> 00:27:18,199 Speaker 1: I mean you can technically murder anywhere. Don't take that 577 00:27:18,240 --> 00:27:21,119 Speaker 1: as permission. People. That was technically so right, So you 578 00:27:21,359 --> 00:27:25,840 Speaker 1: shouldn't be murdering anyone anywhere. But in space we still 579 00:27:25,880 --> 00:27:28,400 Speaker 1: need to work out some of the rules. So the 580 00:27:28,640 --> 00:27:33,160 Speaker 1: Outer Space Treaty does stipulate that countries are responsible for 581 00:27:33,400 --> 00:27:37,160 Speaker 1: like the actions of their citizens in space, so there 582 00:27:37,359 --> 00:27:40,879 Speaker 1: is some responsibility sort of assigned in that case. But 583 00:27:40,920 --> 00:27:43,600 Speaker 1: these things can get complicated. There's this example of the 584 00:27:43,640 --> 00:27:46,240 Speaker 1: case in the Arctic that was thought of as an 585 00:27:46,280 --> 00:27:48,760 Speaker 1: exciting case for sort of figuring this stuff out, where 586 00:27:48,840 --> 00:27:51,159 Speaker 1: they were on an ice flow in the Arctic. So 587 00:27:51,200 --> 00:27:54,800 Speaker 1: this moves, it can like move between the territorial waters 588 00:27:54,840 --> 00:27:59,600 Speaker 1: of multiple countries. And an American citizen accidentally murdered I 589 00:27:59,640 --> 00:28:03,720 Speaker 1: think it is his boss after a fight about some 590 00:28:03,840 --> 00:28:07,240 Speaker 1: raisin wine which was stolen by a third member of 591 00:28:07,280 --> 00:28:09,399 Speaker 1: the crew who was generally considered to be an all 592 00:28:09,400 --> 00:28:12,600 Speaker 1: around kind of jerk face. And the guy who shot 593 00:28:12,640 --> 00:28:15,359 Speaker 1: his boss it was, according to him, an accident. His 594 00:28:15,560 --> 00:28:17,720 Speaker 1: rifle went off. It wasn't supposed to go off. But 595 00:28:17,840 --> 00:28:20,440 Speaker 1: everybody involved was American, so it was pretty clear that 596 00:28:20,560 --> 00:28:23,240 Speaker 1: jurisdiction should go to America. But these things can get 597 00:28:23,280 --> 00:28:25,280 Speaker 1: more complicated. So there have been some issues on the 598 00:28:25,280 --> 00:28:29,159 Speaker 1: International Space Station and the way those are handled is 599 00:28:29,160 --> 00:28:33,840 Speaker 1: according to this nine inter Governmental Agreement or i g 600 00:28:34,040 --> 00:28:37,960 Speaker 1: A that essentially said, okay, so we're gonna call the 601 00:28:38,080 --> 00:28:42,640 Speaker 1: modules a sort of quasi sovereign territory. So Japan's keybo 602 00:28:42,760 --> 00:28:46,760 Speaker 1: module is effectively like visiting a chunk of Japan, whereas 603 00:28:46,800 --> 00:28:50,160 Speaker 1: the individuals remain under the law of their home nations. 604 00:28:50,320 --> 00:28:53,400 Speaker 1: So this would be like an American going to visit, like, 605 00:28:53,640 --> 00:28:56,760 Speaker 1: you know, an Australian base in Antarctica. So there was 606 00:28:56,800 --> 00:28:59,200 Speaker 1: recently this problem on the International Space Station where there 607 00:28:59,280 --> 00:29:01,480 Speaker 1: was a whole in one of the soyez is. So 608 00:29:01,560 --> 00:29:05,120 Speaker 1: this is a spacecraft that, for example, can bring astronauts 609 00:29:05,160 --> 00:29:07,600 Speaker 1: and cosmonauts from Russia up to the I S S 610 00:29:08,120 --> 00:29:11,040 Speaker 1: and someone had drilled a hole in it. It was 611 00:29:11,120 --> 00:29:14,640 Speaker 1: discovered because the station was very slowly depressurizing, and so 612 00:29:14,680 --> 00:29:17,400 Speaker 1: they were like, oh, what's going on. And the two 613 00:29:17,400 --> 00:29:21,560 Speaker 1: competing theories are that in Russia somebody who was building 614 00:29:21,560 --> 00:29:25,000 Speaker 1: the Soyez accidentally made a mistake and drilled a hole 615 00:29:25,160 --> 00:29:27,640 Speaker 1: and instead of admitting all my bad, I put a 616 00:29:27,640 --> 00:29:31,480 Speaker 1: hole in the soyas they hastily patched it up and 617 00:29:31,600 --> 00:29:33,800 Speaker 1: the patch didn't give way until it was in the 618 00:29:33,840 --> 00:29:36,360 Speaker 1: vacuum of space for a while, So that would be 619 00:29:36,440 --> 00:29:40,040 Speaker 1: Russia's fault. But Russia claims that one of the astronauts 620 00:29:40,040 --> 00:29:42,280 Speaker 1: on the U S side had a mental break after 621 00:29:42,280 --> 00:29:44,840 Speaker 1: getting some bad medical news from one of the medical 622 00:29:44,880 --> 00:29:48,080 Speaker 1: experiments in space and that that astronaut kind of went 623 00:29:48,360 --> 00:29:50,400 Speaker 1: off the deep end and wanted to go home early, 624 00:29:50,720 --> 00:29:53,240 Speaker 1: so they drilled a hole in the soyas to try 625 00:29:53,280 --> 00:29:55,400 Speaker 1: to freak everyone out and get everybody to go home early. 626 00:29:55,720 --> 00:29:58,800 Speaker 1: What drilled a hole in the space station they were 627 00:29:58,840 --> 00:30:03,200 Speaker 1: currently living in. Yeah, right, that seems ridiculous, And so 628 00:30:03,520 --> 00:30:06,040 Speaker 1: the US is like, no, clearly that's not what happened, 629 00:30:06,120 --> 00:30:08,840 Speaker 1: and Russia was like, okay, well, let us do a 630 00:30:08,960 --> 00:30:11,920 Speaker 1: lie detector test. In the US was like one, no, 631 00:30:12,200 --> 00:30:14,680 Speaker 1: because you're being ridiculous. Two of those don't even work, 632 00:30:15,160 --> 00:30:17,880 Speaker 1: Like why would we let you do that? And so anyway, 633 00:30:17,880 --> 00:30:20,360 Speaker 1: there's just been this back and forth where Russia is like, well, 634 00:30:20,440 --> 00:30:22,360 Speaker 1: it's weird that the camera that was supposed to be 635 00:30:22,360 --> 00:30:24,640 Speaker 1: watching the soya is at that time wasn't working. What's 636 00:30:24,640 --> 00:30:27,760 Speaker 1: going on there? And so both sides it's like heating up. 637 00:30:27,800 --> 00:30:29,800 Speaker 1: And the deal is at the end of the day 638 00:30:29,880 --> 00:30:33,080 Speaker 1: that the rules governing the I S S say something 639 00:30:33,120 --> 00:30:36,200 Speaker 1: to the effect of, like, you know, Russia can say 640 00:30:36,240 --> 00:30:40,120 Speaker 1: to the US, we want to prosecute the astronaut under 641 00:30:40,200 --> 00:30:43,560 Speaker 1: Russian law because we think they put everybody at risk 642 00:30:43,720 --> 00:30:47,479 Speaker 1: and damaged the equipment. When that happens, the United States 643 00:30:47,480 --> 00:30:49,880 Speaker 1: gets to say, all right, well, we have ninety days 644 00:30:50,000 --> 00:30:52,080 Speaker 1: and we're going to try to show you that we're 645 00:30:52,120 --> 00:30:56,080 Speaker 1: going to prosecute the offender ourselves and deal with this appropriately. 646 00:30:56,440 --> 00:30:58,640 Speaker 1: And if nothing happens in ninety days, then Russia is 647 00:30:58,640 --> 00:31:01,240 Speaker 1: supposed to be allowed to go full word with their investigation. 648 00:31:01,440 --> 00:31:04,320 Speaker 1: But you know what actually happens in practice is that 649 00:31:04,360 --> 00:31:07,280 Speaker 1: the US is not willing to just give up their astronaut, 650 00:31:07,360 --> 00:31:10,040 Speaker 1: and so geopolitics is getting in the way. Like I 651 00:31:10,040 --> 00:31:12,560 Speaker 1: think in any of these cases where there's any ambiguity 652 00:31:12,600 --> 00:31:14,640 Speaker 1: at all, it's not going to work out the way 653 00:31:14,640 --> 00:31:16,840 Speaker 1: it's supposed to because no one's going to give up 654 00:31:16,840 --> 00:31:20,240 Speaker 1: their citizen to another country, what alone Russia. But aside 655 00:31:20,240 --> 00:31:24,720 Speaker 1: from the fascinating digression here into true crime space murderer 656 00:31:25,040 --> 00:31:27,280 Speaker 1: that might have actually happened on the I S S. 657 00:31:27,400 --> 00:31:30,520 Speaker 1: That's fascinating. Are you saying that the situation is that 658 00:31:30,560 --> 00:31:33,040 Speaker 1: the I S S is multiple countries. As you move 659 00:31:33,120 --> 00:31:35,560 Speaker 1: from one module to another, you're like under the laws 660 00:31:35,560 --> 00:31:37,840 Speaker 1: of Japan or under the laws of Russia. So if 661 00:31:37,840 --> 00:31:40,320 Speaker 1: you kill somebody on the space station, it matters in 662 00:31:40,440 --> 00:31:42,840 Speaker 1: which part of the space station you did it. Yes, 663 00:31:42,880 --> 00:31:45,440 Speaker 1: to some extent it matters. And sorry about the digression. 664 00:31:45,600 --> 00:31:49,040 Speaker 1: I think it's a fascinating story. But so you know, 665 00:31:49,080 --> 00:31:51,880 Speaker 1: if if you had an American who killed another American 666 00:31:52,040 --> 00:31:55,200 Speaker 1: in the Japanese module, probably the U S would deal 667 00:31:55,240 --> 00:31:57,440 Speaker 1: with that. But yes, there is some sort of territory 668 00:31:57,480 --> 00:32:00,520 Speaker 1: that Mundy's waters even more and makes things even more complicated. 669 00:32:00,600 --> 00:32:02,280 Speaker 1: I guess the point is we don't have this all 670 00:32:02,320 --> 00:32:05,040 Speaker 1: worked out yet. There's not nice, clear rules to follow. 671 00:32:05,280 --> 00:32:08,400 Speaker 1: So in the story Mary Robin, it sort of created 672 00:32:08,440 --> 00:32:10,760 Speaker 1: a system to deal with it, and I it's interesting 673 00:32:10,800 --> 00:32:13,280 Speaker 1: talking to her about how she settled on what system 674 00:32:13,320 --> 00:32:15,720 Speaker 1: to use. Yeah, and a bad space lawyers get really 675 00:32:15,720 --> 00:32:18,520 Speaker 1: excited when somebody gets murdered in a new way in 676 00:32:18,600 --> 00:32:20,880 Speaker 1: a new place that's never happened before, because like it's 677 00:32:20,880 --> 00:32:24,600 Speaker 1: sets some new precedent for how to these things. There's 678 00:32:24,600 --> 00:32:26,800 Speaker 1: a little bit of glee in their voice. Perhaps all right, 679 00:32:26,840 --> 00:32:28,480 Speaker 1: that cover is a little bit the signs and legal 680 00:32:28,480 --> 00:32:31,160 Speaker 1: issues that are raised in the fun book The spare Man. 681 00:32:31,320 --> 00:32:32,800 Speaker 1: We're gonna take a break and when we come back, 682 00:32:32,800 --> 00:32:35,480 Speaker 1: we're gonna talk to the author herself. And hear about 683 00:32:35,520 --> 00:32:50,680 Speaker 1: her writing process. All right, we're back and we're talking 684 00:32:50,680 --> 00:32:54,120 Speaker 1: about the novel The spare Man by Mary Robinett Coble, 685 00:32:54,560 --> 00:32:57,320 Speaker 1: and here's our fun interview with her. Well, then, it's 686 00:32:57,400 --> 00:33:00,320 Speaker 1: my great pleasure to welcome back to the podcast, asked 687 00:33:00,400 --> 00:33:03,280 Speaker 1: Mary Robinette Cole. She's been a guest before for her 688 00:33:03,320 --> 00:33:07,360 Speaker 1: excellent book Calculating Stars, which one Hugo and Nebula awards. 689 00:33:07,480 --> 00:33:09,640 Speaker 1: That's a pretty big deal. And for those of you 690 00:33:09,640 --> 00:33:12,280 Speaker 1: who don't remember, she has a really fun background. Before 691 00:33:12,320 --> 00:33:15,600 Speaker 1: she started writing, she worked as a puppeteer and also 692 00:33:15,680 --> 00:33:19,200 Speaker 1: worked for Jim Henson and Sesame Street. Mary Robinette, thank 693 00:33:19,200 --> 00:33:21,480 Speaker 1: you very much for coming back to the podcast. Thank 694 00:33:21,520 --> 00:33:24,040 Speaker 1: you so much for having me. I'm delighted to be here. Well, 695 00:33:24,120 --> 00:33:26,560 Speaker 1: last time you were here, we asked you our standard 696 00:33:26,560 --> 00:33:28,640 Speaker 1: set of questions for an author who has come to 697 00:33:28,640 --> 00:33:31,040 Speaker 1: the podcast the first time. But since you're coming back, 698 00:33:31,280 --> 00:33:33,520 Speaker 1: I have another question for you. I see that you 699 00:33:33,560 --> 00:33:37,000 Speaker 1: have served as President of the Science Fiction and Fantasy 700 00:33:37,040 --> 00:33:40,040 Speaker 1: Writers of America, and I was wondering if that gives you, 701 00:33:40,080 --> 00:33:43,200 Speaker 1: like any broad powers you get to like settle disputes 702 00:33:43,240 --> 00:33:46,320 Speaker 1: over deep questions like whether Hans shot first. Or are 703 00:33:46,320 --> 00:33:48,840 Speaker 1: you in charge of foreign policy letting you declare war 704 00:33:48,880 --> 00:33:52,320 Speaker 1: against other guilds or Disney? Maybe? Yes, yes, In fact, 705 00:33:52,360 --> 00:33:55,600 Speaker 1: I've gone into battle on the Disney front lines. That's been, 706 00:33:55,840 --> 00:33:58,080 Speaker 1: you know, proud service that I've done there. It also 707 00:33:58,120 --> 00:34:01,560 Speaker 1: gives me the ability to herds and to shoot laser 708 00:34:01,640 --> 00:34:04,040 Speaker 1: beams from my eyes. I did have to give up 709 00:34:04,040 --> 00:34:06,440 Speaker 1: the laser beam power upon leaving office though. That was 710 00:34:06,480 --> 00:34:08,520 Speaker 1: a skill I miss. I missed that. That's got to 711 00:34:08,520 --> 00:34:10,560 Speaker 1: be disappointing, no doubt about it. You know, it does 712 00:34:10,640 --> 00:34:13,040 Speaker 1: make reading easier. It was very hard to read while 713 00:34:13,080 --> 00:34:15,480 Speaker 1: in office. So one thing that makes me feel better 714 00:34:15,520 --> 00:34:18,920 Speaker 1: what I'm feeling low is a good drink. And every 715 00:34:19,000 --> 00:34:21,880 Speaker 1: chapter in the book starts with the drink recipe, and 716 00:34:21,920 --> 00:34:25,520 Speaker 1: the characters definitely enjoyed our cocktails. Does that tell you 717 00:34:25,640 --> 00:34:33,279 Speaker 1: tell us something about your experience of lockdown? Um? Yes, 718 00:34:34,000 --> 00:34:39,000 Speaker 1: it does. So. It's funny because my husband is a winemaker, 719 00:34:39,239 --> 00:34:41,399 Speaker 1: so you would think that our go to beverage would 720 00:34:41,400 --> 00:34:44,000 Speaker 1: be wine. But when we were living in Iceland, the 721 00:34:44,080 --> 00:34:47,800 Speaker 1: cost for a bottle of you know, Scotch, it was 722 00:34:47,800 --> 00:34:51,040 Speaker 1: about the same or not much more than the cost 723 00:34:51,080 --> 00:34:53,600 Speaker 1: for a bottle of wine, but it lasted a lot longer. 724 00:34:53,640 --> 00:34:56,840 Speaker 1: So our cocktail games started there. But during lockdown we 725 00:34:56,960 --> 00:34:59,440 Speaker 1: really really up to our cocktail game. And is that 726 00:34:59,480 --> 00:35:02,279 Speaker 1: why our word drinks incorporated into the book? How did 727 00:35:02,280 --> 00:35:03,839 Speaker 1: you pick the drinks that ended up in the book. 728 00:35:04,040 --> 00:35:06,800 Speaker 1: So the book is basically the Thin Man in Space 729 00:35:07,040 --> 00:35:10,080 Speaker 1: and in the Thin Man movies, like when we start off, 730 00:35:10,320 --> 00:35:13,040 Speaker 1: Nick Charles is like drunk from the very beginning. I 731 00:35:13,080 --> 00:35:15,479 Speaker 1: did not want my characters to be drunk the entire time, 732 00:35:15,680 --> 00:35:17,920 Speaker 1: but it's a film that came out right after prohibition 733 00:35:18,000 --> 00:35:21,520 Speaker 1: was lifted, so cocktails are just constant and everywhere. I 734 00:35:21,600 --> 00:35:24,760 Speaker 1: wanted to have some of that vibe, but I also 735 00:35:24,800 --> 00:35:28,560 Speaker 1: wanted to incorporate some that were non alcoholic. So there's 736 00:35:28,600 --> 00:35:31,480 Speaker 1: a mix of cocktails that are classic cocktails like Corps 737 00:35:31,480 --> 00:35:35,040 Speaker 1: Survivor Number two, Death in the Afternoon, the Obituary, and 738 00:35:35,040 --> 00:35:37,360 Speaker 1: then there are also ones where I made up the 739 00:35:37,400 --> 00:35:40,799 Speaker 1: cocktail because I needed something that was thematically linked to 740 00:35:40,840 --> 00:35:43,480 Speaker 1: the chapter, and all of the non alcoholic ones had 741 00:35:43,520 --> 00:35:47,160 Speaker 1: things like Shirley Temple. So Frisky Business is one of mine. 742 00:35:47,480 --> 00:35:51,319 Speaker 1: Orbital Decay also one of mine. So playing around and 743 00:35:51,360 --> 00:35:53,440 Speaker 1: trying to come up with something that was tasty and 744 00:35:53,520 --> 00:35:56,680 Speaker 1: also had an a thematically appropriate name. How long does 745 00:35:56,719 --> 00:35:58,399 Speaker 1: it take to come up with a cocktail that sounds 746 00:35:58,440 --> 00:36:01,200 Speaker 1: like a fun experience? You know, it's really difficult having 747 00:36:01,239 --> 00:36:03,640 Speaker 1: to do that kind of research, so painful. Once you 748 00:36:03,719 --> 00:36:06,719 Speaker 1: understand the theory, it's actually pretty easy. There are a 749 00:36:06,800 --> 00:36:09,160 Speaker 1: number of cocktails that you can just swap out one 750 00:36:09,320 --> 00:36:12,239 Speaker 1: ingredient and and you have a totally different ones. So 751 00:36:12,440 --> 00:36:15,560 Speaker 1: the Negroni is a great example of this. It's equal 752 00:36:15,600 --> 00:36:19,600 Speaker 1: parts Compari, vermouth, and gin. But if you swap out 753 00:36:19,680 --> 00:36:21,680 Speaker 1: gin for bourbon, you come up with something that, let's 754 00:36:21,680 --> 00:36:24,239 Speaker 1: call a boulevardier. If you swap out the Compari for 755 00:36:24,320 --> 00:36:26,600 Speaker 1: something else, then you get a you know, a different flavor, 756 00:36:26,600 --> 00:36:29,040 Speaker 1: a different profile. So a lot of what I'm doing 757 00:36:29,120 --> 00:36:34,160 Speaker 1: there are ingredients swaps or proportion flips. So I can't 758 00:36:34,200 --> 00:36:36,360 Speaker 1: remember which one it is, but one of them is 759 00:36:36,360 --> 00:36:40,280 Speaker 1: basically a Manhattan but with the proportions inverted. So usually 760 00:36:40,360 --> 00:36:44,200 Speaker 1: I would take a standard recipe, I would do a swap. 761 00:36:44,280 --> 00:36:47,000 Speaker 1: Sometimes there was something that I was going for, particularly 762 00:36:47,040 --> 00:36:50,480 Speaker 1: like the orbital decay. I wanted something that was akin 763 00:36:50,560 --> 00:36:53,839 Speaker 1: to an aviation, but that was spaceflight. So my first 764 00:36:53,840 --> 00:36:56,640 Speaker 1: attempt on that was to take an aviation and do 765 00:36:56,840 --> 00:37:00,960 Speaker 1: a bourbon swap for the gin, and that results in 766 00:37:00,960 --> 00:37:05,400 Speaker 1: a cocktail that is an incredibly foul colors, just gross, 767 00:37:05,440 --> 00:37:07,600 Speaker 1: so that when I had to play with a fair 768 00:37:07,680 --> 00:37:11,320 Speaker 1: bit that drifted really far away from the aviation. It 769 00:37:11,400 --> 00:37:13,959 Speaker 1: sounds like you suffered for your art. I did. I did. 770 00:37:14,000 --> 00:37:16,920 Speaker 1: It was really hard. It was really you know, people 771 00:37:17,000 --> 00:37:21,759 Speaker 1: don't understand creativity, and you know, the research that's involved 772 00:37:22,480 --> 00:37:25,960 Speaker 1: so hard, so hard. So I'm really curious about your 773 00:37:26,000 --> 00:37:28,759 Speaker 1: writing process for the science of this book. In your 774 00:37:28,760 --> 00:37:31,320 Speaker 1: other books, you did really extensive research on the various 775 00:37:31,320 --> 00:37:34,560 Speaker 1: scientific scenarios. What was that process like for this book. 776 00:37:34,920 --> 00:37:39,040 Speaker 1: So there are three kind of areas of science that 777 00:37:39,080 --> 00:37:42,360 Speaker 1: are happening in this One is the deep brain pain suppressor, 778 00:37:42,560 --> 00:37:46,800 Speaker 1: and the other is the centrifugal gravity of the ship, 779 00:37:46,920 --> 00:37:49,040 Speaker 1: and then the third is the time lag. So the 780 00:37:49,120 --> 00:37:53,440 Speaker 1: deep brain pain suppressor, which I regretted every time I 781 00:37:53,480 --> 00:37:55,920 Speaker 1: had to say it in the audio book, is based 782 00:37:56,239 --> 00:38:00,640 Speaker 1: on my mom's My mom has Parkinson's and she has DBS, 783 00:38:00,719 --> 00:38:03,600 Speaker 1: which is a deep brain stimulator, So a lot of 784 00:38:03,600 --> 00:38:05,319 Speaker 1: it is based on that and what they had to 785 00:38:05,320 --> 00:38:08,560 Speaker 1: do with that, and then it's kind of extrapolated towards 786 00:38:08,960 --> 00:38:11,920 Speaker 1: what things could be possible in the future. The problem 787 00:38:11,960 --> 00:38:15,439 Speaker 1: with doing pain suppression via deep brain in the real 788 00:38:15,520 --> 00:38:20,200 Speaker 1: world is that pain is incredibly specific person to person, 789 00:38:21,239 --> 00:38:25,759 Speaker 1: and doing clinical trials on pain, having two people that 790 00:38:25,840 --> 00:38:28,520 Speaker 1: have chronic pain for the same reasons, it's rare. You 791 00:38:28,560 --> 00:38:31,320 Speaker 1: don't want to do that to someone. So that is 792 00:38:31,400 --> 00:38:34,239 Speaker 1: lagging significantly behind, but they are aware that it's a 793 00:38:34,280 --> 00:38:38,600 Speaker 1: thing that can be done. And then just last month 794 00:38:38,680 --> 00:38:42,600 Speaker 1: or the month before, someone announced a thing that where 795 00:38:42,640 --> 00:38:46,640 Speaker 1: they basically wrap a cooling sheath around the angry nerve 796 00:38:46,719 --> 00:38:48,839 Speaker 1: for lack of a better way to say it, and like, 797 00:38:48,880 --> 00:38:51,840 Speaker 1: I'm very very layman's terms here, but that can cool 798 00:38:51,920 --> 00:38:54,520 Speaker 1: it down. And so we're actually on the verge of 799 00:38:54,560 --> 00:38:58,480 Speaker 1: that no longer being science fiction. Yeah, the next ten 800 00:38:58,520 --> 00:39:01,000 Speaker 1: to twenty years, I expect that that that's technology that 801 00:39:01,040 --> 00:39:04,400 Speaker 1: will exist. The centrifugal force and the time lag. I 802 00:39:04,560 --> 00:39:08,759 Speaker 1: partnered with rocket engineer Max Fagan, and I told him 803 00:39:08,800 --> 00:39:10,640 Speaker 1: what I wanted the ship to do, which was to 804 00:39:10,840 --> 00:39:15,279 Speaker 1: have Mooner gravity, Earth gravity, and Martian gravity. Um that 805 00:39:15,640 --> 00:39:17,640 Speaker 1: depending on where you were coming from on this this 806 00:39:17,680 --> 00:39:19,720 Speaker 1: cruise ship that you could stay in your home gravity 807 00:39:19,800 --> 00:39:22,399 Speaker 1: or be in your destination gravity, and so he helped 808 00:39:22,440 --> 00:39:24,800 Speaker 1: me come up with this wackadoo thing that would work. 809 00:39:25,080 --> 00:39:26,920 Speaker 1: The only people who would build it would be a 810 00:39:27,000 --> 00:39:31,760 Speaker 1: luxury cruise liner because it is completely bonkers. It travels 811 00:39:31,800 --> 00:39:34,919 Speaker 1: under constant thrust, which gives you Learner gravity, and then 812 00:39:35,000 --> 00:39:38,799 Speaker 1: is spinning with two rings that are progressively larger to 813 00:39:38,920 --> 00:39:43,520 Speaker 1: give you Martian and Earth gravity. As I said, completely bonkers, 814 00:39:43,560 --> 00:39:46,200 Speaker 1: but it meant that whichever level they were in, I 815 00:39:46,239 --> 00:39:49,600 Speaker 1: had to think about, Okay, is the Correolis effect happening here, 816 00:39:49,640 --> 00:39:52,239 Speaker 1: and how's that going to affect things. I fudged a 817 00:39:52,400 --> 00:39:57,839 Speaker 1: little bit on some site lines because the levels would 818 00:39:57,840 --> 00:39:59,920 Speaker 1: have to be I think was like fifty yards apart, 819 00:40:00,080 --> 00:40:02,040 Speaker 1: and I'm like, yeah, you can totally see things from 820 00:40:02,080 --> 00:40:04,640 Speaker 1: one level to the next, you know, And the answer 821 00:40:04,680 --> 00:40:07,200 Speaker 1: is not really not really, wouldn't be able to. And 822 00:40:07,239 --> 00:40:11,080 Speaker 1: then the time lag again because it's constant thrust, the 823 00:40:11,200 --> 00:40:15,560 Speaker 1: round trip time lag for communications changed, like the amount 824 00:40:15,560 --> 00:40:18,480 Speaker 1: of time in the morning and the evening was several 825 00:40:18,520 --> 00:40:21,080 Speaker 1: minutes difference. Because it's a I think it's an eleven 826 00:40:21,160 --> 00:40:23,399 Speaker 1: day voyage that I've got them on. So Max made 827 00:40:23,440 --> 00:40:26,960 Speaker 1: the spreadsheet for me, and I had a calendar, and 828 00:40:27,040 --> 00:40:29,879 Speaker 1: so anytime an event happened where they needed to make 829 00:40:29,880 --> 00:40:32,600 Speaker 1: a call back to Earth, I had to figure out 830 00:40:32,640 --> 00:40:34,920 Speaker 1: what day it was on the voyage, what time it 831 00:40:34,960 --> 00:40:37,600 Speaker 1: was on the voyage, consult with the spreadsheet, and then 832 00:40:37,840 --> 00:40:40,919 Speaker 1: put that in for the time lag, and then make 833 00:40:40,960 --> 00:40:43,960 Speaker 1: sure that I had enough dialogue in betwo, you know, 834 00:40:44,080 --> 00:40:48,520 Speaker 1: like enough action happening locally to account for how long 835 00:40:48,520 --> 00:40:49,960 Speaker 1: it was going to be for them to get the 836 00:40:50,000 --> 00:40:52,520 Speaker 1: response back. And there was one point where I moved 837 00:40:52,520 --> 00:40:56,360 Speaker 1: a scene and then had to like rewrite everything in 838 00:40:56,440 --> 00:40:59,040 Speaker 1: it because the time lag no longer worked well. I 839 00:40:59,080 --> 00:41:02,840 Speaker 1: really appreciate your dedication to being consistent about the science, 840 00:41:02,880 --> 00:41:04,560 Speaker 1: but I wonder why you seem to have chosen to 841 00:41:04,680 --> 00:41:08,040 Speaker 1: keep the signs of your universe basically right on the 842 00:41:08,080 --> 00:41:11,080 Speaker 1: science of our universe means in some cases that things 843 00:41:11,120 --> 00:41:13,600 Speaker 1: are quite tricky to engineer the story. Why didn't you 844 00:41:13,640 --> 00:41:16,120 Speaker 1: decide to like put it in a slightly different universe where, 845 00:41:16,160 --> 00:41:18,799 Speaker 1: you know, maybe FDL communication was possible. Some of it 846 00:41:18,840 --> 00:41:21,080 Speaker 1: was that I actually wanted the complication of the time lag. 847 00:41:21,200 --> 00:41:24,399 Speaker 1: The other is that my experience from writing fantasy where 848 00:41:24,400 --> 00:41:26,760 Speaker 1: I am making up the rules is that it's not easier, 849 00:41:26,920 --> 00:41:28,400 Speaker 1: you know. With this I can go to Max and 850 00:41:28,440 --> 00:41:29,800 Speaker 1: I can say, hey, I need a spreadsheet, and he 851 00:41:29,840 --> 00:41:31,200 Speaker 1: can give me a spreadsheet, and then I have the 852 00:41:31,200 --> 00:41:33,839 Speaker 1: spreadsheet and I have to consult. But that's fine. When 853 00:41:33,840 --> 00:41:35,560 Speaker 1: I'm doing my own universe, I have to make up 854 00:41:35,600 --> 00:41:38,160 Speaker 1: all of the rules and there is no book that 855 00:41:38,200 --> 00:41:40,600 Speaker 1: I can go to, so I become my own reference library, 856 00:41:40,600 --> 00:41:42,640 Speaker 1: and I have to keep track of whether or not 857 00:41:42,960 --> 00:41:45,600 Speaker 1: I have done a thing and what the ramifications of 858 00:41:45,640 --> 00:41:48,480 Speaker 1: it are. And what I find is that it is 859 00:41:48,640 --> 00:41:51,279 Speaker 1: equally as hard. It's just that the difficult point is 860 00:41:51,400 --> 00:41:53,959 Speaker 1: placed in a different part of the process. And people 861 00:41:54,000 --> 00:41:55,840 Speaker 1: will still tell me that I'm wrong about things for 862 00:41:55,880 --> 00:41:58,839 Speaker 1: stuff that I made up. Yeah, of course I will. 863 00:41:59,120 --> 00:42:00,840 Speaker 1: How do you try all that stuff? Do you have 864 00:42:00,920 --> 00:42:03,640 Speaker 1: like a checklist? And after writing every scene you go 865 00:42:03,719 --> 00:42:05,880 Speaker 1: back and say, Okay, have I thought about the Coriolis effect? 866 00:42:05,960 --> 00:42:07,560 Speaker 1: Have I thought about the time leg or is it 867 00:42:07,640 --> 00:42:09,799 Speaker 1: just like in your brain so much? Yeah? How do 868 00:42:09,800 --> 00:42:12,960 Speaker 1: you do that? So usually, like the first two thirds 869 00:42:13,000 --> 00:42:15,040 Speaker 1: of the book, it's just in my brain because I'm 870 00:42:15,040 --> 00:42:16,959 Speaker 1: writing it fast enough of the pace that I haven't 871 00:42:17,000 --> 00:42:19,799 Speaker 1: forgotten what I've been working on. But this book was 872 00:42:19,880 --> 00:42:23,880 Speaker 1: actually complicated enough that I made a triloboard when I 873 00:42:23,920 --> 00:42:27,320 Speaker 1: was doing their visions on it, so that to remind 874 00:42:27,320 --> 00:42:30,040 Speaker 1: myself to check each chapter. It's like, okay, check your 875 00:42:30,040 --> 00:42:35,239 Speaker 1: time lag, check Coriolis. Check what's setting Tesla's DBPs is 876 00:42:35,719 --> 00:42:37,440 Speaker 1: on right now? Does she have it on or off? 877 00:42:37,640 --> 00:42:40,239 Speaker 1: So yeah, I did actually put up a triloboard for 878 00:42:40,280 --> 00:42:43,000 Speaker 1: myself for each chapter. A challenge I find for science 879 00:42:43,040 --> 00:42:46,239 Speaker 1: fiction murder mysteries that for a mystery to work, the 880 00:42:46,280 --> 00:42:48,120 Speaker 1: reader has to know the rules of the world, so 881 00:42:48,160 --> 00:42:49,880 Speaker 1: they know how to interpret the clues they have like 882 00:42:49,920 --> 00:42:52,239 Speaker 1: a fair chance at figuring out the mystery. But in 883 00:42:52,239 --> 00:42:54,640 Speaker 1: a science fiction book, often the rules are just made up. 884 00:42:54,640 --> 00:42:57,680 Speaker 1: And here you've invented several advanced technologies and so the 885 00:42:57,719 --> 00:43:00,560 Speaker 1: reader is learning them as they go. How do you 886 00:43:00,600 --> 00:43:04,000 Speaker 1: handle filling the reader in on the rules of your universe, 887 00:43:04,040 --> 00:43:06,520 Speaker 1: the technologies and how they change life, while also like 888 00:43:06,640 --> 00:43:08,840 Speaker 1: leaving them enough bread crumbs to follow. Is that a 889 00:43:08,880 --> 00:43:11,560 Speaker 1: really difficult thing to balance? It could be like all 890 00:43:11,560 --> 00:43:13,839 Speaker 1: of the things that are existing in the world are 891 00:43:14,040 --> 00:43:16,040 Speaker 1: The hard part is that all of the things are 892 00:43:16,080 --> 00:43:19,160 Speaker 1: things that are completely natural for the characters. Explaining it 893 00:43:19,200 --> 00:43:22,080 Speaker 1: to the reader, the bread crumb question is interesting because 894 00:43:22,360 --> 00:43:24,040 Speaker 1: when I'm teaching this, what I say is you treat 895 00:43:24,040 --> 00:43:27,719 Speaker 1: it like a cheese sandwich. So, if your character is 896 00:43:27,880 --> 00:43:30,279 Speaker 1: just like a cheese sandwiches and everyday thing, and it's 897 00:43:30,280 --> 00:43:32,600 Speaker 1: not a plot point, it's just they're eating a cheese sandwich, 898 00:43:32,640 --> 00:43:34,480 Speaker 1: you just say she had a cheese sandwich and you 899 00:43:34,560 --> 00:43:37,560 Speaker 1: move on. If you are needing to you know, if 900 00:43:37,560 --> 00:43:40,680 Speaker 1: your your character is a chef, then your reaction how 901 00:43:40,680 --> 00:43:43,279 Speaker 1: that cheese sandwich is treated is very different. They're going 902 00:43:43,320 --> 00:43:45,680 Speaker 1: to really think about the bread as they're slicing it, 903 00:43:45,760 --> 00:43:48,680 Speaker 1: and maybe they're going to make the bread themselves from scratch. 904 00:43:48,920 --> 00:43:51,240 Speaker 1: But if your character is an alien, they've never encountered 905 00:43:51,280 --> 00:43:54,120 Speaker 1: a cheese sandwich before. What you're gonna say is, you 906 00:43:54,160 --> 00:43:56,319 Speaker 1: know that the temptation that a lot of writers have 907 00:43:56,440 --> 00:43:58,960 Speaker 1: is like, okay, what's a cheese sandwich? Okay, cheese sandwich 908 00:43:59,080 --> 00:44:02,600 Speaker 1: is made out of cheese and bread, and the aliens like, okay, 909 00:44:02,600 --> 00:44:05,200 Speaker 1: well what's what's bread? And you're like, okay, so bread. 910 00:44:06,160 --> 00:44:08,680 Speaker 1: Bread is made from wheat and it's ground and what 911 00:44:08,840 --> 00:44:11,560 Speaker 1: the alien actually needs to know is a cheese sandwich 912 00:44:11,600 --> 00:44:13,560 Speaker 1: is something that you hold in your hands and you eat. 913 00:44:13,920 --> 00:44:16,000 Speaker 1: And the temptation is to put in all of this 914 00:44:16,080 --> 00:44:19,759 Speaker 1: extraneous detail that you don't need. So what I do 915 00:44:19,880 --> 00:44:22,680 Speaker 1: is I give the character the reader. The reader is 916 00:44:22,760 --> 00:44:25,400 Speaker 1: the alien in this case. I give the reader just 917 00:44:25,480 --> 00:44:27,600 Speaker 1: the information that they need in order to move through 918 00:44:27,760 --> 00:44:30,600 Speaker 1: to the next part. So cheese sandwich is the thing 919 00:44:30,640 --> 00:44:32,000 Speaker 1: you hold in your hand and your eat, and I 920 00:44:32,040 --> 00:44:34,320 Speaker 1: try to give it to them before the cheese sandwich 921 00:44:34,360 --> 00:44:36,800 Speaker 1: is in front of them, so that when it arrives 922 00:44:36,960 --> 00:44:39,200 Speaker 1: there like, ah, I know what to do with this thing. 923 00:44:39,400 --> 00:44:43,399 Speaker 1: But octa devices, yes, the octopode and the octopot yes. 924 00:44:43,640 --> 00:44:46,359 Speaker 1: So that's the other piece. So that those were doing 925 00:44:46,440 --> 00:44:50,960 Speaker 1: two things, lifting wise three things actually, so part of 926 00:44:50,960 --> 00:44:53,520 Speaker 1: what's going on there is the other the other thing. 927 00:44:54,000 --> 00:44:55,839 Speaker 1: When I say, you let them know about it before 928 00:44:55,880 --> 00:44:57,359 Speaker 1: it arrives, so that they know what to do with it. 929 00:44:57,400 --> 00:44:59,680 Speaker 1: To keep people in a murder mystery from knowing for 930 00:45:00,040 --> 00:45:03,759 Speaker 1: sure and certain that that something is a clue, you 931 00:45:03,800 --> 00:45:05,799 Speaker 1: need to use it in a way that is not 932 00:45:06,000 --> 00:45:07,799 Speaker 1: the way it is used as the clue. And then 933 00:45:07,960 --> 00:45:13,360 Speaker 1: also to make something feel real and part of the world. 934 00:45:13,920 --> 00:45:17,040 Speaker 1: For world building, you want to use it more than 935 00:45:17,080 --> 00:45:19,560 Speaker 1: one way. So, for instance, with that cheese sandwich, if 936 00:45:19,600 --> 00:45:22,000 Speaker 1: I use that cheese sandwich once and never again, it's 937 00:45:22,080 --> 00:45:26,760 Speaker 1: like whatever. But if if cheese sandwiches appear multiple places, 938 00:45:26,760 --> 00:45:29,400 Speaker 1: you're like, oh, cheese sandwiches are really important to this society. 939 00:45:29,400 --> 00:45:31,960 Speaker 1: It's like we walk past a cheese sandwich kiosk, you 940 00:45:32,000 --> 00:45:34,240 Speaker 1: know that she pulled out the microplane that she normally 941 00:45:34,320 --> 00:45:35,960 Speaker 1: used on her cheese sandwich, but she could use it 942 00:45:36,000 --> 00:45:38,839 Speaker 1: to slice this apple. That kind of thing helps the 943 00:45:38,880 --> 00:45:41,440 Speaker 1: reader feel like this is an embedded part of the world. 944 00:45:41,600 --> 00:45:43,960 Speaker 1: So with the octopoton octopot part of what I was 945 00:45:44,000 --> 00:45:47,400 Speaker 1: doing was letting the reader know how those machines worked. 946 00:45:47,760 --> 00:45:50,040 Speaker 1: The other pieces that I was letting them know is 947 00:45:50,080 --> 00:45:52,359 Speaker 1: that autonomous robots are just part of the world. I 948 00:45:52,400 --> 00:45:54,600 Speaker 1: was also letting them know that I was doing some 949 00:45:54,680 --> 00:45:57,640 Speaker 1: competence porn, which is Tesla is really smart and she 950 00:45:57,719 --> 00:45:59,560 Speaker 1: knows her stuff. And the third thing that I was 951 00:45:59,600 --> 00:46:02,120 Speaker 1: doing is them know that this is not obvious to 952 00:46:02,160 --> 00:46:06,200 Speaker 1: everyone else, that this is a non obvious and somewhat 953 00:46:06,200 --> 00:46:08,759 Speaker 1: complicated things. So there's a narrow range of people that 954 00:46:08,800 --> 00:46:11,839 Speaker 1: are going to have the same knowledge that my main 955 00:46:11,920 --> 00:46:15,040 Speaker 1: character does. So there's there's a number of different load 956 00:46:15,080 --> 00:46:18,080 Speaker 1: bearing things that those two devices are doing, which is 957 00:46:18,120 --> 00:46:20,279 Speaker 1: part of why you do get more information about it. 958 00:46:20,480 --> 00:46:23,120 Speaker 1: That is a really detailed answer. And the first time 959 00:46:23,120 --> 00:46:24,840 Speaker 1: I wrote fiction, I was like, oh my gosh, this 960 00:46:24,920 --> 00:46:28,319 Speaker 1: is so much harder than I had expected it to be. 961 00:46:28,920 --> 00:46:31,200 Speaker 1: And now I see that there's you know, there's like 962 00:46:31,239 --> 00:46:34,239 Speaker 1: all this theory and anyway, it's it's complicated. So now 963 00:46:34,280 --> 00:46:36,440 Speaker 1: I'm going to transition to a law question. So your 964 00:46:36,480 --> 00:46:39,759 Speaker 1: main character is like always on call with her a 965 00:46:39,880 --> 00:46:43,279 Speaker 1: very angry lawyer who has very colorful language. And so 966 00:46:43,320 --> 00:46:46,279 Speaker 1: I'm wondering what kind of research you did for the 967 00:46:46,360 --> 00:46:48,360 Speaker 1: law related parts of the book. So I think that 968 00:46:48,440 --> 00:46:51,919 Speaker 1: I read that you've like organized workshops on a cruise ship, 969 00:46:52,000 --> 00:46:54,960 Speaker 1: which made me wonder if you've researched cruise ship law 970 00:46:55,200 --> 00:46:57,960 Speaker 1: or did you research space law? How did you end 971 00:46:58,040 --> 00:47:00,960 Speaker 1: up like settling on the jurisdiction is I had a 972 00:47:01,000 --> 00:47:04,279 Speaker 1: space lawyer as a consultant. So my space lawyer was 973 00:47:04,320 --> 00:47:07,840 Speaker 1: Gabriel Sweeney. He does uh space law for the US government, 974 00:47:08,640 --> 00:47:12,319 Speaker 1: which is an amazing It's like basically his whole job 975 00:47:12,520 --> 00:47:17,080 Speaker 1: is thinking about space law. It's amazing. So the space 976 00:47:17,160 --> 00:47:20,560 Speaker 1: law is made up, we made some decisions and he 977 00:47:21,040 --> 00:47:23,480 Speaker 1: let me know where it was like just didn't make 978 00:47:23,560 --> 00:47:26,279 Speaker 1: sense as an evolution. It turns out that space law 979 00:47:26,480 --> 00:47:29,680 Speaker 1: in the real world is terrifyingly vague. You know, all 980 00:47:29,680 --> 00:47:32,640 Speaker 1: the satellites that gets in up, the international space law 981 00:47:32,719 --> 00:47:35,960 Speaker 1: is such that if anything goes wrong, the government that 982 00:47:36,040 --> 00:47:39,120 Speaker 1: they launched from is the one that's on the hook 983 00:47:39,520 --> 00:47:42,920 Speaker 1: for the money. Like, so all of the private space, 984 00:47:43,120 --> 00:47:46,520 Speaker 1: all of the commercial space flight stuff, the US government 985 00:47:47,440 --> 00:47:50,000 Speaker 1: is the one that actually has to pay other countries 986 00:47:50,040 --> 00:47:52,759 Speaker 1: if it like comes down and causes damage whatever you think. 987 00:47:53,520 --> 00:47:56,080 Speaker 1: And it is not maritime law. It's a totally different 988 00:47:56,080 --> 00:47:57,680 Speaker 1: set of law. Well, what I want to know is, 989 00:47:57,880 --> 00:48:00,239 Speaker 1: would you or would you not recommend doing a urder 990 00:48:00,239 --> 00:48:04,200 Speaker 1: in space? I mean purely hypothetically, purely hypothetically, with the 991 00:48:04,640 --> 00:48:07,279 Speaker 1: way things are currently, that's a great place to murder 992 00:48:07,320 --> 00:48:10,239 Speaker 1: someone because there is no one who has jurisdiction over 993 00:48:10,280 --> 00:48:12,399 Speaker 1: you if you were killing someone who is not from 994 00:48:12,440 --> 00:48:16,840 Speaker 1: your country. The closest thing that we have to comparable 995 00:48:16,920 --> 00:48:20,040 Speaker 1: murder thing is a murder on an iceberg. A research 996 00:48:20,120 --> 00:48:23,840 Speaker 1: station on an iceberg because it was in no one's jurisdiction. 997 00:48:24,000 --> 00:48:27,200 Speaker 1: So on the topic of space tourism and the laws involved, 998 00:48:27,320 --> 00:48:29,920 Speaker 1: is a lot of discussion these days about the ethics 999 00:48:30,040 --> 00:48:33,440 Speaker 1: of the upcoming space tourism industry and the potential for 1000 00:48:33,480 --> 00:48:36,320 Speaker 1: a colonization of the Moon and Mars, and whether we 1001 00:48:36,320 --> 00:48:39,400 Speaker 1: should allow corporations to treat space as a resource or 1002 00:48:39,560 --> 00:48:41,920 Speaker 1: if you should be treated as a public good and regulated. 1003 00:48:42,000 --> 00:48:43,920 Speaker 1: I know that you've constructed here in your book just 1004 00:48:43,960 --> 00:48:45,880 Speaker 1: as a setting fit for the murder, But do you 1005 00:48:45,920 --> 00:48:48,759 Speaker 1: think it's a good idea in our universe to build 1006 00:48:48,880 --> 00:48:51,279 Speaker 1: cruise shift to Mars if it were possible. I think 1007 00:48:51,320 --> 00:48:54,160 Speaker 1: that everybody should be able to go into space. The 1008 00:48:54,280 --> 00:48:57,880 Speaker 1: question of resources, I think that the model that we 1009 00:48:57,960 --> 00:49:02,160 Speaker 1: currently have for going to new place is very bad 1010 00:49:04,200 --> 00:49:06,440 Speaker 1: and leads to a lot of harm. I would love 1011 00:49:06,480 --> 00:49:08,120 Speaker 1: to see us come up with different models, and I 1012 00:49:08,120 --> 00:49:09,880 Speaker 1: would love to see us come up with different models 1013 00:49:09,920 --> 00:49:12,319 Speaker 1: before we get there. That is not what's happening. What 1014 00:49:12,520 --> 00:49:15,960 Speaker 1: is happening because in order to do that, every country 1015 00:49:16,000 --> 00:49:19,760 Speaker 1: would have to kind of enact that law simultaneously, because 1016 00:49:19,760 --> 00:49:24,760 Speaker 1: otherwise the countries that don't have those laws enacted, their corporations, 1017 00:49:24,760 --> 00:49:27,680 Speaker 1: their private citizens have an advantage. And so right now 1018 00:49:27,680 --> 00:49:30,279 Speaker 1: everybody is basically doing a free for all and it's 1019 00:49:30,480 --> 00:49:34,440 Speaker 1: gonna stay that way until something goes horribly tragically wrong. 1020 00:49:34,600 --> 00:49:38,440 Speaker 1: And what that means is that, um, everybody who's like 1021 00:49:38,760 --> 00:49:42,640 Speaker 1: do first permission, later forgiveness over permission, all of those 1022 00:49:42,680 --> 00:49:46,799 Speaker 1: places will have established to holds. So do I think 1023 00:49:46,840 --> 00:49:52,600 Speaker 1: it's a good idea? No? And yes, simultaneously, like two 1024 00:49:52,600 --> 00:49:57,000 Speaker 1: things can be true if it was well regulated. Yes, absolutely, 1025 00:49:57,600 --> 00:49:59,600 Speaker 1: you learned so much about where you are from by 1026 00:49:59,640 --> 00:50:02,760 Speaker 1: going some place else. But the way that things are moving, 1027 00:50:02,960 --> 00:50:05,399 Speaker 1: we're not on a good trajectory right now. So does 1028 00:50:05,440 --> 00:50:07,839 Speaker 1: the research you've done for this book make you more 1029 00:50:07,920 --> 00:50:11,280 Speaker 1: or less likely to invest in some space tourism company 1030 00:50:11,320 --> 00:50:14,279 Speaker 1: to buy shares, you know, Blue Origin or whatever. You 1031 00:50:14,360 --> 00:50:18,080 Speaker 1: ask that question as if I have money to invest 1032 00:50:18,120 --> 00:50:24,000 Speaker 1: in anything. I am a writer. We're speaking hypothetically asking 1033 00:50:24,080 --> 00:50:28,280 Speaker 1: for actual investment advice. Speaking hypothetically, there's a couple of 1034 00:50:28,320 --> 00:50:30,840 Speaker 1: really interesting companies, one of which I can't remember the 1035 00:50:30,880 --> 00:50:33,080 Speaker 1: name of it right now, but they basically do crowd 1036 00:50:33,120 --> 00:50:35,920 Speaker 1: funding for space. It's like a Kickstarter kind of thing. 1037 00:50:35,960 --> 00:50:38,239 Speaker 1: There's a I think you can buy in for a 1038 00:50:38,280 --> 00:50:40,480 Speaker 1: hundred dollars I think is your minimum, but you can 1039 00:50:40,560 --> 00:50:44,760 Speaker 1: support some really interesting space research as a private citizen 1040 00:50:45,040 --> 00:50:46,560 Speaker 1: for a very low And if I were going to 1041 00:50:46,600 --> 00:50:48,880 Speaker 1: do it, that's probably the direction I would go because 1042 00:50:49,160 --> 00:50:55,440 Speaker 1: this smaller places seem to be doing things generally speaking, 1043 00:50:55,719 --> 00:50:58,439 Speaker 1: because they're curious about the science of it. But otherwise, yeah, 1044 00:50:58,520 --> 00:51:02,480 Speaker 1: I mean, if you're talking about investments, it's a longer 1045 00:51:02,520 --> 00:51:05,080 Speaker 1: investment than I think most people think. But the trajectory 1046 00:51:05,120 --> 00:51:06,880 Speaker 1: that we're on is the same trajectory that we were 1047 00:51:06,920 --> 00:51:10,480 Speaker 1: on when air flight first happened, which is heading towards 1048 00:51:10,840 --> 00:51:13,520 Speaker 1: commercial stuff. And what I think people should be looking 1049 00:51:13,520 --> 00:51:17,840 Speaker 1: for stuff that's doing suborbital flights, because those are the 1050 00:51:17,840 --> 00:51:19,719 Speaker 1: ones where I think people are going to see a 1051 00:51:19,880 --> 00:51:23,399 Speaker 1: return on investment and are likely to become a thing. 1052 00:51:24,680 --> 00:51:27,319 Speaker 1: Um Again, you know, we always say twenty years to Mars, 1053 00:51:27,360 --> 00:51:30,239 Speaker 1: but I think the suborbital flights, looking at where we 1054 00:51:30,280 --> 00:51:33,759 Speaker 1: are right now, that actually seems like that's coming. So, 1055 00:51:33,800 --> 00:51:37,440 Speaker 1: speaking of space tourism, would you hop on the ln 1056 00:51:37,480 --> 00:51:41,719 Speaker 1: grin and take the cruise ship to Mars? Yes, I 1057 00:51:41,719 --> 00:51:45,439 Speaker 1: think of Mars is like a colder irradiated airless version 1058 00:51:45,480 --> 00:51:49,000 Speaker 1: of Antarctica. But but that's that sounds like a good destination. 1059 00:51:49,239 --> 00:51:51,800 Speaker 1: Well so if a ship like the Lynde Grin exists, 1060 00:51:52,000 --> 00:51:53,920 Speaker 1: then that means that there's stuff at the other end. 1061 00:51:55,040 --> 00:51:59,080 Speaker 1: So it's not just going to desolate wasteland. It's going 1062 00:51:59,120 --> 00:52:02,680 Speaker 1: to resort in desolate wasteland. But yeah, I would absolutely 1063 00:52:02,680 --> 00:52:05,279 Speaker 1: go ownership like the Lyndgren, which in eleven days the 1064 00:52:05,280 --> 00:52:08,759 Speaker 1: longest transit. Would I go now under the circumstances that 1065 00:52:08,800 --> 00:52:11,840 Speaker 1: we have. No, I'd go to the Moon under the 1066 00:52:11,920 --> 00:52:14,239 Speaker 1: circumstances we have. But I'm like, I have a cat. 1067 00:52:14,480 --> 00:52:17,239 Speaker 1: I want to be with my cat. What about the 1068 00:52:17,320 --> 00:52:20,160 Speaker 1: husband You've mentioned a couple of times he doesn't travel, 1069 00:52:20,320 --> 00:52:22,440 Speaker 1: so again, you know, but he might go with me 1070 00:52:22,480 --> 00:52:23,960 Speaker 1: to the Moon. He might actually go with me to 1071 00:52:24,000 --> 00:52:27,040 Speaker 1: Mars because it is a desolate desperately wants to go 1072 00:52:27,120 --> 00:52:30,520 Speaker 1: to Antarctica. See Kelly, some people do want to go 1073 00:52:30,560 --> 00:52:33,560 Speaker 1: to Antarctica. Yeah, I guess I'm definitely not one of them. 1074 00:52:33,600 --> 00:52:35,200 Speaker 1: There's a bunch of people here in the physics department 1075 00:52:35,239 --> 00:52:37,680 Speaker 1: who love going to Antarctica to set up their experiments. 1076 00:52:37,719 --> 00:52:39,680 Speaker 1: Mary Robin, and I want to ask you about your 1077 00:52:39,719 --> 00:52:42,360 Speaker 1: sort of vision for the future of humanity. When you 1078 00:52:42,360 --> 00:52:44,400 Speaker 1: were last on the podcast, it was early on in 1079 00:52:44,400 --> 00:52:46,560 Speaker 1: the pandemic and you were talking about your book The 1080 00:52:46,600 --> 00:52:49,960 Speaker 1: Calculating Stars, in which humanity comes together to use science 1081 00:52:50,000 --> 00:52:52,680 Speaker 1: to solve a crisis. So now two years later, how's 1082 00:52:52,719 --> 00:52:56,359 Speaker 1: your idealism surviving contact with reality? Do you still think 1083 00:52:56,560 --> 00:52:58,760 Speaker 1: humans will come together in the face of an urgent 1084 00:52:58,800 --> 00:53:01,560 Speaker 1: crisis or even like a boiling one like climate change, 1085 00:53:01,680 --> 00:53:04,520 Speaker 1: or should we interpret your pivot to murdering space as 1086 00:53:04,560 --> 00:53:09,000 Speaker 1: a sign of a joke? Great question, and that's all 1087 00:53:09,000 --> 00:53:13,799 Speaker 1: the time we have. It actually has survived because what 1088 00:53:13,840 --> 00:53:18,640 Speaker 1: we saw supports my central thesis, which is that humans 1089 00:53:18,640 --> 00:53:21,880 Speaker 1: do pull together, and that so much of it depends 1090 00:53:21,920 --> 00:53:24,279 Speaker 1: on who is in in the leadership, in the tone 1091 00:53:24,280 --> 00:53:25,799 Speaker 1: that they said. So when you look at some place 1092 00:53:25,880 --> 00:53:28,480 Speaker 1: like New Zealand, when you look at the early days 1093 00:53:28,520 --> 00:53:30,600 Speaker 1: of the pandemic, where everyone's like, yeah, okay, yeah, no, 1094 00:53:30,640 --> 00:53:32,480 Speaker 1: we're gonna do this lockdown thing, when we thought it 1095 00:53:32,480 --> 00:53:33,960 Speaker 1: was only going to be a couple of weeks, you know, 1096 00:53:34,000 --> 00:53:37,040 Speaker 1: people did pull together New Zealand. The leader that they 1097 00:53:37,080 --> 00:53:39,759 Speaker 1: had and the way they weathered that was it perfect. No, 1098 00:53:40,040 --> 00:53:43,400 Speaker 1: But yeah. I think when you look a globally at 1099 00:53:43,400 --> 00:53:48,920 Speaker 1: the way different governments reacted, you can see that we 1100 00:53:49,040 --> 00:53:54,200 Speaker 1: have this capacity, and you can also see how easily 1101 00:53:54,239 --> 00:53:58,000 Speaker 1: we are shaped by someone modeling what to do. It's, 1102 00:53:58,080 --> 00:54:00,920 Speaker 1: you know, it is the bystander effect, a large that 1103 00:54:00,960 --> 00:54:02,920 Speaker 1: no one knows what to do in this circumstance, and 1104 00:54:02,960 --> 00:54:05,000 Speaker 1: so you look for someone in a position of power 1105 00:54:05,000 --> 00:54:07,320 Speaker 1: and authority to tell you how you're supposed to behave 1106 00:54:07,719 --> 00:54:12,920 Speaker 1: And if you're getting conflicting information, then you're gonna wind 1107 00:54:13,000 --> 00:54:18,160 Speaker 1: up with conflict, literal conflict. But if you have people 1108 00:54:18,160 --> 00:54:21,040 Speaker 1: who are study and give you consistent information and are 1109 00:54:21,360 --> 00:54:25,799 Speaker 1: clear and transparent and trust you and you can trust them, 1110 00:54:26,080 --> 00:54:31,160 Speaker 1: then you have a totally different experience. So my optimism remains, 1111 00:54:31,560 --> 00:54:34,560 Speaker 1: but in that we have the capacity, my actual feeling 1112 00:54:34,600 --> 00:54:36,840 Speaker 1: on how things happened did lead to a murder mystery 1113 00:54:38,520 --> 00:54:43,880 Speaker 1: context dependent optimism, yes, yes, So why did you decide 1114 00:54:43,920 --> 00:54:46,759 Speaker 1: to name your main character Tesla? To be honest, for 1115 00:54:46,800 --> 00:54:49,279 Speaker 1: a second, I found myself wishing I hadn't named my 1116 00:54:49,360 --> 00:54:51,759 Speaker 1: daughter Aida, had gone with Tesla instead, because now I 1117 00:54:51,760 --> 00:54:54,040 Speaker 1: think it's a great woman's name. Why did you decide 1118 00:54:54,040 --> 00:54:57,360 Speaker 1: to go with Tesla? So this is definitely not in 1119 00:54:57,400 --> 00:55:04,800 Speaker 1: the book. My headcanon for Tesla is that her dad 1120 00:55:06,200 --> 00:55:10,480 Speaker 1: is or was, say, an armored superhero maybe that happened 1121 00:55:10,520 --> 00:55:13,560 Speaker 1: to have a thing that flew and could shoot things 1122 00:55:13,600 --> 00:55:16,680 Speaker 1: out of his hands. It's definitely not a copyrighted character. 1123 00:55:16,880 --> 00:55:19,759 Speaker 1: So that's my headcanon for her. And you know, obviously, 1124 00:55:19,960 --> 00:55:23,120 Speaker 1: because there is all sorts of reasons, I wanted a 1125 00:55:23,200 --> 00:55:26,480 Speaker 1: name that said, this is somebody who values science. The 1126 00:55:26,560 --> 00:55:28,359 Speaker 1: name that I kind of wish I had gone with 1127 00:55:28,719 --> 00:55:32,040 Speaker 1: in the after the fact is Delta, which is a 1128 00:55:32,280 --> 00:55:36,239 Speaker 1: name that Kelly Gaspari named her daughter Um And I'm like, oh, 1129 00:55:36,239 --> 00:55:38,640 Speaker 1: that's such a good name for someone who's into aerospace, 1130 00:55:38,840 --> 00:55:41,560 Speaker 1: because Tesla, unfortunately is tied up in a lot of 1131 00:55:41,600 --> 00:55:44,040 Speaker 1: people's brains with Elon Musk. But you know, it's such 1132 00:55:44,040 --> 00:55:47,480 Speaker 1: a such a out of the box thinking kind of name. 1133 00:55:47,680 --> 00:55:50,600 Speaker 1: Delta is a great name for middle initials V. Also, yes, 1134 00:55:50,640 --> 00:55:55,759 Speaker 1: I think Delta's middle name does have a V. That's incredible. Well, 1135 00:55:55,800 --> 00:55:57,960 Speaker 1: thanks very much for coming on the podcast and telling 1136 00:55:58,000 --> 00:56:00,200 Speaker 1: us about your latest book, The spare May, and we 1137 00:56:00,280 --> 00:56:02,319 Speaker 1: encourage all of our listeners to check it out. In 1138 00:56:02,320 --> 00:56:05,240 Speaker 1: the meantime, I'm sure you're involved in lots of other projects. 1139 00:56:05,520 --> 00:56:07,759 Speaker 1: What else do you have on the horizon that all 1140 00:56:07,800 --> 00:56:10,120 Speaker 1: of your fans can look out for. Well, I am. 1141 00:56:10,320 --> 00:56:12,359 Speaker 1: I'm writing the next Lady Astronaut book, but that doesn't 1142 00:56:12,360 --> 00:56:16,520 Speaker 1: come out until right now. That's the Martian contingency. So 1143 00:56:16,719 --> 00:56:20,279 Speaker 1: my life is full of very complicated calendars. But the 1144 00:56:20,320 --> 00:56:22,600 Speaker 1: thing that I'm spending a lot of time doing and 1145 00:56:22,760 --> 00:56:26,160 Speaker 1: enjoying these days is teaching on Patreon. So if any 1146 00:56:26,200 --> 00:56:28,640 Speaker 1: of your listeners or writers and want to learn some 1147 00:56:28,760 --> 00:56:31,399 Speaker 1: narrative theory and stuff, that's the thing that I would say, 1148 00:56:31,400 --> 00:56:33,680 Speaker 1: come hang out with me there. All right, Well, thanks 1149 00:56:33,760 --> 00:56:36,160 Speaker 1: very much, everybody, go check that out, and thanks again 1150 00:56:36,160 --> 00:56:39,240 Speaker 1: Mary Robinette for joining us today. Thank you so much. So, Kelly, 1151 00:56:39,280 --> 00:56:42,319 Speaker 1: did she answer your space law questions adequately? Yeah? She did. 1152 00:56:42,360 --> 00:56:45,000 Speaker 1: I I really enjoyed hearing her talk about you know, 1153 00:56:45,120 --> 00:56:47,560 Speaker 1: who's responsible, and I think she did a great job. 1154 00:56:47,680 --> 00:56:49,600 Speaker 1: I think it's amazing that when you write a murder 1155 00:56:49,680 --> 00:56:51,920 Speaker 1: mystery in space you have to not only construct the 1156 00:56:51,960 --> 00:56:54,560 Speaker 1: murder mystery, but think about the science of it and 1157 00:56:54,600 --> 00:56:57,160 Speaker 1: the legal issues of it, Like what a big project. 1158 00:56:57,480 --> 00:57:00,360 Speaker 1: Kudos to Mary Robinette, She's so good. At that and 1159 00:57:00,440 --> 00:57:03,319 Speaker 1: like all the different layers. It's so fun talking to 1160 00:57:03,360 --> 00:57:05,600 Speaker 1: fiction authors and hearing about the different layers that they 1161 00:57:05,600 --> 00:57:07,680 Speaker 1: have to consider when they decide they're going to introduce 1162 00:57:07,719 --> 00:57:09,720 Speaker 1: something and how it has to be like multi purpose 1163 00:57:09,800 --> 00:57:12,480 Speaker 1: for the story. It's incredibly hard. Is one question I 1164 00:57:12,520 --> 00:57:14,680 Speaker 1: forgot to ask her, though, What's that? Can't she use 1165 00:57:14,680 --> 00:57:17,160 Speaker 1: her laser vision to turn her cheese sandwich into a 1166 00:57:17,160 --> 00:57:21,960 Speaker 1: grilled cheese sandwich? You know, I think that was probably 1167 00:57:22,000 --> 00:57:24,840 Speaker 1: the most important question, and you really dropped the ball. Yeah, well, 1168 00:57:24,880 --> 00:57:26,680 Speaker 1: we'll have to have her back when she comes out 1169 00:57:26,720 --> 00:57:29,440 Speaker 1: with the sequel. Thanks everyone for tuning in. We encourage 1170 00:57:29,440 --> 00:57:31,520 Speaker 1: you to go check out The spare Man by Mary 1171 00:57:31,600 --> 00:57:34,680 Speaker 1: Robin and Cole whatever good books are as sold. Thanks 1172 00:57:34,760 --> 00:57:36,880 Speaker 1: Kelly for joining us. Thanks for having me. It was fun. 1173 00:57:37,000 --> 00:57:47,480 Speaker 1: Tune in next time everyone. Thanks for listening, and remember 1174 00:57:47,520 --> 00:57:50,360 Speaker 1: that Daniel and Jorge explain the Universe is a production 1175 00:57:50,400 --> 00:57:53,919 Speaker 1: of I Heart Radio or more podcast from my Heart Radio, 1176 00:57:54,080 --> 00:57:57,680 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 1177 00:57:57,760 --> 00:58:04,320 Speaker 1: you listen to your favorite shows.