1 00:00:08,520 --> 00:00:11,240 Speaker 1: Hey, Kelly, I have kind of a basic question about 2 00:00:11,280 --> 00:00:15,480 Speaker 1: your science farm operation over there. Your questions make me nervous, 3 00:00:15,480 --> 00:00:17,840 Speaker 1: but as long as it doesn't involve anything that might 4 00:00:17,920 --> 00:00:20,560 Speaker 1: scare the kids, I'm willing to answer. All right, Well, 5 00:00:20,600 --> 00:00:24,640 Speaker 1: if it's a science farm, then what are you farming science? 6 00:00:24,800 --> 00:00:28,080 Speaker 1: Do you harvest like raw science? Are you making organic 7 00:00:28,200 --> 00:00:32,919 Speaker 1: farm to journal science? We do actually do science right 8 00:00:32,960 --> 00:00:35,520 Speaker 1: here on the farm, but it's not all that we farm, 9 00:00:35,920 --> 00:00:40,280 Speaker 1: That's right. You also have some humans growing in the farm, right, Okay, Look, 10 00:00:40,280 --> 00:00:43,559 Speaker 1: we have children, but we are farming them. There's no 11 00:00:43,920 --> 00:00:46,960 Speaker 1: harvesting of the children going on. I mean, have you 12 00:00:47,040 --> 00:00:48,879 Speaker 1: checked with Zac about that? I know he can be 13 00:00:48,960 --> 00:00:52,360 Speaker 1: quite literal, you know. I'll admit I never thought to say, 14 00:00:53,040 --> 00:00:58,280 Speaker 1: please don't harvest the children while I'm recording the podcast. Well, 15 00:00:58,360 --> 00:01:17,319 Speaker 1: they're probably fine, I mean at least fifty. Hi. I'm Daniel. 16 00:01:17,400 --> 00:01:20,440 Speaker 1: I'm a particle physicist and a professor at UC Irvine, 17 00:01:20,600 --> 00:01:23,319 Speaker 1: and I am not much of a gardener. I'm Kelly 18 00:01:23,319 --> 00:01:26,720 Speaker 1: Weener Smith. I'm an adjunct professor at Rice University, and 19 00:01:26,920 --> 00:01:29,120 Speaker 1: I too am not a gardener, and I once killed 20 00:01:29,120 --> 00:01:33,440 Speaker 1: a cactus. But you run a whole farm. I hear well, 21 00:01:33,760 --> 00:01:37,920 Speaker 1: Zach does the planting. I mostly just sort of mow 22 00:01:38,000 --> 00:01:40,360 Speaker 1: the hay when it gets too tall, which is fun 23 00:01:40,400 --> 00:01:42,040 Speaker 1: because I love the tractor. So I'm more of a 24 00:01:42,080 --> 00:01:45,919 Speaker 1: destroyer than a builder. I see well. In my marriage, also, 25 00:01:46,040 --> 00:01:50,200 Speaker 1: the gardening responsibilities are on the other spouse. Even in 26 00:01:50,280 --> 00:01:53,200 Speaker 1: the clay soil of our backyard in southern California, my 27 00:01:53,240 --> 00:01:56,520 Speaker 1: wife has managed to plant greenery which has thrived all 28 00:01:56,520 --> 00:01:59,800 Speaker 1: over our backyard. WHOA good for her? That's awesome. What 29 00:01:59,840 --> 00:02:03,000 Speaker 1: was she planted? She's planted a bunch of pretty hearty stuff, 30 00:02:03,040 --> 00:02:06,440 Speaker 1: mostly secculents. In our first year's here and Irvine. We 31 00:02:06,440 --> 00:02:09,440 Speaker 1: went back and forth between California and Geneva several times, 32 00:02:09,440 --> 00:02:11,760 Speaker 1: so we abandoned all the plans so she would just 33 00:02:11,800 --> 00:02:13,440 Speaker 1: planted a bunch of stuff, and then we come back 34 00:02:13,520 --> 00:02:15,880 Speaker 1: nine months later and see what was still alive. Nice. 35 00:02:15,960 --> 00:02:18,560 Speaker 1: Let natural selection run its course. I like it. I 36 00:02:18,600 --> 00:02:21,520 Speaker 1: can get behind that exactly, though it doesn't produce very 37 00:02:21,600 --> 00:02:25,000 Speaker 1: much edible stuff. Are you guys actually producing things on 38 00:02:25,040 --> 00:02:28,400 Speaker 1: your farm that you can eat? Like technically we are. 39 00:02:28,680 --> 00:02:31,080 Speaker 1: We're trying to do that. But what usually happens is 40 00:02:31,160 --> 00:02:33,680 Speaker 1: Zach starts the garden, and then we get totally bobbed 41 00:02:33,720 --> 00:02:37,320 Speaker 1: down by a project and almost everything dies. So very similar, 42 00:02:37,360 --> 00:02:40,160 Speaker 1: except we're not leaving the country. We're just being neglectful. 43 00:02:40,320 --> 00:02:43,600 Speaker 1: But this year our loofa gourds grew and so we 44 00:02:43,639 --> 00:02:48,760 Speaker 1: have a lifetime supply of squash based sponges if anyone's 45 00:02:48,800 --> 00:02:50,520 Speaker 1: in need of some more sponges. So that but that 46 00:02:50,600 --> 00:02:53,400 Speaker 1: was kind of fun. Well, welcome to the podcast. Daniel 47 00:02:53,480 --> 00:02:56,880 Speaker 1: and Jorge explain the universe in which we try to 48 00:02:57,000 --> 00:03:00,280 Speaker 1: grow a garden of ideas in your mind, playing ding 49 00:03:00,360 --> 00:03:04,080 Speaker 1: seeds of understanding about black holes and quantum physics, hoping 50 00:03:04,120 --> 00:03:07,880 Speaker 1: to nurture your understanding of how the universe out there 51 00:03:07,919 --> 00:03:10,200 Speaker 1: works and to grow your mind. So that's large enough 52 00:03:10,240 --> 00:03:13,839 Speaker 1: to incorporate all of these vast cosmic ideas. How many 53 00:03:13,880 --> 00:03:16,720 Speaker 1: of these do you have? I could go on and on, 54 00:03:16,919 --> 00:03:19,640 Speaker 1: but my regular co host, Horror Hey, can't be here today, 55 00:03:19,680 --> 00:03:21,480 Speaker 1: and so I'm delighted to be chatting with one of 56 00:03:21,560 --> 00:03:24,240 Speaker 1: our regular guest hosts, Kelly. Kelly, thank you very much 57 00:03:24,240 --> 00:03:26,760 Speaker 1: for joining us again today. I'm delighted to be back. 58 00:03:26,800 --> 00:03:29,160 Speaker 1: I had a ton of fun reading this book. That's right. 59 00:03:29,360 --> 00:03:32,560 Speaker 1: Sometimes on the podcast, we talk about the real universe, 60 00:03:32,639 --> 00:03:35,800 Speaker 1: the mysteries of nature, how far back we can explore 61 00:03:35,840 --> 00:03:37,880 Speaker 1: with our minds all the way back to the beginning 62 00:03:37,920 --> 00:03:40,480 Speaker 1: of the universe, or how black holes work and what's 63 00:03:40,520 --> 00:03:44,080 Speaker 1: inside of them. But sometimes we think about artificial universe, 64 00:03:44,200 --> 00:03:48,160 Speaker 1: is universe is created within our minds and explored by 65 00:03:48,360 --> 00:03:51,920 Speaker 1: science fiction writers, because we think that the creativity found 66 00:03:51,960 --> 00:03:55,640 Speaker 1: in science fiction is actually a vital element of science, 67 00:03:56,000 --> 00:03:58,680 Speaker 1: that thinking about the ways that the universe might be 68 00:03:59,080 --> 00:04:02,200 Speaker 1: is a very important way of actually doing science. And 69 00:04:02,240 --> 00:04:04,920 Speaker 1: so sometimes on the podcast we will read a science 70 00:04:04,960 --> 00:04:07,880 Speaker 1: fiction novel and talk about the science of that universe, 71 00:04:08,120 --> 00:04:10,840 Speaker 1: along with interviewing the author of that book to get 72 00:04:10,880 --> 00:04:14,520 Speaker 1: an insight into his or her mind. And so on 73 00:04:14,560 --> 00:04:23,480 Speaker 1: today's episode, we'll be talking about the science fiction universe 74 00:04:23,800 --> 00:04:27,839 Speaker 1: of a Half Built Garden. This book is by Ruthanna 75 00:04:27,880 --> 00:04:30,159 Speaker 1: Emery's and one of the things that I love about 76 00:04:30,160 --> 00:04:32,480 Speaker 1: her book is that I feel like when we talk 77 00:04:32,520 --> 00:04:36,000 Speaker 1: about climate change, we're often so negative that sometimes it 78 00:04:36,040 --> 00:04:38,000 Speaker 1: feels like it's not even worth trying to turn the 79 00:04:38,000 --> 00:04:40,839 Speaker 1: ship around. But her book is about a near future 80 00:04:41,040 --> 00:04:44,840 Speaker 1: humanity in three that is starting to turn the ship around. 81 00:04:44,920 --> 00:04:48,440 Speaker 1: They figured out some ways to start recovering from climate change. 82 00:04:48,839 --> 00:04:52,560 Speaker 1: That involves some new political organizations, new ways of people 83 00:04:52,600 --> 00:04:56,280 Speaker 1: sort of learning to work together. But no like amazing 84 00:04:56,320 --> 00:04:58,960 Speaker 1: tech that just pulls all of the carbon out of 85 00:04:58,960 --> 00:05:01,640 Speaker 1: the atmosphere, Like some hard work that needs to get done. 86 00:05:01,880 --> 00:05:03,760 Speaker 1: But I appreciated that that was sort of like a 87 00:05:03,839 --> 00:05:06,360 Speaker 1: take on climate change that was, you know, a little 88 00:05:06,360 --> 00:05:09,480 Speaker 1: bit positive, like we can do this sort of attitude. Yeah, 89 00:05:09,720 --> 00:05:12,880 Speaker 1: this book takes place in about twenty eight three, and 90 00:05:13,000 --> 00:05:16,039 Speaker 1: clearly there have been some disasters between now and then, 91 00:05:16,160 --> 00:05:18,680 Speaker 1: some real climate change and probably some suffering. But to 92 00:05:18,760 --> 00:05:21,560 Speaker 1: write the book is not like a mad Max in Thunderdome. 93 00:05:21,680 --> 00:05:25,120 Speaker 1: Everything is destroyed and a few humans are scrabbling for 94 00:05:25,279 --> 00:05:29,040 Speaker 1: survival in a new harsh world. It really describes the 95 00:05:29,120 --> 00:05:31,720 Speaker 1: situation where we have adapted to it. We have come 96 00:05:31,800 --> 00:05:34,920 Speaker 1: up with new political and social organizations that do allow 97 00:05:35,080 --> 00:05:38,720 Speaker 1: large populations of humans to exist and even to thrive 98 00:05:38,800 --> 00:05:41,680 Speaker 1: and like have fun and chill out totally. And and 99 00:05:41,720 --> 00:05:44,640 Speaker 1: those humans sort of work together and exist in slightly 100 00:05:44,640 --> 00:05:47,680 Speaker 1: different ways than what we have now. So like you 101 00:05:47,720 --> 00:05:51,000 Speaker 1: still have corporations, and the corporation still sort of have 102 00:05:51,240 --> 00:05:54,440 Speaker 1: the like let's use all the resources sort of attitude. 103 00:05:54,480 --> 00:05:57,560 Speaker 1: They're not totally on board with turning the climate change 104 00:05:57,560 --> 00:06:01,120 Speaker 1: ship around, which isn't too surprising. You still have nations, 105 00:06:01,160 --> 00:06:03,520 Speaker 1: but the nations have a little bit less power than 106 00:06:03,520 --> 00:06:06,080 Speaker 1: they used to. And now it seems like most of 107 00:06:06,120 --> 00:06:09,360 Speaker 1: the activities that happen in people's lives are happening at 108 00:06:09,360 --> 00:06:13,799 Speaker 1: the watershed level. So within your watershed you make decisions 109 00:06:13,839 --> 00:06:16,440 Speaker 1: about you know, working together and what you're going to do, 110 00:06:16,600 --> 00:06:19,200 Speaker 1: and so you're sort of linked globally, but most of 111 00:06:19,240 --> 00:06:23,240 Speaker 1: the action is happening locally. And fun fact, I believe 112 00:06:23,279 --> 00:06:27,040 Speaker 1: the word for that is global, which is a great word. 113 00:06:27,120 --> 00:06:30,240 Speaker 1: Can you spell that word for us? What was that? Global? Global? 114 00:06:30,520 --> 00:06:34,920 Speaker 1: Like glocalization, it's when you've get like lots of exchanges 115 00:06:34,920 --> 00:06:38,359 Speaker 1: of information and stuff, but like at the local level, 116 00:06:38,440 --> 00:06:41,080 Speaker 1: it's sort of is interacting in people's lives and those 117 00:06:41,120 --> 00:06:44,320 Speaker 1: people are taking what is helpful for them and sort 118 00:06:44,360 --> 00:06:46,719 Speaker 1: of mixing it with their own culture to make something 119 00:06:46,720 --> 00:06:49,119 Speaker 1: that's a little bit new and sort of a little 120 00:06:49,120 --> 00:06:52,320 Speaker 1: bit localized. Well, it's fascinating to me to think about 121 00:06:52,360 --> 00:06:55,279 Speaker 1: the future of sort of human organization, and you know, 122 00:06:55,320 --> 00:06:57,279 Speaker 1: you can look in the past and you can see 123 00:06:57,760 --> 00:07:01,440 Speaker 1: that overall there's this trend towards sort of larger and 124 00:07:01,520 --> 00:07:04,919 Speaker 1: larger and more and more global organizations. The nations that 125 00:07:04,960 --> 00:07:07,280 Speaker 1: we have now are really vast compared to you know, 126 00:07:07,360 --> 00:07:10,480 Speaker 1: the city states of two thousand years ago, for example. 127 00:07:10,600 --> 00:07:12,440 Speaker 1: But there also is this sort of periodicity as we 128 00:07:12,680 --> 00:07:15,000 Speaker 1: build up large empires and then they collapse, and then 129 00:07:15,000 --> 00:07:18,000 Speaker 1: we build up new empires and then they collapse, And 130 00:07:18,040 --> 00:07:21,120 Speaker 1: so it's interesting to think about an alternative where we 131 00:07:21,120 --> 00:07:24,440 Speaker 1: don't just collapse into total disaster and have to scrabble 132 00:07:24,480 --> 00:07:27,320 Speaker 1: our way through five dred years of dark ages, but 133 00:07:27,560 --> 00:07:30,720 Speaker 1: instead we sort of fracture where things become a little 134 00:07:30,720 --> 00:07:33,320 Speaker 1: bit more local, and we can have like different priorities 135 00:07:33,320 --> 00:07:37,120 Speaker 1: in different areas and people organizing themselves from the bottom up. 136 00:07:37,360 --> 00:07:39,520 Speaker 1: I thought this book is really clever in the way 137 00:07:39,520 --> 00:07:42,800 Speaker 1: that it imagined like this middle ground for humanity not 138 00:07:42,920 --> 00:07:45,200 Speaker 1: total disaster. I thought that was a unique take on 139 00:07:45,240 --> 00:07:47,640 Speaker 1: it also and was very cool. And so, Daniel, do 140 00:07:47,720 --> 00:07:50,200 Speaker 1: you ever wonder are you living in an era where 141 00:07:50,240 --> 00:07:52,520 Speaker 1: there's going to be a break and a collapse? I 142 00:07:52,600 --> 00:07:54,600 Speaker 1: do wonder about that all the time, and I look 143 00:07:54,600 --> 00:07:57,120 Speaker 1: around at our lives and I think It's easy to 144 00:07:57,160 --> 00:08:00,560 Speaker 1: imagine us looking back in fifty years and being baffled 145 00:08:00,800 --> 00:08:03,720 Speaker 1: at how we lived, you know, just the sheer wealth 146 00:08:03,840 --> 00:08:06,960 Speaker 1: and the ibulance and the resources that we consume every 147 00:08:07,000 --> 00:08:09,200 Speaker 1: day without even thinking about it very much, you know, 148 00:08:09,560 --> 00:08:14,440 Speaker 1: gas and electricity and money and the food waste. It 149 00:08:14,480 --> 00:08:16,560 Speaker 1: does seem like it would be easy to look back 150 00:08:16,600 --> 00:08:18,200 Speaker 1: at this as sort of like the peak of the 151 00:08:18,280 --> 00:08:21,480 Speaker 1: Roman Empire, just before the fall. I hope you're wrong, 152 00:08:21,640 --> 00:08:25,160 Speaker 1: but yes, I I think about that too. Sometimes I 153 00:08:25,200 --> 00:08:27,080 Speaker 1: think about how I might be living in a cave 154 00:08:27,200 --> 00:08:31,080 Speaker 1: talking to my grandchildren about you know, what television was 155 00:08:31,560 --> 00:08:34,679 Speaker 1: or what running water was. You'll be kicking the stone 156 00:08:34,679 --> 00:08:38,400 Speaker 1: around the old cave to play soccer. Maybe the solution 157 00:08:38,520 --> 00:08:41,160 Speaker 1: for us will be the same as a possible solution 158 00:08:41,240 --> 00:08:44,920 Speaker 1: presented in the book, which is aliens. And I also 159 00:08:44,960 --> 00:08:47,800 Speaker 1: really liked the take on aliens. They were sort of 160 00:08:47,840 --> 00:08:49,800 Speaker 1: like a fresh look at aliens, and she had a 161 00:08:49,880 --> 00:08:53,960 Speaker 1: very interesting take on their biology. And these aliens were friendly. 162 00:08:54,040 --> 00:08:56,280 Speaker 1: They had I thought, a very clever way of of 163 00:08:56,360 --> 00:08:58,520 Speaker 1: letting us know that. But I'll sort of leave that 164 00:08:58,600 --> 00:09:01,000 Speaker 1: to be unveiled to the reader when they read the book. 165 00:09:01,080 --> 00:09:03,040 Speaker 1: But they have kind of a scary message. What is 166 00:09:03,080 --> 00:09:05,720 Speaker 1: their message? Yeah, I thought this was a really cool 167 00:09:05,800 --> 00:09:08,280 Speaker 1: way to sort of put a pin in the issues 168 00:09:08,559 --> 00:09:11,520 Speaker 1: of climate change and how to live long into the future. 169 00:09:11,880 --> 00:09:14,120 Speaker 1: These aliens in the book, when they arrive, they are 170 00:09:14,200 --> 00:09:16,960 Speaker 1: very friendly, You're right, but their messages you have to 171 00:09:16,960 --> 00:09:19,560 Speaker 1: get off the planet. They think about planets as a 172 00:09:19,559 --> 00:09:22,840 Speaker 1: way to like incubate a new species. They're like planets 173 00:09:22,880 --> 00:09:25,520 Speaker 1: are like nests. You know, you can create a new species, 174 00:09:25,520 --> 00:09:27,800 Speaker 1: you can evolve, but they're not a place to live, 175 00:09:28,000 --> 00:09:30,080 Speaker 1: right Like, you basically got to move out of mom 176 00:09:30,080 --> 00:09:33,880 Speaker 1: and Dad's house eventually and go off into space. And 177 00:09:34,400 --> 00:09:36,640 Speaker 1: they come with a warning, you know, to say that 178 00:09:36,920 --> 00:09:39,720 Speaker 1: everybody who tries to live long term on a planet 179 00:09:39,760 --> 00:09:43,000 Speaker 1: eventually kills themselves. And they even say how they tried 180 00:09:43,040 --> 00:09:45,120 Speaker 1: to help for other species, but they got there too 181 00:09:45,200 --> 00:09:50,599 Speaker 1: late before those species basically exterminated themselves through climate change disasters. 182 00:09:50,720 --> 00:09:53,160 Speaker 1: And so they're coming with a warning saying get off 183 00:09:53,200 --> 00:09:56,400 Speaker 1: planet A S. A P. And not only get off planet, 184 00:09:56,440 --> 00:09:58,280 Speaker 1: but get off planet, and then you're gonna need to 185 00:09:58,360 --> 00:10:01,160 Speaker 1: dismantle that planet for rees sources that you're gonna need 186 00:10:01,240 --> 00:10:03,559 Speaker 1: to keep going in space, which to me was the 187 00:10:03,559 --> 00:10:05,680 Speaker 1: point where I was like, oh, man, like I could 188 00:10:05,760 --> 00:10:08,640 Speaker 1: imagine if the messages get off planet, you could be like, okay, 189 00:10:08,640 --> 00:10:10,680 Speaker 1: but like you know, of us are going to stay 190 00:10:10,720 --> 00:10:13,160 Speaker 1: because Earth can totally handle it. But it's like, no, no, no, 191 00:10:13,240 --> 00:10:16,360 Speaker 1: that's not even an option for a variety of reasons, 192 00:10:16,480 --> 00:10:19,720 Speaker 1: one of which is we need your parts or you know, 193 00:10:19,760 --> 00:10:22,000 Speaker 1: we need the parts of the Earth to build stuff. 194 00:10:22,080 --> 00:10:24,120 Speaker 1: And it's interesting, Yeah, how do you how do you 195 00:10:24,160 --> 00:10:27,080 Speaker 1: convince the aliens that you should get to stay or 196 00:10:27,200 --> 00:10:29,480 Speaker 1: and you know, of course humans never agree on anything, 197 00:10:30,040 --> 00:10:32,200 Speaker 1: and sort of seeing how the human species comes to 198 00:10:32,360 --> 00:10:36,240 Speaker 1: terms with this ultimatum is an interesting problem to watch 199 00:10:36,400 --> 00:10:38,160 Speaker 1: get solved throughout the course of the book. Yeah, and 200 00:10:38,200 --> 00:10:40,840 Speaker 1: I thought it's especially fascinating because it's a question that 201 00:10:40,880 --> 00:10:42,880 Speaker 1: we have now, like should we change that we believe 202 00:10:42,920 --> 00:10:45,160 Speaker 1: should we try to get off planet? Even if aliens 203 00:10:45,200 --> 00:10:47,680 Speaker 1: don't come and tell us that they think it's necessary. 204 00:10:48,080 --> 00:10:50,160 Speaker 1: There are a lot of people who think that the 205 00:10:50,200 --> 00:10:53,880 Speaker 1: long term solution to human survival is to get off 206 00:10:53,920 --> 00:10:56,880 Speaker 1: the planet and establish basis on Mars, or build dyce 207 00:10:56,960 --> 00:10:59,160 Speaker 1: in Speares or these kinds of things. I know that 208 00:10:59,200 --> 00:11:02,840 Speaker 1: you're a skeptic though about survival in space and space settlement. 209 00:11:02,960 --> 00:11:05,240 Speaker 1: What do you think about not just moving out into 210 00:11:05,280 --> 00:11:09,160 Speaker 1: space but also dismantling this complex ecosystem that we use 211 00:11:09,280 --> 00:11:12,400 Speaker 1: as the like the foundation of our society. Well, you know, 212 00:11:12,440 --> 00:11:14,880 Speaker 1: so in the nineteen seventies, the idea was really popular 213 00:11:14,920 --> 00:11:16,800 Speaker 1: that we need to go out into space because you know, 214 00:11:16,840 --> 00:11:19,640 Speaker 1: a population numbers are expanding like crazy, We're gonna run 215 00:11:19,640 --> 00:11:22,000 Speaker 1: out of space, We're gonna run out of resources. We 216 00:11:22,080 --> 00:11:24,200 Speaker 1: need to move into space. And you know, they like 217 00:11:24,800 --> 00:11:29,280 Speaker 1: war induced famine over not having enough resources and the 218 00:11:29,320 --> 00:11:32,480 Speaker 1: millions and millions of people dying from starvation, like those 219 00:11:32,480 --> 00:11:35,000 Speaker 1: things never came to pass. And so I think, no 220 00:11:35,040 --> 00:11:37,319 Speaker 1: doubt there have been famines and there have been problems, 221 00:11:37,600 --> 00:11:39,480 Speaker 1: but I don't think that space is going to be 222 00:11:39,520 --> 00:11:42,280 Speaker 1: the solution to these problems the way that a lot 223 00:11:42,280 --> 00:11:44,520 Speaker 1: of people think. And also, you know, so yes, I'm 224 00:11:44,559 --> 00:11:47,000 Speaker 1: I'm a skeptic. I think we can move into space 225 00:11:47,040 --> 00:11:49,000 Speaker 1: and we can have space settlements, but it's gonna take 226 00:11:49,040 --> 00:11:50,880 Speaker 1: us a lot longer than I think a lot of 227 00:11:50,880 --> 00:11:53,960 Speaker 1: people would probably guess or would probably expect. There's a 228 00:11:53,960 --> 00:11:56,560 Speaker 1: lot of problems we still need to solve, and so 229 00:11:56,720 --> 00:11:59,120 Speaker 1: I suppose I think we need to figure out the 230 00:11:59,160 --> 00:12:01,880 Speaker 1: problems down here and not expect on moving people off 231 00:12:01,880 --> 00:12:04,160 Speaker 1: into spaces the solution, because I just don't think we 232 00:12:04,160 --> 00:12:06,320 Speaker 1: can do it fast enough to make content. Did I 233 00:12:06,320 --> 00:12:08,480 Speaker 1: answer your question? I sort of went off on a tangent. No, 234 00:12:08,559 --> 00:12:11,079 Speaker 1: absolutely you did. But now I wonder what is Kelly's 235 00:12:11,160 --> 00:12:14,199 Speaker 1: timeline for human space colonization. You think it's going to 236 00:12:14,280 --> 00:12:16,160 Speaker 1: take a while, and we're talking a hundred years, a 237 00:12:16,200 --> 00:12:19,120 Speaker 1: thousand years, Just like with soon ish, I always hesitate 238 00:12:19,200 --> 00:12:21,599 Speaker 1: to make like estimates for when such and such is 239 00:12:21,640 --> 00:12:23,920 Speaker 1: going to happen, because it depends not just on how 240 00:12:23,960 --> 00:12:26,840 Speaker 1: long it takes to make the technology, but like how 241 00:12:26,840 --> 00:12:29,560 Speaker 1: many politicians want to fund the project to make it happen, 242 00:12:30,280 --> 00:12:32,800 Speaker 1: And so you know, your estimate for how long it's 243 00:12:32,840 --> 00:12:34,480 Speaker 1: going to take depends on so many things that you 244 00:12:34,520 --> 00:12:37,600 Speaker 1: can't control that you're destined to be wrong. But what 245 00:12:37,720 --> 00:12:39,920 Speaker 1: I can say is I think it's a project that 246 00:12:39,960 --> 00:12:44,160 Speaker 1: if we force to happen quickly, we might regret because 247 00:12:44,200 --> 00:12:45,839 Speaker 1: you know, for example, we don't know if humans can 248 00:12:45,840 --> 00:12:47,880 Speaker 1: have babies in space safely, and I think we want 249 00:12:47,880 --> 00:12:50,600 Speaker 1: to figure that out before we, you know, move to Mars, 250 00:12:51,000 --> 00:12:53,400 Speaker 1: for example, and then discover that actually there's a bunch 251 00:12:53,440 --> 00:12:56,600 Speaker 1: of problems and we're, you know, not happy with how 252 00:12:56,640 --> 00:12:58,440 Speaker 1: things are going. So I think it would be better 253 00:12:58,480 --> 00:13:00,640 Speaker 1: if it took longer. Let's say, but I don't know 254 00:13:00,880 --> 00:13:03,080 Speaker 1: how long it would take for you to like meet 255 00:13:03,120 --> 00:13:05,960 Speaker 1: the bare minimum standards to move out into space, but 256 00:13:06,000 --> 00:13:08,160 Speaker 1: I think you should be way past that before we 257 00:13:08,200 --> 00:13:10,240 Speaker 1: go ahead and start that project. Well, I was hoping 258 00:13:10,280 --> 00:13:12,360 Speaker 1: to trick you into giving a specific number so that 259 00:13:12,360 --> 00:13:15,360 Speaker 1: I could call you up in and say, see, Kelly, 260 00:13:15,400 --> 00:13:17,840 Speaker 1: we're not yet in space. You were wrong. Nope, Nope, 261 00:13:17,880 --> 00:13:21,160 Speaker 1: there's no trick. In main, there's no But I do 262 00:13:21,240 --> 00:13:23,400 Speaker 1: think you bring up a really interesting issue, which is 263 00:13:23,440 --> 00:13:26,480 Speaker 1: that a lot of our technological bursts and development do 264 00:13:26,679 --> 00:13:29,840 Speaker 1: come in response to a crisis, you know, or for example, 265 00:13:29,840 --> 00:13:33,200 Speaker 1: a nationalistic race, which essentially is a crisis you know, 266 00:13:33,240 --> 00:13:35,640 Speaker 1: that we see an astory coming and then we scramble 267 00:13:35,760 --> 00:13:38,560 Speaker 1: to develop a technology necessary for that. And in this book, 268 00:13:38,559 --> 00:13:41,080 Speaker 1: I think it's quite interesting that the humans have like 269 00:13:41,360 --> 00:13:43,080 Speaker 1: kind of figured it out. You know, by the time 270 00:13:43,120 --> 00:13:45,319 Speaker 1: the aliens do arrive and say, hey, you've got to 271 00:13:45,360 --> 00:13:48,360 Speaker 1: get off planet, the humans have sort of threaded the 272 00:13:48,360 --> 00:13:51,960 Speaker 1: needle and figured out how to live maybe sustainably on 273 00:13:52,000 --> 00:13:54,720 Speaker 1: the planet. It's a really fun conversation they have both 274 00:13:54,760 --> 00:13:57,880 Speaker 1: internally within the humans and with the aliens about hey, 275 00:13:58,000 --> 00:13:59,880 Speaker 1: do we actually need to get off planet or are 276 00:14:00,040 --> 00:14:02,800 Speaker 1: we doing okay? Yeah, and and and you know, of course, 277 00:14:02,880 --> 00:14:06,000 Speaker 1: it being a discussion involving humans, there's a lot of disagreement. 278 00:14:06,120 --> 00:14:08,480 Speaker 1: But but yeah, it's interesting to argue, like, how do 279 00:14:08,559 --> 00:14:11,040 Speaker 1: we know, you know, when we get a handle on 280 00:14:11,080 --> 00:14:14,200 Speaker 1: this climate change thing, and hopefully it's when, not if, 281 00:14:14,320 --> 00:14:16,520 Speaker 1: How will we be able to convince ourselves that we've 282 00:14:16,559 --> 00:14:19,800 Speaker 1: really got things going in a better direction and we 283 00:14:19,840 --> 00:14:22,280 Speaker 1: don't have to worry anymore. And that's yeah, that's sort 284 00:14:22,320 --> 00:14:25,040 Speaker 1: of an interesting question that doesn't have a clear answer yet. Yeah, 285 00:14:25,120 --> 00:14:27,960 Speaker 1: something else I really enjoyed about this book where the disagreements. 286 00:14:28,160 --> 00:14:32,240 Speaker 1: Often when aliens arrive, they're like monolithic in culture and 287 00:14:32,320 --> 00:14:35,280 Speaker 1: in politics and in opinion. You know, they all sort 288 00:14:35,280 --> 00:14:37,840 Speaker 1: of speak with one voice, but here they disagree with 289 00:14:37,840 --> 00:14:40,520 Speaker 1: each other, they have different personalities, they undermine each other. 290 00:14:40,720 --> 00:14:43,720 Speaker 1: I thought that was really fascinating and probably much more realistic. Yeah. 291 00:14:43,880 --> 00:14:46,560 Speaker 1: I think so often in the movies that I've seen 292 00:14:46,880 --> 00:14:48,360 Speaker 1: and the books that I've read, you have sort of 293 00:14:48,440 --> 00:14:50,600 Speaker 1: aliens with the hive mind where they all can sort 294 00:14:50,640 --> 00:14:52,920 Speaker 1: of like I guess it's not that different than the 295 00:14:53,000 --> 00:14:56,000 Speaker 1: humans sort of coming to a consensus with their technology, 296 00:14:56,120 --> 00:14:58,280 Speaker 1: but like you know, they even still disagree. And yeah, 297 00:14:58,320 --> 00:15:02,080 Speaker 1: like you said, the aliens are think very realistically portrayed 298 00:15:02,120 --> 00:15:04,920 Speaker 1: and that they don't all agree even and it's yeah, 299 00:15:05,000 --> 00:15:07,840 Speaker 1: it's it's very good writing. Yeah, often in science fiction 300 00:15:07,920 --> 00:15:09,880 Speaker 1: you meet like some species and it was like a 301 00:15:09,920 --> 00:15:12,360 Speaker 1: president of the planet, and I'm like, a president of 302 00:15:12,400 --> 00:15:15,000 Speaker 1: the planet. Really, like there's no way humans would ever 303 00:15:15,600 --> 00:15:18,960 Speaker 1: know elect the president who could then also act boldly right, 304 00:15:19,040 --> 00:15:22,200 Speaker 1: like it'd be so bogged down in you know, disagreements 305 00:15:22,200 --> 00:15:24,280 Speaker 1: among the planets. And so it's really nice for me 306 00:15:24,360 --> 00:15:27,560 Speaker 1: to see aliens that you know, don't always get along 307 00:15:27,600 --> 00:15:30,400 Speaker 1: and make decisions together. I thought that was really fascinating. 308 00:15:30,440 --> 00:15:32,960 Speaker 1: But I want to dig more into the science of 309 00:15:33,040 --> 00:15:36,080 Speaker 1: this universe that Ruth Anne Emers has created in her novel. 310 00:15:36,200 --> 00:15:51,480 Speaker 1: But first let's take a quick break. Okay, we're back 311 00:15:51,480 --> 00:15:54,160 Speaker 1: and we're talking about the science fiction universe of the 312 00:15:54,240 --> 00:15:58,040 Speaker 1: novel I have built. Garden by ruth Anna Emiress whose 313 00:15:58,080 --> 00:16:02,720 Speaker 1: background is in cognitive psychology and sociology, and she's written 314 00:16:02,720 --> 00:16:06,480 Speaker 1: a really fascinating book, not just about aliens though, of course, 315 00:16:06,480 --> 00:16:08,560 Speaker 1: and we love the aliens, but also about the future 316 00:16:08,600 --> 00:16:12,120 Speaker 1: of human civilization, how humans come together to solve the 317 00:16:12,120 --> 00:16:16,320 Speaker 1: climate crisis and reorganize their own lives. Yeah, and so 318 00:16:16,400 --> 00:16:19,040 Speaker 1: one of the aspects of the technology that I thought 319 00:16:19,080 --> 00:16:22,080 Speaker 1: was really interesting is that everybody has this mesh that 320 00:16:22,120 --> 00:16:24,440 Speaker 1: they can put on their head and they can sort 321 00:16:24,440 --> 00:16:27,520 Speaker 1: of network ideas and so like, if you are about 322 00:16:27,560 --> 00:16:29,680 Speaker 1: to make a decision that could impact the whole community, 323 00:16:30,200 --> 00:16:32,760 Speaker 1: you can send that information out to the network, and 324 00:16:32,800 --> 00:16:35,480 Speaker 1: people can like add their input and vote up or 325 00:16:35,520 --> 00:16:38,120 Speaker 1: down on solutions. And if anybody has sort of like 326 00:16:38,520 --> 00:16:41,760 Speaker 1: research that's relevant, they can summarize it. And so the experts, 327 00:16:41,800 --> 00:16:44,800 Speaker 1: I think the experts have like extra weight. And it's 328 00:16:44,880 --> 00:16:48,280 Speaker 1: like having sort of like a Reddit all the time everywhere, 329 00:16:48,320 --> 00:16:51,960 Speaker 1: which which strikes me as being like kind of overwhelming, 330 00:16:52,040 --> 00:16:54,160 Speaker 1: you know, to be honest. Part of my like, well 331 00:16:54,200 --> 00:16:55,760 Speaker 1: I was reading it, one of my thoughts was like 332 00:16:55,960 --> 00:16:58,440 Speaker 1: I don't think I could handle that. Like just when 333 00:16:58,440 --> 00:17:01,800 Speaker 1: my phone vibrate in my pocket and I'm having a conversation. 334 00:17:01,840 --> 00:17:04,520 Speaker 1: I'm distracted, and I'm like, oh, this isn't good. If 335 00:17:04,640 --> 00:17:08,159 Speaker 1: like my brain were constantly working through threads of information 336 00:17:08,240 --> 00:17:10,840 Speaker 1: about decisions, I think that would be overwhelming. But you know, 337 00:17:10,960 --> 00:17:12,560 Speaker 1: maybe that's something you get used to. You what do 338 00:17:12,600 --> 00:17:15,080 Speaker 1: you think? I don't know. I was sort of amazed. 339 00:17:15,359 --> 00:17:17,520 Speaker 1: First of all, I love the richness of the experience 340 00:17:17,520 --> 00:17:19,720 Speaker 1: that she imagined. She really seems to have thought about 341 00:17:19,800 --> 00:17:21,919 Speaker 1: what it would be like to have read it in 342 00:17:22,000 --> 00:17:24,920 Speaker 1: your head all the time and to have these sort 343 00:17:24,920 --> 00:17:28,359 Speaker 1: of constant communal discussions and debate. I'm not sure I 344 00:17:28,359 --> 00:17:30,120 Speaker 1: would enjoy it also, but you know, it's really hard 345 00:17:30,160 --> 00:17:33,840 Speaker 1: to know whether you would appreciate a completely different human experience. 346 00:17:33,960 --> 00:17:36,399 Speaker 1: And the thing that made me wonder was, you know, 347 00:17:36,720 --> 00:17:39,480 Speaker 1: would people really be so well behaved? I imagine if 348 00:17:39,560 --> 00:17:42,320 Speaker 1: a bunch of people had access to like injecting ideas 349 00:17:42,400 --> 00:17:45,159 Speaker 1: into my brain, it might just be dominated by like 350 00:17:45,359 --> 00:17:49,679 Speaker 1: the loudest, meanest voices, the bullies. Basically, it sort of 351 00:17:49,680 --> 00:17:52,560 Speaker 1: requires everybody to be civil in a way and to 352 00:17:52,640 --> 00:17:55,439 Speaker 1: agree to the rules and to be moderated. And I 353 00:17:55,480 --> 00:17:57,959 Speaker 1: was wondering because you know, in today's society, we were 354 00:17:57,960 --> 00:18:01,119 Speaker 1: struggling with exactly just that, how much speech to allow 355 00:18:01,240 --> 00:18:04,040 Speaker 1: on social media platforms and how much to moderate it. 356 00:18:04,080 --> 00:18:05,840 Speaker 1: I was sort of wondering how they figured out that 357 00:18:05,880 --> 00:18:07,679 Speaker 1: balance in a way that we could all live with 358 00:18:07,800 --> 00:18:12,080 Speaker 1: Reddit inside of our brain. Yeah, I know, that's that's 359 00:18:12,119 --> 00:18:13,760 Speaker 1: a good question. It would be interesting to talk to 360 00:18:13,800 --> 00:18:16,960 Speaker 1: software engineers about how they're tackling that problem right now. 361 00:18:17,040 --> 00:18:19,360 Speaker 1: I don't think it's easy, but sociologically, I think it's 362 00:18:19,359 --> 00:18:23,640 Speaker 1: really fascinating, this idea of devolving control, rather than having 363 00:18:23,640 --> 00:18:26,720 Speaker 1: it be centralized in some distant foreign government, having it 364 00:18:26,800 --> 00:18:31,040 Speaker 1: be more local and community oriented people making these decisions themselves. 365 00:18:31,320 --> 00:18:34,480 Speaker 1: And in some sense that seems empowering because maybe you 366 00:18:34,480 --> 00:18:36,160 Speaker 1: want the people on the ground to be the ones 367 00:18:36,200 --> 00:18:39,280 Speaker 1: who are like taking care of the wildlife and understanding 368 00:18:39,320 --> 00:18:41,960 Speaker 1: really the water flow issues. But it could also really 369 00:18:42,040 --> 00:18:44,720 Speaker 1: lead to issues of like inequality. You have a bunch 370 00:18:44,760 --> 00:18:47,159 Speaker 1: of wealthy people get together and build their own school 371 00:18:47,200 --> 00:18:49,240 Speaker 1: and have their own fire department and their own police force, 372 00:18:49,280 --> 00:18:51,879 Speaker 1: and pretty soon they're living in like a literal bubble, 373 00:18:52,119 --> 00:18:54,080 Speaker 1: and if the wealth is concentrated there, it can be 374 00:18:54,200 --> 00:18:58,159 Speaker 1: very difficult for people without those resources to have access 375 00:18:58,200 --> 00:19:00,840 Speaker 1: to it and have opportunities, and then where you live 376 00:19:00,960 --> 00:19:03,679 Speaker 1: determines basically the course of your life. So I think 377 00:19:03,720 --> 00:19:06,560 Speaker 1: there's definitely pluses and mindsets to that sort of reorganization 378 00:19:06,640 --> 00:19:10,719 Speaker 1: of society. I agree completely. Yep, this stuff is super complicated, 379 00:19:10,920 --> 00:19:13,080 Speaker 1: but in her book she takes all of this on 380 00:19:13,160 --> 00:19:15,119 Speaker 1: and she talks about the ups and the downs. The 381 00:19:15,119 --> 00:19:18,480 Speaker 1: network crashes at one point, which creates, you know, maybe 382 00:19:18,520 --> 00:19:22,080 Speaker 1: literal headaches for people trying to make decisions, So you know, 383 00:19:22,119 --> 00:19:23,840 Speaker 1: she doesn't shy away from all of this. She really 384 00:19:23,840 --> 00:19:26,200 Speaker 1: seems to have done a lot of research into how 385 00:19:26,240 --> 00:19:28,880 Speaker 1: this would actually operate. Tell me what your impressions were 386 00:19:28,920 --> 00:19:32,240 Speaker 1: of the biology, because her aliens were really quite inventive. 387 00:19:32,280 --> 00:19:34,760 Speaker 1: Did you find it realistic? Yeah, I mean I really 388 00:19:34,840 --> 00:19:38,400 Speaker 1: enjoyed reading about the aliens. So I'm I'm not usually 389 00:19:38,440 --> 00:19:41,399 Speaker 1: a great person to talk to about, like critiquing the 390 00:19:41,480 --> 00:19:44,679 Speaker 1: science of a sci fi universe, because I'm pretty much 391 00:19:44,720 --> 00:19:47,240 Speaker 1: willing to accept anything as long as the person is 392 00:19:47,280 --> 00:19:49,720 Speaker 1: consistent with the rules that they lay out. But I 393 00:19:49,760 --> 00:19:52,800 Speaker 1: will say that she had some really interesting aliens. One 394 00:19:52,920 --> 00:19:54,880 Speaker 1: was kind of like an insect and one was kind 395 00:19:54,920 --> 00:19:58,320 Speaker 1: of like a spider, and seeing how you know, those different, 396 00:19:58,560 --> 00:20:01,560 Speaker 1: you know groups, since there are environments in very different ways, 397 00:20:02,119 --> 00:20:03,960 Speaker 1: and you know, how figuring out how they learned to 398 00:20:04,000 --> 00:20:06,679 Speaker 1: communicate with one another and appreciate the ways that they 399 00:20:06,680 --> 00:20:09,760 Speaker 1: were different and learned how to complement one another, and 400 00:20:09,800 --> 00:20:12,240 Speaker 1: how they how they essentially ended up living in symbiosis 401 00:20:12,240 --> 00:20:14,160 Speaker 1: and we're reaching out with humans to try to make 402 00:20:14,160 --> 00:20:16,879 Speaker 1: them another symbiotic partner. I thought that was really interesting. 403 00:20:16,960 --> 00:20:19,879 Speaker 1: And additionally, how they engineered their environments so that, you know, 404 00:20:20,080 --> 00:20:23,480 Speaker 1: both species could interact even though their you know, their 405 00:20:23,480 --> 00:20:26,160 Speaker 1: body plans were really quite different. She thought it through 406 00:20:26,240 --> 00:20:28,360 Speaker 1: quite a bit. What did you think of the aliens? Yeah, 407 00:20:28,560 --> 00:20:32,080 Speaker 1: super creative. I had never thought about having two aliens 408 00:20:32,080 --> 00:20:35,080 Speaker 1: and symbiosis come and visit and like invite us to 409 00:20:35,280 --> 00:20:37,560 Speaker 1: join their little club, you know. I thought that was 410 00:20:37,600 --> 00:20:39,840 Speaker 1: a really cool way of thinking about things. I also 411 00:20:39,920 --> 00:20:43,800 Speaker 1: really enjoyed our insights into the alien culture. You know. 412 00:20:44,040 --> 00:20:45,960 Speaker 1: On one hand, it was very easy to talk to 413 00:20:45,960 --> 00:20:48,920 Speaker 1: the aliens because by the time they arrived that already 414 00:20:49,160 --> 00:20:51,720 Speaker 1: heard a bunch of English in our broadcasts and train 415 00:20:51,800 --> 00:20:53,960 Speaker 1: themselves on it, so we could just like chat with 416 00:20:54,040 --> 00:20:56,880 Speaker 1: them immediately. On the other hand, they were important cultural 417 00:20:56,880 --> 00:20:59,160 Speaker 1: differences like the Aliens were weirded out when people didn't 418 00:20:59,160 --> 00:21:02,800 Speaker 1: bring their children along to you know, diplomatic meetings, because 419 00:21:02,840 --> 00:21:05,479 Speaker 1: apparently in their culture, that's a real sign of trust, 420 00:21:05,640 --> 00:21:07,679 Speaker 1: right that you brought your children. So I thought that 421 00:21:07,720 --> 00:21:10,080 Speaker 1: was really clever. So much in this book of that 422 00:21:10,160 --> 00:21:12,840 Speaker 1: was just really different from anything I've ever seen before. Yeah, 423 00:21:13,000 --> 00:21:16,080 Speaker 1: I definitely found myself thinking afterwards, like, oh, what, I 424 00:21:16,119 --> 00:21:17,960 Speaker 1: what I want to live in that world? Like it 425 00:21:18,040 --> 00:21:21,439 Speaker 1: certainly would have made being a mom and maintaining my 426 00:21:21,520 --> 00:21:24,840 Speaker 1: career much easier if it was just expected that my 427 00:21:24,920 --> 00:21:27,200 Speaker 1: kids would come with me everywhere. On the other hand, 428 00:21:27,680 --> 00:21:30,520 Speaker 1: I find it really hard to think sometimes when my 429 00:21:30,600 --> 00:21:32,320 Speaker 1: kid is getting up, you know, when my kids were 430 00:21:32,359 --> 00:21:34,240 Speaker 1: getting upset and I had to make a big decision. 431 00:21:34,400 --> 00:21:36,280 Speaker 1: But she has her characters deal with that kind of 432 00:21:36,280 --> 00:21:38,520 Speaker 1: stuff too, and so yeah, it was it was an 433 00:21:38,560 --> 00:21:41,480 Speaker 1: interesting way to imagine the world that I think would 434 00:21:41,520 --> 00:21:43,880 Speaker 1: have some big benefits but would be difficult to implement. 435 00:21:44,200 --> 00:21:46,640 Speaker 1: Something she described in her novel as sort of an 436 00:21:46,640 --> 00:21:50,640 Speaker 1: eventual end point for civilizations is not just moving out 437 00:21:50,680 --> 00:21:54,040 Speaker 1: to space, but also constructing sort of like mega projects, 438 00:21:54,040 --> 00:21:57,360 Speaker 1: things like Dyson spears, which capture a large fraction or 439 00:21:57,400 --> 00:22:01,000 Speaker 1: all of the energy of a star, allowing realizations to 440 00:22:01,240 --> 00:22:06,040 Speaker 1: like vaporized planets or or construct enormous other technologies that 441 00:22:06,119 --> 00:22:08,720 Speaker 1: require so much energy. I thought it was really interesting 442 00:22:08,800 --> 00:22:12,080 Speaker 1: to think about whether that's actually possible, you know, whether 443 00:22:12,119 --> 00:22:15,000 Speaker 1: that's the only way to live as an interstellar species, 444 00:22:15,320 --> 00:22:17,480 Speaker 1: or whether there are other ways to do it. Yeah, 445 00:22:17,520 --> 00:22:19,359 Speaker 1: I mean I found I found it to be a 446 00:22:19,480 --> 00:22:22,240 Speaker 1: very depressing prospect, the idea that you would have to 447 00:22:22,320 --> 00:22:25,439 Speaker 1: like grind up the Earth in order to make a 448 00:22:25,520 --> 00:22:28,800 Speaker 1: dicense fear to keep a subset of these species alive 449 00:22:29,520 --> 00:22:32,280 Speaker 1: right now on Earth, you know, alive in space stations 450 00:22:32,359 --> 00:22:35,680 Speaker 1: or something. I hope that's not the direction of things. 451 00:22:35,800 --> 00:22:38,800 Speaker 1: What and it sounds complicated? What did you think of it? 452 00:22:38,920 --> 00:22:41,359 Speaker 1: I think that is an interesting question. And you know, 453 00:22:41,440 --> 00:22:43,879 Speaker 1: if we were to build a dicense fear here in 454 00:22:44,119 --> 00:22:47,879 Speaker 1: our Solar system, I wouldn't start with grinding up the Earth, 455 00:22:48,240 --> 00:22:50,760 Speaker 1: you know, I would start with like mercury. Mercury has 456 00:22:50,800 --> 00:22:52,879 Speaker 1: a lot of really heavy metals in it, and we 457 00:22:52,920 --> 00:22:55,480 Speaker 1: don't really need it for anything else. We could like 458 00:22:55,640 --> 00:22:58,520 Speaker 1: lose mercury and not really notice. But it's a good 459 00:22:58,520 --> 00:23:01,240 Speaker 1: point that if you wanted to build a full dicense fhere. 460 00:23:01,280 --> 00:23:03,920 Speaker 1: If you wanted to capture like all of the ten 461 00:23:04,000 --> 00:23:06,560 Speaker 1: to the twenty six watts of energy that the Sun 462 00:23:06,600 --> 00:23:09,800 Speaker 1: puts out, you would need a lot of material. Right 463 00:23:09,840 --> 00:23:13,000 Speaker 1: If you built like a sphere at the radius of 464 00:23:13,040 --> 00:23:16,119 Speaker 1: the Earth like radius of one AU, then the internal 465 00:23:16,200 --> 00:23:20,800 Speaker 1: surface of that sphere would be like mind bogglingly large. 466 00:23:20,840 --> 00:23:24,960 Speaker 1: We're talking about five hundred million times the surface of 467 00:23:25,000 --> 00:23:27,720 Speaker 1: the Earth. So we've never built anything the size of 468 00:23:27,760 --> 00:23:30,600 Speaker 1: the Earth. Now we're talking about building something like hundreds 469 00:23:30,600 --> 00:23:33,679 Speaker 1: of millions of times the surface area of the Earth. 470 00:23:33,720 --> 00:23:36,359 Speaker 1: It's like, we're not even close to doing that. So 471 00:23:36,520 --> 00:23:39,359 Speaker 1: I think I'm more realistic. Trajectory is that you build 472 00:23:39,720 --> 00:23:43,080 Speaker 1: a bunch of stations in space that are capable of 473 00:23:43,119 --> 00:23:46,080 Speaker 1: absorbing the power of the Sun and you use that 474 00:23:46,119 --> 00:23:48,760 Speaker 1: for your space based infrastructure. You don't necessarily need to 475 00:23:48,800 --> 00:23:53,200 Speaker 1: go from zero to complete dicensphere in an afternoon. Where 476 00:23:53,359 --> 00:23:56,080 Speaker 1: where would you live in the dicensphere. So, like you 477 00:23:56,160 --> 00:23:59,480 Speaker 1: build the dicensphere, do you live on like the outside 478 00:23:59,480 --> 00:24:01,520 Speaker 1: part of it? Are you just floating around inside of 479 00:24:01,560 --> 00:24:04,600 Speaker 1: the sphere. It's a tricky question, right, Like you could 480 00:24:04,640 --> 00:24:07,440 Speaker 1: imagine living on the inside of this mega project. But 481 00:24:07,480 --> 00:24:09,639 Speaker 1: there would be no gravity, right, You're not going to 482 00:24:09,680 --> 00:24:12,119 Speaker 1: be able to like walk around in the inside of it. 483 00:24:12,240 --> 00:24:14,040 Speaker 1: And then you might think, oh, let's spin the thing. 484 00:24:14,400 --> 00:24:18,320 Speaker 1: Right now, you have this enormous thing which is also spinning, 485 00:24:18,760 --> 00:24:21,760 Speaker 1: and it would be really unstable. You know, a Dyson 486 00:24:21,840 --> 00:24:24,800 Speaker 1: sphere that surrounds the Sun, you can stay there stably, 487 00:24:25,119 --> 00:24:27,040 Speaker 1: just sort of in orbit, but as soon as he 488 00:24:27,080 --> 00:24:30,440 Speaker 1: gets off center a little bit, now the part that's 489 00:24:30,480 --> 00:24:33,680 Speaker 1: closer to the Sun is going to feel more gravity 490 00:24:33,720 --> 00:24:36,000 Speaker 1: towards the Sun and it's going to very quickly fall 491 00:24:36,080 --> 00:24:39,480 Speaker 1: into the Sun. And so this thing would be very unstable. 492 00:24:39,800 --> 00:24:42,240 Speaker 1: And now you're spinning it also, which makes it less stable, 493 00:24:42,359 --> 00:24:44,760 Speaker 1: and it would need to be much much stronger. Right, 494 00:24:44,880 --> 00:24:47,960 Speaker 1: this thing would require like a tensile strength that exceeds 495 00:24:48,080 --> 00:24:52,159 Speaker 1: any known material that we can even imagine building it 496 00:24:52,320 --> 00:24:54,320 Speaker 1: out of. So it would be very hard to build. 497 00:24:54,400 --> 00:24:57,080 Speaker 1: It'd be very unclear to know, like where you would 498 00:24:57,160 --> 00:25:00,679 Speaker 1: live on it. I think instead, much more realistic is 499 00:25:00,680 --> 00:25:03,480 Speaker 1: not to build a huge Dicen sphere that encircles the 500 00:25:03,520 --> 00:25:06,320 Speaker 1: whole star, but just to build a bunch of satellites 501 00:25:06,359 --> 00:25:09,520 Speaker 1: that like roughly circle the Sun, don't block it entirely. 502 00:25:09,880 --> 00:25:12,080 Speaker 1: That just gather a bunch of energy. Because the Sun 503 00:25:12,119 --> 00:25:14,520 Speaker 1: has so much energy, we don't even need all the 504 00:25:14,640 --> 00:25:16,840 Speaker 1: energy that the Sun puts out. What would we even 505 00:25:16,840 --> 00:25:19,600 Speaker 1: do with that other than building like giant space lasers. 506 00:25:19,880 --> 00:25:21,560 Speaker 1: I don't want to live in a world with giant 507 00:25:21,600 --> 00:25:24,720 Speaker 1: space lasers. I don't think would you have to like 508 00:25:24,840 --> 00:25:28,360 Speaker 1: replace these satellites regularly that would be an incredible job. 509 00:25:28,480 --> 00:25:30,439 Speaker 1: Or do you you just imagine that these satellites are 510 00:25:30,440 --> 00:25:33,560 Speaker 1: gonna work forever. No, you definitely need to replace them. 511 00:25:33,600 --> 00:25:35,960 Speaker 1: And I think the most realistic plan I've ever heard 512 00:25:36,040 --> 00:25:38,480 Speaker 1: for building this kind of system is that you build 513 00:25:38,480 --> 00:25:42,680 Speaker 1: a few of them manually out of materials from like mercury, 514 00:25:42,760 --> 00:25:45,359 Speaker 1: and then you build robots that make more, and you 515 00:25:45,440 --> 00:25:48,159 Speaker 1: power those robots using the system that you built. So 516 00:25:48,240 --> 00:25:51,440 Speaker 1: you sort of build a few bespoke ones yourself using 517 00:25:51,560 --> 00:25:54,719 Speaker 1: human mining and industry, and then you use that as 518 00:25:54,720 --> 00:25:58,760 Speaker 1: a launching point for your like automatic self replicating robot 519 00:25:59,040 --> 00:26:01,320 Speaker 1: arm that can make more of these things. And then 520 00:26:01,359 --> 00:26:04,919 Speaker 1: it basically like devours mercury and turns it into a 521 00:26:04,960 --> 00:26:07,159 Speaker 1: whole network of these things. And yet some of them 522 00:26:07,160 --> 00:26:09,360 Speaker 1: will go offline, but you just keep building them where 523 00:26:09,359 --> 00:26:12,639 Speaker 1: you can recycle the materials somehow, and then you beam 524 00:26:12,680 --> 00:26:14,679 Speaker 1: it to the Earth or to your I guess, to 525 00:26:14,680 --> 00:26:17,879 Speaker 1: your stations because you're living around the sphere. You and 526 00:26:17,960 --> 00:26:20,320 Speaker 1: I've talked about the prospect of getting solar power in 527 00:26:20,400 --> 00:26:22,800 Speaker 1: space and beaming it down to the Earth. That's tricky, right, 528 00:26:22,840 --> 00:26:24,240 Speaker 1: because you've got to get it down to the Earth, 529 00:26:24,280 --> 00:26:25,760 Speaker 1: But you don't want to fry people, and you don't 530 00:26:25,760 --> 00:26:27,800 Speaker 1: want to build a giant space laser and handed over 531 00:26:27,840 --> 00:26:31,560 Speaker 1: to politicians for so many reasons. But instead, if you 532 00:26:31,720 --> 00:26:35,040 Speaker 1: build it in space for use in space, then you 533 00:26:35,040 --> 00:26:37,879 Speaker 1: know that mitigates some of those issues. You don't need 534 00:26:37,920 --> 00:26:40,080 Speaker 1: to beam it down to the planet. Um, you can 535 00:26:40,119 --> 00:26:42,280 Speaker 1: just sort of beam it around space. I suppose do 536 00:26:42,280 --> 00:26:45,520 Speaker 1: you feel like there's any ethical argument against destroying a 537 00:26:45,600 --> 00:26:48,919 Speaker 1: planet just because we're not personally super interested in it 538 00:26:49,000 --> 00:26:51,760 Speaker 1: right now? I think there's definitely a question of you know, 539 00:26:51,840 --> 00:26:54,879 Speaker 1: colonization and treating it like a resource. We don't know, 540 00:26:54,960 --> 00:26:57,840 Speaker 1: for example, whether there's life on Mercury, and we also 541 00:26:57,880 --> 00:27:00,600 Speaker 1: don't know the sort of spectrum of post some whole life. 542 00:27:00,880 --> 00:27:03,840 Speaker 1: Potentially there is life on mercury that we don't even notice, 543 00:27:04,040 --> 00:27:06,399 Speaker 1: we don't even recognize, and we just like devour it 544 00:27:06,480 --> 00:27:09,360 Speaker 1: and turn it into our battery system essentially without even 545 00:27:09,359 --> 00:27:12,040 Speaker 1: being aware of it. Or maybe life would have evolved 546 00:27:12,040 --> 00:27:14,399 Speaker 1: on mercury in another hundred million years, just sort of 547 00:27:14,440 --> 00:27:18,000 Speaker 1: like slow going chemistry, and we've prevented that from happening. 548 00:27:18,080 --> 00:27:20,840 Speaker 1: So there are definitely important questions about how we treat 549 00:27:20,920 --> 00:27:24,000 Speaker 1: resources in space, and also who in our society gets 550 00:27:24,000 --> 00:27:27,520 Speaker 1: access to those resources, you know, should it be corporate 551 00:27:27,560 --> 00:27:30,600 Speaker 1: barons who are launching their own satellites, Should it be 552 00:27:30,760 --> 00:27:33,600 Speaker 1: national governments, or should it be like decisions made by 553 00:27:33,600 --> 00:27:36,000 Speaker 1: a bunch of local communities with Reddit in their brain? 554 00:27:36,119 --> 00:27:39,040 Speaker 1: You know? These are important questions, they are, And I've 555 00:27:39,040 --> 00:27:42,359 Speaker 1: been reading a bunch of papers by philosophers about, you know, 556 00:27:42,400 --> 00:27:44,879 Speaker 1: conservation of things in space, and you you know, you 557 00:27:44,880 --> 00:27:47,919 Speaker 1: mentioned mercury's value if it has life on it, But 558 00:27:47,960 --> 00:27:49,920 Speaker 1: I think they would argue that, like, I mean, it's 559 00:27:49,920 --> 00:27:53,080 Speaker 1: a whole planet that you know, has scientific value, and 560 00:27:53,119 --> 00:27:55,080 Speaker 1: even that even if it doesn't have scientific value, it 561 00:27:55,160 --> 00:27:57,000 Speaker 1: might be nice to look at. And should we, you know, 562 00:27:57,119 --> 00:27:59,679 Speaker 1: should we value it just because it's a giant thing? 563 00:27:59,720 --> 00:28:02,000 Speaker 1: That its cysts, and I think that's a very popular 564 00:28:02,080 --> 00:28:05,280 Speaker 1: argument with many people. But it's interesting to think about. Yeah, 565 00:28:05,320 --> 00:28:07,920 Speaker 1: that is interesting in the same way that you might say, hey, 566 00:28:08,000 --> 00:28:11,280 Speaker 1: let's not demolish that mountain because of the coal inside 567 00:28:11,280 --> 00:28:12,879 Speaker 1: of it. It's kind of nice to look at and 568 00:28:12,960 --> 00:28:16,040 Speaker 1: to hike around on. We prefer it in mountain form, right, 569 00:28:16,040 --> 00:28:17,119 Speaker 1: And what are we going to do with all the 570 00:28:17,160 --> 00:28:19,760 Speaker 1: acronyms that we use to memorize the planets if there's 571 00:28:19,840 --> 00:28:22,879 Speaker 1: no M at the beginning and we're gonna have to 572 00:28:22,920 --> 00:28:24,880 Speaker 1: start over, and that's going to be tough. That's really 573 00:28:24,880 --> 00:28:27,640 Speaker 1: an ethical issue. All those children we've taught this acronym 574 00:28:27,720 --> 00:28:30,280 Speaker 1: and now they have to start again. Ye, not fair? 575 00:28:30,920 --> 00:28:33,600 Speaker 1: What are we doing to our children? Think of the children? 576 00:28:33,800 --> 00:28:37,000 Speaker 1: Think of the children. Indeed, all right, wonderful. Well, we 577 00:28:37,080 --> 00:28:39,320 Speaker 1: have a special tree coming up for you. After the break, 578 00:28:39,360 --> 00:28:41,400 Speaker 1: we're going to talk to the author of a Half 579 00:28:41,400 --> 00:28:43,680 Speaker 1: Built Garden and hear about how she came up with 580 00:28:43,680 --> 00:28:47,520 Speaker 1: these ideas and why she is so fascinated by thinking 581 00:28:47,560 --> 00:29:03,160 Speaker 1: about them. But first we're gonna take another break. Then 582 00:29:03,240 --> 00:29:06,880 Speaker 1: it's my pleasure to welcome to the program ruth Anna Emiris. 583 00:29:07,400 --> 00:29:10,880 Speaker 1: She is a prolific author of many novels and has 584 00:29:10,960 --> 00:29:15,800 Speaker 1: been shortlisted for several awards, including being a finalist for 585 00:29:15,880 --> 00:29:19,480 Speaker 1: the Locust Award for Best First Novel, and she's here 586 00:29:19,560 --> 00:29:22,760 Speaker 1: to talk to us about her recent science fiction book, 587 00:29:23,160 --> 00:29:26,280 Speaker 1: A Half Built Garden. Ruthanna, thank you very much for 588 00:29:26,360 --> 00:29:28,560 Speaker 1: joining us, and welcome to the program. Thank you for 589 00:29:28,600 --> 00:29:30,680 Speaker 1: having me so First, we'd like to get to know 590 00:29:30,720 --> 00:29:32,640 Speaker 1: you a little bit before we ask you in detail 591 00:29:32,760 --> 00:29:35,400 Speaker 1: about your process of writing the book. Tell us a 592 00:29:35,440 --> 00:29:38,240 Speaker 1: little bit about your background and how you came to 593 00:29:38,520 --> 00:29:41,400 Speaker 1: write a book about aliens. I've really come to write 594 00:29:41,440 --> 00:29:43,640 Speaker 1: a book about aliens? I don't know. I've been writing 595 00:29:43,680 --> 00:29:48,640 Speaker 1: about aliens honestly most of my life. It's always been 596 00:29:49,000 --> 00:29:53,960 Speaker 1: one of my favorite sub genres of science fiction. How 597 00:29:54,000 --> 00:29:58,920 Speaker 1: I came to write book about climate mitigation that had 598 00:29:58,960 --> 00:30:04,600 Speaker 1: aliens in it is that I've lived in Washington, d C. 599 00:30:04,840 --> 00:30:06,880 Speaker 1: For about ten years, and one of the first things 600 00:30:06,880 --> 00:30:09,000 Speaker 1: that happened when I came here was that I got 601 00:30:09,040 --> 00:30:14,720 Speaker 1: involved in the local citizen science movement, and I got 602 00:30:14,760 --> 00:30:19,600 Speaker 1: involved people who were running projects that were bringing ordinary 603 00:30:19,640 --> 00:30:26,280 Speaker 1: people into the process of planning science, collecting data, analyzing data, 604 00:30:26,360 --> 00:30:29,080 Speaker 1: and seeing the way that that changes the way that 605 00:30:29,160 --> 00:30:34,040 Speaker 1: people think about the world and the ecosystems around them. 606 00:30:34,240 --> 00:30:37,320 Speaker 1: So when I started to think about, well, what sort 607 00:30:37,360 --> 00:30:41,240 Speaker 1: of governance structures could be really different from what we 608 00:30:41,280 --> 00:30:46,960 Speaker 1: have now and maybe do better at dealing with the 609 00:30:47,040 --> 00:30:50,239 Speaker 1: huge existential problems that we face as a species, I 610 00:30:50,280 --> 00:30:55,880 Speaker 1: started to think about that sort of crowdsource system. And 611 00:30:55,960 --> 00:31:01,680 Speaker 1: because I am always interested in what challenges systems, and 612 00:31:01,800 --> 00:31:04,400 Speaker 1: I was thinking about the ways that our current systems 613 00:31:04,400 --> 00:31:07,840 Speaker 1: are challenged by these problems, I started thinking about, Okay, 614 00:31:07,880 --> 00:31:11,200 Speaker 1: here's a system that works much better for these problems. 615 00:31:11,320 --> 00:31:16,080 Speaker 1: What makes this system break? And the answer was aliens? 616 00:31:16,160 --> 00:31:20,040 Speaker 1: Because why would it not be aliens? That's a good answer. 617 00:31:20,640 --> 00:31:22,800 Speaker 1: When I was reading the book, I found myself wondering 618 00:31:22,840 --> 00:31:26,560 Speaker 1: if you have a background in either ecology or political science, 619 00:31:26,560 --> 00:31:29,200 Speaker 1: because both of those sort of themes were done so 620 00:31:29,240 --> 00:31:32,000 Speaker 1: well in the book, And so do you have a 621 00:31:32,040 --> 00:31:35,280 Speaker 1: background in either of those topics? Thank you? My background 622 00:31:35,680 --> 00:31:39,480 Speaker 1: is in the social sciences more generally. I'm an experimental 623 00:31:39,640 --> 00:31:44,200 Speaker 1: cognitive psychologist by training, but I spend a lot of 624 00:31:44,320 --> 00:31:51,480 Speaker 1: time working with anthropologists and political scientists, and I also 625 00:31:51,880 --> 00:31:57,760 Speaker 1: spend a lot of time working with ecologists and other 626 00:31:57,880 --> 00:32:02,800 Speaker 1: people who are working on other disciplines involved in solving 627 00:32:02,880 --> 00:32:08,600 Speaker 1: climate issues. So I'm always working on the how can 628 00:32:08,680 --> 00:32:12,680 Speaker 1: humans screw this up end of things? But I love 629 00:32:13,280 --> 00:32:16,000 Speaker 1: talking with and looking with the people who are working 630 00:32:16,560 --> 00:32:20,600 Speaker 1: on the how do we get carbon out of the atmosphere? 631 00:32:20,680 --> 00:32:25,480 Speaker 1: How do we improve the resilience of our systems from 632 00:32:26,080 --> 00:32:31,040 Speaker 1: uh energy standpoints? And then I'm coming back with, okay, 633 00:32:31,120 --> 00:32:34,280 Speaker 1: and how do we get people to actually implement the 634 00:32:34,400 --> 00:32:36,600 Speaker 1: solution now that you come up with it. Something I 635 00:32:36,640 --> 00:32:39,880 Speaker 1: really enjoy in science fiction novels and in particular in yours, 636 00:32:40,040 --> 00:32:42,680 Speaker 1: is imagining other ways that we can live, other ways 637 00:32:42,680 --> 00:32:45,720 Speaker 1: that we might organize ourselves. And your book describes a 638 00:32:45,760 --> 00:32:50,440 Speaker 1: pretty novel political and social organization, the watershed. Can you 639 00:32:50,480 --> 00:32:54,240 Speaker 1: explain this concept to our listeners? So, the dandelion networks 640 00:32:54,280 --> 00:32:58,880 Speaker 1: in a half built garden are they are built around watersheds, 641 00:32:59,160 --> 00:33:03,600 Speaker 1: and a lot of that was trying to think about 642 00:33:03,760 --> 00:33:10,360 Speaker 1: what sort of geographic boundaries would have some basis in 643 00:33:11,360 --> 00:33:17,080 Speaker 1: shared interests and shared problem solving, and it would make 644 00:33:17,120 --> 00:33:19,600 Speaker 1: people think more deeply about the world around them. They 645 00:33:19,760 --> 00:33:28,440 Speaker 1: are also technological system The networks are sort of many 646 00:33:28,480 --> 00:33:35,960 Speaker 1: internets that are based around watersheds or in some cases 647 00:33:36,160 --> 00:33:40,160 Speaker 1: around just you know, knitting or other shared interests as well, 648 00:33:40,200 --> 00:33:43,400 Speaker 1: and people can belong to more than one of them. 649 00:33:43,600 --> 00:33:48,280 Speaker 1: But the central thing they do is decision making. So 650 00:33:48,560 --> 00:33:54,040 Speaker 1: they include both systems set up for people to provide 651 00:33:54,120 --> 00:33:59,560 Speaker 1: input into a problem like how do we reduce level 652 00:33:59,640 --> 00:34:03,520 Speaker 1: of run off into the Anacostia River. And then they 653 00:34:03,600 --> 00:34:08,240 Speaker 1: also include algorithms that, if you think about the way 654 00:34:08,360 --> 00:34:15,560 Speaker 1: that modern machine learning algorithms often unintentionally bring in biases 655 00:34:15,640 --> 00:34:20,640 Speaker 1: from their data sets, the dandelion algorithms deliberately bring in 656 00:34:20,760 --> 00:34:25,040 Speaker 1: biases that we want to have. So they include algorithms 657 00:34:25,080 --> 00:34:32,160 Speaker 1: that bias problems solving towards human rights or towards advocacy 658 00:34:32,960 --> 00:34:39,040 Speaker 1: for the local ecosystem. And so those algorithms also contribute 659 00:34:39,480 --> 00:34:43,560 Speaker 1: to solving problems and weighting solutions. One of the things 660 00:34:43,560 --> 00:34:45,719 Speaker 1: that I really enjoyed about the book, in addition to 661 00:34:45,760 --> 00:34:47,799 Speaker 1: the things I've already mentioned that I enjoyed about the book. 662 00:34:47,840 --> 00:34:50,120 Speaker 1: So you talk about a law, you know, there are 663 00:34:50,120 --> 00:34:53,480 Speaker 1: people who are arguing that living on Earth isn't a 664 00:34:53,560 --> 00:34:56,400 Speaker 1: viable long term solution. And I just finished reading a 665 00:34:56,400 --> 00:34:58,480 Speaker 1: bunch of books about space settlements, and it was interesting 666 00:34:58,520 --> 00:35:00,760 Speaker 1: to hear some of those arguments or of coming back 667 00:35:00,840 --> 00:35:03,960 Speaker 1: and being heard from different characters in your book. So 668 00:35:04,239 --> 00:35:07,399 Speaker 1: what is your feeling about the future of humanity? Can 669 00:35:07,440 --> 00:35:10,399 Speaker 1: we eventually make civilization work here on Earth or are 670 00:35:10,400 --> 00:35:12,400 Speaker 1: we going to need to move out to the stars 671 00:35:12,440 --> 00:35:14,680 Speaker 1: to solve our problems? You know, I kind of wrote 672 00:35:14,719 --> 00:35:17,799 Speaker 1: the book to argue with myself about that. If I 673 00:35:17,840 --> 00:35:20,760 Speaker 1: had to take an end to say, I think that 674 00:35:21,280 --> 00:35:24,359 Speaker 1: we ought to get both. I love to see us 675 00:35:24,440 --> 00:35:28,239 Speaker 1: going out and colonizing space. At the same time, I 676 00:35:28,239 --> 00:35:31,360 Speaker 1: think that a lot of science fiction that valorizes the 677 00:35:31,600 --> 00:35:37,040 Speaker 1: destiny of humans in the stars tends to underestimate the 678 00:35:37,200 --> 00:35:41,840 Speaker 1: value of having a complex ecosystem that you evolved to 679 00:35:41,880 --> 00:35:44,920 Speaker 1: live in and the distance that we are from actually 680 00:35:45,560 --> 00:35:52,520 Speaker 1: being able to make other places more amenable to human 681 00:35:52,560 --> 00:35:54,960 Speaker 1: life when we're currently in the process of making this 682 00:35:55,000 --> 00:35:58,479 Speaker 1: one less amenable to human life. I probably did come 683 00:35:58,520 --> 00:36:02,160 Speaker 1: down on the side of the characters saying, you know, 684 00:36:02,280 --> 00:36:04,600 Speaker 1: we need to maybe figure out how to make this 685 00:36:04,760 --> 00:36:07,319 Speaker 1: work on easy mode in order to do it right 686 00:36:07,360 --> 00:36:09,480 Speaker 1: anywhere else. Is that the viewpoint you had when you 687 00:36:09,520 --> 00:36:12,360 Speaker 1: started writing the book or through the process of writing 688 00:36:12,400 --> 00:36:15,400 Speaker 1: the book, you sort of formed that more like solid viewpoint. 689 00:36:15,640 --> 00:36:17,520 Speaker 1: Like I said, I wrote the book in part to 690 00:36:17,920 --> 00:36:24,160 Speaker 1: argue with myself that is srequitely why I write books. Well, 691 00:36:24,280 --> 00:36:27,800 Speaker 1: something I thought was really fascinating are these political structures 692 00:36:27,800 --> 00:36:30,799 Speaker 1: that you described with this Dandelion network. It seems to 693 00:36:30,880 --> 00:36:32,960 Speaker 1: me like sort of an opposite trend of what we're 694 00:36:32,960 --> 00:36:36,799 Speaker 1: seeing today where we're lurching towards globalization. In your book, 695 00:36:36,800 --> 00:36:39,440 Speaker 1: you have sort of these smaller, more local communities that 696 00:36:39,480 --> 00:36:42,600 Speaker 1: operates am I independently Do you think that that's a 697 00:36:42,760 --> 00:36:46,600 Speaker 1: future for humanity, that these larger national governments and international 698 00:36:46,640 --> 00:36:48,960 Speaker 1: corporations are going to break up in favor of more 699 00:36:48,960 --> 00:36:51,600 Speaker 1: local solutions. I think it depends on the direction that 700 00:36:51,640 --> 00:36:55,520 Speaker 1: we choose to go. Um, but I really see trends 701 00:36:55,640 --> 00:37:00,440 Speaker 1: in both directions. In the modern world. We get anything 702 00:37:00,600 --> 00:37:05,520 Speaker 1: that pushed towards greater globalization, but we also you know, 703 00:37:05,640 --> 00:37:08,799 Speaker 1: over the course of this last couple of weeks, I've 704 00:37:08,880 --> 00:37:16,520 Speaker 1: been anxiously watching Twitter breakdown and started up Mastodon account 705 00:37:16,600 --> 00:37:19,640 Speaker 1: just to make sure that I still had something. And 706 00:37:20,280 --> 00:37:26,960 Speaker 1: there's something much more granular and localized about the Mastodon instances. 707 00:37:27,120 --> 00:37:30,640 Speaker 1: And you know, people talk about that as both drinks 708 00:37:30,719 --> 00:37:35,279 Speaker 1: and a weakness, just as the globalization of Twitter has 709 00:37:35,320 --> 00:37:39,239 Speaker 1: been a great strength and also turns out to make 710 00:37:39,280 --> 00:37:44,160 Speaker 1: it kind of griddle. I also see a lot of 711 00:37:44,360 --> 00:37:50,520 Speaker 1: the best, quietest work towards sustainability and resilience happening at 712 00:37:50,520 --> 00:37:54,600 Speaker 1: the local level, you know, in towns and cities where 713 00:37:55,239 --> 00:38:02,480 Speaker 1: people really have concretely shared needs that let them negotiate 714 00:38:02,800 --> 00:38:06,520 Speaker 1: politics locally in a way that can be more challenging 715 00:38:06,600 --> 00:38:09,120 Speaker 1: at higher scales. Do you think we're gonna need like 716 00:38:09,200 --> 00:38:13,040 Speaker 1: a major political realignment, like having little watershed governments before 717 00:38:13,080 --> 00:38:15,160 Speaker 1: we can actually start to address some of these bigger 718 00:38:15,200 --> 00:38:18,520 Speaker 1: problems like climate change. I don't think it's the only way. 719 00:38:18,560 --> 00:38:23,399 Speaker 1: As I said, I live in the DC area. I'm 720 00:38:24,360 --> 00:38:32,279 Speaker 1: felt way person, and I have uh fondness for the 721 00:38:32,320 --> 00:38:36,200 Speaker 1: executive branch agencies and the hard work that people do 722 00:38:36,360 --> 00:38:42,120 Speaker 1: in them. The NASA people who are running around trying 723 00:38:42,160 --> 00:38:45,880 Speaker 1: desperately to be relevant in the book are kind of 724 00:38:45,880 --> 00:38:48,799 Speaker 1: a love letter to all the people that I have 725 00:38:49,200 --> 00:38:54,600 Speaker 1: seen around here working utterly thankless jobs and trying desperately 726 00:38:54,760 --> 00:38:59,840 Speaker 1: to solve problems while people detegrate them in you know, 727 00:39:00,160 --> 00:39:04,600 Speaker 1: xt door And I hope that we figure out how 728 00:39:04,640 --> 00:39:07,400 Speaker 1: to solve problems with the nation states we currently have, 729 00:39:07,719 --> 00:39:12,279 Speaker 1: because it's honestly easier to do things with systems that 730 00:39:12,320 --> 00:39:17,440 Speaker 1: you've already got in place. But I also think that 731 00:39:18,120 --> 00:39:27,080 Speaker 1: having subsidiarity and overlapping systems provide some really important ability 732 00:39:27,200 --> 00:39:30,560 Speaker 1: to address problems in different ways. In a different level, 733 00:39:30,680 --> 00:39:32,680 Speaker 1: I'd like to hear more of your thoughts about how 734 00:39:32,800 --> 00:39:35,680 Speaker 1: technology plays a role in allowing that to happen. I mean, 735 00:39:35,719 --> 00:39:37,920 Speaker 1: I know that in the early days the Internet, we 736 00:39:37,960 --> 00:39:40,600 Speaker 1: all imagined that the internet would be a powerful force 737 00:39:40,640 --> 00:39:43,479 Speaker 1: for direct democracy, and now we see there's another side 738 00:39:43,480 --> 00:39:46,480 Speaker 1: to it that can also amplify hate speech and connect 739 00:39:46,640 --> 00:39:49,520 Speaker 1: pockets of extremism. And in your book, it was fascinating 740 00:39:49,520 --> 00:39:52,400 Speaker 1: how the networks and the discussion seemed to be the 741 00:39:52,440 --> 00:39:56,520 Speaker 1: core of this like communal, bottom up style government. It 742 00:39:56,640 --> 00:39:59,359 Speaker 1: was almost a utopian at the same time as being 743 00:39:59,360 --> 00:40:02,000 Speaker 1: a little bit topian. Because we're facing this crisis. Do 744 00:40:02,040 --> 00:40:04,680 Speaker 1: you think that Twitter or Mathadon or these other social 745 00:40:04,719 --> 00:40:07,839 Speaker 1: networks are going to be sort of a framework for 746 00:40:07,960 --> 00:40:11,319 Speaker 1: reimagining our priorities and government strategies. I mean, I think 747 00:40:11,360 --> 00:40:15,760 Speaker 1: they have been. You know, Twitter has changed the way 748 00:40:15,920 --> 00:40:20,120 Speaker 1: that we do some types of governance. I have a 749 00:40:20,160 --> 00:40:26,560 Speaker 1: friend who is currently completely freaking out because Twitter has 750 00:40:26,640 --> 00:40:34,799 Speaker 1: been the backbone of vastly improved disaster response over the 751 00:40:34,920 --> 00:40:40,400 Speaker 1: last decade, and she's fairly certain that when uh new 752 00:40:40,560 --> 00:40:43,879 Speaker 1: natural disaster happens in the next few months, people are 753 00:40:43,920 --> 00:40:48,239 Speaker 1: going to die because Twitter is broken, and as we 754 00:40:48,320 --> 00:40:52,840 Speaker 1: will be losing infrastructure that we were depending on to 755 00:40:53,560 --> 00:40:58,400 Speaker 1: you know, put people in touch with resources and to 756 00:40:59,360 --> 00:41:05,399 Speaker 1: get help quickly where it's needed. Most technologies, they can 757 00:41:05,440 --> 00:41:08,680 Speaker 1: be used in many different ways, but they also have 758 00:41:08,840 --> 00:41:14,400 Speaker 1: affordances that make some things easier and somethings harder. Twitter 759 00:41:14,480 --> 00:41:20,719 Speaker 1: unfortunately makes some good things easier and some bad things easier, 760 00:41:21,200 --> 00:41:25,680 Speaker 1: and I think that as we design new technologies, we 761 00:41:25,800 --> 00:41:31,040 Speaker 1: want to think very deeply about what affordances we're building 762 00:41:31,080 --> 00:41:37,800 Speaker 1: in and trying actively to prevent or mitigate the worst 763 00:41:38,120 --> 00:41:41,479 Speaker 1: of the negative law. I also tend to think about 764 00:41:41,600 --> 00:41:46,279 Speaker 1: technology in a way that I think a lot of 765 00:41:46,400 --> 00:41:50,640 Speaker 1: people don't. It's not just the circuits, it's the social structures. 766 00:41:50,840 --> 00:41:57,279 Speaker 1: And for the dandelion networks, there are algorithms involved, but 767 00:41:57,320 --> 00:42:02,160 Speaker 1: there are also new modes of social organization and new 768 00:42:02,239 --> 00:42:07,640 Speaker 1: ways of teaching people to expect and be incentivized by 769 00:42:07,880 --> 00:42:13,440 Speaker 1: certain types of engagements. And then I'm also very fond 770 00:42:14,360 --> 00:42:19,840 Speaker 1: of whole idea of humans as natural side works that 771 00:42:20,000 --> 00:42:26,520 Speaker 1: built into our neurology is the expectation of being slow used. 772 00:42:27,280 --> 00:42:31,680 Speaker 1: So literally, when you pick up a stick, you change 773 00:42:31,719 --> 00:42:37,839 Speaker 1: the way you represent space around you, in your occipital 774 00:42:38,200 --> 00:42:41,840 Speaker 1: lobe where you normally represent space, because the distinction that 775 00:42:41,920 --> 00:42:44,799 Speaker 1: you actually make and representing space is places that I 776 00:42:44,800 --> 00:42:47,720 Speaker 1: can reach and manipulate in places that I can't reach 777 00:42:47,760 --> 00:42:53,000 Speaker 1: and manipulate. So every time we take on a new technology, 778 00:42:53,080 --> 00:42:58,080 Speaker 1: it changes our representation of ourselves and of our ability 779 00:42:58,280 --> 00:43:01,799 Speaker 1: to impact the world. And that was something else I 780 00:43:01,880 --> 00:43:06,759 Speaker 1: was thinking of with the Dandelion Networks, was deliberately designing 781 00:43:06,920 --> 00:43:13,160 Speaker 1: something too create that cyborgness in a way that was 782 00:43:13,280 --> 00:43:15,600 Speaker 1: good for the world and good for the people who 783 00:43:15,680 --> 00:43:19,000 Speaker 1: use it. I really appreciate the considering technology in the 784 00:43:19,040 --> 00:43:21,160 Speaker 1: long term from both the perspectives of how it can 785 00:43:21,200 --> 00:43:22,920 Speaker 1: go well and how it can go poorly. And I 786 00:43:22,960 --> 00:43:24,760 Speaker 1: had a project that I did once and I interviewed 787 00:43:24,760 --> 00:43:28,040 Speaker 1: a bunch of people working on emerging technologies, and I 788 00:43:28,080 --> 00:43:30,480 Speaker 1: was really surprised by how many of them didn't have 789 00:43:30,520 --> 00:43:33,040 Speaker 1: the answer to well and explain to me all the 790 00:43:33,040 --> 00:43:35,480 Speaker 1: ways your technology could be bad. And I can't tell 791 00:43:35,480 --> 00:43:37,200 Speaker 1: if they just didn't want me to know, or if 792 00:43:37,200 --> 00:43:39,120 Speaker 1: they really hadn't blowed it through, But I do feel 793 00:43:39,160 --> 00:43:40,680 Speaker 1: like in general, we tend to be a little bit 794 00:43:40,760 --> 00:43:43,279 Speaker 1: rosier about technology and we try not to think about 795 00:43:43,280 --> 00:43:45,279 Speaker 1: the negative implications until they sort of hit us in 796 00:43:45,320 --> 00:43:48,200 Speaker 1: the face. What people do have answers, they go in 797 00:43:48,400 --> 00:43:51,680 Speaker 1: very interesting directions. So I was involved in I think 798 00:43:51,960 --> 00:43:56,200 Speaker 1: a similar project many years ago now, I think about 799 00:43:56,440 --> 00:44:01,759 Speaker 1: fifteen years ago. I got involved with with people who 800 00:44:02,040 --> 00:44:09,400 Speaker 1: were trying to do foresight work around nanotechnology and to 801 00:44:09,440 --> 00:44:14,880 Speaker 1: come up with policy recommendations in advance of actual capability. 802 00:44:15,200 --> 00:44:19,000 Speaker 1: And after a while in these rooms you would find 803 00:44:19,080 --> 00:44:23,440 Speaker 1: that everyone wanted to think about the grade youth problem, 804 00:44:23,440 --> 00:44:29,800 Speaker 1: which is, you know, very specialtive nanotechnology that reproduces itself, 805 00:44:30,600 --> 00:44:33,839 Speaker 1: optimizes for pagano clips and turns the entire planet into 806 00:44:33,920 --> 00:44:42,160 Speaker 1: paper clips and zero. People wanted to think of about 807 00:44:42,640 --> 00:44:47,759 Speaker 1: inhaling nanoparticles, which was in fact a actual problem with 808 00:44:47,880 --> 00:44:52,560 Speaker 1: actual nanotechnology at the time, for you know, what what 809 00:44:52,760 --> 00:44:58,560 Speaker 1: happens if there is UH bug in your paper clip 810 00:44:58,640 --> 00:45:02,960 Speaker 1: optimizer with there will be and how does it break down? 811 00:45:03,760 --> 00:45:06,439 Speaker 1: People love the big dramatic and I love the big 812 00:45:06,480 --> 00:45:09,640 Speaker 1: dramatic futures still on the science fiction writer, but it 813 00:45:09,719 --> 00:45:13,120 Speaker 1: also got very interesting to me psychologically the types of 814 00:45:13,239 --> 00:45:16,120 Speaker 1: futures that people wanting to think about and the type 815 00:45:16,680 --> 00:45:20,080 Speaker 1: that they found uncomfortable to think about. I have a 816 00:45:20,160 --> 00:45:22,440 Speaker 1: question about the sort of emotional side of it. In 817 00:45:22,480 --> 00:45:25,560 Speaker 1: your story, humans and aliens have like really big and 818 00:45:25,600 --> 00:45:29,680 Speaker 1: important cultural differences, but they can also successfully empathize with 819 00:45:29,719 --> 00:45:32,800 Speaker 1: each other and in some cases understand each other's social 820 00:45:32,840 --> 00:45:35,480 Speaker 1: and political issues. There's even a thread where we get 821 00:45:35,480 --> 00:45:38,520 Speaker 1: a sense that one character develops romantic feelings for an alien. 822 00:45:38,680 --> 00:45:41,120 Speaker 1: Do you think that's something we can expect to happen 823 00:45:41,200 --> 00:45:44,200 Speaker 1: in our universe when maybe aliens or is it more 824 00:45:44,280 --> 00:45:47,960 Speaker 1: just that in the hard realistic take where aliens are 825 00:45:48,000 --> 00:45:51,920 Speaker 1: incomprehensible emotionally doesn't make for a very satisfying science fiction story. 826 00:45:52,120 --> 00:45:54,960 Speaker 1: I mean, certainly that's part of why I choose to 827 00:45:54,960 --> 00:45:59,520 Speaker 1: write aliens who are somewhat comprehensible emotionally. But I think 828 00:45:59,680 --> 00:46:03,160 Speaker 1: it's you know, it depends on the species. If you 829 00:46:03,320 --> 00:46:08,520 Speaker 1: look at be more intelligent to other species with whom 830 00:46:08,680 --> 00:46:14,719 Speaker 1: we share our planet. Currently, humans get along better with 831 00:46:14,880 --> 00:46:19,560 Speaker 1: some of them than others, and when we get along 832 00:46:19,800 --> 00:46:25,440 Speaker 1: with them, we have very weird and unexpected places of 833 00:46:25,480 --> 00:46:30,920 Speaker 1: breakdowns and communication. And the things that humans can agree 834 00:46:30,960 --> 00:46:35,000 Speaker 1: on with adolphins are very differus to the things that 835 00:46:35,080 --> 00:46:38,440 Speaker 1: humans can agree on with a part and the relationships 836 00:46:38,520 --> 00:46:42,520 Speaker 1: that we go with them are also very different. And 837 00:46:42,520 --> 00:46:45,920 Speaker 1: I said, tool even for something where we could learn 838 00:46:46,480 --> 00:46:50,160 Speaker 1: to speak each other's languages better than humans and Paris do. 839 00:46:50,640 --> 00:46:53,279 Speaker 1: So then if aliens do arrive, do you think that 840 00:46:53,360 --> 00:46:56,600 Speaker 1: we should send cognitive psychologists to go talk to them first? 841 00:46:56,600 --> 00:47:02,560 Speaker 1: Are you volunteering? Yes, yes, I'm totally volunteering. Oh my gosh, 842 00:47:02,800 --> 00:47:06,160 Speaker 1: most people ask that question backpedal rapidly, so I'm glad 843 00:47:06,160 --> 00:47:08,359 Speaker 1: for your enthusiasm. So I have sort of a light 844 00:47:08,480 --> 00:47:11,960 Speaker 1: question here. So what alien in either you know, literature 845 00:47:12,000 --> 00:47:15,920 Speaker 1: or movies or TV is the best done alien? Are 846 00:47:15,960 --> 00:47:20,200 Speaker 1: the best written alien that you've come across? And what 847 00:47:20,239 --> 00:47:22,880 Speaker 1: was your thought process as you went through and like 848 00:47:23,040 --> 00:47:26,040 Speaker 1: designed your aliens for the book? It depends on how 849 00:47:26,120 --> 00:47:28,479 Speaker 1: you defined I know I've said it depends a lot. 850 00:47:28,600 --> 00:47:33,200 Speaker 1: I I am very annoyingly sent there. I really love 851 00:47:33,760 --> 00:47:39,760 Speaker 1: the aliens in Mary Deriyah russell Sparrow and the mix 852 00:47:40,160 --> 00:47:47,640 Speaker 1: of you know, understanding and horrible misunderstanding that happens there, 853 00:47:47,960 --> 00:47:53,400 Speaker 1: and the interesting relationships between the different species there. And 854 00:47:53,440 --> 00:47:56,080 Speaker 1: I'm sure that that was part of my own influence. 855 00:47:56,120 --> 00:48:00,160 Speaker 1: I'm wanting to write two species that have relationship and 856 00:48:00,200 --> 00:48:04,319 Speaker 1: then come to those contest with humans together. I was 857 00:48:04,360 --> 00:48:07,920 Speaker 1: also thinking about my first two books are in fact 858 00:48:08,160 --> 00:48:13,920 Speaker 1: deconstructive love crafty and historical fantasy, and they use aliens 859 00:48:13,960 --> 00:48:17,759 Speaker 1: that love Crafts made up. And Lovecot had many serious 860 00:48:17,760 --> 00:48:21,160 Speaker 1: issues as a person, some of which my books are 861 00:48:21,200 --> 00:48:27,080 Speaker 1: about arguing with. But he was really good at coming 862 00:48:27,239 --> 00:48:35,279 Speaker 1: up with not even remotely humanoid aliens. And then, you know, 863 00:48:35,640 --> 00:48:38,799 Speaker 1: having whole slide bars of those years, all the biology 864 00:48:38,920 --> 00:48:41,960 Speaker 1: that I just made up, It isn't this fun oh 865 00:48:42,120 --> 00:48:46,120 Speaker 1: look funnest. And so when I went to create my 866 00:48:46,200 --> 00:48:49,560 Speaker 1: own aliens, I did set myself the bar of that 867 00:48:49,600 --> 00:48:54,200 Speaker 1: they have to be at least as interesting in terms 868 00:48:54,239 --> 00:48:56,480 Speaker 1: of body plans. And the aliens that I got to 869 00:48:56,560 --> 00:48:59,680 Speaker 1: borrow for my last actually I really enjoyed, you know, 870 00:49:00,239 --> 00:49:03,480 Speaker 1: all the biology that you incorporated into the aliens lives. 871 00:49:03,960 --> 00:49:06,720 Speaker 1: They were, They were very interesting aliens to think about. 872 00:49:06,840 --> 00:49:10,080 Speaker 1: Speaking of aliens, why do you think we haven't been 873 00:49:10,160 --> 00:49:12,840 Speaker 1: visited yet? You know, what's your personal answer to the 874 00:49:12,880 --> 00:49:16,560 Speaker 1: Fermi paradox? Even how old the galaxy is and how 875 00:49:16,960 --> 00:49:20,400 Speaker 1: common rocky planets seem to be, why have we not 876 00:49:20,480 --> 00:49:24,920 Speaker 1: yet been visited by aliens? We've been looking for a century. 877 00:49:25,040 --> 00:49:29,080 Speaker 1: That is a minuscule amount of time us working what 878 00:49:29,160 --> 00:49:32,040 Speaker 1: the fact that we haven't found aliens yet is very 879 00:49:32,160 --> 00:49:37,040 Speaker 1: much like my kid looking for her stuffed animal for 880 00:49:37,160 --> 00:49:40,400 Speaker 1: two seconds and then mean that she hasn't sarned her stuff. 881 00:49:44,680 --> 00:49:49,120 Speaker 1: You make us sound very immature as a species. At 882 00:49:49,200 --> 00:49:51,760 Speaker 1: least hope we're immature as a species. If we're mature 883 00:49:51,840 --> 00:49:54,560 Speaker 1: as a species, then I have a whole new answer 884 00:49:54,640 --> 00:49:59,960 Speaker 1: for the Fermi paradox. Yeah, I could even art your 885 00:50:00,040 --> 00:50:04,239 Speaker 1: in any of the answers from you know we we 886 00:50:04,320 --> 00:50:10,360 Speaker 1: missed them to everyone killed themselves a climate change to 887 00:50:10,760 --> 00:50:14,480 Speaker 1: be accurate. But I also feel like we just haven't 888 00:50:14,560 --> 00:50:19,279 Speaker 1: been looking that lot. And also I do feel and 889 00:50:19,320 --> 00:50:20,640 Speaker 1: this is one of the things I was arguing with 890 00:50:20,719 --> 00:50:25,120 Speaker 1: myself about in the book, Like a lot of the 891 00:50:25,200 --> 00:50:28,400 Speaker 1: why haven't we found the people who colonized the galaxy 892 00:50:28,600 --> 00:50:31,319 Speaker 1: already A lot of the answer to that is, I 893 00:50:31,360 --> 00:50:34,280 Speaker 1: think the sort of mindset that it takes to try 894 00:50:34,320 --> 00:50:37,160 Speaker 1: and grow endlessly is the sort of mindset that it 895 00:50:37,280 --> 00:50:42,000 Speaker 1: takes to kill yourself off climate change. And I did 896 00:50:42,040 --> 00:50:44,200 Speaker 1: ask myself that I was writing it, was what would 897 00:50:44,200 --> 00:50:49,160 Speaker 1: it take for someone to build a licensephere and still 898 00:50:49,200 --> 00:50:54,280 Speaker 1: be worth talking too? Because I personally think that most 899 00:50:54,920 --> 00:50:58,840 Speaker 1: species you can imagine building dicenspheres, you hope they stay 900 00:50:59,080 --> 00:51:06,680 Speaker 1: very far way from your solar system. Agreed. So, speaking 901 00:51:06,680 --> 00:51:10,600 Speaker 1: of far off tech, what tech that's either existing in 902 00:51:10,640 --> 00:51:12,960 Speaker 1: your book or other sci fi books would you like 903 00:51:13,120 --> 00:51:15,799 Speaker 1: to see made real? Most of all, I really like 904 00:51:16,040 --> 00:51:20,440 Speaker 1: the part of the networks that involves making it easier 905 00:51:20,440 --> 00:51:24,600 Speaker 1: and more organic for people to sense the details of 906 00:51:24,680 --> 00:51:29,880 Speaker 1: the boom around them. So the sort of augmented reality 907 00:51:29,880 --> 00:51:33,040 Speaker 1: where it's not going to block your ability to hear 908 00:51:33,080 --> 00:51:35,080 Speaker 1: bird sell when you go out for a walk, but 909 00:51:35,320 --> 00:51:38,279 Speaker 1: you can also you know, dive into the health of 910 00:51:38,320 --> 00:51:41,280 Speaker 1: the trees or find out what kind of a burden 911 00:51:41,440 --> 00:51:44,000 Speaker 1: is if you're you know, not circlers who already has 912 00:51:44,080 --> 00:51:49,919 Speaker 1: that memorized. I'm also very fond of the sensory substitution 913 00:51:50,400 --> 00:51:55,680 Speaker 1: stuff that exists that I gave Judy. I just like 914 00:51:55,760 --> 00:51:58,560 Speaker 1: the idea of being able to have more senses. That 915 00:51:58,600 --> 00:52:00,879 Speaker 1: would be awesome. I like to be able to see 916 00:52:00,920 --> 00:52:03,000 Speaker 1: the universe and all sorts of new ways. I think 917 00:52:03,040 --> 00:52:06,280 Speaker 1: that would really fundamently change our view of it wonderful. Well, 918 00:52:06,320 --> 00:52:08,640 Speaker 1: thanks very much for telling us about the process of 919 00:52:08,680 --> 00:52:10,360 Speaker 1: writing your book and giving us a little bit of 920 00:52:10,400 --> 00:52:13,960 Speaker 1: insight into how you think about aliens and humans and 921 00:52:14,080 --> 00:52:16,919 Speaker 1: the prospects of their interactions. It's been a pleasure. Thank 922 00:52:16,960 --> 00:52:18,600 Speaker 1: you for having me up And can you tell our 923 00:52:18,600 --> 00:52:21,080 Speaker 1: listeners about any upcoming projects of yours? If they've enjoyed 924 00:52:21,120 --> 00:52:22,799 Speaker 1: your book as much as we have, what can they 925 00:52:22,800 --> 00:52:27,200 Speaker 1: look out for. I don't have any upcoming publications at 926 00:52:27,200 --> 00:52:30,000 Speaker 1: the moment. I have a novella that is sitting with 927 00:52:30,200 --> 00:52:34,160 Speaker 1: a couple of publishers, so I hope there will be 928 00:52:34,200 --> 00:52:38,359 Speaker 1: a publication date on a couple of things soon. You 929 00:52:38,400 --> 00:52:41,960 Speaker 1: can also find me on a regular basis at the 930 00:52:42,040 --> 00:52:45,960 Speaker 1: Reading the Weird column on tour dot com, where I 931 00:52:45,960 --> 00:52:51,680 Speaker 1: am pull Worth and I do commentary on two hundred 932 00:52:51,760 --> 00:52:59,040 Speaker 1: years of your fiction with equal parts we and criticism, 933 00:52:59,040 --> 00:53:02,200 Speaker 1: nothing being published, just now hoping to have more stuck 934 00:53:02,200 --> 00:53:04,719 Speaker 1: out soon. Well, best of luck and very nice to 935 00:53:04,760 --> 00:53:07,319 Speaker 1: chat with you. Yeah, thanks for being on the show, 936 00:53:15,160 --> 00:53:17,960 Speaker 1: Thanks for listening, and remember that Daniel and Jorge explain 937 00:53:18,040 --> 00:53:20,960 Speaker 1: The Universe is a production of I Heart Radio or 938 00:53:21,040 --> 00:53:23,959 Speaker 1: more podcast For my heart Radio, visit the I heart 939 00:53:24,080 --> 00:53:27,640 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 940 00:53:27,719 --> 00:53:28,440 Speaker 1: favorite shows.