1 00:00:08,560 --> 00:00:11,640 Speaker 1: Hey, Daniel, remember when we did that episode about supervillains, 2 00:00:11,680 --> 00:00:14,440 Speaker 1: the one where we worked through the physics of listeners 3 00:00:14,520 --> 00:00:17,880 Speaker 1: hypothetical nefarious schemes. Yeah? Are you sure they were hypothetical? 4 00:00:18,040 --> 00:00:19,880 Speaker 1: Do you know if anyone actually did any of them? 5 00:00:19,920 --> 00:00:21,479 Speaker 1: I didn't follow up with any of them, but I 6 00:00:21,480 --> 00:00:23,680 Speaker 1: guess we probably hear about it on the news. If 7 00:00:23,680 --> 00:00:26,840 Speaker 1: anyone actually managed to turn off the sun or build 8 00:00:26,920 --> 00:00:29,520 Speaker 1: nuclear power ants? That makes what do you think that 9 00:00:29,520 --> 00:00:31,240 Speaker 1: maybe we should have done more? Are you worried we 10 00:00:31,280 --> 00:00:33,800 Speaker 1: didn't help them enough? Maybe we should like write a 11 00:00:33,840 --> 00:00:37,240 Speaker 1: book with step by step instructions. Yeah, maybe maybe nobody 12 00:00:37,240 --> 00:00:40,040 Speaker 1: should write that book. Sounds like a bad idea, might 13 00:00:40,120 --> 00:00:57,640 Speaker 1: be too late. I am r handmade cartoonists and the 14 00:00:57,640 --> 00:01:01,120 Speaker 1: creator of PhD comics. Hi. I'm Daniel, a particle physicist 15 00:01:01,120 --> 00:01:03,960 Speaker 1: and a professor at UC Irvine, and I seriously hope 16 00:01:04,160 --> 00:01:07,240 Speaker 1: nobody takes over the world. What if it's a really 17 00:01:07,280 --> 00:01:11,120 Speaker 1: good person you mean, like a physicist. No, that's the opposite. 18 00:01:12,160 --> 00:01:16,080 Speaker 1: I mean, all right, I mean a great person, not 19 00:01:16,200 --> 00:01:19,120 Speaker 1: just good enough cartoonist thing. That's right, So with a 20 00:01:19,160 --> 00:01:22,400 Speaker 1: sense of humor and artistic skills. I think Plato suggested 21 00:01:22,440 --> 00:01:25,240 Speaker 1: that right that all societies should be ruled by philosopher 22 00:01:25,240 --> 00:01:29,080 Speaker 1: cartoonist kings. Yeah I could be a king, sure, But 23 00:01:29,160 --> 00:01:31,679 Speaker 1: welcome to our podcast Daniel and Jorge Explain the Universe, 24 00:01:31,720 --> 00:01:33,880 Speaker 1: a production of I Heart Radio, in which we try 25 00:01:33,920 --> 00:01:37,240 Speaker 1: to explain everything we do and do not understand about 26 00:01:37,280 --> 00:01:39,880 Speaker 1: the universe to you. We dive into the details of 27 00:01:39,920 --> 00:01:43,440 Speaker 1: how quantum particles move, about how hurricane swirls, and about 28 00:01:43,480 --> 00:01:46,640 Speaker 1: how galaxies form. We seek to enlighten and to explain 29 00:01:46,680 --> 00:01:49,320 Speaker 1: and to illuminate, and not to arm you with the 30 00:01:49,360 --> 00:01:51,600 Speaker 1: tools to take over the world. That's right. We don't 31 00:01:51,640 --> 00:01:53,640 Speaker 1: try to take over the world. We try to explain 32 00:01:53,680 --> 00:01:56,160 Speaker 1: the world and talk about all the amazing things in it, 33 00:01:56,200 --> 00:01:58,440 Speaker 1: because it is a pretty wonderful world that if we 34 00:01:58,560 --> 00:02:01,280 Speaker 1: understood more about, maybe we can appreciate more and not 35 00:02:01,360 --> 00:02:03,600 Speaker 1: try to take it over. That's right. The express goal 36 00:02:03,600 --> 00:02:06,120 Speaker 1: of physics is not to build the next generation of 37 00:02:06,240 --> 00:02:09,520 Speaker 1: nuclear weapons or arm you with nuclear powered ants, but 38 00:02:09,600 --> 00:02:11,840 Speaker 1: to show you how the world works, to take it 39 00:02:11,880 --> 00:02:14,400 Speaker 1: apart bit by bit and try to digest it into 40 00:02:14,440 --> 00:02:17,880 Speaker 1: little mathematical stories that make sense to the human mind. 41 00:02:18,120 --> 00:02:21,320 Speaker 1: But of course, along the way sometimes we do accidentally 42 00:02:21,360 --> 00:02:25,120 Speaker 1: gain bits of knowledge which have practical consequences. That's right. 43 00:02:25,160 --> 00:02:28,200 Speaker 1: That's not the intended purpose of science, but it's kind 44 00:02:28,200 --> 00:02:30,799 Speaker 1: of a side effect sometimes it does kind of give 45 00:02:30,840 --> 00:02:33,640 Speaker 1: you the ammunition you need to change the world and 46 00:02:33,680 --> 00:02:36,359 Speaker 1: to do amazing things in it, and maybe terrible things too, 47 00:02:36,480 --> 00:02:39,040 Speaker 1: that's right. And it's transformed the world for the better 48 00:02:39,160 --> 00:02:41,720 Speaker 1: in many ways and some ways for the worst. I'm 49 00:02:41,720 --> 00:02:44,840 Speaker 1: sure people would argue, Well, that is a super big question, 50 00:02:44,960 --> 00:02:47,120 Speaker 1: and it's one that people have been asking themselves for 51 00:02:47,160 --> 00:02:49,480 Speaker 1: a long time, like, if you had the means and 52 00:02:49,520 --> 00:02:53,200 Speaker 1: maybe the science and the knowledge to do incredible things 53 00:02:53,200 --> 00:02:55,840 Speaker 1: here on earth, should you do them? Does physics and 54 00:02:55,840 --> 00:02:59,000 Speaker 1: the rest of science equip you with enough power to 55 00:02:59,160 --> 00:03:01,680 Speaker 1: control everything? It is a pretty big question and a 56 00:03:01,720 --> 00:03:04,480 Speaker 1: pretty complicated one. Fortunately, Daniel, you and I have a 57 00:03:04,480 --> 00:03:07,560 Speaker 1: friend who has written a whole book about this. Fortunately 58 00:03:07,680 --> 00:03:11,720 Speaker 1: or unfortunately he's written this book, fortunately because I didn't 59 00:03:11,760 --> 00:03:14,200 Speaker 1: really want to write it, So I'm glad that he didn't, 60 00:03:14,360 --> 00:03:17,040 Speaker 1: and then he can take the blame if something bad 61 00:03:17,200 --> 00:03:20,680 Speaker 1: happens from that. That's right. And so today on the podcast, 62 00:03:20,720 --> 00:03:28,920 Speaker 1: we'll be talking about Ryan North how to take over 63 00:03:28,960 --> 00:03:34,520 Speaker 1: the world with science? Science, Science, now, just to be 64 00:03:34,520 --> 00:03:37,200 Speaker 1: clear with science is not the part of the title 65 00:03:37,200 --> 00:03:39,800 Speaker 1: of his book. That's right, though he does dive deep 66 00:03:39,840 --> 00:03:43,400 Speaker 1: into realistic scientific approaches to how to solve these problems. 67 00:03:43,440 --> 00:03:45,000 Speaker 1: So it's a bit of a tongue in cheek. It's 68 00:03:45,040 --> 00:03:47,440 Speaker 1: come for the supervillain scheme to take over the world, 69 00:03:47,640 --> 00:03:51,040 Speaker 1: stay for the scientific understanding of how the world works. Yeah, 70 00:03:51,040 --> 00:03:52,720 Speaker 1: it is a pretty interesting book. It's kind of like 71 00:03:52,760 --> 00:03:55,480 Speaker 1: how if you wanted to be a super villain, Like, 72 00:03:55,520 --> 00:03:57,680 Speaker 1: what are some of the possible schemes and how would 73 00:03:57,680 --> 00:03:59,560 Speaker 1: you make them work if you wanted to do it? 74 00:03:59,720 --> 00:04:02,000 Speaker 1: Or me be you're a government regulator and you're wondering 75 00:04:02,280 --> 00:04:05,040 Speaker 1: how your citizens might turn into supervillains and you want 76 00:04:05,080 --> 00:04:07,720 Speaker 1: to know how to stop them. Also an excellent resource. 77 00:04:07,920 --> 00:04:10,520 Speaker 1: Oh man, you went pretty dark there. You went right 78 00:04:10,560 --> 00:04:14,040 Speaker 1: into like big brother. I would say maybe also it 79 00:04:14,080 --> 00:04:16,839 Speaker 1: could be a good guy for superheroes, Like if you 80 00:04:16,839 --> 00:04:19,480 Speaker 1: wanted to stay one step ahead of the supervillains, how 81 00:04:19,520 --> 00:04:21,640 Speaker 1: would you what would you need? Are you saying government 82 00:04:21,680 --> 00:04:24,719 Speaker 1: regulators can't be superheroes? That's what I was imagining Superman 83 00:04:24,880 --> 00:04:28,080 Speaker 1: working together with the FBI. That's right. N A stands 84 00:04:28,120 --> 00:04:32,760 Speaker 1: for New Superman Agency. Well, you don't want to hear 85 00:04:32,800 --> 00:04:34,880 Speaker 1: from us. We have not done any research into how 86 00:04:34,920 --> 00:04:36,880 Speaker 1: to take Over the World, which is why we invited 87 00:04:36,960 --> 00:04:38,920 Speaker 1: Ryan on the show to talk to us. And so 88 00:04:39,000 --> 00:04:42,920 Speaker 1: here is our interview with author and cartoonist Ryan North. 89 00:04:43,200 --> 00:04:46,800 Speaker 1: So it's my pleasure to introduce to the program Ryan North. 90 00:04:46,960 --> 00:04:50,719 Speaker 1: He's the author of Dinosaur Comics, Adventure Time Comics, and 91 00:04:50,839 --> 00:04:54,520 Speaker 1: Squirrel Girl. He's won the Eisner Award and Harvey Awards, 92 00:04:54,560 --> 00:04:57,320 Speaker 1: and has been a New York Times bestseller. In addition, 93 00:04:57,440 --> 00:05:01,480 Speaker 1: is an academic background in computational inguistics and a dog 94 00:05:01,600 --> 00:05:04,800 Speaker 1: named Chomsky. Ryan. Welcome to the program. Hi, thank you 95 00:05:04,839 --> 00:05:07,000 Speaker 1: for having me. We had Ryan on the show several 96 00:05:07,040 --> 00:05:09,400 Speaker 1: years ago to talk about his book called How To 97 00:05:09,600 --> 00:05:12,800 Speaker 1: Invent Everything, which is a piece of science communication in 98 00:05:12,839 --> 00:05:16,080 Speaker 1: the genre that we hear the podcast like having humble 99 00:05:16,160 --> 00:05:19,120 Speaker 1: titles like this podcast, and this new book continues that 100 00:05:19,160 --> 00:05:22,239 Speaker 1: theme and it's called how Do take Over the World. 101 00:05:22,480 --> 00:05:25,320 Speaker 1: So my first question for you, Ryan is why should 102 00:05:25,320 --> 00:05:28,560 Speaker 1: we take world domination advice from someone who was famously 103 00:05:28,839 --> 00:05:32,680 Speaker 1: trapped in a scaring pit with only an umbrella, a leash, phone, 104 00:05:32,760 --> 00:05:35,280 Speaker 1: and a dog. It's true, I did get stuck in 105 00:05:35,360 --> 00:05:37,520 Speaker 1: a hole with my dog, and we were there for 106 00:05:37,520 --> 00:05:40,200 Speaker 1: about an hour before Twitter helped me escape. But the 107 00:05:40,320 --> 00:05:43,200 Speaker 1: sort of secret about the book is it is called 108 00:05:43,200 --> 00:05:45,440 Speaker 1: how to take Over the World, and it is using 109 00:05:45,480 --> 00:05:48,080 Speaker 1: that as the kind of the candy coating to get 110 00:05:48,120 --> 00:05:52,120 Speaker 1: across the nonfiction inside. But the plots in the book 111 00:05:52,120 --> 00:05:56,800 Speaker 1: are really used as a a lens to explore the 112 00:05:56,839 --> 00:06:02,080 Speaker 1: actual science and technology and interesting ideas contained within it. 113 00:06:02,200 --> 00:06:04,760 Speaker 1: So while you could use the book to take over 114 00:06:04,800 --> 00:06:07,560 Speaker 1: the world, I hope you use it to learn more 115 00:06:07,600 --> 00:06:10,520 Speaker 1: about the really interesting world around us and the ways 116 00:06:10,600 --> 00:06:12,320 Speaker 1: we can make it better. I guess you could use 117 00:06:12,320 --> 00:06:14,400 Speaker 1: it to learn about the world so you can take 118 00:06:14,440 --> 00:06:16,680 Speaker 1: over it. Yeah you could. I mean I priced out 119 00:06:16,720 --> 00:06:18,560 Speaker 1: every plot in the book and it comes in at 120 00:06:18,600 --> 00:06:22,240 Speaker 1: a little under fifties six billion dollars. So if you 121 00:06:22,320 --> 00:06:24,520 Speaker 1: have that kind of cash kicking around, I know a 122 00:06:24,560 --> 00:06:26,760 Speaker 1: way you could spend it. You probably already run the world. 123 00:06:26,800 --> 00:06:29,240 Speaker 1: Is that what you're saying? Okay? Right, But do you 124 00:06:29,240 --> 00:06:31,359 Speaker 1: think it's kind of a bad idea to give people 125 00:06:31,640 --> 00:06:33,799 Speaker 1: instructions like that about how to take over the world? 126 00:06:34,000 --> 00:06:36,240 Speaker 1: In other words, should we stop the publication of this book? 127 00:06:38,120 --> 00:06:40,279 Speaker 1: I hope not. I'll tell you something I consider right, like, 128 00:06:40,440 --> 00:06:44,279 Speaker 1: is this a dangerous book? And I don't think it is, 129 00:06:44,800 --> 00:06:49,520 Speaker 1: because the truth of it is that what we're doing 130 00:06:49,839 --> 00:06:52,800 Speaker 1: is learning about the world. Like with How to Invent Everything, 131 00:06:52,839 --> 00:06:54,960 Speaker 1: the premise was you've gone back in time, your time 132 00:06:54,960 --> 00:06:57,280 Speaker 1: machine is broken, and we're going to use that fictional 133 00:06:57,320 --> 00:07:00,839 Speaker 1: premise to learn about how to rebuild civilization from. And 134 00:07:00,880 --> 00:07:03,720 Speaker 1: so with these world domination schemes, they're all lifted from 135 00:07:03,760 --> 00:07:06,040 Speaker 1: comic books. There are things that Dr Doom or Lex 136 00:07:06,120 --> 00:07:09,320 Speaker 1: Luthor would do, like having a secret floating base, or 137 00:07:09,720 --> 00:07:12,720 Speaker 1: clothing a dinosaur so you can ride around on it 138 00:07:12,120 --> 00:07:16,240 Speaker 1: and scare people. It's digging a hole to the Earth's 139 00:07:16,240 --> 00:07:18,280 Speaker 1: core so you can hold it hostage, that sort of stuff, 140 00:07:18,320 --> 00:07:22,320 Speaker 1: which in real life doesn't tend to get done, because 141 00:07:22,480 --> 00:07:25,720 Speaker 1: if you have the kind of means to dig a 142 00:07:25,720 --> 00:07:28,760 Speaker 1: hole to the Earth's core, you are probably already using 143 00:07:28,800 --> 00:07:31,920 Speaker 1: that to do a lot more direct world domination things, 144 00:07:31,960 --> 00:07:34,760 Speaker 1: if that's what you're interested in. So let me put 145 00:07:34,800 --> 00:07:38,320 Speaker 1: your listeners at ease and say that I don't think 146 00:07:38,320 --> 00:07:41,560 Speaker 1: it's a dangerous book. But I am also the guy 147 00:07:41,560 --> 00:07:44,480 Speaker 1: who wrote it, so I am therefore extremely biased. Well, 148 00:07:44,520 --> 00:07:46,240 Speaker 1: I mean, how many copies of this book do you 149 00:07:46,240 --> 00:07:49,360 Speaker 1: plan to sell? How many supervillains can the Earth tolerate? 150 00:07:49,560 --> 00:07:52,160 Speaker 1: You know what, if this book creates several supervillains and 151 00:07:52,160 --> 00:07:54,440 Speaker 1: they have competing tunnels to the center of the Earth. 152 00:07:54,680 --> 00:07:58,920 Speaker 1: If this book produces several competing supervillains all digging to 153 00:07:58,960 --> 00:08:02,280 Speaker 1: the Earth's core, I would argue that the actual good 154 00:08:02,360 --> 00:08:06,160 Speaker 1: science that would come out of actually reaching the Earth's 155 00:08:06,160 --> 00:08:10,120 Speaker 1: core with a tunnel would probably be worth those repeating 156 00:08:10,200 --> 00:08:13,040 Speaker 1: villains trying to do it as quickly as possible. Also, 157 00:08:13,120 --> 00:08:15,400 Speaker 1: you don't need actual supervillains to read your book, You 158 00:08:15,440 --> 00:08:19,320 Speaker 1: just need aspiring supervillains. Yeah. I had a friend who 159 00:08:19,400 --> 00:08:21,600 Speaker 1: was like, you know, I never considered myself to be 160 00:08:21,640 --> 00:08:24,360 Speaker 1: a super villain, but part way through reading your book, 161 00:08:24,720 --> 00:08:28,640 Speaker 1: I started thinking, this feels like self help. This feels 162 00:08:28,680 --> 00:08:31,560 Speaker 1: like I'm becoming more and more convinced that I could 163 00:08:31,600 --> 00:08:34,680 Speaker 1: become a supervillain and I could take over the world 164 00:08:34,679 --> 00:08:37,520 Speaker 1: if I really wanted to, which I counted as a success. Yeah. 165 00:08:37,559 --> 00:08:39,600 Speaker 1: Well that kind of brings up the next question we have, 166 00:08:39,720 --> 00:08:42,600 Speaker 1: which is it do you think scientists or like writers 167 00:08:42,640 --> 00:08:46,560 Speaker 1: like yourself have responsibility for how their technology is applied? 168 00:08:46,600 --> 00:08:48,720 Speaker 1: I mean, like, do you think it would be Daniel's 169 00:08:48,720 --> 00:08:51,839 Speaker 1: fault if the l C, for example, accidentally creates a 170 00:08:51,840 --> 00:08:54,680 Speaker 1: black hole and destroys the Earth. I would personally blame 171 00:08:54,800 --> 00:08:57,880 Speaker 1: Daniel for that. Yes, absolutely, But seriously, I think there 172 00:08:58,000 --> 00:09:00,480 Speaker 1: is there's always a discussion, right of what do we 173 00:09:00,480 --> 00:09:03,280 Speaker 1: owe each other and how responsible are we the things 174 00:09:03,280 --> 00:09:06,760 Speaker 1: that we do when we produce. And my angle on 175 00:09:06,880 --> 00:09:10,000 Speaker 1: this is growing up in the eighties and nineties and 176 00:09:10,600 --> 00:09:14,040 Speaker 1: studying computers, and at the time there was this idea 177 00:09:14,080 --> 00:09:17,240 Speaker 1: that I fully subscribed to that the Internet was an 178 00:09:17,280 --> 00:09:20,600 Speaker 1: intrinsic good. It was good by its very nature because 179 00:09:20,600 --> 00:09:22,480 Speaker 1: it would let people talk to anyone in the world, 180 00:09:22,520 --> 00:09:25,440 Speaker 1: and communication was good by its very nature, and whatever 181 00:09:25,480 --> 00:09:27,400 Speaker 1: the Internet did, it had to be positive for the 182 00:09:27,400 --> 00:09:29,839 Speaker 1: world because we were connecting humans like that just felt 183 00:09:29,880 --> 00:09:31,559 Speaker 1: like it was. You accepted it as too, because it 184 00:09:31,600 --> 00:09:34,720 Speaker 1: was so obvious. And of course, twenty years later we 185 00:09:34,760 --> 00:09:39,640 Speaker 1: see things like Facebook inciting genocide and may and Mare 186 00:09:39,720 --> 00:09:43,040 Speaker 1: and all these ways in which human communication has been 187 00:09:43,080 --> 00:09:47,560 Speaker 1: weaponized and used to accomplish some very bad things, and 188 00:09:47,960 --> 00:09:50,880 Speaker 1: that was nothing that we foresaw at the time. Like 189 00:09:50,920 --> 00:09:52,800 Speaker 1: we all thought it was great, we didn't realize that 190 00:09:53,480 --> 00:09:55,560 Speaker 1: anyone in the world being able to talk to you 191 00:09:55,800 --> 00:09:57,960 Speaker 1: meant that if you were in any way marginalized, you'd 192 00:09:57,960 --> 00:10:00,360 Speaker 1: have to be constantly defending yourself against an effe actively 193 00:10:00,440 --> 00:10:05,120 Speaker 1: infinite array of strangers, always calling upon you to justify 194 00:10:05,200 --> 00:10:07,640 Speaker 1: your existence, Like there were these downsides we didn't see. 195 00:10:07,760 --> 00:10:09,640 Speaker 1: And I don't think you can call up you know, 196 00:10:09,720 --> 00:10:13,960 Speaker 1: Tim Bernard's Lee and say, hey, you invented the Worldwide Web. 197 00:10:14,000 --> 00:10:16,640 Speaker 1: I'm holding you responsible for this, but I think there 198 00:10:16,679 --> 00:10:22,280 Speaker 1: is a responsibility to try to foresee the negative ways 199 00:10:22,600 --> 00:10:24,679 Speaker 1: technology can be used. And that's a that's a very 200 00:10:24,720 --> 00:10:27,640 Speaker 1: serious answer to have. Probably not that serious question. I 201 00:10:27,640 --> 00:10:29,680 Speaker 1: apologize for that. No, I think it is a serious 202 00:10:29,720 --> 00:10:31,880 Speaker 1: question and it needs a serious answer. So thank you 203 00:10:31,920 --> 00:10:34,400 Speaker 1: for that. And I'm also glad to hear that you're 204 00:10:34,480 --> 00:10:38,400 Speaker 1: leaving most scientists of the responsibility for their actions except 205 00:10:38,400 --> 00:10:42,439 Speaker 1: for me. I'm responsible for the black hole in case 206 00:10:42,480 --> 00:10:45,520 Speaker 1: the LC eats the Earth, and that micro second I 207 00:10:45,600 --> 00:10:47,680 Speaker 1: have before I get stuck into the black called like 208 00:10:47,760 --> 00:10:49,920 Speaker 1: Daniel exactly, I haven't spent a whole lot of time 209 00:10:49,920 --> 00:10:53,559 Speaker 1: preparing my defense for the post Earth has been destroyed. 210 00:10:53,880 --> 00:10:56,600 Speaker 1: You know Hague trial, in which I'm called into justice. 211 00:10:57,880 --> 00:11:01,960 Speaker 1: You're probably fine, but you know, it is an important 212 00:11:02,000 --> 00:11:04,160 Speaker 1: topic and it's something that I personally have thought about 213 00:11:04,160 --> 00:11:06,720 Speaker 1: a lot. Having grown up in Los Alumos, where the 214 00:11:06,760 --> 00:11:09,800 Speaker 1: development of nuclear weapons, you know, literally has changed the 215 00:11:09,880 --> 00:11:13,720 Speaker 1: nature of human society, I definitely shied away from any 216 00:11:13,800 --> 00:11:17,960 Speaker 1: kind of physics research which had immediate applications, and tried 217 00:11:18,000 --> 00:11:21,080 Speaker 1: to imagine that, you know, particle physics was basically useless 218 00:11:21,120 --> 00:11:24,280 Speaker 1: to humanity except that its scratched our itch. You know 219 00:11:24,400 --> 00:11:27,760 Speaker 1: that it's satisfied some curiosity about the nature of reality. 220 00:11:27,800 --> 00:11:29,800 Speaker 1: But I think it's fun in your book how you 221 00:11:30,160 --> 00:11:34,160 Speaker 1: very directly connect these ideas of here's some knowledge about 222 00:11:34,160 --> 00:11:36,120 Speaker 1: how the universe works, and here's how you could use 223 00:11:36,160 --> 00:11:38,640 Speaker 1: it for your own personal gain at the expense of 224 00:11:38,640 --> 00:11:41,800 Speaker 1: the rest of humanity. I introduced this idea of what 225 00:11:41,880 --> 00:11:46,640 Speaker 1: I call enlightened superability, where you're helping the world by 226 00:11:46,679 --> 00:11:48,719 Speaker 1: helping yourself. I think the best example of that is 227 00:11:48,800 --> 00:11:51,280 Speaker 1: in the chapter about becoming immortal. We explore all the 228 00:11:51,320 --> 00:11:54,040 Speaker 1: different ways that people have tried to reach for immortality 229 00:11:54,080 --> 00:11:55,600 Speaker 1: in the past, and a lot of them feel really 230 00:11:55,640 --> 00:11:58,920 Speaker 1: goofy from our current point of view, like there's an 231 00:11:58,960 --> 00:12:01,200 Speaker 1: idea of popular and ang in the sixteen hundreds where 232 00:12:01,240 --> 00:12:05,000 Speaker 1: if you drank all these medicines which are really toxins poisons, 233 00:12:05,240 --> 00:12:08,160 Speaker 1: then your hair would fall out and your nails would 234 00:12:08,200 --> 00:12:09,920 Speaker 1: fall off. And we recognize this as like a very 235 00:12:09,960 --> 00:12:12,960 Speaker 1: crude form of chemotherapy, but they saw that when the 236 00:12:13,000 --> 00:12:15,240 Speaker 1: hair grew back and the nails grew back, if you 237 00:12:15,320 --> 00:12:18,640 Speaker 1: survive the treatment, then you'd be reborn as a baby, 238 00:12:18,679 --> 00:12:21,680 Speaker 1: and you live for longer by surviving this really crude 239 00:12:21,679 --> 00:12:24,960 Speaker 1: form of amateur chemotherapy that it feels goofy now, but 240 00:12:25,000 --> 00:12:26,439 Speaker 1: at the time they're like, yeah, this this might be 241 00:12:26,480 --> 00:12:29,000 Speaker 1: way we can become immortal. And sort of the conclusion 242 00:12:29,040 --> 00:12:32,120 Speaker 1: of that chapter is that if you did become immortal 243 00:12:32,200 --> 00:12:35,120 Speaker 1: through any sort of medical process, then that would be 244 00:12:35,160 --> 00:12:38,560 Speaker 1: really bad for the world, right because if it's a 245 00:12:38,600 --> 00:12:40,520 Speaker 1: medical way to become immortal, it's not some magical thing, 246 00:12:40,600 --> 00:12:42,760 Speaker 1: it's some scientific thing you're doing to yourself. Then that 247 00:12:42,800 --> 00:12:46,040 Speaker 1: takes money, that takes resources, and not everyone can have it. 248 00:12:46,360 --> 00:12:48,160 Speaker 1: And now you've produced a world in which you have 249 00:12:48,440 --> 00:12:51,680 Speaker 1: a class of immortal humans who won't die and everyone 250 00:12:51,720 --> 00:12:54,400 Speaker 1: else who will die. And that's that's like cartoonist levels 251 00:12:54,480 --> 00:12:57,560 Speaker 1: of dystopia and inequality. And the way you get around 252 00:12:57,600 --> 00:13:00,240 Speaker 1: that perfectly is you keep it a secret. Only you 253 00:13:00,320 --> 00:13:02,600 Speaker 1: become immortal, you're the only one who does is you're 254 00:13:02,600 --> 00:13:04,600 Speaker 1: the only one who knows about it. And then you 255 00:13:04,679 --> 00:13:07,400 Speaker 1: have these benefits of immortality where you can study something 256 00:13:07,440 --> 00:13:10,240 Speaker 1: for for hundreds of thousands of lifetimes and the rest 257 00:13:10,280 --> 00:13:13,079 Speaker 1: of the world doesn't need to suffer for it. And 258 00:13:13,600 --> 00:13:16,640 Speaker 1: that feels like a benefit, but also there's clearly something 259 00:13:16,720 --> 00:13:20,120 Speaker 1: villainous in becoming immortal and keeping it all for yourself, 260 00:13:20,160 --> 00:13:22,920 Speaker 1: And that narrow space is what I call it enlightened Supervillain. 261 00:13:22,960 --> 00:13:24,480 Speaker 1: You feel like it's where a lot of the book 262 00:13:24,720 --> 00:13:26,880 Speaker 1: operates and has fun with. Well, I guess I have 263 00:13:26,920 --> 00:13:29,480 Speaker 1: a more general question and now, Ryan, which is what 264 00:13:29,679 --> 00:13:31,760 Speaker 1: kind of inspired the book for you, Like, what made 265 00:13:31,760 --> 00:13:33,719 Speaker 1: you think to write a book about how to take 266 00:13:33,720 --> 00:13:35,720 Speaker 1: over the world. That's a good question. It sort of 267 00:13:35,760 --> 00:13:39,840 Speaker 1: came from me writing stories for Marvel and DC Comic 268 00:13:39,840 --> 00:13:45,040 Speaker 1: book Adventures. And in those stories, you always have a 269 00:13:45,080 --> 00:13:46,959 Speaker 1: hero fighting a villain. That's the rule of a genre, 270 00:13:47,040 --> 00:13:51,800 Speaker 1: superhero comics, and the best stories have the heroes winning 271 00:13:52,200 --> 00:13:54,920 Speaker 1: at the last second, right like in a sports game. 272 00:13:54,960 --> 00:13:57,439 Speaker 1: If your team dominates the whole match, it's not very 273 00:13:57,440 --> 00:13:58,960 Speaker 1: exciting because you know how it's going to end. But 274 00:13:59,000 --> 00:14:01,319 Speaker 1: if they come into the last second and double overtime 275 00:14:01,320 --> 00:14:03,280 Speaker 1: and that's a really good game, that's super exciting. And 276 00:14:03,360 --> 00:14:06,320 Speaker 1: so I realized that we were writing these stories where 277 00:14:06,320 --> 00:14:09,960 Speaker 1: the villains always lost at the last second, and what 278 00:14:10,120 --> 00:14:12,680 Speaker 1: happened if they didn't have to lose, what happened if 279 00:14:12,679 --> 00:14:15,280 Speaker 1: we structure the story so that they could win. And 280 00:14:15,520 --> 00:14:18,000 Speaker 1: if we could do that in a story, what's to 281 00:14:18,000 --> 00:14:20,160 Speaker 1: stop us from doing it in real life? And so 282 00:14:20,280 --> 00:14:22,640 Speaker 1: part of writing the book was me trying to convince 283 00:14:22,680 --> 00:14:25,360 Speaker 1: myself that this is fine, this is safe. No one 284 00:14:25,480 --> 00:14:27,640 Speaker 1: is going to blow up the moon or do any 285 00:14:27,680 --> 00:14:30,080 Speaker 1: crazy stuff. And part of it is, well, if someone 286 00:14:30,160 --> 00:14:34,040 Speaker 1: was trying to actually pull off supervillain schemes, like if 287 00:14:34,040 --> 00:14:37,600 Speaker 1: they really wanted a floating base, how close can we get? 288 00:14:37,760 --> 00:14:40,440 Speaker 1: Can we do it? If they really wanted a dinosaur 289 00:14:40,440 --> 00:14:42,400 Speaker 1: to write around on, what's the state of the art 290 00:14:42,440 --> 00:14:46,520 Speaker 1: for that? How close are we to de extinctifying which 291 00:14:46,560 --> 00:14:50,320 Speaker 1: is definitely a very scientific term these sorts of animals, 292 00:14:50,320 --> 00:14:52,080 Speaker 1: And there's a lot of really interesting things happening there. 293 00:14:52,080 --> 00:14:54,800 Speaker 1: So it end up being this fun exploration of the 294 00:14:54,840 --> 00:14:58,240 Speaker 1: sort of current edges of science and research, all through 295 00:14:58,240 --> 00:15:01,120 Speaker 1: this lens of I want to take over the world, 296 00:15:01,160 --> 00:15:03,600 Speaker 1: like dr doing what can I do. One thing that 297 00:15:03,720 --> 00:15:07,520 Speaker 1: excited me but also terrified me was how realistic some 298 00:15:07,600 --> 00:15:10,960 Speaker 1: of these schemes sort of are, especially in the category 299 00:15:11,000 --> 00:15:16,000 Speaker 1: of like geo engineering. It's certainly possible that somebody fairly 300 00:15:16,040 --> 00:15:19,080 Speaker 1: wealthy could decide they want to change the climate of 301 00:15:19,120 --> 00:15:22,320 Speaker 1: the Earth and do something about it, you know, where 302 00:15:22,360 --> 00:15:25,560 Speaker 1: one person makes a decision essentially for the whole planet. 303 00:15:25,720 --> 00:15:27,480 Speaker 1: I don't know if you've read the recent novel by 304 00:15:27,560 --> 00:15:30,280 Speaker 1: Neil Stevenson, Termination and Shock, in which this is sort 305 00:15:30,280 --> 00:15:33,640 Speaker 1: of explored. Can you talk about how somebody could actually 306 00:15:33,680 --> 00:15:37,160 Speaker 1: impact the climate, how an individual, fairly wealthy person could 307 00:15:37,160 --> 00:15:41,000 Speaker 1: make a decision about geo engineering for the whole human population. Yeah, 308 00:15:41,280 --> 00:15:43,360 Speaker 1: that was actually one of the chapters. That was the 309 00:15:43,440 --> 00:15:46,400 Speaker 1: chapter where I was most aware of. We're all having 310 00:15:46,400 --> 00:15:48,200 Speaker 1: fun here with this book for supervillaince, but I want 311 00:15:48,200 --> 00:15:50,440 Speaker 1: to make it clear that this is probably not something 312 00:15:50,480 --> 00:15:52,760 Speaker 1: we should be doing. Don't go off and do this 313 00:15:52,840 --> 00:15:55,240 Speaker 1: on your own. So the basic idea is that the 314 00:15:55,280 --> 00:15:59,920 Speaker 1: Earth's climate is changing, and one way we could stick 315 00:16:00,160 --> 00:16:02,040 Speaker 1: that if we can't get rid of carbon dioxide in 316 00:16:02,080 --> 00:16:05,200 Speaker 1: the atmosphere, which causes a greenhouse effect, is just have 317 00:16:05,440 --> 00:16:08,800 Speaker 1: less sunlight hit the Earth. And if we reduce the 318 00:16:08,800 --> 00:16:10,800 Speaker 1: amount of sunlight hitting the surface of the Earth by 319 00:16:10,840 --> 00:16:15,160 Speaker 1: two that could bring global temperatures down to pre industrial levels. 320 00:16:15,600 --> 00:16:17,640 Speaker 1: And one way you can do that is you go 321 00:16:17,720 --> 00:16:22,120 Speaker 1: up to the stratosphere and you spray sulfur dioxide into it, 322 00:16:22,680 --> 00:16:25,600 Speaker 1: which is white, and it reflects light back to space. 323 00:16:25,920 --> 00:16:29,360 Speaker 1: Everything's fine, and then that sulfur dioxide falls down as 324 00:16:29,400 --> 00:16:32,640 Speaker 1: acid rain. Don't worry about it. That sounds fine, It's 325 00:16:32,640 --> 00:16:37,080 Speaker 1: probably fine. And you could do this with a fleet 326 00:16:37,240 --> 00:16:42,560 Speaker 1: of not that strongly modified airplanes change so that they 327 00:16:42,560 --> 00:16:46,680 Speaker 1: can fly through the stratosphere and disperse this this sulfur dioxide. 328 00:16:47,120 --> 00:16:50,000 Speaker 1: And the cost for that would be about as seven 329 00:16:50,040 --> 00:16:53,200 Speaker 1: billion dollars US initially and then two billion dollars a 330 00:16:53,280 --> 00:16:56,000 Speaker 1: year ongoing, because of course this sulfur dioxide falls down 331 00:16:56,000 --> 00:16:58,320 Speaker 1: to Earth, it needs to replenished once a year. And 332 00:16:58,560 --> 00:17:00,440 Speaker 1: you know the plus side is, yes, this this would 333 00:17:00,520 --> 00:17:03,520 Speaker 1: reduce global temperatures down to where they were pre industrially, 334 00:17:04,119 --> 00:17:07,840 Speaker 1: but the downsides are pretty significant. It's something that needs 335 00:17:07,880 --> 00:17:10,639 Speaker 1: to be maintained. So if we stopped, then suddenly you 336 00:17:10,680 --> 00:17:14,240 Speaker 1: have two hundred years of climate change happening in one year, 337 00:17:14,320 --> 00:17:16,679 Speaker 1: which would obviously be catastrophic. But of course, like we 338 00:17:16,720 --> 00:17:18,879 Speaker 1: also rely on fertilizers to feed all the humans that 339 00:17:18,920 --> 00:17:20,720 Speaker 1: are live in the planet, so there's some precedent for 340 00:17:20,720 --> 00:17:23,199 Speaker 1: for doing this. But the real objection I see is 341 00:17:23,240 --> 00:17:26,840 Speaker 1: that if you do this, you're not just changing the 342 00:17:26,880 --> 00:17:30,760 Speaker 1: global climate, but you're also changing global weather, the weather patterns, 343 00:17:30,840 --> 00:17:34,200 Speaker 1: and that means, you know, when a tornado strikes somewhere, 344 00:17:34,200 --> 00:17:36,200 Speaker 1: we call that an act of God, because we can't 345 00:17:36,200 --> 00:17:38,400 Speaker 1: control the weather. But when you have modified the weather, 346 00:17:38,600 --> 00:17:40,600 Speaker 1: there's no more acts of God. There's just acts of you. 347 00:17:40,760 --> 00:17:42,440 Speaker 1: And there's also the concern of if there is to 348 00:17:42,480 --> 00:17:46,320 Speaker 1: be a global thermostat, why is one person controlling it. 349 00:17:46,359 --> 00:17:49,199 Speaker 1: Who's to say it can be controlled or should be 350 00:17:49,200 --> 00:17:51,080 Speaker 1: controlled by that one person. So you get these really 351 00:17:51,840 --> 00:17:56,280 Speaker 1: thorny issues of culpability, especially if say, these tornadoes only 352 00:17:56,280 --> 00:17:58,720 Speaker 1: happen to affect one part of the planet that you're 353 00:17:58,760 --> 00:18:01,280 Speaker 1: not in, maybe you'll decide that's fine. Like, it gets 354 00:18:01,359 --> 00:18:05,720 Speaker 1: so messy so quickly and doesn't actually fully address the 355 00:18:05,760 --> 00:18:08,320 Speaker 1: problems with climate change. It addresses the temperature thing, but 356 00:18:08,359 --> 00:18:11,199 Speaker 1: there's still extra carbon dioxide causes coral re bleaching and 357 00:18:11,240 --> 00:18:13,920 Speaker 1: things like that. So it's not a perfect solution, and 358 00:18:14,680 --> 00:18:19,240 Speaker 1: it's not even a consequence free solution, but it is 359 00:18:19,400 --> 00:18:25,040 Speaker 1: a surprisingly achievable thing for someone with nine billion dollars 360 00:18:25,119 --> 00:18:28,240 Speaker 1: kicking around to do if they were really motivated. And 361 00:18:28,560 --> 00:18:31,679 Speaker 1: I don't think the answer there is to pretend it 362 00:18:31,720 --> 00:18:34,439 Speaker 1: doesn't exist. I'd like for people to be aware that 363 00:18:34,480 --> 00:18:36,720 Speaker 1: this is something that's out there, and this is something 364 00:18:36,720 --> 00:18:38,960 Speaker 1: that someone might try to do one day, or a 365 00:18:39,040 --> 00:18:40,960 Speaker 1: government or a country might try to do, and I 366 00:18:40,960 --> 00:18:42,960 Speaker 1: wouldn't want us to be surprised by it. Well, I'll 367 00:18:42,960 --> 00:18:44,879 Speaker 1: admit that I was terrified as I was reading this, 368 00:18:45,200 --> 00:18:48,240 Speaker 1: imagining what if Jeff Bezos is reading this book, because 369 00:18:48,359 --> 00:18:50,359 Speaker 1: this is something he could do. He could like push 370 00:18:50,359 --> 00:18:52,679 Speaker 1: a button and do this and decide, Hey, you know, 371 00:18:52,800 --> 00:18:55,919 Speaker 1: I'm sorry I blew away your country and tornadoes. But 372 00:18:56,000 --> 00:18:58,440 Speaker 1: I decided it was for the best. It really helped 373 00:18:58,480 --> 00:19:01,520 Speaker 1: Amazon's bottom line in America, so we went for it. 374 00:19:01,560 --> 00:19:03,640 Speaker 1: All right, Well, we have lots more questions for Ryan, 375 00:19:03,920 --> 00:19:19,440 Speaker 1: but first let's take a short break. Okay, we're back 376 00:19:19,440 --> 00:19:22,439 Speaker 1: and we're talking to Ryan North, the author of the 377 00:19:22,440 --> 00:19:25,720 Speaker 1: book How to take Over the World, who really really 378 00:19:25,800 --> 00:19:28,200 Speaker 1: doesn't want you to use his book to take over 379 00:19:28,240 --> 00:19:30,240 Speaker 1: the world, or maybe in a way you do. You 380 00:19:30,520 --> 00:19:33,320 Speaker 1: want people to be more informed and you know, make 381 00:19:33,359 --> 00:19:36,720 Speaker 1: better decisions with the book. Yeah, I do. Um. I 382 00:19:36,720 --> 00:19:41,280 Speaker 1: think there's this interesting angle in science communication where like, 383 00:19:41,400 --> 00:19:45,600 Speaker 1: these schemes have not been fully created by me. It's 384 00:19:45,640 --> 00:19:48,359 Speaker 1: just me looking around at what's there on the ground 385 00:19:48,359 --> 00:19:51,080 Speaker 1: and how it can be put together. And if me, 386 00:19:51,359 --> 00:19:54,679 Speaker 1: a simple cartoonist put them together in these ways, I'm 387 00:19:54,680 --> 00:19:56,679 Speaker 1: sure others have too. So I feel like, you know, 388 00:19:56,760 --> 00:19:58,679 Speaker 1: let's be aware of it. Let's let's look at what 389 00:19:58,720 --> 00:20:00,320 Speaker 1: can be done. And I also do want to make 390 00:20:00,320 --> 00:20:03,479 Speaker 1: it sound like it's all dire and and bad. Like 391 00:20:03,560 --> 00:20:06,760 Speaker 1: the scheme for bringing back dinosaurs, I think is a 392 00:20:06,760 --> 00:20:11,080 Speaker 1: lot of fun and could be a great benefit for people. 393 00:20:11,480 --> 00:20:14,040 Speaker 1: The argument there is less of you know, let's clone 394 00:20:14,040 --> 00:20:17,840 Speaker 1: dinosaurs like in Jurassic Park, and more let's look at chickens, 395 00:20:18,040 --> 00:20:22,879 Speaker 1: which are a distant ancestor of dinosaurs and use a 396 00:20:23,359 --> 00:20:28,359 Speaker 1: bespoke development environment to bring out arms instead of wings 397 00:20:28,480 --> 00:20:30,760 Speaker 1: and a snout instead of a beak, and a rounded 398 00:20:30,800 --> 00:20:33,399 Speaker 1: butt instead of a tail, and produce very dinosaur like 399 00:20:33,640 --> 00:20:36,600 Speaker 1: chickens which we could then track in with ostriches. And 400 00:20:36,640 --> 00:20:39,840 Speaker 1: then there's your dinosaur tour to run around on. And yes, 401 00:20:39,880 --> 00:20:43,800 Speaker 1: it's not technically a dinosaur, but it looks like one, 402 00:20:44,200 --> 00:20:46,439 Speaker 1: and it would probably sound like one if it worked. 403 00:20:46,520 --> 00:20:48,919 Speaker 1: But it would it taste like a dinosaur? I don't know. 404 00:20:49,000 --> 00:20:52,720 Speaker 1: I think probably pretty close. I'm gonna go on record 405 00:20:52,760 --> 00:20:55,040 Speaker 1: and say I think dinosaurs are delicious. Yeah, dinosaurs tastes 406 00:20:55,080 --> 00:20:57,120 Speaker 1: like chicken. I'm thinking about all sorts of tie ins 407 00:20:57,119 --> 00:20:59,960 Speaker 1: with fast food restaurants. You know, you get your Jurassic nuggets. 408 00:21:00,080 --> 00:21:01,800 Speaker 1: I mean, it sounds fantastic. I don't know why you 409 00:21:01,800 --> 00:21:04,080 Speaker 1: didn't base the whole book on that, but tell us 410 00:21:04,080 --> 00:21:06,760 Speaker 1: a little bit about that. You know, is Jurassic chicken 411 00:21:06,960 --> 00:21:09,680 Speaker 1: actually a possibility? Is this something that you know scientists 412 00:21:09,720 --> 00:21:12,119 Speaker 1: could actually accomplish if they decided it was a good idea. 413 00:21:12,240 --> 00:21:14,919 Speaker 1: It's interesting there are some scientists who think it's a 414 00:21:14,960 --> 00:21:18,200 Speaker 1: great idea. One in particular, Jack Horner wrote a book 415 00:21:18,200 --> 00:21:20,080 Speaker 1: called How to Build a Dinosaur where he was like, 416 00:21:20,119 --> 00:21:21,560 Speaker 1: this is what needs to be done, and we could 417 00:21:21,560 --> 00:21:23,520 Speaker 1: do it. And there's other teams who have been working 418 00:21:23,520 --> 00:21:25,679 Speaker 1: on this who have sort of gone up to the 419 00:21:25,720 --> 00:21:28,560 Speaker 1: line and then stepped back. There was one team that 420 00:21:28,680 --> 00:21:32,080 Speaker 1: was working on snouts and produced chicken embryo in an 421 00:21:32,080 --> 00:21:35,800 Speaker 1: egg that had a snout with teeth, and then they 422 00:21:35,800 --> 00:21:37,960 Speaker 1: didn't allow the egg to develop because they thought, you know, 423 00:21:38,040 --> 00:21:42,240 Speaker 1: there's some definite ethical and moral implications with what we're doing, 424 00:21:42,960 --> 00:21:46,240 Speaker 1: and let's avoid them by just not hatching this more 425 00:21:46,359 --> 00:21:49,240 Speaker 1: dinosaur like chicken. Are you saying that they thought about 426 00:21:49,240 --> 00:21:52,680 Speaker 1: the ethics after they had succeeded in creating a mini dinosaur. 427 00:21:52,880 --> 00:21:55,240 Speaker 1: That seems a tiny bit late in the game. I 428 00:21:55,280 --> 00:21:58,080 Speaker 1: mean better late than never. I feel like better late 429 00:21:58,119 --> 00:22:00,200 Speaker 1: than ever for sure. But the new thing of out 430 00:22:00,200 --> 00:22:04,000 Speaker 1: this process is we're not genetically engineering chickens to be dinosaurs. 431 00:22:04,000 --> 00:22:06,440 Speaker 1: Were just expressing different genes in their development to make 432 00:22:06,480 --> 00:22:09,399 Speaker 1: them more dinosaur like. And so if one of these 433 00:22:09,960 --> 00:22:13,520 Speaker 1: chicken Asaurus is actually hatched, it would have the same DNA, 434 00:22:13,640 --> 00:22:16,159 Speaker 1: the same genetic code as a chicken, and if it 435 00:22:16,200 --> 00:22:20,439 Speaker 1: could reproduce with another chicken, which probably would be unlikely, 436 00:22:20,440 --> 00:22:22,760 Speaker 1: but if it worked, you wouldn't get more chicken a souruss. 437 00:22:22,760 --> 00:22:25,440 Speaker 1: You just get regular chickens because they hadn't had their 438 00:22:25,440 --> 00:22:30,080 Speaker 1: development environment altered, so it feels safe. There's no escaping 439 00:22:30,119 --> 00:22:32,399 Speaker 1: the park and taking over the world angle here, and 440 00:22:32,440 --> 00:22:34,880 Speaker 1: if they do escape, like you said, there's lots of 441 00:22:35,280 --> 00:22:41,200 Speaker 1: companies dedicated towards breading and frying these animals for delicious consumption. 442 00:22:41,920 --> 00:22:43,760 Speaker 1: So I feel like it's it's like plugs, right, Like 443 00:22:43,840 --> 00:22:46,840 Speaker 1: dogs are wolves that we have bread into being something 444 00:22:46,920 --> 00:22:49,639 Speaker 1: cuter and more acceptable to us. And pugs are a 445 00:22:49,720 --> 00:22:53,080 Speaker 1: strong example because they have these short noses that are 446 00:22:53,160 --> 00:22:55,639 Speaker 1: not great for respiration, Like that's not what the animal 447 00:22:55,880 --> 00:22:58,000 Speaker 1: would want if it was choosing itself. But we love 448 00:22:58,000 --> 00:23:00,159 Speaker 1: these animals, we care for them. We give them all 449 00:23:00,200 --> 00:23:03,040 Speaker 1: the attention and and care they could need so that 450 00:23:03,080 --> 00:23:06,439 Speaker 1: they're not affected poorly by these snouts. So we do 451 00:23:06,480 --> 00:23:08,080 Speaker 1: what we can for the animals we love. And I 452 00:23:08,119 --> 00:23:11,840 Speaker 1: feel like had we created or if we created a 453 00:23:11,960 --> 00:23:14,440 Speaker 1: chicken dinosaur, it would absolutely be one of the most 454 00:23:14,880 --> 00:23:17,359 Speaker 1: cared for and celebrated animals in the world and they 455 00:23:17,359 --> 00:23:18,960 Speaker 1: would have a pretty good life. I'm still stuck in 456 00:23:19,000 --> 00:23:22,399 Speaker 1: the idea of a dinosaur giving birth to a chicken 457 00:23:22,840 --> 00:23:25,080 Speaker 1: and the trauma for that poor chicken that has to 458 00:23:25,119 --> 00:23:27,439 Speaker 1: have a dinosaur mom. Like you know when your parents 459 00:23:27,440 --> 00:23:29,760 Speaker 1: look different from all the other parents, Like, wow, that's 460 00:23:29,800 --> 00:23:32,439 Speaker 1: gonna be a difficult chicken childhood there, I mean the 461 00:23:32,520 --> 00:23:36,240 Speaker 1: ugly duckling. Still that's where it's happy or that No, 462 00:23:36,359 --> 00:23:38,520 Speaker 1: the ugly duckling. The duckling grows up to be a 463 00:23:38,560 --> 00:23:41,199 Speaker 1: swan and then like beats up the other ducks. Right, 464 00:23:41,200 --> 00:23:45,200 Speaker 1: it's a really mean, very tale recall. All right, Well, 465 00:23:45,359 --> 00:23:47,320 Speaker 1: Danie and I really love your book, Ryan, But one 466 00:23:47,320 --> 00:23:50,199 Speaker 1: thing that is pretty cool is that it's for a 467 00:23:50,240 --> 00:23:52,440 Speaker 1: wide range of people. So the book was a big 468 00:23:52,480 --> 00:23:54,520 Speaker 1: hit in my house, was a big hit in Daniel South, 469 00:23:54,760 --> 00:23:57,520 Speaker 1: and our kids also got into the book. So we 470 00:23:57,520 --> 00:24:00,520 Speaker 1: have a couple of questions from Hazel dan his daughter, 471 00:24:00,560 --> 00:24:02,720 Speaker 1: and a couple of questions from my son. Sure this 472 00:24:02,800 --> 00:24:05,439 Speaker 1: is great, all right, So here's a question from my daughter, Hazel. 473 00:24:05,760 --> 00:24:08,520 Speaker 1: If hypothetically a twelve year old wanted to take over 474 00:24:08,560 --> 00:24:10,760 Speaker 1: the world and didn't have access to a lot of 475 00:24:10,800 --> 00:24:17,360 Speaker 1: resources other than her dad's particle accelerator, what would you recommend, hypothetically, hypathetically, sympathetically, 476 00:24:17,440 --> 00:24:20,439 Speaker 1: That is a great question, Hazel I would say that 477 00:24:20,480 --> 00:24:24,000 Speaker 1: you're in a great position given this access to a 478 00:24:24,000 --> 00:24:27,879 Speaker 1: particle accelerator. I feel like I'm not really best qualified 479 00:24:27,920 --> 00:24:30,399 Speaker 1: to tell you what you can use a particle accelerator 480 00:24:30,480 --> 00:24:34,159 Speaker 1: for in terms of world domination. But with the particle 481 00:24:34,160 --> 00:24:37,680 Speaker 1: accelerator and a very tredulous father, I think you could 482 00:24:37,680 --> 00:24:41,239 Speaker 1: trick him into doing stuff. But just just ask him, like, 483 00:24:41,680 --> 00:24:44,159 Speaker 1: you know, Dad, what should I definitely never use that 484 00:24:44,240 --> 00:24:47,000 Speaker 1: particle accelerator for because I want to be super safe. 485 00:24:47,000 --> 00:24:50,000 Speaker 1: And he'll tell you, and then you know, do it, 486 00:24:50,640 --> 00:24:54,320 Speaker 1: And then he'll say, go ask your mother. If mom 487 00:24:54,359 --> 00:24:56,399 Speaker 1: says it's okay to make a black hole, then I 488 00:24:56,440 --> 00:24:59,320 Speaker 1: guess it's fine. I think it's it's great deniability here 489 00:24:59,320 --> 00:25:01,399 Speaker 1: because you can do Google what not to do with 490 00:25:01,440 --> 00:25:04,520 Speaker 1: a particle accelerator and just tell the FBI, like what 491 00:25:04,800 --> 00:25:07,280 Speaker 1: I was trying to be safe. That actually brings me 492 00:25:07,320 --> 00:25:09,840 Speaker 1: to my son's question. So my son really liked your book. 493 00:25:10,040 --> 00:25:12,200 Speaker 1: He read it and he wants to know how many 494 00:25:12,240 --> 00:25:15,080 Speaker 1: times did you use Google to get all the information 495 00:25:15,119 --> 00:25:16,960 Speaker 1: for the book. But I think it was so impressed 496 00:25:16,960 --> 00:25:18,800 Speaker 1: by all the information in it. He was like, how 497 00:25:18,840 --> 00:25:20,359 Speaker 1: do you how do you do you even get all 498 00:25:20,400 --> 00:25:24,000 Speaker 1: this stuff. Yeah, that's a great question. I think anyone 499 00:25:24,080 --> 00:25:27,400 Speaker 1: these days uses Google. But it's not like you can 500 00:25:27,440 --> 00:25:30,680 Speaker 1: just type in what should I put in a book 501 00:25:30,720 --> 00:25:33,040 Speaker 1: called how to take Over the World? PS, it's an 502 00:25:33,040 --> 00:25:37,320 Speaker 1: emergency and have it work. So what I what I do? 503 00:25:37,480 --> 00:25:39,640 Speaker 1: I think? What what all of us do going through 504 00:25:39,640 --> 00:25:42,480 Speaker 1: the world is we're always keeping our eyes peeled for 505 00:25:42,560 --> 00:25:46,240 Speaker 1: interesting things and remembering things that surprised us or that 506 00:25:46,280 --> 00:25:49,840 Speaker 1: we thought were unexpected. And so when you sit down 507 00:25:49,880 --> 00:25:52,320 Speaker 1: to write, you have all this stuff you remember of, Oh, 508 00:25:52,400 --> 00:25:56,000 Speaker 1: you know, I think I've read somewhere that plastics aren't 509 00:25:56,080 --> 00:25:59,600 Speaker 1: eaten by anything alive on the world, and just how 510 00:25:59,640 --> 00:26:02,240 Speaker 1: they'll has a really long time. And then you google 511 00:26:02,320 --> 00:26:04,400 Speaker 1: that and you think, oh, you realize, oh, no, there 512 00:26:04,440 --> 00:26:06,639 Speaker 1: was an animal that was discovered in the early two 513 00:26:06,680 --> 00:26:09,240 Speaker 1: thousands that does actually or can actually eat plastic, but 514 00:26:09,280 --> 00:26:12,880 Speaker 1: still most animals don't, so it's it's a really long lasting, 515 00:26:13,359 --> 00:26:17,120 Speaker 1: non biodegradable product. In certain scenarios, then you say, well, 516 00:26:17,119 --> 00:26:20,000 Speaker 1: if nothing eats plastic on the ocean floor, then you 517 00:26:20,040 --> 00:26:24,119 Speaker 1: could use a engraved hunk of plastic and put it 518 00:26:24,119 --> 00:26:26,560 Speaker 1: in the ocean and it will last ten thousand years. 519 00:26:26,560 --> 00:26:29,360 Speaker 1: And you know what, if you put that maybe near 520 00:26:29,400 --> 00:26:31,359 Speaker 1: the marian A Trench, because that's the deepest part of 521 00:26:31,359 --> 00:26:33,720 Speaker 1: the ocean, so it's naturally interesting, just like Mount Everest 522 00:26:33,720 --> 00:26:36,760 Speaker 1: is naturally interesting for us, then that might be give 523 00:26:36,800 --> 00:26:38,959 Speaker 1: it a better chance of being found in that time period. 524 00:26:39,000 --> 00:26:41,639 Speaker 1: And then you google how fast the maran A trench 525 00:26:41,840 --> 00:26:44,200 Speaker 1: plate is being subducted, and you realize, oh, it's fifty 526 00:26:44,200 --> 00:26:47,080 Speaker 1: three millime a year. So you put it like five 527 00:26:47,160 --> 00:26:49,440 Speaker 1: kilometers away in the right direction, it'll be right where 528 00:26:49,440 --> 00:26:52,399 Speaker 1: it's at the most interesting point on the planet at 529 00:26:52,440 --> 00:26:54,840 Speaker 1: the time period we want. And then you've got a 530 00:26:54,880 --> 00:26:59,160 Speaker 1: scheme for sending a villainous message centuries thousands of years 531 00:26:59,200 --> 00:27:01,080 Speaker 1: in the future. I think get all starts from just 532 00:27:01,160 --> 00:27:04,680 Speaker 1: being curious about the world around you and remembering the 533 00:27:04,720 --> 00:27:06,520 Speaker 1: stuff you find interesting, and then you can put it 534 00:27:06,560 --> 00:27:09,119 Speaker 1: together in different ways like little villainous lego blocks. That 535 00:27:09,200 --> 00:27:13,160 Speaker 1: sounds like the answer is Google, but I think he's 536 00:27:13,160 --> 00:27:16,480 Speaker 1: saying it's how you use Google, right, Like, if you 537 00:27:16,520 --> 00:27:17,840 Speaker 1: use it in the right way, you can get the 538 00:27:17,960 --> 00:27:21,719 Speaker 1: interesting places. Every nonfiction book these days is curated Google 539 00:27:22,520 --> 00:27:26,359 Speaker 1: and Wikipedia. Wikipedia. Actually, it's surprising I wish it were 540 00:27:26,880 --> 00:27:29,240 Speaker 1: because of me so much easier. There's still lots of 541 00:27:29,280 --> 00:27:33,000 Speaker 1: information in books that you can't find online, and you 542 00:27:33,040 --> 00:27:35,400 Speaker 1: can't just search for what you want. I do love 543 00:27:35,480 --> 00:27:37,640 Speaker 1: that a lot of the older out of copyright books 544 00:27:37,640 --> 00:27:40,080 Speaker 1: are online, so you can find like these on the 545 00:27:40,160 --> 00:27:45,040 Speaker 1: chapter immortality, I found all these schemes from the mid 546 00:27:45,200 --> 00:27:48,159 Speaker 1: fift hundreds onward that were written down these books and 547 00:27:48,160 --> 00:27:50,480 Speaker 1: published and forgotten, but then someone scanned to put them online. 548 00:27:50,480 --> 00:27:53,159 Speaker 1: I can read these original ideas of here's how you 549 00:27:53,200 --> 00:27:55,360 Speaker 1: make this juice that makes you immortal. And you look 550 00:27:55,359 --> 00:27:57,080 Speaker 1: at the recipe from our modern era and you're like, 551 00:27:57,080 --> 00:28:01,840 Speaker 1: this is ludicrous. This will at best not kill you. 552 00:28:02,200 --> 00:28:07,720 Speaker 1: Come close. Well, technically not killing you is extending your life. Yeah. 553 00:28:07,920 --> 00:28:10,320 Speaker 1: I guess if you continue to not be killed for 554 00:28:10,359 --> 00:28:13,080 Speaker 1: long enough, then you can live forever. That's the trip, 555 00:28:13,200 --> 00:28:15,919 Speaker 1: as I just avoid a hoax. He cures and you 556 00:28:16,000 --> 00:28:18,560 Speaker 1: might live forever. Speaking of cures, my daughter had this 557 00:28:18,640 --> 00:28:22,199 Speaker 1: moment Hazel realized the power of Google. We're watching this 558 00:28:22,280 --> 00:28:26,000 Speaker 1: show All Creatures Great and Small and PBS, this wonderful show. 559 00:28:26,119 --> 00:28:28,639 Speaker 1: But he's trying to save some cow and he's looking 560 00:28:28,720 --> 00:28:31,800 Speaker 1: through books for an answer. He's like desperately doing research 561 00:28:31,840 --> 00:28:34,320 Speaker 1: all night long, and Hazel's like, Wow, too bad he 562 00:28:34,359 --> 00:28:37,000 Speaker 1: doesn't just have Google. And I think she'd like clicked 563 00:28:37,040 --> 00:28:39,320 Speaker 1: for her the power of being able to search through 564 00:28:39,360 --> 00:28:42,160 Speaker 1: all these texts simultaneously to find an answer. I want 565 00:28:42,160 --> 00:28:44,120 Speaker 1: to come back to something you talked about earlier, about 566 00:28:44,120 --> 00:28:46,640 Speaker 1: sending a message into the future. You know, why do 567 00:28:46,680 --> 00:28:49,680 Speaker 1: you think this is such a powerful theme in supervillain 568 00:28:49,720 --> 00:28:53,280 Speaker 1: e Being immortal or not being forgotten? Is it such 569 00:28:53,280 --> 00:28:55,680 Speaker 1: an important theme? And what are some of the schemes 570 00:28:55,720 --> 00:28:58,800 Speaker 1: you have in your book for not being forgotten? Well? 571 00:28:58,840 --> 00:29:01,680 Speaker 1: It ties into ego, right, Like the best villains are 572 00:29:01,680 --> 00:29:04,760 Speaker 1: always super egotistical. Your your doctor Dunes and lex Luthors 573 00:29:04,800 --> 00:29:07,920 Speaker 1: are very convinced that they are the best humanity has 574 00:29:07,960 --> 00:29:11,480 Speaker 1: to offer. But since they are a human, they're going 575 00:29:11,560 --> 00:29:15,240 Speaker 1: to die one day and that feels unfair to them, 576 00:29:15,240 --> 00:29:18,240 Speaker 1: and they want either cheat death by becoming immortal, or 577 00:29:18,320 --> 00:29:20,600 Speaker 1: at least ensure that they're never forgotten and find at 578 00:29:20,640 --> 00:29:24,360 Speaker 1: least some some flavor of immortality. It's fascinating how often 579 00:29:24,400 --> 00:29:26,880 Speaker 1: this idea shows up across human cultures, and you look 580 00:29:26,920 --> 00:29:29,520 Speaker 1: at it just from like first principles, that makes sense, right, 581 00:29:29,600 --> 00:29:35,600 Speaker 1: like humans begin as an egg and sperm cell, and 582 00:29:35,640 --> 00:29:38,520 Speaker 1: then from those two pieces we build up an entire 583 00:29:38,600 --> 00:29:42,080 Speaker 1: baby in nine months, which is incredible. It's it's it's 584 00:29:42,080 --> 00:29:47,000 Speaker 1: a magic trick. And then somehow just maintaining that body 585 00:29:47,160 --> 00:29:50,840 Speaker 1: is what kills us. We can build a human being 586 00:29:50,960 --> 00:29:53,520 Speaker 1: from two cells and nine months, and then just keeping 587 00:29:53,560 --> 00:29:57,800 Speaker 1: it around is fatal. That feels like there's something wrong there, 588 00:29:57,880 --> 00:29:59,760 Speaker 1: like there must be some mistake and we can fix this. 589 00:29:59,840 --> 00:30:02,240 Speaker 1: So I get the implication for why you'd want to 590 00:30:02,240 --> 00:30:04,680 Speaker 1: be immortal, and when that fails, I absolutely understand this 591 00:30:04,760 --> 00:30:07,160 Speaker 1: idea of well, well, let's at least make sure people 592 00:30:07,200 --> 00:30:09,280 Speaker 1: always remember me. I want to send a message of 593 00:30:09,280 --> 00:30:11,480 Speaker 1: the futures that they know I was here, and I 594 00:30:11,480 --> 00:30:14,160 Speaker 1: feel like that's also very very common in humans, this idea. 595 00:30:14,280 --> 00:30:16,720 Speaker 1: But we'll build something so that people knew we were here. 596 00:30:16,880 --> 00:30:19,200 Speaker 1: And in the book, I look at different time ranges, 597 00:30:19,240 --> 00:30:22,480 Speaker 1: so one year, ten year, a hundred year, a thousand year, 598 00:30:22,640 --> 00:30:26,560 Speaker 1: and as we go for larger and larger periods of time, 599 00:30:26,640 --> 00:30:29,160 Speaker 1: we start getting greater and greater problems. Right, and we 600 00:30:29,240 --> 00:30:32,000 Speaker 1: looking at uh, ten thousand years into the future. It's 601 00:30:32,040 --> 00:30:35,080 Speaker 1: not impossible to get an object to survive that long. 602 00:30:35,200 --> 00:30:40,080 Speaker 1: But the fact is that no human language has survived 603 00:30:40,120 --> 00:30:43,640 Speaker 1: that long, And so you start getting into well, maybe 604 00:30:43,640 --> 00:30:47,320 Speaker 1: we can use a subset of words that we believe 605 00:30:47,560 --> 00:30:51,120 Speaker 1: evolved more slowly than other words, so very common things 606 00:30:51,200 --> 00:30:54,840 Speaker 1: like like face and numbers and letters. But even that 607 00:30:54,880 --> 00:30:56,600 Speaker 1: doesn't get you far enough, and you're like, well, maybe 608 00:30:56,640 --> 00:31:00,040 Speaker 1: we can use symbols and communicate with the universe a 609 00:31:00,120 --> 00:31:03,880 Speaker 1: language of pictures. But comics. Yeah, But the problem with 610 00:31:03,920 --> 00:31:06,720 Speaker 1: comics is that even they're culturally interpreted, like a skull 611 00:31:06,760 --> 00:31:09,640 Speaker 1: and crossbones we would see as being a symbol of death, 612 00:31:10,000 --> 00:31:12,960 Speaker 1: or maybe it's piracy, or to a medieval alchemist, it's 613 00:31:13,000 --> 00:31:17,560 Speaker 1: the bones of Adam that promised eternal resurrection, and comics 614 00:31:17,560 --> 00:31:19,480 Speaker 1: are reading them left to right or right to left, 615 00:31:19,520 --> 00:31:22,400 Speaker 1: like we we don't know the culture that these symbols 616 00:31:22,440 --> 00:31:25,719 Speaker 1: will be interpreted in, so we can't. Actually, they're not 617 00:31:26,040 --> 00:31:28,320 Speaker 1: nearly as reliable as as we'd like. We need like 618 00:31:28,360 --> 00:31:31,720 Speaker 1: an ikea manual for how to read the comics. People 619 00:31:31,760 --> 00:31:34,640 Speaker 1: have tried it. People have because that's such a tantalizing idea, right, 620 00:31:34,640 --> 00:31:36,400 Speaker 1: if we can come up with a symbol, a set 621 00:31:36,400 --> 00:31:39,800 Speaker 1: of symbols that any human can look at and understand. 622 00:31:40,080 --> 00:31:43,479 Speaker 1: Then we've got a universal language, and especially in the seventies, 623 00:31:43,480 --> 00:31:45,320 Speaker 1: see a lot of efforts to try to build this 624 00:31:45,480 --> 00:31:48,400 Speaker 1: universal pictorial language. And as soon as you get to 625 00:31:48,480 --> 00:31:51,680 Speaker 1: something even a little bit complicated, you start making these 626 00:31:51,680 --> 00:31:57,200 Speaker 1: assumptions that a circle is good and a square is bad, 627 00:31:57,280 --> 00:31:59,760 Speaker 1: or green means go and red means stop, and all 628 00:31:59,800 --> 00:32:02,440 Speaker 1: these things that you can't actually bake in. So it 629 00:32:02,680 --> 00:32:04,960 Speaker 1: starts to get very very challenging. But when you're looking 630 00:32:05,000 --> 00:32:07,800 Speaker 1: at say a hundred thousand years or even a million 631 00:32:07,920 --> 00:32:10,640 Speaker 1: years in the future, at that point, all you can 632 00:32:10,680 --> 00:32:12,800 Speaker 1: do is leave the Earth and be like, you know what, 633 00:32:13,320 --> 00:32:15,920 Speaker 1: We're gonna put a satellite in orbit, like NASA did 634 00:32:16,000 --> 00:32:19,240 Speaker 1: with Legios, which is a million dollar satellite nineteen and 635 00:32:19,240 --> 00:32:22,800 Speaker 1: the polar orbit that's expected to last about a million 636 00:32:22,880 --> 00:32:25,720 Speaker 1: years before the grades. And so you could launch a 637 00:32:25,800 --> 00:32:29,280 Speaker 1: very similar satellite with your message on it, and maybe 638 00:32:29,280 --> 00:32:31,280 Speaker 1: it won't be understood, but the fact that it made 639 00:32:31,760 --> 00:32:35,000 Speaker 1: it a million years in the future, that's remarkable on 640 00:32:35,040 --> 00:32:37,880 Speaker 1: its own. Are you saying that NASA launched a satellite 641 00:32:38,000 --> 00:32:40,120 Speaker 1: just to send a message to the future, that's the 642 00:32:40,160 --> 00:32:43,440 Speaker 1: primary purpose of this satellite, or it's an auxiliary purpose. 643 00:32:43,560 --> 00:32:45,520 Speaker 1: It's an auxiliary revercose I wishly done it on purpose. 644 00:32:45,640 --> 00:32:49,200 Speaker 1: So this satellite was used to initially measure continental drift. 645 00:32:49,240 --> 00:32:51,800 Speaker 1: It's basically imagine a giant golf ball and if you 646 00:32:51,840 --> 00:32:54,920 Speaker 1: fire a laser at this sallit it reflects it back 647 00:32:54,960 --> 00:32:56,959 Speaker 1: to you and you can use that to measure distance 648 00:32:57,040 --> 00:32:59,240 Speaker 1: very accurately, which you can then use to measure the 649 00:32:59,440 --> 00:33:02,000 Speaker 1: very light movements of the continents. And so it was 650 00:33:02,160 --> 00:33:05,320 Speaker 1: used to sort of nail down this newer theory of 651 00:33:05,360 --> 00:33:07,720 Speaker 1: continental drift naturally get measurements for it, and it worked. 652 00:33:08,480 --> 00:33:11,040 Speaker 1: And it was only when they were launching it, well 653 00:33:11,040 --> 00:33:13,480 Speaker 1: not just we're launching, but as they're launching it, they realized, 654 00:33:13,680 --> 00:33:15,800 Speaker 1: you know, if this stays up for this long, this 655 00:33:15,880 --> 00:33:19,760 Speaker 1: is a chance to talk to life on Earth millions 656 00:33:19,800 --> 00:33:22,640 Speaker 1: of years in the future if it's understood. And so 657 00:33:22,840 --> 00:33:25,600 Speaker 1: what they did trying to figure out this problem of language, 658 00:33:25,640 --> 00:33:28,400 Speaker 1: They did the numbers one to ten in binary to 659 00:33:28,400 --> 00:33:30,440 Speaker 1: show that we knew what numbers were, and then there 660 00:33:30,440 --> 00:33:34,040 Speaker 1: were three pictures of what Earth looked like around eight 661 00:33:34,040 --> 00:33:37,000 Speaker 1: million years ago we think from continent drift, what it 662 00:33:37,040 --> 00:33:39,480 Speaker 1: looked like when it was launched and then what it 663 00:33:39,560 --> 00:33:41,600 Speaker 1: might look like around eight million years in the future. 664 00:33:41,880 --> 00:33:43,840 Speaker 1: And that was basically a little more than a guest 665 00:33:43,880 --> 00:33:46,760 Speaker 1: because we hadn't yet precisely measured how the continents were moving. 666 00:33:47,280 --> 00:33:49,880 Speaker 1: But it at least says I can give you some 667 00:33:49,960 --> 00:33:53,000 Speaker 1: idea of this is about when this came from, and 668 00:33:53,000 --> 00:33:55,400 Speaker 1: this is about what we were trying to do if 669 00:33:55,440 --> 00:33:57,440 Speaker 1: you know what the Earth looks like. But that's a 670 00:33:57,440 --> 00:33:59,400 Speaker 1: pretty big if, right, if the Earth even looks like 671 00:33:59,440 --> 00:34:02,920 Speaker 1: what we think will look like. So you don't have 672 00:34:03,000 --> 00:34:06,920 Speaker 1: anything guaranteed, But I think the fact that it's possible 673 00:34:06,960 --> 00:34:11,400 Speaker 1: to try and to maybe succeed is absolutely wild, right. 674 00:34:11,520 --> 00:34:13,080 Speaker 1: I think if humans look at that thing in a 675 00:34:13,120 --> 00:34:15,120 Speaker 1: million years, there would be a big group of people 676 00:34:15,160 --> 00:34:17,440 Speaker 1: who think it's a fake. It must be a hoax. 677 00:34:17,600 --> 00:34:20,439 Speaker 1: There's no way NASA actually did that a million years ago. 678 00:34:21,160 --> 00:34:23,719 Speaker 1: Well there's the for a hundred million years. What I 679 00:34:23,760 --> 00:34:28,360 Speaker 1: suggest is based on this satellite, the Echo Star sixteen, 680 00:34:28,680 --> 00:34:32,239 Speaker 1: which there's a type of orbit called the graveyard orbit. 681 00:34:32,320 --> 00:34:34,440 Speaker 1: So if you're in geosynchronous orbit, it's really useful for 682 00:34:34,480 --> 00:34:37,560 Speaker 1: satellites because you're always in the same place in the sky. 683 00:34:37,640 --> 00:34:41,880 Speaker 1: But when the satellites exceed they're useful lifetime. They're pushed 684 00:34:41,960 --> 00:34:45,719 Speaker 1: up about three kilometers higher to what's called the graveyard orbit, 685 00:34:45,719 --> 00:34:51,400 Speaker 1: where they just orbit in this graveyard indefinitely and on 686 00:34:51,440 --> 00:34:55,560 Speaker 1: this Echo Star sixteen satellite. This artist, Trevor Peglin, talked 687 00:34:55,640 --> 00:34:58,680 Speaker 1: them into letting him put on a silicon disc that 688 00:34:58,760 --> 00:35:01,760 Speaker 1: had a hundred different pictures of Earth, portraits of humanity, 689 00:35:01,760 --> 00:35:05,320 Speaker 1: he called it. And we're all familiar with the Voyager Record, 690 00:35:05,320 --> 00:35:08,880 Speaker 1: which had these images of humans and humanity shows by NASA, 691 00:35:08,880 --> 00:35:10,640 Speaker 1: and they were all very It was like Earth on 692 00:35:10,640 --> 00:35:12,520 Speaker 1: a good day, right. These images we put in our 693 00:35:12,600 --> 00:35:15,080 Speaker 1: dating profile if we were a planet or a species 694 00:35:15,520 --> 00:35:19,560 Speaker 1: looking to meet other species. But what Trevor chose were 695 00:35:19,920 --> 00:35:23,000 Speaker 1: I thought very interesting because he just chose images that 696 00:35:23,040 --> 00:35:25,080 Speaker 1: showed all these different aspects of Earth. There was a 697 00:35:25,120 --> 00:35:29,880 Speaker 1: screenshot of the text adventure game Zorc. There was what 698 00:35:29,920 --> 00:35:32,040 Speaker 1: you might expect, like pictures of beautiful buildings and stuff, 699 00:35:32,080 --> 00:35:35,320 Speaker 1: but also here's a picture of a factory farm. Here's 700 00:35:35,360 --> 00:35:38,200 Speaker 1: a picture of a predator drone taken from the ground 701 00:35:38,239 --> 00:35:43,759 Speaker 1: in Iran. Here's pictures of children who were born with 702 00:35:43,960 --> 00:35:46,920 Speaker 1: deformations caused by Agent Orange, like stuff that we wouldn't 703 00:35:46,920 --> 00:35:49,920 Speaker 1: normally want to. Remember, he also put on this satellite 704 00:35:49,960 --> 00:35:52,080 Speaker 1: where it might last a hundred million years or more. 705 00:35:52,280 --> 00:35:55,279 Speaker 1: And I actually got to speak to him, and I 706 00:35:55,320 --> 00:35:57,360 Speaker 1: was saying, well, what do you think this means? And 707 00:35:57,400 --> 00:36:00,839 Speaker 1: he was saying, Look, there's all these forces on Earth 708 00:36:00,880 --> 00:36:03,520 Speaker 1: that we can't control, that are beyond us, that no 709 00:36:03,560 --> 00:36:07,240 Speaker 1: one human can can influence. But that doesn't mean you 710 00:36:07,400 --> 00:36:11,319 Speaker 1: don't have to try or don't have to participate. And 711 00:36:11,480 --> 00:36:14,080 Speaker 1: he recognized that if these were ever found, which is 712 00:36:14,800 --> 00:36:18,719 Speaker 1: not super likely, it's even less likely that they'd be understood. 713 00:36:19,680 --> 00:36:23,880 Speaker 1: But just because it wasn't likely didn't mean he couldn't 714 00:36:23,880 --> 00:36:26,759 Speaker 1: stop himself from trying. And he wanted to put art 715 00:36:26,840 --> 00:36:28,759 Speaker 1: into orbit where it might be seen a hundred million 716 00:36:28,800 --> 00:36:31,520 Speaker 1: years from now. And I think that is part of 717 00:36:31,520 --> 00:36:33,319 Speaker 1: the beauty of it too. Like as much as as 718 00:36:33,440 --> 00:36:36,920 Speaker 1: his project or the Voice your Record is us trying 719 00:36:36,960 --> 00:36:39,560 Speaker 1: to speak to the future, trying to speak to distant 720 00:36:40,160 --> 00:36:43,440 Speaker 1: aliens or anything, it's also us speaking to ourselves, right. 721 00:36:43,480 --> 00:36:46,640 Speaker 1: It's humans speaking to humanity and saying, some of what 722 00:36:46,719 --> 00:36:51,840 Speaker 1: we've done here might survive, some of it might outlast 723 00:36:51,880 --> 00:36:55,280 Speaker 1: all of us. And I think that is really comforting 724 00:36:55,320 --> 00:36:57,879 Speaker 1: in a way. It's inspiring comforting. It makes me feel like, 725 00:36:58,400 --> 00:37:03,000 Speaker 1: no matter what, they're still this voyager record, there's still 726 00:37:03,040 --> 00:37:05,719 Speaker 1: these portraits of humanity on the Echo starsis team satellite 727 00:37:05,760 --> 00:37:09,040 Speaker 1: that will be there, probably long after all of us 728 00:37:09,040 --> 00:37:12,040 Speaker 1: are gone and at least have something to say about 729 00:37:12,080 --> 00:37:13,640 Speaker 1: who we were and what we were doing. Yeah, and 730 00:37:13,719 --> 00:37:15,439 Speaker 1: I think one of my favorite parts of your book 731 00:37:15,760 --> 00:37:18,600 Speaker 1: are these questions that really do touch everybody. I mean, 732 00:37:18,640 --> 00:37:21,640 Speaker 1: not everybody necessarily wants to be a geo engineer, but 733 00:37:21,960 --> 00:37:25,719 Speaker 1: everybody thinks about death and mortality and whether they'll be remembered. 734 00:37:25,880 --> 00:37:27,640 Speaker 1: And so one of the things you talk about in 735 00:37:27,640 --> 00:37:29,640 Speaker 1: the book is how to be remembered, how to leave 736 00:37:29,719 --> 00:37:32,240 Speaker 1: a you know, a statue of yourself or a message 737 00:37:32,239 --> 00:37:34,720 Speaker 1: that might be remembered. But you actually talk about also 738 00:37:35,000 --> 00:37:38,880 Speaker 1: literally curing aging. There's this quote where you when you 739 00:37:38,880 --> 00:37:41,160 Speaker 1: talk about a scientist who says that they can cure 740 00:37:41,239 --> 00:37:44,520 Speaker 1: aging in lab mice in ten years and in humans 741 00:37:44,600 --> 00:37:46,560 Speaker 1: within just a decade or so, more tell us a 742 00:37:46,560 --> 00:37:49,040 Speaker 1: little bit about the trajectory. There are we on the 743 00:37:49,160 --> 00:37:51,960 Speaker 1: verge of curing aging? Haven't they been saying that for 744 00:37:52,000 --> 00:37:55,319 Speaker 1: the last thirty years? There are absolutely people who will 745 00:37:55,320 --> 00:37:58,480 Speaker 1: tell you that the first immortal human it's already born, 746 00:37:58,560 --> 00:38:00,960 Speaker 1: already walking alive, and just doesn't walk around doesn't know 747 00:38:01,000 --> 00:38:03,600 Speaker 1: it yet. I in the book and in real life 748 00:38:03,680 --> 00:38:07,320 Speaker 1: take a more skeptical approach to that. Usually. I feel 749 00:38:07,320 --> 00:38:10,360 Speaker 1: like these these shots of immortality, things like cryonics or 750 00:38:10,440 --> 00:38:13,040 Speaker 1: uploading your brain to a computer, as soon as you 751 00:38:13,080 --> 00:38:15,200 Speaker 1: look at them in any sort of detail, they kind 752 00:38:15,200 --> 00:38:18,720 Speaker 1: of really fall apart. Chronics, especially where it's this idea 753 00:38:18,800 --> 00:38:21,840 Speaker 1: that if you freeze yourself and keep your body frozen, 754 00:38:22,000 --> 00:38:25,279 Speaker 1: you'll be thought out and rejuvenated in the future. And 755 00:38:25,320 --> 00:38:28,480 Speaker 1: it sounds great until you realize that, Okay, well you 756 00:38:28,520 --> 00:38:32,359 Speaker 1: have to not only cure whatever disease you were dying from, 757 00:38:32,400 --> 00:38:34,360 Speaker 1: but also cure it if it's advanced so much that 758 00:38:34,400 --> 00:38:38,040 Speaker 1: it's literally already killed you, And you have to keep 759 00:38:38,080 --> 00:38:40,799 Speaker 1: this body frozen for so long, And it feels like like, 760 00:38:40,960 --> 00:38:43,399 Speaker 1: is there anything we can point to that says this 761 00:38:43,480 --> 00:38:46,680 Speaker 1: is possible? And the closest example I could find was 762 00:38:46,719 --> 00:38:50,239 Speaker 1: this practice of chantry in medieval times, where when you died, 763 00:38:50,280 --> 00:38:51,840 Speaker 1: if you're rich, you'd pay people to sing for your 764 00:38:51,880 --> 00:38:55,600 Speaker 1: immortal soul to get you into heaven and This evolved 765 00:38:55,600 --> 00:38:58,359 Speaker 1: into this thing called perpetual chantry, where you'd give the 766 00:38:58,440 --> 00:39:01,200 Speaker 1: church land and they would charge rent, and that rent 767 00:39:01,239 --> 00:39:02,759 Speaker 1: on the land would pay for someone to sing for 768 00:39:02,800 --> 00:39:06,600 Speaker 1: your immortal soul forever, so you'd be definitely getting into heaven. 769 00:39:07,200 --> 00:39:10,200 Speaker 1: And this is are the same scheme of cryonics, where 770 00:39:10,239 --> 00:39:14,080 Speaker 1: you make the living do something to keep the dead around, 771 00:39:14,920 --> 00:39:18,239 Speaker 1: and it has advantage over cryonics that all you had 772 00:39:18,280 --> 00:39:21,080 Speaker 1: to do was sing and pray. And it's still lasted 773 00:39:21,160 --> 00:39:23,759 Speaker 1: less than four years until a king was like, you 774 00:39:23,800 --> 00:39:26,399 Speaker 1: know what, I'm taking this land. It's mine now this 775 00:39:26,480 --> 00:39:29,320 Speaker 1: is over. So I don't I don't see it very likely, 776 00:39:29,360 --> 00:39:31,479 Speaker 1: but that the quote you were mentioning was from Dr 777 00:39:31,480 --> 00:39:34,920 Speaker 1: Aubrey de Gray, who believes that there are treatments just 778 00:39:34,960 --> 00:39:39,480 Speaker 1: around the corner that could make effectively immortal humans. And 779 00:39:39,520 --> 00:39:41,480 Speaker 1: when I say that, I mean not they won't die, 780 00:39:41,600 --> 00:39:44,200 Speaker 1: but just they won't die from what we normally call 781 00:39:44,480 --> 00:39:47,319 Speaker 1: old age. They accidents might do them in and a 782 00:39:47,360 --> 00:39:51,640 Speaker 1: long enough timeline that is probably true. We're driving around cars, 783 00:39:51,640 --> 00:39:55,399 Speaker 1: I gotta hit someone. But he believes that it might 784 00:39:55,440 --> 00:40:00,080 Speaker 1: be possible to cure aging. Not by figuring out what 785 00:40:00,239 --> 00:40:02,360 Speaker 1: aging is. You don't really know what it is, but 786 00:40:02,440 --> 00:40:04,440 Speaker 1: just by figure out how to address the symptoms of it. 787 00:40:04,640 --> 00:40:08,520 Speaker 1: So if there's a problem with tissues becoming inflexible, well 788 00:40:08,560 --> 00:40:10,520 Speaker 1: let's find a way to make them flexible. If there's 789 00:40:10,520 --> 00:40:14,240 Speaker 1: a problem with cancer killing people, well then let's solve 790 00:40:14,320 --> 00:40:18,040 Speaker 1: this problem with cancer. And the way he suggests is basically, 791 00:40:18,200 --> 00:40:23,200 Speaker 1: cancer happens when cells divide without limit. They just keep dividing, 792 00:40:23,680 --> 00:40:27,239 Speaker 1: and they can do that because stem cells have this 793 00:40:28,360 --> 00:40:31,560 Speaker 1: thing called telomerase, which allows telomeres in the cell to 794 00:40:31,640 --> 00:40:36,839 Speaker 1: reproduce infinitely. So telomeres are repetitive parts at the end 795 00:40:36,840 --> 00:40:39,600 Speaker 1: of chromosomes, end of DNA that's shortened every time a 796 00:40:39,640 --> 00:40:42,600 Speaker 1: cell divides, and so there's a limit on how many 797 00:40:43,080 --> 00:40:45,520 Speaker 1: times a cell can divide. But if you have, like 798 00:40:45,560 --> 00:40:48,880 Speaker 1: stem cells, do telomerase, then you can do this indefinitely. 799 00:40:48,920 --> 00:40:53,440 Speaker 1: So he proposes, let's remove the ability to produce telomerase 800 00:40:53,560 --> 00:40:57,120 Speaker 1: from every cell in the body, which effectively, let's sterilize 801 00:40:57,160 --> 00:40:59,680 Speaker 1: every cell so it can no longer reproduce or only 802 00:40:59,680 --> 00:41:02,200 Speaker 1: produce a finite number of times, and then it we'll 803 00:41:02,200 --> 00:41:05,520 Speaker 1: have to die. And then to prevent this from being fatal, 804 00:41:05,560 --> 00:41:09,600 Speaker 1: he suggests we can genetically engineer special stem cells with 805 00:41:09,719 --> 00:41:13,080 Speaker 1: really long telomeres so they can't produce telomeras, but they 806 00:41:13,080 --> 00:41:16,200 Speaker 1: can divide enough to last, say ten years, and then 807 00:41:16,200 --> 00:41:19,879 Speaker 1: every ten years you get a new injection. And this 808 00:41:20,000 --> 00:41:23,760 Speaker 1: is wild, right, Like this is effectively saying, I will 809 00:41:24,480 --> 00:41:27,600 Speaker 1: kill my body's ability to reproduce at the cellular level, 810 00:41:28,080 --> 00:41:31,719 Speaker 1: but get topped up with cellular gasoline every couple of 811 00:41:31,840 --> 00:41:36,000 Speaker 1: years to keep myself alive. And you know, in theory, 812 00:41:36,320 --> 00:41:39,640 Speaker 1: maybe it could work. In practice. There's an awful lot 813 00:41:39,680 --> 00:41:42,200 Speaker 1: of very complicated things that we've just glossed over there 814 00:41:43,160 --> 00:41:44,799 Speaker 1: in an effort to, at the end of the day, 815 00:41:44,880 --> 00:41:49,360 Speaker 1: just make an individual live forever. And I find it 816 00:41:49,560 --> 00:41:54,680 Speaker 1: not a very convincing argument that an individual should live forever. 817 00:41:55,200 --> 00:41:57,399 Speaker 1: I think there's lots of individuals you look at your 818 00:41:57,440 --> 00:42:01,839 Speaker 1: Genghis Khan's, your Hitler's, where it's a bad thing if 819 00:42:01,840 --> 00:42:03,440 Speaker 1: they would live forever. And the fact that we all 820 00:42:03,520 --> 00:42:06,279 Speaker 1: do die has some really positive things for society. And 821 00:42:06,680 --> 00:42:09,200 Speaker 1: it's what encourages billionaires to be philanthropic, to to give 822 00:42:09,239 --> 00:42:10,799 Speaker 1: money away through the end of their lives and try 823 00:42:10,840 --> 00:42:14,680 Speaker 1: to get some sort of better angle for posterity. Because 824 00:42:14,680 --> 00:42:17,239 Speaker 1: I can't take it with them, but you can't take 825 00:42:17,239 --> 00:42:18,839 Speaker 1: it with you. Doesn't mean a lot if you never 826 00:42:18,880 --> 00:42:21,399 Speaker 1: have to go. Right. So I've been writing this book, 827 00:42:21,400 --> 00:42:24,920 Speaker 1: I've kind of come down surprisingly in favor of death, 828 00:42:25,360 --> 00:42:27,960 Speaker 1: and now a person who thinks death is good for 829 00:42:28,120 --> 00:42:30,640 Speaker 1: society and civilization as a whole. You're starting to sound 830 00:42:30,640 --> 00:42:32,319 Speaker 1: like a super villain, right, this is the beginning, Like 831 00:42:32,320 --> 00:42:34,560 Speaker 1: this is this whole thing has been a villainous monologue 832 00:42:34,600 --> 00:42:37,200 Speaker 1: where I started saying death is good and we should 833 00:42:37,200 --> 00:42:40,480 Speaker 1: embrace it. You sound like the speech by Thanos in 834 00:42:40,520 --> 00:42:42,719 Speaker 1: the Avengers movie. He's going to be a guest next 835 00:42:42,719 --> 00:42:45,960 Speaker 1: week on the podcast. So that was a very long 836 00:42:45,960 --> 00:42:48,000 Speaker 1: answer to a very simple question. But I think it's 837 00:42:48,040 --> 00:42:50,759 Speaker 1: so it's such a fascinating topic and it gets into 838 00:42:50,800 --> 00:42:56,239 Speaker 1: these deeper questions of like what life is and can 839 00:42:56,400 --> 00:42:58,799 Speaker 1: death have an upside? Right? Like I don't want to die. 840 00:42:58,840 --> 00:43:00,680 Speaker 1: I don't want anyone I love to die. But at 841 00:43:00,680 --> 00:43:03,879 Speaker 1: a larger level, I get it right, like I see 842 00:43:03,880 --> 00:43:06,680 Speaker 1: the benefits death gives to society and to our species 843 00:43:06,719 --> 00:43:08,920 Speaker 1: as a whole will be fascinating. Also, if if you 844 00:43:08,920 --> 00:43:11,200 Speaker 1: then you had an option, if some people could say, 845 00:43:11,200 --> 00:43:12,640 Speaker 1: you know what, I just want to live eighty years 846 00:43:12,760 --> 00:43:14,479 Speaker 1: or a a hundred years and I'm done, and other people 847 00:43:14,480 --> 00:43:16,359 Speaker 1: are like, no, I'm gonna do three hundred, 'm gonna 848 00:43:16,400 --> 00:43:19,040 Speaker 1: do ten thousand, or I'm just gonna go forever. It 849 00:43:19,239 --> 00:43:22,120 Speaker 1: really make for a really fascinating sort of segregated or 850 00:43:22,160 --> 00:43:24,920 Speaker 1: stratified society if that happened. But my real question is, 851 00:43:25,080 --> 00:43:28,080 Speaker 1: in this scenario, is this one doctor the soul source 852 00:43:28,320 --> 00:43:31,000 Speaker 1: for this sort of cellular gasoline and then eventually becomes 853 00:43:31,040 --> 00:43:37,919 Speaker 1: a gazillionaire because everybody's reliant on his one pipeline of regeneration. Yeah, 854 00:43:37,960 --> 00:43:40,640 Speaker 1: And my my pitch for this is you do it yourself, 855 00:43:41,239 --> 00:43:44,000 Speaker 1: and you split it up amongst different scientists of workers 856 00:43:44,000 --> 00:43:46,240 Speaker 1: so that no one can quite put together all the pieces, 857 00:43:46,880 --> 00:43:49,000 Speaker 1: and then you're the only one who does it, so 858 00:43:50,480 --> 00:43:52,399 Speaker 1: you can become rich if you want. But I would 859 00:43:52,480 --> 00:43:55,080 Speaker 1: encourage you more just to use this time you have 860 00:43:55,239 --> 00:43:58,759 Speaker 1: this now effectively unlimited time to do whatever you want 861 00:43:58,800 --> 00:44:01,120 Speaker 1: to do, things that you can accomp should one human lifetime. 862 00:44:01,160 --> 00:44:04,840 Speaker 1: Like I know, like I love linguistics, but I can't 863 00:44:04,840 --> 00:44:06,719 Speaker 1: bet on more than a hundred years if that on 864 00:44:06,760 --> 00:44:09,120 Speaker 1: this world, And there's more linguistics than I could ever 865 00:44:09,320 --> 00:44:12,480 Speaker 1: hope to learn in three hundred years, like you could 866 00:44:12,560 --> 00:44:16,640 Speaker 1: learn something beyond what any human can learn today. And 867 00:44:16,719 --> 00:44:19,600 Speaker 1: I think that's the appealing part of of striving towards 868 00:44:19,600 --> 00:44:23,000 Speaker 1: immortality for me, is to exceed these limits on what 869 00:44:23,080 --> 00:44:25,759 Speaker 1: we can do as one person. But then I see 870 00:44:25,760 --> 00:44:30,319 Speaker 1: the downsides of this ludicrous you know, inequality. Well, that's 871 00:44:30,360 --> 00:44:32,800 Speaker 1: kind of an interesting angle of this idea, like maybe 872 00:44:32,880 --> 00:44:34,920 Speaker 1: what if we've run out of space in your brain? 873 00:44:35,040 --> 00:44:37,600 Speaker 1: Like how many memories can your brain hold? Yeah, I 874 00:44:37,600 --> 00:44:39,759 Speaker 1: think that's the common thing we forget when we we 875 00:44:39,920 --> 00:44:43,040 Speaker 1: fantasize what immortality is, that the brain is not infinite. 876 00:44:43,200 --> 00:44:47,239 Speaker 1: And I'm sure if I lived five hundred years, I 877 00:44:47,239 --> 00:44:50,439 Speaker 1: would remember the last twenty pretty good. I wouldn't better 878 00:44:50,440 --> 00:44:53,280 Speaker 1: remember in the first thirty with any number of detail. 879 00:44:53,480 --> 00:44:54,920 Speaker 1: This is gonna have a sidebar to that. But I 880 00:44:54,960 --> 00:44:57,600 Speaker 1: was really fascinated by the idea of childhood amnesia, where 881 00:44:57,640 --> 00:44:59,759 Speaker 1: we don't remember our first couple of years of life. 882 00:45:00,640 --> 00:45:02,640 Speaker 1: And I was wondering if this was something that was 883 00:45:02,719 --> 00:45:05,200 Speaker 1: unique to humans or other animals. And I talked to 884 00:45:05,440 --> 00:45:09,040 Speaker 1: a neuroscientist friend of mine, and I was like, here's 885 00:45:09,080 --> 00:45:11,560 Speaker 1: my theory. Tell me where I'm wrong. Humans are born 886 00:45:11,600 --> 00:45:14,000 Speaker 1: effectively premature, where we need a lot of care for 887 00:45:14,320 --> 00:45:16,920 Speaker 1: several years of our lives. But horses are born and 888 00:45:16,960 --> 00:45:19,600 Speaker 1: they're running around within the hour like they're they're born 889 00:45:19,640 --> 00:45:23,200 Speaker 1: and they're ready to go. So can a horse remember 890 00:45:23,239 --> 00:45:27,400 Speaker 1: being born? An adult horse remember being born? And he, 891 00:45:27,680 --> 00:45:30,359 Speaker 1: to his credit, took my question seriously and was like, well, 892 00:45:30,880 --> 00:45:34,359 Speaker 1: what seems to happen, putting future science terms for you, 893 00:45:34,640 --> 00:45:37,000 Speaker 1: is that when we're young, we address our memories in 894 00:45:37,000 --> 00:45:38,759 Speaker 1: a certain way, and as we get older, we change 895 00:45:38,800 --> 00:45:41,680 Speaker 1: the addressing scheme, so we lose access to those memories. 896 00:45:41,680 --> 00:45:45,320 Speaker 1: So I don't think an adult horse remembers being born, 897 00:45:45,200 --> 00:45:47,600 Speaker 1: And thank you for the question, And also, would you 898 00:45:47,600 --> 00:45:49,920 Speaker 1: want to remember being born? It doesn't sound like a 899 00:45:49,960 --> 00:45:52,600 Speaker 1: very pleasant experience. That sounds painful, But I think I 900 00:45:52,640 --> 00:45:55,360 Speaker 1: think all this to say is that our brains are finite, 901 00:45:55,480 --> 00:45:58,359 Speaker 1: and when we fantasize about living forever, we forget that 902 00:45:58,840 --> 00:46:02,040 Speaker 1: we can't remember everything. And if you live for every 903 00:46:02,040 --> 00:46:05,160 Speaker 1: you might just remember effectively one human lifetime on a 904 00:46:05,200 --> 00:46:07,560 Speaker 1: sliding scale through time. But you're talk in your book 905 00:46:07,560 --> 00:46:10,799 Speaker 1: also about the possibility of uploading your mind into the simulation, 906 00:46:11,120 --> 00:46:12,680 Speaker 1: and it makes me wonder if there's sort of a 907 00:46:12,760 --> 00:46:15,759 Speaker 1: hybrid there. You know, I already have some extension of 908 00:46:15,800 --> 00:46:18,040 Speaker 1: my brain on this device I carry around because I 909 00:46:18,040 --> 00:46:21,400 Speaker 1: don't remember any telephone numbers or email addresses or stuff 910 00:46:21,440 --> 00:46:24,800 Speaker 1: like that. Isn't there possibility where we store these memories 911 00:46:24,840 --> 00:46:26,719 Speaker 1: somewhere on the cloud, and then when you want to 912 00:46:26,719 --> 00:46:29,319 Speaker 1: remember your thirteenth birthday, you just have to, like, you know, 913 00:46:29,400 --> 00:46:31,480 Speaker 1: go and fetch it and download it back into your brain, 914 00:46:31,520 --> 00:46:33,359 Speaker 1: and then you can relieve the trauma. I mean, you're 915 00:46:33,360 --> 00:46:40,480 Speaker 1: describing a video clip, I think exactly. So we are 916 00:46:40,520 --> 00:46:45,640 Speaker 1: living in the future right now. You're you just invent YouTube. 917 00:46:47,040 --> 00:46:49,479 Speaker 1: That's right, I want my cut, guys, where's my cut 918 00:46:49,480 --> 00:46:52,520 Speaker 1: of YouTube? I just invented it. We can externalize things 919 00:46:52,560 --> 00:46:56,520 Speaker 1: that happened with some sort of motion picture. Deep thoughts 920 00:46:56,520 --> 00:46:59,040 Speaker 1: by Daniel Wetson. All right, I got lots more questions 921 00:46:59,040 --> 00:47:14,480 Speaker 1: for Ryan, but first let's take another break. Okay, we're 922 00:47:14,520 --> 00:47:16,680 Speaker 1: back and we're talking to the author of the book 923 00:47:16,880 --> 00:47:19,359 Speaker 1: How to take Over the World, Ryan North, who has 924 00:47:19,440 --> 00:47:22,560 Speaker 1: not yet successfully taken over the world, but instead decided 925 00:47:22,600 --> 00:47:24,680 Speaker 1: to write a book about it. How do you know 926 00:47:25,600 --> 00:47:28,560 Speaker 1: he's not the one pulling all the strings behind the currents. 927 00:47:28,680 --> 00:47:30,640 Speaker 1: Let's just say you'll all be surprised in a couple 928 00:47:30,640 --> 00:47:33,240 Speaker 1: of months. I guess you would be a lot busier 929 00:47:33,280 --> 00:47:35,040 Speaker 1: and have more important things to do than to be 930 00:47:35,120 --> 00:47:38,080 Speaker 1: on our podcast. Well, that's my question instead of joking 931 00:47:38,160 --> 00:47:41,280 Speaker 1: the aside. Having thought about all of these schemes about 932 00:47:41,280 --> 00:47:43,360 Speaker 1: how to potentially take over the world, what do you 933 00:47:43,360 --> 00:47:45,839 Speaker 1: think the prospects are that somebody in the future might 934 00:47:45,920 --> 00:47:48,440 Speaker 1: actually do it? You know, it's been the goal of 935 00:47:48,600 --> 00:47:53,239 Speaker 1: real life villains Alexander the Great and Napoleon Hitler for 936 00:47:53,280 --> 00:47:55,680 Speaker 1: a long time, but nobody's really pulled it off, actually 937 00:47:55,719 --> 00:47:57,879 Speaker 1: taken over the entire world. Do you think it's something 938 00:47:57,920 --> 00:48:00,000 Speaker 1: that somebody in the future might be able to accomplish. 939 00:48:00,160 --> 00:48:04,000 Speaker 1: I don't. The reason for that is that, in one sense, 940 00:48:04,040 --> 00:48:07,160 Speaker 1: it's it's kind of self defeating because once you've taken 941 00:48:07,200 --> 00:48:08,840 Speaker 1: over the world, the only thing left to do is 942 00:48:08,880 --> 00:48:11,960 Speaker 1: to lose the world, right, like, there's no political human 943 00:48:11,960 --> 00:48:16,880 Speaker 1: political system that lasts indefinitely. And the other reason is 944 00:48:16,920 --> 00:48:20,120 Speaker 1: that it's a it's an appealing goal to a certain 945 00:48:20,120 --> 00:48:24,040 Speaker 1: type of person, like what if I controlled everything? But 946 00:48:24,080 --> 00:48:28,680 Speaker 1: the reality is that the benefits are very limited. Right, 947 00:48:28,760 --> 00:48:31,239 Speaker 1: Like you get to Boston people around, I guess, but 948 00:48:31,280 --> 00:48:33,720 Speaker 1: there's no way you can effectively boss around every person 949 00:48:33,800 --> 00:48:37,319 Speaker 1: on earth. There's always resistance, there's always people who want 950 00:48:37,360 --> 00:48:39,680 Speaker 1: to do something else and what you want them to do. 951 00:48:40,320 --> 00:48:43,480 Speaker 1: So if anyone came up to me and sincerely said, Ryan, 952 00:48:43,600 --> 00:48:45,920 Speaker 1: I want to take over the world, I think my 953 00:48:46,000 --> 00:48:49,719 Speaker 1: first instinct would be to not take them seriously, to laugh, 954 00:48:49,760 --> 00:48:51,319 Speaker 1: because I don't think they would have thought about it 955 00:48:51,440 --> 00:48:53,560 Speaker 1: in enough depth to be a credible threat. You're saying 956 00:48:53,560 --> 00:48:55,680 Speaker 1: it's not a job anyone would want. Yeah, I don't 957 00:48:55,719 --> 00:48:57,120 Speaker 1: think it's a job you want. I think if you 958 00:48:57,160 --> 00:48:58,560 Speaker 1: think about it for a minute, to be like, you 959 00:48:58,560 --> 00:49:01,160 Speaker 1: know what, that sounds like a lot of work for 960 00:49:01,239 --> 00:49:04,800 Speaker 1: very little benefit, a lot of destruction for very little benefit. 961 00:49:05,480 --> 00:49:08,279 Speaker 1: Maybe I won't. That's how I feel about becoming department chair, Like, 962 00:49:08,320 --> 00:49:12,279 Speaker 1: why does anybody want that job? Much much smaller scale? Yeah, 963 00:49:12,360 --> 00:49:14,839 Speaker 1: it looks good on a resume, but surely you want 964 00:49:14,840 --> 00:49:17,120 Speaker 1: to do something else with your time. Well, Brian, what 965 00:49:17,160 --> 00:49:19,160 Speaker 1: were some of the ideas that did not make it 966 00:49:19,160 --> 00:49:21,480 Speaker 1: into the book about How to take Over the World? 967 00:49:21,600 --> 00:49:25,560 Speaker 1: There was one that I really liked. Um So the 968 00:49:25,600 --> 00:49:29,240 Speaker 1: idea was let's have a chapter on throwing your enemies 969 00:49:29,280 --> 00:49:32,640 Speaker 1: into the Sun, and it's a great idea. The issue 970 00:49:32,680 --> 00:49:36,799 Speaker 1: is that it's not actually that hard surprisingly, so that 971 00:49:36,840 --> 00:49:38,759 Speaker 1: the trick with it is that you want to throw 972 00:49:38,800 --> 00:49:40,720 Speaker 1: your enemy to the Sun, you need a rocket, obviously, 973 00:49:41,280 --> 00:49:43,239 Speaker 1: but if you just launch them away from Earth. The 974 00:49:43,480 --> 00:49:45,440 Speaker 1: challenges that the Earth is rotating around the Sun very 975 00:49:45,480 --> 00:49:47,400 Speaker 1: quickly and you've got all that momentum, and so if 976 00:49:47,440 --> 00:49:49,680 Speaker 1: you launch someone off the Earth, they're going to also 977 00:49:49,719 --> 00:49:51,720 Speaker 1: go into orbit around the Sun. You want to actually 978 00:49:51,719 --> 00:49:54,239 Speaker 1: be thrown into the Sun. You need to slow them 979 00:49:54,239 --> 00:49:56,879 Speaker 1: down an awful lot to get close to the Sun 980 00:49:56,880 --> 00:49:58,960 Speaker 1: and not just orbit around it. And so you can 981 00:49:59,000 --> 00:50:03,279 Speaker 1: look at NASA's solar probe, the Parker Solar Probe that 982 00:50:03,520 --> 00:50:05,359 Speaker 1: was in the news recently for doing that. It did 983 00:50:05,480 --> 00:50:07,800 Speaker 1: orbital fly bys to slow it down off of Venus 984 00:50:08,320 --> 00:50:10,480 Speaker 1: and eventually it's close enough now that they can enter 985 00:50:10,600 --> 00:50:12,879 Speaker 1: the outer outer layer of the Sun. So you could 986 00:50:12,960 --> 00:50:16,000 Speaker 1: use that sort of process, do some orbital fly bys 987 00:50:16,000 --> 00:50:18,520 Speaker 1: to slow down this I guess corpse of your enemy. 988 00:50:18,960 --> 00:50:22,480 Speaker 1: You've thrown into space enough to actually impact and be 989 00:50:22,560 --> 00:50:24,759 Speaker 1: burned up in the Sun. But the challenge there as 990 00:50:24,800 --> 00:50:27,480 Speaker 1: an author is that this is not super complicated. Surprisingly, 991 00:50:27,520 --> 00:50:29,880 Speaker 1: this is just the costs are well known, the orbital 992 00:50:29,880 --> 00:50:32,400 Speaker 1: mechanics are well known. It's just a better of something 993 00:50:32,440 --> 00:50:35,399 Speaker 1: someone willing to spend the millions of dollars it takes 994 00:50:35,440 --> 00:50:38,439 Speaker 1: to do this just to fire someone who's already dead, 995 00:50:38,560 --> 00:50:41,440 Speaker 1: because it's not a survivable trip into the sun. So 996 00:50:41,480 --> 00:50:44,280 Speaker 1: it sort of reaches this level of impractical but still 997 00:50:44,360 --> 00:50:47,160 Speaker 1: really really appealing. So it's not in the book, But 998 00:50:47,239 --> 00:50:51,480 Speaker 1: if anyone wants to spend a couple million on launching 999 00:50:51,480 --> 00:50:54,040 Speaker 1: their enemies corpses in the sun, give me a call. 1000 00:50:54,120 --> 00:51:00,799 Speaker 1: I can send you over the spreadsheets. It's in the appendix. Well, 1001 00:51:00,880 --> 00:51:02,520 Speaker 1: it sounds like something you might want to do just 1002 00:51:02,560 --> 00:51:04,359 Speaker 1: to send a message, you know, like if you are 1003 00:51:04,480 --> 00:51:06,800 Speaker 1: ruling the world, you want people to take you seriously 1004 00:51:06,880 --> 00:51:09,120 Speaker 1: and not you know, rebel against. Yeah, but the other 1005 00:51:09,160 --> 00:51:11,239 Speaker 1: downside it takes a long time because you're having to 1006 00:51:11,280 --> 00:51:13,239 Speaker 1: fly out. The initial plan for Parker was to fly 1007 00:51:13,239 --> 00:51:15,719 Speaker 1: out to Saturn, and that modified one was to use 1008 00:51:15,719 --> 00:51:18,000 Speaker 1: a couple of flybys of Venus. But this still takes 1009 00:51:18,080 --> 00:51:20,600 Speaker 1: years and years and years, and it's not super scary 1010 00:51:20,680 --> 00:51:22,560 Speaker 1: to send a message to be like, if you cross me, 1011 00:51:23,400 --> 00:51:27,359 Speaker 1: well you just wait five years, because when five years 1012 00:51:27,360 --> 00:51:31,520 Speaker 1: were over, your dead body will be cremated in the sun. 1013 00:51:32,239 --> 00:51:34,200 Speaker 1: Maybe depends on what do you do to them on 1014 00:51:34,280 --> 00:51:37,160 Speaker 1: the on the way, They're like, maybe, I don't know. 1015 00:51:37,480 --> 00:51:39,600 Speaker 1: You put on some bad television on the on the 1016 00:51:39,719 --> 00:51:41,480 Speaker 1: rocket ship to the side. So you're assuming we're keeping 1017 00:51:41,520 --> 00:51:43,239 Speaker 1: them alive, and it's much more expensive to keep them 1018 00:51:43,239 --> 00:51:45,680 Speaker 1: alive for that journey. I was assuming we just you know, 1019 00:51:45,880 --> 00:51:47,520 Speaker 1: kill them on Earth, or to stick them in the 1020 00:51:47,560 --> 00:51:50,000 Speaker 1: in the shuttle in alive and then launch them into 1021 00:51:50,000 --> 00:51:51,759 Speaker 1: space and they'll die when they get into the cold 1022 00:51:51,800 --> 00:51:55,480 Speaker 1: vacuum space. But keeping them alive for five years, yeah, 1023 00:51:55,520 --> 00:51:57,640 Speaker 1: then it's worse. But then I feel like you're in 1024 00:51:57,760 --> 00:52:02,960 Speaker 1: like Geneva Convention torture violation territory. I like that you 1025 00:52:02,960 --> 00:52:05,520 Speaker 1: think so carefully about the budget for each of these schemes. 1026 00:52:05,560 --> 00:52:07,360 Speaker 1: You know, that's important because it's like a it's like 1027 00:52:07,360 --> 00:52:09,520 Speaker 1: a shopping list, you know, think about how much money 1028 00:52:09,560 --> 00:52:11,400 Speaker 1: you have before you want to take over the world. 1029 00:52:11,800 --> 00:52:13,920 Speaker 1: I feel like the fun of it, the fun of 1030 00:52:13,960 --> 00:52:16,440 Speaker 1: going through these thought experiments is the logistics, right, like 1031 00:52:16,520 --> 00:52:19,640 Speaker 1: with how to invent everything? The premise was you've got 1032 00:52:19,640 --> 00:52:21,960 Speaker 1: a time machine that's broken, which is clearly fictional, but 1033 00:52:22,080 --> 00:52:24,719 Speaker 1: let's use that to explore the science. But I also 1034 00:52:24,719 --> 00:52:27,120 Speaker 1: wanted to be like legitimately what it says in the cover. 1035 00:52:27,160 --> 00:52:29,160 Speaker 1: I wanted to be a real book that could actually 1036 00:52:29,200 --> 00:52:31,200 Speaker 1: work if you were trapped in the past. And so 1037 00:52:31,239 --> 00:52:33,040 Speaker 1: with this book, I wanted to be like, let's have 1038 00:52:33,120 --> 00:52:37,040 Speaker 1: legitimate schemes here. Yes, they're going to cost billions and 1039 00:52:37,080 --> 00:52:40,319 Speaker 1: billions of dollars, but well, most people use them to 1040 00:52:40,400 --> 00:52:42,759 Speaker 1: learn about science and technology and history and the world 1041 00:52:42,800 --> 00:52:46,440 Speaker 1: around us. Let's make it so that these are viable, 1042 00:52:46,640 --> 00:52:49,400 Speaker 1: credible schemes, Like, let's let's have the fun of actually 1043 00:52:49,440 --> 00:52:51,960 Speaker 1: pricing things out and thinking about logistics and if we 1044 00:52:52,000 --> 00:52:53,680 Speaker 1: do want to do this, what does it cost, what 1045 00:52:53,719 --> 00:52:55,799 Speaker 1: do we need? How is this gonna blow up at 1046 00:52:55,800 --> 00:52:58,239 Speaker 1: our faces? That sort of thing. So it's I think 1047 00:52:58,239 --> 00:53:00,560 Speaker 1: it's fun. Maybe you should write a book had to 1048 00:53:00,600 --> 00:53:04,600 Speaker 1: be a Supervillains Accountant. It's a less catchy title. Maybe 1049 00:53:04,600 --> 00:53:06,880 Speaker 1: we can use it for the paperback. That's like selling 1050 00:53:06,920 --> 00:53:09,359 Speaker 1: pick access to the gold miners. Well, in terms of 1051 00:53:09,360 --> 00:53:12,319 Speaker 1: credible schemes, I was expecting to see something about like 1052 00:53:12,480 --> 00:53:15,600 Speaker 1: leading an AI revolution because I feel like AI is 1053 00:53:15,600 --> 00:53:18,200 Speaker 1: gonna take over the world anyway. Is there some way 1054 00:53:18,280 --> 00:53:20,960 Speaker 1: you can think of to like lead that charge, you know, 1055 00:53:21,280 --> 00:53:24,319 Speaker 1: be the first conspirator, the first collaborator to join the 1056 00:53:24,360 --> 00:53:26,799 Speaker 1: other side and uh and use the power of AI 1057 00:53:26,880 --> 00:53:29,000 Speaker 1: to take over the world. Why did you reject that 1058 00:53:29,120 --> 00:53:31,840 Speaker 1: kind of idea? I felt like it's it's hard to 1059 00:53:31,880 --> 00:53:35,280 Speaker 1: get into specifics with something like AI because the ais 1060 00:53:35,320 --> 00:53:39,160 Speaker 1: we have now are nowhere close to that um at all. 1061 00:53:39,600 --> 00:53:41,560 Speaker 1: And I sort of touched on that in the mind 1062 00:53:41,640 --> 00:53:45,359 Speaker 1: uploading section where the the issue encounter if you're gonna 1063 00:53:45,400 --> 00:53:48,880 Speaker 1: upload your mind to a computer is well, it basically 1064 00:53:48,880 --> 00:53:51,480 Speaker 1: boils down to who cares, right, Like, if I've uploaded 1065 00:53:51,480 --> 00:53:53,759 Speaker 1: my brain to a computer, I can't prove that it's 1066 00:53:53,800 --> 00:53:56,280 Speaker 1: me there. There's no way to prove that I'm conscious. 1067 00:53:56,800 --> 00:54:01,200 Speaker 1: And we've all had, you know, computer games, so we've 1068 00:54:01,239 --> 00:54:04,040 Speaker 1: loved and played for hours and hours and then forgotten 1069 00:54:04,040 --> 00:54:06,480 Speaker 1: about and then later deleted or even just not to 1070 00:54:06,640 --> 00:54:08,160 Speaker 1: be just left them on the hard dave and throughout 1071 00:54:08,160 --> 00:54:11,400 Speaker 1: the computer. And I think the same sort of thing 1072 00:54:11,400 --> 00:54:13,880 Speaker 1: would happen if you had a computer that had Ryan 1073 00:54:14,000 --> 00:54:16,160 Speaker 1: dot e x C on it. You'd probably have fun 1074 00:54:16,200 --> 00:54:18,200 Speaker 1: with it for a while, and then you might mess 1075 00:54:18,239 --> 00:54:20,080 Speaker 1: with me for a while or try to get me 1076 00:54:20,120 --> 00:54:22,560 Speaker 1: mad or try to probuke a reaction. I think that's 1077 00:54:22,560 --> 00:54:25,120 Speaker 1: called torture. And then eventually you get bored with Ryan 1078 00:54:25,160 --> 00:54:29,040 Speaker 1: dot e x C and deleted or whatever. But you're 1079 00:54:29,040 --> 00:54:31,640 Speaker 1: not going to keep it running for four years. Like, 1080 00:54:31,719 --> 00:54:35,960 Speaker 1: I don't think that is something that is likely to happen, 1081 00:54:36,600 --> 00:54:38,120 Speaker 1: or if it does, I had to be giving you 1082 00:54:38,160 --> 00:54:40,640 Speaker 1: some sort of benefits, some sort of profit, some some 1083 00:54:40,680 --> 00:54:42,160 Speaker 1: reason for you to keep me running when you could 1084 00:54:42,160 --> 00:54:44,759 Speaker 1: be running you know, Doom dot e XC or some 1085 00:54:44,840 --> 00:54:47,000 Speaker 1: other game. Maybe you can pay the church to keep 1086 00:54:47,000 --> 00:54:49,400 Speaker 1: a computer running in the back, you know, in the 1087 00:54:49,440 --> 00:54:52,320 Speaker 1: back of the room. Yeah, So for an AI revolution, 1088 00:54:52,440 --> 00:54:54,879 Speaker 1: sort of the same thing where I I think if 1089 00:54:54,920 --> 00:54:58,080 Speaker 1: such a thing happened, I doubt they'd have much need 1090 00:54:58,239 --> 00:55:00,279 Speaker 1: for a human at that point. They if they're in 1091 00:55:00,400 --> 00:55:01,840 Speaker 1: a place where they can take over the world, do 1092 00:55:01,920 --> 00:55:03,759 Speaker 1: they need a human collaborator to like let them in 1093 00:55:03,800 --> 00:55:06,600 Speaker 1: the front door or can they just do what they want? 1094 00:55:06,960 --> 00:55:09,520 Speaker 1: That's the next book, How to take Over the AI World. 1095 00:55:10,400 --> 00:55:14,120 Speaker 1: Speaking about keeping computer programs running, I always remember my 1096 00:55:14,200 --> 00:55:17,560 Speaker 1: dad who did his thesis on a computer that required 1097 00:55:17,560 --> 00:55:21,000 Speaker 1: punch cards, and he kept his thesis around as a 1098 00:55:21,040 --> 00:55:23,399 Speaker 1: hard copy, and he liked saying that it was the 1099 00:55:23,480 --> 00:55:26,120 Speaker 1: last kind of hard copy that you could actually run, 1100 00:55:26,400 --> 00:55:28,879 Speaker 1: where you know, printing out the program was actually also 1101 00:55:29,120 --> 00:55:32,120 Speaker 1: the program you could insert back into the computer and 1102 00:55:32,160 --> 00:55:34,640 Speaker 1: make it run again. That's something we've lost with more 1103 00:55:34,680 --> 00:55:37,760 Speaker 1: modern technology, and that's fascinating. I saw on art project 1104 00:55:37,800 --> 00:55:41,600 Speaker 1: the other day where someone had recorded every operation a 1105 00:55:41,680 --> 00:55:47,080 Speaker 1: Nintendo entertainment system when playing Mario three for three seconds, 1106 00:55:47,760 --> 00:55:49,239 Speaker 1: and then printed those out in a book. And there 1107 00:55:49,239 --> 00:55:53,279 Speaker 1: are these three giant bound books of just assembly code 1108 00:55:53,320 --> 00:55:56,279 Speaker 1: instructions that tell you everything that happens in the first 1109 00:55:56,280 --> 00:55:59,480 Speaker 1: three seconds of Marrier Brothers. But it's such a different 1110 00:55:59,520 --> 00:56:02,120 Speaker 1: way of looking at the program. Right. Well, Ryan, I 1111 00:56:02,160 --> 00:56:05,600 Speaker 1: have a small request from my son, who apparently was 1112 00:56:05,640 --> 00:56:09,520 Speaker 1: really taken by geodesic spheres that you describe in your book. 1113 00:56:10,040 --> 00:56:12,440 Speaker 1: So his request is that you put more geodesic spheres 1114 00:56:12,440 --> 00:56:17,879 Speaker 1: in your next book. I love a request that is straightforward. 1115 00:56:18,440 --> 00:56:20,319 Speaker 1: I am happy to do it as much as I 1116 00:56:20,320 --> 00:56:22,640 Speaker 1: am possible. I would try to get some geodesic spheres 1117 00:56:23,040 --> 00:56:26,440 Speaker 1: in the next book. Well, you know, for some reason, 1118 00:56:26,480 --> 00:56:28,840 Speaker 1: he was really taken by your description of like making 1119 00:56:28,880 --> 00:56:32,880 Speaker 1: a ginormous geodesic sphere, and how like if even if 1120 00:56:32,920 --> 00:56:35,239 Speaker 1: it's just a little bit warmer than the air around that, 1121 00:56:35,280 --> 00:56:38,040 Speaker 1: it would float until you could create like a cloud city. Um, 1122 00:56:38,080 --> 00:56:39,960 Speaker 1: so I guess he wants more of that. Yeah, that 1123 00:56:39,960 --> 00:56:43,160 Speaker 1: that idea comes from Buckminster Fuller, who was given the 1124 00:56:43,239 --> 00:56:46,600 Speaker 1: US Presidential Metal of Freedom, and he described this in 1125 00:56:46,680 --> 00:56:48,680 Speaker 1: briefing one of his book's Critical path where he was 1126 00:56:48,680 --> 00:56:52,240 Speaker 1: basically making the point that square cube law, the surface 1127 00:56:52,320 --> 00:56:55,479 Speaker 1: area of something and the volume of something I grow 1128 00:56:55,560 --> 00:56:59,040 Speaker 1: very differently. And so if you had a giant geodesic sphere, 1129 00:56:59,280 --> 00:57:04,120 Speaker 1: the mass of the shell holding it in place gets 1130 00:57:04,160 --> 00:57:06,359 Speaker 1: negligible as the sphere gets larger and larger. And then 1131 00:57:06,400 --> 00:57:08,360 Speaker 1: you could just heat up the sphere a little bit 1132 00:57:08,400 --> 00:57:11,080 Speaker 1: above the surrounding temperature and it'll float. And like the 1133 00:57:11,160 --> 00:57:13,560 Speaker 1: numbers check out, this makes sense. The challenge is that 1134 00:57:13,640 --> 00:57:16,760 Speaker 1: to build a you know, one point six kilometer diameter 1135 00:57:16,840 --> 00:57:19,760 Speaker 1: geodesic sphere to float as a secret base, you need 1136 00:57:19,840 --> 00:57:24,920 Speaker 1: some extremely strong materials, and we don't necessarily have those yet. 1137 00:57:25,120 --> 00:57:28,280 Speaker 1: But I don't see any reason why someone couldn't If 1138 00:57:28,320 --> 00:57:30,480 Speaker 1: you want to put the money into it. Also the 1139 00:57:30,520 --> 00:57:32,600 Speaker 1: challenges that it would be larger than the birds Khalifa, 1140 00:57:32,640 --> 00:57:35,440 Speaker 1: which is the largest building man made structure in the world, 1141 00:57:36,040 --> 00:57:39,360 Speaker 1: almost twice size of it, which clearly has some engineering challenges, 1142 00:57:39,400 --> 00:57:42,320 Speaker 1: But I mean solve them. These are just challenges, right, Yeah. 1143 00:57:42,360 --> 00:57:44,440 Speaker 1: And I think Jorge Son has a father who's a 1144 00:57:44,440 --> 00:57:47,240 Speaker 1: mechanical engineer, so maybe he knows somebody you can ask. Yeah, well, 1145 00:57:47,280 --> 00:57:52,320 Speaker 1: I can draw it pretty different. It sounds like you've 1146 00:57:52,360 --> 00:57:54,800 Speaker 1: got your plot. I'll figured out horror. It sounds like 1147 00:57:54,840 --> 00:57:56,680 Speaker 1: I might have a little supervillain here in my house. 1148 00:57:56,760 --> 00:57:59,080 Speaker 1: It sounds like he wants to move out. Hypothetically speaking, 1149 00:58:00,000 --> 00:58:02,000 Speaker 1: we speaking all right, Well, I want to say thank 1150 00:58:02,000 --> 00:58:04,720 Speaker 1: you very much to Ryan for speaking with us literally 1151 00:58:04,800 --> 00:58:07,840 Speaker 1: and hypothetically about how to take over the world, and 1152 00:58:08,120 --> 00:58:10,360 Speaker 1: encourage our listeners out there to go ahead and check 1153 00:58:10,360 --> 00:58:12,680 Speaker 1: out the book How to take Over the World? Ryan. 1154 00:58:12,720 --> 00:58:14,840 Speaker 1: Where can folks find your book? They can find it 1155 00:58:14,920 --> 00:58:18,920 Speaker 1: at super Villain book dot com. And they can find 1156 00:58:18,960 --> 00:58:21,880 Speaker 1: me at Ryan North dot c A or on Twitter 1157 00:58:22,000 --> 00:58:25,240 Speaker 1: at at Ryan to North. And if we have questions 1158 00:58:25,240 --> 00:58:27,480 Speaker 1: for your dog Chomsky, where should we send those all 1159 00:58:27,560 --> 00:58:31,000 Speaker 1: questions can't be sent to Gnome Chomsky Chomsky with a 1160 00:58:31,080 --> 00:58:37,320 Speaker 1: P because he's care of Ryan Norris. All right, thanks 1161 00:58:37,400 --> 00:58:48,400 Speaker 1: very much trying. Oh, it's my sincere pleasure. Thanks for listening, 1162 00:58:48,480 --> 00:58:51,200 Speaker 1: and remember that Daniel and Jorge explain the Universe is 1163 00:58:51,240 --> 00:58:54,640 Speaker 1: a production of I Heart Radio or more podcast For 1164 00:58:54,720 --> 00:58:58,480 Speaker 1: my heart Radio, visit the I heart Radio app, Apple Podcasts, 1165 00:58:58,600 --> 00:59:00,960 Speaker 1: or wherever you listen to your favorite chills