1 00:00:00,440 --> 00:00:03,760 Speaker 1: Hey, everybody, Josh here Um. We wanted to include a 2 00:00:03,840 --> 00:00:08,200 Speaker 1: note before this episode, which is about existential risks um 3 00:00:08,360 --> 00:00:11,200 Speaker 1: threats that are big enough to actually wipe humanity out 4 00:00:11,200 --> 00:00:15,520 Speaker 1: of existence. Well, we recorded this episode just before the pandemic, 5 00:00:15,800 --> 00:00:18,960 Speaker 1: which explains the weird lack of mention of COVID when 6 00:00:18,960 --> 00:00:22,560 Speaker 1: we're talking about viruses, and when this pandemic came along, 7 00:00:22,880 --> 00:00:25,759 Speaker 1: we thought perhaps a wait and sea approach might be 8 00:00:25,760 --> 00:00:29,360 Speaker 1: best before just willy nilly releasing an episode about the 9 00:00:29,440 --> 00:00:32,520 Speaker 1: end of the world. So we decided to release this now, 10 00:00:32,680 --> 00:00:35,120 Speaker 1: still in the thick of things, not just because the 11 00:00:35,120 --> 00:00:38,200 Speaker 1: world hasn't ended, but because one of the few good 12 00:00:38,200 --> 00:00:40,800 Speaker 1: things that's come out of this terrible time is the 13 00:00:40,840 --> 00:00:43,840 Speaker 1: way that we've all kind of come together and given 14 00:00:43,880 --> 00:00:46,320 Speaker 1: a lot of thought about how we can look out 15 00:00:46,360 --> 00:00:50,440 Speaker 1: for each other. And that's exactly what thinking about existential 16 00:00:50,560 --> 00:00:52,919 Speaker 1: risks is all about. So we thought there would be 17 00:00:52,960 --> 00:00:56,120 Speaker 1: no better time than right now to talk about them. 18 00:00:56,160 --> 00:00:59,000 Speaker 1: We hope this explains things, uh and that you realize 19 00:00:59,000 --> 00:01:03,400 Speaker 1: we're not releasing the glibly in anyway. Instead, we hope 20 00:01:03,440 --> 00:01:06,160 Speaker 1: that it makes you reflective about what it means to 21 00:01:06,240 --> 00:01:12,200 Speaker 1: be human and why humanity is worth fighting for. Welcome 22 00:01:12,240 --> 00:01:14,480 Speaker 1: to Stuff. You should know a production of My Heart 23 00:01:14,560 --> 00:01:23,240 Speaker 1: Radios How Stuff Works. Hey, you, welcome to the podcast. 24 00:01:23,280 --> 00:01:26,360 Speaker 1: I'm Josh Clark. And there's Charles W. Chuck Bryant over there, 25 00:01:26,640 --> 00:01:30,360 Speaker 1: and there's guest producer Dave c sitting in yet again, 26 00:01:30,680 --> 00:01:33,679 Speaker 1: at least the second time. I believe he's already picked 27 00:01:33,720 --> 00:01:36,039 Speaker 1: up that he knows not to speak. He's nodding the 28 00:01:36,160 --> 00:01:40,120 Speaker 1: custom established by Jerry Um. But yeah, he did not, 29 00:01:40,280 --> 00:01:42,560 Speaker 1: didn't he? So yeah, I guess it is twice that 30 00:01:42,680 --> 00:01:45,479 Speaker 1: Dave's been sitting in. What if he just heard two 31 00:01:45,520 --> 00:01:49,240 Speaker 1: times from the other side of the room, You're like, 32 00:01:49,400 --> 00:01:51,400 Speaker 1: didn't have the heart to tell him not to do that? Right? 33 00:01:52,360 --> 00:01:54,520 Speaker 1: I think he would be Um, he would catch the 34 00:01:54,600 --> 00:01:58,800 Speaker 1: drift from like the record scratching that just like materialized 35 00:01:58,800 --> 00:02:02,040 Speaker 1: out of nowhere. Many people know that we have someone 36 00:02:02,080 --> 00:02:04,920 Speaker 1: on permanent stand by by a record player just in 37 00:02:04,960 --> 00:02:07,560 Speaker 1: case we do something like that, and that person is 38 00:02:07,600 --> 00:02:12,480 Speaker 1: Tommy Chong. Hi, Tommy, do I smell bong water man? 39 00:02:12,680 --> 00:02:16,200 Speaker 1: Betty breaks of it? Yeah? Probably, so, I mean, hats 40 00:02:16,240 --> 00:02:18,200 Speaker 1: off to him for sticking to his bit. You know. 41 00:02:18,480 --> 00:02:20,640 Speaker 1: Cheech was like, hey, hey, I want a good long 42 00:02:20,840 --> 00:02:23,440 Speaker 1: spot on Nash Bridges, So alls, whatever you want me 43 00:02:23,480 --> 00:02:26,920 Speaker 1: to about, Like, I'm just into gummies now, Tommy Chong 44 00:02:27,560 --> 00:02:32,120 Speaker 1: like tripled down. Yeah, and least he sold the bongs, 45 00:02:32,160 --> 00:02:37,560 Speaker 1: didn't he that? Uh? Pe test Beaters test beat Okay, 46 00:02:38,639 --> 00:02:42,840 Speaker 1: suddenly think of how to say something like that away 47 00:02:42,880 --> 00:02:47,440 Speaker 1: to um defeat urine test. Oh well, listen to you, fancy, 48 00:02:47,760 --> 00:02:49,840 Speaker 1: I would say, I don't know. I know that a 49 00:02:49,960 --> 00:02:54,120 Speaker 1: street guys call it pe Test Beaters, but Pete Test 50 00:02:54,160 --> 00:02:57,880 Speaker 1: Beaters is a a band name about as good as say, 51 00:02:57,919 --> 00:03:02,919 Speaker 1: like die your Planet. Actually I think diarrhea Planet's got 52 00:03:02,919 --> 00:03:06,280 Speaker 1: it beat. But still all right, so chuck, um, we're 53 00:03:06,280 --> 00:03:10,399 Speaker 1: talking today about a topic that is near and dear 54 00:03:10,440 --> 00:03:14,079 Speaker 1: to my heart. Existential risks, that's right, which I don't 55 00:03:14,080 --> 00:03:15,960 Speaker 1: know if you if you've gathered that or not, but 56 00:03:16,040 --> 00:03:21,480 Speaker 1: I really really am into this, this topic all all around. Um. 57 00:03:21,520 --> 00:03:24,079 Speaker 1: As a matter of fact, I did a temporart series 58 00:03:24,120 --> 00:03:26,040 Speaker 1: on it called The End of the World with Josh Clark, 59 00:03:26,600 --> 00:03:30,080 Speaker 1: available everywhere you get podcasts right now. Um, I managed 60 00:03:30,120 --> 00:03:32,760 Speaker 1: to smash that down. That's kind of what this is. 61 00:03:32,880 --> 00:03:36,400 Speaker 1: It's a condensed version and forever, like I wanted to 62 00:03:36,600 --> 00:03:40,000 Speaker 1: just s y s k if I the topic of 63 00:03:40,040 --> 00:03:43,360 Speaker 1: existential risks like do do it with you? I wanted 64 00:03:43,400 --> 00:03:45,360 Speaker 1: to do it with you. This is going to be 65 00:03:45,600 --> 00:03:48,280 Speaker 1: a live show at one point it was, Um, I 66 00:03:48,320 --> 00:03:50,560 Speaker 1: think even before that, I was like, Hey, you want 67 00:03:50,600 --> 00:03:51,880 Speaker 1: to do an episode on this. You're like, this is 68 00:03:51,880 --> 00:03:56,040 Speaker 1: pretty dark stuff. We're doing it now now. I The 69 00:03:56,040 --> 00:03:58,000 Speaker 1: only time I said that was when you actually sent 70 00:03:58,120 --> 00:03:59,960 Speaker 1: me the document for the live show and I went, 71 00:04:00,360 --> 00:04:02,160 Speaker 1: I don't know about a live version of this, So 72 00:04:02,200 --> 00:04:04,640 Speaker 1: I guess I guess that must have been before the 73 00:04:04,720 --> 00:04:06,880 Speaker 1: end of the world. Then, huh, this was like eight 74 00:04:06,960 --> 00:04:08,840 Speaker 1: years ago. Well, I'm glad you turned down the live 75 00:04:08,880 --> 00:04:12,400 Speaker 1: show because it may have lived and died there. So Um, 76 00:04:12,440 --> 00:04:15,120 Speaker 1: one of the have made all those into the world 77 00:04:15,160 --> 00:04:18,600 Speaker 1: big bucks right exactly, Man, I'm rolling in it. My 78 00:04:18,640 --> 00:04:22,840 Speaker 1: mattress is stuffed with him. Um. So uh. And you know, 79 00:04:23,080 --> 00:04:27,039 Speaker 1: bucks aren't always just the only way of qualifying or 80 00:04:27,120 --> 00:04:33,000 Speaker 1: quantifying the success of something. You know, there's also Academy awards, right, oscars, 81 00:04:33,040 --> 00:04:38,960 Speaker 1: and that's it, peabodies, big money or public awards ceremonies. Okay, granted, Um, 82 00:04:39,080 --> 00:04:40,840 Speaker 1: the other reason I wanted to do this episode is 83 00:04:40,880 --> 00:04:43,680 Speaker 1: because one of the people who was a participant in 84 00:04:43,800 --> 00:04:46,640 Speaker 1: Interviewee in the End of the World with Josh Clark, 85 00:04:46,839 --> 00:04:50,000 Speaker 1: a guy named Dr Toby ord Um recently published a 86 00:04:50,040 --> 00:04:53,680 Speaker 1: book called The Precipice and it is like a really 87 00:04:53,760 --> 00:04:56,839 Speaker 1: in depth look at existential risks and the ones we 88 00:04:56,920 --> 00:04:59,120 Speaker 1: face and you know what's coming down the pike and 89 00:04:59,400 --> 00:05:03,000 Speaker 1: what we can do about him and why right exactly, 90 00:05:03,080 --> 00:05:07,560 Speaker 1: cheers and jeers right exactly. Um. And it's a it's 91 00:05:07,600 --> 00:05:09,320 Speaker 1: a really good book and it's written just for like 92 00:05:09,360 --> 00:05:11,320 Speaker 1: the average person to pick up and be like, I 93 00:05:11,360 --> 00:05:13,240 Speaker 1: hadn't heard about this and then reached the end of 94 00:05:13,240 --> 00:05:15,599 Speaker 1: it and say, I'm terrified, but I'm also hopeful. And 95 00:05:15,680 --> 00:05:17,440 Speaker 1: that one reason I wanted to do this episode to 96 00:05:17,520 --> 00:05:20,040 Speaker 1: let everybody know about Dr Ord's book or Toby's book. 97 00:05:20,360 --> 00:05:22,279 Speaker 1: It's impossible to call him dr or he's just a 98 00:05:22,320 --> 00:05:27,039 Speaker 1: really likable guy. Um is because he actually turned the 99 00:05:27,080 --> 00:05:31,680 Speaker 1: tone of the End of the World around almost single handedly. 100 00:05:31,800 --> 00:05:36,559 Speaker 1: It was really grim, remember before I interviewed him, really 101 00:05:36,600 --> 00:05:39,039 Speaker 1: and and also you remember I started like listening to 102 00:05:39,120 --> 00:05:42,520 Speaker 1: The Cure a lot. Um just got real dark there 103 00:05:42,520 --> 00:05:44,920 Speaker 1: for a little while. Which is funny that The Cure 104 00:05:45,000 --> 00:05:48,680 Speaker 1: is my conception of like really dark anyway, um, death 105 00:05:48,720 --> 00:05:52,480 Speaker 1: metal guys out there laughing, right, so he but talking 106 00:05:52,520 --> 00:05:55,280 Speaker 1: to him, he just kind of just steered the ship 107 00:05:55,320 --> 00:05:57,720 Speaker 1: a little bit, and by the end of it, because 108 00:05:57,720 --> 00:05:59,400 Speaker 1: of his influence, the End of the the World actually is 109 00:05:59,400 --> 00:06:02,320 Speaker 1: a pretty hopeful well series. So my hat's off to 110 00:06:02,360 --> 00:06:04,719 Speaker 1: the guy for for doing that, but also for writing 111 00:06:04,760 --> 00:06:09,520 Speaker 1: this book the precipice. Hat's off, sir. So, um, we 112 00:06:09,520 --> 00:06:13,520 Speaker 1: should probably kind of describe what existential risks are. Um. 113 00:06:13,720 --> 00:06:15,919 Speaker 1: I know that you know in this document is described 114 00:06:16,360 --> 00:06:19,640 Speaker 1: many many times. But the reasons described many many times 115 00:06:19,760 --> 00:06:21,599 Speaker 1: is because there's like a lot of nuance to it. 116 00:06:21,680 --> 00:06:23,599 Speaker 1: And the reason there's a lot of nuance to it 117 00:06:23,640 --> 00:06:26,480 Speaker 1: is because we kind of tend to walk around thinking 118 00:06:26,520 --> 00:06:31,000 Speaker 1: that we understand existential risks based on our experience with 119 00:06:31,040 --> 00:06:34,680 Speaker 1: previous risks. But the problem with existential risks are they're 120 00:06:34,680 --> 00:06:38,200 Speaker 1: actually new to us and they're not like other risks 121 00:06:38,279 --> 00:06:42,680 Speaker 1: because they're just so big and if something happens one 122 00:06:42,680 --> 00:06:46,200 Speaker 1: of these existential catastrophes but follows us, that's it. There's 123 00:06:46,200 --> 00:06:48,599 Speaker 1: no second chance, there's no do over, and we're not 124 00:06:48,760 --> 00:06:52,600 Speaker 1: used to risks like that. That's right. Uh, nobody is 125 00:06:52,640 --> 00:06:56,039 Speaker 1: because we are all people, right, and the thought of 126 00:06:56,080 --> 00:07:00,680 Speaker 1: all of human beings being gone um or at least uh, 127 00:07:00,880 --> 00:07:03,760 Speaker 1: not being able to live as regular humans live and 128 00:07:03,839 --> 00:07:08,800 Speaker 1: enjoy life like and not live as matrix batteries because 129 00:07:08,839 --> 00:07:12,880 Speaker 1: you know, technically the matrix those are people. Yeah, but 130 00:07:12,960 --> 00:07:15,800 Speaker 1: that's that's no way to live. The people in the pods, Yeah, 131 00:07:15,880 --> 00:07:18,040 Speaker 1: that's what I'm saying. I wouldn't want to live that way. 132 00:07:18,120 --> 00:07:20,880 Speaker 1: But that's another version of existential risk. Is not necessarily 133 00:07:20,880 --> 00:07:23,239 Speaker 1: that everyone's dead, but you could become just a matrix 134 00:07:23,240 --> 00:07:26,640 Speaker 1: battery and not flourish or move forward as a people, 135 00:07:26,840 --> 00:07:31,600 Speaker 1: right exactly, so um. But but with existential risks in general, 136 00:07:31,680 --> 00:07:34,120 Speaker 1: like that, the general idea of them is that like 137 00:07:34,240 --> 00:07:36,840 Speaker 1: if you are walking along and you suddenly get hit 138 00:07:36,880 --> 00:07:40,600 Speaker 1: by a car, like you no longer exist, but the 139 00:07:40,640 --> 00:07:45,760 Speaker 1: rest of humanity continues on existing, uh. Correct. With existential risks, 140 00:07:45,840 --> 00:07:48,760 Speaker 1: it's like the car that comes along and hits not 141 00:07:49,520 --> 00:07:53,040 Speaker 1: a human but all humans. So it's a risk to 142 00:07:53,480 --> 00:07:57,480 Speaker 1: humanity itself. And that's just kind of different because all 143 00:07:57,520 --> 00:08:01,560 Speaker 1: of the other risks that we've ever run across, um 144 00:08:01,760 --> 00:08:07,000 Speaker 1: either give us the luxury of time or proximity, meaning 145 00:08:07,040 --> 00:08:11,000 Speaker 1: that we have enough time to adapt our our behavior 146 00:08:11,080 --> 00:08:14,840 Speaker 1: to it, to survive it and continue on as a species. 147 00:08:15,440 --> 00:08:17,360 Speaker 1: Or there's not enough of us in one place to 148 00:08:17,400 --> 00:08:20,800 Speaker 1: be affected by this this UM risk that took out 149 00:08:20,800 --> 00:08:23,800 Speaker 1: say one person or a billion people, Like if all 150 00:08:23,880 --> 00:08:28,000 Speaker 1: of Europe went away, that is not an ex risk. No, 151 00:08:28,440 --> 00:08:32,400 Speaker 1: and so people might say, um, it would be really sad, 152 00:08:32,520 --> 00:08:37,160 Speaker 1: and I mean up to you know, the people alive 153 00:08:37,200 --> 00:08:40,760 Speaker 1: on Earth, if they all died somehow, it would still 154 00:08:40,800 --> 00:08:43,960 Speaker 1: possibly not be an existential risk because that one percent 155 00:08:44,040 --> 00:08:49,240 Speaker 1: living could conceivably rebuild civilization. That's right. We're talking about 156 00:08:49,360 --> 00:08:53,760 Speaker 1: giving the world back to mother Nature and just seeing 157 00:08:53,840 --> 00:08:57,280 Speaker 1: what happens. Do you remember that UM series. I think 158 00:08:57,320 --> 00:09:00,000 Speaker 1: it was a book to start the Earth without us 159 00:09:00,440 --> 00:09:03,320 Speaker 1: m oh so, I think I know that there was 160 00:09:03,360 --> 00:09:04,920 Speaker 1: a big deal when it came out, and then they 161 00:09:04,960 --> 00:09:07,200 Speaker 1: made like a maybe a Science channel or that GEO 162 00:09:07,320 --> 00:09:10,880 Speaker 1: series about it where this guy describes like how our 163 00:09:10,960 --> 00:09:14,240 Speaker 1: infrastructure will start to crumble, like if humans just vanished tomorrow, 164 00:09:15,000 --> 00:09:18,640 Speaker 1: how the Earth would reclaim, Nature would reclaim everything we've 165 00:09:18,679 --> 00:09:21,800 Speaker 1: done and undo you know, after after a month, after 166 00:09:21,880 --> 00:09:24,199 Speaker 1: a year, after ten years. I've heard of that it's 167 00:09:24,240 --> 00:09:27,920 Speaker 1: really cool stuff. Yeah, there's a Bonnie Prince. Billy my 168 00:09:28,520 --> 00:09:32,360 Speaker 1: idol has a song called It's Far from Over, and 169 00:09:32,360 --> 00:09:36,160 Speaker 1: that's sort of a Bonnie principally look at the fact that, hey, 170 00:09:36,200 --> 00:09:39,920 Speaker 1: even if all humans leave, it's not over. Yeah, like 171 00:09:40,000 --> 00:09:42,000 Speaker 1: new animals are gonna new creatures are going to be born, 172 00:09:42,520 --> 00:09:45,160 Speaker 1: the earth continues. Yeah. Uh. And he also has a 173 00:09:45,200 --> 00:09:47,560 Speaker 1: line though about like you, but you better teacher kids 174 00:09:47,559 --> 00:09:50,439 Speaker 1: a swim. That's a great line. Yeah, it's good stuff. 175 00:09:50,480 --> 00:09:52,400 Speaker 1: They ever tell you. I saw that guy do karaoke 176 00:09:52,640 --> 00:09:55,920 Speaker 1: with his wife once. Oh really, you know our friend 177 00:09:55,960 --> 00:10:00,560 Speaker 1: Toby his wedding. Yeah, I would have not been able 178 00:10:00,600 --> 00:10:02,440 Speaker 1: to be at that wedding because you would have just 179 00:10:02,480 --> 00:10:05,200 Speaker 1: been such a fanboy. I don't know what I would do. 180 00:10:05,240 --> 00:10:07,560 Speaker 1: I would, I would. It would have ruined my time, 181 00:10:08,760 --> 00:10:10,920 Speaker 1: They really would, because I would second guess everything I 182 00:10:10,960 --> 00:10:13,200 Speaker 1: did about talk. I mean I even talked to the 183 00:10:13,200 --> 00:10:17,640 Speaker 1: guy once backstage, and that ruined my day. It really did, 184 00:10:17,640 --> 00:10:19,520 Speaker 1: because you spent the rest of the time just thinking 185 00:10:19,559 --> 00:10:22,600 Speaker 1: about how stuff. It was actually fine. He was a 186 00:10:22,679 --> 00:10:25,320 Speaker 1: very very very nice guy, and we talked about athens 187 00:10:25,360 --> 00:10:27,240 Speaker 1: and stuff. But that's who I just went to see 188 00:10:27,440 --> 00:10:30,400 Speaker 1: in d C Philly in New York. Nice when a 189 00:10:30,440 --> 00:10:32,880 Speaker 1: little follow him around the tour for a few days, 190 00:10:33,000 --> 00:10:35,760 Speaker 1: did he sing that song about the world going on 191 00:10:35,760 --> 00:10:39,319 Speaker 1: on or life going on? He did so. Um, so 192 00:10:39,400 --> 00:10:42,080 Speaker 1: let's just cover a couple of things that we like 193 00:10:42,160 --> 00:10:45,720 Speaker 1: people might think our existential risk that actually aren't. Okay. Yeah, 194 00:10:45,760 --> 00:10:48,720 Speaker 1: I mean I think a lot of people might think, um, sure, 195 00:10:48,920 --> 00:10:54,640 Speaker 1: some global pandemic that could wipe out humanity. There could 196 00:10:54,840 --> 00:10:57,080 Speaker 1: very well be a global pandemic that could kill a 197 00:10:57,080 --> 00:10:59,959 Speaker 1: lot of people, but it's probably not going to kill 198 00:11:00,000 --> 00:11:02,960 Speaker 1: every living human, right. It would be a catastrophe, sure, 199 00:11:03,120 --> 00:11:05,520 Speaker 1: but not an next risk. Yeah, I mean, because humans 200 00:11:05,720 --> 00:11:09,120 Speaker 1: have anybodies that we develop, and so people who survive 201 00:11:09,400 --> 00:11:12,600 Speaker 1: that flu have anybodies that they pass on the next generation, 202 00:11:12,640 --> 00:11:15,480 Speaker 1: and so that that disease kind of dies out before 203 00:11:15,480 --> 00:11:17,839 Speaker 1: it kills everybody off, and the preppers at the very 204 00:11:17,920 --> 00:11:21,600 Speaker 1: least they'll be fine, would be safe. Um, what about 205 00:11:22,200 --> 00:11:25,480 Speaker 1: calamities like a mud slide or something like that. You 206 00:11:25,480 --> 00:11:27,880 Speaker 1: can't mud slide the earth. You can't. And that's a 207 00:11:27,880 --> 00:11:29,760 Speaker 1: really good point. This is what I figured out in 208 00:11:29,840 --> 00:11:32,400 Speaker 1: researching this, after doing the End of the World, after 209 00:11:32,840 --> 00:11:37,320 Speaker 1: talking all these people. It took researching this article for 210 00:11:37,360 --> 00:11:40,400 Speaker 1: me to figure this out. That it's time and proximity 211 00:11:40,559 --> 00:11:42,760 Speaker 1: that are the two things that we used to survive, 212 00:11:42,800 --> 00:11:45,520 Speaker 1: and that if you take away time and proximity, we're 213 00:11:45,559 --> 00:11:48,040 Speaker 1: in trouble. And so mud slides are a really good 214 00:11:48,200 --> 00:11:51,800 Speaker 1: example of proximity, where a mudslide can come down a 215 00:11:51,800 --> 00:11:55,160 Speaker 1: mountain and take out an entire village of people. Yes, 216 00:11:55,280 --> 00:11:57,480 Speaker 1: and it's really sad and really scary to think of. 217 00:11:57,520 --> 00:11:59,640 Speaker 1: I mean we saw it with our own eyes. We 218 00:12:00,040 --> 00:12:02,800 Speaker 1: stood in a field that was now what like eight 219 00:12:02,880 --> 00:12:04,600 Speaker 1: or nine feet higher than it used to be. Yeah, 220 00:12:04,600 --> 00:12:06,559 Speaker 1: and you could see the trek. This is in Guatemala. 221 00:12:06,559 --> 00:12:08,280 Speaker 1: When we went down to visit our friends at co 222 00:12:08,480 --> 00:12:12,599 Speaker 1: ed Um, there was like the trees were much sparser, 223 00:12:12,640 --> 00:12:14,480 Speaker 1: you could see the track of the mundon. They were like, 224 00:12:14,559 --> 00:12:17,720 Speaker 1: the people are still down there. This is It was 225 00:12:17,720 --> 00:12:21,040 Speaker 1: a horrible tragedy and it happened in a matter of seconds. 226 00:12:21,040 --> 00:12:23,360 Speaker 1: It just wiped out a village. But we all don't 227 00:12:23,400 --> 00:12:26,080 Speaker 1: live under one mountain, and so if a bunch of 228 00:12:26,120 --> 00:12:28,480 Speaker 1: people are taken out, the rest of us still go on. 229 00:12:28,640 --> 00:12:31,760 Speaker 1: So there's the time and there's a proximity yeah, I 230 00:12:31,800 --> 00:12:34,320 Speaker 1: think a lot of people in the eighties might have thought, 231 00:12:34,880 --> 00:12:37,199 Speaker 1: because of movies like war games and movies like The 232 00:12:37,280 --> 00:12:40,319 Speaker 1: Day after that global thermon nuclear war would be an 233 00:12:40,320 --> 00:12:43,719 Speaker 1: ex risk and as bad as that would be, it 234 00:12:43,720 --> 00:12:47,480 Speaker 1: wouldn't kill every single human being. Uh no, No, they 235 00:12:47,520 --> 00:12:49,880 Speaker 1: don't think so they started out thinking this. Like, as 236 00:12:49,880 --> 00:12:53,240 Speaker 1: a matter of fact, nuclear war was the first one 237 00:12:53,280 --> 00:12:56,240 Speaker 1: of the first things that we identify as a possible 238 00:12:56,240 --> 00:12:59,000 Speaker 1: existential risk. And if you kind of talk about the 239 00:12:59,040 --> 00:13:02,280 Speaker 1: history of the field for the first like several decades, 240 00:13:02,360 --> 00:13:06,439 Speaker 1: that was like the focus, the entire focus of existential risks. 241 00:13:06,480 --> 00:13:10,400 Speaker 1: Like Bertrand Russell and Einstein wrote a manifesto about how 242 00:13:10,840 --> 00:13:12,920 Speaker 1: we really need to be careful with these nukes because 243 00:13:12,920 --> 00:13:16,080 Speaker 1: we're gonna wipe ourselves out. Carl Sagan, you remember our 244 00:13:16,640 --> 00:13:21,600 Speaker 1: amazing Nuclear Winner episode that was from you know, studying 245 00:13:21,640 --> 00:13:24,520 Speaker 1: existential risks. And then in the nineties a guy named 246 00:13:24,559 --> 00:13:28,000 Speaker 1: John Leslie came along and said, Hey, there's way more 247 00:13:28,040 --> 00:13:30,640 Speaker 1: than just nuclear war that we could wipe ourselves out with. 248 00:13:30,880 --> 00:13:32,440 Speaker 1: And some of it is taking the form of this 249 00:13:32,520 --> 00:13:35,400 Speaker 1: technology that's coming down the pike. And that was taken 250 00:13:35,480 --> 00:13:37,719 Speaker 1: up by one of my personal hero is a guy 251 00:13:37,800 --> 00:13:42,079 Speaker 1: named Nick Bostrom. Yeah, he's a philosopher out of Oxford, 252 00:13:42,920 --> 00:13:45,320 Speaker 1: and he is one of the founders of this field. 253 00:13:45,880 --> 00:13:48,280 Speaker 1: And he's the one that said, are one of the 254 00:13:48,280 --> 00:13:51,920 Speaker 1: ones that said, you know, there's a lot of potential 255 00:13:51,960 --> 00:13:57,120 Speaker 1: existential existential risks and nuclear wars peanuts bring it on. 256 00:13:58,640 --> 00:14:01,000 Speaker 1: But and I know, I don't know of boss from 257 00:14:01,040 --> 00:14:04,240 Speaker 1: specifically believes they probably does that that there we would 258 00:14:04,240 --> 00:14:07,360 Speaker 1: be able to recover from a nuclear war. That's the idea, 259 00:14:07,400 --> 00:14:11,160 Speaker 1: as you rebuild as a society after whatever zombie apocalypse, 260 00:14:11,240 --> 00:14:13,480 Speaker 1: ri clear war happened. Yeah, and again say it killed 261 00:14:13,480 --> 00:14:17,520 Speaker 1: off people. To us, that would seem like an unimaginable 262 00:14:17,520 --> 00:14:20,080 Speaker 1: tragedy because we lived through it. But if you zoom 263 00:14:20,120 --> 00:14:23,320 Speaker 1: back out and look at the lifespan of humanity, not 264 00:14:23,440 --> 00:14:26,160 Speaker 1: just the humans life today, but all of humanity, like 265 00:14:26,160 --> 00:14:29,640 Speaker 1: it would be a very horrible period in human history, 266 00:14:29,680 --> 00:14:34,280 Speaker 1: but one we could rebuild from oversay ten thousand years 267 00:14:34,720 --> 00:14:36,760 Speaker 1: to get back to the point where we were before 268 00:14:36,800 --> 00:14:40,280 Speaker 1: the nuclear war. And so ultimately it's probably not an 269 00:14:40,360 --> 00:14:43,160 Speaker 1: existential risk. Yeah, it's tough. This is a tough topic 270 00:14:43,240 --> 00:14:45,360 Speaker 1: for people because I think people have a hard time 271 00:14:45,360 --> 00:14:48,680 Speaker 1: with that long of a view of things. And then 272 00:14:48,680 --> 00:14:53,360 Speaker 1: whenever you hear uh the big Matt comparisons of you know, 273 00:14:53,440 --> 00:14:55,520 Speaker 1: how long people have been around and how old the 274 00:14:55,520 --> 00:14:57,680 Speaker 1: earth is and that stuff, it kind of hits home. 275 00:14:57,720 --> 00:15:00,600 Speaker 1: But it's stuff for people living that live you eighty 276 00:15:00,680 --> 00:15:05,280 Speaker 1: years to think about, Well, ten thousand years will be fine, 277 00:15:05,640 --> 00:15:08,280 Speaker 1: and even like, um, you mean, when I was researching this, 278 00:15:08,360 --> 00:15:09,960 Speaker 1: she brought this up a lot, like where do we 279 00:15:10,000 --> 00:15:13,440 Speaker 1: stop caring about people that our descendants? You know, we 280 00:15:13,480 --> 00:15:16,600 Speaker 1: care about our children or our grandchildren. That's about I 281 00:15:16,680 --> 00:15:18,920 Speaker 1: just care about my daughter, That's about it. That's where 282 00:15:18,920 --> 00:15:22,320 Speaker 1: it is with the grandchildren. You have grandchildren yet? Yeah, 283 00:15:22,360 --> 00:15:24,720 Speaker 1: but wait till they come along. Everything I've ever heard 284 00:15:24,760 --> 00:15:27,320 Speaker 1: is that being a grandparents even better than being a parent. 285 00:15:28,360 --> 00:15:32,320 Speaker 1: And I know some grandparents. Okay, let's say I'm not 286 00:15:32,480 --> 00:15:35,000 Speaker 1: dead before my daughter eventually has a kid, if she 287 00:15:35,240 --> 00:15:39,320 Speaker 1: wants to, I would care about that grandchild. But after that, 288 00:15:40,280 --> 00:15:43,960 Speaker 1: forget it. Yeah, my kids, kids, kids, who cares granded 289 00:15:44,080 --> 00:15:47,120 Speaker 1: that's about That's about where it would like, I care 290 00:15:47,160 --> 00:15:49,600 Speaker 1: about people in humanity as a whole. I think that's 291 00:15:49,600 --> 00:15:52,560 Speaker 1: what you gotta do. You can't think about like your 292 00:15:52,880 --> 00:15:58,240 Speaker 1: your eventual ancestors think you just got to think about people, right, Yeah, 293 00:15:58,280 --> 00:16:00,920 Speaker 1: that's really help people you don't know. Now, it's kind 294 00:16:00,920 --> 00:16:04,560 Speaker 1: of requisite to start caring about existential risks, to start 295 00:16:04,640 --> 00:16:07,880 Speaker 1: thinking about people, not just well let's talk about it. 296 00:16:07,920 --> 00:16:10,160 Speaker 1: So Toby Ord made a really good point in his 297 00:16:10,200 --> 00:16:13,840 Speaker 1: book The Precipice, right that you care about people on 298 00:16:13,920 --> 00:16:16,160 Speaker 1: the other side of the world that you've never met. Yeah, 299 00:16:16,160 --> 00:16:19,720 Speaker 1: that's what I'm saying, Like that happens every day, right, 300 00:16:19,800 --> 00:16:22,640 Speaker 1: So what's the difference between people who live on the 301 00:16:22,640 --> 00:16:24,320 Speaker 1: other side of the world that you will never meet 302 00:16:24,680 --> 00:16:26,760 Speaker 1: and people who live in a different time that you 303 00:16:26,800 --> 00:16:29,360 Speaker 1: will never meet. Why would you care any less about 304 00:16:29,400 --> 00:16:32,680 Speaker 1: these people human beings that you'll never meet, whether they 305 00:16:32,760 --> 00:16:34,000 Speaker 1: live on the other side of the world at some 306 00:16:34,160 --> 00:16:36,040 Speaker 1: time or in the same place you do, but at 307 00:16:36,040 --> 00:16:38,840 Speaker 1: a different time. I think a few I mean I'm 308 00:16:38,840 --> 00:16:40,760 Speaker 1: not speaking for me, but I think if I were 309 00:16:40,760 --> 00:16:42,720 Speaker 1: to step inside the brain of someone who thinks that, 310 00:16:43,600 --> 00:16:47,160 Speaker 1: they would think like a it's a little bit of 311 00:16:47,240 --> 00:16:51,400 Speaker 1: a self Um, it's a bit of an ego thing 312 00:16:51,960 --> 00:16:54,200 Speaker 1: because you know, like, oh, I'm helping someone else, so 313 00:16:54,240 --> 00:16:57,120 Speaker 1: that does something for you in the moment, Like someone 314 00:16:57,200 --> 00:16:58,480 Speaker 1: right now on the other side of the world that 315 00:16:58,560 --> 00:17:02,440 Speaker 1: maybe I've sponsored is doing good because of me. And 316 00:17:02,440 --> 00:17:05,000 Speaker 1: I had a little kick out of it from Sally Struthers. 317 00:17:05,160 --> 00:17:09,520 Speaker 1: Yeah that does something. It helps he help with food 318 00:17:09,520 --> 00:17:12,720 Speaker 1: on her plate. She still with us, I think so. 319 00:17:13,000 --> 00:17:15,000 Speaker 1: I think so too. But I feel really bad if 320 00:17:15,040 --> 00:17:18,119 Speaker 1: I certainly haven't heard any news of her death, people 321 00:17:18,160 --> 00:17:20,320 Speaker 1: would talk about that and the record scratch would have 322 00:17:20,320 --> 00:17:24,080 Speaker 1: just happened. Uh. So I think that is something too. 323 00:17:24,119 --> 00:17:26,960 Speaker 1: And I think there are also sort of a certain 324 00:17:27,000 --> 00:17:31,600 Speaker 1: amount of people that are just um that just believe 325 00:17:31,640 --> 00:17:36,119 Speaker 1: you're worm dirt. There is no benefit to the afterlife 326 00:17:36,200 --> 00:17:39,440 Speaker 1: as far as good deeds and things, so like once 327 00:17:39,440 --> 00:17:42,080 Speaker 1: you're gone, it's just who cares because it doesn't matter. 328 00:17:42,680 --> 00:17:45,320 Speaker 1: There's no consciousness. Yeah, well that's I mean, if you 329 00:17:45,520 --> 00:17:47,639 Speaker 1: if you were at all like piqued by that stuff, 330 00:17:47,680 --> 00:17:50,760 Speaker 1: I would say, definitely read the precipice because like one 331 00:17:50,800 --> 00:17:52,479 Speaker 1: of the best things that Toby does, and he does 332 00:17:52,480 --> 00:17:56,600 Speaker 1: a lot of stuff really well, is describe why it matters, 333 00:17:56,640 --> 00:17:59,400 Speaker 1: because I mean, that's a philosopher after all. Um, so 334 00:17:59,440 --> 00:18:01,359 Speaker 1: he says like this is why it matters, Like not 335 00:18:01,440 --> 00:18:04,640 Speaker 1: only does it matter because you're you're keeping things going 336 00:18:04,680 --> 00:18:08,080 Speaker 1: for the future generation, you're also continuing on with the 337 00:18:08,080 --> 00:18:11,239 Speaker 1: previous generation built like who who are you to just 338 00:18:11,280 --> 00:18:13,720 Speaker 1: be like, oh, were you just gonna drop the ball? No? 339 00:18:13,800 --> 00:18:15,919 Speaker 1: I agree that's a very self centered way to look 340 00:18:15,960 --> 00:18:17,800 Speaker 1: at things totally. But I think you're right. I think 341 00:18:17,840 --> 00:18:19,080 Speaker 1: there are a lot of people who look at it 342 00:18:19,160 --> 00:18:21,600 Speaker 1: that way. So you want to take a break, Yeah, 343 00:18:21,680 --> 00:18:23,880 Speaker 1: we can take a break now, and maybe we can 344 00:18:23,920 --> 00:18:28,200 Speaker 1: dive into Mr Bostrom's or doctor I imagine Bostroms U 345 00:18:28,640 --> 00:18:32,760 Speaker 1: Five different types? Are there? Five? No, there's just a few. Okay, 346 00:18:32,800 --> 00:18:35,119 Speaker 1: a few different types of existential we can make up 347 00:18:35,119 --> 00:19:06,960 Speaker 1: a couple of the addam net. Let's not stop, all right, Chuck. So, uh, 348 00:19:07,960 --> 00:19:10,920 Speaker 1: one of the things you said earlier is that existential risk, 349 00:19:11,000 --> 00:19:14,400 Speaker 1: the way we think of them typically is um that 350 00:19:15,040 --> 00:19:18,280 Speaker 1: something happens and humanity has wiped out and we all 351 00:19:18,320 --> 00:19:21,200 Speaker 1: die and there's no more humans forever and ever. That's 352 00:19:21,240 --> 00:19:24,439 Speaker 1: an existential risk. That's one kind, really, and that's the 353 00:19:24,440 --> 00:19:28,720 Speaker 1: easiest one to grasp with, which is extinction. Yeah, and 354 00:19:28,760 --> 00:19:31,480 Speaker 1: that kind of speaks for itself. Just like dinosaurs are 355 00:19:31,480 --> 00:19:35,359 Speaker 1: no longer here, that would be us, Yes, and I 356 00:19:35,400 --> 00:19:37,240 Speaker 1: think that's one of those other things too. It's kind 357 00:19:37,280 --> 00:19:39,800 Speaker 1: of like how people walk around like, yeah, I know 358 00:19:39,840 --> 00:19:42,040 Speaker 1: I'm going to die someday. But if you sat them 359 00:19:42,040 --> 00:19:45,040 Speaker 1: down and you were like, do you really understand that 360 00:19:45,080 --> 00:19:48,439 Speaker 1: you're going to die someday? That they might start to 361 00:19:48,480 --> 00:19:50,960 Speaker 1: panic a little bit, you know, and they realize I 362 00:19:51,040 --> 00:19:53,159 Speaker 1: haven't actually confronted that, I just know that I'm going 363 00:19:53,240 --> 00:19:56,399 Speaker 1: to die. Or if you knew the date, that'd be weird. 364 00:19:56,480 --> 00:19:58,800 Speaker 1: It would be like a Justin Timberlake movie. Would that 365 00:19:58,880 --> 00:20:04,160 Speaker 1: make things better? Or we're for humanity? I would say better? Probably, right. 366 00:20:04,200 --> 00:20:05,840 Speaker 1: I think it'd be a mixed bag. I think some 367 00:20:05,920 --> 00:20:08,399 Speaker 1: people would be able to do nothing but focus on 368 00:20:08,440 --> 00:20:11,199 Speaker 1: that and think about all the time they're wasting, and 369 00:20:11,200 --> 00:20:13,639 Speaker 1: other people would be like, I'm gonna make the absolute 370 00:20:13,640 --> 00:20:15,320 Speaker 1: most out of this. Well, I guess there are a 371 00:20:15,359 --> 00:20:17,440 Speaker 1: couple of ways you can go, and it probably depends 372 00:20:17,480 --> 00:20:19,880 Speaker 1: on when your data is. If if you found out 373 00:20:19,880 --> 00:20:23,000 Speaker 1: your date was a ripe old age, you might be like, well, 374 00:20:23,040 --> 00:20:24,400 Speaker 1: I'm just going to try and lead the best life 375 00:20:24,400 --> 00:20:27,800 Speaker 1: I can. That's great. If you find out you live 376 00:20:27,880 --> 00:20:32,320 Speaker 1: fast and die hard at seven die harder, uh, you 377 00:20:32,400 --> 00:20:34,760 Speaker 1: might die harder. You might be like screw it, or 378 00:20:34,880 --> 00:20:38,520 Speaker 1: you might really ramp up your good works. It depends 379 00:20:38,560 --> 00:20:40,600 Speaker 1: what kind of person you are. Probably and more and 380 00:20:40,640 --> 00:20:42,719 Speaker 1: more I'm realizing is it depends on how you were 381 00:20:42,800 --> 00:20:47,000 Speaker 1: raised to. You know, like we we definitely are responsible 382 00:20:47,040 --> 00:20:52,960 Speaker 1: for carrying on ourselves as adults. Like you can't just say, well, 383 00:20:53,000 --> 00:20:55,000 Speaker 1: I wasn't raised very well or I was raised this way, 384 00:20:55,040 --> 00:20:57,520 Speaker 1: so whatever, Like you have a responsibility for yourself and 385 00:20:57,560 --> 00:20:59,440 Speaker 1: who you are as an adult. Sure, but I really 386 00:20:59,440 --> 00:21:02,639 Speaker 1: feel like the way that you're raised to really sets 387 00:21:02,680 --> 00:21:05,359 Speaker 1: the stage and put you on a path that's that 388 00:21:05,400 --> 00:21:07,399 Speaker 1: can be difficult to get off of because it's so 389 00:21:07,440 --> 00:21:10,040 Speaker 1: hard to see for sure, you know, because that's just 390 00:21:10,119 --> 00:21:12,840 Speaker 1: normal to you because that's what your family was. Yeah, 391 00:21:12,920 --> 00:21:16,119 Speaker 1: that's a good point. So anyway, extinction is just one 392 00:21:16,200 --> 00:21:19,040 Speaker 1: of the ways one of the types of existential risks 393 00:21:19,080 --> 00:21:22,880 Speaker 1: that we face, a bad one. Permanent stagnation is another one, 394 00:21:23,000 --> 00:21:26,840 Speaker 1: and that's the one we kind of mentioned. Um dance 395 00:21:26,880 --> 00:21:29,400 Speaker 1: around a little bit, and that's like some people are around. 396 00:21:29,840 --> 00:21:34,520 Speaker 1: Not every human died and whatever happened, but um, whatever 397 00:21:34,680 --> 00:21:39,320 Speaker 1: is left is not enough to either repopulate the world 398 00:21:39,520 --> 00:21:43,040 Speaker 1: or to progress humanity in any meaningful way to rebuild 399 00:21:43,040 --> 00:21:45,760 Speaker 1: civilization back to where it was, and it would be 400 00:21:45,800 --> 00:21:49,119 Speaker 1: that way permanently, which is kind of in itself tough 401 00:21:49,720 --> 00:21:53,160 Speaker 1: to imagine too, just like the genuine extinction of humanity 402 00:21:53,200 --> 00:21:55,960 Speaker 1: is tough to imagine the idea of, well, there's still 403 00:21:56,119 --> 00:21:58,840 Speaker 1: plenty of humans running around, how are we never going 404 00:21:58,960 --> 00:22:01,520 Speaker 1: to get back to that place? And there's that may 405 00:22:01,520 --> 00:22:03,960 Speaker 1: be the most depressing one, I think. I think the 406 00:22:04,000 --> 00:22:06,160 Speaker 1: next one is the most depressing. But that's pretty depressing. 407 00:22:06,840 --> 00:22:09,159 Speaker 1: But one one example it's been given for that is like, 408 00:22:09,240 --> 00:22:12,160 Speaker 1: let's say we say, um, all right, this climate change, 409 00:22:12,160 --> 00:22:15,000 Speaker 1: we need to do something about that. So we undertake 410 00:22:15,040 --> 00:22:19,240 Speaker 1: a geo engineering project that isn't fully thought out, and 411 00:22:19,320 --> 00:22:22,679 Speaker 1: we end up causing like a runaway greenhouse gas effect 412 00:22:23,920 --> 00:22:26,280 Speaker 1: and there's just nothing we can do to reverse course, 413 00:22:26,320 --> 00:22:29,840 Speaker 1: and so we ultimately wreck the earth. That would be 414 00:22:29,880 --> 00:22:34,280 Speaker 1: a good example of permanent stagnation. That's right, This is 415 00:22:34,359 --> 00:22:37,199 Speaker 1: this next one. So yes, agreed, permanent stagnation is pretty bad. 416 00:22:37,240 --> 00:22:39,920 Speaker 1: I wouldn't want to live under that, But at least 417 00:22:39,960 --> 00:22:43,280 Speaker 1: you can run around and like, um, do what you want. 418 00:22:43,720 --> 00:22:46,800 Speaker 1: I think the total lack of personal liberty and the 419 00:22:46,800 --> 00:22:51,680 Speaker 1: flawed realization one is what gets me. Yeah, they all 420 00:22:51,680 --> 00:22:55,479 Speaker 1: get me. Uh. Flawed realization is the next one, and 421 00:22:55,520 --> 00:22:59,280 Speaker 1: that's Um, that's sort of like the matrix example, which 422 00:22:59,320 --> 00:23:03,600 Speaker 1: is that there's technology that we invented that eventually makes 423 00:23:03,720 --> 00:23:10,439 Speaker 1: us their little batteries and pods right basically, or there's 424 00:23:10,520 --> 00:23:16,240 Speaker 1: just um, some someone is in charge, whether it's a 425 00:23:16,240 --> 00:23:19,200 Speaker 1: group or some some individual or something like that. It's 426 00:23:19,240 --> 00:23:22,040 Speaker 1: basically a permanent dictatorship that we will never be able 427 00:23:22,040 --> 00:23:26,120 Speaker 1: to get out from under because this technology we've developed, yeah, 428 00:23:26,280 --> 00:23:30,440 Speaker 1: is being used against us, and it's so good at 429 00:23:30,520 --> 00:23:33,919 Speaker 1: keeping tabs on everybody and squashing descent before it grows. 430 00:23:34,160 --> 00:23:37,199 Speaker 1: There's just nothing anybody could ever do to overthrow it. 431 00:23:37,760 --> 00:23:41,840 Speaker 1: And so it's a permanent dictatorship where um, we're not 432 00:23:41,920 --> 00:23:45,680 Speaker 1: doing anything productive, we're not advancing, we're say, um, say, 433 00:23:45,720 --> 00:23:49,119 Speaker 1: it's like a religious dictatorship or something like that. Oh, 434 00:23:49,359 --> 00:23:51,800 Speaker 1: anybody does is go to church and support the church 435 00:23:51,880 --> 00:23:55,800 Speaker 1: or whatever, and that's that. And so what Dr Bostrom 436 00:23:55,840 --> 00:24:00,160 Speaker 1: figured out is that there are there are fatest as 437 00:24:00,160 --> 00:24:03,639 Speaker 1: bad as death. There are possible outcomes for the human 438 00:24:03,720 --> 00:24:07,040 Speaker 1: race that aren't are as bad as extinction that still 439 00:24:07,119 --> 00:24:10,560 Speaker 1: live people alive even like in kind of a futuristic 440 00:24:10,680 --> 00:24:13,680 Speaker 1: kind of thing, like the flawed realization Wine goes um, 441 00:24:13,800 --> 00:24:16,359 Speaker 1: but that you wouldn't want to live the lives that 442 00:24:16,440 --> 00:24:21,240 Speaker 1: those humans live, and so humanity has lost its chance 443 00:24:21,280 --> 00:24:24,399 Speaker 1: of ever achieving it's true, its true potential. That's right, 444 00:24:24,760 --> 00:24:28,440 Speaker 1: And that that those qualifies existential risks as well, that's right. 445 00:24:28,680 --> 00:24:30,960 Speaker 1: They want to live in the matrix no at all, 446 00:24:31,400 --> 00:24:36,800 Speaker 1: or in a post apocalyptic um altered Earth. Yeah, the 447 00:24:36,840 --> 00:24:39,840 Speaker 1: matrix basically like thunder of the Barbarian that's what I 448 00:24:39,840 --> 00:24:44,960 Speaker 1: imagine with with the permanent stagnation. So, uh, there are 449 00:24:45,000 --> 00:24:48,560 Speaker 1: a couple of big categories for existential risks, and they 450 00:24:48,560 --> 00:24:53,159 Speaker 1: are either nature made or man made um. The nature 451 00:24:53,200 --> 00:24:56,840 Speaker 1: ones we've uh you know there there's always been the 452 00:24:56,880 --> 00:25:02,000 Speaker 1: threat that big enough um object hitting planet Earth could 453 00:25:02,000 --> 00:25:04,560 Speaker 1: do it, right, Like that's always been around. It's not 454 00:25:04,600 --> 00:25:07,439 Speaker 1: like that's some sort of new realization, but it's just 455 00:25:07,680 --> 00:25:12,160 Speaker 1: a pretty rare It's so rare that it's not likely. Right, 456 00:25:12,320 --> 00:25:15,200 Speaker 1: All of the natural ones are pretty pretty rare compared 457 00:25:15,240 --> 00:25:17,360 Speaker 1: to the human made ones. Yeah, Like I don't think 458 00:25:17,359 --> 00:25:21,119 Speaker 1: science wakes up every day and worries about a comet 459 00:25:21,280 --> 00:25:25,080 Speaker 1: or an asteroid or a meteor. No, and it's definitely 460 00:25:25,080 --> 00:25:28,680 Speaker 1: worth saying that the better we get at scanning the heavens, 461 00:25:29,320 --> 00:25:33,040 Speaker 1: the safer we are eventually when we can do something 462 00:25:33,080 --> 00:25:35,520 Speaker 1: about it. If we see this coming, what do we do? 463 00:25:35,600 --> 00:25:37,400 Speaker 1: Just hit the gas and move the Earth over a bit? 464 00:25:37,600 --> 00:25:41,320 Speaker 1: Since the right, Um, and there was nothing we can 465 00:25:41,359 --> 00:25:43,400 Speaker 1: do about any of these anyway, So maybe that's also 466 00:25:43,400 --> 00:25:46,760 Speaker 1: why science doesn't wake up worrying, right. Yeah, so you've 467 00:25:46,800 --> 00:25:50,359 Speaker 1: got near earth objects, you've got celestial stuff like collapsing 468 00:25:50,440 --> 00:25:53,240 Speaker 1: stars that produce gamma ray births. And then even back 469 00:25:53,240 --> 00:25:56,920 Speaker 1: here on Earth, like a supervolcanic eruption could conceivably put 470 00:25:56,920 --> 00:26:01,159 Speaker 1: out enough soot that it blocks photosynthesis and showing that, yeah, 471 00:26:01,440 --> 00:26:04,320 Speaker 1: sends us into essentially a nuclear winner too. That would 472 00:26:04,320 --> 00:26:07,160 Speaker 1: be bad. But like you're saying, there's these are very 473 00:26:07,280 --> 00:26:09,040 Speaker 1: rare and there's not a lot we can do about 474 00:26:09,040 --> 00:26:12,800 Speaker 1: them now. Instead, the focus of people who think about 475 00:26:12,880 --> 00:26:16,760 Speaker 1: existential risks, um, And there are like a pretty decent 476 00:26:16,800 --> 00:26:19,440 Speaker 1: handful of people who are dedicated to this now. Um, 477 00:26:19,520 --> 00:26:22,840 Speaker 1: they say that the anthropogenic or the human made ones, 478 00:26:23,000 --> 00:26:25,640 Speaker 1: these are the ones we really need to mitigate because 479 00:26:25,920 --> 00:26:30,040 Speaker 1: they're human made, so they're under our control and um, 480 00:26:30,200 --> 00:26:33,639 Speaker 1: they they they that means we can do something about 481 00:26:33,680 --> 00:26:37,920 Speaker 1: them more than say a comment. Yeah. Yeah, but that's 482 00:26:37,920 --> 00:26:40,960 Speaker 1: a it's a bit of a um double edged sword 483 00:26:41,040 --> 00:26:43,960 Speaker 1: because you think, oh, well, it's since we could stop 484 00:26:44,000 --> 00:26:49,199 Speaker 1: this stuff, that's really comforting to know. But we're not. Right, Like, 485 00:26:49,240 --> 00:26:51,520 Speaker 1: we're headed down a bad path in some of these 486 00:26:51,560 --> 00:26:56,280 Speaker 1: areas for sure. So because we are creating these risks 487 00:26:56,320 --> 00:26:58,840 Speaker 1: and not thinking about these things, in a lot of cases, 488 00:26:58,920 --> 00:27:03,280 Speaker 1: they're actually worse even though we could possibly control them. 489 00:27:03,400 --> 00:27:08,000 Speaker 1: It definitely makes it more ironic too, right. So, um, 490 00:27:08,040 --> 00:27:09,960 Speaker 1: there are a few that have been identified, and there's 491 00:27:09,960 --> 00:27:12,000 Speaker 1: probably more that we haven't figured out yet or haven't 492 00:27:12,040 --> 00:27:16,040 Speaker 1: been invented yet. But one of the big ones, just um, 493 00:27:16,320 --> 00:27:19,080 Speaker 1: I think almost across the board, the one that existential 494 00:27:19,160 --> 00:27:23,680 Speaker 1: risk analysts worry about the most is AI artificial intelligence. Yeah, 495 00:27:23,680 --> 00:27:27,560 Speaker 1: and this is the most frustrating one because it seems 496 00:27:27,560 --> 00:27:31,719 Speaker 1: like it would be the easiest one to uh not 497 00:27:31,800 --> 00:27:37,560 Speaker 1: stopping its tracks, but to divert along a safer path. Um. 498 00:27:37,600 --> 00:27:43,119 Speaker 1: The problem with that is that people who have dedicated 499 00:27:43,160 --> 00:27:45,879 Speaker 1: themselves to figuring out how to make that safer path, 500 00:27:46,400 --> 00:27:49,560 Speaker 1: are coming back and saying this is way harder than 501 00:27:49,600 --> 00:27:51,240 Speaker 1: we thought it was going to be to make the 502 00:27:51,280 --> 00:27:54,760 Speaker 1: safer path. Yeah really yeah. And so at the same time, 503 00:27:54,960 --> 00:27:57,439 Speaker 1: while people recognize that there needs to be a safe 504 00:27:57,480 --> 00:28:00,560 Speaker 1: path for AI to follow, this other our path that 505 00:28:00,600 --> 00:28:04,160 Speaker 1: it's on now, which is known as the unsafe path, 506 00:28:04,720 --> 00:28:07,520 Speaker 1: that's the one that's making people money. So everybody's just 507 00:28:07,560 --> 00:28:10,159 Speaker 1: going down the unsafe these other people are trying to 508 00:28:10,200 --> 00:28:14,920 Speaker 1: figure out the safer one because the UM the computer 509 00:28:15,080 --> 00:28:17,439 Speaker 1: and war games would say, maybe the best option is 510 00:28:17,440 --> 00:28:21,119 Speaker 1: to not play the game, and that's if there is 511 00:28:21,160 --> 00:28:23,800 Speaker 1: no safe option, then maybe a I should not happen 512 00:28:24,800 --> 00:28:28,800 Speaker 1: or we need to And this is almost heresy to 513 00:28:28,840 --> 00:28:31,440 Speaker 1: say we need to put the brakes on AI development 514 00:28:32,080 --> 00:28:35,640 Speaker 1: so that we can figure out the safer way and 515 00:28:35,680 --> 00:28:38,920 Speaker 1: then move forward. But we should probably explain what we're 516 00:28:38,920 --> 00:28:41,240 Speaker 1: talking about was safe in the in the first place, right, Yeah, 517 00:28:41,240 --> 00:28:45,040 Speaker 1: I mean, we're talking about creating super intelligent AI that 518 00:28:45,200 --> 00:28:48,840 Speaker 1: basically is is so smart that it starts to self 519 00:28:48,960 --> 00:28:53,920 Speaker 1: learn UM and is beyond our control and it's not thinking, Ah, 520 00:28:54,000 --> 00:28:56,080 Speaker 1: wait a minute, one of the one of the things 521 00:28:56,080 --> 00:28:58,600 Speaker 1: I'm programmed to do is make sure we take care 522 00:28:58,600 --> 00:29:02,680 Speaker 1: of humans. And it doesn't necessarily mean that some AI 523 00:29:02,880 --> 00:29:05,280 Speaker 1: is going to become super intelligent and say I want 524 00:29:05,320 --> 00:29:09,040 Speaker 1: to destroy all humans. That's actually probably not going to 525 00:29:09,080 --> 00:29:12,320 Speaker 1: be the case. It will be that this super intelligent 526 00:29:12,360 --> 00:29:15,120 Speaker 1: AI is carrying out whatever it was programmed to do, 527 00:29:15,240 --> 00:29:19,720 Speaker 1: would disregard humans exactly. And so if our goal of 528 00:29:19,960 --> 00:29:24,120 Speaker 1: staying alive and thriving UM comes in conflict with the 529 00:29:24,120 --> 00:29:26,880 Speaker 1: goal of whatever this AI's goal is, whatever it was 530 00:29:26,920 --> 00:29:29,760 Speaker 1: designed to do, we would lose that because it's smarter 531 00:29:29,800 --> 00:29:32,120 Speaker 1: than us. By definition, it's smarter than us. It's out 532 00:29:32,160 --> 00:29:34,800 Speaker 1: of it, it's out of our control. And probably one 533 00:29:34,800 --> 00:29:37,040 Speaker 1: of the first things it would do when it became 534 00:29:37,160 --> 00:29:39,680 Speaker 1: super intelligent is figure out how to prevent us from 535 00:29:39,720 --> 00:29:43,160 Speaker 1: turning it off. Well, yeah, that's the fail safe, is 536 00:29:43,560 --> 00:29:46,200 Speaker 1: the all important failsafe that the AI could just disable 537 00:29:46,440 --> 00:29:48,760 Speaker 1: exactly right. You can just like sneak up behind it 538 00:29:48,800 --> 00:29:50,720 Speaker 1: with a screwdriver or something like that, and then you 539 00:29:50,760 --> 00:29:56,040 Speaker 1: can get shot the robots like in a robot voice. 540 00:29:56,280 --> 00:30:00,080 Speaker 1: So that's called UM designing friendly or aligned a I 541 00:30:00,560 --> 00:30:03,120 Speaker 1: and people I have are like some of the smartest 542 00:30:03,160 --> 00:30:06,400 Speaker 1: people in the field of AI research. Have have stopped 543 00:30:06,560 --> 00:30:09,080 Speaker 1: figuring out how to build AI and have started to 544 00:30:09,120 --> 00:30:11,840 Speaker 1: figure out how to build friendly AI. Yeah. Aligned is 545 00:30:11,840 --> 00:30:15,920 Speaker 1: in aligned with our goals and needs and desires. And 546 00:30:16,040 --> 00:30:20,280 Speaker 1: Nick Bostrom actually has a really great um thought experiment 547 00:30:20,320 --> 00:30:24,240 Speaker 1: about is called the paper clip problem. Yeah. Um, and 548 00:30:24,320 --> 00:30:26,560 Speaker 1: it's it's you can hear it on the end of 549 00:30:26,600 --> 00:30:31,400 Speaker 1: the world. Nice. I like that driving listeners. The next 550 00:30:31,400 --> 00:30:36,480 Speaker 1: one is nanotech. Um. And nanotech is I mean, it's 551 00:30:36,520 --> 00:30:40,000 Speaker 1: something that's very much within the realm of possibility, as 552 00:30:40,080 --> 00:30:42,920 Speaker 1: is AI. Actually it's not. That's not super far fetched 553 00:30:42,920 --> 00:30:47,880 Speaker 1: either by super intelligent AI. Yeah, it's definitely possible. Yeah, 554 00:30:47,920 --> 00:30:51,200 Speaker 1: and that's the same with nanotechnology we're talking about. And 555 00:30:51,240 --> 00:30:54,760 Speaker 1: I've seen this everywhere, from um, little tiny robots that 556 00:30:54,800 --> 00:30:58,400 Speaker 1: will just be dispersed and clean your house, um, to 557 00:30:58,880 --> 00:31:02,800 Speaker 1: like the atomic level where they can like reprogram our 558 00:31:02,920 --> 00:31:06,440 Speaker 1: body from the inside, the little tiny robots that can 559 00:31:06,480 --> 00:31:09,520 Speaker 1: clean your car. Yeah. Those are those are the three. 560 00:31:11,120 --> 00:31:14,200 Speaker 1: Those are three things, so um, two of them are cool. 561 00:31:14,520 --> 00:31:17,720 Speaker 1: One of the one of the things about these nanobots 562 00:31:18,160 --> 00:31:20,760 Speaker 1: is that because they're so small, they'll be able to 563 00:31:20,800 --> 00:31:24,959 Speaker 1: manipulate matter on like the atomic level, which is like 564 00:31:25,000 --> 00:31:27,680 Speaker 1: the usefulness of that is mind bottling to send them 565 00:31:27,720 --> 00:31:30,040 Speaker 1: in and they're gonna be networked, so we'll be able 566 00:31:30,040 --> 00:31:33,200 Speaker 1: to program to do whatever and control them. Right. Um. 567 00:31:33,280 --> 00:31:35,840 Speaker 1: The problem is is if they're networked in there under 568 00:31:35,880 --> 00:31:39,280 Speaker 1: our control, if they fall under the control of somebody 569 00:31:39,280 --> 00:31:43,040 Speaker 1: else or say a super intelligent AI, then we would 570 00:31:43,040 --> 00:31:45,960 Speaker 1: have a problem because they can rearrange matter on the 571 00:31:45,960 --> 00:31:50,280 Speaker 1: atomic level, so who knows what they would start rearranging 572 00:31:50,280 --> 00:31:53,320 Speaker 1: that we wouldn't want them to rearrange. It's like that 573 00:31:53,640 --> 00:31:57,880 Speaker 1: Gene Simmons sci fi movie in the eighties. Uh, I 574 00:31:57,920 --> 00:32:01,840 Speaker 1: won't say it was Looker. No. I always confuse those two, 575 00:32:02,040 --> 00:32:06,200 Speaker 1: the other one this is Runaway, Runaway. I think one 576 00:32:06,600 --> 00:32:10,160 Speaker 1: inevitably followed the other on HBO. They had to have 577 00:32:10,160 --> 00:32:12,880 Speaker 1: been a double feature because they could not be more linked. 578 00:32:12,920 --> 00:32:15,480 Speaker 1: In my mind. Same here, you know. I remember Albert 579 00:32:15,520 --> 00:32:17,000 Speaker 1: Finney was in one. I think he was in Looker 580 00:32:17,120 --> 00:32:20,160 Speaker 1: he was, and Gene Simmons was in Runaway as the 581 00:32:20,160 --> 00:32:22,640 Speaker 1: bad guy, of course, but it did a great job, 582 00:32:22,680 --> 00:32:26,680 Speaker 1: and Tom Selleck was the good guy. Tom Selleck. Yeah, 583 00:32:26,760 --> 00:32:29,840 Speaker 1: but the idea in that movie was not nanobots. They were, 584 00:32:30,160 --> 00:32:33,600 Speaker 1: but they were a little um insect like robots that 585 00:32:33,640 --> 00:32:36,760 Speaker 1: they just weren't nano sized, right, And so the reason 586 00:32:37,160 --> 00:32:41,120 Speaker 1: that these could be so dangerous is because not their size. 587 00:32:41,160 --> 00:32:44,360 Speaker 1: But there's just so many of them. And while they're 588 00:32:44,360 --> 00:32:45,920 Speaker 1: not big and they can't like punch you in the 589 00:32:45,960 --> 00:32:48,080 Speaker 1: face or stick you in the neck with a needle 590 00:32:48,200 --> 00:32:51,200 Speaker 1: or something like the runaway robots, they can do all 591 00:32:51,240 --> 00:32:54,320 Speaker 1: sorts of stuff to you molecularly, and you would not 592 00:32:54,440 --> 00:32:56,960 Speaker 1: want that to happen. Yeah, this is pretty bad. There's 593 00:32:56,960 --> 00:32:59,200 Speaker 1: an engineer out of m my Tea named Eric Drexler. 594 00:33:00,320 --> 00:33:05,320 Speaker 1: He is a big, big name in molecular nanotech. He 595 00:33:05,440 --> 00:33:08,000 Speaker 1: if he's listening right now. Right up to when you 596 00:33:08,040 --> 00:33:10,160 Speaker 1: said his name, he was just sitting there saying, please 597 00:33:10,200 --> 00:33:13,280 Speaker 1: don't mention me, no, because he's tried to back off 598 00:33:13,680 --> 00:33:17,200 Speaker 1: from his gray goo hypothesis. So, yeah, this is the idea. 599 00:33:17,280 --> 00:33:20,000 Speaker 1: What there are so many of these nano bots that 600 00:33:20,120 --> 00:33:22,880 Speaker 1: they can harvest their own energy, that can self replicate 601 00:33:23,240 --> 00:33:25,640 Speaker 1: like a little bunny rabbits, and that there would be 602 00:33:25,680 --> 00:33:28,840 Speaker 1: a point where there was runaway growth such that the 603 00:33:29,040 --> 00:33:32,400 Speaker 1: entire world would look like gray goo because it's covered 604 00:33:32,400 --> 00:33:35,880 Speaker 1: with nanobots. Yeah, and since they can harvest energy from 605 00:33:35,880 --> 00:33:39,560 Speaker 1: the environment, they would eat the world, they'd wreck the world. Basically, 606 00:33:40,000 --> 00:33:42,560 Speaker 1: this is that's a that's scary, you're right. So he 607 00:33:42,600 --> 00:33:45,640 Speaker 1: took so much flak for saying this, even because apparently 608 00:33:45,840 --> 00:33:49,440 Speaker 1: it's scared people enough back in the eighties that nanotechnology 609 00:33:49,600 --> 00:33:51,800 Speaker 1: was like kind of frozen for a little bit. Yeah, 610 00:33:51,840 --> 00:33:55,440 Speaker 1: and so everybody went drux lure. And so he's backed 611 00:33:55,440 --> 00:33:59,240 Speaker 1: off from it, saying like, this would be a design flaw, 612 00:33:59,640 --> 00:34:02,520 Speaker 1: this would just naturally happen with nanobots. You'd have to 613 00:34:02,520 --> 00:34:06,960 Speaker 1: design them to harvest energy themselves and to self replicate, 614 00:34:07,240 --> 00:34:09,919 Speaker 1: and so just don't do that. And so the thing 615 00:34:10,000 --> 00:34:11,719 Speaker 1: is like, yes, he took a lot of flak for it, 616 00:34:11,719 --> 00:34:14,160 Speaker 1: but he also like it was a contribution to the world. 617 00:34:14,160 --> 00:34:16,799 Speaker 1: He pointed out two big flaws that could happen that 618 00:34:16,840 --> 00:34:19,520 Speaker 1: now we're just like a sci fi trope. But when 619 00:34:19,560 --> 00:34:23,360 Speaker 1: he when he thought about them, they weren't self evident 620 00:34:23,480 --> 00:34:26,600 Speaker 1: or obvious. Yeah. I mean, I feel bad we even 621 00:34:26,600 --> 00:34:32,840 Speaker 1: said his name, but it's worth saying. Clyde Drexler, right, Glad, 622 00:34:32,920 --> 00:34:38,120 Speaker 1: that's right. Biotechnology is another pretty scary field. Um. There 623 00:34:38,120 --> 00:34:42,640 Speaker 1: are great people doing great research with infectious disease. UM. 624 00:34:42,880 --> 00:34:47,120 Speaker 1: Part of that, though, involves developing new bacteria and new viruses, 625 00:34:47,239 --> 00:34:50,840 Speaker 1: new strains that are even worse than the pre existing 626 00:34:50,880 --> 00:34:55,040 Speaker 1: ones as part of the research. And that is, uh, 627 00:34:55,200 --> 00:34:57,719 Speaker 1: that can be a little scary too, because I mean 628 00:34:57,880 --> 00:35:00,280 Speaker 1: it's not just stuff of movies. There are acts, events 629 00:35:00,360 --> 00:35:04,919 Speaker 1: that happen, protocols that aren't followed, and this stuff can 630 00:35:05,440 --> 00:35:07,480 Speaker 1: or could get out of a lab. Yeah, and it's 631 00:35:07,480 --> 00:35:10,040 Speaker 1: not one of those like could get out of a lab. 632 00:35:10,120 --> 00:35:13,279 Speaker 1: Even things that has gotten out of it happens, I 633 00:35:13,320 --> 00:35:16,280 Speaker 1: don't want to say routinely. Dis happened so many times 634 00:35:16,600 --> 00:35:18,319 Speaker 1: that when you look at the track record of the 635 00:35:18,320 --> 00:35:21,920 Speaker 1: biotech industry, it's just like, how are we not all 636 00:35:22,040 --> 00:35:25,640 Speaker 1: dead right now? It's crazy lost broken arrows, lost nuclear 637 00:35:25,680 --> 00:35:28,839 Speaker 1: warhead exactly, but with little, tiny, horrible viruses. And then 638 00:35:28,840 --> 00:35:31,799 Speaker 1: when you factor in that terrible track record with them 639 00:35:31,840 --> 00:35:35,720 Speaker 1: actually altering viruses in bacteria to make them more deadly, 640 00:35:36,360 --> 00:35:39,560 Speaker 1: to do those two things to reduce the time that 641 00:35:39,600 --> 00:35:42,680 Speaker 1: we have to get over them right, so they make 642 00:35:42,719 --> 00:35:46,160 Speaker 1: them more deadly, um, and then to reduce proximity to 643 00:35:46,239 --> 00:35:49,440 Speaker 1: make them more easily spread, more contagious, so they spread 644 00:35:49,440 --> 00:35:53,080 Speaker 1: more quickly. And kill more more quickly as well. Then 645 00:35:53,120 --> 00:35:56,520 Speaker 1: you have potentially an existential risk on your hand. For sure. 646 00:35:57,120 --> 00:35:59,640 Speaker 1: We've talked in here a lot about the Large Haydrin Collider. 647 00:36:00,280 --> 00:36:04,080 Speaker 1: We're talking about physics experiments as the I guess this 648 00:36:04,160 --> 00:36:06,600 Speaker 1: is the last example that we're going to talk about. Yeah, 649 00:36:06,600 --> 00:36:09,560 Speaker 1: and I should point out that this is not physics experiments. 650 00:36:09,560 --> 00:36:13,320 Speaker 1: Does not show up anywhere in Toby Ord's precipist book. Okay, 651 00:36:13,480 --> 00:36:16,759 Speaker 1: this one is kind of my pet. Yeah. I mean, 652 00:36:16,760 --> 00:36:20,600 Speaker 1: there's plenty of people who agree that this is a possibility, 653 00:36:20,680 --> 00:36:24,759 Speaker 1: but a lot of existential risks. Theorists are like, I 654 00:36:24,800 --> 00:36:27,440 Speaker 1: don't know. Well, you'll explain it better than me. But 655 00:36:27,560 --> 00:36:30,880 Speaker 1: the idea is that we're doing all these experiments, uh 656 00:36:30,920 --> 00:36:33,360 Speaker 1: like the large Adreing Collider to try and figure stuff 657 00:36:33,360 --> 00:36:37,759 Speaker 1: out we don't understand and which is great, but we 658 00:36:37,840 --> 00:36:41,200 Speaker 1: don't exactly know where that all could lead. Yeah, because 659 00:36:41,239 --> 00:36:44,160 Speaker 1: we don't understand it enough, you can't say this is 660 00:36:44,200 --> 00:36:49,120 Speaker 1: totally safe. And so if you read some physics papers 661 00:36:49,160 --> 00:36:52,520 Speaker 1: and this isn't like Rupert Sheldrake Morphick Fields kind of 662 00:36:52,520 --> 00:36:58,400 Speaker 1: like right, it's it's actual physicists have said, well, actually 663 00:36:58,480 --> 00:37:01,640 Speaker 1: using this version of string. Are it's possible that this 664 00:37:01,680 --> 00:37:04,839 Speaker 1: could be created in a large hadron collider or more 665 00:37:04,920 --> 00:37:08,080 Speaker 1: likely a more powerful collider that's going to be built 666 00:37:08,080 --> 00:37:11,399 Speaker 1: in the next fifty years or something like that. Super large. Sure, 667 00:37:12,560 --> 00:37:16,719 Speaker 1: the Duper, I think it's the nickname for it. Oh man, 668 00:37:16,760 --> 00:37:21,279 Speaker 1: I hope that doesn't end up being the nickname the Duper, right, Yeah, 669 00:37:21,360 --> 00:37:23,200 Speaker 1: I guess so. But it also has a little kind 670 00:37:23,200 --> 00:37:25,920 Speaker 1: of you know, I don't know, I like it all right, 671 00:37:26,280 --> 00:37:28,719 Speaker 1: So um, they're saying that a few things could be 672 00:37:29,520 --> 00:37:32,840 Speaker 1: created accidentally within one of these colliders when they smashed 673 00:37:32,880 --> 00:37:37,440 Speaker 1: the particles together. Microscopic black hole. Uh, my favorite, the 674 00:37:37,480 --> 00:37:41,680 Speaker 1: low energy vacuum bubble, which is it's a little tiny 675 00:37:41,800 --> 00:37:45,239 Speaker 1: version of our universe that's more stable, like a more 676 00:37:45,280 --> 00:37:49,160 Speaker 1: stable version, a lower energy version, and so if it 677 00:37:49,200 --> 00:37:51,280 Speaker 1: were allowed to grow, it would grow at the speed 678 00:37:51,320 --> 00:37:55,400 Speaker 1: of light. It would overwhelm our universe and be the 679 00:37:55,480 --> 00:37:57,680 Speaker 1: new version of the universe. Yeah. That's like when you 680 00:37:57,719 --> 00:38:01,200 Speaker 1: buy the baby alligator or the baby constrictor python. You 681 00:38:01,239 --> 00:38:03,520 Speaker 1: think is so cute, right, and then it grows up 682 00:38:03,560 --> 00:38:07,600 Speaker 1: and eats the universe screwed. The problem is, is this 683 00:38:07,680 --> 00:38:10,040 Speaker 1: new version of the universe is set up in a 684 00:38:10,080 --> 00:38:13,240 Speaker 1: way that's different than our version, and so all the matter, 685 00:38:13,320 --> 00:38:17,200 Speaker 1: including us, that's arranged just so for this this version 686 00:38:17,239 --> 00:38:20,200 Speaker 1: of the universe would be disintegrated in this new version. 687 00:38:20,920 --> 00:38:23,839 Speaker 1: So it's like the snap. But can you imagine if 688 00:38:23,880 --> 00:38:27,440 Speaker 1: all of a sudden, just a new universe grew out 689 00:38:27,480 --> 00:38:29,759 Speaker 1: a large hadron clider accidentally and at the speed of 690 00:38:29,840 --> 00:38:33,319 Speaker 1: light just ruined this universe forever, If it was we 691 00:38:33,480 --> 00:38:36,919 Speaker 1: just accidentally did this with a physics experiment. I find 692 00:38:36,960 --> 00:38:42,799 Speaker 1: that endlessly fascinating and also hilarious, just the idea I 693 00:38:42,840 --> 00:38:48,320 Speaker 1: think the world will end ironically somehow. It's it's entirely possible. So, uh, 694 00:38:48,480 --> 00:38:51,160 Speaker 1: maybe before we take a break, let's talk a little 695 00:38:51,160 --> 00:38:55,200 Speaker 1: bit about climate change, because a lot of people might 696 00:38:55,280 --> 00:38:59,759 Speaker 1: think climate change is an existential threat. Uh, you know, 697 00:39:00,239 --> 00:39:02,520 Speaker 1: it's terrible and we need to do all we can, 698 00:39:03,239 --> 00:39:06,440 Speaker 1: but even the worst case models probably don't mean an 699 00:39:06,480 --> 00:39:10,080 Speaker 1: end the humanity as a as a whole. Like it 700 00:39:10,120 --> 00:39:13,719 Speaker 1: means we're living much further inland than we thought we 701 00:39:13,760 --> 00:39:16,840 Speaker 1: ever would, and we maybe are much tighter quarters than 702 00:39:16,880 --> 00:39:18,200 Speaker 1: we ever thought we might be in a lot of 703 00:39:18,200 --> 00:39:20,640 Speaker 1: people might be gone, but it's probably not going to 704 00:39:20,719 --> 00:39:24,239 Speaker 1: wipe out every human being. Yeah, it'll probably end up 705 00:39:24,239 --> 00:39:27,160 Speaker 1: being akin to that same that same line of thinking, 706 00:39:27,239 --> 00:39:32,200 Speaker 1: the same path of Um, A catastrophic nuclear war, which 707 00:39:32,239 --> 00:39:35,320 Speaker 1: I guess you could just say nuclear war catastrophic is 708 00:39:35,360 --> 00:39:37,600 Speaker 1: kind of built into the idea, but we would be 709 00:39:37,640 --> 00:39:41,799 Speaker 1: able to adapt and rebuild. Um, it's possible that our 710 00:39:41,880 --> 00:39:46,120 Speaker 1: worst case scenarios are actually better than what will actually happen. 711 00:39:46,680 --> 00:39:50,399 Speaker 1: So it's just like with a total nuclear war, it's 712 00:39:50,480 --> 00:39:53,560 Speaker 1: possible that it could be bad enough that it could 713 00:39:53,600 --> 00:39:57,080 Speaker 1: be an existential risk. It's possible climate change could end 714 00:39:57,160 --> 00:39:59,480 Speaker 1: up being bad enough that it's an existential voice. But 715 00:39:59,760 --> 00:40:04,440 Speaker 1: from current understanding, they're probably not existential risks, right, all right, 716 00:40:04,560 --> 00:40:07,400 Speaker 1: Well that's a hopeful place to leave for another break, 717 00:40:07,840 --> 00:40:10,279 Speaker 1: and we're gonna come back and finish up with why 718 00:40:10,280 --> 00:40:13,160 Speaker 1: all of this is important. It should be pretty obvious, 719 00:40:13,200 --> 00:40:41,440 Speaker 1: but we'll summarize it. Stop you know, stop, stop, you know, stop, 720 00:40:42,760 --> 00:40:46,600 Speaker 1: you know, okay, chuck um. One thing about existential risks 721 00:40:46,640 --> 00:40:48,759 Speaker 1: that people like to say is well, let's just not 722 00:40:48,960 --> 00:40:51,919 Speaker 1: let's just not do anything. And it turns out from 723 00:40:51,960 --> 00:40:55,400 Speaker 1: people like Nick Bostrom and Toby Ord and other people 724 00:40:55,440 --> 00:40:58,160 Speaker 1: around the world who are thinking about this kind of stuff. 725 00:40:58,200 --> 00:41:01,799 Speaker 1: If we don't do anything, we probably are going to 726 00:41:01,840 --> 00:41:05,960 Speaker 1: accidentally wipe ourselves out. Like doing nothing is not a 727 00:41:06,000 --> 00:41:09,640 Speaker 1: safe option. Yeah, But um Bostrom is one who has 728 00:41:09,680 --> 00:41:15,200 Speaker 1: developed a concept that's hypothetical called technological maturity Um, which 729 00:41:15,280 --> 00:41:17,719 Speaker 1: is it would be great and that is sometime in 730 00:41:17,719 --> 00:41:21,920 Speaker 1: the future where we have invented all these things, but 731 00:41:22,000 --> 00:41:24,920 Speaker 1: we have done so safely and we have complete mastery 732 00:41:24,960 --> 00:41:27,759 Speaker 1: over it all, there won't be those accidents, there won't 733 00:41:27,800 --> 00:41:29,839 Speaker 1: be the gray goo, there won't be the AI that's 734 00:41:29,880 --> 00:41:32,279 Speaker 1: not aligned. Yeah, because we'll know how to use all 735 00:41:32,320 --> 00:41:34,560 Speaker 1: this stuff. Says right, like you said, right, we're not 736 00:41:35,200 --> 00:41:37,759 Speaker 1: mature in that way right now. No, Actually, we're at 737 00:41:37,760 --> 00:41:41,759 Speaker 1: a place that Carl Sagan called their technological adolescence, where 738 00:41:41,760 --> 00:41:46,640 Speaker 1: we're becoming powerful, but we're also not wise. At the 739 00:41:46,640 --> 00:41:50,040 Speaker 1: point where we're right now technological adolescence where we're starting 740 00:41:50,040 --> 00:41:53,960 Speaker 1: to invent the stuff that actually can wipe humanity out 741 00:41:53,960 --> 00:41:58,319 Speaker 1: of existence. But before we reach technological maturity, where we 742 00:41:58,440 --> 00:42:01,360 Speaker 1: have safely master and have that kind of wisdom to 743 00:42:01,480 --> 00:42:05,000 Speaker 1: use all this stuff, that's probably the most dangerous period 744 00:42:05,000 --> 00:42:08,200 Speaker 1: in the history of humanity. And we're entering it right now. 745 00:42:08,760 --> 00:42:11,120 Speaker 1: And if we don't figure out how to take on 746 00:42:11,200 --> 00:42:16,640 Speaker 1: these existential risks, we probably won't survive from technological adolescents 747 00:42:16,680 --> 00:42:19,680 Speaker 1: all the way to technological maturity. We will wipe ourselves 748 00:42:19,680 --> 00:42:22,480 Speaker 1: out one way or another. Because this is really important 749 00:42:22,480 --> 00:42:27,760 Speaker 1: to remember. All it takes is one one existential catastrophe, 750 00:42:27,920 --> 00:42:29,920 Speaker 1: and not all of these have to take place. It 751 00:42:30,000 --> 00:42:32,640 Speaker 1: doesn't have to be some combination, just one. Just one 752 00:42:33,200 --> 00:42:36,040 Speaker 1: um bug with basically a percent mortality has to get 753 00:42:36,080 --> 00:42:39,799 Speaker 1: out of a lab. Just one accidental physics experiment has 754 00:42:39,840 --> 00:42:43,320 Speaker 1: to slip up um just one AI has to become 755 00:42:43,320 --> 00:42:46,919 Speaker 1: super intelligent and take over the world like just one 756 00:42:46,960 --> 00:42:49,920 Speaker 1: of those things happening, and then that's it. And again 757 00:42:49,960 --> 00:42:53,239 Speaker 1: the problem with existential risks that makes them different is 758 00:42:53,520 --> 00:42:58,120 Speaker 1: we don't get a second chance. One of them befalls us, 759 00:42:58,239 --> 00:43:01,440 Speaker 1: and that's that. That's right. Uh, there depends on who 760 00:43:01,440 --> 00:43:04,000 Speaker 1: you talked to about if you want to get in, 761 00:43:04,960 --> 00:43:07,879 Speaker 1: maybe just a projection on our chances as a whole 762 00:43:07,960 --> 00:43:11,360 Speaker 1: as humans. Uh Toby ord right now is uh what 763 00:43:11,480 --> 00:43:15,000 Speaker 1: a one and six chance over the next hundred years. Yeah, 764 00:43:15,000 --> 00:43:19,080 Speaker 1: he always follows that with Russian roulette, other people say 765 00:43:19,080 --> 00:43:23,759 Speaker 1: about ten percent. Um. There's some different cosmologists. There's one 766 00:43:23,840 --> 00:43:27,600 Speaker 1: name Lord Martin Rees who puts it att Yeah. He 767 00:43:27,640 --> 00:43:29,840 Speaker 1: actually is a member of the Center for the Study 768 00:43:29,840 --> 00:43:33,360 Speaker 1: of Existential Risk. And we didn't mention before. Bostrom founded 769 00:43:33,440 --> 00:43:36,800 Speaker 1: something called the Future of Humanity Institute, which is pretty 770 00:43:36,800 --> 00:43:41,400 Speaker 1: great f h I. And then there's another one more 771 00:43:41,440 --> 00:43:42,759 Speaker 1: place that I gotta want to shout out. It's called 772 00:43:42,760 --> 00:43:45,320 Speaker 1: the Future of Life Institute. It was founded by Max 773 00:43:45,360 --> 00:43:50,320 Speaker 1: tag Mark and Yon Talling, co founder of I think Skype. 774 00:43:50,880 --> 00:43:53,160 Speaker 1: Oh really, I think so all right, well you should 775 00:43:53,160 --> 00:43:57,359 Speaker 1: probably also shut out the Church of Scientology. No, no, no, 776 00:43:57,360 --> 00:44:01,520 Speaker 1: no genius. Yeah, that's the one that's one thinking about. Well, 777 00:44:01,560 --> 00:44:04,520 Speaker 1: they get confused a lot. This is a pretty cool 778 00:44:04,560 --> 00:44:08,120 Speaker 1: little um thing you did here with how long because 779 00:44:08,160 --> 00:44:10,759 Speaker 1: I was kind of talking before about the long view 780 00:44:10,760 --> 00:44:13,399 Speaker 1: of things and how long humans have been around. So 781 00:44:13,480 --> 00:44:15,759 Speaker 1: I think your rope analogy is pretty spot on here. 782 00:44:15,920 --> 00:44:19,000 Speaker 1: So that's J. L. Schellenberg's rope analogy. Well, I didn't 783 00:44:19,000 --> 00:44:21,200 Speaker 1: think he wrote it. I wish it were admitting that 784 00:44:21,239 --> 00:44:23,919 Speaker 1: you included it. So the what we were talking about, 785 00:44:23,960 --> 00:44:25,640 Speaker 1: like you were saying, is like it's it's hard to 786 00:44:25,640 --> 00:44:28,200 Speaker 1: take that long view. But if you if you step 787 00:44:28,239 --> 00:44:30,600 Speaker 1: back and look at how long humans have been around. 788 00:44:30,800 --> 00:44:33,880 Speaker 1: So Homo sapiens have been on Earth about two thousand years, 789 00:44:34,160 --> 00:44:36,600 Speaker 1: it seems like a very long time. It does. And 790 00:44:36,640 --> 00:44:39,560 Speaker 1: even modern humans um like us have been around for 791 00:44:39,600 --> 00:44:41,799 Speaker 1: about fifty thousand years, it seems like a very long 792 00:44:41,800 --> 00:44:45,280 Speaker 1: time as well. But if you think about how much 793 00:44:45,360 --> 00:44:49,360 Speaker 1: longer the human race, humanity could continue on to exist 794 00:44:49,440 --> 00:44:55,360 Speaker 1: as a species, UM, it's that's nothing. It's virtually insignificant um. 795 00:44:55,400 --> 00:44:57,919 Speaker 1: And J. L. Schellenberg puts it like this, like, let's 796 00:44:57,920 --> 00:45:02,719 Speaker 1: say humanity has a billion year lifespan, and you translate 797 00:45:02,760 --> 00:45:05,479 Speaker 1: that billion years into a twenty foot rope. That's easy 798 00:45:06,440 --> 00:45:09,080 Speaker 1: to show up with just the eighth of an inch 799 00:45:09,200 --> 00:45:12,920 Speaker 1: mark on that twenty ft rope. You would have to 800 00:45:13,120 --> 00:45:15,960 Speaker 1: our species would have to live another three hundred thousand 801 00:45:16,040 --> 00:45:19,120 Speaker 1: years from the point where we've already lived. Yes, we 802 00:45:19,160 --> 00:45:22,279 Speaker 1: would have to live five hundred thousand years just to 803 00:45:22,360 --> 00:45:24,720 Speaker 1: show up as an eighth of an inch that first 804 00:45:24,719 --> 00:45:27,319 Speaker 1: eighth of an inch on that twenty ft long rope. 805 00:45:28,160 --> 00:45:30,440 Speaker 1: That's how long humanity might have ahead of us. And 806 00:45:30,480 --> 00:45:33,879 Speaker 1: that's actually kind of a conservative estimate. Some people say, 807 00:45:33,920 --> 00:45:37,840 Speaker 1: once we reach technological maturity, we're we're fine, We're not 808 00:45:37,840 --> 00:45:40,319 Speaker 1: going to go extinct because we'll be able to use 809 00:45:40,360 --> 00:45:43,799 Speaker 1: all that technology like having a I track all those 810 00:45:43,800 --> 00:45:46,080 Speaker 1: new Earth objects and say, well, this one is a 811 00:45:46,080 --> 00:45:48,480 Speaker 1: little close for comfort, I'm gonna send some nanobouts out 812 00:45:48,520 --> 00:45:51,319 Speaker 1: to disassembl it. We will remove ourselves from the risk 813 00:45:51,360 --> 00:45:54,360 Speaker 1: of ever going extinct when we hit technological maturity. So 814 00:45:54,400 --> 00:45:57,920 Speaker 1: a billion years is definitely doable for us. Yeah, and 815 00:45:58,120 --> 00:46:01,200 Speaker 1: it's uh, why we care about it is because it's 816 00:46:01,200 --> 00:46:04,160 Speaker 1: happening right now. I mean, there is already a I 817 00:46:04,480 --> 00:46:08,200 Speaker 1: that is unaligned. Um, we are, We've already talked about 818 00:46:08,239 --> 00:46:12,239 Speaker 1: the biotech in labs. Accidents have already happened, have been 819 00:46:12,360 --> 00:46:17,120 Speaker 1: all the time, and there are experiments going on with 820 00:46:17,280 --> 00:46:19,520 Speaker 1: physics that we we think we know what we're doing, 821 00:46:20,320 --> 00:46:24,240 Speaker 1: but accidents happen, and an accident that you can't recover from, 822 00:46:24,280 --> 00:46:28,120 Speaker 1: you know, there's no whoops is let me try that again, 823 00:46:28,360 --> 00:46:31,080 Speaker 1: right exactly because we're all toasts. So this is why 824 00:46:31,160 --> 00:46:33,600 Speaker 1: you have to care about it. And luckily, um, I 825 00:46:33,640 --> 00:46:35,920 Speaker 1: wish there were more people that care about it. Well, 826 00:46:35,960 --> 00:46:37,600 Speaker 1: it's becoming more of a thing and if you talk 827 00:46:37,680 --> 00:46:40,960 Speaker 1: to Toby Ord, he's like, so, just like say, the 828 00:46:41,080 --> 00:46:45,920 Speaker 1: environmental movement was, you know, the the moral push, and 829 00:46:46,239 --> 00:46:48,759 Speaker 1: we're starting to see some some stuff some results from 830 00:46:48,800 --> 00:46:51,680 Speaker 1: that now, but stay starting making the sixties and seventies, 831 00:46:51,800 --> 00:46:53,600 Speaker 1: nobody had ever heard of that. Yeah, I mean it 832 00:46:53,640 --> 00:46:56,600 Speaker 1: took decades. He's saying, like, we're about that's what we're 833 00:46:56,640 --> 00:46:59,120 Speaker 1: doing now with existential Chris. People are going to start 834 00:46:59,160 --> 00:47:02,000 Speaker 1: to realize like, oh man, this is for real, and 835 00:47:02,040 --> 00:47:05,319 Speaker 1: we do something about it because we could live a 836 00:47:05,320 --> 00:47:08,280 Speaker 1: billion years if we managed to survive the next hundred, 837 00:47:08,960 --> 00:47:11,480 Speaker 1: which makes you and me chucking, like all of us 838 00:47:11,480 --> 00:47:14,839 Speaker 1: alive right now in one of the most unique positions 839 00:47:15,160 --> 00:47:18,040 Speaker 1: any humans ever been at. We have the entire future 840 00:47:18,040 --> 00:47:21,160 Speaker 1: of the human race basically resting in our hands because 841 00:47:21,160 --> 00:47:24,080 Speaker 1: we're the ones who happened to be alive when humanity 842 00:47:24,239 --> 00:47:27,480 Speaker 1: entered its technological adolescence. Yeah, and it's a it's a 843 00:47:27,520 --> 00:47:30,440 Speaker 1: tougher one then save the planet, because it's such a 844 00:47:30,480 --> 00:47:34,359 Speaker 1: tangible thing when you talk about pollution, and it's very 845 00:47:34,400 --> 00:47:39,080 Speaker 1: easy to put on a TV screen or in a classroom. Um, 846 00:47:39,160 --> 00:47:41,960 Speaker 1: and it's not so easily dismissed because you can see 847 00:47:42,000 --> 00:47:44,120 Speaker 1: it in front of your eyeballs and understand it. This 848 00:47:44,200 --> 00:47:50,280 Speaker 1: is a lot tougher education wise, because, um people, here's 849 00:47:50,280 --> 00:47:54,439 Speaker 1: something about nanobots and gray goo or ai and just think, 850 00:47:54,680 --> 00:47:57,200 Speaker 1: come on, man, that's the stuff of movies. Yeah, And 851 00:47:57,239 --> 00:48:00,319 Speaker 1: I mean that's it's sad that like we couldn't dig 852 00:48:00,360 --> 00:48:03,080 Speaker 1: into it further, because when you really do start to 853 00:48:03,480 --> 00:48:06,040 Speaker 1: break it all down and understand it, it's like, no, 854 00:48:06,200 --> 00:48:09,560 Speaker 1: this totally is for real and it makes sense, like 855 00:48:09,600 --> 00:48:13,200 Speaker 1: this is entirely possible and maybe even likely. Yeah, and 856 00:48:13,280 --> 00:48:15,520 Speaker 1: not the hardest thing to understand. It's not like you 857 00:48:15,560 --> 00:48:19,480 Speaker 1: have to understand nanotechnology to understand its threat, right exactly. 858 00:48:19,560 --> 00:48:22,000 Speaker 1: That's well put. The other thing about all this is 859 00:48:22,040 --> 00:48:24,960 Speaker 1: that not everybody is on board with this, even people, 860 00:48:25,239 --> 00:48:27,560 Speaker 1: even people who hear about this kind of stuff are like, no, 861 00:48:27,840 --> 00:48:30,960 Speaker 1: you know, this is pie in the sky. It's overblown 862 00:48:31,280 --> 00:48:34,680 Speaker 1: or the opposite of the sky. It's a cake in 863 00:48:34,719 --> 00:48:39,279 Speaker 1: the ground. Is that the opposite dark sky territory? It's 864 00:48:39,280 --> 00:48:42,880 Speaker 1: a turkey drumstick in the earth. Okay, that's kind of 865 00:48:42,920 --> 00:48:47,440 Speaker 1: the opposite of the of a pie. Okay. I think 866 00:48:47,480 --> 00:48:49,600 Speaker 1: I may have just come up with a coloaquialism I 867 00:48:49,719 --> 00:48:53,640 Speaker 1: think so. Um, So, some people aren't convinced. Some people say, no, 868 00:48:53,880 --> 00:48:58,600 Speaker 1: AI is nowhere near being even close to human level intelligent, 869 00:48:58,680 --> 00:49:01,399 Speaker 1: let alone super intelligent. It like, why spend money because 870 00:49:01,400 --> 00:49:04,200 Speaker 1: it's expensive, right, Well, and other people are like, yeah, 871 00:49:04,239 --> 00:49:07,239 Speaker 1: if you start diverting, you know, research into figuring out 872 00:49:07,520 --> 00:49:10,520 Speaker 1: make AI friendly, I can tell you China and India 873 00:49:10,520 --> 00:49:12,400 Speaker 1: aren't going to do that, and so they're going to 874 00:49:12,480 --> 00:49:14,400 Speaker 1: leap frog ahead of us and we're going to be 875 00:49:14,920 --> 00:49:18,959 Speaker 1: toast competitively. So there's a cost to an opportunity cost, 876 00:49:19,160 --> 00:49:22,160 Speaker 1: there's an actual cost um So there's a lot of people. 877 00:49:22,160 --> 00:49:24,880 Speaker 1: It's basically the same arguments for people who argue against 878 00:49:25,160 --> 00:49:29,400 Speaker 1: mitigating climate change. Yeah, same same thing kind of. So 879 00:49:29,440 --> 00:49:36,160 Speaker 1: the answer is, uh, terraforming, terraforming, Well, that's that's not 880 00:49:36,320 --> 00:49:38,800 Speaker 1: the answer. The answer is one of those right, study 881 00:49:38,960 --> 00:49:42,520 Speaker 1: terraforming is right. The answer is to study this stuff 882 00:49:43,280 --> 00:49:45,320 Speaker 1: figure out what to do about it. But it wouldn't 883 00:49:45,360 --> 00:49:47,319 Speaker 1: hurt to learn how to live on Mars, right or 884 00:49:47,400 --> 00:49:50,399 Speaker 1: just off of Earth because in the exact same way, 885 00:49:50,560 --> 00:49:53,479 Speaker 1: like um that, like a whole village is at risk 886 00:49:53,520 --> 00:49:55,759 Speaker 1: when it's under a mud slide or a mountain and 887 00:49:55,800 --> 00:49:58,720 Speaker 1: a mud slide comes down. If we all live on Earth. 888 00:49:59,440 --> 00:50:03,480 Speaker 1: If something happens to life on Earth, that's it for humanity. 889 00:50:03,560 --> 00:50:06,839 Speaker 1: But if they're like a thriving population of humans who 890 00:50:06,880 --> 00:50:08,799 Speaker 1: don't live on Earth, who live off of Earth, if 891 00:50:08,800 --> 00:50:12,279 Speaker 1: something happens on Earth, humanity continues on. So learning to 892 00:50:12,360 --> 00:50:14,840 Speaker 1: live off of Earth is a good step in the 893 00:50:14,920 --> 00:50:23,359 Speaker 1: right direction. That's plan A DOT one. Sure it's tied 894 00:50:23,440 --> 00:50:25,200 Speaker 1: for first, like it's something we should be doing at 895 00:50:25,200 --> 00:50:28,879 Speaker 1: the same time as studying and learning to mitigate existent tourists. Yeah, 896 00:50:28,960 --> 00:50:30,960 Speaker 1: and I think it's got to be multi pronged, because 897 00:50:31,080 --> 00:50:35,279 Speaker 1: the threats are multi pronged. Absolutely. And there's one other 898 00:50:35,320 --> 00:50:37,720 Speaker 1: thing that I really think you've got to get across. 899 00:50:38,960 --> 00:50:41,560 Speaker 1: Like we said that, if if say the US starts 900 00:50:41,600 --> 00:50:45,080 Speaker 1: to invest all of its resources into figuring out how 901 00:50:45,080 --> 00:50:48,680 Speaker 1: to make friendly AI, but India and China continue on 902 00:50:48,760 --> 00:50:52,160 Speaker 1: like the path, it's not gonna work. And the same 903 00:50:52,200 --> 00:50:56,160 Speaker 1: goes with if every country in the world said, no, 904 00:50:56,280 --> 00:50:59,080 Speaker 1: we're going to figure out friendly AI, but just one 905 00:50:59,600 --> 00:51:03,360 Speaker 1: dedicate it itself to continuing on this path, the ninety 906 00:51:03,920 --> 00:51:06,120 Speaker 1: not the rest of the countries in the world. Progress 907 00:51:06,200 --> 00:51:08,560 Speaker 1: would be totally negated by that one yeah, so we 908 00:51:08,600 --> 00:51:10,520 Speaker 1: gotta get the It's got to be a global effort. 909 00:51:10,600 --> 00:51:13,160 Speaker 1: It has to be a species wide effort, not just 910 00:51:13,200 --> 00:51:15,560 Speaker 1: with AI, but with all these understanding all of them 911 00:51:15,560 --> 00:51:19,520 Speaker 1: and mitigating them together. Yeah, that could be a problem. So, um, 912 00:51:19,560 --> 00:51:24,120 Speaker 1: thank you for very much for doing this episode with me. 913 00:51:24,880 --> 00:51:26,839 Speaker 1: I'll though you talking to Dave. No, well, Dave too. 914 00:51:26,840 --> 00:51:29,239 Speaker 1: We appreciate you to Dave, but but big ups to 915 00:51:29,320 --> 00:51:31,560 Speaker 1: you Charles, because Jerry was like, I'm not sitting in 916 00:51:31,600 --> 00:51:34,600 Speaker 1: that room. It's like, I'm not listening to Clark blather 917 00:51:34,719 --> 00:51:37,960 Speaker 1: on about existential risk for an hour. Um, so one 918 00:51:37,960 --> 00:51:41,360 Speaker 1: more time. Toby Ord's The Precipice is available everywhere you 919 00:51:41,400 --> 00:51:44,280 Speaker 1: buy books. You can get The End of the World 920 00:51:44,280 --> 00:51:47,960 Speaker 1: with Josh Clark wherever you get podcasts. If this kind 921 00:51:47,960 --> 00:51:49,920 Speaker 1: of thing floated your boat, check out the Future of 922 00:51:50,000 --> 00:51:54,200 Speaker 1: Humanity Institute the Future of Life Institute UM and they 923 00:51:54,239 --> 00:51:58,920 Speaker 1: have a podcast hosted by Aerial Kahn and UM. She 924 00:51:59,040 --> 00:52:02,279 Speaker 1: had me on back in December of two eighteen as 925 00:52:02,360 --> 00:52:05,120 Speaker 1: part of a group that was talking about existential hope. 926 00:52:05,520 --> 00:52:07,560 Speaker 1: So you can go listen to that too. If you're 927 00:52:07,600 --> 00:52:09,279 Speaker 1: like this is a downer, I want to think about 928 00:52:09,280 --> 00:52:13,440 Speaker 1: the bright side, there's that whole Future of Life Institute podcast. 929 00:52:14,320 --> 00:52:17,600 Speaker 1: So what about you? Are you like convinced of this 930 00:52:17,880 --> 00:52:20,239 Speaker 1: whole thing like that this is an actual like thing 931 00:52:20,280 --> 00:52:23,840 Speaker 1: we need to be worrying about and thinking of. Really, No, 932 00:52:23,960 --> 00:52:28,040 Speaker 1: I mean I think that, sure there are people that 933 00:52:28,080 --> 00:52:31,239 Speaker 1: should be thinking about this stuff, and that's great as 934 00:52:31,280 --> 00:52:35,520 Speaker 1: far as like me, what can I do well? And 935 00:52:35,640 --> 00:52:38,239 Speaker 1: then I ran into that like, there's not a great 936 00:52:38,239 --> 00:52:41,360 Speaker 1: answer for that. It's more like, start telling other people 937 00:52:41,480 --> 00:52:44,279 Speaker 1: is the best thing that the average person can do. Hey, man, 938 00:52:44,320 --> 00:52:45,960 Speaker 1: we just did that in a big way. We did, 939 00:52:46,000 --> 00:52:50,440 Speaker 1: didn't We Like? It's people. Now we can go to sleep. Okay, 940 00:52:50,920 --> 00:52:54,640 Speaker 1: you got anything else? I got nothing else? All right? Well, 941 00:52:54,680 --> 00:52:57,279 Speaker 1: then since Chuck said he's got nothing else, that's time 942 00:52:57,480 --> 00:53:02,640 Speaker 1: for listener mail. Uh yeah, this is the opposite of 943 00:53:03,000 --> 00:53:07,399 Speaker 1: all the smart stuff we just talked about. I just realized, Hey, guys, 944 00:53:07,800 --> 00:53:10,120 Speaker 1: I love you, love stuff you should know. On a 945 00:53:10,160 --> 00:53:12,319 Speaker 1: recent airplane flight, I listened to and really enjoyed the 946 00:53:12,800 --> 00:53:16,680 Speaker 1: Coyote episode wherein Chuck mentioned aften wolf bait as a 947 00:53:16,719 --> 00:53:22,200 Speaker 1: euphemism for farts. Coincidentally, on that same flight, Uh, we're 948 00:53:22,239 --> 00:53:26,640 Speaker 1: Bill Ny the Science Guy and Anthony Michael Hall, the actor. 949 00:53:28,120 --> 00:53:33,160 Speaker 1: What is this star studded airplane flight? He said, so Naturally, 950 00:53:33,160 --> 00:53:34,920 Speaker 1: when I arrived at my home, I felt compelled to 951 00:53:34,960 --> 00:53:40,400 Speaker 1: watch rewatch the film Weird Science, in which Anthony Michael 952 00:53:40,400 --> 00:53:43,120 Speaker 1: Hall stars in that movie, and I remember this now 953 00:53:43,160 --> 00:53:45,919 Speaker 1: that he mentions it. In that movie, Anthony Michael Hall 954 00:53:46,040 --> 00:53:49,120 Speaker 1: uses the term wolf bait as a euphemism for pooping 955 00:53:50,120 --> 00:53:52,880 Speaker 1: dropping a wolf bait, which makes sense now that it 956 00:53:52,880 --> 00:53:55,880 Speaker 1: would be actual poop and not a fart. Did you 957 00:53:55,920 --> 00:53:58,600 Speaker 1: say his name before? Who wrote this? No? Your friend 958 00:53:58,640 --> 00:54:01,040 Speaker 1: who used the word wolf bait? Eddie at sure? Okay? 959 00:54:01,040 --> 00:54:03,320 Speaker 1: So is Eddie like a big Weird Science fan or 960 00:54:03,360 --> 00:54:08,040 Speaker 1: Anthony Michael Hall. I think he just Kelly Lebroc fan. Yeah, 961 00:54:08,080 --> 00:54:10,879 Speaker 1: that must be it. Uh. It has been a full 962 00:54:10,960 --> 00:54:13,160 Speaker 1: circle day for me and one that I hope you 963 00:54:13,200 --> 00:54:16,759 Speaker 1: will appreciate hearing about. And that is Jake Man. Can 964 00:54:16,800 --> 00:54:19,680 Speaker 1: you imagine being on a flight with Bill Night and 965 00:54:19,760 --> 00:54:21,680 Speaker 1: Anthony Michael Hall. Who do you talk to? Who do 966 00:54:21,719 --> 00:54:23,839 Speaker 1: you hang with? I don't. I'd just be worried that 967 00:54:23,920 --> 00:54:26,800 Speaker 1: somebody was gonna like take over control of the plane 968 00:54:26,800 --> 00:54:29,759 Speaker 1: and fly it somewhere to hold us all hostage and 969 00:54:29,800 --> 00:54:33,359 Speaker 1: make those two like perform or what if Bill Knight 970 00:54:33,360 --> 00:54:36,799 Speaker 1: and Anthony Michael Holler Inca hoots maybe and they take 971 00:54:36,840 --> 00:54:39,480 Speaker 1: the plane hostage. Yeah, it would be very suspicious if 972 00:54:39,520 --> 00:54:41,279 Speaker 1: they didn't talk to one another, you know what I mean? 973 00:54:41,320 --> 00:54:44,719 Speaker 1: I think so? Who is that? That was Jake? Thanks Jake, 974 00:54:44,800 --> 00:54:48,319 Speaker 1: that was a great email and thank you for joining us. 975 00:54:48,520 --> 00:54:49,840 Speaker 1: If you want to get in touch with us, like 976 00:54:49,920 --> 00:54:52,560 Speaker 1: Jake did, you can go onto stuff you Should Know 977 00:54:52,640 --> 00:54:56,520 Speaker 1: dot com and get lost in the amazing nous of it. 978 00:54:56,760 --> 00:54:58,640 Speaker 1: And you can also just send us an email to 979 00:54:58,760 --> 00:55:05,160 Speaker 1: stuff podcast at i heeart radio dot com. Stuff you 980 00:55:05,160 --> 00:55:07,680 Speaker 1: Should Know is a production of iHeart Radio's How Stuff Works. 981 00:55:07,960 --> 00:55:10,240 Speaker 1: For more podcasts for my heart Radio, visit the iHeart 982 00:55:10,280 --> 00:55:12,799 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 983 00:55:12,800 --> 00:55:16,680 Speaker 1: favorite shows. H