1 00:00:04,280 --> 00:00:06,920 Speaker 1: Hey, and welcome to the short stuff. I'm Josh, there's Chuck, 2 00:00:07,000 --> 00:00:09,640 Speaker 1: and we're sitting in for Jerry, who usually sits in 3 00:00:09,720 --> 00:00:12,400 Speaker 1: for Dave. So yeah, let's go. 4 00:00:13,440 --> 00:00:13,880 Speaker 2: Let's go. 5 00:00:14,440 --> 00:00:18,480 Speaker 3: This is well, it's a follow up to our Doomsday 6 00:00:18,520 --> 00:00:20,479 Speaker 3: Clock episode. But it turns out we didn't do a 7 00:00:20,520 --> 00:00:24,040 Speaker 3: Doomsday Clock episode. I know we talked about this, so 8 00:00:24,200 --> 00:00:25,960 Speaker 3: it might have been in one of when we were 9 00:00:25,960 --> 00:00:29,080 Speaker 3: doing videos years ago. I think it was probably in 10 00:00:29,160 --> 00:00:31,360 Speaker 3: one of those. But I know for a fact the 11 00:00:31,400 --> 00:00:33,640 Speaker 3: only reason I would have known about this is because 12 00:00:33,640 --> 00:00:35,360 Speaker 3: of this job and you. 13 00:00:36,040 --> 00:00:39,440 Speaker 1: That makes me feel good that I wasn't completely unaware 14 00:00:39,479 --> 00:00:41,920 Speaker 1: that we had done an episode on doomsteak clocks. 15 00:00:42,040 --> 00:00:45,600 Speaker 3: Yeah, it popped up in someplace. But what's the Doomstak clock? 16 00:00:45,640 --> 00:00:46,120 Speaker 2: Josh? 17 00:00:46,280 --> 00:00:49,639 Speaker 1: So, the doomsday clock is a metaphorical clock that is 18 00:00:50,200 --> 00:00:55,880 Speaker 1: operated or overseen by the Bulletin Atomic Scientists, which was 19 00:00:55,920 --> 00:00:58,840 Speaker 1: a group of scientists who had worked on the Manhattan Project. 20 00:01:00,080 --> 00:01:02,280 Speaker 1: They got together and they said, we need to create 21 00:01:02,320 --> 00:01:05,000 Speaker 1: a group that is going to basically keep an eye 22 00:01:05,040 --> 00:01:08,720 Speaker 1: on this nuclear proliferation that's starting up. And one of 23 00:01:08,720 --> 00:01:11,479 Speaker 1: the things they did in nineteen forty seven was create 24 00:01:11,560 --> 00:01:18,680 Speaker 1: the doomsday clock, and it essentially is this I guess 25 00:01:18,680 --> 00:01:21,800 Speaker 1: it's a graphic representation of how close humanity is to 26 00:01:21,959 --> 00:01:26,199 Speaker 1: self inflicted disaster like a nuclear. 27 00:01:25,959 --> 00:01:31,360 Speaker 3: War, perfectly said elegantly said I see, yeah, So, like 28 00:01:31,400 --> 00:01:33,200 Speaker 3: you said, been around since nineteen forty seven. 29 00:01:34,040 --> 00:01:35,600 Speaker 2: They set the time every year. 30 00:01:35,680 --> 00:01:37,679 Speaker 3: It's sort of a thing where they say, like, all right, 31 00:01:37,720 --> 00:01:40,080 Speaker 3: the time for this year is going to be this, 32 00:01:41,120 --> 00:01:43,280 Speaker 3: would they move it forward if something really went down 33 00:01:43,319 --> 00:01:43,920 Speaker 3: within a year? 34 00:01:45,120 --> 00:01:47,360 Speaker 1: I think they do it every year, so. 35 00:01:47,480 --> 00:01:50,240 Speaker 3: Like in January they said it, and if like four 36 00:01:50,240 --> 00:01:53,760 Speaker 3: months later, like the s goes down, they wouldn't be 37 00:01:53,840 --> 00:01:55,640 Speaker 3: like they'd be like, nop, gota wait till next year. 38 00:01:55,920 --> 00:01:58,640 Speaker 1: Well, I think two things would happen. Either it would 39 00:01:58,760 --> 00:02:01,600 Speaker 1: be something that they would take into account the next year, yes, 40 00:02:01,880 --> 00:02:04,040 Speaker 1: or the world would end and they wouldn't have anything 41 00:02:04,040 --> 00:02:04,760 Speaker 1: to do anyway. 42 00:02:05,280 --> 00:02:09,920 Speaker 3: Okay, But again, we're talking about it this year because 43 00:02:10,000 --> 00:02:11,359 Speaker 3: there was and you know, we'll. 44 00:02:11,200 --> 00:02:13,200 Speaker 2: Talk about a little bit how it's fluctuated over the years. 45 00:02:13,200 --> 00:02:16,000 Speaker 3: But the reason we bring it up is because this year, 46 00:02:16,200 --> 00:02:19,920 Speaker 3: January eighth, twenty twenty five, is when they moved the 47 00:02:19,960 --> 00:02:22,880 Speaker 3: second hand on the clock forward to eighty nine seconds 48 00:02:22,919 --> 00:02:26,080 Speaker 3: to midnight, yeah, which means it's the closest that clock 49 00:02:26,160 --> 00:02:29,040 Speaker 3: has ever been to midnight since they started. 50 00:02:29,600 --> 00:02:32,880 Speaker 1: Yeah, since they started, or when they started in nineteen 51 00:02:32,919 --> 00:02:35,560 Speaker 1: forty seven, when the US and Russia were starting the 52 00:02:35,600 --> 00:02:39,120 Speaker 1: Cold War, creating nukes, testing nukes out in the open 53 00:02:39,560 --> 00:02:43,760 Speaker 1: underground in space, there was seven minutes to midnight. We're 54 00:02:43,800 --> 00:02:48,280 Speaker 1: now less than two minutes away from midnight because stuff 55 00:02:48,360 --> 00:02:52,360 Speaker 1: is just so close to hitting the fan. And we 56 00:02:52,360 --> 00:02:56,680 Speaker 1: should say that they've actually moved the clock backward. They've 57 00:02:56,720 --> 00:03:00,320 Speaker 1: moved the secondhand backwards further away from midnight in the past, 58 00:03:00,840 --> 00:03:03,959 Speaker 1: and the furthest away it was from midnight was nineteen 59 00:03:04,040 --> 00:03:07,640 Speaker 1: ninety one, after the Soviet Union Union dissolved. Uh huh, 60 00:03:07,919 --> 00:03:10,679 Speaker 1: it was all the way back from seventeen minutes to midnight, 61 00:03:11,280 --> 00:03:11,960 Speaker 1: which is work. 62 00:03:12,000 --> 00:03:12,200 Speaker 2: I think. 63 00:03:12,320 --> 00:03:14,040 Speaker 1: I think that's called teeky time. 64 00:03:14,560 --> 00:03:18,000 Speaker 3: Yeah, it's like bust out the rum, everybody, exactly. 65 00:03:18,120 --> 00:03:20,320 Speaker 2: We're all at seventeen minutes. Yeah. 66 00:03:20,360 --> 00:03:24,000 Speaker 3: The closest pre this time in twenty twenty five was 67 00:03:24,000 --> 00:03:27,760 Speaker 3: in nineteen fifty three. It was two minutes before midnight, 68 00:03:28,440 --> 00:03:30,920 Speaker 3: So we're eighty nine seconds till midnight, And the closest 69 00:03:30,919 --> 00:03:33,840 Speaker 3: previous was two minutes. So that's you know, it's pretty drastic. 70 00:03:33,880 --> 00:03:36,240 Speaker 3: And again, you know, I guess we can go ahead 71 00:03:36,240 --> 00:03:38,240 Speaker 3: and mention. One of the criticisms of this is that 72 00:03:38,320 --> 00:03:41,840 Speaker 3: it's it's something that just gins up some the critics 73 00:03:41,880 --> 00:03:44,800 Speaker 3: will say it's something that just gins up paranoia in 74 00:03:44,880 --> 00:03:48,880 Speaker 3: people and like pushes the panic button and. 75 00:03:48,680 --> 00:03:51,200 Speaker 2: What is it even doing? But what is doing? I 76 00:03:51,240 --> 00:03:51,680 Speaker 2: think it's a. 77 00:03:51,720 --> 00:03:54,600 Speaker 3: Valuable thing because it just raises awareness every year with people. 78 00:03:55,360 --> 00:03:57,400 Speaker 3: It's just another thing to kind of say, hey, like 79 00:03:57,480 --> 00:04:00,720 Speaker 3: we're not headed in the right direction as amenity goes. 80 00:04:01,360 --> 00:04:05,080 Speaker 1: Yeah. So the first editor of the Bulletin of Atomic 81 00:04:05,160 --> 00:04:09,360 Speaker 1: Scientists was Eugene Rabinowitch, and Eugene Rabinowich said that the 82 00:04:09,360 --> 00:04:12,560 Speaker 1: purpose of the doomsday clock is to quote frighten men 83 00:04:12,680 --> 00:04:16,760 Speaker 1: into rationality and to basically say like, hey, you know, 84 00:04:16,960 --> 00:04:19,719 Speaker 1: this is where this stuff's out of control. People. You 85 00:04:19,760 --> 00:04:22,280 Speaker 1: need to be paying attention to these things, because they 86 00:04:22,279 --> 00:04:25,960 Speaker 1: don't just say we're eighty nine seconds from midnight. See 87 00:04:26,000 --> 00:04:29,600 Speaker 1: you next year. They explain what the what the reasoning 88 00:04:29,720 --> 00:04:32,360 Speaker 1: is for moving or even not moving or moving back 89 00:04:32,880 --> 00:04:36,080 Speaker 1: the secondhand and this year being eighty nine seconds the 90 00:04:36,120 --> 00:04:38,920 Speaker 1: closest we've ever been. They had a whole crop of 91 00:04:39,120 --> 00:04:43,159 Speaker 1: issues that go well beyond the nuclear risk that was 92 00:04:43,200 --> 00:04:46,840 Speaker 1: originally the clock was originally designed to track. And I say, 93 00:04:46,880 --> 00:04:48,279 Speaker 1: we take a break and we come back and talk 94 00:04:48,279 --> 00:04:51,400 Speaker 1: about why we're so close to midnight right now according 95 00:04:51,440 --> 00:05:16,960 Speaker 1: to the Bulletin of Atomic Scientists. 96 00:05:18,240 --> 00:05:19,480 Speaker 2: All right, everyone, we're back. 97 00:05:19,560 --> 00:05:23,640 Speaker 3: We're eighty seconds to midnight, not ten minutes to midnight, 98 00:05:23,800 --> 00:05:28,640 Speaker 3: like Charles Bunson was and that great movie. 99 00:05:28,000 --> 00:05:29,600 Speaker 1: What was it called ten Minutes to Midnight? 100 00:05:30,000 --> 00:05:32,760 Speaker 3: Yeah, you didn't see that one. No, you should check 101 00:05:32,800 --> 00:05:34,920 Speaker 3: that out. It's got a couple of choice scenes. It's 102 00:05:34,920 --> 00:05:40,160 Speaker 3: about a creepy, uh, serial killer that's he's chasing. 103 00:05:40,360 --> 00:05:43,000 Speaker 1: Oh Bronson's not the creepy serial killer. He's being chased 104 00:05:43,000 --> 00:05:43,880 Speaker 1: by a creepy dude. 105 00:05:44,160 --> 00:05:46,359 Speaker 3: Charles Bunsen is always the guy on the hunt for 106 00:05:46,440 --> 00:05:47,000 Speaker 3: the bad guy. 107 00:05:47,640 --> 00:05:50,039 Speaker 1: Have you ever seen Death Wish three, where like the 108 00:05:50,240 --> 00:05:52,120 Speaker 1: group of pos has taken over the neighborhood. 109 00:05:52,440 --> 00:05:53,800 Speaker 2: Yeah, all the Death Wish movies. 110 00:05:53,839 --> 00:05:55,479 Speaker 3: I mean, the first one was genuinely pretty good, but 111 00:05:55,520 --> 00:05:58,159 Speaker 3: they got really sort of over the top after a while. 112 00:05:58,400 --> 00:05:59,880 Speaker 1: Yeah, it's good though. 113 00:06:00,080 --> 00:06:03,040 Speaker 2: It's good. You got the Death Wish pal this is 114 00:06:03,080 --> 00:06:04,120 Speaker 2: my favorite impression to do. 115 00:06:04,240 --> 00:06:06,640 Speaker 1: But I can't wait till you get your Morgan Freeman down. 116 00:06:07,000 --> 00:06:08,479 Speaker 2: Oh no, I don't think so. 117 00:06:09,440 --> 00:06:12,080 Speaker 3: All right, So how do we get eighty nine seconds 118 00:06:12,120 --> 00:06:16,240 Speaker 3: to midnight? This comes direct from the bulletin website. Some 119 00:06:16,320 --> 00:06:18,320 Speaker 3: we can kind of summarize a few of these. I'm 120 00:06:18,400 --> 00:06:21,080 Speaker 3: just going to read outright because it's so like sort 121 00:06:21,120 --> 00:06:22,400 Speaker 3: of expertly put. 122 00:06:23,080 --> 00:06:24,360 Speaker 2: But the first thing is. 123 00:06:24,800 --> 00:06:28,520 Speaker 3: The ongoing war in Ukraine, and not just that, but 124 00:06:28,640 --> 00:06:32,240 Speaker 3: the nuclear risk therein involved in the third year of 125 00:06:32,240 --> 00:06:37,120 Speaker 3: that conflict that you know, hopefully it doesn't go that way. 126 00:06:37,839 --> 00:06:40,599 Speaker 3: Maybe things are wrapping up, but at the peak of 127 00:06:40,600 --> 00:06:43,680 Speaker 3: this thing, like any weird bad decision could have led 128 00:06:43,720 --> 00:06:45,080 Speaker 3: to something like that happening. 129 00:06:45,560 --> 00:06:49,120 Speaker 1: Yeah. Same with the Middle East right now that can 130 00:06:49,279 --> 00:06:52,440 Speaker 1: spiral out of control and suck in nuclear powers against 131 00:06:52,480 --> 00:06:56,240 Speaker 1: one another. That's a nuclear risk for sure. And then 132 00:06:56,839 --> 00:07:00,000 Speaker 1: we're back to increasing the size of our nuclear arsenal, 133 00:07:00,160 --> 00:07:01,680 Speaker 1: which is a reverse of what we were doing in 134 00:07:01,720 --> 00:07:05,600 Speaker 1: the eighties and nineties where we were getting rid of them. 135 00:07:06,080 --> 00:07:08,920 Speaker 1: That's not a good sign. And then one other thing too, 136 00:07:08,960 --> 00:07:11,960 Speaker 1: and this is definitely new countries that hadn't had nukes 137 00:07:12,000 --> 00:07:14,200 Speaker 1: before were basically like, well, we're never going to have 138 00:07:14,280 --> 00:07:17,240 Speaker 1: nukes because that's just not the way things are. It's 139 00:07:17,360 --> 00:07:20,720 Speaker 1: changed geopolitically, and now countries are starting to think about 140 00:07:20,760 --> 00:07:23,960 Speaker 1: developing their own nuclear programs, where if you have more 141 00:07:24,000 --> 00:07:26,640 Speaker 1: countries with more nukes, you have that much more risk. 142 00:07:27,240 --> 00:07:28,160 Speaker 2: Yeah, for sure. 143 00:07:29,400 --> 00:07:32,680 Speaker 3: Climate change is the next thing they have listed, and 144 00:07:32,760 --> 00:07:34,440 Speaker 3: you know, this one kind of speaks for itself. 145 00:07:34,440 --> 00:07:35,720 Speaker 2: We don't need to beat a dead horse. 146 00:07:35,760 --> 00:07:40,640 Speaker 3: But their take basically is that global greenhouse gas emissions 147 00:07:40,680 --> 00:07:44,720 Speaker 3: are still rising. No one is doing enough to combat this. 148 00:07:44,720 --> 00:07:48,640 Speaker 3: This is bringing on extreme weather and climate changed events 149 00:07:50,200 --> 00:07:53,880 Speaker 3: or climate change influence events, and it's affecting people all 150 00:07:53,920 --> 00:07:56,800 Speaker 3: over the world. And even if we're growing things like 151 00:07:56,880 --> 00:07:59,560 Speaker 3: solar and wind, it's just not fast enough and not 152 00:07:59,640 --> 00:08:01,880 Speaker 3: nearly an enough to make a dent and the damage 153 00:08:01,920 --> 00:08:02,720 Speaker 3: that's being done. 154 00:08:03,200 --> 00:08:07,920 Speaker 1: Right. Also, there's the biological arena, as they put. 155 00:08:07,760 --> 00:08:10,920 Speaker 2: It the boy, This one is very scary. 156 00:08:11,240 --> 00:08:12,800 Speaker 1: That's the most mucisy arena. 157 00:08:13,800 --> 00:08:18,560 Speaker 3: Yeah, but obviously coming out of COVID and with avian 158 00:08:18,880 --> 00:08:22,680 Speaker 3: the avian flu now expanding you know, to farm animals, 159 00:08:22,680 --> 00:08:27,680 Speaker 3: to dairy products, human cases, all this stuff is very 160 00:08:27,720 --> 00:08:30,000 Speaker 3: scary and the point of this episode isn't to scare 161 00:08:30,040 --> 00:08:32,720 Speaker 3: the cred out of everybody, but it's hard to read 162 00:08:32,720 --> 00:08:35,360 Speaker 3: the stuff and not get the crud scared out of 163 00:08:35,360 --> 00:08:36,000 Speaker 3: you sometimes. 164 00:08:36,440 --> 00:08:40,280 Speaker 1: Yeah, also, don't leave AI on the sidelines. In their 165 00:08:40,320 --> 00:08:45,240 Speaker 1: disruptive Technology part, they were like, like, yes, AI. They 166 00:08:45,240 --> 00:08:48,199 Speaker 1: didn't get into the existential threat that AI itself pose. 167 00:08:48,600 --> 00:08:51,160 Speaker 1: They more looked at it like, hey, some militaries are 168 00:08:51,160 --> 00:08:56,080 Speaker 1: starting to incorporate AI in their like battlefield decision making, 169 00:08:56,600 --> 00:08:59,719 Speaker 1: Like we're a step away from AIS deciding whether to 170 00:08:59,800 --> 00:09:03,400 Speaker 1: kill or not kill and then eventually giving AIS control 171 00:09:03,440 --> 00:09:06,400 Speaker 1: over our nuclear arsenals. That's not a direction we want 172 00:09:06,400 --> 00:09:09,600 Speaker 1: to be going. And then the whole thing, this is 173 00:09:09,640 --> 00:09:12,520 Speaker 1: the reason why all these things that have been around 174 00:09:12,600 --> 00:09:14,880 Speaker 1: for a while or have been developing for a while 175 00:09:15,480 --> 00:09:19,840 Speaker 1: have been accelerated to eighty nine seconds from midnight because 176 00:09:19,920 --> 00:09:25,120 Speaker 1: of the threat multiplier of misinformation and disinformation and conspiracy theories. 177 00:09:25,840 --> 00:09:27,640 Speaker 3: Yeah, and this is the one I wanted to read 178 00:09:28,040 --> 00:09:32,000 Speaker 3: a part or two from this because it just kind 179 00:09:32,000 --> 00:09:34,640 Speaker 3: of speaks volumes of things. They really put it very succinctly. 180 00:09:35,559 --> 00:09:39,080 Speaker 3: Spread of misinformation, disinformation, and conspiracy theories that degrade the 181 00:09:39,080 --> 00:09:43,400 Speaker 3: communication ecosystem increasingly blur the lines between truth and falsehood. 182 00:09:43,920 --> 00:09:46,240 Speaker 3: And then they talk about AI making it even you know, 183 00:09:46,240 --> 00:09:48,240 Speaker 3: we've talked about deep fake video and stuff like that, 184 00:09:48,360 --> 00:09:52,080 Speaker 3: like making all that stuff just so much easier, And 185 00:09:52,120 --> 00:09:55,200 Speaker 3: then this final line is really really good. The battered 186 00:09:55,200 --> 00:09:58,679 Speaker 3: information landscape is also producing leaders who discount science and 187 00:09:58,760 --> 00:10:02,280 Speaker 3: endeavor to suppress fe free speech, and human rights, compromising 188 00:10:02,320 --> 00:10:06,199 Speaker 3: the fact based public discussions that are required to combat 189 00:10:06,240 --> 00:10:10,160 Speaker 3: the enormous threats facing the world. So like, all of 190 00:10:10,160 --> 00:10:14,000 Speaker 3: the problems that we've been listing are bad enough, and 191 00:10:14,040 --> 00:10:17,840 Speaker 3: then when you've got disinformation and conspiracy theories and misinformation 192 00:10:17,920 --> 00:10:21,199 Speaker 3: thrown on top of that, and AI exacerbating all that, 193 00:10:21,200 --> 00:10:24,280 Speaker 3: that's when it's like they're moving that clock as close 194 00:10:24,320 --> 00:10:25,600 Speaker 3: to midnight as they've ever been. 195 00:10:25,960 --> 00:10:29,640 Speaker 1: Yeah, And the reason why is because people would be 196 00:10:29,760 --> 00:10:32,920 Speaker 1: under that circumstance, they're being led away from paying attention 197 00:10:33,679 --> 00:10:37,240 Speaker 1: to the stuff the doomsday clock is warring against. And 198 00:10:37,280 --> 00:10:39,640 Speaker 1: that just makes it that much riskier too, because we 199 00:10:39,720 --> 00:10:42,000 Speaker 1: have to be paying attention to it, whether you like 200 00:10:42,080 --> 00:10:45,800 Speaker 1: it or not. For some reason, when I was researching 201 00:10:45,800 --> 00:10:48,120 Speaker 1: this today, I was like, this is striking me as 202 00:10:48,120 --> 00:10:53,160 Speaker 1: a little ridiculous, and like, I get the point of it, 203 00:10:53,200 --> 00:10:56,160 Speaker 1: and I think it is noble and worthy, but there's 204 00:10:56,200 --> 00:11:00,880 Speaker 1: also some like real, I don't know, real criticisms of it. 205 00:11:00,920 --> 00:11:04,040 Speaker 1: And I found one piece by a guy named Stephen 206 00:11:04,120 --> 00:11:09,880 Speaker 1: Johnson on Life Hacker and he interviewed Lawrence Krause, who's 207 00:11:09,920 --> 00:11:13,679 Speaker 1: a physicist and a member of the Bulletin of Atomic Scientist. Sorry, 208 00:11:13,840 --> 00:11:17,040 Speaker 1: the New Republic interviewed Kraus and he said, it's not scientific. 209 00:11:17,080 --> 00:11:19,040 Speaker 1: It's a number that's arrived at by a group of 210 00:11:19,080 --> 00:11:22,640 Speaker 1: people exploring each of the questions and having a huge 211 00:11:22,679 --> 00:11:26,280 Speaker 1: amount of discussion and ultimately convergence on a number. That 212 00:11:26,400 --> 00:11:31,240 Speaker 1: number is frankly arbitrary. And that's true. You have to 213 00:11:31,240 --> 00:11:34,079 Speaker 1: remember it's a metaphor. There's no way to measure it. Okay, 214 00:11:34,160 --> 00:11:36,400 Speaker 1: for eighty nine seconds from midnight right now? How much 215 00:11:36,440 --> 00:11:39,760 Speaker 1: longer is the world going to last? And the big 216 00:11:39,800 --> 00:11:44,160 Speaker 1: problem with it, I think is national geographic put it. 217 00:11:44,200 --> 00:11:48,400 Speaker 1: If everything's a crisis, nothing's a crisis. So before the 218 00:11:48,440 --> 00:11:53,640 Speaker 1: whole thing was created to say this one thing, nuclear proliferation, 219 00:11:53,800 --> 00:11:56,880 Speaker 1: this is what we're warning about. Now you've got climate change, 220 00:11:56,960 --> 00:12:03,080 Speaker 1: AI avian flu information. It's just like being piled on, 221 00:12:03,280 --> 00:12:07,160 Speaker 1: and I think it's really diluted the point and the 222 00:12:07,200 --> 00:12:08,600 Speaker 1: pointedness of the whole thing. 223 00:12:09,840 --> 00:12:12,040 Speaker 3: Yeah maybe, but that's also the world we're living in 224 00:12:12,120 --> 00:12:12,559 Speaker 3: right now. 225 00:12:13,440 --> 00:12:16,000 Speaker 1: Yeah, but it makes it so easy to just be like, oh, well, 226 00:12:16,200 --> 00:12:18,640 Speaker 1: I give up, I'm going to guess, so pay attention 227 00:12:18,800 --> 00:12:22,600 Speaker 1: to I don't know, flowers versus zombies. Do people play? 228 00:12:22,679 --> 00:12:25,920 Speaker 1: Let Still, I know that was the thing. I think 229 00:12:25,960 --> 00:12:28,280 Speaker 1: it was at some point unless I had a fever dream. 230 00:12:28,760 --> 00:12:31,400 Speaker 3: Uh well, you didn't have a fever dream. One thing, 231 00:12:31,880 --> 00:12:35,120 Speaker 3: whether or not you agree with the doomsday clock or not. 232 00:12:35,160 --> 00:12:37,679 Speaker 3: One thing we can I can recommend because you're too 233 00:12:37,720 --> 00:12:41,319 Speaker 3: humble too. It's a little limited podcast series called The 234 00:12:41,400 --> 00:12:43,440 Speaker 3: End of the World with Josh Clark. That way you 235 00:12:43,480 --> 00:12:46,360 Speaker 3: can really learn something and take a deep dive into 236 00:12:46,720 --> 00:12:49,120 Speaker 3: real existential threats that face humanity. 237 00:12:49,760 --> 00:12:51,280 Speaker 1: Thanks Chuck, I appreciate that. 238 00:12:51,960 --> 00:12:54,679 Speaker 3: Holds up still great? I imagine I haven't gone back 239 00:12:54,720 --> 00:12:55,400 Speaker 3: and listen to it again. 240 00:12:55,480 --> 00:12:56,760 Speaker 2: I bet it still holds up though. 241 00:12:57,080 --> 00:12:59,719 Speaker 1: Yeah. Well, because one of the number one rules and 242 00:12:59,720 --> 00:13:02,920 Speaker 1: show businesses leave them wanting more, I say short stuff 243 00:13:03,000 --> 00:13:06,480 Speaker 1: is out. 244 00:13:07,040 --> 00:13:09,920 Speaker 2: Stuff you should know is a production of iHeartRadio. For 245 00:13:10,000 --> 00:13:14,199 Speaker 2: more podcasts, My heart Radio, visit the iHeartRadio app, Apple podcasts, 246 00:13:14,320 --> 00:13:16,160 Speaker 2: or wherever you listen to your favorite shows.