1 00:00:00,360 --> 00:00:03,720 Speaker 1: Hello, Maine in Greater New England. We're coming to see 2 00:00:03,760 --> 00:00:06,360 Speaker 1: you guys in Portland and we can't wait. We would 3 00:00:06,360 --> 00:00:07,960 Speaker 1: love to see you there. Yep, we'll be at the 4 00:00:08,000 --> 00:00:11,119 Speaker 1: State Theater on August. If you're interested, you can get 5 00:00:11,119 --> 00:00:15,480 Speaker 1: tickets and information at s y s K live dot com. 6 00:00:15,520 --> 00:00:19,280 Speaker 1: There's some lobster at us. Welcome to Stuff You Should Know, 7 00:00:19,480 --> 00:00:28,080 Speaker 1: a production of I Heart Radios How Stuff Works. Hey, 8 00:00:28,200 --> 00:00:31,840 Speaker 1: and welcome to the podcast. I'm Josh Clark, There's Charles 9 00:00:31,920 --> 00:00:35,479 Speaker 1: W Chuck Bryant, there's Jerry over there, and this is 10 00:00:35,520 --> 00:00:41,040 Speaker 1: Stuff you should Know, Uh, the podcast. And you're about 11 00:00:41,080 --> 00:00:43,760 Speaker 1: to say that the Blank Edition. Yeah, I was, but 12 00:00:43,800 --> 00:00:46,600 Speaker 1: I couldn't think of anything. It was literally the blank Edition, 13 00:00:47,600 --> 00:00:49,320 Speaker 1: was it? I mean, you couldn't think of anything? You 14 00:00:49,320 --> 00:00:51,480 Speaker 1: were blacked. Oh no, that's right, it was the blank Edition. 15 00:00:51,520 --> 00:00:54,880 Speaker 1: Oh gosh, it's a terrible start, Chuck. So how about this, 16 00:00:55,080 --> 00:00:59,880 Speaker 1: just to uh divert ourselves from that disaster, what was 17 00:01:00,040 --> 00:01:03,000 Speaker 1: not a disaster where our live shows we just did. 18 00:01:03,120 --> 00:01:06,480 Speaker 1: Oh yeah, we we finally got up on stage everyone 19 00:01:06,720 --> 00:01:10,600 Speaker 1: since first time since January kick the Rust off in 20 00:01:10,720 --> 00:01:14,560 Speaker 1: Chicago and Toronto, and both of them were we just killed. 21 00:01:14,600 --> 00:01:17,399 Speaker 1: They were great. Yeah, everybody were great. Everyone had a 22 00:01:17,400 --> 00:01:20,800 Speaker 1: really great time. Yeah, they told us so. They seemed 23 00:01:20,800 --> 00:01:24,040 Speaker 1: to be legitimately meaning what they were saying. Yeah, it 24 00:01:24,080 --> 00:01:26,440 Speaker 1: was really really great to get back on stage with you, 25 00:01:26,520 --> 00:01:29,800 Speaker 1: my friend. And also also also hats off to Chicago 26 00:01:29,959 --> 00:01:32,920 Speaker 1: for showing up. They showed up, Like we called you 27 00:01:32,959 --> 00:01:36,600 Speaker 1: guys out and you responded, thank you very and thank 28 00:01:36,600 --> 00:01:39,880 Speaker 1: you Toronto for not making us call you out. But 29 00:01:40,480 --> 00:01:42,600 Speaker 1: there are still tickets remaining for August twenty nine than 30 00:01:42,720 --> 00:01:45,440 Speaker 1: Boston at the Wilbur in Portland, Maine. You know, we're 31 00:01:45,560 --> 00:01:51,200 Speaker 1: venturing up into the hinterlands of America next after that, 32 00:01:51,240 --> 00:01:55,240 Speaker 1: but August thirty, if there are still plenty of great 33 00:01:55,240 --> 00:01:58,040 Speaker 1: tickets left there, and then the same can be said 34 00:01:58,080 --> 00:02:03,200 Speaker 1: in October and Orlando and October tenth. I think I 35 00:02:03,200 --> 00:02:06,200 Speaker 1: said October nine right in Orlando, October tenth in New Orleans. Yep, 36 00:02:06,320 --> 00:02:09,000 Speaker 1: that's right, Brooklyn. I'm not worried about that. Already, all 37 00:02:09,000 --> 00:02:12,480 Speaker 1: sold out, the whole thing, all three nights. Man, should 38 00:02:12,480 --> 00:02:16,200 Speaker 1: we had a fourth, jeez, I don't know. We'll talk 39 00:02:16,240 --> 00:02:18,560 Speaker 1: about it anyway. Thanks to everyone who came out. It 40 00:02:18,560 --> 00:02:20,480 Speaker 1: was a lot of fun and this is a good one, 41 00:02:20,560 --> 00:02:22,400 Speaker 1: so you don't want you don't want to miss it. Yeah, 42 00:02:22,480 --> 00:02:25,320 Speaker 1: so come on out, especially Portland, Maine. Let's get with 43 00:02:25,360 --> 00:02:28,680 Speaker 1: it all right now, Nuclear Semiotics, which I didn't know 44 00:02:28,760 --> 00:02:32,400 Speaker 1: I loved, but I do. Really Do you remember Visible 45 00:02:32,480 --> 00:02:35,160 Speaker 1: did a very famous episode on this very topic. I 46 00:02:35,200 --> 00:02:38,760 Speaker 1: didn't hear that. I specifically avoided going back and listening 47 00:02:38,760 --> 00:02:41,920 Speaker 1: to it because I don't want to be um stunk 48 00:02:42,000 --> 00:02:44,880 Speaker 1: upon by its taint. Does that make sense? You don't 49 00:02:44,919 --> 00:02:47,919 Speaker 1: want Roman Mars is taint um thinking on you. It's 50 00:02:47,960 --> 00:02:50,640 Speaker 1: more like it's just such a classic episode that I 51 00:02:50,639 --> 00:02:52,400 Speaker 1: don't want it to like leak in. I don't want 52 00:02:52,400 --> 00:02:55,200 Speaker 1: to actually rip it off. Yeah, well we we certainly 53 00:02:55,240 --> 00:02:58,760 Speaker 1: can't Invisible this thing, because that is a show that 54 00:02:58,840 --> 00:03:02,880 Speaker 1: exists at that hop echelon this industry. Sure so so 55 00:03:02,880 --> 00:03:05,680 Speaker 1: so do we Sure we're up there all right? But 56 00:03:05,840 --> 00:03:08,000 Speaker 1: if you like this one, if this stuff like floats 57 00:03:08,000 --> 00:03:09,239 Speaker 1: your boat and you're like, I want to know more, 58 00:03:09,400 --> 00:03:13,240 Speaker 1: go listen to the Visible episode. Yeah, this thing really 59 00:03:13,280 --> 00:03:17,400 Speaker 1: triggered a lot of like synapsis firing for me, and 60 00:03:17,440 --> 00:03:21,320 Speaker 1: I think, like I think I really enjoy this kind 61 00:03:21,360 --> 00:03:26,920 Speaker 1: of thought experiment, problem solving stuff. I think I would 62 00:03:26,919 --> 00:03:30,400 Speaker 1: really dig like that. Part of the zombie Apocalypse is 63 00:03:30,440 --> 00:03:33,840 Speaker 1: figuring the stuff out as a team. Because the whole 64 00:03:33,840 --> 00:03:35,880 Speaker 1: time I was reading this, I was saying, great idea, 65 00:03:36,040 --> 00:03:38,520 Speaker 1: terrible idea. They should do this, they shouldn't do that. 66 00:03:38,840 --> 00:03:40,800 Speaker 1: Go sit down. Yeah you I like the cut of 67 00:03:40,840 --> 00:03:43,120 Speaker 1: your gym. It was really cool. I dug this. I've 68 00:03:43,120 --> 00:03:46,080 Speaker 1: never heard of it, so thank you. Oh You're very welcome. 69 00:03:46,840 --> 00:03:50,160 Speaker 1: I actually heard of it before Roman Mars made the episode, 70 00:03:50,200 --> 00:03:52,680 Speaker 1: so I can't really thank him. But well, not before 71 00:03:52,680 --> 00:03:54,440 Speaker 1: he heard of it, because I think it's well known 72 00:03:54,480 --> 00:03:59,440 Speaker 1: that Roman's first words were nuclear semiotics. That's true. Yeah, 73 00:03:59,520 --> 00:04:03,960 Speaker 1: even before or Mama's um. I could totally believe that. Actually, 74 00:04:05,040 --> 00:04:07,600 Speaker 1: uh so, what we're talking about, is, Chuck said a 75 00:04:07,640 --> 00:04:09,240 Speaker 1: couple of times, for those of you who don't know, 76 00:04:10,000 --> 00:04:14,160 Speaker 1: is nuclear semiotics. And that is a very specialized branch, 77 00:04:15,040 --> 00:04:20,000 Speaker 1: interdisciplinary branch of I guess science that involves also basically 78 00:04:20,040 --> 00:04:22,200 Speaker 1: any field of research that you can throw at the 79 00:04:22,240 --> 00:04:25,640 Speaker 1: wall would probably have some function to play in the 80 00:04:25,680 --> 00:04:28,680 Speaker 1: field of nuclear semiotics. And to to make a long 81 00:04:28,720 --> 00:04:31,559 Speaker 1: story short, to do the too long, didn't read version 82 00:04:31,600 --> 00:04:38,120 Speaker 1: of this TL Semicolon dr um is nuclear semiotics seeks 83 00:04:38,160 --> 00:04:43,160 Speaker 1: to figure out how to warn the future humans to 84 00:04:43,279 --> 00:04:47,080 Speaker 1: come or whatever is here. Sure, good point. I mean, 85 00:04:47,080 --> 00:04:50,479 Speaker 1: why discriminate right to warn the future humans or the 86 00:04:50,520 --> 00:04:56,080 Speaker 1: future super intelligent jellyfish whatever to come. Hey, this is 87 00:04:56,120 --> 00:04:59,800 Speaker 1: a very dangerous radioactive dump site that we've put here. 88 00:05:00,440 --> 00:05:03,800 Speaker 1: Stay away. Yeah, it's that easy. It sounds easy. The 89 00:05:03,880 --> 00:05:07,680 Speaker 1: problem is is if you presume that it's easy, you're 90 00:05:07,720 --> 00:05:10,680 Speaker 1: making a lot of assumptions that aren't necessarily going to 91 00:05:10,760 --> 00:05:12,840 Speaker 1: hold up. Oh yeah, like a lot of times and 92 00:05:12,880 --> 00:05:14,599 Speaker 1: are like they should just do And I would even 93 00:05:14,600 --> 00:05:17,200 Speaker 1: stop halfway through my thought. It was like, now that 94 00:05:17,200 --> 00:05:20,560 Speaker 1: that wouldn't work. It's true because our languages might be 95 00:05:20,600 --> 00:05:24,279 Speaker 1: gone by then. Our symbols don't necessarily make sense outside 96 00:05:24,279 --> 00:05:27,400 Speaker 1: of the context that we understand them in. Um civilization 97 00:05:27,480 --> 00:05:31,159 Speaker 1: might be ridiculously advanced by them. Civilization might be in 98 00:05:31,160 --> 00:05:33,560 Speaker 1: a state of collapse by then. We have no idea. 99 00:05:33,800 --> 00:05:36,520 Speaker 1: But the point of nuclear semiotics is to figure out 100 00:05:36,520 --> 00:05:40,160 Speaker 1: how to come up with a um a message that 101 00:05:40,279 --> 00:05:45,000 Speaker 1: is understandable to everybody in any situation in the future. 102 00:05:45,400 --> 00:05:48,760 Speaker 1: And the current state of the art is let's figure 103 00:05:48,760 --> 00:05:51,440 Speaker 1: out how to speak as far as ten thousand years 104 00:05:51,440 --> 00:05:55,719 Speaker 1: into the future. Yeah, I mean, and that's like being generous. 105 00:05:55,760 --> 00:05:57,880 Speaker 1: It needs to go beyond that. It does, because the 106 00:05:57,920 --> 00:06:00,560 Speaker 1: whole point of nuclear semiotics, the whole porn of warning 107 00:06:00,560 --> 00:06:03,560 Speaker 1: the future, is this stuff, this nuclear waste that we're 108 00:06:03,560 --> 00:06:07,520 Speaker 1: putting into the ground now, is going to be dangerous 109 00:06:07,560 --> 00:06:11,120 Speaker 1: for tens and tens of thousands of years. Plutonium two 110 00:06:11,160 --> 00:06:14,040 Speaker 1: thirty nine has a half life of twenty four thousand years. 111 00:06:14,480 --> 00:06:19,240 Speaker 1: There's something called um Technetium ninety nine has a half 112 00:06:19,240 --> 00:06:22,080 Speaker 1: life of two hundred and eleven thousand years, So another 113 00:06:22,120 --> 00:06:24,960 Speaker 1: one is like one point seven million year half life. 114 00:06:25,279 --> 00:06:28,880 Speaker 1: This is the nuclear waste that we're creating now and 115 00:06:28,920 --> 00:06:32,279 Speaker 1: are putting in the ground. Yeah. And uh, Julia Layton, 116 00:06:32,360 --> 00:06:34,920 Speaker 1: who was one of our writers who does great work 117 00:06:34,960 --> 00:06:37,400 Speaker 1: for us, she made a lot of great points, which 118 00:06:37,440 --> 00:06:40,239 Speaker 1: is like the history of human evolution is two hundred 119 00:06:40,279 --> 00:06:43,960 Speaker 1: thousand years, and like we've only been like reading and 120 00:06:43,960 --> 00:06:46,960 Speaker 1: writing for how long about five thousand, less than six 121 00:06:46,960 --> 00:06:51,440 Speaker 1: thousand years. Yeah, So it's it sounds like like you said, 122 00:06:51,480 --> 00:06:54,520 Speaker 1: it sounds simple. And so many times I thought I 123 00:06:54,560 --> 00:06:57,520 Speaker 1: had I thought I had it cracked, only to think 124 00:06:57,920 --> 00:07:00,240 Speaker 1: like I was, like, why don't they just uh do 125 00:07:00,360 --> 00:07:04,440 Speaker 1: something purely visual and and stage a play of people 126 00:07:04,560 --> 00:07:07,640 Speaker 1: at that site digging in and then dying. And it's like, 127 00:07:07,680 --> 00:07:09,400 Speaker 1: what what do you do with it? Well, I'll just 128 00:07:09,400 --> 00:07:13,200 Speaker 1: put it on a DVD that just plays on a loop. 129 00:07:14,320 --> 00:07:16,720 Speaker 1: It's like, how are you gonna power that thing? You 130 00:07:16,760 --> 00:07:20,760 Speaker 1: know what happens when everybody's converted to blue ray? Yeah exactly? 131 00:07:20,920 --> 00:07:23,240 Speaker 1: Or you know, well, then solar put a solar panel up. 132 00:07:23,760 --> 00:07:25,120 Speaker 1: He's got the last forever. But what have it done? 133 00:07:25,160 --> 00:07:29,320 Speaker 1: What if there's like a forever nuclear storm or whatever. 134 00:07:29,600 --> 00:07:31,960 Speaker 1: But if the sun never shines again on Earth in 135 00:07:32,400 --> 00:07:35,200 Speaker 1: eight thousand years, that's could happen. That's the cool thing 136 00:07:35,360 --> 00:07:38,480 Speaker 1: about thinking into the deep future is all the things 137 00:07:38,520 --> 00:07:40,840 Speaker 1: that will go wrong. Yeah, it makes you realize like 138 00:07:41,040 --> 00:07:45,400 Speaker 1: how specific everything you think and know and understand really 139 00:07:45,520 --> 00:07:48,200 Speaker 1: is to your current time. Yeah, it's very cool. She 140 00:07:48,240 --> 00:07:50,320 Speaker 1: brings up the point about an apple, like when you 141 00:07:50,360 --> 00:07:53,000 Speaker 1: see the word apple, you don't see the word apple, 142 00:07:53,560 --> 00:07:56,640 Speaker 1: you see visualize the symbol of that is an apple. 143 00:07:57,240 --> 00:08:00,360 Speaker 1: So it's like it's almost like the words and very 144 00:08:00,400 --> 00:08:03,160 Speaker 1: much of the words will just not have meaning anymore 145 00:08:03,200 --> 00:08:06,840 Speaker 1: at some point. Man. Well, let's dig into this stuff. 146 00:08:06,960 --> 00:08:09,520 Speaker 1: You're ready, let's do it. So to start, we should 147 00:08:09,520 --> 00:08:12,440 Speaker 1: talk about where this all came from. It came from 148 00:08:13,000 --> 00:08:17,080 Speaker 1: a new type of nuclear storage solution, nuclear waste storage 149 00:08:17,080 --> 00:08:23,600 Speaker 1: solution called UM long term geological repositories, and it is 150 00:08:23,640 --> 00:08:27,360 Speaker 1: basically digging into the earth a couple of miles into 151 00:08:27,360 --> 00:08:31,120 Speaker 1: the earth, putting our nuclear waste there, again, waste that's 152 00:08:31,160 --> 00:08:34,840 Speaker 1: going to be harmful the health for tens of thousands, 153 00:08:34,920 --> 00:08:39,320 Speaker 1: hundreds of thousands of years, and sealing it up and 154 00:08:39,360 --> 00:08:41,840 Speaker 1: then covering over the site and then putting a warning 155 00:08:41,880 --> 00:08:46,160 Speaker 1: on there. And right now, the general consensus is that 156 00:08:46,400 --> 00:08:50,040 Speaker 1: salt beds are the best place to put that nuclear waste. 157 00:08:50,040 --> 00:08:52,400 Speaker 1: And there's actually some pretty good reasons why. Yeah, we 158 00:08:52,440 --> 00:08:55,600 Speaker 1: could do an episode on nuclear storage. I think I 159 00:08:55,679 --> 00:08:57,559 Speaker 1: really want to in and of itself. Yeah, I don't 160 00:08:57,559 --> 00:08:59,600 Speaker 1: know if that's a shorty or a long it's probably 161 00:08:59,600 --> 00:09:02,800 Speaker 1: a long. But just briefly, the reason salt beds are 162 00:09:02,840 --> 00:09:06,440 Speaker 1: preferable is because the fact that they're even there suggests 163 00:09:06,480 --> 00:09:09,160 Speaker 1: that there's no water. If they if there was water, 164 00:09:09,200 --> 00:09:12,480 Speaker 1: they would have been dissolved long ago. It's really relatively 165 00:09:12,520 --> 00:09:15,400 Speaker 1: easy to mine into them. And then what's awesome about 166 00:09:15,440 --> 00:09:18,960 Speaker 1: salt is that when you mine a shaft into a 167 00:09:19,040 --> 00:09:23,440 Speaker 1: salt bed and you put your deposit there, then you 168 00:09:23,480 --> 00:09:28,640 Speaker 1: pull back out, Um, the salt bed actually heals itself over, 169 00:09:28,760 --> 00:09:31,800 Speaker 1: like just a few decades, seals itself back up, right, Yes, 170 00:09:31,920 --> 00:09:34,600 Speaker 1: So you put a container that's been engineered to hold 171 00:09:34,600 --> 00:09:37,120 Speaker 1: the nuclear waste inside for ten thousand years. Yeah, it's 172 00:09:37,160 --> 00:09:39,440 Speaker 1: also in a container. You should point that out, right, 173 00:09:39,720 --> 00:09:44,480 Speaker 1: You're putting it into a borehole in the salt. The 174 00:09:44,520 --> 00:09:47,119 Speaker 1: salt is going to grow back around it and intubate, 175 00:09:48,000 --> 00:09:52,760 Speaker 1: perhaps permanently in this salt very strong to right, Yeah, 176 00:09:52,960 --> 00:09:55,439 Speaker 1: it is fairly strong. I mean, like if you're mining 177 00:09:55,559 --> 00:09:59,040 Speaker 1: using modern mining equipment, it's it's really easy to mine into. 178 00:09:59,480 --> 00:10:01,679 Speaker 1: But if you have like a pickaxe or something is 179 00:10:01,760 --> 00:10:04,480 Speaker 1: rock to salt. Rock is what it's called. Right. So 180 00:10:04,520 --> 00:10:06,480 Speaker 1: there's a lot of reasons why people have figured out 181 00:10:06,559 --> 00:10:10,120 Speaker 1: like this is not a bad idea to intomb nuclear waste. 182 00:10:10,400 --> 00:10:13,880 Speaker 1: But but here's the thing. We can't just intumbate and 183 00:10:13,920 --> 00:10:17,720 Speaker 1: walk away, Like we have a responsibility for those of 184 00:10:17,800 --> 00:10:21,800 Speaker 1: us generating this waste today to warn the future. And 185 00:10:21,840 --> 00:10:24,439 Speaker 1: it's on the future if they listen to us or not. 186 00:10:24,960 --> 00:10:27,120 Speaker 1: That's on them, right, But we have to make them 187 00:10:27,160 --> 00:10:29,600 Speaker 1: able to listen to us exactly, Like we have a 188 00:10:29,640 --> 00:10:32,640 Speaker 1: responsibility to do that because some people are proposed like, hey, 189 00:10:32,840 --> 00:10:34,920 Speaker 1: let's just bury and forget about it. The chances of 190 00:10:34,960 --> 00:10:38,240 Speaker 1: somebody actually finding it are pretty slim. Just bury and 191 00:10:38,320 --> 00:10:40,560 Speaker 1: forget about it, and that's probably the best, the best 192 00:10:40,600 --> 00:10:42,800 Speaker 1: way to go. And people said, it's not a bad idea, 193 00:10:42,840 --> 00:10:45,040 Speaker 1: but it's actually pretty bad ideas. He actually, I thought 194 00:10:45,040 --> 00:10:47,640 Speaker 1: that one wasn't the worst idea. It's not that was 195 00:10:47,679 --> 00:10:50,480 Speaker 1: the behavioral psychologists. He was like, And he wasn't like 196 00:10:50,520 --> 00:10:52,680 Speaker 1: just forget about it. He was like, maybe the smartest 197 00:10:52,679 --> 00:10:55,480 Speaker 1: thing to do is to leave it unmarked, right because, 198 00:10:55,480 --> 00:10:59,600 Speaker 1: as we'll see, attracting attention to something like exactly attracts attention. 199 00:11:00,080 --> 00:11:03,200 Speaker 1: It's interesting thought experiment, right. That was that psychologist. By 200 00:11:03,200 --> 00:11:06,679 Speaker 1: the way, is Dr Percy Tannenbaum Really no, no wonder 201 00:11:06,679 --> 00:11:09,120 Speaker 1: I like that of the East Hampton tenembombs. So we 202 00:11:09,120 --> 00:11:12,360 Speaker 1: should point out that there's a couple of a couple 203 00:11:12,360 --> 00:11:16,040 Speaker 1: of big times that these um that this has been commissioned, like, hey, 204 00:11:16,080 --> 00:11:18,679 Speaker 1: we need to think of something one for a site 205 00:11:18,720 --> 00:11:21,440 Speaker 1: that never happened, uh, And one for a site that 206 00:11:21,520 --> 00:11:23,800 Speaker 1: has happened, the one that has happened. It's only one 207 00:11:23,800 --> 00:11:25,800 Speaker 1: in the United States right now, the only one in 208 00:11:25,800 --> 00:11:28,440 Speaker 1: the world as far as they know now, it's number three. 209 00:11:29,080 --> 00:11:31,760 Speaker 1: It's the third largest. I didn't see what the other 210 00:11:31,800 --> 00:11:34,360 Speaker 1: two were. It must have been the first in the world. Then, yeah, 211 00:11:34,400 --> 00:11:36,640 Speaker 1: probably the first in the world. Yeah, which makes sense 212 00:11:36,679 --> 00:11:38,960 Speaker 1: because the other two are bigger. But this is in 213 00:11:38,960 --> 00:11:42,920 Speaker 1: New Mexico. It's called the Whip the Waist Isolation Pilot Plant, 214 00:11:43,760 --> 00:11:48,880 Speaker 1: and this one they are, uh, they're actively guarding for 215 00:11:49,000 --> 00:11:51,880 Speaker 1: They've committed the Department of Energy is committed to guarding 216 00:11:51,920 --> 00:11:55,160 Speaker 1: it with people for a hundred years. They have hired 217 00:11:55,840 --> 00:11:59,720 Speaker 1: Barney five year contract to look over this nuclear for 218 00:11:59,800 --> 00:12:01,439 Speaker 1: it least a hundred years. It's not like at the 219 00:12:01,520 --> 00:12:03,400 Speaker 1: end of the hundred years are gonna just like put 220 00:12:03,400 --> 00:12:06,320 Speaker 1: a padlock on it and walk away. I imagine they 221 00:12:06,360 --> 00:12:08,439 Speaker 1: will keep guarding it as long as they feel like 222 00:12:08,480 --> 00:12:11,960 Speaker 1: it needs guarding. I don't know if that's something I 223 00:12:12,000 --> 00:12:14,080 Speaker 1: don't know, man. I mean, we're talking about a government 224 00:12:14,120 --> 00:12:17,080 Speaker 1: run program here. At least a hundred years. We can 225 00:12:17,120 --> 00:12:21,480 Speaker 1: at least say that, yes, they agreed to that, so uh, 226 00:12:21,559 --> 00:12:25,160 Speaker 1: you know, the whole idea arose um before that, though, 227 00:12:25,800 --> 00:12:29,199 Speaker 1: what was the other one in Nevada. That's the Yucky 228 00:12:29,280 --> 00:12:31,120 Speaker 1: Mountain one. That was the first one, right, that's the 229 00:12:31,120 --> 00:12:34,880 Speaker 1: first one that that never happened. But that's when, you know, 230 00:12:34,920 --> 00:12:37,680 Speaker 1: in the seventies is when this idea sort of came about. 231 00:12:37,880 --> 00:12:40,400 Speaker 1: And I think it was two when it was sort 232 00:12:40,400 --> 00:12:45,880 Speaker 1: of codified as an official um I guess science or yeah, 233 00:12:45,960 --> 00:12:48,800 Speaker 1: it is. It's a branch. It's an interdisciplinary branch of science. 234 00:12:48,880 --> 00:12:52,240 Speaker 1: Nuclear semionics is, and it is. It's because the e 235 00:12:52,360 --> 00:12:55,760 Speaker 1: p A came up with a rule in two a law. 236 00:12:55,880 --> 00:12:57,600 Speaker 1: Really that's the eighty one. I got that wrong, by 237 00:12:57,640 --> 00:12:59,440 Speaker 1: the way, So it's eighty one that they came up 238 00:12:59,440 --> 00:13:01,800 Speaker 1: with the law. Uh well, it became a discipline in 239 00:13:01,840 --> 00:13:06,120 Speaker 1: nineteen eighty one with that Yucky Mountain Repository project. And 240 00:13:06,160 --> 00:13:09,280 Speaker 1: I think from that Yucky Mountain Repository project, because we 241 00:13:09,280 --> 00:13:11,720 Speaker 1: were starting to figure out how to deposit the stuff 242 00:13:11,720 --> 00:13:14,160 Speaker 1: for for a long time, the e p A came 243 00:13:14,200 --> 00:13:16,760 Speaker 1: up with a rule I think it was two that said, 244 00:13:16,920 --> 00:13:19,520 Speaker 1: if you're going to create these kind of repositories for 245 00:13:19,640 --> 00:13:22,200 Speaker 1: nuclear waste, you also have to figure out how to 246 00:13:22,559 --> 00:13:26,280 Speaker 1: come up with a permanent warning sign. And everybody was like, 247 00:13:26,320 --> 00:13:28,319 Speaker 1: that's no problem, of course, and then the EPs to 248 00:13:28,440 --> 00:13:30,800 Speaker 1: think about it. It's harder than you think. He said, 249 00:13:30,840 --> 00:13:35,120 Speaker 1: just slap that, uh, that nuclear waste logo that everyone knows. 250 00:13:35,880 --> 00:13:38,679 Speaker 1: And everyone was like, everyone doesn't know that. It's been 251 00:13:38,679 --> 00:13:42,360 Speaker 1: around forever. Everyone doesn't know that now, right, much less 252 00:13:42,840 --> 00:13:45,600 Speaker 1: in two hundred thousand years. Yeah, did you see the 253 00:13:45,760 --> 00:13:49,120 Speaker 1: how that was created? Yeah? It was. It was a 254 00:13:49,120 --> 00:13:51,440 Speaker 1: a group doodle. I don't know how that happens. I 255 00:13:51,440 --> 00:13:53,680 Speaker 1: think that means they can't ascribe it to one person. 256 00:13:53,880 --> 00:13:55,600 Speaker 1: They know there was like five people on one of 257 00:13:55,600 --> 00:14:00,000 Speaker 1: those giants like silver spoons, pencils. Yeah, this is an 258 00:14:00,080 --> 00:14:04,240 Speaker 1: nineteen six was it at Berkeley? And it was a 259 00:14:04,240 --> 00:14:08,480 Speaker 1: group doodle in the science class? And is that a 260 00:14:08,600 --> 00:14:13,079 Speaker 1: a band name group dood Wiggles or something? Yeah, I 261 00:14:13,120 --> 00:14:15,480 Speaker 1: think it's an album title for sure. So the Wiggles 262 00:14:15,720 --> 00:14:19,640 Speaker 1: group doodle. Absolutely, that's probably a real thing. That's our 263 00:14:19,680 --> 00:14:21,960 Speaker 1: gift to you, Wiggles. But I saw this was interesting 264 00:14:22,000 --> 00:14:26,240 Speaker 1: in the symbol came under consideration for wider use because 265 00:14:26,280 --> 00:14:29,480 Speaker 1: at first it was just a group doodle and then 266 00:14:29,520 --> 00:14:34,520 Speaker 1: the Brookhaven National Laboratory requested a standardized symbol of standardized 267 00:14:34,560 --> 00:14:38,840 Speaker 1: colors for their radiation safety program, and there was more 268 00:14:38,920 --> 00:14:43,280 Speaker 1: argument about the colors than the actual symbol um because 269 00:14:43,600 --> 00:14:45,160 Speaker 1: at first we're like, you can't use yellow because we 270 00:14:45,240 --> 00:14:47,080 Speaker 1: use yellow for a lot of stuff. Yeah, they wanted 271 00:14:47,120 --> 00:14:50,160 Speaker 1: to make sure that it didn't get overused so people 272 00:14:50,160 --> 00:14:52,040 Speaker 1: would just become kind of blind to because they saw 273 00:14:52,080 --> 00:14:53,320 Speaker 1: it so much. And they were like, have you heard 274 00:14:53,320 --> 00:14:57,040 Speaker 1: a striper get easy? Yellow and black? They're like, no, 275 00:14:57,120 --> 00:14:58,880 Speaker 1: I haven't heard him. And give us forty years. You 276 00:14:59,080 --> 00:15:01,200 Speaker 1: have heard of them, believe, and then in forty two 277 00:15:01,280 --> 00:15:04,080 Speaker 1: years no one will have heard of them. So I 278 00:15:04,080 --> 00:15:08,600 Speaker 1: think the original design was um, it was. I saw 279 00:15:08,640 --> 00:15:12,640 Speaker 1: them in concert. We won't even believe it was. Magenta 280 00:15:12,680 --> 00:15:16,480 Speaker 1: blades on a blue background was the original design, and 281 00:15:16,560 --> 00:15:18,920 Speaker 1: it was chosen because it was uncommon. But then an 282 00:15:18,960 --> 00:15:22,840 Speaker 1: oak Ridge, Tennessee at the oak Ridge National Laboratory, they 283 00:15:22,840 --> 00:15:26,880 Speaker 1: went with the yellow background in later on in and 284 00:15:26,920 --> 00:15:28,920 Speaker 1: I guess it's stuck. That's where the oak Ridge boys 285 00:15:28,920 --> 00:15:33,280 Speaker 1: were all scientists, that's right. So it was originally magenta 286 00:15:33,320 --> 00:15:36,480 Speaker 1: on blue, right, Yes, And the logo we're talking about 287 00:15:36,520 --> 00:15:38,640 Speaker 1: for those of you, I know, it's called the nuclear trefoil. 288 00:15:38,800 --> 00:15:43,320 Speaker 1: You know, it's a circle and then three cortial circles 289 00:15:43,320 --> 00:15:45,640 Speaker 1: around the blades. And from what I saw, one of 290 00:15:45,640 --> 00:15:49,800 Speaker 1: the original group doodlers explained it as um it's supposed 291 00:15:49,840 --> 00:15:53,520 Speaker 1: to be an atom with activity around it. Yeah, that's it, 292 00:15:53,760 --> 00:15:55,840 Speaker 1: which I never saw it before, but now that I've 293 00:15:55,880 --> 00:15:58,600 Speaker 1: read that, I can't unsee it. And that is really 294 00:15:58,640 --> 00:16:00,200 Speaker 1: what it looks like. It's a pretty great it will 295 00:16:00,240 --> 00:16:02,960 Speaker 1: doodle but it's like you said, that is not a 296 00:16:03,080 --> 00:16:07,120 Speaker 1: universally accepted symbol, which is a big problem. And it 297 00:16:07,240 --> 00:16:10,520 Speaker 1: doesn't it doesn't evoke like, oh, an atom. Of course, 298 00:16:10,640 --> 00:16:12,360 Speaker 1: I know what an atom looks like. I just saw 299 00:16:12,480 --> 00:16:14,280 Speaker 1: one go down the street a second ago, and this 300 00:16:14,400 --> 00:16:18,160 Speaker 1: looks like an atom. It's a symbolic representation of an atom, 301 00:16:18,360 --> 00:16:21,840 Speaker 1: which means that after people stop thinking about what atoms 302 00:16:21,920 --> 00:16:25,200 Speaker 1: look like, maybe a thousand years or five thousand years 303 00:16:25,240 --> 00:16:27,720 Speaker 1: down the road, if something happens, no one's going to 304 00:16:27,760 --> 00:16:30,000 Speaker 1: look at that and be like, oh, it's an atom. 305 00:16:30,040 --> 00:16:32,960 Speaker 1: Activity around an atom, that must mean there's radiation here. 306 00:16:33,160 --> 00:16:36,920 Speaker 1: Hence this is a dangerous sign that's not going to happen. Uh. 307 00:16:36,960 --> 00:16:39,040 Speaker 1: The other thing you would think, is just just put 308 00:16:39,120 --> 00:16:42,480 Speaker 1: up in a bunch of languages. Done. Ye, here's the thing. 309 00:16:42,960 --> 00:16:46,480 Speaker 1: Languages are disappearing. I'm gonna ask you, actually, what is 310 00:16:46,520 --> 00:16:53,800 Speaker 1: your best guess? A language dies out every blank nine 311 00:16:54,040 --> 00:16:59,640 Speaker 1: million seconds? Is that right? Did I nail it? Jerk? 312 00:17:01,400 --> 00:17:03,600 Speaker 1: I gotta get out of calculator. A language dies out 313 00:17:03,600 --> 00:17:06,640 Speaker 1: every fourteen days. I'm pretty sure that's nine millions. Isn't 314 00:17:06,640 --> 00:17:08,720 Speaker 1: that staggering? God? What if it was? Are you about 315 00:17:08,760 --> 00:17:11,120 Speaker 1: to do that? Yeah? You keep talking. So that's about 316 00:17:11,119 --> 00:17:14,080 Speaker 1: twenty five languages per year that die out. And um, 317 00:17:14,160 --> 00:17:17,320 Speaker 1: that's really sad, it is, and it is very sad. 318 00:17:17,359 --> 00:17:20,520 Speaker 1: And granted these aren't you know, major languages, but they're 319 00:17:20,560 --> 00:17:23,600 Speaker 1: important to the people who speak them. Um. But that's 320 00:17:23,640 --> 00:17:27,480 Speaker 1: just sort of to uh to to get across the 321 00:17:27,560 --> 00:17:30,480 Speaker 1: point that throwing it up in a bunch of languages 322 00:17:30,480 --> 00:17:34,680 Speaker 1: there's no guarantee and in fact, in all likelihood and 323 00:17:34,720 --> 00:17:38,320 Speaker 1: in fifty thousand years, there won't be English or German 324 00:17:38,440 --> 00:17:42,800 Speaker 1: or French, there may not even be humans. That's we 325 00:17:42,920 --> 00:17:46,280 Speaker 1: maybe what what's the calculation? Six days? I was a 326 00:17:46,280 --> 00:17:49,960 Speaker 1: little lot. Okay there there there, Maybe we may all 327 00:17:50,000 --> 00:17:53,480 Speaker 1: be like post biological humans. You know, uploaded our consciousness 328 00:17:53,480 --> 00:17:56,399 Speaker 1: onto like the Internet or something, in which point that 329 00:17:56,440 --> 00:17:58,320 Speaker 1: really won't matter to tell you the truth. Where the 330 00:17:58,400 --> 00:18:00,879 Speaker 1: nuclear waste is buried? But who knows? It could be 331 00:18:00,880 --> 00:18:03,359 Speaker 1: an intelligent species, It could be humans who don't know 332 00:18:03,400 --> 00:18:06,000 Speaker 1: how to read or write. The fact is is the 333 00:18:06,040 --> 00:18:09,560 Speaker 1: stuff that we take for granted changes a lot faster 334 00:18:09,680 --> 00:18:12,840 Speaker 1: than you think. UM, And even if it doesn't necessarily 335 00:18:12,880 --> 00:18:16,359 Speaker 1: die out, the changes that come along are pretty alarming. 336 00:18:16,640 --> 00:18:19,159 Speaker 1: I found a UM. I've been watching a lot of 337 00:18:19,200 --> 00:18:24,479 Speaker 1: Silicon Valley lately. I told you I my UM vocal 338 00:18:24,520 --> 00:18:29,120 Speaker 1: delivery sounds a lot like Jared's. It's a currently think 339 00:18:29,359 --> 00:18:34,200 Speaker 1: a lot, and I never really put those two together. 340 00:18:34,200 --> 00:18:36,120 Speaker 1: We keep an ear out for it. Now and see 341 00:18:36,119 --> 00:18:39,560 Speaker 1: what you think I mean? Tell me I'm wrong. I mean, 342 00:18:39,600 --> 00:18:42,359 Speaker 1: I would have to dissociate so much because I like 343 00:18:42,440 --> 00:18:46,600 Speaker 1: you and Jared is like such a pedantic bureaucrat him. 344 00:18:47,040 --> 00:18:49,960 Speaker 1: I mean, he's fun to watch, but I wouldn't say 345 00:18:50,000 --> 00:18:52,960 Speaker 1: that he's like the most likable character. Maybe he is, 346 00:18:53,000 --> 00:18:55,600 Speaker 1: I don't know. I would say pedantic bureaucrat is not 347 00:18:56,520 --> 00:19:00,840 Speaker 1: entirely off for me. Jared needs a girlfriend that's just okay, 348 00:19:01,160 --> 00:19:03,359 Speaker 1: So I do not, because I have a fine wife. 349 00:19:04,200 --> 00:19:05,920 Speaker 1: Uh So let me give you an example of how 350 00:19:05,960 --> 00:19:08,800 Speaker 1: English has changed. This is a quote from Sir Gawain 351 00:19:08,880 --> 00:19:13,840 Speaker 1: and the Green Knight. It was written in thirty fifty 352 00:19:13,920 --> 00:19:18,080 Speaker 1: years ago. This is in English. The steel of a 353 00:19:18,160 --> 00:19:22,040 Speaker 1: stiff staff, the stern hit gripped that was wounded with 354 00:19:22,119 --> 00:19:25,400 Speaker 1: iron into the wand's end, and i'll be graven with 355 00:19:25,480 --> 00:19:29,120 Speaker 1: green and gravy as works. And you should see it spelled. 356 00:19:29,200 --> 00:19:30,960 Speaker 1: Oh yeah, I mean I was an English major. We 357 00:19:30,960 --> 00:19:32,520 Speaker 1: had to go through this stuff. It was a slog. 358 00:19:32,720 --> 00:19:34,960 Speaker 1: Do you have a guess at what I just said? Yeah, 359 00:19:35,000 --> 00:19:37,840 Speaker 1: you said that? He uh, the Green Knights sat down 360 00:19:37,840 --> 00:19:41,199 Speaker 1: and watch some Silicon valley. That's right. It's that the 361 00:19:41,280 --> 00:19:44,520 Speaker 1: grim man gripped it by its strong handle, which was 362 00:19:44,560 --> 00:19:47,080 Speaker 1: wound with iron all the way to the end, engraven 363 00:19:47,080 --> 00:19:51,320 Speaker 1: and green with graceful designs. So like, that's English fifty 364 00:19:51,359 --> 00:19:54,160 Speaker 1: years ago. English is still around six hundred and fifty years. 365 00:19:54,160 --> 00:19:57,600 Speaker 1: We're talking about thousands, tens of thousands of years exactly. 366 00:19:57,680 --> 00:20:01,000 Speaker 1: So that's a problem. Languages evolved. Language is die. Symbols 367 00:20:01,040 --> 00:20:03,600 Speaker 1: don't quite make sense out of context, so there's a 368 00:20:03,600 --> 00:20:06,920 Speaker 1: lot of challenges that face the people who to try 369 00:20:06,960 --> 00:20:10,600 Speaker 1: to explain this stuff or figure out how to, um 370 00:20:10,640 --> 00:20:13,040 Speaker 1: explain it to future people. I think it's a better 371 00:20:13,080 --> 00:20:15,760 Speaker 1: way to put it, that's right. Uh, they have looked 372 00:20:15,840 --> 00:20:20,000 Speaker 1: in simio semioticians for people who who really want on 373 00:20:20,040 --> 00:20:23,520 Speaker 1: this stuff. Um. I think I'm an amateur simiotician after 374 00:20:23,560 --> 00:20:26,520 Speaker 1: reading this. But one thing that they're looking for, because 375 00:20:26,560 --> 00:20:29,760 Speaker 1: what you want is ideally is instant recognition and not 376 00:20:29,920 --> 00:20:32,200 Speaker 1: something I mean, yeah maybe if you have to figure 377 00:20:32,240 --> 00:20:34,840 Speaker 1: it out, but what you want is something that conveys 378 00:20:34,960 --> 00:20:37,679 Speaker 1: danger right when you look at it, Like just steer 379 00:20:37,760 --> 00:20:41,240 Speaker 1: clear of this place, not come closer and start poking around. 380 00:20:41,359 --> 00:20:44,439 Speaker 1: Just go away, that's right. So um. She makes a 381 00:20:44,440 --> 00:20:47,640 Speaker 1: great point though that, like it's a double edged sword, 382 00:20:47,640 --> 00:20:50,840 Speaker 1: like you're talking about earlier. If you you know, human beings, 383 00:20:51,160 --> 00:20:53,520 Speaker 1: if you show an extreme skier or a sign this 384 00:20:53,600 --> 00:20:56,360 Speaker 1: is danger, don't ski this way. He's gonna say, bra, 385 00:20:56,960 --> 00:20:59,919 Speaker 1: let's do it. Yeah, you know, give me some Hamma 386 00:21:00,040 --> 00:21:03,200 Speaker 1: side power drink. So there's a very very fine line 387 00:21:04,000 --> 00:21:09,320 Speaker 1: between warning people and enticing people. Yeah, even inadvertently exactly, 388 00:21:09,400 --> 00:21:11,560 Speaker 1: you know. I mean, there's um, she she points out 389 00:21:11,560 --> 00:21:13,560 Speaker 1: haunted houses because I'm like, yeah, not everybody's like a 390 00:21:13,560 --> 00:21:16,879 Speaker 1: red Bull extreme sports person, but people do like it. 391 00:21:17,000 --> 00:21:20,359 Speaker 1: Haunted houses too, So that abandoned like scary place is 392 00:21:20,400 --> 00:21:23,480 Speaker 1: so creepy. Let's go there for Halloween because maybe Halloween 393 00:21:23,520 --> 00:21:27,240 Speaker 1: survived but the English language didn't. Who knows. Um So yeah, 394 00:21:27,280 --> 00:21:29,919 Speaker 1: you you you really walk a fine line here between 395 00:21:30,040 --> 00:21:34,160 Speaker 1: warning people away and saying I dare you? Right? Yeah, 396 00:21:34,200 --> 00:21:38,480 Speaker 1: My whole jam is I think they need to uh 397 00:21:38,840 --> 00:21:41,680 Speaker 1: the what will survive if there are humans at all 398 00:21:41,880 --> 00:21:44,639 Speaker 1: is emotion, So I think they need to appeal to 399 00:21:44,720 --> 00:21:50,880 Speaker 1: human emotions like fear, ah, more than words and symbols. Okay, well, 400 00:21:50,960 --> 00:21:52,560 Speaker 1: let's take a break and we'll get back into this 401 00:21:52,560 --> 00:22:21,040 Speaker 1: all right, because this is fun. Yes, sorry, all right, Tuck. 402 00:22:21,080 --> 00:22:23,800 Speaker 1: So we've kind of talked about how things go away, 403 00:22:23,920 --> 00:22:28,840 Speaker 1: languages fall away, symbols don't make sense anyway, it is 404 00:22:29,480 --> 00:22:34,000 Speaker 1: it really is, right, um, so what what will last? 405 00:22:34,080 --> 00:22:37,720 Speaker 1: What have nuclear semioticians come up with? And should we explain? With? 406 00:22:37,760 --> 00:22:42,480 Speaker 1: Semiotics is in general, what is it? I don't even know, oh, 407 00:22:42,640 --> 00:22:45,680 Speaker 1: just kind of in shorthand. Semiotics is basically the study 408 00:22:45,760 --> 00:22:51,000 Speaker 1: of how and why UM signs have meanings, right, Like 409 00:22:51,040 --> 00:22:54,600 Speaker 1: you were saying, earlier. How the word apple doesn't evoke 410 00:22:54,720 --> 00:22:58,760 Speaker 1: thoughts of the word apple. It evokes thoughts of the round, shiny, 411 00:22:58,840 --> 00:23:01,560 Speaker 1: tasty fruit that grows on a tree. That's a sign 412 00:23:02,119 --> 00:23:06,280 Speaker 1: in semiotics, that's right, specifically a cursive sign because it 413 00:23:06,359 --> 00:23:10,200 Speaker 1: uses language. So what they've done, um and in many 414 00:23:10,720 --> 00:23:14,080 Speaker 1: cases is and this is a great idea for stuff 415 00:23:14,119 --> 00:23:17,280 Speaker 1: like this, is to have a competition. UM. They had 416 00:23:17,280 --> 00:23:19,439 Speaker 1: one at U c. L A. I think in two 417 00:23:19,480 --> 00:23:24,080 Speaker 1: thousand one calls it called the Desert Space Competition. UM. 418 00:23:24,119 --> 00:23:28,639 Speaker 1: And what one that year was a cactus, a yucca 419 00:23:29,160 --> 00:23:33,879 Speaker 1: cacti glowing blue. And then the idea was plant a 420 00:23:33,960 --> 00:23:38,560 Speaker 1: field of these regular green cacti, and then over the 421 00:23:38,960 --> 00:23:42,040 Speaker 1: place where you know the waist is the repository. And 422 00:23:42,080 --> 00:23:44,160 Speaker 1: then if you see the sign of a glowing blue one, 423 00:23:45,720 --> 00:23:47,640 Speaker 1: I mean, I don't think I didn't see the rest 424 00:23:47,680 --> 00:23:49,800 Speaker 1: of them, but I didn't think this one was that great. 425 00:23:50,640 --> 00:23:52,359 Speaker 1: I'm sorry to the person who came up with it 426 00:23:52,440 --> 00:23:55,119 Speaker 1: that I know. I think that something they should do 427 00:23:55,240 --> 00:23:58,960 Speaker 1: is go even further back to younger children, because sometimes 428 00:23:58,960 --> 00:24:00,840 Speaker 1: like go to like an l entry school and ask 429 00:24:00,920 --> 00:24:03,520 Speaker 1: kids or a high school, right, or you just take 430 00:24:03,720 --> 00:24:06,000 Speaker 1: each kid out and rub their face in the sand 431 00:24:06,040 --> 00:24:08,159 Speaker 1: and be like you see this, you stay out of here. No, 432 00:24:08,320 --> 00:24:11,240 Speaker 1: I mean, have the kids like throw out ideas because 433 00:24:11,280 --> 00:24:14,360 Speaker 1: I think, oh yeah, I think the like my idea. 434 00:24:15,040 --> 00:24:17,680 Speaker 1: I think a lot of times children can cut through 435 00:24:17,720 --> 00:24:20,840 Speaker 1: the to the simplicity of something much better than adults 436 00:24:20,840 --> 00:24:23,719 Speaker 1: can easily. So that's my idea. Throw it out as 437 00:24:23,760 --> 00:24:26,080 Speaker 1: a as a science fair project. Well, I think that's 438 00:24:26,080 --> 00:24:28,440 Speaker 1: one of the cool things about nuclear semionics is it's 439 00:24:28,480 --> 00:24:32,119 Speaker 1: so inviting to like, anybody can come up with a 440 00:24:32,200 --> 00:24:36,720 Speaker 1: great idea. It's just so confounding, but it's also so accessible. Yeah, 441 00:24:36,760 --> 00:24:39,480 Speaker 1: we'll get ideas. In fact, we want to hear from you. 442 00:24:39,800 --> 00:24:41,320 Speaker 1: If you think you have a cool idea, it's a 443 00:24:41,320 --> 00:24:43,600 Speaker 1: good idea. Like, I guarantee you we're gonna get some 444 00:24:43,640 --> 00:24:46,040 Speaker 1: good ones. We're not gonna pass them along or anything. 445 00:24:46,320 --> 00:24:49,919 Speaker 1: So rather than just like pooh pooing the glowing yucka one, 446 00:24:50,240 --> 00:24:53,479 Speaker 1: there's a here's the problem with the glowing yucka idea. 447 00:24:54,040 --> 00:24:59,639 Speaker 1: It requires explanation, right somebody, So part of the glowing 448 00:24:59,720 --> 00:25:02,880 Speaker 1: yuck is to say, these things have been genetically engineered 449 00:25:02,920 --> 00:25:06,040 Speaker 1: so that when there's radiation present, they glow so if 450 00:25:06,080 --> 00:25:08,880 Speaker 1: you see this yucka glowing, it means that there's radiation here. 451 00:25:09,000 --> 00:25:13,600 Speaker 1: Stay away. If you lose that that additional story that 452 00:25:13,640 --> 00:25:16,040 Speaker 1: has to go along with the glowing yucka, then you 453 00:25:16,080 --> 00:25:18,520 Speaker 1: just have glowing yucca. And I can't think of a 454 00:25:18,560 --> 00:25:20,760 Speaker 1: more attractive thing is going to draw people to a 455 00:25:20,840 --> 00:25:24,440 Speaker 1: site than the legendary glowing yucka that only glows and 456 00:25:24,520 --> 00:25:27,880 Speaker 1: this one spot on Earth. That's kind of the problem 457 00:25:27,920 --> 00:25:30,080 Speaker 1: with it, you know. I like this other idea from 458 00:25:30,080 --> 00:25:32,440 Speaker 1: that same year a little better that did not win. 459 00:25:33,320 --> 00:25:37,879 Speaker 1: Fields of Asphodel, which is a Eurasian lily. They said, 460 00:25:38,040 --> 00:25:41,400 Speaker 1: let's just cover the site with metal blades that screech 461 00:25:41,560 --> 00:25:44,120 Speaker 1: when the wind blows. It makes it horrible noise, right, 462 00:25:44,320 --> 00:25:48,440 Speaker 1: not bad. Here's the problem with that. Moving parts. Okay, 463 00:25:48,880 --> 00:25:51,439 Speaker 1: it's been pretty well established that if you're trying to 464 00:25:51,520 --> 00:25:55,479 Speaker 1: convey something to the people into the distant future, you 465 00:25:55,600 --> 00:25:59,000 Speaker 1: need to have something that's monolithic and made of one piece, 466 00:25:59,440 --> 00:26:03,000 Speaker 1: because if you have multiple parts, that's an opportunity for 467 00:26:03,119 --> 00:26:06,320 Speaker 1: weathering to occur through the place where the two parts meet, 468 00:26:06,440 --> 00:26:08,480 Speaker 1: or three parts or five parts. And if it's a 469 00:26:08,520 --> 00:26:12,320 Speaker 1: moving part, just kiss the movement goodbye. What about this. 470 00:26:13,280 --> 00:26:17,160 Speaker 1: I've had the thought earlier today about just a mountain 471 00:26:17,240 --> 00:26:21,520 Speaker 1: of razor wire. Okay, here's the problem with that. And 472 00:26:21,560 --> 00:26:23,760 Speaker 1: this is the same problem also with the What is 473 00:26:23,800 --> 00:26:26,439 Speaker 1: the problem the steel? The steel, stuff that move and everything. 474 00:26:26,760 --> 00:26:28,880 Speaker 1: You want to use you I know, but you want 475 00:26:28,880 --> 00:26:33,800 Speaker 1: to use stuff that has no value whatsoever, not just financially, 476 00:26:33,800 --> 00:26:36,600 Speaker 1: but usefulness because someone will say, I can harvest that 477 00:26:36,680 --> 00:26:38,960 Speaker 1: razor wire, Yeah, I can go use that to keep 478 00:26:39,000 --> 00:26:41,320 Speaker 1: the cows in in my house next door. Yeah. But 479 00:26:41,359 --> 00:26:44,080 Speaker 1: if you have so much of it over time, over 480 00:26:44,119 --> 00:26:47,760 Speaker 1: ten thousand years, people like take and take, and I 481 00:26:47,800 --> 00:26:50,240 Speaker 1: mean that's why the pyramids are stripped of, like they're 482 00:26:50,240 --> 00:26:53,159 Speaker 1: more attractive outer. They used to have like a white 483 00:26:53,600 --> 00:26:58,520 Speaker 1: I think limestone shell encasement. It's gone because the locals 484 00:26:58,520 --> 00:26:59,840 Speaker 1: were like, oh, I can use that to build a 485 00:27:00,880 --> 00:27:04,880 Speaker 1: pizza hut. That's exactly That's what people will do if 486 00:27:04,920 --> 00:27:08,560 Speaker 1: you play something of any kind of usefulness Like that 487 00:27:08,760 --> 00:27:12,800 Speaker 1: is the beauty of every idea is wrong as a whole. 488 00:27:13,119 --> 00:27:16,680 Speaker 1: It's so great, it's pretty great. I love it so 489 00:27:16,840 --> 00:27:19,600 Speaker 1: the um One of the most often uh sided bodies 490 00:27:19,600 --> 00:27:22,720 Speaker 1: of work is from eighty three, and this was a 491 00:27:22,720 --> 00:27:26,680 Speaker 1: call for ideas from the German Journal of Semiotics that 492 00:27:26,800 --> 00:27:28,760 Speaker 1: basically said the same thing. It's like, you know, what 493 00:27:28,800 --> 00:27:31,600 Speaker 1: are your ideas? This one got a little goofy, to 494 00:27:31,640 --> 00:27:35,480 Speaker 1: say the least. Someone suggested an artificial moon as a 495 00:27:35,520 --> 00:27:39,320 Speaker 1: storage vessel. There's just a huge flaw on that one. 496 00:27:39,359 --> 00:27:41,880 Speaker 1: If you ask me, I mean, you don't even get that. Well, 497 00:27:41,920 --> 00:27:44,040 Speaker 1: it was like, how do you make sure that the 498 00:27:44,160 --> 00:27:49,199 Speaker 1: information about this site stays protected? Put it into an 499 00:27:49,280 --> 00:27:52,000 Speaker 1: artificial moon in orbit around Earth? But it's like, how 500 00:27:52,000 --> 00:27:53,880 Speaker 1: do you get to the artificial I didn't get that's 501 00:27:53,880 --> 00:27:56,240 Speaker 1: what they meant that that, Yeah, that doesn't make any sense. 502 00:27:56,320 --> 00:27:58,080 Speaker 1: That's what I think I guess they were. I mean 503 00:27:58,240 --> 00:28:00,640 Speaker 1: it said, uh, well, will they be aiming it down 504 00:28:00,640 --> 00:28:03,879 Speaker 1: to a TV that won't play? That's a different one. Yeah, 505 00:28:03,960 --> 00:28:06,720 Speaker 1: And then I just don't understand this at all. I 506 00:28:06,720 --> 00:28:10,159 Speaker 1: don't understand the Radioactive Cats either, even though that's a 507 00:28:10,200 --> 00:28:13,040 Speaker 1: decent band name. So there there. That was a big 508 00:28:13,080 --> 00:28:17,280 Speaker 1: part of the invisible episode Nuclear Semiotics. They talked about 509 00:28:17,280 --> 00:28:20,119 Speaker 1: the ray Cats um and I think they actually hired 510 00:28:20,160 --> 00:28:23,320 Speaker 1: a musician to create a song, because just like with 511 00:28:23,359 --> 00:28:26,560 Speaker 1: the Glowing Yucka, you have to explain what's going on 512 00:28:26,640 --> 00:28:28,800 Speaker 1: when the cats glow, you need to stay away. So 513 00:28:28,880 --> 00:28:30,640 Speaker 1: they had somebody come up with a ray cat song 514 00:28:30,680 --> 00:28:33,200 Speaker 1: I believe for the episode. Was it Hoodie and the Blowfish? 515 00:28:33,280 --> 00:28:36,040 Speaker 1: Yes it was. That was a good guess. Now this 516 00:28:36,080 --> 00:28:38,160 Speaker 1: one I thought was had a little I thought it 517 00:28:38,200 --> 00:28:42,080 Speaker 1: was interesting at least this SIMI edition named Thomas C. 518 00:28:42,280 --> 00:28:47,400 Speaker 1: Bac He said this what has survived more than anything 519 00:28:47,440 --> 00:28:51,240 Speaker 1: else religion, religious texts that date back, you know, a 520 00:28:51,240 --> 00:28:53,440 Speaker 1: couple of thousand years in the Catholic Church. Not a 521 00:28:53,480 --> 00:28:56,600 Speaker 1: bad start. Yeah, the ideas that you hear at Catholic 522 00:28:56,640 --> 00:28:59,760 Speaker 1: Mass today are a couple of thousand years old and something. 523 00:29:00,520 --> 00:29:02,640 Speaker 1: And if you go back to the original text, which 524 00:29:02,680 --> 00:29:05,960 Speaker 1: we can still read fortunately, you can say, yep, this 525 00:29:06,040 --> 00:29:08,680 Speaker 1: is what they're talking about. Like those ideas have survived 526 00:29:08,720 --> 00:29:12,640 Speaker 1: that long because of the practices they use. So interesting idea. 527 00:29:12,840 --> 00:29:15,400 Speaker 1: But it gets a little goofy because he thought, why 528 00:29:15,400 --> 00:29:20,560 Speaker 1: don't we almost create a fake religion around this thing? 529 00:29:21,360 --> 00:29:25,280 Speaker 1: Um a fearful myth that you can generate appointing an 530 00:29:25,280 --> 00:29:30,000 Speaker 1: atomic priesthood two tell people and tell them to tell 531 00:29:30,080 --> 00:29:34,320 Speaker 1: future generations. But I mean, I guess the idea is 532 00:29:34,400 --> 00:29:37,360 Speaker 1: that it's all false and it's just a big made 533 00:29:37,440 --> 00:29:40,160 Speaker 1: up story. Yeah, this the atomic priesthood would know the 534 00:29:40,200 --> 00:29:43,560 Speaker 1: truth and they would indoctrinate people, but out in society 535 00:29:43,600 --> 00:29:47,360 Speaker 1: around them, it would be a closely guarded secret because 536 00:29:47,360 --> 00:29:50,840 Speaker 1: everybody else thinks that whatever, this fake myth about why 537 00:29:50,880 --> 00:29:53,880 Speaker 1: you have to stay away from this haunted, evil area 538 00:29:54,680 --> 00:29:57,640 Speaker 1: is true, when really the atomic priests are the ones 539 00:29:57,680 --> 00:30:01,840 Speaker 1: who know, uh no, actually there's there's radioactive stuff. They're here. 540 00:30:02,200 --> 00:30:04,200 Speaker 1: They just came up with this three thousand years ago 541 00:30:04,480 --> 00:30:07,920 Speaker 1: to scare everybody away. But initially a decent idea as 542 00:30:07,920 --> 00:30:10,680 Speaker 1: far as trying to make it or incorporate like what 543 00:30:10,760 --> 00:30:14,280 Speaker 1: religion does. But it just definitely strange. It is to 544 00:30:14,440 --> 00:30:17,520 Speaker 1: me though, it is at its base despicable. It's a 545 00:30:17,560 --> 00:30:24,160 Speaker 1: despicable idea because it is purposefully introducing fearful, false superstition 546 00:30:24,240 --> 00:30:28,640 Speaker 1: into the future, Like we're gonna purposefully introduce fearful false 547 00:30:28,640 --> 00:30:32,240 Speaker 1: superstition into the future just to scare people off from radioactivity, 548 00:30:32,320 --> 00:30:35,520 Speaker 1: Like what kind of sweeping side effects? What kind of 549 00:30:35,600 --> 00:30:38,680 Speaker 1: wars might start over this? Many people will die to 550 00:30:38,800 --> 00:30:41,360 Speaker 1: defend this fake thing that they don't realize is fake. 551 00:30:41,560 --> 00:30:44,440 Speaker 1: Because Thomas Cebia came up with this idea to keep 552 00:30:44,480 --> 00:30:48,200 Speaker 1: people away from a single site in New Mexico. That's crazy. 553 00:30:48,400 --> 00:30:51,160 Speaker 1: It didn't fare too well either among his colleagues, and 554 00:30:51,320 --> 00:30:54,120 Speaker 1: rightfully so, because again it's a despicable idea. So he 555 00:30:54,240 --> 00:30:57,280 Speaker 1: was on the Human Interference task Force. We mentioned them 556 00:30:57,440 --> 00:31:00,960 Speaker 1: the Nevada site. That was what was uh what was 557 00:31:01,040 --> 00:31:05,080 Speaker 1: launched for that yucka mountain side back in from eighty three. 558 00:31:05,480 --> 00:31:10,400 Speaker 1: So whatever cbo original idea was, he had like some 559 00:31:10,520 --> 00:31:15,120 Speaker 1: other closely related ideas that were great. Though, Like he's 560 00:31:15,120 --> 00:31:17,480 Speaker 1: not like it's a total nut job happened. He's like, 561 00:31:17,600 --> 00:31:20,400 Speaker 1: I think it was just a misfire for in an 562 00:31:20,400 --> 00:31:23,240 Speaker 1: otherwise illustrious career. I think I don't know that much 563 00:31:23,240 --> 00:31:27,720 Speaker 1: about him. But um, one of his other ideas was, Okay, 564 00:31:27,880 --> 00:31:30,640 Speaker 1: well let's take the the atomic priesthood away, let's take 565 00:31:31,000 --> 00:31:34,080 Speaker 1: the religion and all that stuff away, and let's just 566 00:31:34,120 --> 00:31:37,120 Speaker 1: give them like the facts, but let's figure out a 567 00:31:37,120 --> 00:31:39,400 Speaker 1: way to make sure that those facts get passed down. 568 00:31:40,000 --> 00:31:42,720 Speaker 1: And what he came up with was called the meta message, 569 00:31:42,840 --> 00:31:46,680 Speaker 1: where it's a message that says this is this place 570 00:31:46,720 --> 00:31:49,240 Speaker 1: has nuclear radiation, it can kill you. You need to 571 00:31:49,240 --> 00:31:52,600 Speaker 1: stay away. From it, and we invite you to take 572 00:31:52,640 --> 00:31:56,160 Speaker 1: this message and translated into whatever languages you guys have 573 00:31:56,240 --> 00:31:59,080 Speaker 1: on earth at the time, assuming you can read this right, 574 00:31:59,480 --> 00:32:02,400 Speaker 1: But if you do that often enough, there will always 575 00:32:02,400 --> 00:32:06,240 Speaker 1: be somebody who can translate it. And then that way 576 00:32:06,280 --> 00:32:09,200 Speaker 1: you form a bridge between now and as far into 577 00:32:09,200 --> 00:32:12,040 Speaker 1: the future as people are around to read and add 578 00:32:12,080 --> 00:32:15,600 Speaker 1: their own interpretation or their own translation of it. But 579 00:32:15,640 --> 00:32:17,800 Speaker 1: then you want to leave the original so that if 580 00:32:17,840 --> 00:32:21,120 Speaker 1: there's ever like a disagreement about what some what word meant, 581 00:32:21,600 --> 00:32:25,520 Speaker 1: hopefully somebody can go back language language language and connect 582 00:32:25,560 --> 00:32:29,480 Speaker 1: them so that they can see the original version. Yeah, 583 00:32:29,480 --> 00:32:33,040 Speaker 1: but like what if a society develops an isolation that 584 00:32:33,240 --> 00:32:37,480 Speaker 1: knows none of these languages, you're just totally toast. Yeah, 585 00:32:37,680 --> 00:32:40,640 Speaker 1: that's when the symbols come in. Right. So what they 586 00:32:40,680 --> 00:32:43,600 Speaker 1: settled on as a panel though from one to eighty 587 00:32:43,600 --> 00:32:47,720 Speaker 1: three was what's called long term communication was going to 588 00:32:47,800 --> 00:32:49,360 Speaker 1: be the most effective thing, like kind of what you 589 00:32:49,400 --> 00:32:51,960 Speaker 1: were just talking about. And they said, a system that 590 00:32:52,000 --> 00:32:55,400 Speaker 1: combines physical markers and archives that cover the two major 591 00:32:55,440 --> 00:33:00,440 Speaker 1: forms of this long term communicate direct and successive. Direct 592 00:33:00,640 --> 00:33:06,640 Speaker 1: utilizes markers, and uh, successive is humans like you were 593 00:33:06,640 --> 00:33:08,800 Speaker 1: talking about I guess with this meta message, I guess 594 00:33:08,800 --> 00:33:10,960 Speaker 1: you could write it down, but it's still humans carrying 595 00:33:11,000 --> 00:33:13,600 Speaker 1: a message through time. Well, it's more like a direct one, 596 00:33:13,720 --> 00:33:16,760 Speaker 1: is that, Like you can write an inscription on a 597 00:33:16,880 --> 00:33:20,560 Speaker 1: monument and that monument is going to deliver that message 598 00:33:20,640 --> 00:33:23,400 Speaker 1: directly to people ten thousand years from now. Yeah, I 599 00:33:23,400 --> 00:33:26,280 Speaker 1: mean it's a physical thing, right, whereas with successive it's 600 00:33:26,360 --> 00:33:29,400 Speaker 1: kind of passed along like a game of telephone exactly. 601 00:33:29,440 --> 00:33:31,560 Speaker 1: And you know how that goes, right, It can get 602 00:33:31,560 --> 00:33:34,160 Speaker 1: a little hinky, that's right, but it's always fun at 603 00:33:34,160 --> 00:33:38,040 Speaker 1: a somber party. Sure. So they came up with multiple ones, 604 00:33:38,080 --> 00:33:41,240 Speaker 1: like you were saying, um that they settled on a 605 00:33:41,320 --> 00:33:46,320 Speaker 1: monument that had um massive stone structures. Remember, you want monoliths, 606 00:33:46,720 --> 00:33:50,560 Speaker 1: they're engraved with warnings in all currently known languages. It's 607 00:33:50,560 --> 00:33:53,400 Speaker 1: a lot of languages. You want a buried vault that 608 00:33:53,480 --> 00:33:56,440 Speaker 1: has all the info you need about radioactivity, about the site, 609 00:33:56,520 --> 00:33:59,600 Speaker 1: all that stuff. You want a bunch of barriers around 610 00:33:59,600 --> 00:34:03,080 Speaker 1: the site, not necessarily to definitely keep people out, but 611 00:34:03,200 --> 00:34:05,560 Speaker 1: enough to basically say, hey, hey, we're trying to impede 612 00:34:05,560 --> 00:34:07,680 Speaker 1: progress here. Yeah. I mean, to me, that's one of 613 00:34:07,680 --> 00:34:10,280 Speaker 1: the most obvious ones. If like you see a huge 614 00:34:10,320 --> 00:34:13,600 Speaker 1: wall again, it might entice you, but it for sure 615 00:34:13,640 --> 00:34:15,919 Speaker 1: indicates to any culture that you're like, you're not meant 616 00:34:15,960 --> 00:34:21,120 Speaker 1: to come beyond this. And then, um, the last one 617 00:34:21,200 --> 00:34:24,560 Speaker 1: is a network of archives basically the same information you 618 00:34:24,560 --> 00:34:27,880 Speaker 1: would have in that buried vault, but elsewhere scattered around 619 00:34:27,920 --> 00:34:29,920 Speaker 1: the world. So if something happens to the buried vault, 620 00:34:29,960 --> 00:34:32,120 Speaker 1: somebody can come across the archives somewhere and be like, 621 00:34:32,120 --> 00:34:33,759 Speaker 1: oh wait, wait, we want to stay out of there, 622 00:34:34,000 --> 00:34:36,200 Speaker 1: right And along with that, they said, while we're at it, 623 00:34:36,320 --> 00:34:38,920 Speaker 1: can we can we at least like all agree around 624 00:34:38,920 --> 00:34:41,680 Speaker 1: the world on a nuclear warning symbol, if it's the 625 00:34:41,719 --> 00:34:46,680 Speaker 1: trefoil or whatever, let's just all codify that as the thing, 626 00:34:46,840 --> 00:34:48,960 Speaker 1: which is not the case right now. Now there's was 627 00:34:49,000 --> 00:34:52,080 Speaker 1: a triangle with an arrow pointing down, and then in 628 00:34:52,160 --> 00:34:55,279 Speaker 1: the head of the arrow was the Biohezard symbol, which 629 00:34:55,320 --> 00:34:57,160 Speaker 1: is not great because you want something that's going to 630 00:34:57,200 --> 00:35:00,160 Speaker 1: be so simple that even as people affuse me, I 631 00:35:00,200 --> 00:35:02,120 Speaker 1: need to see it. I guess yeah. It's even when 632 00:35:02,160 --> 00:35:04,920 Speaker 1: you see it you're like wait what, um, But you 633 00:35:04,960 --> 00:35:07,200 Speaker 1: want something simple enough so that as people kind of 634 00:35:07,239 --> 00:35:09,880 Speaker 1: create a shorthand version of it. It's still retains its 635 00:35:09,880 --> 00:35:13,319 Speaker 1: its meaning rightly, all right. So that stuff was the 636 00:35:13,400 --> 00:35:16,920 Speaker 1: Yucca project in the early eighties. They decided not to 637 00:35:16,960 --> 00:35:21,280 Speaker 1: do that. They just packed it up, put it away, 638 00:35:21,680 --> 00:35:23,680 Speaker 1: and then it all came back again with this new 639 00:35:23,719 --> 00:35:28,240 Speaker 1: Mexico plant when the Department of Energy said once again, hey, 640 00:35:28,320 --> 00:35:30,320 Speaker 1: we need to think of a sign and a symbol 641 00:35:30,960 --> 00:35:33,359 Speaker 1: or or whatever you can, you know, come up with, 642 00:35:33,719 --> 00:35:35,640 Speaker 1: and we need the best in the brightest thinking of this. 643 00:35:36,160 --> 00:35:39,880 Speaker 1: So call up Carl Sagan, Get me Sagan, give me Sagan, 644 00:35:40,120 --> 00:35:43,040 Speaker 1: give me Percy Tannenbaum, stat uh. And this guy named 645 00:35:43,080 --> 00:35:46,680 Speaker 1: John Lomberg who's a science writer and space illustrator, and 646 00:35:46,800 --> 00:35:49,719 Speaker 1: he had worked in semiotics before for NASA on their 647 00:35:49,719 --> 00:35:52,440 Speaker 1: mission to Mars. Sagan was an ill health so he 648 00:35:52,520 --> 00:35:56,240 Speaker 1: declined to come, but he sent a message from the present. 649 00:35:56,320 --> 00:36:02,200 Speaker 1: I guess that said skull and crossbones done universal. Everyone 650 00:36:02,280 --> 00:36:04,480 Speaker 1: knows it. He gave a really good example. He said, 651 00:36:04,640 --> 00:36:08,760 Speaker 1: it has um. It's marked the lintels of cannibal dwellings, 652 00:36:08,760 --> 00:36:11,680 Speaker 1: the flags of pirates, the insignia of s S divisions, 653 00:36:11,680 --> 00:36:14,680 Speaker 1: and motorcycle gangs. Like. He makes a pretty good point. 654 00:36:15,080 --> 00:36:17,000 Speaker 1: A lot of people out there see a skull and 655 00:36:17,040 --> 00:36:23,840 Speaker 1: crossbones and no, it means like danger problems. Yes, you know, 656 00:36:23,960 --> 00:36:27,120 Speaker 1: you'll be a skull. And so the working group for 657 00:36:27,200 --> 00:36:31,520 Speaker 1: the whip Um project they said, now that doesn't work. 658 00:36:31,640 --> 00:36:34,400 Speaker 1: It's a young Ian archetype. It doesn't really exist outside 659 00:36:34,440 --> 00:36:37,400 Speaker 1: of the West. To me, I'm like, no, Segan was 660 00:36:37,480 --> 00:36:40,200 Speaker 1: definitely onto something. I think so. I mean, tell me, 661 00:36:40,239 --> 00:36:43,640 Speaker 1: if you go to China and hold up a sign 662 00:36:43,640 --> 00:36:47,319 Speaker 1: with a skull and crossbones, I would think so, wouldn't they. 663 00:36:47,360 --> 00:36:49,680 Speaker 1: I mean, that's a dire warning, isn't it. I think 664 00:36:49,760 --> 00:36:51,799 Speaker 1: their point is is that the skull used to be 665 00:36:51,840 --> 00:36:55,719 Speaker 1: like a Memento movie where it meant like rebirth and death, 666 00:36:55,760 --> 00:36:57,440 Speaker 1: so they could be like, oh, wonderful a skull and 667 00:36:57,480 --> 00:37:04,120 Speaker 1: cross bus. But to me, that is the one enduring 668 00:37:04,200 --> 00:37:06,959 Speaker 1: symbol that's always going to be around as long as 669 00:37:07,040 --> 00:37:10,359 Speaker 1: there are humans, because what happens when you die and 670 00:37:10,480 --> 00:37:14,359 Speaker 1: rot what's left your skull. Every human knows that even 671 00:37:14,480 --> 00:37:16,719 Speaker 1: humans in the future are going to know that. Even 672 00:37:16,840 --> 00:37:21,040 Speaker 1: ones that are in like a post collapse tribes were 673 00:37:21,120 --> 00:37:23,359 Speaker 1: running around like and and have lost all of the 674 00:37:23,480 --> 00:37:25,600 Speaker 1: languages that are around today, they're gonna know what a 675 00:37:25,680 --> 00:37:28,000 Speaker 1: skull looks like or what a skull means, or at 676 00:37:28,080 --> 00:37:29,560 Speaker 1: least one of them is going to be like wait, 677 00:37:29,920 --> 00:37:33,080 Speaker 1: I don't think this is saying that the rainbow is coming. Um, 678 00:37:33,120 --> 00:37:35,759 Speaker 1: I think it means like death or danger. All right, 679 00:37:35,800 --> 00:37:39,200 Speaker 1: let's take another break. Yeah, we'll come back and talk 680 00:37:39,239 --> 00:37:42,600 Speaker 1: about the the approach that the whip panel took and 681 00:37:42,640 --> 00:38:10,440 Speaker 1: what they came up with right after this. Sorry, you know, 682 00:38:10,480 --> 00:38:14,200 Speaker 1: I gotta defend Sagan. It's my boy. I love that guy. 683 00:38:14,400 --> 00:38:19,400 Speaker 1: Someone should ask U Neil de grass Tyson. Why not. 684 00:38:19,480 --> 00:38:22,000 Speaker 1: I bet he's got a good idea or two. I'll 685 00:38:22,040 --> 00:38:25,680 Speaker 1: bet they have asked although Atlanta for a show. Oh yeah, 686 00:38:25,680 --> 00:38:30,080 Speaker 1: where Fox? I think a cop Energy Center. Oh yeah, yeah, 687 00:38:30,160 --> 00:38:32,360 Speaker 1: well I think that's even more seats than the phone. No, 688 00:38:32,480 --> 00:38:36,120 Speaker 1: it's less. Oh sorry, I think it's like three thousand people, 689 00:38:36,280 --> 00:38:40,640 Speaker 1: which is nothing to you know, put up a stink about. 690 00:38:40,680 --> 00:38:42,759 Speaker 1: That's a lot of folks. We have not hit that. 691 00:38:43,040 --> 00:38:45,840 Speaker 1: No we're not, No, we haven't. Did you hear the 692 00:38:45,920 --> 00:38:48,680 Speaker 1: Star talk I was on? Oh no, it was it good? 693 00:38:48,760 --> 00:38:50,680 Speaker 1: It was pretty good. Yeah. If I do say so myself, 694 00:38:50,840 --> 00:38:54,960 Speaker 1: if it was supposed to be like rapid fast responses. 695 00:38:55,880 --> 00:38:58,800 Speaker 1: We got to like four questions in an hour because 696 00:38:58,800 --> 00:39:01,839 Speaker 1: you're like, rapid fast response is not my specialty. Deal, 697 00:39:02,280 --> 00:39:07,120 Speaker 1: let me just do a distracting here, a more deliberate alright. So, 698 00:39:07,280 --> 00:39:10,440 Speaker 1: speaking of deliberate, the WHIP panel was very deliberate and methodical. 699 00:39:10,520 --> 00:39:14,520 Speaker 1: They divided into teams and um approached it from the 700 00:39:14,719 --> 00:39:17,040 Speaker 1: two things we were talking about, direct and successive forms 701 00:39:17,080 --> 00:39:22,240 Speaker 1: of communication. Debated a lot, deliberated a lot the recommendations. 702 00:39:22,520 --> 00:39:26,000 Speaker 1: They had two proposals, and they did overlap a little bit. Uh. 703 00:39:26,040 --> 00:39:28,239 Speaker 1: What I thought was pretty smart is they both had 704 00:39:28,239 --> 00:39:32,440 Speaker 1: a multi leveled approach from the surface down that got 705 00:39:32,480 --> 00:39:35,520 Speaker 1: more specific and intense as you went down. Yeah, the 706 00:39:35,560 --> 00:39:39,160 Speaker 1: first one was basically like you, ding Dong, this is dangerous, 707 00:39:39,160 --> 00:39:42,040 Speaker 1: go away exactly. That's like level one, and then level 708 00:39:42,080 --> 00:39:45,000 Speaker 1: two is like, okay, ding Dong, and you're kind of smart. 709 00:39:45,040 --> 00:39:47,759 Speaker 1: Friend explained to ding Dong that the reason this is 710 00:39:47,800 --> 00:39:50,520 Speaker 1: dangerous because there's something buried here and it's gonna hurt you. 711 00:39:50,960 --> 00:39:52,520 Speaker 1: All right, we should we should. We don't talk about 712 00:39:52,560 --> 00:39:55,680 Speaker 1: the real things. Oh sure, I thought I was. Uh 713 00:39:55,719 --> 00:40:00,319 Speaker 1: so Group A this was their's. They studied the face 714 00:40:00,400 --> 00:40:03,960 Speaker 1: of the site with what they called menacing earthworks, so 715 00:40:04,080 --> 00:40:07,560 Speaker 1: a field of spikes and then a big massive disk 716 00:40:08,160 --> 00:40:11,160 Speaker 1: painted to look like a black hole. I didn't quite 717 00:40:11,160 --> 00:40:13,480 Speaker 1: get that part. That's so dumb. I get the spikes. 718 00:40:14,239 --> 00:40:17,320 Speaker 1: I I think it's the yeah, of course, But the 719 00:40:17,800 --> 00:40:19,480 Speaker 1: black hole. I think it's supposed to just mean like 720 00:40:19,520 --> 00:40:22,120 Speaker 1: a void or chaos. I don't know. I'm not sure 721 00:40:22,360 --> 00:40:24,120 Speaker 1: I could see how you would think that that was 722 00:40:24,200 --> 00:40:26,759 Speaker 1: kind of universal, like nobody wants to fall into a 723 00:40:26,800 --> 00:40:28,960 Speaker 1: hole or something, and maybe it evokes that kind of 724 00:40:29,000 --> 00:40:32,320 Speaker 1: like stay away, all right. Then they have large markers 725 00:40:32,600 --> 00:40:34,759 Speaker 1: all around the site, which, like you said, are the 726 00:40:34,800 --> 00:40:38,399 Speaker 1: really basic messages and the warnings, including and I thought 727 00:40:38,440 --> 00:40:43,200 Speaker 1: this is so interesting, uh faces that invoke Edvard Munch's 728 00:40:43,320 --> 00:40:48,080 Speaker 1: The Scream, the ones I saw a word at the scream. Yeah, 729 00:40:48,200 --> 00:40:49,759 Speaker 1: like it was a line drawing of the guy from 730 00:40:49,760 --> 00:40:53,520 Speaker 1: the people. Yeah, like in great agony and pain. That 731 00:40:53,680 --> 00:40:56,480 Speaker 1: to me not bad. It isn't bad. I don't know 732 00:40:56,560 --> 00:41:01,120 Speaker 1: there was that more universally understood than a skull and crossbones. 733 00:41:01,120 --> 00:41:03,320 Speaker 1: I don't know, or or or if art survives or 734 00:41:03,360 --> 00:41:05,680 Speaker 1: people like, Oh, I wonder if that paintings down there, well, 735 00:41:05,719 --> 00:41:08,200 Speaker 1: I think what they're saying is and simuticians kind of 736 00:41:08,200 --> 00:41:12,200 Speaker 1: feel this way. Is that Edvard Monk so perfectly nailed 737 00:41:12,239 --> 00:41:14,920 Speaker 1: the scream that even without the art, like if you 738 00:41:14,960 --> 00:41:17,279 Speaker 1: see that, you understand that that person you're seeing is 739 00:41:17,320 --> 00:41:20,719 Speaker 1: in agony. Did I say Munch? No, I think you 740 00:41:20,719 --> 00:41:23,400 Speaker 1: said monk. Did I say much? You said Monk. I 741 00:41:23,520 --> 00:41:25,600 Speaker 1: might have said much. No, you said I think he 742 00:41:25,680 --> 00:41:28,799 Speaker 1: said monk. Is it Munch? I think it's probably Monk. 743 00:41:29,200 --> 00:41:31,600 Speaker 1: There's no way his name as much. I'm almost positive 744 00:41:31,640 --> 00:41:35,720 Speaker 1: you said monk. Jerry, can you rewind for a second. Much? 745 00:41:36,360 --> 00:41:38,799 Speaker 1: Oh you did say much. I would have scorned you 746 00:41:38,800 --> 00:41:42,839 Speaker 1: said monk. So Group A below the surface, Uh, this 747 00:41:42,880 --> 00:41:45,400 Speaker 1: is when they actually start talking about nuclear waste to 748 00:41:45,440 --> 00:41:48,280 Speaker 1: what it does to you, the details about the structure 749 00:41:48,280 --> 00:41:52,400 Speaker 1: and all that stuff, right where where they teach you 750 00:41:52,400 --> 00:41:55,680 Speaker 1: a little bit about radio activity. So group B Uh, 751 00:41:55,719 --> 00:41:59,799 Speaker 1: this was they went super informative, and really what they 752 00:41:59,840 --> 00:42:01,920 Speaker 1: were light on was that people had a little bit 753 00:42:02,000 --> 00:42:04,319 Speaker 1: of knowledge in the future about stuff like this. But 754 00:42:04,400 --> 00:42:07,160 Speaker 1: they also trusted that the people didn't have to just 755 00:42:07,239 --> 00:42:10,200 Speaker 1: be spooked or scared or something like that. That that 756 00:42:10,440 --> 00:42:12,960 Speaker 1: it's like, here is the facts and information. That's why 757 00:42:13,040 --> 00:42:15,520 Speaker 1: you want to stay away from that. Yeah, they're big 758 00:42:15,520 --> 00:42:20,600 Speaker 1: above groundwork was these big earthen walls in the shape 759 00:42:20,600 --> 00:42:23,480 Speaker 1: of the nuclear trafoil. Not bad. I imagine you'd have 760 00:42:23,520 --> 00:42:25,000 Speaker 1: to see it from above to even know though what 761 00:42:25,080 --> 00:42:27,239 Speaker 1: it was. But that's that's part of the One of 762 00:42:27,239 --> 00:42:30,640 Speaker 1: the requirements was that you you wanted to be um 763 00:42:30,680 --> 00:42:34,640 Speaker 1: easily visible, not just with human cognition, but like remote 764 00:42:34,640 --> 00:42:38,879 Speaker 1: sensing to so like magnetic surveys. They they they said, 765 00:42:38,920 --> 00:42:41,440 Speaker 1: we should put some magnets in here, right, not just 766 00:42:41,520 --> 00:42:43,960 Speaker 1: from when you walk up to it, right, So, and 767 00:42:43,960 --> 00:42:45,120 Speaker 1: you also have to be able to see it from 768 00:42:45,160 --> 00:42:48,960 Speaker 1: your flying saucer exactly. Uh. And then inside the walls 769 00:42:49,000 --> 00:42:52,400 Speaker 1: they have at various steps have these big markers and 770 00:42:52,440 --> 00:42:56,200 Speaker 1: here's where these like symbols and pictographs, uh, all kinds 771 00:42:56,200 --> 00:43:02,480 Speaker 1: of languages, writing in different languages and in more human faces, 772 00:43:02,880 --> 00:43:07,480 Speaker 1: increasingly contorted in agony. As you go down, it looks 773 00:43:07,520 --> 00:43:11,640 Speaker 1: to me like the guy's getting drunker and drunker. Yeah, yeah, 774 00:43:12,719 --> 00:43:15,279 Speaker 1: that's what it looks like. Well, maybe that means there's 775 00:43:14,920 --> 00:43:17,799 Speaker 1: a happening bar exactly. That's how I would take it 776 00:43:17,840 --> 00:43:20,799 Speaker 1: if I were a future human post collapse. Gotta go 777 00:43:20,960 --> 00:43:24,279 Speaker 1: gotta go down here. There were also pictograms you're just 778 00:43:24,320 --> 00:43:26,759 Speaker 1: like digging through the sand to get to They're also 779 00:43:26,840 --> 00:43:31,040 Speaker 1: pictograms that showed like under the ground, like real easy 780 00:43:31,160 --> 00:43:36,000 Speaker 1: to understand drawings of the radioactive waste, the water groundwater 781 00:43:36,080 --> 00:43:39,920 Speaker 1: flowing through it, taking the radioactive waste up to the plants, 782 00:43:39,960 --> 00:43:42,080 Speaker 1: which are then eaten by the humans in the picture, 783 00:43:42,840 --> 00:43:46,319 Speaker 1: one of whom dies, which makes sense. You don't need 784 00:43:46,400 --> 00:43:48,960 Speaker 1: to understand anything about radioactivity. You don't need to be 785 00:43:49,000 --> 00:43:52,120 Speaker 1: able to read anything. It's a really like it makes sense, 786 00:43:52,360 --> 00:43:55,040 Speaker 1: especially if some people are sitting there thinking about it's 787 00:43:55,080 --> 00:43:57,560 Speaker 1: the final image is skulling crossbones or or a pile 788 00:43:57,560 --> 00:43:59,200 Speaker 1: of bones. You know. It was like a person, three 789 00:43:59,239 --> 00:44:01,160 Speaker 1: people standing in one of them. The last one was 790 00:44:01,239 --> 00:44:03,000 Speaker 1: like dead and I think he might even have excess 791 00:44:03,000 --> 00:44:04,600 Speaker 1: for eyes. Well I was about to say, though, I mean, 792 00:44:04,640 --> 00:44:07,239 Speaker 1: if you think about twenty years from now, maybe maybe 793 00:44:07,280 --> 00:44:10,799 Speaker 1: they're like, oh, this induces a nice nap. Maybe, Like 794 00:44:11,600 --> 00:44:14,359 Speaker 1: but to your point though, like the bones is where 795 00:44:14,400 --> 00:44:17,239 Speaker 1: you need to end up, right, Yeah, maybe somebody would 796 00:44:17,239 --> 00:44:18,960 Speaker 1: be like, oh this these veggies here give you a 797 00:44:19,000 --> 00:44:20,840 Speaker 1: great buzz if you grow them on this ground. That 798 00:44:21,000 --> 00:44:23,120 Speaker 1: excess for eyes right. Yeah, the bones do make a 799 00:44:23,120 --> 00:44:26,040 Speaker 1: lot more sense. I think Sagan was right. That's that 800 00:44:26,200 --> 00:44:28,440 Speaker 1: should be a T shirt. Stuff you should know t shirts. 801 00:44:28,440 --> 00:44:31,680 Speaker 1: Sagan was right, don't even need to have any context. 802 00:44:32,040 --> 00:44:34,160 Speaker 1: We're gonna be an email in a few days from 803 00:44:34,200 --> 00:44:36,279 Speaker 1: the guy from the estate of Karl Sagan saying do 804 00:44:36,360 --> 00:44:38,880 Speaker 1: not make that T shirt. So what did they go with? 805 00:44:38,920 --> 00:44:45,319 Speaker 1: In the end, though, they went with um a earthen work, 806 00:44:45,719 --> 00:44:49,880 Speaker 1: earthen berm basically to provide an obstacle um and to 807 00:44:50,280 --> 00:44:55,800 Speaker 1: block easy access some granite slabs, monoliths that have warnings 808 00:44:55,840 --> 00:44:59,960 Speaker 1: written in seven languages, yeah, Navajo and then the six 809 00:45:00,120 --> 00:45:03,960 Speaker 1: languages of the u n So Arabic, Chinese, English, Spanish, French, 810 00:45:04,000 --> 00:45:07,840 Speaker 1: and Russian. Um, which makes a lot of sense. But 811 00:45:08,040 --> 00:45:11,520 Speaker 1: then they took Thomas Cibiak up on his idea. They 812 00:45:11,600 --> 00:45:15,440 Speaker 1: kind of built on the earlier right exactly um, and 813 00:45:15,480 --> 00:45:18,600 Speaker 1: they left blank spaces or they play in their plan. 814 00:45:18,719 --> 00:45:22,200 Speaker 1: They leave blank spaces on these slabs for future generations 815 00:45:22,239 --> 00:45:26,719 Speaker 1: to add their own translations of the inscriptions. It's a 816 00:45:26,760 --> 00:45:31,080 Speaker 1: great idea. And the faces of humans and pain in 817 00:45:31,160 --> 00:45:34,400 Speaker 1: English that did survive in the end. So that was 818 00:45:34,440 --> 00:45:38,120 Speaker 1: the final report on this whip panel. It's a pretty 819 00:45:38,120 --> 00:45:41,680 Speaker 1: good idea, makes a lot of sense because not only so, 820 00:45:41,760 --> 00:45:44,240 Speaker 1: there are two two groups that they're trying to say 821 00:45:44,280 --> 00:45:48,279 Speaker 1: stay away, not not really like urban explorers or thrill 822 00:45:48,320 --> 00:45:51,960 Speaker 1: seekers or whatever they would have, they would have virtually 823 00:45:52,040 --> 00:45:55,000 Speaker 1: no chance of getting down to the actual radio at 824 00:45:55,000 --> 00:45:57,840 Speaker 1: the material. The people they were worried about, or technology 825 00:45:57,920 --> 00:46:02,920 Speaker 1: technological advanced civilizations that we're drilling for resources like accident 826 00:46:03,040 --> 00:46:06,959 Speaker 1: like God help this this um this this waste disposal site. 827 00:46:07,000 --> 00:46:10,480 Speaker 1: If salt becomes incredibly important in the future, and then 828 00:46:11,440 --> 00:46:17,160 Speaker 1: um less advanced civilizations that could accidentally um change the 829 00:46:17,160 --> 00:46:20,279 Speaker 1: flow of groundwater to go through the salt bed through 830 00:46:20,360 --> 00:46:23,799 Speaker 1: massive like irrigation projects. It covers all of it. Yeah, 831 00:46:23,800 --> 00:46:26,840 Speaker 1: my whole thing is just make it inaccessible. Why is 832 00:46:26,840 --> 00:46:29,879 Speaker 1: it in New Mexico? Why is it out you know, well, 833 00:46:29,920 --> 00:46:33,520 Speaker 1: that's I mean, that's pretty inaccessible. It's not as access 834 00:46:33,600 --> 00:46:37,319 Speaker 1: inaccessible as you know Siberia. No, some one of the 835 00:46:37,400 --> 00:46:41,719 Speaker 1: recommendations for nuclear waste disposal shooting it into space. Just 836 00:46:41,800 --> 00:46:43,960 Speaker 1: send it out in the outer space and forget about it. 837 00:46:44,200 --> 00:46:46,520 Speaker 1: And if you believe in the family paradox that it 838 00:46:46,600 --> 00:46:50,879 Speaker 1: says we're the only intelligent life in the universe. Man 839 00:46:50,960 --> 00:46:54,600 Speaker 1: more power to That's actually not that bad of an idea. 840 00:46:54,800 --> 00:46:56,840 Speaker 1: It's a horrific idea, but it's actually kind of a 841 00:46:56,840 --> 00:46:58,799 Speaker 1: good idea. Thing. Yeah, but then I wonder about the 842 00:46:58,880 --> 00:47:01,960 Speaker 1: danger and the risk and evolved. I mean, we've seen 843 00:47:02,040 --> 00:47:04,279 Speaker 1: rockets blow up and space shuttles blow up. That would 844 00:47:04,280 --> 00:47:06,160 Speaker 1: be bad. Like what if the thing that they're shooting 845 00:47:06,160 --> 00:47:08,480 Speaker 1: it out there malfunctioned or something that'd be really bad 846 00:47:08,560 --> 00:47:10,600 Speaker 1: would be really bad. That's a great points, Like all 847 00:47:10,640 --> 00:47:13,800 Speaker 1: of our nuclear waste has just been released, oh into 848 00:47:13,840 --> 00:47:18,480 Speaker 1: the atmosphere. Yeah, that's a great, great point. So here's 849 00:47:18,520 --> 00:47:21,200 Speaker 1: the thing, is all of this just wasted effort because 850 00:47:21,239 --> 00:47:23,560 Speaker 1: I was getting so into the stuff. And then the 851 00:47:23,640 --> 00:47:27,000 Speaker 1: end of this article was a real like sad trombone 852 00:47:27,160 --> 00:47:30,279 Speaker 1: because it seems like it seems like nobody really even 853 00:47:30,320 --> 00:47:33,000 Speaker 1: cares the people that matter. Well, the first the first 854 00:47:33,000 --> 00:47:36,319 Speaker 1: group like their whole thing will will probably never be 855 00:47:36,400 --> 00:47:39,759 Speaker 1: implemented because yuck, a mountain project got shut down. But 856 00:47:39,880 --> 00:47:43,719 Speaker 1: the Whip group may actually have their their plan come 857 00:47:43,760 --> 00:47:46,920 Speaker 1: to fruition because it is an ep A rule that 858 00:47:46,960 --> 00:47:49,319 Speaker 1: you have to create this kind of marker and they've 859 00:47:49,320 --> 00:47:52,320 Speaker 1: got until about forty until they estimate the place is 860 00:47:52,320 --> 00:47:54,920 Speaker 1: going to shut down. So it's entirely possible that in 861 00:47:56,120 --> 00:47:59,239 Speaker 1: or sometime in the hundred years after, when the d 862 00:47:59,320 --> 00:48:01,400 Speaker 1: O E stops protecting the site or the d O 863 00:48:01,480 --> 00:48:05,319 Speaker 1: d UM, they may implement this earthen works in the 864 00:48:05,400 --> 00:48:08,359 Speaker 1: sixteen granite slabs, and we we may live to see 865 00:48:08,400 --> 00:48:11,280 Speaker 1: something like this. Well, outside of the US, it seems 866 00:48:11,320 --> 00:48:14,640 Speaker 1: like no one is super concerned. Sweden in two thousand 867 00:48:14,640 --> 00:48:18,080 Speaker 1: eleven had an application to build a repository in force 868 00:48:18,160 --> 00:48:23,319 Speaker 1: Mark and in their in their literal application, they basically said, 869 00:48:24,040 --> 00:48:26,279 Speaker 1: you know what, we're gonna worry about that later in 870 00:48:26,360 --> 00:48:28,960 Speaker 1: seventy years, when this thing's finished. They said, see this, 871 00:48:29,080 --> 00:48:31,640 Speaker 1: can we just kicked the seventy years down. The ref 872 00:48:32,080 --> 00:48:35,960 Speaker 1: and the Swedish National Archives they consulted on their application, 873 00:48:36,760 --> 00:48:39,080 Speaker 1: they said, that's really insufficient. It said, it gives the 874 00:48:39,120 --> 00:48:43,720 Speaker 1: impression that one intends to postpone important documentation efforts until 875 00:48:43,760 --> 00:48:46,920 Speaker 1: the closure of the repository in seventy years. And it's like, 876 00:48:46,960 --> 00:48:50,920 Speaker 1: it doesn't give the impression. It literally said that, So, uh, 877 00:48:51,000 --> 00:48:55,200 Speaker 1: I think they're being ultrapolite. Yeah, I think, well Sweden, right, good, 878 00:48:55,200 --> 00:48:58,600 Speaker 1: people in the US, though, don't tell Asa Brocky that 879 00:48:58,600 --> 00:49:02,160 Speaker 1: that don't even know what that means. That's a singer, right, yeah, 880 00:49:02,160 --> 00:49:04,600 Speaker 1: he's he's a rapper. He's in prison in Sweden right now. 881 00:49:04,719 --> 00:49:06,759 Speaker 1: And I did not know that. Oh man, what would 882 00:49:06,760 --> 00:49:09,280 Speaker 1: he do. He got into a fight with some Swedish 883 00:49:09,320 --> 00:49:11,080 Speaker 1: kids and it may or may not have been their fault. 884 00:49:11,120 --> 00:49:13,880 Speaker 1: It looks on video like they definitely provoked it. But 885 00:49:13,920 --> 00:49:16,319 Speaker 1: the King of Sweden is like, sorry, rule of law 886 00:49:16,400 --> 00:49:21,920 Speaker 1: applies to everybody, including super famous Americans. Donald Trump called 887 00:49:21,960 --> 00:49:23,920 Speaker 1: him to try to get the thing resolved at the 888 00:49:23,960 --> 00:49:27,520 Speaker 1: behest of Kanye West God, and apparently I just made 889 00:49:27,520 --> 00:49:29,480 Speaker 1: everything worse. And now the King of Sweden is like, 890 00:49:29,520 --> 00:49:33,640 Speaker 1: there's no chance he's getting released early. Wow, man, where 891 00:49:33,680 --> 00:49:37,200 Speaker 1: have I been? This is reality. What I just said 892 00:49:37,360 --> 00:49:41,360 Speaker 1: is actual fact that actually happened here in two thousand nineteen. Everybody. 893 00:49:41,760 --> 00:49:46,440 Speaker 1: Humans of the far future? Can you believe it? Humans 894 00:49:46,480 --> 00:49:49,080 Speaker 1: of the near John Lomberg, that guy we were talking 895 00:49:49,080 --> 00:49:52,680 Speaker 1: about earlier, who was on that original panel, he told 896 00:49:52,800 --> 00:49:56,319 Speaker 1: Vice just a couple of years ago. Um, a lot 897 00:49:56,360 --> 00:49:58,440 Speaker 1: of us had been around the block a few times before, 898 00:49:58,480 --> 00:50:00,640 Speaker 1: because you know, he was back then doing the same 899 00:50:00,680 --> 00:50:02,520 Speaker 1: thing and knew this is going to be a report 900 00:50:02,560 --> 00:50:05,080 Speaker 1: the government only did and this is the US and 901 00:50:05,120 --> 00:50:08,040 Speaker 1: we're putting more thought towards this than anyone. Yeah, which 902 00:50:08,160 --> 00:50:10,959 Speaker 1: is really surprising. He said they only did this because 903 00:50:10,960 --> 00:50:13,120 Speaker 1: they needed to show compliance. They didn't really care what 904 00:50:13,200 --> 00:50:17,799 Speaker 1: we said. And then uh, and from the Human Interference 905 00:50:17,840 --> 00:50:21,080 Speaker 1: Task Force during the competition, they basically said, the most 906 00:50:21,080 --> 00:50:23,800 Speaker 1: effective sign will be the dead bodies of those foolish 907 00:50:23,880 --> 00:50:29,440 Speaker 1: enough to ignore whatever sign. So basically like who cares, 908 00:50:29,840 --> 00:50:31,680 Speaker 1: someone will do someone will get in there and they 909 00:50:31,800 --> 00:50:34,000 Speaker 1: all die and then that'll be the big morning, right, 910 00:50:34,040 --> 00:50:37,120 Speaker 1: which makes sense if if humans are in communication around 911 00:50:37,160 --> 00:50:39,279 Speaker 1: the globe and you've got the same morning around. But 912 00:50:39,320 --> 00:50:43,520 Speaker 1: if they're not, then it's catastrophe, catastrophe, catastrophe. But at 913 00:50:43,600 --> 00:50:45,880 Speaker 1: least we fulfilled our part of the bargain where we 914 00:50:45,920 --> 00:50:50,319 Speaker 1: really tried to warn everybody agreed, you got anything else? Yeah, 915 00:50:50,880 --> 00:50:53,880 Speaker 1: if you will indulge me, I would like to plug 916 00:50:54,200 --> 00:50:57,000 Speaker 1: the end of the world with Josh Clark. What the 917 00:50:57,120 --> 00:50:59,600 Speaker 1: end of the world with Josh Clark? If if like 918 00:50:59,719 --> 00:51:02,160 Speaker 1: thinking about things in like far deep time in the 919 00:51:02,200 --> 00:51:04,560 Speaker 1: future of humanity and all that stuff kind of floated 920 00:51:04,600 --> 00:51:07,640 Speaker 1: your boat. I would recommend my little podcast series The 921 00:51:07,719 --> 00:51:09,480 Speaker 1: End of the World of Josh Clark for sure. This 922 00:51:09,840 --> 00:51:12,719 Speaker 1: is right up your alley. Thank you, Chuck. Uh. And 923 00:51:12,760 --> 00:51:14,880 Speaker 1: since Chuck said write up your alley, it's time for 924 00:51:14,920 --> 00:51:20,080 Speaker 1: a listener mail. Hey guys, we are strangers, but we aren't. 925 00:51:20,560 --> 00:51:22,320 Speaker 1: You've been with me during the most challenging times of 926 00:51:22,360 --> 00:51:24,239 Speaker 1: my life. I've listened to your show for about seven years. 927 00:51:24,640 --> 00:51:28,279 Speaker 1: I'm an English teacher. My students retired and making fun 928 00:51:28,320 --> 00:51:30,520 Speaker 1: of me because I always start lessons with so I 929 00:51:30,560 --> 00:51:33,160 Speaker 1: was listening to stuff you should know. I went through 930 00:51:33,200 --> 00:51:35,320 Speaker 1: a huge life change recently. I was in a relationship 931 00:51:35,360 --> 00:51:38,040 Speaker 1: for five years, engage for four of them, uh, and 932 00:51:38,080 --> 00:51:40,960 Speaker 1: moved from Phoenix to Charlotte after ending that relationship, which 933 00:51:41,000 --> 00:51:43,879 Speaker 1: was incredibly difficult to do. During the drive, I listen 934 00:51:43,880 --> 00:51:46,719 Speaker 1: to you guys for the entire thirty four hours. Wow, 935 00:51:47,640 --> 00:51:52,080 Speaker 1: can you imagine? No, I honestly can't know music, just 936 00:51:52,200 --> 00:51:54,600 Speaker 1: you guys. My heart was so broken. I didn't think 937 00:51:55,239 --> 00:51:57,200 Speaker 1: I would ever be able to recover from that trauma, 938 00:51:57,239 --> 00:51:59,759 Speaker 1: but the trauma of listening to us for thirty four 939 00:52:00,880 --> 00:52:02,520 Speaker 1: But you didn't know that you were able to comfort 940 00:52:02,520 --> 00:52:05,040 Speaker 1: me and call me down. My brother, who helped me move, 941 00:52:05,600 --> 00:52:07,720 Speaker 1: asked me what I needed to listen to during the drive. 942 00:52:07,760 --> 00:52:09,600 Speaker 1: I told him I wanted to listen to stuff you 943 00:52:09,600 --> 00:52:11,520 Speaker 1: should Know. He had never heard of it. But now 944 00:52:11,680 --> 00:52:15,000 Speaker 1: my brother Nick is also a fan, whether he likes 945 00:52:15,000 --> 00:52:17,600 Speaker 1: it or not, and we almost always started our conversations 946 00:52:17,600 --> 00:52:19,040 Speaker 1: now with did you listen to the last stuff you 947 00:52:19,080 --> 00:52:21,279 Speaker 1: should Know? So I just want to give you guys 948 00:52:21,360 --> 00:52:24,480 Speaker 1: kudos kudos for being incredible. Please give a shout out 949 00:52:24,480 --> 00:52:26,879 Speaker 1: to Justin, a fan that learned about you guys from 950 00:52:26,920 --> 00:52:29,719 Speaker 1: me in case uh he didn't hear it the first time. 951 00:52:29,840 --> 00:52:34,160 Speaker 1: Hello Justin Potter. Wow, thanks for giving me common times 952 00:52:34,200 --> 00:52:36,600 Speaker 1: of adversity. I know we are strangers, but we are 953 00:52:36,600 --> 00:52:39,200 Speaker 1: not actually because you have been with me during struggles 954 00:52:39,200 --> 00:52:41,600 Speaker 1: of my life. Credit you for getting me through the 955 00:52:41,600 --> 00:52:44,279 Speaker 1: hardest times and I will be a lifelong fan of 956 00:52:44,320 --> 00:52:47,480 Speaker 1: you both. That is from Kate. Thanks Kate. I'm really 957 00:52:47,480 --> 00:52:50,440 Speaker 1: glad we got to play some small part in getting 958 00:52:50,440 --> 00:52:53,239 Speaker 1: you back on the road to Yeah, to happiness. Yeah. 959 00:52:53,239 --> 00:52:55,200 Speaker 1: I hope everything's going great for you. Yeah for real. 960 00:52:55,880 --> 00:52:57,279 Speaker 1: If you want to get in touch of this like 961 00:52:57,400 --> 00:52:59,560 Speaker 1: Kate did, just to say hi or to say thanks 962 00:52:59,640 --> 00:53:01,919 Speaker 1: or to say you guys really screwed up. It's cool. 963 00:53:02,239 --> 00:53:04,120 Speaker 1: You can go onto stuff you Should Know dot com 964 00:53:04,160 --> 00:53:06,760 Speaker 1: and check out our social links, and you can also 965 00:53:06,840 --> 00:53:09,880 Speaker 1: send us a good old fashioned email to stuff podcast 966 00:53:10,040 --> 00:53:15,600 Speaker 1: at iHeart radio dot com. Stuff you Should Know is 967 00:53:15,600 --> 00:53:18,200 Speaker 1: a production of iHeart Radios How Stuff Works. For more 968 00:53:18,239 --> 00:53:20,719 Speaker 1: podcasts for my heart Radio, visit the iHeart Radio app, 969 00:53:20,800 --> 00:53:23,400 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows. 970 00:53:26,600 --> 00:53:26,640 Speaker 1: H