1 00:00:01,280 --> 00:00:04,000 Speaker 1: Hey everybody, it's me Josh, and for this week's select, 2 00:00:04,240 --> 00:00:06,920 Speaker 1: I've chosen one of my favorite episodes in honor of 3 00:00:06,960 --> 00:00:10,200 Speaker 1: the birthday of my favorite person, my dear sweet wife, Yumy, 4 00:00:10,320 --> 00:00:13,560 Speaker 1: whose birthday is today. Since this episode is so cool 5 00:00:13,720 --> 00:00:16,720 Speaker 1: and Yumi is too, I figure it was just logical 6 00:00:17,079 --> 00:00:20,560 Speaker 1: and logical stuff really floats my boat. Plus, what better 7 00:00:20,600 --> 00:00:24,000 Speaker 1: way to send off a not too bad year, considering everything, 8 00:00:24,320 --> 00:00:26,759 Speaker 1: especially the last couple of years, than with a real 9 00:00:26,800 --> 00:00:31,200 Speaker 1: head scratcher. Fascinating, interesting episode. I hope you enjoy it. 10 00:00:31,480 --> 00:00:36,159 Speaker 1: Happy birthday to Yumi, and happy New Year to everybody. 11 00:00:39,400 --> 00:00:48,320 Speaker 2: Welcome to Stuff You Should Know, a production of iHeartRadio. 12 00:00:48,960 --> 00:00:52,480 Speaker 1: Hey, and welcome to the podcast. I'm Josh Clark. There's 13 00:00:52,560 --> 00:00:56,320 Speaker 1: Charles w Chuck Bryant, there's Jerry over there. And this 14 00:00:56,440 --> 00:01:00,960 Speaker 1: is Stuff you should Know. It's a podcast. 15 00:01:01,600 --> 00:01:03,560 Speaker 2: And you're about to say that the Blank Edition. 16 00:01:04,200 --> 00:01:05,960 Speaker 1: Yeah, I was, but I couldn't think of anything. 17 00:01:06,200 --> 00:01:09,480 Speaker 2: It was literally the blank Edition, was it? I mean, 18 00:01:09,480 --> 00:01:10,680 Speaker 2: you couldn't think of anything you were playing? 19 00:01:10,720 --> 00:01:13,160 Speaker 1: No, No, that's right, it was the blank Edition. Oh gosh, 20 00:01:13,160 --> 00:01:14,520 Speaker 1: it's a terrible start, Chuck. 21 00:01:14,800 --> 00:01:19,520 Speaker 2: So how about this, just to divert ourselves from that disaster. 22 00:01:20,680 --> 00:01:23,640 Speaker 2: What was not a disaster were our live shows we 23 00:01:23,720 --> 00:01:27,200 Speaker 2: just did. Oh yeah, we finally got up on stage 24 00:01:27,200 --> 00:01:30,960 Speaker 2: everyone since first time since January. Yeah, kick the rust off. 25 00:01:31,120 --> 00:01:33,759 Speaker 2: Sure in Chicago and Toronto. 26 00:01:33,920 --> 00:01:35,640 Speaker 1: And both of them were we just killed. 27 00:01:35,680 --> 00:01:37,759 Speaker 2: They were great. Yeah, every audiences were great. 28 00:01:37,840 --> 00:01:40,560 Speaker 1: Everyone had a really great time. Yeah, they told us so, 29 00:01:40,800 --> 00:01:43,920 Speaker 1: uh huh. They seemed to be legitimately meaning what they 30 00:01:43,959 --> 00:01:44,440 Speaker 1: were saying. 31 00:01:44,680 --> 00:01:46,880 Speaker 2: Yeah, it was really really great to get back on 32 00:01:46,959 --> 00:01:47,960 Speaker 2: stage with you, my friend. 33 00:01:48,040 --> 00:01:51,960 Speaker 1: And also also also hats off to Chicago for showing up. 34 00:01:52,080 --> 00:01:54,800 Speaker 1: They showed up, Like we called you guys out and 35 00:01:54,840 --> 00:01:57,760 Speaker 1: you responded, yeah, thank you very right, and thank you 36 00:01:57,800 --> 00:01:59,760 Speaker 1: Toronto for not making us call you out. 37 00:02:00,760 --> 00:02:03,600 Speaker 2: But there are still tickets remaining for August twenty ninth 38 00:02:03,600 --> 00:02:06,520 Speaker 2: in Boston, the wilbur and Portland, Maine. You know, we're 39 00:02:06,600 --> 00:02:12,280 Speaker 2: venturing up into the hinterlands of America right years after that, 40 00:02:12,320 --> 00:02:16,639 Speaker 2: But August thirtieth, there are still plenty of great tickets 41 00:02:16,720 --> 00:02:19,240 Speaker 2: left there. And then the same can be said in 42 00:02:19,280 --> 00:02:24,480 Speaker 2: October and Orlando and October tenth. I think I said 43 00:02:24,480 --> 00:02:27,320 Speaker 2: October ninth right in Orlando, October tenth and New Orleans Yep, 44 00:02:27,400 --> 00:02:29,440 Speaker 2: that's right, Brooklyn. I'm not worried about that. 45 00:02:29,440 --> 00:02:32,600 Speaker 1: That's already all sold out, the whole thing, all three nights. 46 00:02:33,000 --> 00:02:34,200 Speaker 2: Man, should we add a fourth? 47 00:02:35,200 --> 00:02:36,880 Speaker 1: Jeez, I don't know. 48 00:02:36,919 --> 00:02:39,520 Speaker 2: We'll talk about it anyway. Thanks to everyone who came out. 49 00:02:39,560 --> 00:02:41,160 Speaker 2: It was a lot of fun. And this is a 50 00:02:41,160 --> 00:02:42,840 Speaker 2: good one. So you don't want you don't want to 51 00:02:42,840 --> 00:02:43,160 Speaker 2: miss it. 52 00:02:43,240 --> 00:02:46,079 Speaker 1: Yeah, so come on out, especially you Portland Maine. Let's 53 00:02:46,080 --> 00:02:46,440 Speaker 1: get with. 54 00:02:46,440 --> 00:02:49,760 Speaker 2: It all right now. Nuclear Semiotics, which I didn't know 55 00:02:49,840 --> 00:02:50,720 Speaker 2: I loved, but I do. 56 00:02:51,320 --> 00:02:53,760 Speaker 1: Really do you remember ninety nine percent Visible did a 57 00:02:53,840 --> 00:02:55,959 Speaker 1: very famous episode on this very topic. 58 00:02:56,040 --> 00:02:56,840 Speaker 2: Oh, I didn't hear that. 59 00:02:57,040 --> 00:03:00,360 Speaker 1: I specifically avoided going back and listening to it because 60 00:03:00,400 --> 00:03:04,120 Speaker 1: I don't want to be stunk upon by its taint. 61 00:03:04,880 --> 00:03:05,640 Speaker 1: Does that make sense? 62 00:03:05,720 --> 00:03:07,240 Speaker 2: You don't want Roman Mars' taint. 63 00:03:08,840 --> 00:03:11,560 Speaker 1: It's more like, it's just such a classic episode that 64 00:03:11,639 --> 00:03:13,480 Speaker 1: I don't wanted to like leak in. I don't want 65 00:03:13,480 --> 00:03:14,680 Speaker 1: to accidentally rip it off. 66 00:03:14,880 --> 00:03:18,880 Speaker 2: Yeah. Well, we certainly can't ninety nine invisible this thing, 67 00:03:18,880 --> 00:03:21,240 Speaker 2: because that is a show that exists at the top 68 00:03:21,280 --> 00:03:22,959 Speaker 2: echelon this industry. 69 00:03:23,080 --> 00:03:26,359 Speaker 1: Sure so so do we Sure we're up there all right? 70 00:03:26,639 --> 00:03:28,760 Speaker 1: But if you like this one, if this stuff like 71 00:03:28,760 --> 00:03:30,360 Speaker 1: floats your bone and you're like, I want to know more, 72 00:03:30,480 --> 00:03:33,079 Speaker 1: go listen to the ninety nine percent Invisible episode. 73 00:03:33,440 --> 00:03:36,680 Speaker 2: Yeah, this thing really triggered a lot of like synapsis 74 00:03:36,760 --> 00:03:40,920 Speaker 2: firing for me, and I think, like, I think I 75 00:03:40,960 --> 00:03:46,720 Speaker 2: really enjoy this kind of thought experiment, problem solving stuff. 76 00:03:46,840 --> 00:03:47,600 Speaker 1: Oh yeah, I. 77 00:03:47,520 --> 00:03:49,520 Speaker 2: Think I would really dig like that. Part of the 78 00:03:49,600 --> 00:03:53,680 Speaker 2: zombie Apocalypse is figuring the stuff out as a team, right, 79 00:03:54,120 --> 00:03:56,200 Speaker 2: Because the whole time I was reading this, I was saying, 80 00:03:56,360 --> 00:03:59,320 Speaker 2: great idea, terrible idea. They should do this, they shouldn't 81 00:03:59,320 --> 00:03:59,640 Speaker 2: do that. 82 00:04:00,200 --> 00:04:02,480 Speaker 1: Sit down. Yeah, you I like the cut of your gym. 83 00:04:02,800 --> 00:04:04,520 Speaker 2: It was really cool. I dug this. I'd never heard 84 00:04:04,560 --> 00:04:05,640 Speaker 2: of it, so thank you. 85 00:04:06,160 --> 00:04:09,200 Speaker 1: Oh you're very welcome. I actually heard of it before 86 00:04:09,320 --> 00:04:12,680 Speaker 1: Roman Mars made the episode, so I can't really thank him. 87 00:04:12,720 --> 00:04:14,520 Speaker 2: But well, not before he heard of it, because I 88 00:04:14,560 --> 00:04:17,680 Speaker 2: think it's well known that Roman's first words were nuclear semiotics. 89 00:04:18,480 --> 00:04:23,560 Speaker 1: That's true. Yeah, even before Mama, that's right. I could 90 00:04:23,640 --> 00:04:28,040 Speaker 1: totally believe that. Actually. Yeah, So what we're talking about 91 00:04:28,080 --> 00:04:29,640 Speaker 1: is Chuck said a couple of times for those of 92 00:04:29,640 --> 00:04:33,240 Speaker 1: you who don't know, is nuclear semiotics and that is 93 00:04:33,279 --> 00:04:39,120 Speaker 1: a very specialized branch, interdisciplinary branch of I guess science 94 00:04:39,600 --> 00:04:42,599 Speaker 1: that involves all basically any feel of research that you 95 00:04:42,600 --> 00:04:45,960 Speaker 1: can throw at the wall would probably have some function 96 00:04:46,080 --> 00:04:48,760 Speaker 1: to play in the field of nuclear semiotics. And to 97 00:04:49,160 --> 00:04:51,680 Speaker 1: make a long story short, to do the too long, 98 00:04:51,720 --> 00:04:57,160 Speaker 1: didn't read version of this TL semi colon dr is, 99 00:04:57,839 --> 00:05:01,720 Speaker 1: nuclear semiotics seeks to figure out how how to warn 100 00:05:02,400 --> 00:05:04,880 Speaker 1: the future humans to come. 101 00:05:05,120 --> 00:05:07,880 Speaker 2: Or whatever is here. Sure, let's get on a good point. 102 00:05:07,960 --> 00:05:10,640 Speaker 1: I mean why discriminate, right, Yeah, to warn the future 103 00:05:10,720 --> 00:05:15,920 Speaker 1: humans or the future super intelligent jellyfish whatever to come. Hey, 104 00:05:16,800 --> 00:05:20,320 Speaker 1: this is a very dangerous radioactive dump site that we've 105 00:05:20,360 --> 00:05:22,200 Speaker 1: put here. Stay away. 106 00:05:22,440 --> 00:05:23,600 Speaker 2: Yeah, it's that easy. 107 00:05:23,960 --> 00:05:27,000 Speaker 1: It sounds easy. The problem is is if you presume 108 00:05:27,040 --> 00:05:30,560 Speaker 1: that it's easy, you're making a lot of assumptions that 109 00:05:30,600 --> 00:05:32,320 Speaker 1: aren't necessarily going to hold up. 110 00:05:32,560 --> 00:05:34,400 Speaker 2: Oh yeah, like a lot of times are like they 111 00:05:34,400 --> 00:05:36,520 Speaker 2: should just do. And I would even stop halfway through 112 00:05:36,520 --> 00:05:38,840 Speaker 2: my thought because like, no, that wouldn't work. 113 00:05:38,920 --> 00:05:42,200 Speaker 1: It's true because our languages might be gone by then. 114 00:05:42,640 --> 00:05:46,040 Speaker 1: Our symbols don't necessarily make sense outside of the context 115 00:05:46,040 --> 00:05:50,600 Speaker 1: that we understand them in civilization might be ridiculously advanced 116 00:05:50,600 --> 00:05:52,680 Speaker 1: by them, a civilization might be in a state of 117 00:05:52,720 --> 00:05:55,400 Speaker 1: collapse by then. We have no idea. But the point 118 00:05:55,440 --> 00:05:58,159 Speaker 1: of nuclear semiotics is to figure out how to come 119 00:05:58,240 --> 00:06:03,280 Speaker 1: up with a a message that is understandable to everybody 120 00:06:03,320 --> 00:06:08,320 Speaker 1: in any situation in the future. And the current state 121 00:06:08,360 --> 00:06:10,640 Speaker 1: of the art is, let's figure out how to speak 122 00:06:10,920 --> 00:06:13,320 Speaker 1: as far as ten thousand years into the future. 123 00:06:13,839 --> 00:06:17,159 Speaker 2: Yeah, I mean, and that's like being generous. It needs 124 00:06:17,160 --> 00:06:17,920 Speaker 2: to go beyond that. 125 00:06:18,120 --> 00:06:20,720 Speaker 1: It does, because the whole point of nuclear semiotis the 126 00:06:20,720 --> 00:06:23,440 Speaker 1: whole point of warning the future is this stuff, this 127 00:06:23,560 --> 00:06:26,640 Speaker 1: nuclear waste that we're putting into the ground now, is 128 00:06:27,480 --> 00:06:30,279 Speaker 1: going to be dangerous for tens and tens of thousands 129 00:06:30,320 --> 00:06:30,800 Speaker 1: of years. 130 00:06:30,960 --> 00:06:31,279 Speaker 2: Yeah. 131 00:06:31,320 --> 00:06:34,000 Speaker 1: Plutonium two thirty nine has a half life of twenty 132 00:06:34,000 --> 00:06:40,000 Speaker 1: four thousand years. There's something called technetium ninety nine has 133 00:06:40,000 --> 00:06:42,520 Speaker 1: a half life of two hundred and eleven thousand years. 134 00:06:42,720 --> 00:06:45,360 Speaker 1: So another one is like one point seven million year 135 00:06:45,440 --> 00:06:48,960 Speaker 1: half life. This is the nuclear waste that we're creating 136 00:06:49,040 --> 00:06:51,120 Speaker 1: now and are putting in the ground. 137 00:06:51,600 --> 00:06:55,200 Speaker 2: Yeah. And Julia Layton, who is one of our writers 138 00:06:55,200 --> 00:06:57,440 Speaker 2: who does great work for us, she made a lot 139 00:06:57,440 --> 00:07:00,240 Speaker 2: of great points, which is like the history of an 140 00:07:00,240 --> 00:07:03,720 Speaker 2: evolution is two hundred thousand years. Yeah, and like we've 141 00:07:03,760 --> 00:07:06,320 Speaker 2: only been like reading and writing for how long. 142 00:07:06,279 --> 00:07:08,719 Speaker 1: About five thousand, less than six thousand years? 143 00:07:08,920 --> 00:07:12,640 Speaker 2: Yeah, So it's it sounds like like you said, it 144 00:07:12,680 --> 00:07:15,800 Speaker 2: sounds simple. And so many times I thought I had 145 00:07:15,920 --> 00:07:18,600 Speaker 2: I thought I had it cracked right, only to think 146 00:07:19,000 --> 00:07:21,680 Speaker 2: like I was, like, why don't they just do something 147 00:07:21,720 --> 00:07:26,000 Speaker 2: purely visual and stage a play of people at that 148 00:07:26,080 --> 00:07:28,840 Speaker 2: site digging in and then dying. And it's like, well, 149 00:07:28,880 --> 00:07:30,680 Speaker 2: what do you do with it? Well, I'll just put 150 00:07:30,680 --> 00:07:34,760 Speaker 2: it on a DVD. Sure, that just plays on a loop. Right, 151 00:07:35,400 --> 00:07:36,920 Speaker 2: It's like, well, how are you going to power that thing? 152 00:07:37,120 --> 00:07:40,400 Speaker 1: Right? You know what happens when everybody's converted to blu ray? 153 00:07:40,560 --> 00:07:43,480 Speaker 2: Yeah exactly? Or you know, thought, well, then solar put 154 00:07:43,480 --> 00:07:45,720 Speaker 2: a solar panel up? Is that a last river? But 155 00:07:45,720 --> 00:07:47,800 Speaker 2: what have it done? What if there's like a forever 156 00:07:48,320 --> 00:07:51,760 Speaker 2: nuclear storm or whatever. But if the sun never shines 157 00:07:51,760 --> 00:07:55,320 Speaker 2: again on Earth in eight thousand years, that's could happen. 158 00:07:55,400 --> 00:07:58,840 Speaker 1: That's the cool thing about thinking into the deep future 159 00:07:58,960 --> 00:08:01,160 Speaker 1: is all the things that will go. Yeah. It makes 160 00:08:01,200 --> 00:08:05,240 Speaker 1: you realize like how specific everything you think and know 161 00:08:05,360 --> 00:08:07,880 Speaker 1: and understand really is to your current time. 162 00:08:08,240 --> 00:08:10,200 Speaker 2: Yeah, it's very cool. She brings up the point about 163 00:08:10,200 --> 00:08:13,040 Speaker 2: an apple, Like when you see the word apple, you 164 00:08:13,040 --> 00:08:15,720 Speaker 2: don't see the word apple, right, You see you visualize 165 00:08:15,720 --> 00:08:17,720 Speaker 2: the symbol of that is an apple. 166 00:08:17,920 --> 00:08:18,160 Speaker 1: Right. 167 00:08:18,320 --> 00:08:21,600 Speaker 2: So it's like it's almost like the words very much. 168 00:08:21,680 --> 00:08:25,360 Speaker 2: The words will just not have meaning anymore at some point, right. 169 00:08:25,840 --> 00:08:28,880 Speaker 1: Man, Well, let's dig into this stuff. You ready, Let's 170 00:08:28,920 --> 00:08:31,720 Speaker 1: do it. So to start, we should talk about where 171 00:08:31,720 --> 00:08:34,800 Speaker 1: this all came from. It came from a new type 172 00:08:34,880 --> 00:08:41,319 Speaker 1: of nuclear storage solution, nuclear waste storage solution called long 173 00:08:41,400 --> 00:08:46,600 Speaker 1: term geological repositories, and it is basically digging into the 174 00:08:46,640 --> 00:08:50,079 Speaker 1: earth a couple of miles into the earth, putting our 175 00:08:50,160 --> 00:08:53,520 Speaker 1: nuclear waste there again, waste that's going to be harmful 176 00:08:53,600 --> 00:08:57,360 Speaker 1: to health for tens of thousands, hundreds of thousands of years, 177 00:08:57,920 --> 00:09:01,880 Speaker 1: and sealing it up and then covering over the site 178 00:09:01,880 --> 00:09:04,559 Speaker 1: and then putting a warning on there. And right now, 179 00:09:04,960 --> 00:09:09,199 Speaker 1: the general consensus is that salt beds are the best 180 00:09:09,240 --> 00:09:11,760 Speaker 1: place to put that nuclear waste. And there's actually some 181 00:09:11,800 --> 00:09:12,920 Speaker 1: pretty good reasons why. 182 00:09:13,160 --> 00:09:16,240 Speaker 2: Yeah, we could do an episode on nuclear storage. I 183 00:09:16,240 --> 00:09:18,400 Speaker 2: think I really want to in and of itself, yeah, 184 00:09:18,400 --> 00:09:20,120 Speaker 2: I don't know if that's a shorty or a long ea. 185 00:09:20,200 --> 00:09:23,079 Speaker 1: It's probably a longie. Yeah, But just briefly, the reason 186 00:09:23,160 --> 00:09:26,120 Speaker 1: salt beds are preferable is because the fact that they're 187 00:09:26,160 --> 00:09:29,719 Speaker 1: even there suggests that there's no water. They if there 188 00:09:29,800 --> 00:09:32,360 Speaker 1: was water, they would have been dissolved long ago. It's 189 00:09:32,360 --> 00:09:35,760 Speaker 1: really relatively easy to mine into them. And then what's 190 00:09:35,800 --> 00:09:39,080 Speaker 1: awesome about salt is that when you mine a shaft 191 00:09:39,200 --> 00:09:43,520 Speaker 1: into a salt bed and you put your deposit there 192 00:09:44,240 --> 00:09:48,079 Speaker 1: and then you pull back out, the salt bed actually 193 00:09:48,240 --> 00:09:51,160 Speaker 1: heals itself over, like just a few. 194 00:09:50,920 --> 00:09:52,600 Speaker 2: Decades, seals itself back up, right. 195 00:09:52,679 --> 00:09:55,400 Speaker 1: Yes, so you put a container that's been engineered to 196 00:09:55,440 --> 00:09:57,760 Speaker 1: hold the nuclear waste inside for ten thousand years. 197 00:09:57,840 --> 00:09:59,800 Speaker 2: Yeah, it's also in a container. I should point that 198 00:09:59,800 --> 00:10:00,600 Speaker 2: out right. 199 00:10:00,800 --> 00:10:05,560 Speaker 1: You're putting it into a borehole in the salt. The 200 00:10:05,600 --> 00:10:08,199 Speaker 1: salt is going to grow back around it and intument 201 00:10:09,040 --> 00:10:11,520 Speaker 1: perhaps permanently in this salt. 202 00:10:11,720 --> 00:10:13,280 Speaker 2: Yeah, very strong too, write. 203 00:10:13,600 --> 00:10:16,080 Speaker 1: Yeah, it is fairly strong. I mean, like, if you're 204 00:10:16,160 --> 00:10:20,040 Speaker 1: mining using modern mining equipment, it's really easy to mine into. 205 00:10:20,520 --> 00:10:22,560 Speaker 1: But if you just have like a pickaxe or something, 206 00:10:22,600 --> 00:10:24,920 Speaker 1: it's rock too. Salt rock is what it's called. Right, 207 00:10:25,440 --> 00:10:27,400 Speaker 1: So there's a lot of reasons why people have figured 208 00:10:27,400 --> 00:10:30,120 Speaker 1: out like this is not a bad idea to entomb 209 00:10:30,320 --> 00:10:34,120 Speaker 1: nuclear waste. But but here's the thing. We can't just 210 00:10:34,400 --> 00:10:38,240 Speaker 1: entune it and walk away, Like, we have a responsibility 211 00:10:38,360 --> 00:10:41,680 Speaker 1: for those of us generating this waste today to warn 212 00:10:41,720 --> 00:10:44,280 Speaker 1: the future. Sure, and it's on the future. If they 213 00:10:44,520 --> 00:10:47,160 Speaker 1: listen to us or not, that's on them. 214 00:10:47,000 --> 00:10:49,239 Speaker 2: Right, But we have to make them able to listen. 215 00:10:49,000 --> 00:10:51,719 Speaker 1: To us, exactly, like, we have a responsibility to do that. 216 00:10:51,760 --> 00:10:54,600 Speaker 1: Because some people are proposed like, hey, let's just bury 217 00:10:54,640 --> 00:10:56,959 Speaker 1: and forget about it. The chances of somebody actually finding 218 00:10:56,960 --> 00:11:00,000 Speaker 1: it are pretty slim. Just bury and forget about it. 219 00:11:00,120 --> 00:11:02,240 Speaker 1: That's probably the best, the best way to go. And 220 00:11:02,280 --> 00:11:04,480 Speaker 1: people said it's not a bad idea, but it's actually 221 00:11:04,520 --> 00:11:05,240 Speaker 1: a pretty bad idea. 222 00:11:05,320 --> 00:11:07,480 Speaker 2: See, actually I thought that one wasn't the worst idea. 223 00:11:07,760 --> 00:11:10,480 Speaker 2: It's not that was a behavioral psychologist. He was like, 224 00:11:10,880 --> 00:11:12,800 Speaker 2: and he wasn't like, just forget about it. He was like, 225 00:11:12,880 --> 00:11:16,320 Speaker 2: maybe the smartest thing to do is to leave it unmarked. 226 00:11:15,960 --> 00:11:19,280 Speaker 1: Right, because, as we'll see, attracting attention to something like 227 00:11:19,320 --> 00:11:20,600 Speaker 1: exactly attracts attention. 228 00:11:20,720 --> 00:11:22,680 Speaker 2: I know, it's interesting thought experiment, right. 229 00:11:22,880 --> 00:11:25,960 Speaker 1: That was that psychologist, by the way, was doctor Percy Tenenbaum. 230 00:11:26,240 --> 00:11:28,400 Speaker 2: No really, no, no wonder I liked it of. 231 00:11:28,320 --> 00:11:29,880 Speaker 1: The East Hampton ten and baums. 232 00:11:29,920 --> 00:11:31,800 Speaker 2: So we should point out that there's a couple of 233 00:11:33,040 --> 00:11:35,920 Speaker 2: a couple of big times that these that this has 234 00:11:35,960 --> 00:11:38,160 Speaker 2: been commissioned, like, hey, we need to think of something. 235 00:11:38,960 --> 00:11:41,960 Speaker 2: One for a site that never happened, and one for 236 00:11:42,000 --> 00:11:44,200 Speaker 2: a site that has happened. The one that has happened. 237 00:11:44,320 --> 00:11:46,240 Speaker 2: It's only one of the United States right now. 238 00:11:46,280 --> 00:11:47,840 Speaker 1: The only one in the world as far as I know. 239 00:11:48,160 --> 00:11:51,640 Speaker 2: No, it's number three. Oh really, it's the third largest. Okay, 240 00:11:52,040 --> 00:11:53,240 Speaker 2: I didn't see what the other two were. 241 00:11:53,360 --> 00:11:55,000 Speaker 1: It must have been the first in the world. 242 00:11:55,000 --> 00:11:57,040 Speaker 2: Then, yeah, probably the first in the world. Okay, yeah, 243 00:11:57,040 --> 00:11:59,640 Speaker 2: which makes sense because the other two are bigger. But 244 00:11:59,679 --> 00:12:02,560 Speaker 2: this is New Mexico. It's called the WHIP the Waste 245 00:12:02,559 --> 00:12:08,800 Speaker 2: Isolation Pilot Plant, and this one they are they're actively 246 00:12:08,840 --> 00:12:12,520 Speaker 2: guarding for. They've committed the Department of Energy is committed 247 00:12:12,520 --> 00:12:15,400 Speaker 2: to guarding it with people for one hundred years. 248 00:12:15,520 --> 00:12:19,360 Speaker 1: They've hired Barney Fife into one hundred year contract to 249 00:12:19,440 --> 00:12:20,000 Speaker 1: look over. 250 00:12:19,880 --> 00:12:22,160 Speaker 2: This nuclear way for at least one hundred years. It's 251 00:12:22,160 --> 00:12:23,480 Speaker 2: not like at the end of the hundred years are 252 00:12:23,520 --> 00:12:25,760 Speaker 2: going to just like put a padlock on it and 253 00:12:25,840 --> 00:12:28,880 Speaker 2: walk away. I imagine they will keep guarding it as 254 00:12:28,880 --> 00:12:30,160 Speaker 2: long as they feel like it needs guarding. 255 00:12:31,679 --> 00:12:33,680 Speaker 1: I don't know if that's the thing. I don't know, man. 256 00:12:33,880 --> 00:12:36,040 Speaker 1: I mean, we're talking about a government run program here. 257 00:12:36,160 --> 00:12:38,720 Speaker 2: Well, at least one hundred years, we can at least say. 258 00:12:38,600 --> 00:12:39,959 Speaker 1: That, yes, they agreed to that. 259 00:12:40,400 --> 00:12:46,280 Speaker 2: So you know, the whole idea arose before that though. 260 00:12:46,880 --> 00:12:49,280 Speaker 2: What was the other one in Nevada? 261 00:12:49,480 --> 00:12:50,880 Speaker 1: That's the Yucca Mountain one. 262 00:12:50,920 --> 00:12:52,559 Speaker 2: That was the first one, right, that's the first one 263 00:12:52,600 --> 00:12:56,040 Speaker 2: that never happened, right, But that's when, you know, in 264 00:12:56,080 --> 00:12:58,760 Speaker 2: the seventies is when this idea sort of came about. 265 00:12:58,960 --> 00:13:00,679 Speaker 2: And I think it was a nineteen eight when it 266 00:13:00,720 --> 00:13:05,679 Speaker 2: was sort of codified as an official I guess science or. 267 00:13:06,720 --> 00:13:09,920 Speaker 1: Yeah, it is a brand. It's an interdisciplinary branch of science, 268 00:13:09,960 --> 00:13:15,040 Speaker 1: nuclear semiotics is. And it's because the EPA came up 269 00:13:15,080 --> 00:13:16,840 Speaker 1: with a rule in nineteen eighty two, a law. 270 00:13:16,960 --> 00:13:19,360 Speaker 2: Really that's eighty one. I got that wrong, by the way, So. 271 00:13:19,320 --> 00:13:20,960 Speaker 1: It's eighty one that they came up with the law. 272 00:13:21,640 --> 00:13:24,680 Speaker 2: Well, it became a discipline in nineteen eighty one with 273 00:13:25,000 --> 00:13:27,000 Speaker 2: that Yuck Mountain Repository project. 274 00:13:27,120 --> 00:13:30,160 Speaker 1: And I think from that yucka Mountain Repository project, because 275 00:13:30,200 --> 00:13:32,439 Speaker 1: we were starting to figure out how to deposit this 276 00:13:32,480 --> 00:13:35,480 Speaker 1: stuff for a long time, the EPA came up with 277 00:13:35,520 --> 00:13:37,800 Speaker 1: a rule I think it was nineteen eighty two that said, 278 00:13:37,960 --> 00:13:40,600 Speaker 1: if you're going to create these kind of repositories for 279 00:13:40,720 --> 00:13:43,280 Speaker 1: nuclear waste, you also have to figure out how to 280 00:13:43,640 --> 00:13:47,360 Speaker 1: come up with a permanent warning sign. And everybody was like, 281 00:13:47,400 --> 00:13:49,400 Speaker 1: that's no problem, of course, and then the EPs to 282 00:13:49,520 --> 00:13:51,440 Speaker 1: think about it. It's harder than you think. 283 00:13:51,559 --> 00:13:56,680 Speaker 2: They said, just slap that nuclear waste logo that everyone knows, sure, 284 00:13:56,960 --> 00:13:59,120 Speaker 2: and everyone was like, everyone doesn't know that. 285 00:13:59,280 --> 00:14:00,480 Speaker 1: Well, it's been around forever. 286 00:14:00,559 --> 00:14:05,079 Speaker 2: Everyone doesn't know that now, much less in two hundred 287 00:14:05,120 --> 00:14:05,720 Speaker 2: thousand years. 288 00:14:05,760 --> 00:14:07,840 Speaker 1: Yeah, did you see how that was created? 289 00:14:08,320 --> 00:14:11,440 Speaker 2: Yeah, it was. It was a group doodle. I don't 290 00:14:11,480 --> 00:14:13,480 Speaker 2: know how that happens. I think that means they can't 291 00:14:13,520 --> 00:14:14,760 Speaker 2: ascribe it to one person. 292 00:14:14,960 --> 00:14:16,680 Speaker 1: They know there was like five people in one of 293 00:14:16,679 --> 00:14:20,360 Speaker 1: those giant like silver spoons, pencils or kral cranon. 294 00:14:20,560 --> 00:14:24,840 Speaker 2: Yeah, this is a nineteen forty six, was it at Berkeley? Yeah, 295 00:14:24,880 --> 00:14:27,000 Speaker 2: and it was a group doodle in this science class. 296 00:14:27,760 --> 00:14:30,480 Speaker 1: And is that a is that an album or a 297 00:14:30,520 --> 00:14:31,000 Speaker 1: band name? 298 00:14:31,240 --> 00:14:31,840 Speaker 2: Group doodle? 299 00:14:31,960 --> 00:14:33,680 Speaker 1: It's like the Wiggles or something. 300 00:14:33,920 --> 00:14:35,440 Speaker 2: Yeah, I think it's an album title for. 301 00:14:35,400 --> 00:14:38,280 Speaker 1: Sure, So the Wiggles group doodle absolutely. 302 00:14:38,320 --> 00:14:39,800 Speaker 2: Okay, get that's probably a real thing. 303 00:14:40,280 --> 00:14:41,680 Speaker 1: That's our gift to you, Wiggles. 304 00:14:41,880 --> 00:14:44,000 Speaker 2: But I saw this was interesting. In nineteen forty eight, 305 00:14:44,080 --> 00:14:47,400 Speaker 2: the symbol came under consideration for wider use because at 306 00:14:47,440 --> 00:14:50,680 Speaker 2: first it was just a group doodle and then the 307 00:14:50,720 --> 00:14:56,120 Speaker 2: Brookhaven National Laboratory requested a standardized symbol of standardized colors 308 00:14:56,680 --> 00:15:00,400 Speaker 2: for their radiation safety program, and there was more argument 309 00:15:00,400 --> 00:15:04,920 Speaker 2: about the colors then the actual symbol because at first 310 00:15:04,960 --> 00:15:06,400 Speaker 2: they are like, you can't use yellow because we use 311 00:15:06,480 --> 00:15:07,440 Speaker 2: yellow for a lot of stuff. 312 00:15:07,520 --> 00:15:10,720 Speaker 1: Yeah, they wanted to make sure that it didn't get overused. 313 00:15:10,800 --> 00:15:12,600 Speaker 1: So people had just become kind of blind to it 314 00:15:12,600 --> 00:15:13,600 Speaker 1: because they saw it so much. 315 00:15:13,600 --> 00:15:16,320 Speaker 2: And they were like, have you heard a striper exactly? 316 00:15:16,320 --> 00:15:17,600 Speaker 2: They can't use yellow and black. 317 00:15:17,680 --> 00:15:19,000 Speaker 1: They were like, no, I haven't heard them, and they like, 318 00:15:19,000 --> 00:15:21,120 Speaker 1: give us forty years, you'll have heard of them, believe me. 319 00:15:21,440 --> 00:15:23,040 Speaker 2: And then in forty two years, no one will have 320 00:15:23,080 --> 00:15:28,920 Speaker 2: heard of them. So I think the original design was 321 00:15:28,920 --> 00:15:31,040 Speaker 2: was I saw them in concert. We weren't even talk. 322 00:15:31,080 --> 00:15:32,880 Speaker 1: Oh, I believe it. 323 00:15:32,880 --> 00:15:35,880 Speaker 2: It was magenta blades on a blue background was the 324 00:15:35,920 --> 00:15:39,280 Speaker 2: original design, and it was chosen because it was uncommon. 325 00:15:39,360 --> 00:15:43,600 Speaker 2: But then in Oakridge, Tennessee, at the Oakridge National Laboratory, 326 00:15:43,400 --> 00:15:46,280 Speaker 2: they went with the yellow background in nineteen forty eight. 327 00:15:46,400 --> 00:15:48,680 Speaker 2: Later on in nineteen forty eight, and I guess it stuck. 328 00:15:48,840 --> 00:15:51,720 Speaker 1: That's where the Oakridge boys were all scientists. That's right, 329 00:15:52,520 --> 00:15:56,280 Speaker 1: So it was originally magenta on blue. Right, Yes, And 330 00:15:56,360 --> 00:15:58,280 Speaker 1: the logo we're talking about, for those of you I know, 331 00:15:58,320 --> 00:16:02,360 Speaker 1: it's called the nuclear trefoil. It's a circle and then 332 00:16:02,720 --> 00:16:06,440 Speaker 1: three partial circles around a blade. And from what I saw, 333 00:16:06,480 --> 00:16:10,480 Speaker 1: one of the original group doodlers explained it as it's 334 00:16:10,520 --> 00:16:13,360 Speaker 1: supposed to be an atom with activity around it. 335 00:16:13,520 --> 00:16:15,080 Speaker 2: Yeah, that's it. 336 00:16:14,840 --> 00:16:16,920 Speaker 1: Which I never saw it before, but now that I've 337 00:16:16,960 --> 00:16:19,680 Speaker 1: read that, I can't unsee it. And that is really 338 00:16:19,680 --> 00:16:21,600 Speaker 1: what it looks like. It's a pretty great little doodle. 339 00:16:21,840 --> 00:16:24,800 Speaker 1: But it's like you said, that is not a universally 340 00:16:24,840 --> 00:16:28,800 Speaker 1: accepted symbol, which is a big problem. And it doesn't. 341 00:16:29,080 --> 00:16:31,800 Speaker 1: It doesn't evoke like, oh, an atom, of course, I 342 00:16:31,840 --> 00:16:33,720 Speaker 1: know what an atom looks like. I just saw one 343 00:16:33,800 --> 00:16:35,760 Speaker 1: go down the street a second ago, and this looks 344 00:16:35,760 --> 00:16:39,240 Speaker 1: like an atom. It's a symbolic representation of an atom, 345 00:16:39,440 --> 00:16:42,960 Speaker 1: which means that after people stop thinking about what atoms 346 00:16:42,960 --> 00:16:46,280 Speaker 1: look like, maybe a thousand years or five thousand years 347 00:16:46,280 --> 00:16:48,800 Speaker 1: down the road, if something happens, no one's going to 348 00:16:48,800 --> 00:16:51,040 Speaker 1: look at that and be like, oh, it's an atom. 349 00:16:51,120 --> 00:16:54,080 Speaker 1: Activity around an atom, that must mean there's radiation here. 350 00:16:54,240 --> 00:16:56,960 Speaker 1: Hence this is a danger sign that's not going to happen. 351 00:16:57,280 --> 00:16:57,400 Speaker 3: Right. 352 00:16:58,000 --> 00:17:00,200 Speaker 2: The other thing you would think is just just put 353 00:17:00,280 --> 00:17:03,560 Speaker 2: up in a bunch of languages. Done. Yeah, here's the thing. 354 00:17:04,040 --> 00:17:07,399 Speaker 2: Languages are disappearing. I'm going to ask you, actually, what 355 00:17:07,480 --> 00:17:12,160 Speaker 2: is your best guess? A language dies out every blank. 356 00:17:14,560 --> 00:17:20,119 Speaker 1: Nine million seconds? Is that right? Did I nail it? 357 00:17:20,480 --> 00:17:24,280 Speaker 2: Jerk? I got to get out of calculator. A language 358 00:17:24,320 --> 00:17:25,760 Speaker 2: dies out every fourteen days. 359 00:17:26,080 --> 00:17:27,320 Speaker 1: I'm pretty sure there's nine million. 360 00:17:27,480 --> 00:17:29,520 Speaker 2: Isn't that staggering? God? What if it was? Are you 361 00:17:29,560 --> 00:17:30,520 Speaker 2: about to do that? Yeah? 362 00:17:30,560 --> 00:17:31,320 Speaker 1: You keep talking. 363 00:17:31,600 --> 00:17:34,040 Speaker 2: So that's about twenty five languages per year that die 364 00:17:34,040 --> 00:17:34,640 Speaker 2: out and. 365 00:17:35,200 --> 00:17:36,639 Speaker 1: That's really sad. 366 00:17:36,960 --> 00:17:39,000 Speaker 2: It is, and it is very sad. And granted these 367 00:17:39,000 --> 00:17:42,159 Speaker 2: aren't you know, major languages, but they're important to the 368 00:17:42,200 --> 00:17:45,359 Speaker 2: people who speak them. Sure, but that's just sort of 369 00:17:45,400 --> 00:17:50,479 Speaker 2: to to get across the point that throwing it up 370 00:17:50,520 --> 00:17:53,760 Speaker 2: in a bunch of languages, there's no guarantee. And in fact, 371 00:17:53,760 --> 00:17:58,320 Speaker 2: in all likelihood, in fifty thousand years, there won't be 372 00:17:58,440 --> 00:18:01,720 Speaker 2: English or German or French. There may not even be humans. 373 00:18:02,720 --> 00:18:06,159 Speaker 2: That's a really point. We may be what's the calculation. 374 00:18:05,920 --> 00:18:07,800 Speaker 1: Four hundred and forty six days. I was a little lot. 375 00:18:07,840 --> 00:18:11,159 Speaker 1: Okay there there, There may be. We may all be 376 00:18:11,280 --> 00:18:14,960 Speaker 1: like post biological humans, you know, uploaded our consciousness onto 377 00:18:15,200 --> 00:18:17,760 Speaker 1: like the Internet or something. And which point that really 378 00:18:17,800 --> 00:18:19,800 Speaker 1: won't matter to tell you the truth. Where the nuclear 379 00:18:19,840 --> 00:18:22,040 Speaker 1: waste is buried? But who knows? It could be an 380 00:18:22,080 --> 00:18:24,600 Speaker 1: intelligent species, It could be humans who don't know how 381 00:18:24,600 --> 00:18:27,439 Speaker 1: to read or write. The fact is is the stuff 382 00:18:27,440 --> 00:18:30,960 Speaker 1: that we take for granted changes a lot faster than 383 00:18:31,000 --> 00:18:34,520 Speaker 1: you think, and even if it doesn't necessarily die out, 384 00:18:34,840 --> 00:18:38,200 Speaker 1: the changes that come along are pretty alarming. I found 385 00:18:38,200 --> 00:18:41,440 Speaker 1: a I've been watching a lot of Silicon Valley lately. 386 00:18:41,480 --> 00:18:42,760 Speaker 2: I told you yeah, great show. 387 00:18:43,560 --> 00:18:49,520 Speaker 1: My vocal delivery sounds a lot like Jared's. It's a 388 00:18:49,600 --> 00:18:52,080 Speaker 1: curly you think a lot. 389 00:18:53,680 --> 00:18:55,240 Speaker 2: I never really put those two together. 390 00:18:55,280 --> 00:18:57,120 Speaker 1: We'll keep an ear out for it now, and see 391 00:18:57,160 --> 00:19:00,400 Speaker 1: what you think. I mean. Tell me I'm wrong. 392 00:19:00,440 --> 00:19:03,160 Speaker 2: I mean, I would have to dissociate so much because 393 00:19:03,160 --> 00:19:06,880 Speaker 2: I like you and Jared is like such a pedantic bureaucrat. 394 00:19:06,920 --> 00:19:07,879 Speaker 1: Oh, I love him. 395 00:19:08,119 --> 00:19:10,840 Speaker 2: I mean he's fun, yeah to watch, But I wouldn't 396 00:19:10,840 --> 00:19:14,040 Speaker 2: say that he's like the most likable character. Maybe he is, 397 00:19:14,080 --> 00:19:14,479 Speaker 2: I don't know. 398 00:19:14,560 --> 00:19:18,600 Speaker 1: I would say pedantic bureaucrat is not entirely off for me. 399 00:19:19,200 --> 00:19:21,160 Speaker 2: Yeah, Jared needs a girlfriend, that's his deal. 400 00:19:21,160 --> 00:19:24,480 Speaker 1: Okay, so I do not because I have a fine wife, 401 00:19:24,520 --> 00:19:26,800 Speaker 1: That's right. So let me give you an example of 402 00:19:26,800 --> 00:19:29,480 Speaker 1: how English has changed. This is a quote from Sir 403 00:19:29,520 --> 00:19:31,920 Speaker 1: Gawain and the Green Knight. It was written in thirteen 404 00:19:32,000 --> 00:19:35,360 Speaker 1: seventy five, Oh boy, six hundred and fifty years ago. 405 00:19:35,480 --> 00:19:39,159 Speaker 1: All right, this is in English. The steel of a 406 00:19:39,240 --> 00:19:42,840 Speaker 1: stiff staff, the stern hit be gripped, that was wound 407 00:19:42,880 --> 00:19:45,800 Speaker 1: in with iron into the wand's end. And I'll be 408 00:19:45,880 --> 00:19:49,640 Speaker 1: graving with green and gravious works. And you should see 409 00:19:49,640 --> 00:19:50,160 Speaker 1: it spelled. 410 00:19:50,280 --> 00:19:52,000 Speaker 2: Oh yeah, I mean I was an English major. We 411 00:19:52,040 --> 00:19:53,639 Speaker 2: had to go through this stuff. It was a slog. 412 00:19:53,760 --> 00:19:55,320 Speaker 1: Do you have a guess at what I just said? 413 00:19:55,800 --> 00:19:58,760 Speaker 2: Yeah, you said that. He uh, the green Knight sat 414 00:19:58,760 --> 00:19:59,960 Speaker 2: down and watched some silicon valley. 415 00:20:00,040 --> 00:20:03,840 Speaker 1: That's right. It's that the grim Man gripped it by 416 00:20:03,880 --> 00:20:06,919 Speaker 1: its strong handle, which was wound with iron all the 417 00:20:06,920 --> 00:20:09,640 Speaker 1: way to the end, and graven and green with graceful designs. 418 00:20:09,920 --> 00:20:12,840 Speaker 1: So like, that's English six hundred and fifty years ago. 419 00:20:12,920 --> 00:20:14,119 Speaker 1: English is still around. 420 00:20:13,880 --> 00:20:17,240 Speaker 2: Six hundred and fifty years. We're talking about thousands, tens 421 00:20:17,240 --> 00:20:18,680 Speaker 2: of thousands of years exactly. 422 00:20:18,720 --> 00:20:22,280 Speaker 1: So that's a problem. Languages evolve, languages die, Symbols don't 423 00:20:22,320 --> 00:20:24,880 Speaker 1: quite make sense out of context. So there's a lot 424 00:20:24,920 --> 00:20:29,040 Speaker 1: of challenges that face the people to try to explain 425 00:20:29,119 --> 00:20:32,480 Speaker 1: this stuff, or figure out how to explain it to 426 00:20:32,720 --> 00:20:34,719 Speaker 1: future people, I think is a better way to put it. 427 00:20:35,000 --> 00:20:39,400 Speaker 2: That's right. They have looked in semio semiotitians for people 428 00:20:39,440 --> 00:20:43,120 Speaker 2: who really want on this stuff. Sure, I think I'm 429 00:20:43,119 --> 00:20:45,080 Speaker 2: an amateur semiutitian after reading this. 430 00:20:45,320 --> 00:20:45,879 Speaker 1: That's great. 431 00:20:45,920 --> 00:20:47,840 Speaker 2: But one thing that they're looking for, because what you 432 00:20:47,880 --> 00:20:51,640 Speaker 2: want is ideally is instant recognition and not something I mean, 433 00:20:51,920 --> 00:20:54,199 Speaker 2: yeah maybe if you have to figure it out. But 434 00:20:54,280 --> 00:20:57,199 Speaker 2: what you want is something that conveys danger right when 435 00:20:57,240 --> 00:20:57,680 Speaker 2: you look at. 436 00:20:57,640 --> 00:21:00,159 Speaker 1: It, Like just steer clear of this right place, not 437 00:21:00,280 --> 00:21:04,080 Speaker 1: come closer and start poking around. Just go away, That's right. 438 00:21:04,280 --> 00:21:07,720 Speaker 2: So she makes a great point though, that, like it's 439 00:21:07,760 --> 00:21:09,720 Speaker 2: a double edged sword, like you were talking about earlier. 440 00:21:09,760 --> 00:21:12,960 Speaker 2: If you, you know, human beings, if you show an 441 00:21:13,000 --> 00:21:15,640 Speaker 2: extreme skier or a sign this is danger, don't ski 442 00:21:15,720 --> 00:21:18,719 Speaker 2: this way. Sure, he's gonna say, bra let's do it. 443 00:21:18,880 --> 00:21:21,960 Speaker 1: Yeah, you know, give me some homicide power drink. 444 00:21:22,160 --> 00:21:26,320 Speaker 2: So there's a very very fine line between warning people 445 00:21:26,400 --> 00:21:27,439 Speaker 2: and enticing people. 446 00:21:28,080 --> 00:21:32,000 Speaker 1: Yeah, even inadvertently exactly, you know. I mean, there's she 447 00:21:32,000 --> 00:21:33,920 Speaker 1: she points out haunted houses because I'm like, yeah, not 448 00:21:33,960 --> 00:21:37,480 Speaker 1: everybody's like a red bull extreme sports person, but people 449 00:21:37,520 --> 00:21:40,600 Speaker 1: do like it haunted houses too, So that abandoned like 450 00:21:40,640 --> 00:21:43,120 Speaker 1: scary place is so creepy. Let's go there for Halloween, 451 00:21:43,320 --> 00:21:46,359 Speaker 1: right because maybe Halloween survived but the English language didn't. 452 00:21:46,400 --> 00:21:49,840 Speaker 1: Who knows. So yeah, you you you really walk a 453 00:21:49,880 --> 00:21:53,159 Speaker 1: fine line here between warning people away and saying I 454 00:21:53,400 --> 00:21:53,960 Speaker 1: dare you? 455 00:21:54,680 --> 00:22:00,320 Speaker 2: Right? Yeah, my whole jam is I think they need to, Uh, 456 00:22:00,400 --> 00:22:03,520 Speaker 2: what will survive if there are humans at all is emotion, 457 00:22:04,520 --> 00:22:06,679 Speaker 2: So I think they need to appeal to human emotions 458 00:22:06,720 --> 00:22:11,280 Speaker 2: like fear, more than words and symbols. 459 00:22:11,359 --> 00:22:13,320 Speaker 1: Okay, well, let's take a break and we'll get back 460 00:22:13,320 --> 00:22:14,879 Speaker 1: into this, all right, because this is fun. 461 00:22:15,080 --> 00:22:32,400 Speaker 4: Yes, any stuffy jaws. 462 00:22:41,560 --> 00:22:43,480 Speaker 1: All right, chuck. So we've kind of talked about how 463 00:22:44,040 --> 00:22:47,600 Speaker 1: things go away, Languages fall away, symbols don't make sense. 464 00:22:47,440 --> 00:22:49,040 Speaker 2: Any any ephemera. 465 00:22:49,520 --> 00:22:55,119 Speaker 1: It is, it really is, right, So what what will last? 466 00:22:55,160 --> 00:22:58,320 Speaker 1: What have nuclear semiutitions come up with? And should we 467 00:22:58,320 --> 00:23:00,280 Speaker 1: explain with semiotics? Is in general. 468 00:23:01,520 --> 00:23:01,960 Speaker 2: What is it? 469 00:23:01,960 --> 00:23:05,040 Speaker 1: I don't even know, oh, just kind of in shorthand. 470 00:23:05,080 --> 00:23:10,440 Speaker 1: Semiotics is basically the study of how and why signs 471 00:23:10,480 --> 00:23:13,440 Speaker 1: have meaning. Okay, right, Like you were saying earlier, how 472 00:23:13,680 --> 00:23:17,119 Speaker 1: the word apple doesn't evoke thoughts of the word apple. 473 00:23:17,200 --> 00:23:20,639 Speaker 1: It evokes thoughts of the round, shiny, tasty fruit that 474 00:23:20,680 --> 00:23:25,119 Speaker 1: grows on a tree. That's a sign. In semiotics, that's 475 00:23:25,119 --> 00:23:28,240 Speaker 1: specifically a cursive sign because it uses language. 476 00:23:28,520 --> 00:23:33,760 Speaker 2: So what they've done in many cases is, and this 477 00:23:33,800 --> 00:23:35,720 Speaker 2: is a great idea for stuff like this is to 478 00:23:36,080 --> 00:23:40,240 Speaker 2: have a competition. They had one at UCLA, I think 479 00:23:40,280 --> 00:23:42,200 Speaker 2: in two thousand and one called it called the Desert 480 00:23:42,240 --> 00:23:49,040 Speaker 2: Space Competition. And what one that year was a cactus, 481 00:23:49,160 --> 00:23:54,320 Speaker 2: a yucca cacti glowing blue. And then the idea was 482 00:23:54,480 --> 00:23:58,960 Speaker 2: plant a field of these regular green cacti, and then 483 00:23:59,280 --> 00:24:02,960 Speaker 2: over the place where you know the waist is the repository, 484 00:24:03,000 --> 00:24:04,760 Speaker 2: and then if you see the sign of a glowing 485 00:24:04,800 --> 00:24:08,440 Speaker 2: blue one, I mean, I don't think I didn't see 486 00:24:08,440 --> 00:24:09,800 Speaker 2: the rest of them, but I didn't think this one 487 00:24:09,840 --> 00:24:10,879 Speaker 2: was that great. 488 00:24:11,680 --> 00:24:13,520 Speaker 1: I'm sorry to the person who came up with it. 489 00:24:13,480 --> 00:24:15,800 Speaker 2: Though I know. I think a bit of something they 490 00:24:15,800 --> 00:24:18,879 Speaker 2: should do is go even further back to younger children, 491 00:24:19,320 --> 00:24:21,680 Speaker 2: because sometime like go to like an elementary school and 492 00:24:21,720 --> 00:24:23,280 Speaker 2: ask kids or a high school. 493 00:24:23,560 --> 00:24:26,160 Speaker 1: Right, or you just take each kid out and rub 494 00:24:26,200 --> 00:24:27,760 Speaker 1: their face in the sand and be like you see this, 495 00:24:27,880 --> 00:24:28,760 Speaker 1: you stay out of here. 496 00:24:29,200 --> 00:24:32,000 Speaker 2: No, I mean, have the kids like throw out ideas 497 00:24:32,040 --> 00:24:34,080 Speaker 2: because I think, oh, oh, I see yeah, I think 498 00:24:34,119 --> 00:24:38,440 Speaker 2: the I think a lot of times children can cut 499 00:24:38,480 --> 00:24:41,439 Speaker 2: through the to the simplicity of something much better than 500 00:24:41,480 --> 00:24:44,680 Speaker 2: adults can easily. So that's my idea. Throw it out 501 00:24:44,680 --> 00:24:46,479 Speaker 2: as a as a science fair project. 502 00:24:46,560 --> 00:24:48,160 Speaker 1: Well, I think that's one of the cool things about 503 00:24:48,240 --> 00:24:52,560 Speaker 1: nuclear semionics is it's so inviting to like anybody can 504 00:24:52,600 --> 00:24:55,960 Speaker 1: come up with a great idea. It's just so confounding, 505 00:24:56,000 --> 00:24:57,480 Speaker 1: but it's also so accessible. 506 00:24:57,600 --> 00:25:00,199 Speaker 2: Yeah, we'll get ideas. In fact, we want to hear 507 00:25:00,200 --> 00:25:02,160 Speaker 2: from you. If you think you have a cool idea, 508 00:25:02,200 --> 00:25:04,320 Speaker 2: it's a good idea, Like, I guarantee you we're going 509 00:25:04,359 --> 00:25:06,119 Speaker 2: to get some good ones. Yep, We're not going to 510 00:25:06,119 --> 00:25:07,080 Speaker 2: pass them along or anything. 511 00:25:07,400 --> 00:25:10,880 Speaker 1: So rather than just like poo pooing the glowing yuka one, 512 00:25:11,320 --> 00:25:14,560 Speaker 1: there's a here's the problem with the glowing yucca idea. 513 00:25:15,119 --> 00:25:20,720 Speaker 1: It requires explanation, right, somebody. So part of the glowing 514 00:25:20,800 --> 00:25:23,960 Speaker 1: yuka is to say, these things have been genetically engineered 515 00:25:23,960 --> 00:25:27,119 Speaker 1: so that when there's radiation present, they glow. So if 516 00:25:27,160 --> 00:25:29,960 Speaker 1: you see this yucka glowing, it means that there's radiation here, 517 00:25:30,040 --> 00:25:34,520 Speaker 1: stay away, right. If you lose that that additional story 518 00:25:34,560 --> 00:25:36,960 Speaker 1: that has to go along with the glowing yuka, then 519 00:25:37,000 --> 00:25:39,480 Speaker 1: you just have glowing yucca. And I can't think of 520 00:25:39,520 --> 00:25:41,679 Speaker 1: a more attractive thing that's going to draw people to 521 00:25:41,760 --> 00:25:45,480 Speaker 1: a site than the legendary glowing yucca that only glows 522 00:25:45,520 --> 00:25:48,399 Speaker 1: in this one. Soster On earth. Yeah, that's kind of 523 00:25:48,440 --> 00:25:49,560 Speaker 1: the problem with it. You know. 524 00:25:49,680 --> 00:25:52,159 Speaker 2: I like this other idea from that same year a 525 00:25:52,160 --> 00:25:56,359 Speaker 2: little better that did not win. Fields of Asphodel, which 526 00:25:56,440 --> 00:25:59,879 Speaker 2: is a Eurasian lily. They said, let's just cover the 527 00:26:00,280 --> 00:26:03,560 Speaker 2: with metal blades that screech when the wind blows. It 528 00:26:03,600 --> 00:26:05,840 Speaker 2: makes it horrible noise, right, not bad. 529 00:26:06,200 --> 00:26:09,840 Speaker 1: Here's the problem with that. Okay, Moving parts Okay, sure, 530 00:26:09,920 --> 00:26:12,359 Speaker 1: it's been pretty well established that if you're trying to 531 00:26:12,640 --> 00:26:16,520 Speaker 1: convey something to the people into the distant future, you 532 00:26:16,680 --> 00:26:20,080 Speaker 1: need to have something that's monolithic and made of one piece, 533 00:26:20,520 --> 00:26:24,080 Speaker 1: because if you have multiple parts, that's an opportunity for 534 00:26:24,160 --> 00:26:27,400 Speaker 1: weathering to occur through the place where the two parts meet, 535 00:26:27,480 --> 00:26:29,560 Speaker 1: or three parts or five parts. And if it's a 536 00:26:29,600 --> 00:26:32,360 Speaker 1: moving part, just kiss the movement goodbye. 537 00:26:32,680 --> 00:26:36,040 Speaker 2: What about this? Okay, I've had the thought earlier today 538 00:26:36,119 --> 00:26:39,119 Speaker 2: about just a mountain of razor wire. 539 00:26:39,560 --> 00:26:41,280 Speaker 1: Okay, here's the problem with that. 540 00:26:41,359 --> 00:26:41,679 Speaker 2: Okay. 541 00:26:42,440 --> 00:26:44,679 Speaker 1: And this is the same problem also with the what 542 00:26:44,760 --> 00:26:47,040 Speaker 1: is the problem the steel? The steel stuff that move 543 00:26:47,080 --> 00:26:49,240 Speaker 1: and everything this do you want to use you? I don't, 544 00:26:49,480 --> 00:26:53,680 Speaker 1: But you want to use stuff that has no value whatsoever, 545 00:26:53,920 --> 00:26:56,399 Speaker 1: not just financially, but usefulness. 546 00:26:55,800 --> 00:26:58,320 Speaker 2: Well, because someone will say I can harvest that razor wire. 547 00:26:58,400 --> 00:27:00,520 Speaker 1: Yeah, I can go use that to keep the cows 548 00:27:00,600 --> 00:27:01,880 Speaker 1: in in my house next door. 549 00:27:02,119 --> 00:27:03,919 Speaker 2: Yeah. But if you have so much of it. 550 00:27:04,359 --> 00:27:07,840 Speaker 1: Over time, over ten thousand years, people like take and make. 551 00:27:08,800 --> 00:27:11,040 Speaker 1: I mean, that's why the pyramids are stripped of, like 552 00:27:11,080 --> 00:27:13,840 Speaker 1: they're more attractive outer. They used to have like a 553 00:27:13,920 --> 00:27:19,080 Speaker 1: white I think limestone shell encasement. It's gone because the 554 00:27:19,160 --> 00:27:20,840 Speaker 1: locals were like, oh, I can use that to build 555 00:27:20,840 --> 00:27:25,000 Speaker 1: a fine lot pizza hut. That's exactly that's what people 556 00:27:25,320 --> 00:27:27,639 Speaker 1: will do if you place something of any kind of 557 00:27:27,720 --> 00:27:30,520 Speaker 1: usefulness of that true like that is the beauty in 558 00:27:30,600 --> 00:27:33,560 Speaker 1: that every idea is wrong. 559 00:27:33,359 --> 00:27:36,520 Speaker 2: As a whole. It's so great, it's pretty great. 560 00:27:36,560 --> 00:27:37,160 Speaker 1: I love it. 561 00:27:37,480 --> 00:27:41,040 Speaker 2: So. One of the most often sided bodies of work 562 00:27:41,119 --> 00:27:43,679 Speaker 2: is from nineteen eighty two eighty three, and this was 563 00:27:43,720 --> 00:27:47,200 Speaker 2: a call for ideas from the German Journal of Semiotics 564 00:27:47,680 --> 00:27:49,720 Speaker 2: that basically said the same thing. It's like, you know, 565 00:27:49,720 --> 00:27:51,800 Speaker 2: what are your ideas? This one got a little goofy 566 00:27:52,560 --> 00:27:56,480 Speaker 2: to say the least. Someone suggests an artificial moon as 567 00:27:56,480 --> 00:27:57,720 Speaker 2: a storage vessel. 568 00:27:58,960 --> 00:28:00,399 Speaker 1: There's just a huge flaw on that one. 569 00:28:00,440 --> 00:28:02,119 Speaker 2: If you ask me, I mean I don't even get that. 570 00:28:02,760 --> 00:28:04,960 Speaker 1: Well, it was like, how do you make sure that 571 00:28:05,000 --> 00:28:10,080 Speaker 1: the information about this site stays protected? Put it into 572 00:28:10,160 --> 00:28:12,639 Speaker 1: an artificial moon in orbit around Earth? But it's like, 573 00:28:12,920 --> 00:28:14,680 Speaker 1: how do you get to the artificial. 574 00:28:14,280 --> 00:28:16,720 Speaker 2: Don't get that's what they meant that. Yeah, that doesn't 575 00:28:16,720 --> 00:28:18,880 Speaker 2: make any sense. That's what I think I guess they were. 576 00:28:18,920 --> 00:28:21,399 Speaker 2: I mean it said, oh, well, were they beaming it 577 00:28:21,440 --> 00:28:23,400 Speaker 2: down to a TV that won't play? 578 00:28:23,560 --> 00:28:26,320 Speaker 1: That's a different one. Yeah, and that. I just don't 579 00:28:26,400 --> 00:28:27,360 Speaker 1: understand this at all. 580 00:28:27,720 --> 00:28:30,880 Speaker 2: I don't understand the Radioactive Cats either, even though that's 581 00:28:30,920 --> 00:28:32,120 Speaker 2: a decent band name. 582 00:28:32,320 --> 00:28:35,000 Speaker 1: So there. That was a big part of the ninety 583 00:28:35,080 --> 00:28:38,360 Speaker 1: nine percent Invisible episode. Now Nuclear Semiotics. They talked about 584 00:28:38,360 --> 00:28:41,280 Speaker 1: the ray Cats, and I think they actually hired a 585 00:28:41,360 --> 00:28:44,520 Speaker 1: musician to create a song because just like with the 586 00:28:44,560 --> 00:28:47,920 Speaker 1: glowing Yucca, you have to explain what's going on when 587 00:28:47,920 --> 00:28:50,080 Speaker 1: the cats glow, you need to stay away. So they 588 00:28:50,120 --> 00:28:52,200 Speaker 1: had somebody come up with a Raycat song I believe 589 00:28:52,200 --> 00:28:52,960 Speaker 1: for the episode. 590 00:28:53,160 --> 00:28:54,800 Speaker 2: Was it Hoody and the Blowfish, Yes, it was. 591 00:28:55,880 --> 00:28:56,720 Speaker 1: That was a good guess. 592 00:28:56,760 --> 00:28:58,920 Speaker 2: Now, this one I thought was had a little I 593 00:28:58,920 --> 00:29:03,800 Speaker 2: thought it was interesting. This semi Aitian named Thomas Cibioc. 594 00:29:05,280 --> 00:29:10,280 Speaker 2: He said, this what has survived more than anything else religion, Right, 595 00:29:10,600 --> 00:29:12,959 Speaker 2: religious texts that date back, you know, a couple thousand 596 00:29:13,000 --> 00:29:15,120 Speaker 2: years in the Catholic Church. Not a bad start. 597 00:29:15,200 --> 00:29:18,360 Speaker 1: Yeah, The ideas that you hear at Catholic Mass today 598 00:29:18,520 --> 00:29:21,680 Speaker 1: are a couple thousand years old and something instances. And 599 00:29:21,760 --> 00:29:23,800 Speaker 1: if you go back to the original text, which we 600 00:29:23,840 --> 00:29:27,160 Speaker 1: can still read, fortunately, you can say, yep, this is 601 00:29:27,200 --> 00:29:29,880 Speaker 1: what they're talking about. Like those ideas have survived that 602 00:29:30,040 --> 00:29:32,600 Speaker 1: long because of the practices they use. 603 00:29:32,680 --> 00:29:35,400 Speaker 2: So interesting idea, But it gets a little goofy yeah, 604 00:29:35,440 --> 00:29:39,080 Speaker 2: because he thought, why don't we almost create a fake 605 00:29:39,240 --> 00:29:45,560 Speaker 2: religion around this thing, a fearful myth that you can generate, 606 00:29:45,720 --> 00:29:50,720 Speaker 2: appointing an atomic priesthood to tell people and tell them 607 00:29:50,800 --> 00:29:54,640 Speaker 2: to tell future generations. But I mean, I guess the 608 00:29:54,720 --> 00:29:57,560 Speaker 2: idea is that it's all false and it's just a 609 00:29:57,560 --> 00:29:59,000 Speaker 2: big made up story. 610 00:29:59,120 --> 00:30:01,840 Speaker 1: Yeah, the comic priesthood would know the truth and they 611 00:30:01,840 --> 00:30:06,120 Speaker 1: would indoctrinate people. But out in society around them, it 612 00:30:06,120 --> 00:30:09,440 Speaker 1: would be a closely guarded secret because everybody else thinks 613 00:30:09,600 --> 00:30:12,280 Speaker 1: that whatever this fake myth about, why you have to 614 00:30:12,320 --> 00:30:16,880 Speaker 1: stay away from this haunted, evil area is true when 615 00:30:16,960 --> 00:30:20,120 Speaker 1: really the atomic priests are the ones who know no, 616 00:30:20,240 --> 00:30:23,760 Speaker 1: actually there's radioactive stuff that out here. They just came 617 00:30:23,840 --> 00:30:26,800 Speaker 1: up with this three thousand years ago to scare everybody away. 618 00:30:27,360 --> 00:30:29,880 Speaker 2: But initially a decent idea as far as trying to 619 00:30:29,920 --> 00:30:32,960 Speaker 2: make it or incorporate like what religion does. But it 620 00:30:33,080 --> 00:30:34,480 Speaker 2: just definitely strange. 621 00:30:34,600 --> 00:30:38,120 Speaker 1: It is to me though, it is at its base despicable. 622 00:30:38,320 --> 00:30:43,800 Speaker 1: It's a despicable idea because it is purposefully introducing fearful, 623 00:30:44,040 --> 00:30:48,000 Speaker 1: false superstition into the future, Like we're going to purposefully 624 00:30:48,040 --> 00:30:52,000 Speaker 1: introduce fearful false superstition into the future just to scare 625 00:30:52,000 --> 00:30:56,000 Speaker 1: people off from radioactivity, like what kind of sweeping side effects, 626 00:30:56,120 --> 00:30:58,640 Speaker 1: what kind of wars might start over this time? People 627 00:30:58,680 --> 00:31:02,000 Speaker 1: will die? Then this fake thing that they don't realize 628 00:31:02,120 --> 00:31:05,000 Speaker 1: is fake. Because Thomas Sebiak came up with this idea 629 00:31:05,200 --> 00:31:07,680 Speaker 1: to keep people away from a single site in New Mexico, 630 00:31:08,240 --> 00:31:09,320 Speaker 1: that's crazy. 631 00:31:09,480 --> 00:31:12,040 Speaker 2: It didn't fare too well either among his colleagues. 632 00:31:12,080 --> 00:31:14,719 Speaker 1: No, and rightfully so, because again it's a despicable idea. 633 00:31:14,880 --> 00:31:17,600 Speaker 2: So he was on the Human Interference Task Force. We 634 00:31:17,680 --> 00:31:22,040 Speaker 2: mentioned the Nevada site. That was what was what was 635 00:31:22,120 --> 00:31:24,880 Speaker 2: launched for that yucka Mountain site back in eighty one, 636 00:31:24,960 --> 00:31:26,080 Speaker 2: from eighty one to eighty three. 637 00:31:26,520 --> 00:31:31,400 Speaker 1: So whatever Cebiac's original idea was, he had like some 638 00:31:31,600 --> 00:31:35,960 Speaker 1: other closely related ideas that were great. Though, Yeah, like 639 00:31:36,000 --> 00:31:38,680 Speaker 1: he's not like this a total nut job happened. I 640 00:31:38,720 --> 00:31:41,800 Speaker 1: think it was just a misfire for in an otherwise 641 00:31:41,840 --> 00:31:44,880 Speaker 1: illustrious career. I think I don't know that much about him. 642 00:31:45,000 --> 00:31:49,360 Speaker 1: But one of his other ideas was, Okay, well, let's 643 00:31:49,360 --> 00:31:52,920 Speaker 1: take the atomic priesthood away, let's take the religion and 644 00:31:52,960 --> 00:31:55,760 Speaker 1: all that stuff away, and let's just give them like 645 00:31:55,800 --> 00:31:58,760 Speaker 1: the facts, but let's figure out a way to make 646 00:31:58,800 --> 00:32:01,480 Speaker 1: sure that those facts get pass down. And what he 647 00:32:01,560 --> 00:32:04,520 Speaker 1: came up with was called a meta message, where it's 648 00:32:04,560 --> 00:32:09,200 Speaker 1: a message that says, this place has nuclear radiation, it 649 00:32:09,200 --> 00:32:11,280 Speaker 1: can kill you. You need to stay away from it. 650 00:32:12,080 --> 00:32:15,200 Speaker 1: And we invite you to take this message and translate 651 00:32:15,240 --> 00:32:17,960 Speaker 1: it into whatever languages you guys have on Earth at. 652 00:32:17,880 --> 00:32:20,160 Speaker 2: The time, assuming you can read this right. 653 00:32:20,560 --> 00:32:23,440 Speaker 1: But if you do that often enough, there will always 654 00:32:23,480 --> 00:32:26,200 Speaker 1: be somebody who can translate it. Oh sure, And then 655 00:32:26,920 --> 00:32:29,760 Speaker 1: that way you form a bridge between now and as 656 00:32:29,800 --> 00:32:32,200 Speaker 1: far into the future as people are around to read 657 00:32:32,280 --> 00:32:36,280 Speaker 1: and add their own interpretation or their own translation of it. 658 00:32:36,560 --> 00:32:38,760 Speaker 1: But then you want to leave the original so that 659 00:32:38,800 --> 00:32:42,200 Speaker 1: if there's ever like a disagreement about what word meant, 660 00:32:42,680 --> 00:32:46,120 Speaker 1: hopefully somebody can go back language and language language and 661 00:32:46,200 --> 00:32:49,360 Speaker 1: connect them so that they can see the original version. 662 00:32:50,240 --> 00:32:53,840 Speaker 2: Yeah, but like what if a society develops an isolation 663 00:32:54,000 --> 00:32:56,280 Speaker 2: that knows none of these languages. 664 00:32:56,280 --> 00:33:00,200 Speaker 1: You're just totally toast. Yeah, that's when the symbols come in. 665 00:33:00,800 --> 00:33:03,080 Speaker 2: Right. So what they settled on as a panel though, 666 00:33:03,160 --> 00:33:06,760 Speaker 2: and from eighty one to eighty three was what's called 667 00:33:06,800 --> 00:33:09,840 Speaker 2: long term communication was going to be the most effective thing, 668 00:33:09,880 --> 00:33:11,440 Speaker 2: like kind of what you were just talking about, right, 669 00:33:11,840 --> 00:33:14,320 Speaker 2: And they said a system that combines physical markers and 670 00:33:14,440 --> 00:33:17,320 Speaker 2: archives that cover the two major forms of this long 671 00:33:17,400 --> 00:33:23,760 Speaker 2: term community communicate, direct and successive. Direct utilizes markers, and 672 00:33:24,600 --> 00:33:28,480 Speaker 2: successive is humans like you were talking about. I guess 673 00:33:28,480 --> 00:33:30,520 Speaker 2: with this meta message, I guess you could write it down, 674 00:33:30,560 --> 00:33:33,000 Speaker 2: but it's still humans carrying a message through time. 675 00:33:33,200 --> 00:33:35,760 Speaker 1: Well, it's more like a direct one is like you 676 00:33:35,800 --> 00:33:39,920 Speaker 1: can write an inscription on a monument and that monument 677 00:33:40,000 --> 00:33:43,200 Speaker 1: is going to deliver that message directly to people ten 678 00:33:43,240 --> 00:33:44,440 Speaker 1: thousand years from now, yeah. 679 00:33:44,440 --> 00:33:45,720 Speaker 2: I mean it's a physical thing. 680 00:33:45,720 --> 00:33:48,760 Speaker 1: Right, whereas with successive it's kind of passed along like 681 00:33:48,800 --> 00:33:49,320 Speaker 1: a game. 682 00:33:49,160 --> 00:33:52,160 Speaker 2: Of telephone exactly. And you know how that goes, right, 683 00:33:52,200 --> 00:33:54,480 Speaker 2: It can get a little hinky, that's right, but it's 684 00:33:54,480 --> 00:33:56,240 Speaker 2: always fun at a slumber party. 685 00:33:56,400 --> 00:33:59,360 Speaker 1: Sure, So they came up with multiple ones, Like you 686 00:33:59,400 --> 00:34:05,160 Speaker 1: were saying, they've settled on a monument that had massive 687 00:34:05,200 --> 00:34:09,120 Speaker 1: stone structures. Remember, you want monoliths, they're engraved with warnings 688 00:34:09,400 --> 00:34:12,520 Speaker 1: in all currently known languages. It's a lot of languages. 689 00:34:13,280 --> 00:34:13,719 Speaker 2: You want a. 690 00:34:13,640 --> 00:34:16,840 Speaker 1: Buried vault that has all the info you need about radioactivity, 691 00:34:16,840 --> 00:34:19,360 Speaker 1: about the site, all that stuff. Sure, you want a 692 00:34:19,400 --> 00:34:23,200 Speaker 1: bunch of barriers around the site, not necessarily to definitely 693 00:34:23,280 --> 00:34:25,520 Speaker 1: keep people out, but enough to basically say, hey, hey, 694 00:34:25,560 --> 00:34:27,399 Speaker 1: we're trying to impede progress here. 695 00:34:27,600 --> 00:34:29,080 Speaker 2: Yeah, I mean to me, that's one of the most 696 00:34:29,120 --> 00:34:32,479 Speaker 2: obvious ones. If like you see a huge wall again, 697 00:34:32,520 --> 00:34:35,359 Speaker 2: it might entice you, but it for sure indicates to 698 00:34:35,400 --> 00:34:37,600 Speaker 2: any culture that you're like, you're not meant to come 699 00:34:37,680 --> 00:34:38,839 Speaker 2: beyond this, Right. 700 00:34:39,480 --> 00:34:43,840 Speaker 1: And then the last one is a network of archives 701 00:34:44,000 --> 00:34:47,239 Speaker 1: basically the same information you would have in that buried vault, right, 702 00:34:47,400 --> 00:34:50,200 Speaker 1: but elsewhere scattered around the world. So if something happens 703 00:34:50,239 --> 00:34:52,560 Speaker 1: with the buried vault, somebody can come across the archives 704 00:34:52,560 --> 00:34:54,200 Speaker 1: somewhere and be like, oh wait, wait, we want to 705 00:34:54,239 --> 00:34:54,680 Speaker 1: stay out of. 706 00:34:54,680 --> 00:34:57,040 Speaker 2: There, right. And along with that, they said, while we're 707 00:34:57,040 --> 00:34:59,879 Speaker 2: at it, can we at least like all agree around 708 00:35:00,080 --> 00:35:03,280 Speaker 2: world on a nuclear warning symbol. If it's the trefoil 709 00:35:03,400 --> 00:35:07,759 Speaker 2: or whatever, let's just all codify that as the thing, 710 00:35:07,920 --> 00:35:09,200 Speaker 2: which is not the case right now. 711 00:35:09,320 --> 00:35:12,640 Speaker 1: No, there's was a triangle with an arrow pointing down, 712 00:35:12,719 --> 00:35:14,440 Speaker 1: and then in the head of the arrow was the 713 00:35:14,480 --> 00:35:17,839 Speaker 1: Biohezard symbol, which is not great because you want something 714 00:35:17,880 --> 00:35:19,920 Speaker 1: that's going to be so simple that even as. 715 00:35:19,880 --> 00:35:22,520 Speaker 2: People confuse me, I need to see it. I guess yeah. 716 00:35:22,560 --> 00:35:24,399 Speaker 1: It's even when you see it you're like, wait what. 717 00:35:25,719 --> 00:35:27,919 Speaker 1: But you want something simple enough so that as people 718 00:35:28,000 --> 00:35:30,080 Speaker 1: kind of create a shorthand version of it, it still 719 00:35:30,080 --> 00:35:33,320 Speaker 1: retains its meaning right or visually. 720 00:35:33,160 --> 00:35:35,279 Speaker 2: All right, So that stuff was the Yaka project. In 721 00:35:35,320 --> 00:35:38,759 Speaker 2: the early eighties, they decided not to do that. They 722 00:35:39,120 --> 00:35:43,239 Speaker 2: just packed it up, put it away, and then it 723 00:35:43,280 --> 00:35:46,279 Speaker 2: all came back again with this new Mexico plant when 724 00:35:46,280 --> 00:35:49,719 Speaker 2: the Department of Energy said once again, hey, we need 725 00:35:49,760 --> 00:35:52,880 Speaker 2: to think of a sign and a symbol or whatever 726 00:35:52,920 --> 00:35:55,600 Speaker 2: you can come up with. And we need the best 727 00:35:55,600 --> 00:35:58,480 Speaker 2: and the brightest thinking of this. So call up Carl Sagan. 728 00:35:58,560 --> 00:36:03,520 Speaker 1: Get me Sagan, give me Sagan, give me Percy Tannenbaum Stat. 729 00:36:03,440 --> 00:36:05,879 Speaker 2: And this guy named John Lomberg who's a science writer 730 00:36:05,960 --> 00:36:09,479 Speaker 2: and space illustrator and he had worked in semiotics before 731 00:36:09,520 --> 00:36:12,680 Speaker 2: for NASA on their mission to Mars. Sagan was in 732 00:36:12,800 --> 00:36:15,200 Speaker 2: ill health, so he declined to come, but he sent 733 00:36:15,239 --> 00:36:19,400 Speaker 2: a message from the president. I guess that said skull 734 00:36:19,400 --> 00:36:23,680 Speaker 2: and crossbones. Dude, done. Yeah, universal, everyone knows it. 735 00:36:23,719 --> 00:36:26,200 Speaker 1: He gave a really good example. He said, it has 736 00:36:27,320 --> 00:36:30,840 Speaker 1: it's marked the lintels of cannibal dwellings, the flags of pirates, 737 00:36:31,080 --> 00:36:34,960 Speaker 1: the insignia of SS divisions and motorcycle gangs. He makes 738 00:36:34,960 --> 00:36:37,399 Speaker 1: a pretty good point. A lot of people out there 739 00:36:37,440 --> 00:36:39,440 Speaker 1: see a skull and crossbones and no, it means like 740 00:36:39,560 --> 00:36:41,560 Speaker 1: danger problems. 741 00:36:41,680 --> 00:36:44,920 Speaker 2: Yeah, it means this will be you. Yes, you know, 742 00:36:45,000 --> 00:36:45,880 Speaker 2: you'll be a skull. 743 00:36:46,320 --> 00:36:50,560 Speaker 1: And so the the working group for the Whip project 744 00:36:50,920 --> 00:36:54,160 Speaker 1: they said, no, that doesn't work. It's a Youngian archetype. 745 00:36:54,160 --> 00:36:57,040 Speaker 1: It doesn't really exist outside of the West. Ye to me, 746 00:36:57,200 --> 00:36:59,640 Speaker 1: I'm like, no, Sagan was definitely onto something. 747 00:37:00,080 --> 00:37:02,280 Speaker 2: Think so. I mean, tell me, if you go to 748 00:37:02,560 --> 00:37:06,200 Speaker 2: China and hold up a sign with a skull and crossbunds, 749 00:37:07,200 --> 00:37:09,000 Speaker 2: I would think so, wouldn't they. I mean, that's a 750 00:37:09,040 --> 00:37:10,120 Speaker 2: dire warning, isn't it. 751 00:37:10,440 --> 00:37:12,640 Speaker 1: I think their point is is that the skull used 752 00:37:12,680 --> 00:37:15,239 Speaker 1: to be like a memento moral where it meant like 753 00:37:15,320 --> 00:37:16,759 Speaker 1: rebirth and prepare death. 754 00:37:16,800 --> 00:37:18,520 Speaker 2: So they could be like, oh, wonderful a skull and 755 00:37:18,520 --> 00:37:19,839 Speaker 2: crossbud Sure, but. 756 00:37:21,680 --> 00:37:26,320 Speaker 1: To me, that is the one enduring symbol that's always 757 00:37:26,360 --> 00:37:28,920 Speaker 1: going to be around as long as there are humans, 758 00:37:29,520 --> 00:37:32,479 Speaker 1: because what happens when you die and rot what's left 759 00:37:32,560 --> 00:37:36,080 Speaker 1: your skull. Every human knows that. Even humans in the 760 00:37:36,080 --> 00:37:38,400 Speaker 1: future are going to know that, even ones that are 761 00:37:38,480 --> 00:37:43,000 Speaker 1: in like a post collapse tribes were running around like 762 00:37:43,200 --> 00:37:45,760 Speaker 1: and have lost all of the languages that are around today, 763 00:37:45,880 --> 00:37:47,759 Speaker 1: they're going to know what a skull looks like or 764 00:37:47,760 --> 00:37:49,799 Speaker 1: what a skull means, or at least one of them 765 00:37:49,880 --> 00:37:51,640 Speaker 1: is going to be like, wait, I don't think this 766 00:37:51,800 --> 00:37:54,919 Speaker 1: is saying that the rainbow's coming. I think it means 767 00:37:54,960 --> 00:37:55,960 Speaker 1: like death or danger. 768 00:37:56,520 --> 00:37:59,560 Speaker 2: All right, let's take another break, Yeah, sure, and we'll 769 00:37:59,560 --> 00:38:02,480 Speaker 2: come back in talk about the approach that the whip 770 00:38:02,560 --> 00:38:05,080 Speaker 2: panel took and what they came up with right after this. 771 00:38:20,200 --> 00:38:31,319 Speaker 3: Stuffy the jawsh shot soft you. 772 00:38:31,320 --> 00:38:34,280 Speaker 1: You know, I gotta defend Sagan. It's my boy. Sure 773 00:38:34,600 --> 00:38:35,360 Speaker 1: love that guy. 774 00:38:35,480 --> 00:38:38,720 Speaker 2: Someone should ask Neil deGrasse Tyson. 775 00:38:39,120 --> 00:38:40,440 Speaker 1: Sure, why not. 776 00:38:40,560 --> 00:38:41,799 Speaker 2: I bet he's got a good idea or two. 777 00:38:42,920 --> 00:38:43,960 Speaker 1: I'll bet they have asked. 778 00:38:44,719 --> 00:38:46,200 Speaker 2: He's still no Atlanta for a show? 779 00:38:46,320 --> 00:38:47,760 Speaker 1: Oh yeah, where Fox? 780 00:38:48,040 --> 00:38:49,920 Speaker 2: I think a cop Energy center. 781 00:38:50,280 --> 00:38:53,279 Speaker 1: Oh yeah, I think that's even more seats than the phone. 782 00:38:53,360 --> 00:38:54,040 Speaker 2: No, it's less. 783 00:38:54,640 --> 00:38:55,960 Speaker 1: Oh sorry, it's. 784 00:38:55,800 --> 00:38:58,920 Speaker 2: Like three thousand people, which is nothing to you know, 785 00:39:00,840 --> 00:39:02,520 Speaker 2: put up a stink about. That's a lot of folks. 786 00:39:02,760 --> 00:39:03,759 Speaker 1: We have not hit that. 787 00:39:04,120 --> 00:39:04,600 Speaker 2: No we're not. 788 00:39:04,760 --> 00:39:07,640 Speaker 1: No, we haven't. Did you hear the Star talk I 789 00:39:07,680 --> 00:39:07,920 Speaker 1: was on? 790 00:39:08,640 --> 00:39:09,759 Speaker 2: Oh no, it was it good? 791 00:39:09,840 --> 00:39:10,480 Speaker 1: It was pretty good. 792 00:39:10,560 --> 00:39:10,719 Speaker 2: Yeah. 793 00:39:10,760 --> 00:39:12,719 Speaker 1: If I do say, if it was supposed to be 794 00:39:12,800 --> 00:39:16,720 Speaker 1: like rapid fast responses. 795 00:39:16,200 --> 00:39:16,880 Speaker 2: Uh huh. 796 00:39:16,920 --> 00:39:20,080 Speaker 1: We got through like four questions in an hour because. 797 00:39:19,880 --> 00:39:22,680 Speaker 2: You were like, rapid fast response is not my specialty. 798 00:39:22,719 --> 00:39:23,279 Speaker 2: Meal me. 799 00:39:23,360 --> 00:39:25,000 Speaker 1: Let me just do a little distracting here. 800 00:39:25,640 --> 00:39:29,279 Speaker 2: I'm more deliberate, all right. So, speaking of deliberate, the 801 00:39:29,280 --> 00:39:32,399 Speaker 2: Whippanel was very deliberate and methodical. They divided it into 802 00:39:32,400 --> 00:39:36,360 Speaker 2: teams and approached it from the two things we were 803 00:39:36,400 --> 00:39:40,279 Speaker 2: talking about, direct and successive forms of communication. Debated a lot, 804 00:39:40,400 --> 00:39:44,799 Speaker 2: deliberated a lot the recommendations. They had two proposals, and 805 00:39:44,840 --> 00:39:47,680 Speaker 2: they did overlap a little bit. What I thought was 806 00:39:47,680 --> 00:39:50,640 Speaker 2: pretty smart is they both had a multi leveled approach 807 00:39:51,160 --> 00:39:55,360 Speaker 2: from the surface down that got more specific and intense 808 00:39:55,440 --> 00:39:56,200 Speaker 2: as you went down. 809 00:39:56,280 --> 00:39:59,040 Speaker 1: Yeah, the first one was basically like you, ding Dong, 810 00:39:59,360 --> 00:40:02,399 Speaker 1: this is dan or go away exactly. That's like level one, 811 00:40:02,480 --> 00:40:05,239 Speaker 1: and then level two is like, okay, ding Dong and 812 00:40:05,280 --> 00:40:08,200 Speaker 1: your kind of smart friend explain to ding Dong that 813 00:40:08,239 --> 00:40:10,439 Speaker 1: the reason this is dangerous because there's something buried here 814 00:40:10,480 --> 00:40:12,200 Speaker 1: and it's gonna hurt you, all. 815 00:40:12,200 --> 00:40:14,120 Speaker 2: Right, should we don't talk about the real things? 816 00:40:14,400 --> 00:40:17,440 Speaker 1: Oh? Sure, I thought I was so group A. 817 00:40:19,320 --> 00:40:22,040 Speaker 2: This was theirs. They studded the surface of the site 818 00:40:22,400 --> 00:40:25,759 Speaker 2: with what they called menacing earthworks, so a field of 819 00:40:25,800 --> 00:40:29,920 Speaker 2: spikes and then a big, massive disc painted to look 820 00:40:29,960 --> 00:40:32,799 Speaker 2: like a black hole. I didn't quite get that part. 821 00:40:32,920 --> 00:40:33,680 Speaker 1: It's so dumb. 822 00:40:33,760 --> 00:40:34,560 Speaker 2: I get the spikes. 823 00:40:35,320 --> 00:40:39,319 Speaker 1: I think it's the yeah, of course, But the black hole, 824 00:40:39,360 --> 00:40:41,200 Speaker 1: I think it's supposed to just mean like a void 825 00:40:41,280 --> 00:40:43,799 Speaker 1: or chaos. I don't know. I'm not sure I could 826 00:40:43,800 --> 00:40:46,239 Speaker 1: see how you would think that that was kind of universal, 827 00:40:46,320 --> 00:40:48,600 Speaker 1: like nobody wants to fall into a hole or something, 828 00:40:48,640 --> 00:40:50,800 Speaker 1: and maybe it evokes that kind of like stay away 829 00:40:50,920 --> 00:40:51,279 Speaker 1: all right. 830 00:40:52,000 --> 00:40:54,960 Speaker 2: Then they have large markers all around the site, which, 831 00:40:55,120 --> 00:40:57,719 Speaker 2: like you said, are the really basic messages in the warnings, 832 00:40:58,480 --> 00:41:02,480 Speaker 2: including and I thought this is so interesting faces that 833 00:41:02,640 --> 00:41:05,000 Speaker 2: invoke Edvard Munch's the scream. 834 00:41:05,560 --> 00:41:09,440 Speaker 1: The ones I saw were the scream. Yeah, like it 835 00:41:09,560 --> 00:41:10,920 Speaker 1: was the line drawing of the guy from the. 836 00:41:10,920 --> 00:41:14,600 Speaker 2: Screen people, Yeah, like in great agony and pain. That 837 00:41:14,760 --> 00:41:15,800 Speaker 2: to me not bad. 838 00:41:15,920 --> 00:41:18,720 Speaker 1: It isn't bad. I don't know though, is that more 839 00:41:19,360 --> 00:41:22,160 Speaker 1: universally understood than a skull and crossbones? 840 00:41:22,239 --> 00:41:25,040 Speaker 2: I don't know, or if art survives or people like, oh, 841 00:41:25,080 --> 00:41:26,640 Speaker 2: I wonder if that painting's down there. 842 00:41:26,640 --> 00:41:29,000 Speaker 1: Well, I think what they're saying is and semi autitians 843 00:41:29,080 --> 00:41:31,600 Speaker 1: kind of feel this way. Is that Edvard Monk so 844 00:41:32,280 --> 00:41:35,680 Speaker 1: perfectly nailed the scream that even without the art, like, 845 00:41:35,719 --> 00:41:38,000 Speaker 1: if you see that, you understand that that person you're 846 00:41:38,000 --> 00:41:39,000 Speaker 1: seeing is in agony? 847 00:41:39,280 --> 00:41:41,400 Speaker 2: Did I say Munch? No? 848 00:41:41,480 --> 00:41:43,399 Speaker 1: I think you said Monk did I say munch. 849 00:41:43,640 --> 00:41:45,400 Speaker 2: You said Monk. I might have said Munch. 850 00:41:45,560 --> 00:41:48,280 Speaker 1: No, you said I think you said monk. Is it Munch? 851 00:41:48,840 --> 00:41:51,000 Speaker 2: I think it's probably Monk. There's no way his name 852 00:41:51,080 --> 00:41:51,439 Speaker 2: is Munch. 853 00:41:51,560 --> 00:41:54,240 Speaker 1: I'm almost positive you said monk. Jerry, can you rewind 854 00:41:54,280 --> 00:41:55,240 Speaker 1: for a second. 855 00:41:56,480 --> 00:41:56,800 Speaker 2: Much? 856 00:41:57,440 --> 00:41:59,759 Speaker 1: Oh you did say much. I would have sworn you 857 00:41:59,800 --> 00:42:00,359 Speaker 1: said munk. 858 00:42:01,120 --> 00:42:04,359 Speaker 2: So Group A below the surface this is when they 859 00:42:04,480 --> 00:42:06,960 Speaker 2: actually start talking about nuclear waste to what it does 860 00:42:07,000 --> 00:42:09,839 Speaker 2: to you, the details about the structure and all that. 861 00:42:09,760 --> 00:42:13,880 Speaker 1: Stuff, right, not bad, where they teach a little bit 862 00:42:13,880 --> 00:42:14,960 Speaker 1: about radioactivity. 863 00:42:15,200 --> 00:42:19,759 Speaker 2: So Group B this was they went super informative, and 864 00:42:19,800 --> 00:42:22,520 Speaker 2: really what they relied on was that people had a 865 00:42:22,560 --> 00:42:25,160 Speaker 2: little bit of knowledge in the future about stuff like this. 866 00:42:25,320 --> 00:42:28,080 Speaker 1: But they also trusted that the people didn't have to 867 00:42:28,160 --> 00:42:30,680 Speaker 1: just be spooked or scared or something like that. That 868 00:42:31,480 --> 00:42:34,080 Speaker 1: it's like, here is the facts and information is why 869 00:42:34,120 --> 00:42:35,480 Speaker 1: you want to stay away from that. 870 00:42:35,840 --> 00:42:39,920 Speaker 2: Yeah. Their big above ground work was these big earthen 871 00:42:40,080 --> 00:42:43,680 Speaker 2: walls in the shape of the nuclear trifoil not bad. 872 00:42:43,920 --> 00:42:45,400 Speaker 2: I imagine you have to see it from above to 873 00:42:45,440 --> 00:42:45,880 Speaker 2: even know though. 874 00:42:46,080 --> 00:42:48,440 Speaker 1: Yeah, was, But that's part of the one of the 875 00:42:48,480 --> 00:42:52,640 Speaker 1: requirements was that you want it to be easily visible, 876 00:42:52,719 --> 00:42:56,960 Speaker 1: not just with human cognition, but like remote sensing too, right, 877 00:42:57,120 --> 00:43:00,600 Speaker 1: so like magnetic surveys. They said, we should put some 878 00:43:00,640 --> 00:43:03,479 Speaker 1: magnets in here, right, not just from when you walk 879 00:43:03,560 --> 00:43:04,240 Speaker 1: up to it. 880 00:43:04,440 --> 00:43:05,960 Speaker 2: Right so, and you outside to be able to see 881 00:43:05,960 --> 00:43:09,640 Speaker 2: it from your flying saucer exactly. And then inside the 882 00:43:09,680 --> 00:43:13,320 Speaker 2: walls they have at various steps have these big markers, 883 00:43:13,320 --> 00:43:17,239 Speaker 2: and here's where these like symbols and pictographs, all kinds 884 00:43:17,280 --> 00:43:22,960 Speaker 2: of languages, writing in different languages, and then more human 885 00:43:23,000 --> 00:43:26,680 Speaker 2: faces increasingly contorted in agony as you go down. 886 00:43:26,840 --> 00:43:30,600 Speaker 1: Yeah, it looks to me like the guys getting drunker 887 00:43:30,640 --> 00:43:34,640 Speaker 1: and drunker. Yeah, yeah, that's what it looks like. 888 00:43:34,880 --> 00:43:37,840 Speaker 2: Well, maybe that means there's a happening bar exactly. 889 00:43:37,880 --> 00:43:39,480 Speaker 1: That's how I would take it if I were a 890 00:43:39,520 --> 00:43:41,200 Speaker 1: future human post collapse. 891 00:43:41,440 --> 00:43:42,959 Speaker 2: Gotta go, gotta go down here. 892 00:43:43,040 --> 00:43:46,120 Speaker 1: There were also pictograms you're just like digging through the 893 00:43:46,120 --> 00:43:48,680 Speaker 1: sand to get to Yeah. There are also pictograms that 894 00:43:48,800 --> 00:43:52,880 Speaker 1: showed like under the ground, like real easy to understand 895 00:43:52,960 --> 00:43:57,520 Speaker 1: drawings of the radioactive waste the water ground water flowing 896 00:43:57,560 --> 00:44:01,000 Speaker 1: through it, taking the radioactive waste up to the plants, 897 00:44:01,000 --> 00:44:03,120 Speaker 1: which are then eaten by the humans in the picture, 898 00:44:03,920 --> 00:44:07,399 Speaker 1: one of whom dies, which makes sense. You don't need 899 00:44:07,440 --> 00:44:10,040 Speaker 1: to understand anything about radioactivity. You don't need to be 900 00:44:10,040 --> 00:44:13,160 Speaker 1: able to read anything. It's a really like it makes sense, 901 00:44:13,440 --> 00:44:16,000 Speaker 1: especially if some people are sitting there thinking about it. 902 00:44:16,040 --> 00:44:18,360 Speaker 2: Was the final image of skull and crossbones or a 903 00:44:18,400 --> 00:44:19,040 Speaker 2: pile of bones. 904 00:44:19,120 --> 00:44:21,040 Speaker 1: No, it was like a person, three people standing in 905 00:44:21,080 --> 00:44:22,680 Speaker 1: one of them. The last one was like dead, and 906 00:44:22,719 --> 00:44:24,480 Speaker 1: I think he might even have exes for eyes. 907 00:44:24,520 --> 00:44:25,879 Speaker 2: Well, I was about to say, though, I mean, if 908 00:44:25,880 --> 00:44:28,879 Speaker 2: you think about twenty thousand years from now, maybe they're like, oh, 909 00:44:28,920 --> 00:44:33,120 Speaker 2: this induces a nice nap. Maybe, Like but to your 910 00:44:33,120 --> 00:44:35,799 Speaker 2: point though, like the bones is where you need to 911 00:44:35,880 --> 00:44:36,400 Speaker 2: end up. 912 00:44:36,360 --> 00:44:39,560 Speaker 1: Right, Yeah, maybe somebody would be like, oh, these veggies 913 00:44:39,560 --> 00:44:41,239 Speaker 1: here give you a great buzz if you grow them 914 00:44:41,239 --> 00:44:43,360 Speaker 1: on this ground. Yeah, exes for eyes, right, Yeah, the 915 00:44:43,400 --> 00:44:45,600 Speaker 1: bones do make a lot more sense. I think Sagan 916 00:44:45,719 --> 00:44:48,640 Speaker 1: was right. That's that should be a T shirt. Stuff 917 00:44:48,640 --> 00:44:51,480 Speaker 1: you should know T shirt. Sagan was right, don't even 918 00:44:51,520 --> 00:44:52,759 Speaker 1: need to have any context. 919 00:44:53,120 --> 00:44:55,239 Speaker 2: We're gonna give an email in a few days from 920 00:44:55,239 --> 00:44:55,800 Speaker 2: the guy. 921 00:44:55,719 --> 00:44:57,879 Speaker 1: From the estate of Carl Sagan saying do not make 922 00:44:57,920 --> 00:44:58,760 Speaker 1: that T shirt. 923 00:44:59,080 --> 00:45:00,880 Speaker 2: So what did they go with In the end, though. 924 00:45:02,960 --> 00:45:09,080 Speaker 1: They went with an earth and burm basically to provide 925 00:45:09,080 --> 00:45:13,759 Speaker 1: an obstacle and to block easy access some granite slabs 926 00:45:14,160 --> 00:45:19,839 Speaker 1: monoliths that have warnings written in seven languages. 927 00:45:19,600 --> 00:45:22,440 Speaker 2: Yeah, Navajo and then the six languages of the un. 928 00:45:22,480 --> 00:45:27,760 Speaker 1: So Arabic, Chinese, English, Spanish, French, and Russian correct which 929 00:45:27,800 --> 00:45:30,160 Speaker 1: makes a lot of sense. But then they took Thomas 930 00:45:30,200 --> 00:45:33,319 Speaker 1: Siebiak up on his idea. They kind of built on 931 00:45:33,360 --> 00:45:38,120 Speaker 1: the earlier religion right exactly, and they left blank spaces 932 00:45:38,400 --> 00:45:41,000 Speaker 1: or they play in their plan. They leave blank spaces 933 00:45:41,040 --> 00:45:44,319 Speaker 1: on these slabs for future generations to add their own 934 00:45:44,800 --> 00:45:46,840 Speaker 1: translations of the inscriptions. 935 00:45:47,560 --> 00:45:51,480 Speaker 2: It's a great idea, and the faces of humans and 936 00:45:51,719 --> 00:45:54,719 Speaker 2: pain and anguish, right that did survive in the end. 937 00:45:55,040 --> 00:45:58,600 Speaker 1: So that was the final report on this whip panel. 938 00:45:58,680 --> 00:46:01,000 Speaker 1: It's a pretty good idea, makes a lot of sense 939 00:46:01,280 --> 00:46:04,000 Speaker 1: because it not only so there are two groups that 940 00:46:04,000 --> 00:46:07,640 Speaker 1: they're trying to say stay away, not not really like 941 00:46:07,800 --> 00:46:10,200 Speaker 1: urban explorers or thrill seekers or whatever. 942 00:46:10,120 --> 00:46:10,879 Speaker 2: Because they can die. 943 00:46:10,960 --> 00:46:14,160 Speaker 1: They would have they would have virtually no chance of 944 00:46:14,239 --> 00:46:17,239 Speaker 1: getting down to the actual radio act. The material. The 945 00:46:17,239 --> 00:46:20,880 Speaker 1: people they are worried about were technology technological advanced civilizations 946 00:46:21,239 --> 00:46:24,520 Speaker 1: that were drilling for resources, like an accident, like God 947 00:46:24,560 --> 00:46:28,560 Speaker 1: helped this this this this waste disposal site. If salt 948 00:46:28,680 --> 00:46:33,400 Speaker 1: becomes incredibly important in the future, right, and then less 949 00:46:33,480 --> 00:46:39,239 Speaker 1: advanced civilizations that could accidentally change the flow of groundwater 950 00:46:39,320 --> 00:46:43,279 Speaker 1: to go through the salt bed through massive like irrigation projects, 951 00:46:43,560 --> 00:46:44,480 Speaker 1: it covers all of it. 952 00:46:44,640 --> 00:46:47,799 Speaker 2: Yeah, My whole thing is just make it inaccessible. Why 953 00:46:47,840 --> 00:46:50,040 Speaker 2: is it in New Mexico? Why is it out? 954 00:46:50,200 --> 00:46:52,839 Speaker 1: You know, Well, that's I mean, that's pretty inaccessible. 955 00:46:53,200 --> 00:46:56,799 Speaker 2: It's not as inaccessible as you know Siberia. 956 00:46:57,080 --> 00:47:01,600 Speaker 1: No, some one of the recommendations for nuclear waste disposals 957 00:47:01,600 --> 00:47:03,560 Speaker 1: shooting it into space. Just send it out in the 958 00:47:03,600 --> 00:47:06,080 Speaker 1: outer space and forget about it. And if you believe 959 00:47:06,120 --> 00:47:09,120 Speaker 1: in the Fermi paradox that it says we're the only 960 00:47:09,600 --> 00:47:13,600 Speaker 1: intelligent life in the universe, man, more power to you. Yeah, 961 00:47:13,680 --> 00:47:16,120 Speaker 1: that's actually not that bad of an idea. It's a 962 00:47:16,160 --> 00:47:18,600 Speaker 1: horrific idea, but it's actually kind of a good idea. 963 00:47:18,440 --> 00:47:20,520 Speaker 2: To Yeah, but then I wonder about the danger and 964 00:47:20,560 --> 00:47:23,880 Speaker 2: the risk involved. I mean, we've seen rockets blow up 965 00:47:23,920 --> 00:47:25,920 Speaker 2: and space shuttles blow up. That would be bad, like 966 00:47:25,960 --> 00:47:27,800 Speaker 2: what if the thing that they're shooting it out there 967 00:47:27,800 --> 00:47:29,040 Speaker 2: malfunctioned or something. 968 00:47:28,840 --> 00:47:29,560 Speaker 1: That'd be really bad. 969 00:47:29,640 --> 00:47:30,399 Speaker 2: That'd be really bad. 970 00:47:30,480 --> 00:47:31,200 Speaker 1: That's a great point. 971 00:47:31,239 --> 00:47:34,720 Speaker 2: It's like all of our nuclear waste has just been released, oh. 972 00:47:34,640 --> 00:47:38,239 Speaker 1: Into the atmosphere. Yeah, that's a great, great point, chunk. 973 00:47:38,600 --> 00:47:41,480 Speaker 2: So here's the thing, is all of this just wasted 974 00:47:41,520 --> 00:47:44,360 Speaker 2: effort because I was getting so into the stuff and 975 00:47:44,400 --> 00:47:46,440 Speaker 2: then the end of this article was a real like 976 00:47:46,800 --> 00:47:50,520 Speaker 2: sad trombone. Yeah, because it seems like it seems like 977 00:47:50,600 --> 00:47:52,560 Speaker 2: nobody really even cares the people that matter. 978 00:47:52,760 --> 00:47:55,879 Speaker 1: Well, the first the first group like their whole thing 979 00:47:56,040 --> 00:47:59,400 Speaker 1: will will probably never be implemented because yucka mountain project 980 00:47:59,400 --> 00:48:02,880 Speaker 1: got shut down, right, But the Whip group may actually 981 00:48:03,239 --> 00:48:07,160 Speaker 1: have their plan come to fruition because it is an 982 00:48:07,160 --> 00:48:09,600 Speaker 1: EPA rule that you have to create this kind of marker, 983 00:48:10,080 --> 00:48:12,960 Speaker 1: and they've got until about twenty forty until they estimate 984 00:48:13,000 --> 00:48:15,320 Speaker 1: the place is going to shut down. So it's entirely 985 00:48:15,360 --> 00:48:18,080 Speaker 1: possible that in twenty forty or sometime in the one 986 00:48:18,120 --> 00:48:21,560 Speaker 1: hundred years after twenty forty, when the DOE stops protecting 987 00:48:21,560 --> 00:48:25,600 Speaker 1: the site or the DoD, they may implement this earthen 988 00:48:25,840 --> 00:48:29,120 Speaker 1: works in these sixteen granite slabs and we may live 989 00:48:29,160 --> 00:48:30,040 Speaker 1: to see something like this. 990 00:48:30,520 --> 00:48:32,759 Speaker 2: Well, outside of the US, it seems like no one 991 00:48:32,840 --> 00:48:36,800 Speaker 2: is super concerned. Sweden in twenty eleven had an application 992 00:48:36,880 --> 00:48:42,359 Speaker 2: to build a repository and force mark and in their 993 00:48:42,400 --> 00:48:45,799 Speaker 2: literal application they basically said, you know what, we're going 994 00:48:45,840 --> 00:48:48,319 Speaker 2: to worry about that later. Yeah, in seventy years, when 995 00:48:48,320 --> 00:48:49,080 Speaker 2: this thing's finished. 996 00:48:49,080 --> 00:48:51,880 Speaker 1: They said, see this, can we just kicked this seventy 997 00:48:51,960 --> 00:48:52,560 Speaker 1: years down the. 998 00:48:53,160 --> 00:48:56,960 Speaker 2: And the Swedish National Archives they consulted on their application, 999 00:48:57,800 --> 00:49:00,200 Speaker 2: they said, that's really insufficient. It said, it gives the 1000 00:49:00,200 --> 00:49:04,799 Speaker 2: impression that one intends to postpone important documentation efforts until 1001 00:49:04,800 --> 00:49:08,000 Speaker 2: the closure of the repository in seventy years. And it's like, 1002 00:49:08,040 --> 00:49:11,239 Speaker 2: it doesn't give the impression. It literally said that, right, So. 1003 00:49:12,080 --> 00:49:13,880 Speaker 1: I think they're being ultra polite. 1004 00:49:14,320 --> 00:49:17,520 Speaker 2: Yeah, I think, well Sweden, right. Good people in the. 1005 00:49:17,520 --> 00:49:19,680 Speaker 1: US, though, don't tell asap Rocky then. 1006 00:49:19,640 --> 00:49:23,200 Speaker 2: That don't even know what that means. That's the singer, right, Yeah, 1007 00:49:23,480 --> 00:49:24,120 Speaker 2: he's a rapper. 1008 00:49:24,120 --> 00:49:25,680 Speaker 1: He's in prison in Sweden right now. 1009 00:49:25,880 --> 00:49:28,040 Speaker 2: I did not know that oh man did what he do. 1010 00:49:28,719 --> 00:49:30,680 Speaker 1: He got into a fight with some Swedish kids and 1011 00:49:30,719 --> 00:49:32,239 Speaker 1: it may or may not have been their fall. It 1012 00:49:32,239 --> 00:49:34,920 Speaker 1: looks on video like they definitely provoked it. Really, but 1013 00:49:35,000 --> 00:49:37,399 Speaker 1: the King of Sweden is like, sorry, rule of law. 1014 00:49:37,440 --> 00:49:41,120 Speaker 1: It applies to everybody, including super famous Americans. 1015 00:49:41,280 --> 00:49:41,840 Speaker 2: Well true. 1016 00:49:41,920 --> 00:49:44,160 Speaker 1: Donald Trump called him to try to get the thing 1017 00:49:44,200 --> 00:49:47,560 Speaker 1: resolved at the behest of Kanye West. Oh God, and 1018 00:49:47,640 --> 00:49:49,719 Speaker 1: apparently it's just made everything worse. And now the King 1019 00:49:49,760 --> 00:49:52,400 Speaker 1: of Sweden is like, there's no chance he's getting released early. 1020 00:49:52,560 --> 00:49:55,160 Speaker 2: Wow, man, where have I been? 1021 00:49:55,760 --> 00:49:58,480 Speaker 1: This is reality. I know what I just said is 1022 00:49:58,680 --> 00:50:02,440 Speaker 1: actual fact that actually happened here in twenty nineteen. Everybody, 1023 00:50:02,840 --> 00:50:05,920 Speaker 1: Humans are the far future? Can you believe it? 1024 00:50:07,160 --> 00:50:09,960 Speaker 2: Humans of the near John Lomberg, that guy we were 1025 00:50:10,000 --> 00:50:12,279 Speaker 2: talking about earlier, who was on that original nineteen ninety 1026 00:50:12,280 --> 00:50:14,680 Speaker 2: one whip panel, he told Weiss just a couple of 1027 00:50:14,760 --> 00:50:18,320 Speaker 2: years ago, a lot of us had been around the 1028 00:50:18,320 --> 00:50:20,319 Speaker 2: block a few times before, because you know, he was 1029 00:50:20,360 --> 00:50:22,799 Speaker 2: back then doing the same thing and knew this is 1030 00:50:22,800 --> 00:50:25,160 Speaker 2: going to be a report. The government only did and 1031 00:50:25,200 --> 00:50:27,439 Speaker 2: this is the US, and we're putting more thought toward 1032 00:50:27,680 --> 00:50:28,719 Speaker 2: this than anyone. 1033 00:50:28,520 --> 00:50:30,200 Speaker 1: Yeah, which is really surprising. 1034 00:50:30,360 --> 00:50:32,480 Speaker 2: He said they only did this because they needed to 1035 00:50:32,520 --> 00:50:35,440 Speaker 2: show compliance. They didn't really care what we said. And 1036 00:50:35,480 --> 00:50:39,160 Speaker 2: then and from the nineteen eighty one Human Interference Task 1037 00:50:39,200 --> 00:50:42,600 Speaker 2: Force during the competition, they basically said, the most effective 1038 00:50:42,600 --> 00:50:45,120 Speaker 2: sign will be the dead bodies of those foolish enough 1039 00:50:45,120 --> 00:50:48,560 Speaker 2: to ignore, which makes sense. Whatever sign, So basically like 1040 00:50:49,880 --> 00:50:52,480 Speaker 2: who cares, someone will do, someone will get in there, 1041 00:50:52,520 --> 00:50:54,920 Speaker 2: and they'll all die, and then that'll be the big morning. 1042 00:50:54,840 --> 00:50:58,200 Speaker 1: Right, which makes sense if humans are in communication around 1043 00:50:58,239 --> 00:51:00,520 Speaker 1: the globe and you've got the same morning. But if 1044 00:51:00,520 --> 00:51:04,960 Speaker 1: they're not, then it's catastrophe, catastrophe, catastrophe. But at least 1045 00:51:05,040 --> 00:51:07,239 Speaker 1: we fulfilled our part of the bargain where we really 1046 00:51:07,239 --> 00:51:11,640 Speaker 1: tried to warn everybody agreed. You got anything else? Nah, 1047 00:51:11,960 --> 00:51:15,000 Speaker 1: if you will indulge me, I would like to plug 1048 00:51:15,239 --> 00:51:17,480 Speaker 1: The End of the World with Josh Clark, the what 1049 00:51:18,000 --> 00:51:20,719 Speaker 1: the End of the World with Josh Clark? If like 1050 00:51:20,800 --> 00:51:23,239 Speaker 1: thinking about things in like far deep time in the 1051 00:51:23,239 --> 00:51:25,640 Speaker 1: future of humanity and all that stuff kind of floated 1052 00:51:25,640 --> 00:51:28,680 Speaker 1: your boat, I would recommend my little podcast series, The 1053 00:51:28,760 --> 00:51:29,920 Speaker 1: End of the World of Josh Clark. 1054 00:51:29,960 --> 00:51:32,600 Speaker 2: For sure. This is right up your alley. 1055 00:51:32,360 --> 00:51:35,360 Speaker 1: Thank you, Chuck. And since Chuck said right up your alley, 1056 00:51:35,400 --> 00:51:36,960 Speaker 1: it's time for a listener mail. 1057 00:51:39,200 --> 00:51:41,960 Speaker 2: Hey guys, we are strangers, but we aren't. You've been 1058 00:51:41,960 --> 00:51:43,719 Speaker 2: with me during the most challenging times of my life. 1059 00:51:43,760 --> 00:51:45,879 Speaker 2: I've listened to your show for about seven years. I'm 1060 00:51:45,880 --> 00:51:49,359 Speaker 2: an English teacher, and my students retired and making fun 1061 00:51:49,400 --> 00:51:51,600 Speaker 2: of me because I always start lessons with So I 1062 00:51:51,640 --> 00:51:54,239 Speaker 2: was listening to stuff you should know. I went through 1063 00:51:54,280 --> 00:51:56,400 Speaker 2: a huge life change recently. I was in a relationship 1064 00:51:56,400 --> 00:51:59,279 Speaker 2: for five years, engage for four of them, and move 1065 00:51:59,320 --> 00:52:02,200 Speaker 2: from Phoenix to Charlotte after ending that relationship, which was 1066 00:52:02,200 --> 00:52:05,000 Speaker 2: incredibly difficult to do. During the drive, I listened to 1067 00:52:05,040 --> 00:52:07,000 Speaker 2: you guys for the entire thirty four hours. 1068 00:52:07,560 --> 00:52:12,440 Speaker 1: Wow, imagine No, I honestly can't. 1069 00:52:12,360 --> 00:52:14,760 Speaker 2: No music, just you guys. My heart was so broken. 1070 00:52:14,880 --> 00:52:17,560 Speaker 2: I didn't think I would ever be able to recover 1071 00:52:17,600 --> 00:52:18,440 Speaker 2: from that trauma, but. 1072 00:52:18,880 --> 00:52:21,640 Speaker 1: The trauma of listening to us for thirty four hours. 1073 00:52:21,920 --> 00:52:23,560 Speaker 2: But you didn't know that you were able to comfort 1074 00:52:23,600 --> 00:52:26,080 Speaker 2: me and calm me down. My brother, who helped me move, 1075 00:52:26,680 --> 00:52:28,759 Speaker 2: asked me what I needed to listen to during the drive. 1076 00:52:28,840 --> 00:52:30,680 Speaker 2: I told him I wanted to listen to stuff you 1077 00:52:30,680 --> 00:52:32,600 Speaker 2: should know. He had never heard of it. But now 1078 00:52:32,719 --> 00:52:36,080 Speaker 2: my brother Nick is also a fan, whether he likes 1079 00:52:36,120 --> 00:52:38,799 Speaker 2: it or not, and we almost always start our conversations now, 1080 00:52:38,960 --> 00:52:40,520 Speaker 2: did you listen to the last Stuff you Should Know? 1081 00:52:40,600 --> 00:52:41,120 Speaker 1: That's cool? 1082 00:52:41,360 --> 00:52:43,600 Speaker 2: So I just want to give you guys kudos for 1083 00:52:43,640 --> 00:52:46,560 Speaker 2: being incredible. Please give a shout out to Justin, a 1084 00:52:46,600 --> 00:52:48,680 Speaker 2: fan that learned about you guys from me, in case 1085 00:52:49,520 --> 00:52:52,960 Speaker 2: he didn't hear it the first time. Hello Justin Potter. Wow, 1086 00:52:53,880 --> 00:52:56,359 Speaker 2: thanks for giving me calm in times of adversity. I 1087 00:52:56,400 --> 00:52:58,480 Speaker 2: know we are strangers, but we are not actually because 1088 00:52:58,520 --> 00:53:00,720 Speaker 2: you have been with me during strugg in my life. 1089 00:53:01,160 --> 00:53:03,400 Speaker 2: The credit you for getting me through the hardest times, 1090 00:53:03,760 --> 00:53:05,800 Speaker 2: and I will be a lifelong fan of you both. 1091 00:53:06,120 --> 00:53:07,040 Speaker 2: That is from Kate. 1092 00:53:07,400 --> 00:53:09,640 Speaker 1: Thanks Kate. I'm really glad we got to play some 1093 00:53:10,120 --> 00:53:14,080 Speaker 1: small part in getting you back on the road to happiness. 1094 00:53:14,120 --> 00:53:15,520 Speaker 2: Yeah. I hope everything's going great for you. 1095 00:53:15,640 --> 00:53:17,960 Speaker 1: Yeah for real. If you want to get in touch 1096 00:53:18,000 --> 00:53:20,000 Speaker 1: with this like Kate did, just to say hi, or 1097 00:53:20,000 --> 00:53:22,359 Speaker 1: to say thanks, or to say you guys really screwed up. 1098 00:53:22,480 --> 00:53:24,600 Speaker 1: It's cool. You can go on to Stuff you Should 1099 00:53:24,600 --> 00:53:27,200 Speaker 1: Know dot com and check out our social links, and 1100 00:53:27,239 --> 00:53:29,440 Speaker 1: you can also send us a good old fashioned email 1101 00:53:29,520 --> 00:53:38,720 Speaker 1: to stuff Podcast at iHeartRadio dot com. 1102 00:53:35,000 --> 00:53:38,000 Speaker 4: You Know Stuff You Should Know is a production of iHeartRadio. 1103 00:53:38,480 --> 00:53:41,680 Speaker 2: For more podcasts my heart Radio, visit the iHeartRadio app, 1104 00:53:41,880 --> 00:53:44,800 Speaker 2: Apple Podcasts, or wherever you listen to your favorite shows.