1 00:00:00,280 --> 00:00:02,840 Speaker 1: Brought to you by the reinvented two thousand twelve camera. 2 00:00:03,160 --> 00:00:07,560 Speaker 1: It's ready. Are you welcome to Stuff you Should Know? 3 00:00:08,160 --> 00:00:17,200 Speaker 1: From house Stuff Works dot Com. Hey, and welcome to 4 00:00:17,239 --> 00:00:24,160 Speaker 1: the podcast. Hi. I'm Josh Clark. That was Chuck Brian 5 00:00:24,200 --> 00:00:27,080 Speaker 1: who just said, Hi, good times. When you put the 6 00:00:27,080 --> 00:00:30,320 Speaker 1: two of us together, you're gonna get something called stuff. 7 00:00:30,360 --> 00:00:34,879 Speaker 1: You should know. It's like mixing baking soda and vinegar 8 00:00:35,680 --> 00:00:39,879 Speaker 1: and chest hair. Oh god, that's stuff you should know. 9 00:00:39,920 --> 00:00:46,360 Speaker 1: It's explosive. All right, it is, um, Chuck, Yes, are 10 00:00:46,360 --> 00:00:50,400 Speaker 1: you okay? Yeah? Great? Okay? Does that that sound convincing? 11 00:00:51,440 --> 00:00:53,720 Speaker 1: And I know that you won't be okay By the 12 00:00:53,760 --> 00:00:56,000 Speaker 1: time we finished this one. I thought this is very 13 00:00:56,040 --> 00:00:58,600 Speaker 1: interesting and well written. Thank you very much. I would 14 00:00:58,640 --> 00:01:02,480 Speaker 1: agree with very interesting. No, of course, well written. I 15 00:01:02,520 --> 00:01:05,040 Speaker 1: had your fingers doing the typing. Thank you, Chuck. I 16 00:01:05,120 --> 00:01:09,240 Speaker 1: appreciate that. Um, you want to do this one? Okay? 17 00:01:09,319 --> 00:01:15,240 Speaker 1: So Chuck, there was a study UM released in a 18 00:01:15,280 --> 00:01:18,240 Speaker 1: little known journal called Conservation Letters. You may have heard 19 00:01:18,280 --> 00:01:21,600 Speaker 1: of it. I hadn't until I read this. I hadn't 20 00:01:21,600 --> 00:01:24,319 Speaker 1: either until I found it. UM. But it was by 21 00:01:24,400 --> 00:01:28,440 Speaker 1: Conservation Internationals Will Turner and a couple other people from 22 00:01:28,440 --> 00:01:33,560 Speaker 1: that group, and Um. The point of this paper was 23 00:01:33,600 --> 00:01:38,080 Speaker 1: to point out that even though we may try to 24 00:01:38,440 --> 00:01:42,960 Speaker 1: mitigate climate change, we're still screwing things up. By trying 25 00:01:43,000 --> 00:01:48,320 Speaker 1: to mitigate it, and by not preparing for the worst 26 00:01:48,720 --> 00:01:52,000 Speaker 1: e g. Climate change, We're ultimately going to screw things 27 00:01:52,080 --> 00:01:55,680 Speaker 1: up after the Earth is already screwed up. Let me 28 00:01:55,680 --> 00:01:59,520 Speaker 1: give you a couple of examples. First, is um, one 29 00:01:59,600 --> 00:02:03,960 Speaker 1: fifth of all the world's tropical forests lie within a 30 00:02:04,000 --> 00:02:08,160 Speaker 1: few kilometers of areas that would be totally underwater if 31 00:02:08,200 --> 00:02:11,519 Speaker 1: sea levels rose by just one meter, right, Yes, thirty 32 00:02:11,520 --> 00:02:18,440 Speaker 1: one miles of heavy human population would be underwater. Thirty 33 00:02:18,480 --> 00:02:24,760 Speaker 1: one the forest are within thirty one heavy human populations, right, 34 00:02:24,960 --> 00:02:27,200 Speaker 1: And so these human populations aren't just gonna sit there 35 00:02:27,200 --> 00:02:29,280 Speaker 1: and drown one and take one for the Earth. Now, 36 00:02:29,400 --> 00:02:32,079 Speaker 1: we're gonna move upward, and as they move upward, they're 37 00:02:32,080 --> 00:02:35,200 Speaker 1: gonna encounter these forests, and they're gonna say, Hey, that 38 00:02:35,240 --> 00:02:38,640 Speaker 1: bird looks delicious. Hey, I can burn this tree and 39 00:02:38,639 --> 00:02:42,880 Speaker 1: cook this bird. Hey, mama, let's get it on. That's 40 00:02:42,880 --> 00:02:45,160 Speaker 1: gonna be all sorts of weird things said in these 41 00:02:45,160 --> 00:02:49,520 Speaker 1: forests after climate change takes place, right, Yeah, these untouched 42 00:02:49,560 --> 00:02:53,280 Speaker 1: forests will now be touched and plundered. Right and um, 43 00:02:53,360 --> 00:02:56,200 Speaker 1: by the way, these this same one five of the 44 00:02:56,240 --> 00:03:00,440 Speaker 1: forests are home to exactly half of all of the 45 00:03:00,480 --> 00:03:03,760 Speaker 1: Alliance for Zero Extinction sites, which means that these are 46 00:03:03,800 --> 00:03:06,480 Speaker 1: sites where animals that are filled with animals that are 47 00:03:06,520 --> 00:03:10,720 Speaker 1: on the verge of extinction. Humans show up. It's over right. 48 00:03:10,800 --> 00:03:13,280 Speaker 1: So that's if climate change happens and we don't figure 49 00:03:13,320 --> 00:03:16,560 Speaker 1: out what we're going to do for shelter and fuel 50 00:03:16,880 --> 00:03:21,760 Speaker 1: and food ahead of time, that's just natural. The other 51 00:03:21,800 --> 00:03:26,000 Speaker 1: scenario is, Okay, we're trying to mitigate climate change now 52 00:03:26,040 --> 00:03:29,839 Speaker 1: before some sort of disaster happens in the future. Right, 53 00:03:30,200 --> 00:03:34,639 Speaker 1: So say hydroelectric power, Yeah, let's build a dam, and 54 00:03:34,960 --> 00:03:38,360 Speaker 1: why not because that's a clean and green When you 55 00:03:38,480 --> 00:03:43,680 Speaker 1: release water, it makes turbines move, which generates electricity, completely 56 00:03:43,760 --> 00:03:48,520 Speaker 1: claim cleanly, right, So how can you go wrong? Chalk? Well, Uh, 57 00:03:48,560 --> 00:03:51,720 Speaker 1: it didn't say so in this particular paper, but I 58 00:03:51,760 --> 00:03:54,320 Speaker 1: remember a little podcast we did about the reservoir in 59 00:03:54,400 --> 00:03:57,840 Speaker 1: due seismicity, and a Science Channel short film we did 60 00:03:57,840 --> 00:04:03,200 Speaker 1: on that where building dams can potentially cause instability and 61 00:04:03,320 --> 00:04:07,600 Speaker 1: tectonic plates from the heavy water and then no water, 62 00:04:08,360 --> 00:04:09,840 Speaker 1: and if you build one too close to a dam, 63 00:04:09,880 --> 00:04:12,240 Speaker 1: it could cause an earthquake, cause an earthquake. So that's 64 00:04:12,240 --> 00:04:14,920 Speaker 1: one way. That's one way. Another way is that when 65 00:04:14,960 --> 00:04:16,919 Speaker 1: you build a day m that river backs up and 66 00:04:16,960 --> 00:04:19,400 Speaker 1: creates a lake in an area that wasn't ever really 67 00:04:19,400 --> 00:04:21,920 Speaker 1: supposed to be a lake in all of the deer 68 00:04:22,080 --> 00:04:25,320 Speaker 1: and the squirrels and the koala bears and the plants 69 00:04:25,400 --> 00:04:29,000 Speaker 1: are in big trouble. Right. Um. What's more, you also 70 00:04:29,120 --> 00:04:34,120 Speaker 1: affect the fresh water downstream of the dam, and basically 71 00:04:34,200 --> 00:04:37,359 Speaker 1: you screw up the environment. Right. This sounds like a 72 00:04:37,400 --> 00:04:39,279 Speaker 1: you're darned if you do, you're darned if you don't 73 00:04:39,400 --> 00:04:43,440 Speaker 1: kind of thing, right, So chuck, Somebody reading a news 74 00:04:43,520 --> 00:04:49,279 Speaker 1: report about this paper, right could say, what's the point, 75 00:04:49,400 --> 00:04:52,599 Speaker 1: why do anything? Right? Just let me eat my hungry 76 00:04:52,640 --> 00:04:55,800 Speaker 1: man dinner. And I don't care anymore. I don't care, 77 00:04:56,400 --> 00:04:59,840 Speaker 1: like I can't do anything about it. And this that 78 00:04:59,800 --> 00:05:04,680 Speaker 1: the very understandable and reasonable reaction, is kind of caused 79 00:05:04,680 --> 00:05:08,080 Speaker 1: for concern among a lot of critics of the media, 80 00:05:08,360 --> 00:05:12,719 Speaker 1: because well, let's talk about whether or not science believes 81 00:05:12,760 --> 00:05:17,840 Speaker 1: that anthropogenic or man made climate change is real. Uh, 82 00:05:17,960 --> 00:05:21,560 Speaker 1: Josh National Academy of Sciences, who we actually know some 83 00:05:21,600 --> 00:05:24,080 Speaker 1: people there, very nice people. They did great work. We 84 00:05:24,160 --> 00:05:28,120 Speaker 1: do Hey Rick Hey, Rick um Hey Marty. Thirteen hundred 85 00:05:28,200 --> 00:05:32,000 Speaker 1: and seventy two scientists were pulled by the proceedings of 86 00:05:32,040 --> 00:05:37,880 Speaker 1: the National Academy of Sciences and agreed that an anthropogenic 87 00:05:38,279 --> 00:05:41,200 Speaker 1: climate change is a real thing. And they even went 88 00:05:41,240 --> 00:05:44,240 Speaker 1: so far as to say, what's up with you? Three like, 89 00:05:44,320 --> 00:05:45,880 Speaker 1: what what do you what do you think? And they 90 00:05:45,960 --> 00:05:48,280 Speaker 1: kind of found out that they didn't have the expertise 91 00:05:48,320 --> 00:05:52,719 Speaker 1: to really determine that, right, they went back and shamed them. Yeah, 92 00:05:52,960 --> 00:05:57,520 Speaker 1: based on the that three percent citation and publication rates. 93 00:05:57,839 --> 00:06:00,640 Speaker 1: They said, these these people are basically stick it, which 94 00:06:00,680 --> 00:06:03,320 Speaker 1: is why they don't believe in manmade climate change. Right, 95 00:06:03,360 --> 00:06:06,520 Speaker 1: So this pole at least indicates that science says, yeah, 96 00:06:06,600 --> 00:06:08,960 Speaker 1: manmade climate change is happening. It's a real thing. Okay. 97 00:06:08,960 --> 00:06:12,080 Speaker 1: So you have science on the one hand saying yes, 98 00:06:12,520 --> 00:06:15,840 Speaker 1: there's a problem people, and you have the public on 99 00:06:15,880 --> 00:06:19,320 Speaker 1: the other hand saying, all right, um, I don't want 100 00:06:19,360 --> 00:06:21,600 Speaker 1: to drown and I don't want to drive these species 101 00:06:21,600 --> 00:06:24,279 Speaker 1: out of extinction. What can I do? What can I do? Mr? 102 00:06:24,320 --> 00:06:28,520 Speaker 1: Scientists and The group that serves to connect these two, 103 00:06:28,640 --> 00:06:31,000 Speaker 1: the people who know there's a problem and maybe have answers, 104 00:06:31,240 --> 00:06:33,919 Speaker 1: and the people who can actually create change by carrying 105 00:06:33,920 --> 00:06:37,480 Speaker 1: out these these solutions is the media. And it's about 106 00:06:37,560 --> 00:06:43,080 Speaker 1: here where that disconnect comes about, especially when there's doomsday scenarios. Yeah, 107 00:06:43,160 --> 00:06:45,560 Speaker 1: let me read you another stat which feels a little 108 00:06:45,560 --> 00:06:50,120 Speaker 1: awkward because you found the stat a gallop pole. Last year, 109 00:06:52,160 --> 00:06:56,560 Speaker 1: the Future UH of Americans said they believe that the 110 00:06:56,640 --> 00:07:00,800 Speaker 1: seriousness of global warming is quote generally exact adgerated. And 111 00:07:00,880 --> 00:07:05,440 Speaker 1: that was a seventeen percent rise from when we supposedly 112 00:07:05,440 --> 00:07:09,320 Speaker 1: have a lot more information probably over that time period. 113 00:07:09,800 --> 00:07:11,320 Speaker 1: And you hit the nail on the head. It is 114 00:07:11,400 --> 00:07:17,160 Speaker 1: because of something called alarmism, thanks to something called the media, right, 115 00:07:17,200 --> 00:07:19,400 Speaker 1: and part of it, part of it is alarge alarmism. 116 00:07:19,400 --> 00:07:22,840 Speaker 1: Another part is that there's such a thing as professional 117 00:07:22,960 --> 00:07:28,360 Speaker 1: climate skeptics, bloggers, reporters, media influencers who are like, no, no, 118 00:07:28,400 --> 00:07:31,640 Speaker 1: it's not real and then take money from pr companies. Sure, 119 00:07:31,880 --> 00:07:35,480 Speaker 1: but part of it is alarmism. There's there's if you 120 00:07:36,160 --> 00:07:40,320 Speaker 1: put enough problems onto a person, they're going to just 121 00:07:40,360 --> 00:07:42,440 Speaker 1: throw up their hands and give up. Well, there's also 122 00:07:42,480 --> 00:07:47,400 Speaker 1: a British study that you found that, um, the think 123 00:07:47,440 --> 00:07:50,880 Speaker 1: tank Institute for Public Policy Research. They did a review 124 00:07:50,960 --> 00:07:54,360 Speaker 1: of more than six d news articles in the UK 125 00:07:54,480 --> 00:07:56,880 Speaker 1: this is just in the UK right on climate change, 126 00:07:57,200 --> 00:07:59,680 Speaker 1: and they found we should read some of these. This 127 00:07:59,760 --> 00:08:02,680 Speaker 1: is the language, alarmist language being used by some of 128 00:08:02,720 --> 00:08:08,880 Speaker 1: these uh six articles. The climate of fear, serious climate 129 00:08:08,960 --> 00:08:12,640 Speaker 1: change is now inevitable. That was James Lovelock, for the 130 00:08:12,680 --> 00:08:15,400 Speaker 1: person who came up with the gaia theory. He also said, 131 00:08:15,440 --> 00:08:17,679 Speaker 1: the same guy said the Earth has passed the point 132 00:08:17,760 --> 00:08:22,040 Speaker 1: of no return. He also called for the temporary suspension 133 00:08:22,120 --> 00:08:26,360 Speaker 1: of democracy until we can handle climate change. Yeah. Um. 134 00:08:26,440 --> 00:08:29,720 Speaker 1: Another one said we're headed for Dodo status. Words like 135 00:08:30,000 --> 00:08:34,800 Speaker 1: point of no return, civilizational collapse, global chaos and Malcolm 136 00:08:34,840 --> 00:08:39,520 Speaker 1: Gladwell's favorite tipping point, your favorite person. Um. So that 137 00:08:39,600 --> 00:08:43,280 Speaker 1: was the alarmist language, right, And it creates the sense 138 00:08:43,320 --> 00:08:48,000 Speaker 1: of enormity, right, And that sense of enormity creates a 139 00:08:49,440 --> 00:08:52,440 Speaker 1: reasonable distancing from the problem, like there's something I can 140 00:08:52,480 --> 00:08:54,160 Speaker 1: do about this, there's nothing any of us can do 141 00:08:54,240 --> 00:08:59,160 Speaker 1: about this. Yeah. But again to my hungry man dinner, 142 00:08:59,280 --> 00:09:01,560 Speaker 1: I have to admit, well, it's a pretty normal human 143 00:09:01,600 --> 00:09:04,040 Speaker 1: thing to do. Well. Basically you're kind of being charged 144 00:09:04,080 --> 00:09:08,200 Speaker 1: with like save the planet now, or humans will be 145 00:09:08,240 --> 00:09:10,640 Speaker 1: extinct in the blink of an eye, right, and who 146 00:09:10,720 --> 00:09:12,240 Speaker 1: was not going to shrink from that a little bit 147 00:09:12,320 --> 00:09:14,720 Speaker 1: and say where's my hungry man? Right? But they also 148 00:09:14,800 --> 00:09:18,439 Speaker 1: found that there are two other large categories that climate 149 00:09:18,559 --> 00:09:23,199 Speaker 1: change reporting language can be put into. There's also, um 150 00:09:23,440 --> 00:09:27,920 Speaker 1: non pragmatic optimism, which is basically like, it's not gonna 151 00:09:27,960 --> 00:09:31,400 Speaker 1: happen in our lifetime. Who cared? Right? Um? And then 152 00:09:31,440 --> 00:09:34,840 Speaker 1: there's pragmatic optimism, which is like you can change your 153 00:09:34,880 --> 00:09:39,600 Speaker 1: light bulbs and make a difference. We we can save ourselves, 154 00:09:39,600 --> 00:09:42,360 Speaker 1: but we have to do something. That's pragmatic optimism. But 155 00:09:42,360 --> 00:09:46,160 Speaker 1: what they found was far and away the most common 156 00:09:46,200 --> 00:09:50,880 Speaker 1: type was alarmism, right because it sells uh, you know, headlines, 157 00:09:51,040 --> 00:09:53,439 Speaker 1: It sells headlines, it sells headlines, sales papers, it does, 158 00:09:53,559 --> 00:09:59,760 Speaker 1: but there really aren't papers anymore. So exactly so, Chuck Josh, 159 00:10:00,240 --> 00:10:02,640 Speaker 1: the guys who from this British thing tank what wasn't 160 00:10:03,480 --> 00:10:08,760 Speaker 1: uh the Institute for Public Policy Research, They they they 161 00:10:08,800 --> 00:10:11,720 Speaker 1: I don't think they meant to, which is fine to 162 00:10:11,720 --> 00:10:16,760 Speaker 1: say that they failed to show causation like you've got 163 00:10:16,760 --> 00:10:20,800 Speaker 1: alarms language. You have people who don't care about climate 164 00:10:20,880 --> 00:10:23,520 Speaker 1: change any longer, are afraid of the problems, aren't thinking 165 00:10:23,520 --> 00:10:26,880 Speaker 1: about it. Um. They didn't show that one caused the other. 166 00:10:26,960 --> 00:10:29,839 Speaker 1: They just show that they're correlated. But other studies have 167 00:10:30,000 --> 00:10:34,040 Speaker 1: shown that if you manipulate the public in a certain way, 168 00:10:34,200 --> 00:10:37,600 Speaker 1: you're gonna have a counterproductive reaction from him, right, yeah, 169 00:10:37,679 --> 00:10:40,680 Speaker 1: not the reaction that you're intending, right, which I thought 170 00:10:40,760 --> 00:10:44,000 Speaker 1: this was really interesting. The Northwestern study you cited in 171 00:10:44,040 --> 00:10:47,960 Speaker 1: two thousand ten in Canada, they did a public service 172 00:10:48,040 --> 00:10:53,840 Speaker 1: announcement a that said, uh, he beinge drinking has uh 173 00:10:54,240 --> 00:10:56,360 Speaker 1: is a bad thing. You shouldn't bene drink. You should 174 00:10:56,360 --> 00:11:00,640 Speaker 1: feel guilt and shame for binge drinking. But what happened, well, 175 00:11:00,640 --> 00:11:03,760 Speaker 1: they found that people who were already exposed to send 176 00:11:03,800 --> 00:11:08,240 Speaker 1: to the feelings guilt and shame, and then we're shown 177 00:11:08,280 --> 00:11:13,560 Speaker 1: this tended to drink too, were likelier to go bench 178 00:11:13,600 --> 00:11:15,960 Speaker 1: drink within the next two weeks than people who were 179 00:11:16,000 --> 00:11:18,800 Speaker 1: exposed to these ads, and we're had some sort of 180 00:11:18,800 --> 00:11:22,359 Speaker 1: neutral emotion going on, right, that's right. And even scarier, 181 00:11:22,400 --> 00:11:24,920 Speaker 1: there was one in two thousand nine in the Journal 182 00:11:24,960 --> 00:11:30,280 Speaker 1: of Experimental Social Psychology, which is a fun read um 183 00:11:30,520 --> 00:11:35,480 Speaker 1: cigarette packs. They found that said really blatant aggressive messages 184 00:11:35,520 --> 00:11:39,800 Speaker 1: like smoking can kill you actually increased smoking in some people. Right, 185 00:11:40,160 --> 00:11:43,200 Speaker 1: very specific set of people. But if you're trying to 186 00:11:43,200 --> 00:11:46,080 Speaker 1: get people across the board to quit smoking, it would 187 00:11:46,080 --> 00:11:49,720 Speaker 1: be a good idea to not use these death related messages. 188 00:11:49,800 --> 00:11:52,640 Speaker 1: You want to go for death neutral messages. Is that 189 00:11:52,720 --> 00:11:55,400 Speaker 1: what it's called. That's what it's called. Really Yeah, So, 190 00:11:55,640 --> 00:11:58,600 Speaker 1: um that that study, that two thousand nine study from 191 00:11:58,600 --> 00:12:03,440 Speaker 1: the Journal of experiment Social Psychology UM was carried out 192 00:12:03,440 --> 00:12:08,480 Speaker 1: conducted by some terror management theorists. Chuck, that's right. Terror 193 00:12:08,520 --> 00:12:12,400 Speaker 1: management theory is the coolest sounding theory of all time. 194 00:12:12,679 --> 00:12:15,360 Speaker 1: Has nothing to do with terrorism. So all of you 195 00:12:15,400 --> 00:12:19,360 Speaker 1: who are listening to this podcast hoping for one on terrorism, 196 00:12:19,400 --> 00:12:22,320 Speaker 1: you can now be officially disappointed. Yeah. And when I 197 00:12:22,360 --> 00:12:24,720 Speaker 1: first saw terror management theory, the first thing I thought 198 00:12:24,720 --> 00:12:27,480 Speaker 1: it was terrorism. I think of terror. You know what 199 00:12:27,520 --> 00:12:30,880 Speaker 1: I see in my mind? I see like a seventies 200 00:12:31,120 --> 00:12:36,880 Speaker 1: horror anthology paperback cover terror management. Yeah, right, like um yeah, 201 00:12:37,000 --> 00:12:40,840 Speaker 1: like maybe a red skull or maybe the the cover 202 00:12:41,000 --> 00:12:45,640 Speaker 1: to Stephen King's Dense Maccabre. I've never seen that. Yeah, 203 00:12:45,960 --> 00:12:51,600 Speaker 1: nice reference. Cep quab uh what queik way you had 204 00:12:51,640 --> 00:12:54,520 Speaker 1: that great funny line at the end of that one podcast, 205 00:12:54,640 --> 00:12:57,800 Speaker 1: which it was one that just came out recently. When 206 00:12:57,800 --> 00:13:00,160 Speaker 1: I was doing the q A for the episode, I 207 00:13:00,240 --> 00:13:03,520 Speaker 1: genuinely like was cracking up at my desk. That is awesome, 208 00:13:03,559 --> 00:13:05,680 Speaker 1: It's very good. Okay, that happens all the time though, 209 00:13:06,080 --> 00:13:09,080 Speaker 1: um so, where are we? Terror management theory? This was 210 00:13:09,120 --> 00:13:13,400 Speaker 1: created in the nineteen eighties by psychologists at the University 211 00:13:13,440 --> 00:13:21,280 Speaker 1: of Miszoo go uh tigers tigers, Prairie tigers, Prairie tigers, 212 00:13:21,280 --> 00:13:25,600 Speaker 1: Go prairie tigers, and based on the work of Ernest Becker, 213 00:13:25,640 --> 00:13:27,880 Speaker 1: who was the author of Denial of Death, which a 214 00:13:27,880 --> 00:13:29,400 Speaker 1: lot of you may have heard of, and we've talked 215 00:13:29,400 --> 00:13:32,400 Speaker 1: about him before plenty. I think we've talked about him, 216 00:13:32,400 --> 00:13:36,080 Speaker 1: and I think definitely, is there a worse way to die? 217 00:13:36,400 --> 00:13:40,400 Speaker 1: He came up big time. Um. But he was an 218 00:13:40,440 --> 00:13:46,280 Speaker 1: anthropologist who basically said, um, okay, all culture is created 219 00:13:46,320 --> 00:13:50,760 Speaker 1: to distract us from being obsessed with our inevitable death. 220 00:13:51,200 --> 00:13:54,160 Speaker 1: I think he was obsessed with death. He definitely was, 221 00:13:54,240 --> 00:13:58,559 Speaker 1: but through his obsession he managed to free himself. I imagine. 222 00:13:59,360 --> 00:14:02,040 Speaker 1: So that's what he thinks. It's a big distraction. Everything 223 00:14:02,160 --> 00:14:08,439 Speaker 1: from sports on TV, television, probably war is used as 224 00:14:08,440 --> 00:14:11,000 Speaker 1: a distraction and to create meaning in life. Well, actually 225 00:14:11,000 --> 00:14:14,080 Speaker 1: one of them, the war was a big one that 226 00:14:14,120 --> 00:14:18,640 Speaker 1: he used. He used the the I can't remember what 227 00:14:18,720 --> 00:14:21,640 Speaker 1: it's called. It's not thanatology. That's like the study of death. 228 00:14:21,680 --> 00:14:25,240 Speaker 1: Becker's was more like the obsession with death. Um. But 229 00:14:25,280 --> 00:14:30,360 Speaker 1: basically he was saying, we we are aware innately that 230 00:14:30,400 --> 00:14:33,880 Speaker 1: we do create culture to distract ourselves from death. So 231 00:14:33,960 --> 00:14:38,400 Speaker 1: when we encounter another culture that's totally foreigner, alien to us, 232 00:14:38,760 --> 00:14:41,160 Speaker 1: and we look at their culture, we see how ridiculous 233 00:14:41,240 --> 00:14:44,360 Speaker 1: it appears, that their belief systems just appear ridiculous to us. 234 00:14:44,960 --> 00:14:49,520 Speaker 1: That it reminds us that we're we do the same thing. 235 00:14:50,080 --> 00:14:53,160 Speaker 1: Our beliefs are ridiculous. And then we start thinking about 236 00:14:53,160 --> 00:14:55,400 Speaker 1: our own deaths. So we want to destroy this culture 237 00:14:55,400 --> 00:14:57,800 Speaker 1: that we encounter because it's reminding us of our own 238 00:14:57,800 --> 00:14:59,560 Speaker 1: death and we hate it. How does that fit the 239 00:14:59,560 --> 00:15:04,800 Speaker 1: cultural relativism, wonder, that's the opposite of it. Cultural relativism 240 00:15:04,920 --> 00:15:08,000 Speaker 1: is is also flawed. You know, we we figured out 241 00:15:08,160 --> 00:15:11,000 Speaker 1: to a certain extent because it just allows absolutely anything 242 00:15:11,480 --> 00:15:14,800 Speaker 1: you know, wholesale. But it's the opposite of of what 243 00:15:14,840 --> 00:15:18,440 Speaker 1: I just described cultural relativism is. But terror management theory 244 00:15:18,680 --> 00:15:23,640 Speaker 1: takes Becker's ideas and introduces them from anthropology to psychology 245 00:15:23,800 --> 00:15:26,800 Speaker 1: and has really kind of started to standardize them. So 246 00:15:26,880 --> 00:15:29,000 Speaker 1: the idea is that you are afraid of your own 247 00:15:29,000 --> 00:15:32,840 Speaker 1: demise and so you cling like uh mad to the 248 00:15:32,880 --> 00:15:34,920 Speaker 1: culture that you most identify with and you've come to 249 00:15:34,960 --> 00:15:37,440 Speaker 1: identify with. Yeah, and and they keep finding and study 250 00:15:37,480 --> 00:15:40,200 Speaker 1: after study that this this theory holds up. And that's 251 00:15:40,200 --> 00:15:45,880 Speaker 1: called what distal defense. Yes. Interesting, So let me give 252 00:15:45,880 --> 00:15:49,080 Speaker 1: you an example of a study. There were there were 253 00:15:49,120 --> 00:15:51,720 Speaker 1: a bunch of judges that were that were used in 254 00:15:51,720 --> 00:15:55,360 Speaker 1: a study UM and I guess they chose prostitution to 255 00:15:55,440 --> 00:15:58,800 Speaker 1: use it as like a it's the same case, same offense. 256 00:15:59,320 --> 00:16:03,520 Speaker 1: It was easily standardized. Maybe, but they took some judges 257 00:16:03,600 --> 00:16:06,000 Speaker 1: and basically were like, hey, you know you're gonna die 258 00:16:06,040 --> 00:16:10,040 Speaker 1: eventually to one group. Then they did another group where 259 00:16:10,320 --> 00:16:15,600 Speaker 1: they give him some activity that basically guaranteed um death 260 00:16:16,200 --> 00:16:20,400 Speaker 1: saliency I believe, where they weren't thinking about death at all, 261 00:16:20,800 --> 00:16:24,120 Speaker 1: And they sent both groups out to handle their prostitution cases, 262 00:16:24,280 --> 00:16:27,320 Speaker 1: and the judges have been reminded their mortality tended to 263 00:16:27,360 --> 00:16:31,920 Speaker 1: throw the book at prostitutes, and they the researchers theorized 264 00:16:32,360 --> 00:16:36,240 Speaker 1: or postulated that the reason these guys through the book 265 00:16:36,280 --> 00:16:40,080 Speaker 1: at prostitutes was because this was there they were buying 266 00:16:40,120 --> 00:16:42,960 Speaker 1: into their culture. They were clinging to their culture morals 267 00:16:43,000 --> 00:16:46,080 Speaker 1: and laws and what they believed in. And if you 268 00:16:46,080 --> 00:16:48,600 Speaker 1: think about it, it makes perfect sense. Chuck. You've been hungover, 269 00:16:48,680 --> 00:16:53,360 Speaker 1: like tragically hungover before, right, Uh, so you don't want 270 00:16:53,360 --> 00:16:55,760 Speaker 1: to take any risks whatsoever. You kind of feel like 271 00:16:55,880 --> 00:16:58,080 Speaker 1: you're at death store. You're on some sort of edge 272 00:16:58,160 --> 00:17:00,440 Speaker 1: right there where you're really hanging out there and you're 273 00:17:00,440 --> 00:17:03,560 Speaker 1: really vulnerable and like you you maybe you know my 274 00:17:03,640 --> 00:17:06,280 Speaker 1: alwe notice makes you cry while you're watching it on 275 00:17:06,320 --> 00:17:08,840 Speaker 1: the couch, or like you really need to be around 276 00:17:08,840 --> 00:17:11,720 Speaker 1: Emily and have our support right then, right, Yeah, I 277 00:17:11,760 --> 00:17:14,159 Speaker 1: think that's the that's the that's the root of it, 278 00:17:14,200 --> 00:17:16,320 Speaker 1: that's the basis of it. When we're reminded of death, 279 00:17:16,359 --> 00:17:18,679 Speaker 1: we cling to what we find comfort in, and we 280 00:17:18,720 --> 00:17:22,040 Speaker 1: tend to find comfort in the charade of society or 281 00:17:22,119 --> 00:17:24,880 Speaker 1: culture that we create. And I bet you you could 282 00:17:24,920 --> 00:17:27,280 Speaker 1: even trickle it down a little bit, even beyond death, 283 00:17:27,320 --> 00:17:31,560 Speaker 1: when you're feeling most fragile and vulnerable, which obviously death 284 00:17:31,720 --> 00:17:36,080 Speaker 1: would be the ultimate invulnerable vulnerability exactly. But that's when 285 00:17:36,119 --> 00:17:37,760 Speaker 1: you need to support most and you cling to what 286 00:17:37,800 --> 00:17:39,760 Speaker 1: you know most, right. But I mean like, if you 287 00:17:39,840 --> 00:17:43,560 Speaker 1: got fired from a job or you know something, there 288 00:17:43,640 --> 00:17:47,800 Speaker 1: was some disruption to your normal life, You're you're going 289 00:17:47,840 --> 00:17:50,879 Speaker 1: to cling to it. So that's distal. That's the distal defense. 290 00:17:50,880 --> 00:17:54,040 Speaker 1: There's also proximal and this this is what regards the 291 00:17:54,080 --> 00:17:56,800 Speaker 1: climate change. Yeah, and that's when you downplay the seriousness 292 00:17:56,840 --> 00:18:00,200 Speaker 1: of something like, um that that's kind of me. I 293 00:18:00,240 --> 00:18:02,160 Speaker 1: don't think about death much, and I'm one to really 294 00:18:02,200 --> 00:18:05,880 Speaker 1: like play it down probably like, oh, you know, you're 295 00:18:05,880 --> 00:18:08,800 Speaker 1: I'm gonna die when I'm old like everybody, but I 296 00:18:08,800 --> 00:18:10,520 Speaker 1: don't want to think about that. You just downplay the 297 00:18:10,560 --> 00:18:13,240 Speaker 1: significance of it, right, I wonder though, I mean, like, 298 00:18:13,359 --> 00:18:15,800 Speaker 1: is that the best you can hope for, what to 299 00:18:15,840 --> 00:18:18,479 Speaker 1: die when you're old? No, to to just kind of 300 00:18:18,480 --> 00:18:21,439 Speaker 1: downplay it and not like genuinely not worry about it. 301 00:18:21,480 --> 00:18:24,800 Speaker 1: Like is that really flawed? I hope not. I never 302 00:18:24,840 --> 00:18:28,520 Speaker 1: think about death, but I guess what I asked, because 303 00:18:28,720 --> 00:18:31,280 Speaker 1: I have the same approach, Like I tend to think, well, okay, 304 00:18:31,280 --> 00:18:35,000 Speaker 1: I'm aware that I'm going to die one day. Um 305 00:18:35,240 --> 00:18:37,560 Speaker 1: you know, I'm aware of death. I don't think about it. 306 00:18:37,600 --> 00:18:40,960 Speaker 1: I'm not obsessed with that, especially not my own. Um, 307 00:18:41,000 --> 00:18:44,719 Speaker 1: but I wonder sometimes, like in that last few minutes, 308 00:18:44,760 --> 00:18:49,480 Speaker 1: am I going to freak out? Because people do? Yeah, 309 00:18:49,520 --> 00:18:52,199 Speaker 1: you don't see that in the movies. Emily is a 310 00:18:52,440 --> 00:18:56,200 Speaker 1: is very preoccupied with death. Yeah, she didn't talk about 311 00:18:56,200 --> 00:18:59,520 Speaker 1: it much, but she's she's kind of a very dark side, 312 00:18:59,560 --> 00:19:03,440 Speaker 1: like I'm driving home and I could easily slip off 313 00:19:03,480 --> 00:19:06,080 Speaker 1: the road and hit that tree, or there's a tornado 314 00:19:06,160 --> 00:19:08,600 Speaker 1: watch in the area and a tree is gonna fall 315 00:19:08,600 --> 00:19:10,200 Speaker 1: in our house and kill us all in her sleep 316 00:19:10,720 --> 00:19:12,480 Speaker 1: where I don't think about that stuff. But how does 317 00:19:12,480 --> 00:19:15,800 Speaker 1: she react to that? Um, I think there's a low 318 00:19:16,359 --> 00:19:20,680 Speaker 1: level of anxiety. Probably, Yeah, she would probably agree with that. Yeah, 319 00:19:20,760 --> 00:19:23,840 Speaker 1: she didn't listen to the show. That's fine, see whatever 320 00:19:23,880 --> 00:19:27,479 Speaker 1: you want now. Um, So you've got proximal and distal 321 00:19:27,560 --> 00:19:30,719 Speaker 1: defense mechanisms when reminded to mortality, right, Yeah, and imagine 322 00:19:30,760 --> 00:19:33,159 Speaker 1: both can probably happen to right, not one or the 323 00:19:33,160 --> 00:19:36,200 Speaker 1: other right. Um, but with with the doomsday scenario and 324 00:19:36,240 --> 00:19:38,960 Speaker 1: climate change, it seems like terror management can't explain that, 325 00:19:39,560 --> 00:19:43,040 Speaker 1: um through the the proximal defense where you just downplay 326 00:19:43,040 --> 00:19:45,320 Speaker 1: it and they're like, Okay, well, who cares, it's not 327 00:19:45,400 --> 00:19:47,359 Speaker 1: that big of a deal. It's not gonna happen in 328 00:19:47,359 --> 00:19:50,800 Speaker 1: our lifetime, that kind of thing. Um. So I guess 329 00:19:51,200 --> 00:19:54,480 Speaker 1: if we were to uh give advice to the media, 330 00:19:54,560 --> 00:19:57,119 Speaker 1: which we never do in an article. Yeah, and media 331 00:19:57,200 --> 00:19:59,320 Speaker 1: doesn't listen to us anyway, even though we're sort of 332 00:19:59,359 --> 00:20:01,080 Speaker 1: a part of it. Are I don't know if we 333 00:20:01,119 --> 00:20:04,920 Speaker 1: are media fully concluded that yeah, yeah, Um, it would 334 00:20:04,960 --> 00:20:10,639 Speaker 1: be to basically adopt the more pragmatic, optimistic approach I 335 00:20:10,640 --> 00:20:13,040 Speaker 1: think tends to work. But it doesn't look like the 336 00:20:13,080 --> 00:20:19,800 Speaker 1: media is gonna do it. Chucked because tell him about CFLs. Yes, Josh, CFLs. UM. 337 00:20:19,880 --> 00:20:22,920 Speaker 1: When they first came out, it was all the rage 338 00:20:23,200 --> 00:20:26,359 Speaker 1: in green living and as soon as everyone was like, yeah, 339 00:20:26,359 --> 00:20:27,760 Speaker 1: you know what, I can deal with the fact that 340 00:20:27,800 --> 00:20:29,439 Speaker 1: it looks a little funny and I'm not used to 341 00:20:29,480 --> 00:20:32,879 Speaker 1: this whiter light. Um, I'm gonna do it because I'm 342 00:20:32,880 --> 00:20:35,720 Speaker 1: gonna do my part, my very small part like I 343 00:20:35,760 --> 00:20:38,600 Speaker 1: can't exact change against climate change. And then the media 344 00:20:39,080 --> 00:20:43,280 Speaker 1: all of a sudden started reporting stories about mercury. Is 345 00:20:43,320 --> 00:20:47,200 Speaker 1: it mercury in CFLs and you're gonna die if you 346 00:20:47,320 --> 00:20:52,320 Speaker 1: used them exactly? Alarmist language. It's just sensationalism. Man. It's 347 00:20:52,320 --> 00:20:55,800 Speaker 1: been around since the first words were printed, extra extra 348 00:20:56,040 --> 00:20:58,440 Speaker 1: read all about it is like you're going to die 349 00:20:58,480 --> 00:21:02,280 Speaker 1: from the plague, and it's still still around. In this case, 350 00:21:02,359 --> 00:21:05,520 Speaker 1: it's climate change, and it's making people shrink under their 351 00:21:05,520 --> 00:21:08,320 Speaker 1: couch instead of doing small things that can actually make 352 00:21:08,359 --> 00:21:13,359 Speaker 1: a difference. Prescription. Dr Bryant, Um, You know what I 353 00:21:13,400 --> 00:21:16,679 Speaker 1: do is I don't read. I don't watch local news. 354 00:21:16,680 --> 00:21:20,239 Speaker 1: I don't watch I don't watch any news really, no no, no, no, 355 00:21:20,240 --> 00:21:25,119 Speaker 1: no no no do you uh no, now that I 356 00:21:25,160 --> 00:21:28,000 Speaker 1: think about it, not really. I mean I watch selective 357 00:21:28,640 --> 00:21:31,080 Speaker 1: stuff like if if there's like a good video or something, 358 00:21:31,080 --> 00:21:33,800 Speaker 1: I'll watch it. But I think I get most of 359 00:21:33,800 --> 00:21:36,560 Speaker 1: mine news from Twitter or magazines. Yeah. I get most 360 00:21:36,560 --> 00:21:38,359 Speaker 1: of mine from the Internet, and a lot of times 361 00:21:38,480 --> 00:21:41,439 Speaker 1: not from you know, leading sources on the Internet. You 362 00:21:41,440 --> 00:21:44,400 Speaker 1: can usually get an honest truth if you seek out 363 00:21:44,400 --> 00:21:48,200 Speaker 1: some of these other websites. Okay, I wish I could 364 00:21:48,240 --> 00:21:51,760 Speaker 1: think of one. So terror management theory and Ernest Becker 365 00:21:51,840 --> 00:21:55,479 Speaker 1: makes another cameo. Thank you Dr Becker for showing up. 366 00:21:56,119 --> 00:22:00,280 Speaker 1: Be Um. If you want to learn more about dooms, stay, 367 00:22:00,320 --> 00:22:05,640 Speaker 1: climate scenarios, terror management theory, tobacco warnings. This article's got 368 00:22:05,680 --> 00:22:10,840 Speaker 1: it all. You can type in doomsday and climate change 369 00:22:11,560 --> 00:22:13,879 Speaker 1: and I think it'll bring this article up. And handy 370 00:22:13,920 --> 00:22:19,440 Speaker 1: search bar brings up listener mail. That's right, Josh, we're 371 00:22:19,440 --> 00:22:22,400 Speaker 1: gonna call this. Uh, these kids are ripping us off. 372 00:22:23,840 --> 00:22:26,120 Speaker 1: Is this catholic stuff you should know? Now? This tongue 373 00:22:26,119 --> 00:22:31,399 Speaker 1: in cheek? Um, so Josh, so Chuck. We are teachers, so, Jerry. 374 00:22:31,600 --> 00:22:34,280 Speaker 1: We are teachers at Mountain Ridge Middle School in Colorado 375 00:22:34,280 --> 00:22:38,359 Speaker 1: Springs at the base of America's Mountain Pike's Peak. We 376 00:22:38,359 --> 00:22:40,480 Speaker 1: were inspired by your podcasts over the years and became 377 00:22:40,520 --> 00:22:43,840 Speaker 1: regular listeners and decided our eighth grade class should write 378 00:22:43,840 --> 00:22:46,600 Speaker 1: podcasts in the style of stuff you should know, record them, 379 00:22:46,600 --> 00:22:50,160 Speaker 1: and edit them. Sounds like good conclu Yeah, cool exercise. 380 00:22:50,200 --> 00:22:52,439 Speaker 1: So we shared portions of some of your podcasts with 381 00:22:52,480 --> 00:22:55,560 Speaker 1: the kids, asked them to develop the plan and rubric 382 00:22:56,400 --> 00:22:58,720 Speaker 1: and based on the elements they based on the elements 383 00:22:58,720 --> 00:23:01,119 Speaker 1: that they heard in your show. The Chuck and Josh 384 00:23:01,200 --> 00:23:05,439 Speaker 1: is ms, jazzy theme song, jokes, and stats, demonstrating a 385 00:23:05,480 --> 00:23:11,960 Speaker 1: thorough understanding listener mail the whole soup to nuts nasically. Um. 386 00:23:12,000 --> 00:23:14,919 Speaker 1: The students were permitted to choose their topics. We encourage 387 00:23:14,920 --> 00:23:16,880 Speaker 1: them to find out what's the deal with whatever they 388 00:23:16,880 --> 00:23:22,160 Speaker 1: were curious about. Topics range from hovercrafts to grilled meat 389 00:23:22,320 --> 00:23:27,639 Speaker 1: and carcinogens to hypoglycemia. So these are some smart kids. Um. 390 00:23:27,680 --> 00:23:29,880 Speaker 1: The sound quality isn't as crisp obviously, but you think 391 00:23:29,880 --> 00:23:31,159 Speaker 1: you'll get a kick out of it. You should have 392 00:23:31,240 --> 00:23:36,199 Speaker 1: heard our first one before we got microphones. Um. The 393 00:23:36,240 --> 00:23:40,000 Speaker 1: most interesting aspect, though, guys, was you're really surprised who excelled. 394 00:23:40,560 --> 00:23:42,440 Speaker 1: One pair of students didn't have a history of being 395 00:23:42,440 --> 00:23:46,240 Speaker 1: the most academically motivated, but their delivery was really smooth 396 00:23:46,280 --> 00:23:50,959 Speaker 1: and professional, and the emo kids were surprisingly funny. They 397 00:23:51,000 --> 00:23:54,120 Speaker 1: seem to get the Chuck and Josh banner and uh 398 00:23:54,640 --> 00:23:58,440 Speaker 1: could become excellent radio personalities one day awesome. Another group, um, 399 00:23:58,480 --> 00:24:00,680 Speaker 1: shall we say, is not the most so really proficient 400 00:24:00,720 --> 00:24:03,560 Speaker 1: face to face, but behind the microphone they really came 401 00:24:03,560 --> 00:24:06,199 Speaker 1: alive and had a fluid delivery and we're animated that 402 00:24:07,200 --> 00:24:10,280 Speaker 1: the chemistry was amazing between them. So that is from 403 00:24:10,280 --> 00:24:13,480 Speaker 1: Emily and Sewn, who are the t A G Coordinator 404 00:24:13,600 --> 00:24:18,280 Speaker 1: and IB coordinator at Enrichment Team Enrichment Team leaders, and 405 00:24:18,359 --> 00:24:20,639 Speaker 1: we have a clip. They sent us a clip and 406 00:24:20,680 --> 00:24:24,520 Speaker 1: we're gonna play just a little snippet right now. Guys think, 407 00:24:27,320 --> 00:24:31,680 Speaker 1: hey from Colorado Springs, Colorado. It's just like mother used to. 408 00:24:32,800 --> 00:24:39,399 Speaker 1: And here zerost with the face for radio decode. Dim 409 00:24:39,400 --> 00:24:42,800 Speaker 1: Sum originated from the older tradition of tea tasting. When 410 00:24:42,800 --> 00:24:46,399 Speaker 1: people discovered the tea health digestion, they began adding food 411 00:24:46,400 --> 00:24:50,879 Speaker 1: to tea time, giving them dim sum. Dim Sum is 412 00:24:50,880 --> 00:24:54,439 Speaker 1: a dish that involves small individual portions of food, usually 413 00:24:54,440 --> 00:24:58,120 Speaker 1: served in a steam basket or on a plate. Dim 414 00:24:58,200 --> 00:25:03,400 Speaker 1: Sum is typically served as a breakfast. This usually includes vegetables, steam, dumplings, 415 00:25:03,520 --> 00:25:08,000 Speaker 1: roasted chicken and rice, and noodles. Awesome, pretty cool, huh Yeah, 416 00:25:08,080 --> 00:25:11,000 Speaker 1: little emulators, yes, little rip off artists. Yeah, I know, 417 00:25:11,200 --> 00:25:13,920 Speaker 1: I'm just kidding you. Guys will be hearing from our lawyers. 418 00:25:14,080 --> 00:25:15,720 Speaker 1: That was an excellent job and we want to tell 419 00:25:15,760 --> 00:25:18,120 Speaker 1: the whole class way to go, and that's really cool 420 00:25:18,160 --> 00:25:20,320 Speaker 1: that your teachers did this. Yeah, so thank you Emily 421 00:25:20,320 --> 00:25:22,720 Speaker 1: and Sean for letting us know about this. We appreciate it. 422 00:25:22,760 --> 00:25:25,320 Speaker 1: You guys, keep on keeping on, keep doing all sorts 423 00:25:25,359 --> 00:25:27,560 Speaker 1: of cool things. Don't be so judgment all of the 424 00:25:27,600 --> 00:25:33,000 Speaker 1: emo kids people too. If you have a cool theory 425 00:25:33,280 --> 00:25:35,880 Speaker 1: that you'd like us to hear of, I'm always down 426 00:25:35,920 --> 00:25:39,000 Speaker 1: for cool theories, aren't you send it to us via 427 00:25:39,080 --> 00:25:47,440 Speaker 1: email at stuff podcast at how stuff works dot com. 428 00:25:47,560 --> 00:25:50,000 Speaker 1: For more on this and thousands of other topics, visit 429 00:25:50,080 --> 00:25:52,879 Speaker 1: how stuff works dot com. To learn more about the podcast, 430 00:25:53,080 --> 00:25:55,600 Speaker 1: click on the podcast icon in the upper right corner 431 00:25:55,640 --> 00:25:58,720 Speaker 1: of our homepage. The house Stuff Works iPhone app has 432 00:25:58,720 --> 00:26:04,480 Speaker 1: a ride. Download it to day on iTunes, brought to 433 00:26:04,520 --> 00:26:07,639 Speaker 1: you by the reinvented two thousand twelve Camray. It's ready, 434 00:26:07,840 --> 00:26:08,199 Speaker 1: are you