1 00:00:03,080 --> 00:00:05,920 Speaker 1: Welcome to Stuff to Blow your Mind from how Stuff 2 00:00:05,920 --> 00:00:14,560 Speaker 1: Works dot com. Hey you welcome to Stuff to Blow 3 00:00:14,560 --> 00:00:17,560 Speaker 1: your Mind. My name is Robert Lamb and I'm Joe McCormick. 4 00:00:17,600 --> 00:00:19,560 Speaker 1: And today we're going to do an episode following up 5 00:00:19,560 --> 00:00:22,479 Speaker 1: on a panel Robert saw this year at the World 6 00:00:22,600 --> 00:00:26,920 Speaker 1: Science Festival in New York. So we're gonna talk about 7 00:00:27,040 --> 00:00:31,080 Speaker 1: topics having to do with bias, belief, public opinion, and 8 00:00:31,160 --> 00:00:34,519 Speaker 1: science communication. And I think we should start in some 9 00:00:34,640 --> 00:00:37,840 Speaker 1: territory that's sure to annoy at least a few listeners 10 00:00:38,000 --> 00:00:40,280 Speaker 1: right from the get go. So Robert, I got a 11 00:00:40,320 --> 00:00:42,800 Speaker 1: pop quiz for you. Don't look at the numbers. If 12 00:00:42,800 --> 00:00:45,640 Speaker 1: you had to guess how many Americans do you think 13 00:00:45,880 --> 00:00:50,240 Speaker 1: accept the scientific consensus on global warming? Oh, well, of 14 00:00:50,280 --> 00:00:52,280 Speaker 1: course that's a big one, because of course, when you 15 00:00:52,320 --> 00:00:54,360 Speaker 1: when you go to answer that question, you think about 16 00:00:54,960 --> 00:00:57,720 Speaker 1: your your immediate you know, sphere of influence and the 17 00:00:57,720 --> 00:01:01,200 Speaker 1: people you know, or perhaps you think of media representations 18 00:01:01,200 --> 00:01:04,240 Speaker 1: on the question. So it depends on on on on 19 00:01:04,319 --> 00:01:07,880 Speaker 1: the reporting, on the on the panels of experts you're 20 00:01:07,880 --> 00:01:11,440 Speaker 1: presented with. I would I would tend to just off 21 00:01:11,440 --> 00:01:13,280 Speaker 1: the top of my head, and I'd want to rate 22 00:01:13,360 --> 00:01:20,319 Speaker 1: it of people except the scientific consensus, you're not too 23 00:01:20,360 --> 00:01:24,640 Speaker 1: far off, but that's optimistic. Uh so. Gallup has been 24 00:01:24,680 --> 00:01:28,440 Speaker 1: tracking Americans beliefs about global warming for over a decade now, 25 00:01:29,160 --> 00:01:31,240 Speaker 1: and some of the questions they tend to ask people 26 00:01:31,280 --> 00:01:34,319 Speaker 1: are more subjective. It's things like do you worry a 27 00:01:34,360 --> 00:01:39,960 Speaker 1: great deal about global warming? And uh, intent of people 28 00:01:40,000 --> 00:01:43,840 Speaker 1: said yes. That's up from thirty seven percent in sixteen, 29 00:01:43,880 --> 00:01:48,480 Speaker 1: and up from thirty two percent in But technically, I mean, 30 00:01:48,480 --> 00:01:51,480 Speaker 1: it's worth pointing out that there's no objective fact about 31 00:01:51,520 --> 00:01:54,160 Speaker 1: whether you should be worried or not. Maybe you don't care. 32 00:01:55,000 --> 00:01:57,360 Speaker 1: Oh yeah, it comes down to worry and what what? What? 33 00:01:57,440 --> 00:02:00,880 Speaker 1: To what extent? Are you worrying a great deal about something? Right? Like, 34 00:02:00,920 --> 00:02:03,600 Speaker 1: you can you can realize something is a vital threat 35 00:02:03,720 --> 00:02:07,120 Speaker 1: to the human race. You can just not care anyway. Right, 36 00:02:07,160 --> 00:02:09,160 Speaker 1: you can say, well, if it happens, it happens, or 37 00:02:09,200 --> 00:02:11,520 Speaker 1: you know or what or well that's a problem for 38 00:02:11,600 --> 00:02:13,800 Speaker 1: the next generation to figure out. There are various ways 39 00:02:13,840 --> 00:02:16,880 Speaker 1: of calculating that question in your head. Right. It also 40 00:02:16,960 --> 00:02:19,560 Speaker 1: hinges on the word worry, like maybe you do care 41 00:02:19,600 --> 00:02:22,640 Speaker 1: about fighting climate change, but you wouldn't characterize your feeling 42 00:02:22,680 --> 00:02:26,000 Speaker 1: as worried. You're invigorated by the idea of trying to 43 00:02:26,000 --> 00:02:28,200 Speaker 1: do something about it. Right, Are you worried about it 44 00:02:28,400 --> 00:02:31,280 Speaker 1: versus do you think this is this is a problem 45 00:02:31,360 --> 00:02:34,720 Speaker 1: that government should work together too to address. Right. That 46 00:02:34,800 --> 00:02:38,960 Speaker 1: Some of the other questions have straightforwardly right or wrong answers. 47 00:02:39,160 --> 00:02:44,680 Speaker 1: For example, in seventy one percent of Americans said they 48 00:02:44,720 --> 00:02:49,600 Speaker 1: agree that most scientists believe global warming is occurring. Right, 49 00:02:49,639 --> 00:02:53,200 Speaker 1: So the question is, do you think most scientists believe 50 00:02:53,400 --> 00:02:57,600 Speaker 1: global warming is occurring? Se said yes. That's up from 51 00:02:57,680 --> 00:03:03,240 Speaker 1: sixty in sixteen and sixty, so there's there's a climb 52 00:03:03,240 --> 00:03:05,919 Speaker 1: in that number. More people are saying yes, I think 53 00:03:05,960 --> 00:03:09,880 Speaker 1: most scientists believe that the Earth is warming. There is 54 00:03:09,960 --> 00:03:12,960 Speaker 1: just an objective fact to the matter about whether most 55 00:03:13,000 --> 00:03:16,680 Speaker 1: scientists or most climate scientists believe the planet is warming. 56 00:03:16,960 --> 00:03:21,079 Speaker 1: They do. There's no debate about that. Now, what has 57 00:03:21,120 --> 00:03:24,720 Speaker 1: been reasonably debated is the exact figure of the agreement, 58 00:03:25,000 --> 00:03:28,639 Speaker 1: because it's not necessarily easy to calculate exactly what numbers 59 00:03:28,720 --> 00:03:33,120 Speaker 1: of scientists agree with a certain proposition. Right, Yeah, I mean, 60 00:03:33,120 --> 00:03:36,680 Speaker 1: if you if you just give yourself this assignment and 61 00:03:36,720 --> 00:03:40,680 Speaker 1: start hitting Wikipedia. Yeah, you're gonna find lists of scientists 62 00:03:40,680 --> 00:03:43,200 Speaker 1: who or either opponents or proponents. But then when you 63 00:03:43,240 --> 00:03:45,240 Speaker 1: start trying to peal back and figure out who these 64 00:03:45,280 --> 00:03:48,480 Speaker 1: people are and what their field of expertise are, it 65 00:03:48,800 --> 00:03:52,040 Speaker 1: just gets increasingly complicated. Right, But there are studies that 66 00:03:52,080 --> 00:03:55,000 Speaker 1: look into this. They try to impose a methodology and say, Okay, 67 00:03:55,040 --> 00:03:59,200 Speaker 1: what do scientists think or what has the published literature said. Now, 68 00:03:59,240 --> 00:04:01,600 Speaker 1: one study like this was published in two thousand nine 69 00:04:01,640 --> 00:04:07,720 Speaker 1: in EOS Transactions of the American Geophysical Union, Uh good 70 00:04:07,840 --> 00:04:12,240 Speaker 1: good professional publication title there, and it's called Examining the 71 00:04:12,240 --> 00:04:15,200 Speaker 1: Scientific Consensus on Climate Change. And so what they did 72 00:04:15,320 --> 00:04:18,440 Speaker 1: is they sent invitations out to more than ten thousand 73 00:04:18,520 --> 00:04:22,440 Speaker 1: Earth scientists, basically all of the geoscientists they could find 74 00:04:22,480 --> 00:04:26,800 Speaker 1: at universities and public research institutions, with two survey questions. 75 00:04:26,800 --> 00:04:30,400 Speaker 1: And these were the two questions. First question, when compared 76 00:04:30,440 --> 00:04:33,960 Speaker 1: with pre eighteen hundreds levels, do you think that mean 77 00:04:34,040 --> 00:04:39,000 Speaker 1: global temperatures have generally risen, fallen, or remained relatively constant? 78 00:04:39,720 --> 00:04:43,080 Speaker 1: And then the second question, do you think human activity 79 00:04:43,480 --> 00:04:48,320 Speaker 1: is a significant contributing factor in changing mean global temperatures? 80 00:04:48,360 --> 00:04:52,080 Speaker 1: So of the people they pinned with this survey, three thousand, 81 00:04:52,080 --> 00:04:55,440 Speaker 1: one forty six geoscientists responded. They said, that's about a 82 00:04:55,440 --> 00:04:59,799 Speaker 1: standard survey response rate over all of the earth sciences. 83 00:05:00,120 --> 00:05:04,960 Speaker 1: Nine of participants answered risen to the first question. Now, 84 00:05:05,040 --> 00:05:08,479 Speaker 1: that's geoscientists. That's people who study the Earth in anyway, 85 00:05:08,520 --> 00:05:14,919 Speaker 1: so geologist, ocean hydrologists, meteorologists, economic geologists. Um, what about 86 00:05:14,960 --> 00:05:18,480 Speaker 1: people who study the climate specifically? Well, Of the subset 87 00:05:18,520 --> 00:05:21,480 Speaker 1: of respondents who were experts in climate science and had 88 00:05:21,480 --> 00:05:24,159 Speaker 1: published more than half of their recent papers weighing in 89 00:05:24,200 --> 00:05:27,599 Speaker 1: on the subject of climate change, ninety six point two 90 00:05:27,600 --> 00:05:31,680 Speaker 1: percent or seventy six of seventy nine answered risen to 91 00:05:31,839 --> 00:05:35,280 Speaker 1: the first question. That's a high that's a high percentage 92 00:05:35,360 --> 00:05:39,640 Speaker 1: right there, right yeah. And there have been multiple other 93 00:05:39,680 --> 00:05:43,679 Speaker 1: studies that use different methodologies to ask slightly different questions, 94 00:05:43,720 --> 00:05:47,359 Speaker 1: but all of them have found overwhelming agreement among scientists 95 00:05:47,360 --> 00:05:50,839 Speaker 1: in general, and especially among climate scientists in particular, that 96 00:05:50,920 --> 00:05:54,400 Speaker 1: the planet is rapidly warming. So those seventy one percent 97 00:05:54,480 --> 00:05:57,440 Speaker 1: of Americans who say that most scientists believe global warming 98 00:05:57,560 --> 00:06:01,880 Speaker 1: is happening, they are factually correct. Those who disagreed are incorrect. 99 00:06:01,960 --> 00:06:03,960 Speaker 1: Though it is worth saying that a large share of 100 00:06:04,000 --> 00:06:06,960 Speaker 1: the people who didn't agree with that said they were unsure. 101 00:06:07,160 --> 00:06:10,120 Speaker 1: So if you're unsure, you're unsure. But if you if 102 00:06:10,160 --> 00:06:13,680 Speaker 1: you didn't agree with that, you are incorrect. But then, 103 00:06:13,720 --> 00:06:15,920 Speaker 1: of course, there you get into questions in the global 104 00:06:15,920 --> 00:06:18,039 Speaker 1: warming debate that are not quite as cut and dry 105 00:06:18,080 --> 00:06:21,160 Speaker 1: as whether the majority of scientists agree that the Earth 106 00:06:21,240 --> 00:06:26,479 Speaker 1: is warming. For example, what's causing the warming? Is global 107 00:06:26,520 --> 00:06:30,280 Speaker 1: warming caused by human activity primarily greenhouse gas emissions, or 108 00:06:30,360 --> 00:06:33,640 Speaker 1: by natural causes? Well, it shouldn't be surprising that there 109 00:06:33,640 --> 00:06:36,200 Speaker 1: have been plenty of attempts to study the opinion of 110 00:06:36,240 --> 00:06:39,800 Speaker 1: scientists on this question as well. So, for example, in 111 00:06:39,880 --> 00:06:42,600 Speaker 1: that same survey from two thousand nine we just mentioned 112 00:06:43,240 --> 00:06:46,719 Speaker 1: a d two percent of all Earth scientists said yes 113 00:06:47,040 --> 00:06:50,640 Speaker 1: that humans are a major contributing factor, and nine seven 114 00:06:50,720 --> 00:06:56,200 Speaker 1: point four percent of active climate researchers said yes. Again, 115 00:06:56,240 --> 00:06:59,080 Speaker 1: these are high percentages. If these were the experts telling 116 00:06:59,120 --> 00:07:01,440 Speaker 1: me that I should cut something out of my diet 117 00:07:02,600 --> 00:07:04,440 Speaker 1: or maybe, you know, make other some sort of major 118 00:07:04,520 --> 00:07:07,200 Speaker 1: change in my life, I would be seriously inclined to 119 00:07:07,200 --> 00:07:09,720 Speaker 1: listen to them. Okay, sure, but maybe maybe you say, well, 120 00:07:09,760 --> 00:07:12,320 Speaker 1: that's just one study, but has anybody else studied this? 121 00:07:12,520 --> 00:07:15,760 Speaker 1: Actually yes. So a commonly cited figure is from a 122 00:07:16,640 --> 00:07:19,800 Speaker 1: study in the journal Environmental Research Letters by John Cook 123 00:07:19,960 --> 00:07:23,160 Speaker 1: at All that what they did is looked at abstracts 124 00:07:23,160 --> 00:07:25,680 Speaker 1: of published papers on the subject. They looked at about 125 00:07:25,680 --> 00:07:29,240 Speaker 1: twelve thousand research papers published over the previous two decades, 126 00:07:29,960 --> 00:07:33,960 Speaker 1: and uh they found quote, sixty six point four percent 127 00:07:34,000 --> 00:07:38,360 Speaker 1: of abstracts expressed no position on anthropogenic global warming, meaning 128 00:07:38,400 --> 00:07:42,600 Speaker 1: human cause global warming, thirty two point six percent endorsed it, 129 00:07:42,880 --> 00:07:46,840 Speaker 1: and zero point seven percent rejected it, and zero point 130 00:07:46,880 --> 00:07:49,560 Speaker 1: three percent were uncertain about the cause of global warming. 131 00:07:50,120 --> 00:07:52,800 Speaker 1: So this means that among papers that expressed a view 132 00:07:52,880 --> 00:07:56,360 Speaker 1: on the cause of global warming, ninety seven point one 133 00:07:56,480 --> 00:07:59,800 Speaker 1: percent endorsed the consensus. Now, I want to just cut 134 00:07:59,800 --> 00:08:01,800 Speaker 1: in are real quick and say if it sounds like 135 00:08:01,840 --> 00:08:03,160 Speaker 1: we're just hitting you over the head with a bunch 136 00:08:03,160 --> 00:08:06,800 Speaker 1: of figures and numbers to drive home the fact that 137 00:08:06,920 --> 00:08:09,560 Speaker 1: climate change is caused by hum an activity, the vast 138 00:08:09,560 --> 00:08:12,400 Speaker 1: majority of this episode is dealing not with those facts, 139 00:08:12,400 --> 00:08:15,080 Speaker 1: but how we process those facts. Right, We just want 140 00:08:15,080 --> 00:08:18,760 Speaker 1: to establish clearly what the scientific consensus is beyond any 141 00:08:19,000 --> 00:08:22,400 Speaker 1: reasonable doubts. We will get to science, communication and public 142 00:08:22,600 --> 00:08:26,080 Speaker 1: consumption of the information in a bit right. Uh. So 143 00:08:26,280 --> 00:08:28,680 Speaker 1: you've got this Cook paper, and obviously there's a lot 144 00:08:28,680 --> 00:08:31,840 Speaker 1: of people in the general public don't agree with climate change. 145 00:08:31,880 --> 00:08:35,520 Speaker 1: So there's been plenty of criticisms of the cook papers methodology. 146 00:08:35,559 --> 00:08:39,120 Speaker 1: For example, in there was a Dutch born economist named 147 00:08:39,160 --> 00:08:42,040 Speaker 1: Richard Toll who criticized the Cook study and tried to 148 00:08:42,080 --> 00:08:45,520 Speaker 1: revise the estimates down. Uh. Toll is often cited as 149 00:08:45,559 --> 00:08:48,120 Speaker 1: a critic of the consensus on global warming. But even 150 00:08:48,160 --> 00:08:51,880 Speaker 1: when he revised the numbers down, he recrunched them and said, no, 151 00:08:52,040 --> 00:08:54,400 Speaker 1: actually it's not as high as they said. He found 152 00:08:54,559 --> 00:08:59,600 Speaker 1: nine one percent agreement instead of agreement. And I should 153 00:08:59,640 --> 00:09:03,160 Speaker 1: add that Cook also defended their original figure on in 154 00:09:03,160 --> 00:09:07,200 Speaker 1: a response article that attributed tolls lower figure of to 155 00:09:07,240 --> 00:09:11,800 Speaker 1: a math era one last study before we move on, Uh, 156 00:09:11,960 --> 00:09:15,920 Speaker 1: andreg at all expert credibility and Climate Change Proceedings of 157 00:09:15,920 --> 00:09:19,720 Speaker 1: the National Academy of Sciences two thousand ten. They used 158 00:09:19,720 --> 00:09:22,720 Speaker 1: a data set of one thousand, three seventy two climate 159 00:09:22,760 --> 00:09:28,040 Speaker 1: researchers to determine that of climate scientists in general were 160 00:09:28,040 --> 00:09:30,720 Speaker 1: convinced that human activity was the main cause of global 161 00:09:30,760 --> 00:09:34,120 Speaker 1: warming and of climate scientists who were actively publishing in 162 00:09:34,120 --> 00:09:38,760 Speaker 1: the field, between nine and percent of them agreed with 163 00:09:38,800 --> 00:09:41,640 Speaker 1: the findings of the Intergovernmental Panel on Climate Change, the 164 00:09:41,679 --> 00:09:44,520 Speaker 1: i p c C, which concluded that the main reason 165 00:09:44,559 --> 00:09:49,440 Speaker 1: the climate is changing is human activity, primarily greenhouse gas emissions. 166 00:09:49,480 --> 00:09:53,480 Speaker 1: So this is pretty unambiguous anyway you cut it. The 167 00:09:53,640 --> 00:09:57,320 Speaker 1: large majority of scientists, especially those that study the climate 168 00:09:57,360 --> 00:10:00,360 Speaker 1: directly and published in the field, agree that the Earth 169 00:10:00,400 --> 00:10:03,320 Speaker 1: is warming and that human activity is the main cause. 170 00:10:04,280 --> 00:10:08,000 Speaker 1: But let's go back to those gallop results. Only of 171 00:10:08,040 --> 00:10:11,160 Speaker 1: Americans agree with the objective fact that the majority of 172 00:10:11,200 --> 00:10:14,640 Speaker 1: scientists believe the Earth is warming, and only sixty eight 173 00:10:14,679 --> 00:10:18,880 Speaker 1: percent of Americans agree with the clear consensus that human 174 00:10:18,920 --> 00:10:22,560 Speaker 1: activity is a significant cause of this warming. And this 175 00:10:22,679 --> 00:10:27,319 Speaker 1: gap between the expert consensus in public opinion is sometimes 176 00:10:27,360 --> 00:10:31,439 Speaker 1: maddening to science and science communicators, Like if you are 177 00:10:31,480 --> 00:10:34,560 Speaker 1: a dissenting climate researcher and you've got some reason of 178 00:10:34,600 --> 00:10:39,240 Speaker 1: your own to disagree, sure, but why would non experts 179 00:10:39,360 --> 00:10:42,920 Speaker 1: like you and me disagree with the overwhelming majority of 180 00:10:42,960 --> 00:10:46,360 Speaker 1: people who know what they're talking about. And the ungenerous 181 00:10:46,440 --> 00:10:49,600 Speaker 1: question that comes out of frustration with this situation is 182 00:10:49,880 --> 00:10:54,320 Speaker 1: why don't they believe in science? And is that really 183 00:10:54,320 --> 00:10:56,840 Speaker 1: what's going on? So that's a question we wanted to 184 00:10:56,880 --> 00:11:00,480 Speaker 1: ask today is the city. Is the situation really that 185 00:11:00,559 --> 00:11:03,800 Speaker 1: they don't believe in science or is that there's something 186 00:11:03,880 --> 00:11:08,400 Speaker 1: particular to this issue where their ability to judge science 187 00:11:08,480 --> 00:11:14,240 Speaker 1: has been corrupted. Now, obviously you're listening to a science podcast, 188 00:11:14,920 --> 00:11:17,640 Speaker 1: so I'm we're not gonna try and belabor the point 189 00:11:17,679 --> 00:11:20,800 Speaker 1: here too much. But of course science, science is the 190 00:11:20,800 --> 00:11:24,079 Speaker 1: true path as a systematic exploration of the universe and 191 00:11:24,120 --> 00:11:26,880 Speaker 1: the properties that government science. I mean, this has allowed 192 00:11:26,960 --> 00:11:30,240 Speaker 1: humanity a path out of darkness, ignorance, and disease. It 193 00:11:30,320 --> 00:11:33,520 Speaker 1: underlies all the marvels of modern technology and provides us 194 00:11:33,559 --> 00:11:36,200 Speaker 1: with a chance for humanities a long term survival on 195 00:11:36,440 --> 00:11:40,959 Speaker 1: and potentially beyond Earth. I mean, if you listen to 196 00:11:41,000 --> 00:11:43,160 Speaker 1: the podcast, you know I'm not saying that that science 197 00:11:43,440 --> 00:11:47,800 Speaker 1: will answer all of life's questions, especially in teleological questions 198 00:11:47,800 --> 00:11:50,280 Speaker 1: related to him like why do I exist? What's my 199 00:11:50,320 --> 00:11:52,680 Speaker 1: purpose in life? Right? But if you're trying to answer 200 00:11:52,760 --> 00:11:57,120 Speaker 1: facts about the material universe with reliable accuracy, uh, I 201 00:11:57,160 --> 00:12:00,360 Speaker 1: mean we're science advocates around here, right, you have the 202 00:12:00,400 --> 00:12:04,120 Speaker 1: science you have the scientists, and here's the thing. For 203 00:12:04,200 --> 00:12:07,400 Speaker 1: the most part, we trust it, or at the very 204 00:12:07,480 --> 00:12:10,320 Speaker 1: least we claim to trust it. That's right. Is pointed 205 00:12:10,320 --> 00:12:13,600 Speaker 1: out in an often cited uh Pew research study from 206 00:12:13,640 --> 00:12:19,040 Speaker 1: two thousands sixteen. Americans in particular trust only the military 207 00:12:19,080 --> 00:12:21,679 Speaker 1: over scientists. So there's a whole ranking. Yeah. So they 208 00:12:21,760 --> 00:12:25,160 Speaker 1: repeatedly do this study where they'll put up public institutions 209 00:12:25,200 --> 00:12:27,760 Speaker 1: and they'll ask you how much trust you have in them, 210 00:12:27,840 --> 00:12:31,080 Speaker 1: like great deal of trust, a significant amount of trust, 211 00:12:31,160 --> 00:12:34,320 Speaker 1: not too much, exactly like that, And people will rank 212 00:12:34,440 --> 00:12:37,440 Speaker 1: these different institutions, including things like the media. You can 213 00:12:37,480 --> 00:12:40,760 Speaker 1: guess where that goes. But how does the ranking work out? Well, 214 00:12:40,920 --> 00:12:43,160 Speaker 1: based on this two thousand and sixteen ranking, it goes 215 00:12:43,200 --> 00:12:47,400 Speaker 1: as follows military at the top, then medical scientists, scientists 216 00:12:47,920 --> 00:12:51,920 Speaker 1: k through twelve principles, religious leaders, news, media, business, and 217 00:12:51,920 --> 00:12:54,760 Speaker 1: finally elected officials. Yeah. Now, one of the main people 218 00:12:54,760 --> 00:12:57,520 Speaker 1: we're gonna be talking about from this seventeen World Science 219 00:12:57,559 --> 00:13:02,320 Speaker 1: Festival panel, Dan Kahan, is what he made this point 220 00:13:02,480 --> 00:13:04,320 Speaker 1: in the panel that I thought was great. He said, 221 00:13:04,440 --> 00:13:09,319 Speaker 1: even religious people say on this survey that they generally 222 00:13:09,400 --> 00:13:14,280 Speaker 1: trust science more than they trust religious leaders. And and 223 00:13:14,400 --> 00:13:18,320 Speaker 1: percentages for trusting these groups a great deal, because, like 224 00:13:18,360 --> 00:13:21,520 Speaker 1: you said, there are different levels of it and the questions. 225 00:13:21,960 --> 00:13:24,880 Speaker 1: Uh So, the percentages for trusting these groups a great 226 00:13:24,920 --> 00:13:28,440 Speaker 1: deal range from thirty three for the military and twenty 227 00:13:28,480 --> 00:13:31,439 Speaker 1: four and twenty one for medical scientists and scientists, respectively, 228 00:13:32,120 --> 00:13:34,640 Speaker 1: down to a mere three percent for elected officials. Just 229 00:13:34,679 --> 00:13:37,600 Speaker 1: to let you know what politicians, Yeah, I don't know 230 00:13:37,600 --> 00:13:41,319 Speaker 1: why we don't trust them more. But but particularly noteworthy 231 00:13:41,400 --> 00:13:43,600 Speaker 1: here is the fact that of the Americans poll, the 232 00:13:43,600 --> 00:13:48,959 Speaker 1: participants expressed a fair amount of confidence in medical scientists 233 00:13:48,960 --> 00:13:54,240 Speaker 1: and scientists. So, uh, that alone, I think it is 234 00:13:54,360 --> 00:13:59,120 Speaker 1: is telling about our overall cultural trust in scientific expertise. 235 00:13:59,240 --> 00:14:00,880 Speaker 1: And of course that's not been getting into the fact 236 00:14:00,880 --> 00:14:02,720 Speaker 1: that if if you're talking about the military, aren't you 237 00:14:02,720 --> 00:14:05,800 Speaker 1: also talking about military scientists. But that's that's kind of 238 00:14:05,800 --> 00:14:09,360 Speaker 1: a separate question. Yeah, and so that's interesting. Generally, people 239 00:14:09,400 --> 00:14:11,920 Speaker 1: say they put at least a fair amount of trust 240 00:14:12,000 --> 00:14:14,280 Speaker 1: and scientists. A lot of people put a great deal 241 00:14:14,320 --> 00:14:17,640 Speaker 1: of trust in scientists. People claim to believe in science, 242 00:14:17,679 --> 00:14:20,480 Speaker 1: people claim to say that scientists are working with the 243 00:14:20,480 --> 00:14:25,400 Speaker 1: public's best interests at heart. But there are particular issues 244 00:14:25,960 --> 00:14:29,479 Speaker 1: where for some reason, the public's understanding of the scientific 245 00:14:29,880 --> 00:14:33,280 Speaker 1: consensus gets very out of whack. Yeah. Yeah, we have 246 00:14:33,400 --> 00:14:37,320 Speaker 1: this polarizing effect, or seemingly polarized based on what we 247 00:14:37,360 --> 00:14:40,840 Speaker 1: see and what we hear in the media. That how 248 00:14:40,880 --> 00:14:43,880 Speaker 1: can we hold such high opinions of scientists and science 249 00:14:43,880 --> 00:14:47,320 Speaker 1: in general and disagree on the clear scientific consensus regarding, 250 00:14:47,440 --> 00:14:50,520 Speaker 1: for instance, human driven climate change? Why are so many 251 00:14:50,520 --> 00:14:54,440 Speaker 1: of us scared away by the prospect of Frankenstein food? Um? 252 00:14:54,520 --> 00:14:59,000 Speaker 1: And then other other topics, of course are vaccine safety, 253 00:14:59,120 --> 00:15:02,160 Speaker 1: uh for um, evolution and whether your kids should watch 254 00:15:02,200 --> 00:15:05,240 Speaker 1: Dinosaur Train? What does Dinosaur train? Dinosaur Trains a lovely 255 00:15:05,280 --> 00:15:09,240 Speaker 1: show by the Jim Hinson Company where dinosaurs travel in 256 00:15:09,360 --> 00:15:12,600 Speaker 1: time on a train through a wormhole and your child 257 00:15:12,680 --> 00:15:16,240 Speaker 1: memorizes all of these complex dinosaur names. That's wonderful. Yeah. 258 00:15:16,560 --> 00:15:17,920 Speaker 1: And then of course another big one, This is what 259 00:15:18,040 --> 00:15:20,000 Speaker 1: I hadn't really thought about much, but I understand you've 260 00:15:20,080 --> 00:15:23,520 Speaker 1: you've covered in the past for Harvard thinking. Yeah, is 261 00:15:24,080 --> 00:15:28,240 Speaker 1: the deep geological isolation of nuclear waste. Yeah, so, high 262 00:15:28,320 --> 00:15:31,840 Speaker 1: level radioactive waste. I mean, there's pretty clear scientific consensus 263 00:15:31,880 --> 00:15:33,600 Speaker 1: that the best thing we need to do with it 264 00:15:33,680 --> 00:15:36,360 Speaker 1: is bury it deep underground. But when you pull the 265 00:15:36,400 --> 00:15:39,920 Speaker 1: general public, even the educated public, people who say they're 266 00:15:39,920 --> 00:15:43,800 Speaker 1: into science, you you do not get the high levels 267 00:15:43,840 --> 00:15:46,960 Speaker 1: of agreement with the scientific consensus on this. And what 268 00:15:47,120 --> 00:15:49,840 Speaker 1: some of these UH things should indicate is, I mean, 269 00:15:49,880 --> 00:15:54,800 Speaker 1: it's no secret that the consensus on climate change has 270 00:15:54,880 --> 00:15:58,480 Speaker 1: been particularly identified with one half of the political spectrum, 271 00:15:58,520 --> 00:16:01,000 Speaker 1: at least in the United States. But this is not 272 00:16:01,040 --> 00:16:04,360 Speaker 1: to criticize conservatives, because there are plenty of these issues 273 00:16:04,480 --> 00:16:09,040 Speaker 1: where where. Apparently if you pull people, liberals are more 274 00:16:09,120 --> 00:16:12,160 Speaker 1: out of tune with the scientific consensus than conservatives are. 275 00:16:12,240 --> 00:16:15,080 Speaker 1: Like last time I saw more liberals were out of 276 00:16:15,160 --> 00:16:18,920 Speaker 1: tune with the scientific consensus on vaccine safety, namely that 277 00:16:18,960 --> 00:16:23,280 Speaker 1: they are safe uh than conservatives. Now we've already mentioned 278 00:16:23,400 --> 00:16:26,640 Speaker 1: that the the World Science Festival panel that I attended 279 00:16:26,760 --> 00:16:29,480 Speaker 1: UH and then you watched online, and then you too, listener, 280 00:16:29,600 --> 00:16:32,320 Speaker 1: can watch online. I'll include either a link or embedded 281 00:16:32,400 --> 00:16:34,520 Speaker 1: video of it on the landing page for this episode 282 00:16:34,560 --> 00:16:36,760 Speaker 1: Stuff to Abow your Mind dot com. But the title 283 00:16:36,800 --> 00:16:39,840 Speaker 1: the discussion was Science in a Polarized World. It was 284 00:16:39,880 --> 00:16:43,800 Speaker 1: moderated by author and journalist John don Van UH and 285 00:16:44,120 --> 00:16:50,240 Speaker 1: the panelist included astrophysicist France Cordova, physicist and World Science 286 00:16:50,240 --> 00:16:55,760 Speaker 1: Festival co founder Brian Greene, geneticist Sir Paul Nurse, and 287 00:16:56,080 --> 00:16:58,200 Speaker 1: most notably that one of the individuals we are probably 288 00:16:58,200 --> 00:17:00,520 Speaker 1: gonna spend the most time with here today, Ale law 289 00:17:00,560 --> 00:17:05,520 Speaker 1: professor and science communication expert Dan Kahan. He's the Elizabeth K. 290 00:17:05,720 --> 00:17:09,200 Speaker 1: Dollard Professor of Law and Professor of Psychology at Yale 291 00:17:09,280 --> 00:17:13,600 Speaker 1: Law School, and his primary research interests are risk perception, 292 00:17:14,000 --> 00:17:18,240 Speaker 1: science communication, and the application of decision science to law 293 00:17:18,280 --> 00:17:21,080 Speaker 1: and policy making. Yeah, so I was looking at some 294 00:17:21,160 --> 00:17:24,560 Speaker 1: of his research in preparing for this episode, and it's 295 00:17:24,560 --> 00:17:26,880 Speaker 1: an interesting thing he's doing. Obviously he's not the only 296 00:17:26,880 --> 00:17:29,840 Speaker 1: person doing it, but trying to apply, for example, psychology 297 00:17:30,000 --> 00:17:33,919 Speaker 1: psychological science to laws. So he'd published things in the 298 00:17:33,960 --> 00:17:37,080 Speaker 1: Harvard Law Review that are saying, hey, you know, judges 299 00:17:37,119 --> 00:17:41,000 Speaker 1: should be aware that human brains tend to work like X, Y, 300 00:17:41,080 --> 00:17:45,360 Speaker 1: and Z like. For example, one thing that judges might 301 00:17:45,440 --> 00:17:48,880 Speaker 1: really benefit from being aware of is that research shows 302 00:17:49,480 --> 00:17:53,040 Speaker 1: that when you tell people to be rational and objective, 303 00:17:53,400 --> 00:17:56,680 Speaker 1: they do not become more rational and objective to get 304 00:17:56,680 --> 00:18:00,000 Speaker 1: more entrenched. Al Right. So in this discussion panel everybody. 305 00:18:00,119 --> 00:18:06,280 Speaker 1: He had some great commentary on on polarization regarding a 306 00:18:06,320 --> 00:18:11,760 Speaker 1: scientific consensus. Um Sir Paul Nurse was wonderful. Uh, Brian Green, 307 00:18:12,600 --> 00:18:15,560 Speaker 1: the physicist, it just was really fired up for this 308 00:18:15,600 --> 00:18:18,480 Speaker 1: one and it was just you know, enjoy to watch 309 00:18:18,480 --> 00:18:21,680 Speaker 1: and listen to. But one of the things that Brian 310 00:18:21,760 --> 00:18:24,719 Speaker 1: Green was saying in the panel was, you know, so 311 00:18:24,760 --> 00:18:28,239 Speaker 1: we've got this problem of polarization of public opinion on 312 00:18:28,280 --> 00:18:30,879 Speaker 1: scientific issues. There were some issues where the public, as 313 00:18:30,920 --> 00:18:33,000 Speaker 1: we talked about at the beginning of the episode, just 314 00:18:33,200 --> 00:18:37,200 Speaker 1: doesn't line up with what scientists are saying. Why is that? 315 00:18:37,240 --> 00:18:39,280 Speaker 1: And I think Brian Green was coming at it from 316 00:18:39,320 --> 00:18:41,400 Speaker 1: the point of view like, if we could just make 317 00:18:41,440 --> 00:18:45,520 Speaker 1: them understand science better, if we could just teach, we 318 00:18:45,520 --> 00:18:49,600 Speaker 1: could educate people better in the scientific process and what 319 00:18:49,840 --> 00:18:54,200 Speaker 1: science is about, then they would agree with the scientific consensus. 320 00:18:54,800 --> 00:18:57,840 Speaker 1: And Kahan had a really interesting response to this. I 321 00:18:57,880 --> 00:19:01,040 Speaker 1: think his answer was was, don't get ahead of yourself. 322 00:19:01,080 --> 00:19:03,960 Speaker 1: That's not necessarily the case, right, because I mean, because 323 00:19:04,040 --> 00:19:07,720 Speaker 1: Brian Green's question makes sense, right because you think, well, 324 00:19:08,040 --> 00:19:09,639 Speaker 1: you want to say, what do you realize that that 325 00:19:09,800 --> 00:19:13,320 Speaker 1: science is to quote Sir Paul Nurse, tentative knowledge that 326 00:19:13,440 --> 00:19:17,439 Speaker 1: we're it's it's not a complete, prepackaged understanding the universe. 327 00:19:17,480 --> 00:19:19,919 Speaker 1: It's a continued exploration. You want to say, don't you 328 00:19:20,000 --> 00:19:23,080 Speaker 1: understand the mistakes are part of it. This was definitely 329 00:19:23,119 --> 00:19:25,760 Speaker 1: Brian Green's argument that if you want to talk about 330 00:19:25,760 --> 00:19:29,440 Speaker 1: people who are skeptical of climate change, talk to the 331 00:19:29,480 --> 00:19:33,399 Speaker 1: climate change scientists who have studied it, because science is 332 00:19:33,400 --> 00:19:35,480 Speaker 1: about you. If you you have a hypothesis and you're 333 00:19:35,480 --> 00:19:38,119 Speaker 1: studying it, you were skeptical about it every step of 334 00:19:38,119 --> 00:19:40,560 Speaker 1: the way. Yeah, if you're not being skeptical about the 335 00:19:40,560 --> 00:19:43,640 Speaker 1: theory you advocate, then you're not doing science right, right, 336 00:19:43,640 --> 00:19:46,160 Speaker 1: You're not being very good at your job. So all 337 00:19:46,200 --> 00:19:48,199 Speaker 1: of the that, that entire argument makes sense, and it 338 00:19:48,240 --> 00:19:50,760 Speaker 1: does lead one to think, all right, it's just it's 339 00:19:50,760 --> 00:19:55,320 Speaker 1: scientific literacy or lack of reasoning that's being employed here. 340 00:19:55,640 --> 00:19:58,560 Speaker 1: But yeah, Kahan is saying that you'll find plenty of 341 00:19:58,600 --> 00:20:02,520 Speaker 1: people resisting sientific consensus who are highly literate in science 342 00:20:02,520 --> 00:20:06,120 Speaker 1: and highly logical, and they just wind up applying their 343 00:20:06,119 --> 00:20:09,760 Speaker 1: cognitive resources to fit their beliefs in worldview. In fact, 344 00:20:09,760 --> 00:20:13,360 Speaker 1: it gets crazier than this, because it's not just that 345 00:20:13,440 --> 00:20:16,639 Speaker 1: people who are highly educated in science, who apparently, if 346 00:20:16,680 --> 00:20:19,679 Speaker 1: you test them, they understand how science works. It's not 347 00:20:19,760 --> 00:20:23,000 Speaker 1: just that they can disagree with the scientific consensus, but 348 00:20:23,480 --> 00:20:27,120 Speaker 1: that people who have more rational capacity and who have 349 00:20:27,280 --> 00:20:32,000 Speaker 1: greater cognitive resources in fact, tend to apply those more 350 00:20:32,119 --> 00:20:37,520 Speaker 1: strongly on entrenching themselves against the scientific consensus when they 351 00:20:37,520 --> 00:20:40,720 Speaker 1: disagree with it. It's like people who are better educated 352 00:20:40,760 --> 00:20:44,199 Speaker 1: in science are better at coming up with reasons to 353 00:20:44,400 --> 00:20:48,439 Speaker 1: explain why they don't agree with scientists or why they 354 00:20:48,480 --> 00:20:51,080 Speaker 1: don't agree. Not just don't agree with scientists, but don't 355 00:20:51,080 --> 00:20:54,639 Speaker 1: agree about even objective facts like the fact that the 356 00:20:54,640 --> 00:20:57,600 Speaker 1: majority of scientists to endorse the fact that the Earth 357 00:20:57,640 --> 00:21:00,680 Speaker 1: is warming. Yeah, and it is worth noting here that 358 00:21:00,720 --> 00:21:04,600 Speaker 1: when we're talking about these these dissenting individuals, like it's 359 00:21:04,600 --> 00:21:07,240 Speaker 1: generally it's not across the board. They're not they're not 360 00:21:07,359 --> 00:21:11,320 Speaker 1: dissenting on all scientific consensus. It's about a particular topic, 361 00:21:11,359 --> 00:21:15,760 Speaker 1: be that topic climate change, or be that topic, um, 362 00:21:15,760 --> 00:21:20,240 Speaker 1: you know, vaccines or genetically modified organisms. Yeah, and it's 363 00:21:20,240 --> 00:21:23,400 Speaker 1: it's also important Kahan points out that you also see 364 00:21:23,400 --> 00:21:25,560 Speaker 1: this this sort of thing outside of science. You see 365 00:21:25,640 --> 00:21:29,840 Speaker 1: you see the same process involved with say, abortion or 366 00:21:29,880 --> 00:21:34,199 Speaker 1: military recruitment, any issue in which protests becomes a badge 367 00:21:34,200 --> 00:21:37,119 Speaker 1: of identity. And this place specifically into one paper that 368 00:21:37,280 --> 00:21:41,600 Speaker 1: Khan refers to, which is this paper they saw a protest. Uh. 369 00:21:41,760 --> 00:21:45,320 Speaker 1: So this comes up in the conversation where you you 370 00:21:45,400 --> 00:21:49,480 Speaker 1: can show people video of a protest taking place, right, 371 00:21:49,760 --> 00:21:53,040 Speaker 1: and you ask them just objective facts about what they 372 00:21:53,040 --> 00:21:56,440 Speaker 1: see at the video. Did you see the protesters, uh, 373 00:21:56,560 --> 00:22:00,280 Speaker 1: screaming in someone's face, did you see the protesters bocking 374 00:22:00,400 --> 00:22:03,439 Speaker 1: someone's path? Did you see the protesters doing this and that? 375 00:22:03,560 --> 00:22:06,919 Speaker 1: All these sort of negative behaviors that would render the 376 00:22:06,960 --> 00:22:10,240 Speaker 1: protest in a bad light. And it turns out people 377 00:22:10,359 --> 00:22:13,760 Speaker 1: claim to see different things in the video depending on 378 00:22:13,920 --> 00:22:17,520 Speaker 1: whether they think the politics of the protesters line up 379 00:22:17,560 --> 00:22:20,879 Speaker 1: with their own. So if you show a person with 380 00:22:20,920 --> 00:22:24,240 Speaker 1: certain politics a video and you say that it's people 381 00:22:24,280 --> 00:22:28,120 Speaker 1: protesting outside an abortion clinic, they'll have a very different 382 00:22:28,320 --> 00:22:30,480 Speaker 1: report of what they see in the video than if 383 00:22:30,480 --> 00:22:33,960 Speaker 1: you tell them that it's protesters outside a military recruiting 384 00:22:34,000 --> 00:22:37,840 Speaker 1: center protesting. Don't ask, don't tell. And this is a 385 00:22:37,840 --> 00:22:41,080 Speaker 1: specific example of motivated reasoning. Maybe we can talk about 386 00:22:41,200 --> 00:22:44,080 Speaker 1: motivated reasoning more later on, but it's the fact that 387 00:22:44,119 --> 00:22:47,679 Speaker 1: we we just do not process the facts of reality 388 00:22:47,720 --> 00:22:51,600 Speaker 1: and the evidence of our senses with perfect objectivity. We 389 00:22:51,680 --> 00:22:55,040 Speaker 1: in fact process them in a highly goal oriented way, 390 00:22:55,080 --> 00:22:57,960 Speaker 1: and a lot of times that goal is I don't 391 00:22:57,960 --> 00:23:00,840 Speaker 1: want people like me or the people in my group 392 00:23:00,920 --> 00:23:04,200 Speaker 1: to look bad. All Right, we're gonna take a quick break, 393 00:23:04,200 --> 00:23:06,560 Speaker 1: and when we come back, we're gonna jump jump right 394 00:23:06,600 --> 00:23:17,359 Speaker 1: back into Kahan's research. Than all right, we're back. So 395 00:23:17,440 --> 00:23:20,920 Speaker 1: one of the things that people obviously do when they 396 00:23:20,920 --> 00:23:24,520 Speaker 1: are motivated to arrive at a certain conclusion is that 397 00:23:24,600 --> 00:23:28,520 Speaker 1: they cherry pick facts. Right that you can. You can 398 00:23:28,520 --> 00:23:31,800 Speaker 1: always find stuff that makes your worldview look better or 399 00:23:31,840 --> 00:23:35,040 Speaker 1: stuff that makes the other person's worldview look better. And 400 00:23:35,200 --> 00:23:38,280 Speaker 1: it is in fact, incredibly easy and causes almost no 401 00:23:38,480 --> 00:23:42,639 Speaker 1: cognitive dissonance whatsoever for people to just say, Okay, this 402 00:23:42,760 --> 00:23:46,000 Speaker 1: fact that supports what I already believe. That's a good 403 00:23:46,000 --> 00:23:49,440 Speaker 1: fact that's legit and real and should be included. And 404 00:23:49,680 --> 00:23:53,280 Speaker 1: a fact I encounter that doesn't support my point of view, well, 405 00:23:53,359 --> 00:23:55,440 Speaker 1: that's that's a bunch of bunk. You know, why would 406 00:23:55,440 --> 00:23:58,760 Speaker 1: anybody believe that? And it extends to experts as well, 407 00:23:59,080 --> 00:24:01,280 Speaker 1: so that the same thing. It's like, here's this, uh, 408 00:24:01,359 --> 00:24:04,080 Speaker 1: this this expert, and I'm using expert in quotation marks 409 00:24:04,119 --> 00:24:07,439 Speaker 1: because to what degree they're an expert also depends on 410 00:24:07,480 --> 00:24:10,679 Speaker 1: your cherry picking this individual. Let's say this individual with 411 00:24:10,760 --> 00:24:14,280 Speaker 1: some sort of scientific background. Uh, they're making a statement. 412 00:24:14,600 --> 00:24:16,919 Speaker 1: I will consider them more of an expert based on 413 00:24:17,040 --> 00:24:22,200 Speaker 1: how their opinion matches up with my preconceived beliefs and worldview. Yeah, 414 00:24:22,240 --> 00:24:24,680 Speaker 1: and this is another point Kahan makes it it comes 415 00:24:24,720 --> 00:24:28,280 Speaker 1: straight out of that. So he says, it's not that 416 00:24:28,520 --> 00:24:33,200 Speaker 1: people who don't, for example, except the consensus on climate 417 00:24:33,320 --> 00:24:37,640 Speaker 1: change or on vaccine safety, don't believe in scientific expertise. 418 00:24:38,240 --> 00:24:43,000 Speaker 1: They do. They do believe in scientific expertise generally statistically 419 00:24:43,040 --> 00:24:46,840 Speaker 1: they do, but they don't think that people who disagree 420 00:24:46,880 --> 00:24:51,640 Speaker 1: with them are legitimate experts. Now, Kahan wrote about this 421 00:24:51,680 --> 00:24:56,800 Speaker 1: in a very recent paper like this month. Um, misconceptions, misinformation, 422 00:24:56,840 --> 00:25:01,160 Speaker 1: and logic of identity, protective cognition. When we're talking about 423 00:25:01,440 --> 00:25:04,399 Speaker 1: your views as as a badge of identity, that's what 424 00:25:04,440 --> 00:25:08,520 Speaker 1: we're getting to here. Um. This came out June for 425 00:25:08,640 --> 00:25:13,360 Speaker 1: the Cultural Cognition Project. So in this paper, Kahn tackles 426 00:25:13,359 --> 00:25:17,360 Speaker 1: what he refers to as the public irrationality thesis or pit. 427 00:25:18,320 --> 00:25:20,159 Speaker 1: So this is something he's setting up to be an 428 00:25:20,200 --> 00:25:22,679 Speaker 1: opposition to. Right. This is the idea that we can 429 00:25:22,720 --> 00:25:25,240 Speaker 1: touched on earlier, the idea that the general public largely 430 00:25:25,600 --> 00:25:29,840 Speaker 1: quote display only modest familiarity with fundamental scientific findings and 431 00:25:29,920 --> 00:25:33,800 Speaker 1: lack proficiency in the forms of critical reasoning essential to 432 00:25:33,880 --> 00:25:38,440 Speaker 1: science comprehension unquote and uh, and they're therefore easily swayed 433 00:25:38,480 --> 00:25:41,840 Speaker 1: by special interest groups who muddy the waters with non 434 00:25:41,920 --> 00:25:44,639 Speaker 1: scientific information. Yeah, I think this is a is a 435 00:25:44,680 --> 00:25:49,320 Speaker 1: common thesis people on both sides of a contentious issue 436 00:25:49,359 --> 00:25:52,560 Speaker 1: of fact in the public debate sphere. They just tend 437 00:25:52,600 --> 00:25:54,920 Speaker 1: to think that, well, people on the other side are 438 00:25:55,000 --> 00:25:59,240 Speaker 1: just ignorant, and they just they just don't understand and 439 00:25:59,280 --> 00:26:01,960 Speaker 1: they're just being swayed by propaganda. Yeah, you listen to 440 00:26:02,000 --> 00:26:05,120 Speaker 1: the wrong news channel, you listen to the wrong radical 441 00:26:05,480 --> 00:26:08,600 Speaker 1: and now you have you have a faulty understanding of 442 00:26:08,600 --> 00:26:13,640 Speaker 1: the facts. So Gahan argues that Pitt reflects a misconception 443 00:26:13,640 --> 00:26:18,040 Speaker 1: of science communication, like a basic misconception. Controversy over so 444 00:26:18,080 --> 00:26:23,560 Speaker 1: called decision relevant science is increasingly tied to identity protective cognition. 445 00:26:23,640 --> 00:26:27,480 Speaker 1: This is the quote tendency to selectively credit and discredit 446 00:26:27,760 --> 00:26:32,640 Speaker 1: evidence in patterns that reflect people's commitments to competing cultural groups. 447 00:26:33,000 --> 00:26:35,400 Speaker 1: And that's a concept, he says, it's rooted in the 448 00:26:35,440 --> 00:26:37,719 Speaker 1: two thousand to two thousand sixteen work of D. K. 449 00:26:37,920 --> 00:26:40,960 Speaker 1: Sherman and G. L. Cohen. Right, so maybe we should 450 00:26:40,960 --> 00:26:43,520 Speaker 1: try to go a little bit deeper into where this 451 00:26:43,640 --> 00:26:49,000 Speaker 1: idea of of identity protective cognition comes from. So obviously 452 00:26:49,160 --> 00:26:51,560 Speaker 1: there are a lot of ways to be wrong. Right. 453 00:26:51,920 --> 00:26:55,199 Speaker 1: You can be mistaken due to pure error, right, But 454 00:26:55,600 --> 00:26:58,600 Speaker 1: as we've already shown, you can also be mistaken for 455 00:26:58,640 --> 00:27:01,880 Speaker 1: a reason. Our our brains are not so made as 456 00:27:01,920 --> 00:27:05,480 Speaker 1: to perceive and judge the world objectively, like when you're reasoning, 457 00:27:05,640 --> 00:27:10,200 Speaker 1: and perceptions are skewed by a desire, conscious or unconscious, 458 00:27:10,240 --> 00:27:14,320 Speaker 1: to reach particular conclusions. This is what we call motivated reasoning. 459 00:27:14,800 --> 00:27:17,760 Speaker 1: And Kahan in an article he did for the Harvard 460 00:27:17,800 --> 00:27:20,919 Speaker 1: Law Review in that he reproduced an exerpt from on 461 00:27:20,960 --> 00:27:25,280 Speaker 1: his blog, he he said, quote motivated reasoning refers to 462 00:27:25,359 --> 00:27:29,760 Speaker 1: the unconscious tendency of individuals to process information in a 463 00:27:29,840 --> 00:27:34,119 Speaker 1: manner that suits some end or goal extrinsic to the 464 00:27:34,160 --> 00:27:38,880 Speaker 1: formation of accurate beliefs. That unconscious part. It is very 465 00:27:38,880 --> 00:27:41,840 Speaker 1: critical because because no, but we're not arguing that argument. 466 00:27:41,880 --> 00:27:43,920 Speaker 1: Here is not that someone is saying, well, I don't 467 00:27:44,040 --> 00:27:46,480 Speaker 1: I don't like this climate change. This is the expert 468 00:27:46,520 --> 00:27:48,760 Speaker 1: for me. This is it, or or vice versa, theyd 469 00:27:48,840 --> 00:27:51,679 Speaker 1: someone saying, Oh, I'm I don't really like the idea 470 00:27:51,680 --> 00:27:55,399 Speaker 1: of these GMO foods. I'm gonna listen to this expert 471 00:27:55,480 --> 00:27:59,200 Speaker 1: right here. This is taking place, uh in the unconscious. Yeah. 472 00:27:59,240 --> 00:28:01,640 Speaker 1: You you don't even and realize when it's going on. 473 00:28:02,280 --> 00:28:05,320 Speaker 1: And so there there is a classic, highly cited paper 474 00:28:05,320 --> 00:28:07,480 Speaker 1: in the history of psychology that he goes back to 475 00:28:07,480 --> 00:28:10,800 Speaker 1: to talk about early examples of motivated reasoning, and this 476 00:28:10,880 --> 00:28:12,760 Speaker 1: is a precedent, I guess for his They Saw a 477 00:28:12,760 --> 00:28:15,760 Speaker 1: protest paper. The original one was this paper called They 478 00:28:15,800 --> 00:28:19,199 Speaker 1: Saw a Game a case study, and it goes to 479 00:28:19,240 --> 00:28:22,359 Speaker 1: stuff that has nothing to do with politics, absolutely nothing. 480 00:28:22,600 --> 00:28:25,040 Speaker 1: You can take the politics and you can take the 481 00:28:25,080 --> 00:28:28,640 Speaker 1: science completely out and you still get the exact same effects. 482 00:28:28,640 --> 00:28:31,400 Speaker 1: And what this is is there was a football game 483 00:28:31,760 --> 00:28:35,879 Speaker 1: between Dartmouth and Princeton in ninette that had some highly 484 00:28:35,920 --> 00:28:40,000 Speaker 1: controversial behavior, leading injuries for a few players. Players were 485 00:28:40,160 --> 00:28:44,920 Speaker 1: hurting each other in the Researchers in this study recruited 486 00:28:45,160 --> 00:28:49,080 Speaker 1: Dartmouth and Princeton students to review footage of what happened 487 00:28:49,400 --> 00:28:52,640 Speaker 1: and answer questions about what they saw, and it turns 488 00:28:52,680 --> 00:28:57,520 Speaker 1: out what they saw depended on their school allegiance. Dartmouth 489 00:28:57,560 --> 00:29:02,080 Speaker 1: students claimed to see things favorable to Dartmouth's reputation. Princeton 490 00:29:02,160 --> 00:29:06,000 Speaker 1: students claim to see things favorable to Princeton's reputation. They 491 00:29:06,000 --> 00:29:09,520 Speaker 1: didn't just have different opinions about the game, they apparently 492 00:29:09,600 --> 00:29:13,840 Speaker 1: perceived a different reality based on institutional allegiance. They were 493 00:29:13,840 --> 00:29:18,280 Speaker 1: not reasoning impartially but in a motivated way, and lots 494 00:29:18,280 --> 00:29:20,840 Speaker 1: of studies over the years have reflected other versions of 495 00:29:20,880 --> 00:29:24,160 Speaker 1: these findings. It's totally clear when people have a goal, 496 00:29:24,560 --> 00:29:27,840 Speaker 1: when they consciously or unconsciously want things to be a 497 00:29:27,840 --> 00:29:32,080 Speaker 1: certain way, they're usually not capable of reasoning and perceiving 498 00:29:32,160 --> 00:29:36,440 Speaker 1: reality impartially. And to get back to the main example 499 00:29:36,480 --> 00:29:38,880 Speaker 1: of this that we we came in with is the 500 00:29:38,920 --> 00:29:43,760 Speaker 1: idea of identity protective cognition. We want to affirm our 501 00:29:43,800 --> 00:29:47,520 Speaker 1: membership in reference groups because we're social creatures, right, I mean, 502 00:29:48,560 --> 00:29:51,080 Speaker 1: one of one of the main things that's been hypothesized 503 00:29:51,120 --> 00:29:54,960 Speaker 1: that our brains evolved to do is to manage social relationships. 504 00:29:54,960 --> 00:29:58,360 Speaker 1: We were just talking about the social brain hypothesis another episode. 505 00:29:59,040 --> 00:30:02,000 Speaker 1: Uh yeah on one of the main things we appear 506 00:30:02,080 --> 00:30:05,720 Speaker 1: to be optimized for is for group membership and group 507 00:30:05,800 --> 00:30:10,040 Speaker 1: solidarity and understanding group dynamics. Yeah, I mean, survival has 508 00:30:10,080 --> 00:30:13,080 Speaker 1: a almost has a different definition when you're talking about 509 00:30:13,080 --> 00:30:17,680 Speaker 1: an individual versus a a larger especially a global culture. 510 00:30:18,040 --> 00:30:21,600 Speaker 1: We didn't evolve to save the planet from the human 511 00:30:21,640 --> 00:30:25,680 Speaker 1: caused climate change or or meteorites. Uh. We evolve to 512 00:30:25,760 --> 00:30:31,280 Speaker 1: survive um social dynamics, to to adapt our thinking to 513 00:30:31,320 --> 00:30:33,920 Speaker 1: fit in with the group that has access to the fire, 514 00:30:33,960 --> 00:30:37,600 Speaker 1: that has access to the uh to to the food 515 00:30:37,960 --> 00:30:40,840 Speaker 1: and the shelter that is necessary for survival. Yeah. And 516 00:30:40,920 --> 00:30:44,960 Speaker 1: so we deeply, deeply want we're highly motivated to affirm 517 00:30:45,000 --> 00:30:48,960 Speaker 1: our membership in reference groups and the character and the 518 00:30:49,040 --> 00:30:52,560 Speaker 1: reputation of those groups. When those things are at stake, 519 00:30:52,680 --> 00:30:56,400 Speaker 1: we are highly motivated to defend them. So the idea 520 00:30:56,440 --> 00:31:00,320 Speaker 1: here's the culture comes before fact. Perception of what acts 521 00:31:00,400 --> 00:31:03,760 Speaker 1: even are are shaped by values. So many of these 522 00:31:03,920 --> 00:31:08,080 Speaker 1: individual members of the public simply have a quote bigger 523 00:31:08,080 --> 00:31:11,880 Speaker 1: personal stake in fitting in with important affinity groups than 524 00:31:11,920 --> 00:31:15,280 Speaker 1: informing correct perceptions of scientific evidence. Yet again, this is 525 00:31:15,280 --> 00:31:19,000 Speaker 1: not necessarily done consciously. In fact, it's almost never done consciously. 526 00:31:19,080 --> 00:31:23,160 Speaker 1: You don't think I'm sacrificing knowing the truth for fitting 527 00:31:23,200 --> 00:31:25,680 Speaker 1: in with my group. That's just what your brain does 528 00:31:25,880 --> 00:31:28,200 Speaker 1: and doesn't really let you in on the fact that 529 00:31:28,200 --> 00:31:31,120 Speaker 1: that's what it's doing. Yeah, and the members of the 530 00:31:31,120 --> 00:31:33,600 Speaker 1: public that are most polarized over a topic are the 531 00:31:33,600 --> 00:31:36,680 Speaker 1: ones that have the highest degree of scientific comprehension. Where 532 00:31:36,680 --> 00:31:39,320 Speaker 1: I discussed that this is the nature of the dissenting expert. 533 00:31:39,800 --> 00:31:43,600 Speaker 1: The problem, then, Kahan points out, is not a gullible public, 534 00:31:43,640 --> 00:31:49,719 Speaker 1: not this pit scenario, but quote, a polluted science communication environment. Now, 535 00:31:50,200 --> 00:31:52,800 Speaker 1: he referred to a two thousand eleven study that that 536 00:31:52,880 --> 00:31:56,400 Speaker 1: he himself worked on with Jenkins, Smith and Brahmin, in 537 00:31:56,440 --> 00:32:00,480 Speaker 1: which a scientist headshot and credentials were presented along with 538 00:32:00,560 --> 00:32:04,760 Speaker 1: attributed quotes about climate change, and whether he this individual 539 00:32:04,880 --> 00:32:07,200 Speaker 1: was a true expert in the eyes of the subject 540 00:32:07,200 --> 00:32:11,400 Speaker 1: depended entirely on their particulars and their views. So people 541 00:32:11,480 --> 00:32:15,080 Speaker 1: are simply quote, using the consistency of new evidence with 542 00:32:15,200 --> 00:32:18,280 Speaker 1: their group's positions to determine whether the evidence should be 543 00:32:18,320 --> 00:32:21,760 Speaker 1: given any weight at all. And this is how deniers 544 00:32:22,280 --> 00:32:25,560 Speaker 1: of scientific consensus become stuck in their opinions. Right, So, 545 00:32:25,640 --> 00:32:29,560 Speaker 1: if somebody presents you an alternative opinion, the scientist comes 546 00:32:29,600 --> 00:32:32,560 Speaker 1: on TV or writes a book or something like that 547 00:32:33,080 --> 00:32:37,520 Speaker 1: and says, look, here's what the science says. It's pretty clear. 548 00:32:37,680 --> 00:32:40,520 Speaker 1: This is why scientists agree. This is where the consensus 549 00:32:40,520 --> 00:32:42,840 Speaker 1: comes from, and here's why the public should agree with 550 00:32:42,880 --> 00:32:45,560 Speaker 1: it too. If you are part of a group that 551 00:32:45,680 --> 00:32:50,480 Speaker 1: is culturally polarized against that scientific position, you don't think 552 00:32:50,560 --> 00:32:54,120 Speaker 1: I'm being anti science. You just think this person isn't 553 00:32:54,120 --> 00:32:57,160 Speaker 1: a real expert. Why should I trust what they say? Indeed, 554 00:32:57,200 --> 00:32:59,240 Speaker 1: and I also want to point out that the con 555 00:32:59,320 --> 00:33:03,440 Speaker 1: touches on on disinformation. He says that disinformation doesn't seem 556 00:33:03,440 --> 00:33:05,880 Speaker 1: to have as much impact as you might think. And 557 00:33:05,920 --> 00:33:08,640 Speaker 1: the embar in mind that there are several flavors of misinformation. 558 00:33:08,640 --> 00:33:13,320 Speaker 1: There's self misinformation, there's motivated consumption of misinformation. They're straight 559 00:33:13,400 --> 00:33:17,280 Speaker 1: up fake news. Uh. Kahan states that while such misinformation 560 00:33:17,560 --> 00:33:20,760 Speaker 1: certainly does have an impact on the world, UH, the 561 00:33:20,840 --> 00:33:23,040 Speaker 1: reality is is a little bit different. He says what 562 00:33:23,080 --> 00:33:27,240 Speaker 1: these individuals do with misinformation in most circumstances will not 563 00:33:27,440 --> 00:33:30,719 Speaker 1: differ from what they would have done without it. So 564 00:33:30,840 --> 00:33:34,400 Speaker 1: I find this whole scenario very, very illuminating. Uh, you know, 565 00:33:34,440 --> 00:33:36,760 Speaker 1: it's it's help It's a helpful model not only in 566 00:33:36,920 --> 00:33:40,960 Speaker 1: understanding or trying to understand individuals who have a differing 567 00:33:41,000 --> 00:33:43,840 Speaker 1: opinion in your own on scientific consensus, but also to 568 00:33:43,840 --> 00:33:46,160 Speaker 1: to self reflect and and try and think, well, how 569 00:33:46,160 --> 00:33:48,640 Speaker 1: do I think about scientific consensus? Yeah. Well, one of 570 00:33:48,680 --> 00:33:50,480 Speaker 1: the things that you should really take away from this, 571 00:33:50,600 --> 00:33:53,960 Speaker 1: and we should emphasize this very strongly, is that this 572 00:33:54,000 --> 00:33:58,840 Speaker 1: applies to you too. It applies to me and to you. Um, 573 00:33:58,880 --> 00:34:02,920 Speaker 1: it's not so much surprising that motivated reasoning happens, or 574 00:34:02,960 --> 00:34:07,280 Speaker 1: that identity protective cognition happens, but it's surprising that it 575 00:34:07,320 --> 00:34:10,799 Speaker 1: applies to you because it doesn't feel like it does. Yeah. Yeah, 576 00:34:10,800 --> 00:34:13,200 Speaker 1: when it just feels like I'm being objective, I'm trying 577 00:34:13,239 --> 00:34:15,640 Speaker 1: to figure out what's true, it's those other people who 578 00:34:15,719 --> 00:34:19,279 Speaker 1: are reasoning from their cultural point of view, right, Yeah, 579 00:34:19,320 --> 00:34:21,360 Speaker 1: I mean that that is how how it how it feels. 580 00:34:21,360 --> 00:34:23,479 Speaker 1: I mean, that's the that's one of the tricky parts 581 00:34:23,520 --> 00:34:25,759 Speaker 1: about this is you. You can't simply hold up the 582 00:34:25,800 --> 00:34:28,759 Speaker 1: mirror uh so much and say look, look, look how 583 00:34:28,800 --> 00:34:31,080 Speaker 1: you're thinking. Look at look at the way you're processing 584 00:34:31,080 --> 00:34:34,680 Speaker 1: your information. Yeah. And so this actually leads to problems, 585 00:34:34,680 --> 00:34:37,880 Speaker 1: and Cohn writes about this in uh. In that piece 586 00:34:37,880 --> 00:34:40,480 Speaker 1: in the Harvard Law Review, he points out how this 587 00:34:40,560 --> 00:34:44,879 Speaker 1: leads to really bad cultural situations, where so you've got 588 00:34:44,920 --> 00:34:48,160 Speaker 1: your group and another group who are both motivated to 589 00:34:48,200 --> 00:34:51,400 Speaker 1: perceive facts differently for reasons having nothing to do with 590 00:34:51,440 --> 00:34:55,240 Speaker 1: forming accurate beliefs. You know, you're both using motivated reasoning. 591 00:34:55,800 --> 00:35:00,000 Speaker 1: Each group correctly perceives that the other group is used 592 00:35:00,160 --> 00:35:05,120 Speaker 1: motivated reasoning, but each group incorrectly believes that it is 593 00:35:05,200 --> 00:35:09,000 Speaker 1: just looking at the plane obvious objective truth. And of course, 594 00:35:09,000 --> 00:35:10,920 Speaker 1: when you feel like, well, I'm just looking at the 595 00:35:10,960 --> 00:35:15,480 Speaker 1: plane obvious objective truth and this other group is deluding themselves, 596 00:35:15,640 --> 00:35:19,560 Speaker 1: that can lead to feelings of disgust and polarization. You're like, 597 00:35:19,640 --> 00:35:24,319 Speaker 1: why won't they accept reality? Why are they being so dishonest? Yeah, 598 00:35:24,320 --> 00:35:28,000 Speaker 1: And in this the divide deepens even more, right, Yeah, 599 00:35:28,120 --> 00:35:31,520 Speaker 1: And of course leads to these uh, these partisanship situations. 600 00:35:31,560 --> 00:35:34,280 Speaker 1: And of course this makes the problem even worse because 601 00:35:34,320 --> 00:35:37,680 Speaker 1: once you get entrenched partisanship on an issue in the 602 00:35:37,680 --> 00:35:41,840 Speaker 1: public conversation, this provides even more incentive to group a 603 00:35:41,960 --> 00:35:46,520 Speaker 1: line right, and so it re reinforces the motivated reasoning 604 00:35:46,560 --> 00:35:49,800 Speaker 1: that caused you to divide in the first place. Now, 605 00:35:49,920 --> 00:35:52,520 Speaker 1: there are some things that you might think you could 606 00:35:52,520 --> 00:35:55,279 Speaker 1: do to solve the problem. For one thing, you could say, hey, 607 00:35:55,840 --> 00:35:59,239 Speaker 1: what if we just tell people, no, don't don't think 608 00:35:59,239 --> 00:36:01,840 Speaker 1: with your culture, I don't think with your identity, be 609 00:36:02,080 --> 00:36:05,839 Speaker 1: rational and be objective. Does that solve the problem? Well, 610 00:36:05,920 --> 00:36:10,560 Speaker 1: Khan says, research says no. When people use motivated reasoning, 611 00:36:10,800 --> 00:36:14,399 Speaker 1: they tend to believe they're already being objective. They think, yes, 612 00:36:14,440 --> 00:36:16,759 Speaker 1: I am being objective. And this is due to what 613 00:36:16,800 --> 00:36:20,560 Speaker 1: he calls naive realism. This is just the belief that, well, 614 00:36:20,600 --> 00:36:22,960 Speaker 1: what I'm looking at is is a clear and and 615 00:36:23,040 --> 00:36:27,600 Speaker 1: accurate perception of reality. So we're we're all correctly perceiving 616 00:36:27,600 --> 00:36:31,200 Speaker 1: that other people are reasoning with motivation. We're buying into 617 00:36:31,320 --> 00:36:34,160 Speaker 1: naive realism about our own points of view, saying well, 618 00:36:34,200 --> 00:36:36,680 Speaker 1: I'm just looking at the facts. And this leads to 619 00:36:36,760 --> 00:36:40,600 Speaker 1: that horrible state of affairs of of cultural cognition, where 620 00:36:40,680 --> 00:36:45,440 Speaker 1: where partisanship rules. Uh, these certain issues that have been 621 00:36:45,480 --> 00:36:50,760 Speaker 1: infected with the toxic sludge of culture bleeding into questions 622 00:36:50,760 --> 00:36:53,040 Speaker 1: of fact. I always end up coming back to Dr 623 00:36:53,080 --> 00:36:55,719 Speaker 1: Seuss when thinking about these these issues, and not only 624 00:36:55,760 --> 00:36:59,600 Speaker 1: the sneeches, the starbellied sneeches, who who are so caught 625 00:36:59,680 --> 00:37:02,279 Speaker 1: up in in the the identity of their groups that 626 00:37:02,320 --> 00:37:05,560 Speaker 1: they're only cured of it due to just catastrophe. And 627 00:37:05,600 --> 00:37:07,879 Speaker 1: then there's a shorter story in that same book where 628 00:37:07,880 --> 00:37:10,720 Speaker 1: we have the North going Zacks in the South going Zacks. 629 00:37:10,719 --> 00:37:13,319 Speaker 1: These two individuals that need in the desert going in 630 00:37:13,360 --> 00:37:16,600 Speaker 1: a straight line, and they neither one budgets. They can't 631 00:37:16,600 --> 00:37:18,360 Speaker 1: move through each other, but neither one is going to 632 00:37:18,440 --> 00:37:21,600 Speaker 1: go around. Uh. And it just over time like a 633 00:37:21,640 --> 00:37:25,200 Speaker 1: city is built around them while they're just frozen in 634 00:37:25,239 --> 00:37:30,319 Speaker 1: their their their their their unshakable ability to either compromise 635 00:37:30,440 --> 00:37:33,480 Speaker 1: or to understand each other. Yeah. Well, so this situation 636 00:37:33,560 --> 00:37:37,360 Speaker 1: can really induce feelings of despair. I mean, there are 637 00:37:37,440 --> 00:37:40,680 Speaker 1: multiple problems here, one of which is that some issues 638 00:37:40,719 --> 00:37:45,000 Speaker 1: are becoming infected with this with this motivated reasoning, this 639 00:37:45,120 --> 00:37:49,239 Speaker 1: cultural cognition. A toxin is how Kahan referred to it 640 00:37:49,280 --> 00:37:52,800 Speaker 1: as a pollutant. Yeah, it's a pollutant that just infects 641 00:37:52,800 --> 00:37:55,960 Speaker 1: certain issues and then makes it impossible to have a 642 00:37:56,000 --> 00:37:59,000 Speaker 1: clear discussion on them because you get people retrenched in 643 00:37:59,040 --> 00:38:02,840 Speaker 1: their positions and don't budge. But then the retrenchment leads 644 00:38:02,880 --> 00:38:06,200 Speaker 1: to the general worsening of the situation. It's a it's 645 00:38:06,239 --> 00:38:09,280 Speaker 1: a self reinforcing cycle that just gets worse and worse. 646 00:38:09,560 --> 00:38:13,040 Speaker 1: It's like everybody's identity and their politics has all just 647 00:38:13,200 --> 00:38:17,120 Speaker 1: drained out into this, into this body of water. How 648 00:38:17,120 --> 00:38:19,279 Speaker 1: do you unpollute that enough that you can have the 649 00:38:19,400 --> 00:38:23,359 Speaker 1: unpolluted discussion again? Now, maybe that's what we should turn 650 00:38:23,400 --> 00:38:26,160 Speaker 1: to next. If you're hearing this and you're you're following 651 00:38:26,200 --> 00:38:28,560 Speaker 1: along with us, like, if you agree that these are 652 00:38:28,640 --> 00:38:31,200 Speaker 1: valid ways of examining what's going on in in these 653 00:38:31,200 --> 00:38:34,879 Speaker 1: public conversations, uh, you might be feeling to spare right, 654 00:38:35,000 --> 00:38:36,319 Speaker 1: How do we ever get out of this? If we 655 00:38:36,400 --> 00:38:40,520 Speaker 1: all use motivated reasoning and there are these horrible situations 656 00:38:40,600 --> 00:38:44,560 Speaker 1: where issues of fact and scientific questions are just polluted 657 00:38:44,600 --> 00:38:47,879 Speaker 1: by cultural partisanship, how do we get out of it? 658 00:38:48,480 --> 00:38:50,359 Speaker 1: We'll take a quick break and when we come back, 659 00:38:50,640 --> 00:39:01,040 Speaker 1: we can discuss why it's not necessarily always time to despair. Alright, 660 00:39:01,040 --> 00:39:03,800 Speaker 1: we're back. Okay, So we were saying, it can feel 661 00:39:03,840 --> 00:39:06,000 Speaker 1: like it's time to despair once you look at the 662 00:39:06,040 --> 00:39:12,080 Speaker 1: situation of partisanship, partisan reasoning, cultural cognition. But it's not 663 00:39:12,160 --> 00:39:14,920 Speaker 1: necessarily time to despair. First of all, if you're just 664 00:39:14,960 --> 00:39:17,719 Speaker 1: thinking about motivated reasoning and you accept the fact that 665 00:39:17,800 --> 00:39:20,240 Speaker 1: you use it too, it's not just those other people, 666 00:39:20,680 --> 00:39:23,520 Speaker 1: it's me, it's you. We all use it. How can 667 00:39:23,600 --> 00:39:26,799 Speaker 1: we ever know anything is true? Well, i'd say two 668 00:39:26,840 --> 00:39:29,560 Speaker 1: things to that. First of all, not every question is 669 00:39:29,600 --> 00:39:33,239 Speaker 1: settled through motivated reasoning, right, There are plenty of questions 670 00:39:33,280 --> 00:39:36,920 Speaker 1: where we actually do have the primary motivation of just 671 00:39:36,960 --> 00:39:41,440 Speaker 1: getting an accurate answer people. People do show identity splitting 672 00:39:41,440 --> 00:39:44,680 Speaker 1: on whether climate change is dangerous, but they don't show 673 00:39:44,719 --> 00:39:47,759 Speaker 1: identity based splitting on issues like whether X rays are 674 00:39:47,800 --> 00:39:50,760 Speaker 1: harmful to the human body. If you pull people based 675 00:39:50,800 --> 00:39:54,360 Speaker 1: on their ideology and political affiliation, all the other stuff 676 00:39:54,360 --> 00:39:57,640 Speaker 1: you'd be looking for their you know, liberals and conservatives, 677 00:39:58,280 --> 00:40:00,440 Speaker 1: or these other groups that are oft inside, like the 678 00:40:00,560 --> 00:40:06,239 Speaker 1: hierarchical individualist versus the egalitarian communitarian. These groups are in 679 00:40:06,280 --> 00:40:09,080 Speaker 1: agreement X rays are equally harmful to the human bodies 680 00:40:09,160 --> 00:40:13,000 Speaker 1: say they yeah, because, as Khan points out again, these 681 00:40:13,080 --> 00:40:17,920 Speaker 1: these instances of polarization over scientific consensus, these are are 682 00:40:17,960 --> 00:40:21,200 Speaker 1: pathological in the sense that they're harmful, but they're also rare. Yeah. 683 00:40:21,239 --> 00:40:24,200 Speaker 1: So there, it's just these certain issues that we're reasoning 684 00:40:24,200 --> 00:40:27,239 Speaker 1: this way about. Not everything suffers from this problem. Many 685 00:40:27,280 --> 00:40:30,919 Speaker 1: issues are uncontroversial. We generally approached them with no real 686 00:40:31,000 --> 00:40:35,600 Speaker 1: motivation other than just knowing what's true. The problem is 687 00:40:35,719 --> 00:40:39,279 Speaker 1: that even though you're not always using motivated reasoning, you're 688 00:40:39,280 --> 00:40:42,839 Speaker 1: probably not going to know it when you are, uh, 689 00:40:43,120 --> 00:40:48,680 Speaker 1: using motivated reasoning apparently feels similar to using actual objective reasoning. 690 00:40:49,440 --> 00:40:51,919 Speaker 1: You can you can know this firsthand by the fact 691 00:40:51,960 --> 00:40:54,880 Speaker 1: that you don't ever think you're using motivated reasoning. You 692 00:40:54,960 --> 00:40:57,680 Speaker 1: think you're just honestly judging things. But you also know 693 00:40:57,800 --> 00:41:01,279 Speaker 1: you're not right about everything. You're some of those things 694 00:41:01,280 --> 00:41:04,720 Speaker 1: you believe you're definitely wrong about, even though it feels 695 00:41:04,760 --> 00:41:08,600 Speaker 1: like you're just clearly judging what's true. So is there 696 00:41:08,640 --> 00:41:11,520 Speaker 1: any way to know what's true when issues are controversial 697 00:41:11,560 --> 00:41:14,839 Speaker 1: and when we're motivated to reason one way or another. Well, 698 00:41:14,880 --> 00:41:17,200 Speaker 1: I'd say this is when we come back to our 699 00:41:17,239 --> 00:41:22,120 Speaker 1: starting principle going with science, right, Science is exactly a 700 00:41:22,160 --> 00:41:25,839 Speaker 1: way of getting around motivated reasoning and bias if you're 701 00:41:25,880 --> 00:41:28,040 Speaker 1: doing it right. I mean, of course it's possible to 702 00:41:28,120 --> 00:41:31,040 Speaker 1: be really bad at science, but if you're following the 703 00:41:31,040 --> 00:41:34,120 Speaker 1: norms of science, what it is supposed to do is 704 00:41:34,160 --> 00:41:37,280 Speaker 1: make it really hard to get away with motivated reasoning 705 00:41:37,320 --> 00:41:40,480 Speaker 1: for an extended period of time. You've got obstacles built 706 00:41:40,560 --> 00:41:45,160 Speaker 1: into science that are specifically designed to kill motivated reasoning. 707 00:41:45,680 --> 00:41:49,840 Speaker 1: So you've got rigorous empirical method using objective measurement criteria, 708 00:41:49,920 --> 00:41:52,760 Speaker 1: trying to take your own subjective judgments out of things. 709 00:41:53,440 --> 00:41:57,120 Speaker 1: You've got blinding and double blinding of experiments where you know, 710 00:41:57,160 --> 00:41:59,040 Speaker 1: you get people who don't even know what's going on 711 00:41:59,160 --> 00:42:02,560 Speaker 1: to perform the experiment. People in the experiment don't necessarily 712 00:42:02,640 --> 00:42:06,200 Speaker 1: know what's going on. You've got peer review by critical experts, 713 00:42:06,200 --> 00:42:09,960 Speaker 1: you've got replication attempts, you've got professional competition. This is 714 00:42:10,000 --> 00:42:12,480 Speaker 1: the thing that often doesn't get emphasized enough, is that 715 00:42:12,800 --> 00:42:17,280 Speaker 1: there's professional and career based incentive in science to disprove 716 00:42:17,360 --> 00:42:20,719 Speaker 1: the consensus. Right. Yeah, and and again, like like we said, 717 00:42:20,719 --> 00:42:25,040 Speaker 1: skepticism is built into the recipe, right, So if you're 718 00:42:25,040 --> 00:42:28,120 Speaker 1: doing science in a motivated way, your science number one 719 00:42:28,280 --> 00:42:30,839 Speaker 1: is not going to look very strong to begin with, 720 00:42:31,120 --> 00:42:33,279 Speaker 1: and number two, you're not going to get away with 721 00:42:33,320 --> 00:42:35,560 Speaker 1: it for very long. People are going to figure out 722 00:42:35,560 --> 00:42:37,560 Speaker 1: what you're up to. And we've seen examples of this 723 00:42:37,719 --> 00:42:40,960 Speaker 1: when people get caught doing scientific fraud. It seems to 724 00:42:41,040 --> 00:42:44,359 Speaker 1: be fairly rare, but they get caught. Maybe people can't 725 00:42:44,360 --> 00:42:48,480 Speaker 1: replicate your results, people start noticing your regularities in your data. 726 00:42:48,520 --> 00:42:50,839 Speaker 1: I mean, it's a system that is just not very 727 00:42:50,920 --> 00:42:53,640 Speaker 1: forgiving to this kind of nonsense. Yeah, I mean, I don't. 728 00:42:53,719 --> 00:42:56,000 Speaker 1: It depends in each case, like I guess whether it 729 00:42:56,040 --> 00:42:59,800 Speaker 1: falls with fraud or bad science. But with artificial g 730 00:43:00,040 --> 00:43:02,960 Speaker 1: vity or gravity repelling technology is one example where you 731 00:43:03,000 --> 00:43:05,160 Speaker 1: do see studies that have come out where someone claims 732 00:43:05,200 --> 00:43:08,279 Speaker 1: to have developed a means of achieving this. Yeah, but 733 00:43:08,440 --> 00:43:11,040 Speaker 1: they can't be replicated. It doesn't, it doesn't work. It 734 00:43:11,080 --> 00:43:13,759 Speaker 1: doesn't it doesn't pass the tests that are built into 735 00:43:13,800 --> 00:43:16,880 Speaker 1: the scientific process, right, And so this is why science 736 00:43:16,920 --> 00:43:20,200 Speaker 1: is a good way of arriving at correct conclusions about 737 00:43:20,200 --> 00:43:21,880 Speaker 1: the world. I mean, you're not if you go with 738 00:43:21,920 --> 00:43:25,400 Speaker 1: the scientific consensus, you might not be right. Every time, 739 00:43:25,400 --> 00:43:28,040 Speaker 1: But it's your best bet for being right the most 740 00:43:28,239 --> 00:43:32,040 Speaker 1: times of any other thing, you'd go with um. So 741 00:43:32,680 --> 00:43:35,160 Speaker 1: the problem is, of course we can't all be scientists, 742 00:43:35,800 --> 00:43:39,319 Speaker 1: and even scientists themselves can't use all the tools of 743 00:43:39,360 --> 00:43:43,520 Speaker 1: science to solve every controversial question they encounter. Right, So, 744 00:43:43,719 --> 00:43:45,840 Speaker 1: even if you're a scientist, there's tons of stuff in 745 00:43:45,880 --> 00:43:48,719 Speaker 1: your life where you can't bring to bear all of 746 00:43:48,760 --> 00:43:53,960 Speaker 1: that machinery of skepticism and empiricism and impartiality, where you've 747 00:43:54,000 --> 00:43:56,359 Speaker 1: just got to work like everybody else. You've got to 748 00:43:56,400 --> 00:43:59,879 Speaker 1: decide on some issue of public substance what you think 749 00:44:00,000 --> 00:44:03,920 Speaker 1: about it without having the most impartial method possible. So 750 00:44:03,960 --> 00:44:06,280 Speaker 1: the question there, I guess is how can we avoid 751 00:44:06,400 --> 00:44:11,000 Speaker 1: deluding ourselves on issues where identity protective cognition come into play, 752 00:44:11,040 --> 00:44:14,759 Speaker 1: where we can't use the scientific method. Yeah, Like, one 753 00:44:14,760 --> 00:44:17,440 Speaker 1: of the points Gone brings up is like, how do 754 00:44:17,480 --> 00:44:20,520 Speaker 1: you how do you avoid these scenarios in the future, 755 00:44:20,520 --> 00:44:22,000 Speaker 1: Because there's one thing to figure out, how do we 756 00:44:22,200 --> 00:44:25,719 Speaker 1: unpollute this pool of scientific communication? But then how do 757 00:44:25,800 --> 00:44:27,800 Speaker 1: we how do we avoid polluting this one? How do 758 00:44:27,880 --> 00:44:30,880 Speaker 1: we avoid polluting pools that don't really exist yet? As 759 00:44:30,880 --> 00:44:33,640 Speaker 1: a matter of like, public consideration. Yeah. In fact that 760 00:44:33,680 --> 00:44:36,120 Speaker 1: he mentions in the panel that we should have quote 761 00:44:36,120 --> 00:44:40,480 Speaker 1: a science of science communication um, meaning that science communicators 762 00:44:40,520 --> 00:44:43,319 Speaker 1: should have some experiments they can draw on that show 763 00:44:43,440 --> 00:44:46,840 Speaker 1: them how to predict when an issue it's some just 764 00:44:47,000 --> 00:44:52,000 Speaker 1: innocuous question of fact, will become politicized where people suddenly 765 00:44:52,080 --> 00:44:54,480 Speaker 1: take cultural positions on it. I mean, there are a 766 00:44:54,520 --> 00:44:57,680 Speaker 1: lot of variables involved here. It depends on you know, 767 00:44:57,760 --> 00:45:02,480 Speaker 1: who's who's relaying the information there, Uh, that their identity is, 768 00:45:02,520 --> 00:45:06,080 Speaker 1: what what ideals they're pushing on everybody, and how that 769 00:45:06,160 --> 00:45:08,920 Speaker 1: ends up polluting the message. It also depends on a 770 00:45:08,960 --> 00:45:12,120 Speaker 1: number of cultural problems. I mean there are certain polarizing 771 00:45:12,160 --> 00:45:14,880 Speaker 1: issues that are issues here in the United States that 772 00:45:14,920 --> 00:45:18,080 Speaker 1: are not so in Europe, such as such as climate change, 773 00:45:18,520 --> 00:45:21,759 Speaker 1: and then the reverses. You see stuff like genetically modified 774 00:45:22,160 --> 00:45:25,880 Speaker 1: organisms being more of a of a hot topic in 775 00:45:26,320 --> 00:45:29,160 Speaker 1: say England than it is in the States. Totally. Yeah, 776 00:45:29,160 --> 00:45:31,919 Speaker 1: in the in the UK, there's way more controversy over 777 00:45:32,040 --> 00:45:34,200 Speaker 1: GM crops than there is in the United States. Not 778 00:45:34,239 --> 00:45:37,040 Speaker 1: to say there's not some controversy here, and so yeah, 779 00:45:37,080 --> 00:45:39,400 Speaker 1: how do you predict it? I mean, con mentions the 780 00:45:39,400 --> 00:45:42,560 Speaker 1: possibility of well, maybe you can run simulations, if there's 781 00:45:42,600 --> 00:45:45,520 Speaker 1: some sort of simulation system you could employ, which I 782 00:45:45,600 --> 00:45:47,800 Speaker 1: love because they instantly get this sort of star trek 783 00:45:48,280 --> 00:45:52,319 Speaker 1: Um hollow deck scenario where we're running simulations and and 784 00:45:52,440 --> 00:45:56,759 Speaker 1: trying to catch these these polarization points, these confusions, these 785 00:45:56,760 --> 00:45:59,880 Speaker 1: pollution points before they occur, and forget how do you 786 00:46:00,040 --> 00:46:02,640 Speaker 1: davocating how to communicate ahead of them? Yeah, and some 787 00:46:02,719 --> 00:46:05,520 Speaker 1: things are going to be more predictable than others, Like 788 00:46:05,640 --> 00:46:10,360 Speaker 1: there are some facts of science that, if true, tend 789 00:46:10,400 --> 00:46:13,839 Speaker 1: to be unfriendly to the world view that certain people hold, 790 00:46:13,840 --> 00:46:17,239 Speaker 1: tend to be unfriendly to their values. A couple of 791 00:46:17,239 --> 00:46:20,520 Speaker 1: examples Kahn gives is that if you're generally more of 792 00:46:20,560 --> 00:46:25,319 Speaker 1: an individualist and an anti communitary and action person, this 793 00:46:25,360 --> 00:46:28,760 Speaker 1: may make you inherently opposed to the idea of climate change, 794 00:46:28,800 --> 00:46:31,120 Speaker 1: because really the only way that you can do anything 795 00:46:31,120 --> 00:46:35,640 Speaker 1: about climate change is with organized communitary and action. Likewise, 796 00:46:35,719 --> 00:46:38,160 Speaker 1: if you are a person whose values are sort of 797 00:46:38,239 --> 00:46:42,480 Speaker 1: anti big business, that might predispose you to be against 798 00:46:42,480 --> 00:46:45,600 Speaker 1: GMOs because you see them as like a tool that's 799 00:46:45,600 --> 00:46:48,680 Speaker 1: being used by large agrib business to to get their 800 00:46:48,719 --> 00:46:51,880 Speaker 1: profits and to drive other people out of business. And 801 00:46:51,880 --> 00:46:54,560 Speaker 1: and you know, and whip the environment into the shape 802 00:46:54,600 --> 00:46:57,640 Speaker 1: they want it. And I mean, it'd be worth pointing 803 00:46:57,640 --> 00:47:01,200 Speaker 1: out that like you could, for example, use GMOs if 804 00:47:01,200 --> 00:47:03,000 Speaker 1: you're a big business in a way that would be 805 00:47:03,520 --> 00:47:06,960 Speaker 1: very unethical, very damaging to the environment. I mean, the 806 00:47:07,200 --> 00:47:10,480 Speaker 1: whole thing about the scientific consensus on GMOs is that 807 00:47:10,520 --> 00:47:15,279 Speaker 1: there's nothing inherently dangerous about GMOs as a rule, But 808 00:47:15,520 --> 00:47:19,680 Speaker 1: any individual genetically modified organism could be dangerous, just as 809 00:47:19,719 --> 00:47:22,520 Speaker 1: any other organism could. Yeah, that the process is not 810 00:47:22,640 --> 00:47:26,480 Speaker 1: the problem. The potential problem is in the product that's 811 00:47:26,520 --> 00:47:29,359 Speaker 1: created with it, which can be said of most processes. Yeah, 812 00:47:29,360 --> 00:47:32,080 Speaker 1: it can be said just as equally of products that 813 00:47:32,080 --> 00:47:35,680 Speaker 1: are created through traditional agriculture. It's not it's not the 814 00:47:35,760 --> 00:47:39,480 Speaker 1: gene and the lab that makes the problem. But on 815 00:47:39,520 --> 00:47:42,359 Speaker 1: the other hand, uh Kahn points out, you know, there 816 00:47:42,360 --> 00:47:44,759 Speaker 1: there can be other things that are not nearly as 817 00:47:44,840 --> 00:47:49,200 Speaker 1: determined by our core values. It's not necessarily the conservative 818 00:47:49,280 --> 00:47:52,920 Speaker 1: values or liberal values or whatever other kind of dichotomy 819 00:47:52,960 --> 00:47:56,680 Speaker 1: you want to establish in the culture determine how how 820 00:47:56,719 --> 00:47:58,520 Speaker 1: your opinion comes out on them. Some things are much 821 00:47:58,560 --> 00:48:02,320 Speaker 1: more accidental. It can just be that some prominent figure 822 00:48:02,320 --> 00:48:04,600 Speaker 1: on one side of the political spectrum just sort of 823 00:48:04,640 --> 00:48:09,080 Speaker 1: declares for one side of a factual disagreement, and because 824 00:48:09,120 --> 00:48:12,520 Speaker 1: of group affiliation and identity, the groups just start lining 825 00:48:12,600 --> 00:48:16,240 Speaker 1: up accordingly, even though it's not determined by anything inherent 826 00:48:16,280 --> 00:48:20,000 Speaker 1: to their values. Yeah, you know what I mean? Yeah, yeah, Yeah, 827 00:48:20,000 --> 00:48:21,520 Speaker 1: that that's a that's a good point because it's it's 828 00:48:21,560 --> 00:48:25,040 Speaker 1: easier to see it coming together a polarization effect occurring 829 00:48:25,239 --> 00:48:28,759 Speaker 1: when either the problem or the solution either disagree or 830 00:48:28,840 --> 00:48:31,680 Speaker 1: line up with your worldview, such as well, the solution 831 00:48:31,760 --> 00:48:33,600 Speaker 1: is for us all to come together as a as 832 00:48:33,640 --> 00:48:35,759 Speaker 1: a nation that have some sort of top down governmental 833 00:48:35,960 --> 00:48:38,239 Speaker 1: fix that's going to disagree with some people's world views. 834 00:48:38,280 --> 00:48:40,600 Speaker 1: If the if the solution is we're all going to 835 00:48:40,680 --> 00:48:43,880 Speaker 1: eat you know, plants that grow naturally in harmony with 836 00:48:44,239 --> 00:48:47,440 Speaker 1: mother Earth, that's going to fit another worldview more than another. 837 00:48:48,400 --> 00:48:51,520 Speaker 1: But but when it occurs outside of those parameters, yeah, 838 00:48:51,520 --> 00:48:54,640 Speaker 1: it becomes increasingly different. It's like, what is it a 839 00:48:54,640 --> 00:48:57,880 Speaker 1: slow newsweek? Is it just is this just a topic 840 00:48:57,960 --> 00:49:01,880 Speaker 1: that happened to be out there during particular politicians campaign 841 00:49:01,960 --> 00:49:03,360 Speaker 1: and they just took it up and ran with it, 842 00:49:03,360 --> 00:49:05,560 Speaker 1: pressed to distract from something else. I mean, I think 843 00:49:05,600 --> 00:49:08,600 Speaker 1: it looks very very possible that there can be issues 844 00:49:08,680 --> 00:49:12,799 Speaker 1: where there is cultural cognition going on, where society divides 845 00:49:12,920 --> 00:49:16,360 Speaker 1: on a question of scientific fact along cultural lines in 846 00:49:16,360 --> 00:49:18,640 Speaker 1: a way that really just doesn't have very much to 847 00:49:18,680 --> 00:49:22,160 Speaker 1: do with ideology or values at all. It's just one 848 00:49:22,239 --> 00:49:25,400 Speaker 1: side picked one side arbitrarily, and then the other side, 849 00:49:25,719 --> 00:49:29,400 Speaker 1: because they know they always disagree, picks the other side. 850 00:49:29,800 --> 00:49:33,400 Speaker 1: All right. So in terms of solutions here or possible solutions, 851 00:49:33,440 --> 00:49:35,200 Speaker 1: I mean, on one hand, there I feel like there 852 00:49:35,320 --> 00:49:37,600 Speaker 1: is validity in the idea that yes, we have to 853 00:49:37,640 --> 00:49:41,960 Speaker 1: continue to trust science as a process and trust scientists 854 00:49:41,960 --> 00:49:45,040 Speaker 1: that are speaking on behalf of it. Yeah. I mean, 855 00:49:45,120 --> 00:49:47,760 Speaker 1: if you're a person who if you're not yourself a scientist, 856 00:49:47,840 --> 00:49:50,040 Speaker 1: if you're not yourself an expert, and you don't have 857 00:49:50,120 --> 00:49:53,320 Speaker 1: a good reason based in expertise in the subject matter 858 00:49:53,360 --> 00:49:56,000 Speaker 1: for disagreeing with the consensus, I'd say it's usually your 859 00:49:56,040 --> 00:49:58,120 Speaker 1: best bet to go with what most of the people 860 00:49:58,120 --> 00:50:00,640 Speaker 1: who know what they're talking about are saying. Yeah. And 861 00:50:00,640 --> 00:50:03,840 Speaker 1: and then beyond that, I agree with without Khan is 862 00:50:03,840 --> 00:50:06,880 Speaker 1: saying that we do need a science of science communication, 863 00:50:07,120 --> 00:50:11,239 Speaker 1: We need ability to to predict and maneuver around potential 864 00:50:11,640 --> 00:50:15,400 Speaker 1: pollution points in our communication of science. Yeah, how can you? 865 00:50:15,440 --> 00:50:21,279 Speaker 1: How can you preemptively defend contentious facts about reality from 866 00:50:21,320 --> 00:50:25,839 Speaker 1: becoming politicized or becoming subject to cultural cognition? That would 867 00:50:25,840 --> 00:50:29,000 Speaker 1: be a really good thing. One thing that Kahan offers 868 00:50:29,040 --> 00:50:31,879 Speaker 1: that I think is very interesting is that I get 869 00:50:31,880 --> 00:50:34,920 Speaker 1: the impression that he subscribes to what he calls them 870 00:50:35,239 --> 00:50:38,960 Speaker 1: the law of social proof, meaning that if you want 871 00:50:39,000 --> 00:50:41,880 Speaker 1: to convince somebody to to agree on a point of 872 00:50:41,920 --> 00:50:46,400 Speaker 1: fact that they are resistant to for cultural reasons, don't 873 00:50:46,440 --> 00:50:49,319 Speaker 1: try to keep giving them more evidence that you're right 874 00:50:49,360 --> 00:50:52,160 Speaker 1: and they're wrong, because that's just not what works on us. 875 00:50:52,200 --> 00:50:55,000 Speaker 1: I mean, that's what should work on On one hand, 876 00:50:55,280 --> 00:50:57,239 Speaker 1: we feel like we should do that because that would 877 00:50:57,239 --> 00:51:00,560 Speaker 1: be the logical thing to do, but psychologically that is 878 00:51:00,560 --> 00:51:03,480 Speaker 1: not what changes people's positions. What would probably be more 879 00:51:03,480 --> 00:51:07,879 Speaker 1: effective is if they simply see people who are culturally 880 00:51:08,040 --> 00:51:11,200 Speaker 1: like them and part of their in group agreeing with 881 00:51:11,239 --> 00:51:14,640 Speaker 1: this fact. But then that can also come back into 882 00:51:14,680 --> 00:51:17,839 Speaker 1: science communication, like who who are the science communicators? Then 883 00:51:18,160 --> 00:51:20,759 Speaker 1: that are that are reaching out to these groups that 884 00:51:20,920 --> 00:51:24,120 Speaker 1: have a certain amount of polarization present within them. Well, 885 00:51:24,120 --> 00:51:27,319 Speaker 1: it makes me think that if if, what science communicators 886 00:51:27,360 --> 00:51:29,680 Speaker 1: want to do is try to get everybody across different 887 00:51:29,680 --> 00:51:32,160 Speaker 1: cultural groups on the same page. One thing that should 888 00:51:32,200 --> 00:51:36,759 Speaker 1: really be encouraged is cultural diversity of science communication, is 889 00:51:36,840 --> 00:51:40,760 Speaker 1: that there should be people from all different cultural groups 890 00:51:40,840 --> 00:51:45,400 Speaker 1: within a society, all communicating like, hey, here's what the 891 00:51:45,440 --> 00:51:48,440 Speaker 1: science says. So at least when it's a question of science, 892 00:51:48,760 --> 00:51:51,400 Speaker 1: you can be on the same page and not bring 893 00:51:51,400 --> 00:51:53,920 Speaker 1: in these cultural issues because it's not just people from 894 00:51:53,960 --> 00:51:56,560 Speaker 1: that other cultural group telling you what the science says. 895 00:51:56,760 --> 00:51:59,680 Speaker 1: You're hearing about it from people like you, so they 896 00:51:59,800 --> 00:52:02,360 Speaker 1: have it. Hopefully we gave you some some new tools 897 00:52:02,440 --> 00:52:06,799 Speaker 1: to illuminate not only the understanding of others, but growing understandings, 898 00:52:06,840 --> 00:52:09,920 Speaker 1: and also to better understand how science communication is working 899 00:52:09,920 --> 00:52:14,400 Speaker 1: and how these these blockages, the scientific communication breakdowns are occurring, 900 00:52:14,400 --> 00:52:16,720 Speaker 1: and how we might even treat them. Yeah, I hope, 901 00:52:16,880 --> 00:52:19,520 Speaker 1: I hope today maybe we said something of substance that 902 00:52:19,560 --> 00:52:22,320 Speaker 1: will help people. Uh, I don't know, bridge the partisan 903 00:52:22,440 --> 00:52:24,520 Speaker 1: divide and come to some agreement about the things we 904 00:52:24,560 --> 00:52:28,759 Speaker 1: should be able to agree on. Um. But yeah, it's tough, man, 905 00:52:29,680 --> 00:52:32,480 Speaker 1: the partisan divide. It's the thing I often think about 906 00:52:32,560 --> 00:52:35,800 Speaker 1: in in how we communicate stuff like this and it 907 00:52:35,920 --> 00:52:38,040 Speaker 1: can get you down at times, but we shouldn't despair. 908 00:52:38,120 --> 00:52:40,000 Speaker 1: We should try to find ways to get around it. 909 00:52:40,640 --> 00:52:43,440 Speaker 1: Come together, have have one of those big happy Uh 910 00:52:43,680 --> 00:52:47,160 Speaker 1: what were you talking about? Grow grow food together? Oh yes, yes, 911 00:52:47,360 --> 00:52:52,359 Speaker 1: uh in a nice communal Kumbaya moment. Yeah, you might 912 00:52:52,400 --> 00:52:55,319 Speaker 1: have just alienated something. Oh yeah, probably did so. Hey 913 00:52:55,400 --> 00:52:58,160 Speaker 1: did we alienate you? Did we did? We did we 914 00:52:58,239 --> 00:53:00,839 Speaker 1: illuminate anything for you? We of course love to hear 915 00:53:00,880 --> 00:53:03,000 Speaker 1: from you. Check out the podcast at stuff to bow 916 00:53:03,000 --> 00:53:05,160 Speaker 1: your Mind dot com, where you'll find all the episodes, videos, 917 00:53:05,719 --> 00:53:08,200 Speaker 1: blog post and links out to our various social media 918 00:53:08,239 --> 00:53:11,160 Speaker 1: accounts and uh and certainly you can always contact us 919 00:53:11,160 --> 00:53:14,160 Speaker 1: their Facebook, Twitter, Tumbler, Instagram, you name it. And if 920 00:53:14,160 --> 00:53:15,840 Speaker 1: you want to get in touch with us directly to 921 00:53:15,960 --> 00:53:18,040 Speaker 1: let us know feedback on this episode or any other, 922 00:53:18,160 --> 00:53:21,000 Speaker 1: or to suggest a future episode topic, you can email 923 00:53:21,080 --> 00:53:23,600 Speaker 1: us at blow the Mind at how stuff works dot 924 00:53:23,640 --> 00:53:36,440 Speaker 1: com for more on this and thousands of other topics. 925 00:53:36,680 --> 00:54:00,000 Speaker 1: Is it how stuff works dot com