1 00:00:05,320 --> 00:00:09,440 Speaker 1: Why does changing your mind sometimes feel like losing a 2 00:00:09,520 --> 00:00:13,840 Speaker 1: part of yourself. If beliefs are supposed to track truth, 3 00:00:14,400 --> 00:00:18,640 Speaker 1: why do they so often track our tribe instead? Why 4 00:00:18,640 --> 00:00:22,919 Speaker 1: does a challenged belief feel like a personal attack. What 5 00:00:23,079 --> 00:00:27,160 Speaker 1: does your brain think it's defending when it defends an idea? 6 00:00:27,880 --> 00:00:31,720 Speaker 1: What if beliefs didn't evolve to be true, but to 7 00:00:31,760 --> 00:00:37,560 Speaker 1: be useful. What if your strongest convictions are solving social problems, 8 00:00:37,600 --> 00:00:41,760 Speaker 1: not intellectual ones. Today we'll speak with public intellectual and 9 00:00:41,880 --> 00:00:50,800 Speaker 1: neuroscientist Sam Harris about the topic of our beliefs. Welcome 10 00:00:50,800 --> 00:00:53,960 Speaker 1: to Inner Cosmos with me David Eagleman. I'm a neuroscientist 11 00:00:54,040 --> 00:00:57,240 Speaker 1: and author at Stanford, and in these episodes we sail 12 00:00:57,360 --> 00:01:00,920 Speaker 1: deeply into our three pound universe to understand how we 13 00:01:01,000 --> 00:01:04,800 Speaker 1: see the world and importantly today, what we take to 14 00:01:04,920 --> 00:01:21,560 Speaker 1: be true about it. Your brain has a lot of 15 00:01:21,640 --> 00:01:25,920 Speaker 1: beliefs about what is true or false. So if I 16 00:01:25,959 --> 00:01:29,720 Speaker 1: ask you is Paris the capital of France, your brain 17 00:01:29,840 --> 00:01:34,080 Speaker 1: immediately judges the truth value of that statement. Is it 18 00:01:34,480 --> 00:01:37,480 Speaker 1: accurate or inaccurate? How about if I ask you is 19 00:01:37,640 --> 00:01:41,600 Speaker 1: ten plus ten twenty one? Your brain has to take 20 00:01:41,600 --> 00:01:44,720 Speaker 1: in the auditory information and compare it against what it 21 00:01:44,800 --> 00:01:48,520 Speaker 1: knows and make a decision. And most people would probably 22 00:01:48,600 --> 00:01:52,200 Speaker 1: agree with your judgments on these questions. But this can 23 00:01:52,240 --> 00:01:55,800 Speaker 1: of course get more complex when it involves other kinds 24 00:01:55,880 --> 00:02:00,720 Speaker 1: of beliefs, Like Mohammad was a holy proh fit chosen 25 00:02:00,760 --> 00:02:04,480 Speaker 1: by Allah to deliver a message to humanity. Some people 26 00:02:04,520 --> 00:02:08,520 Speaker 1: think that's a true statement, others think that's false, or 27 00:02:08,560 --> 00:02:14,120 Speaker 1: the statement early abortion should be supported, or Trump is 28 00:02:14,360 --> 00:02:18,440 Speaker 1: on balance a good president, or whatever. I'm not suggesting 29 00:02:18,480 --> 00:02:21,000 Speaker 1: how you should evaluate these sentences. I'm just pointing out 30 00:02:21,040 --> 00:02:24,360 Speaker 1: there are lots of sentences I can say that you 31 00:02:24,560 --> 00:02:29,239 Speaker 1: might find true or false. So every moment of your life, 32 00:02:29,280 --> 00:02:33,600 Speaker 1: your brain is making commitments. You're not necessarily announcing these 33 00:02:33,800 --> 00:02:37,160 Speaker 1: or thinking about them consciously. They live under the hood. 34 00:02:37,639 --> 00:02:42,120 Speaker 1: But these commitments shape what seems true, what feels threatening, 35 00:02:42,200 --> 00:02:46,520 Speaker 1: what deserves attention, and what can be safely ignored. These 36 00:02:46,639 --> 00:02:50,400 Speaker 1: commitments we have are what we call beliefs, and beliefs 37 00:02:50,760 --> 00:02:56,600 Speaker 1: are how the brain turns information into action. In other words, 38 00:02:56,600 --> 00:03:01,360 Speaker 1: they determine whether an utterance becomes something you respond to 39 00:03:02,240 --> 00:03:06,080 Speaker 1: or something that fades into the background from the outside. 40 00:03:06,680 --> 00:03:10,040 Speaker 1: Beliefs look like opinions, or positions, in other words, something 41 00:03:10,040 --> 00:03:13,720 Speaker 1: you might argue about or align yourself with. But from 42 00:03:13,760 --> 00:03:19,160 Speaker 1: the inside they feel fundamental, as in this is true, 43 00:03:19,840 --> 00:03:23,960 Speaker 1: this is false. Collectively, your beliefs feel like the structure 44 00:03:24,080 --> 00:03:28,600 Speaker 1: of reality itself. They build the backdrop against which your 45 00:03:28,639 --> 00:03:32,880 Speaker 1: decisions make sense. Now, the puzzle that's always fascinated me 46 00:03:33,040 --> 00:03:36,640 Speaker 1: is that intelligent people who are well informed often reach 47 00:03:37,160 --> 00:03:41,640 Speaker 1: very different conclusions about their beliefs. So you have evidence 48 00:03:41,720 --> 00:03:45,520 Speaker 1: that feels decisive to one person and that feels incomplete 49 00:03:45,600 --> 00:03:49,360 Speaker 1: or unconvincing to another. And what that means is that 50 00:03:49,560 --> 00:03:54,400 Speaker 1: changing one's mind typically feels really hard, and when you 51 00:03:54,480 --> 00:03:58,200 Speaker 1: have a disagreement with someone, this can feel personal, like 52 00:03:58,280 --> 00:04:02,360 Speaker 1: it involves your idea. So when we look at these patterns, 53 00:04:02,400 --> 00:04:05,720 Speaker 1: we see that belief seems to be doing something more 54 00:04:06,200 --> 00:04:09,280 Speaker 1: than building accurate models of the world. That might be 55 00:04:09,480 --> 00:04:13,320 Speaker 1: part of why the brain builds true and false judgments, 56 00:04:13,560 --> 00:04:18,279 Speaker 1: but there's probably more involved. For one, beliefs help coordinate 57 00:04:18,320 --> 00:04:23,479 Speaker 1: your social life by supporting identity within your tribe and 58 00:04:23,640 --> 00:04:28,760 Speaker 1: establishing alignment between people. Also, they can manage uncertainty by 59 00:04:28,800 --> 00:04:32,479 Speaker 1: stabilizing emotion. When we look at it through these lenses, 60 00:04:32,600 --> 00:04:36,360 Speaker 1: belief is, of course a biological process as much as 61 00:04:36,360 --> 00:04:41,120 Speaker 1: a philosophical one. Somewhere between encountering a claim and acting 62 00:04:41,160 --> 00:04:44,200 Speaker 1: on it, your brain has to perform a series of 63 00:04:44,240 --> 00:04:48,760 Speaker 1: operations that determine whether some statements should be reacted to, 64 00:04:49,480 --> 00:04:51,960 Speaker 1: or maybe whether it should be questioned or whether it 65 00:04:51,960 --> 00:04:56,320 Speaker 1: should be dismissed entirely. Whatever your brain decides on the 66 00:04:56,480 --> 00:05:01,240 Speaker 1: truth value of the statement, that shapes what you do next. 67 00:05:01,279 --> 00:05:04,360 Speaker 1: And all of this tends to run below the radar 68 00:05:04,520 --> 00:05:09,000 Speaker 1: of conscious awareness. Now, understanding what the brain is doing 69 00:05:09,040 --> 00:05:13,479 Speaker 1: with beliefs is massively important because what you believe to 70 00:05:13,600 --> 00:05:18,560 Speaker 1: be true and not true determines everything about cooperation or conflict. 71 00:05:19,000 --> 00:05:21,760 Speaker 1: And so this sits right at the center of all 72 00:05:21,839 --> 00:05:26,800 Speaker 1: that's happening in politics, in moral judgments, in social trust. 73 00:05:27,400 --> 00:05:29,520 Speaker 1: So I focus a lot of my episodes on these 74 00:05:29,600 --> 00:05:34,360 Speaker 1: issues about polarization and in groups and out groups and shibbleeths. 75 00:05:34,800 --> 00:05:38,080 Speaker 1: But today I want to move upstream a level and 76 00:05:38,120 --> 00:05:42,119 Speaker 1: focus on the machinery beneath them. How does the brain 77 00:05:42,200 --> 00:05:47,120 Speaker 1: decide what to accept as true? How do emotion and 78 00:05:47,279 --> 00:05:52,800 Speaker 1: identity in social context shape what feels credible? And there's 79 00:05:52,800 --> 00:05:56,120 Speaker 1: no one better to explore these questions with than Sam Harris, 80 00:05:56,120 --> 00:06:00,120 Speaker 1: whose work has examined belief from inside the brain and 81 00:06:01,120 --> 00:06:05,960 Speaker 1: within public discourse. Sam is a public intellectual, a philosopher 82 00:06:06,040 --> 00:06:10,120 Speaker 1: and author, and also a neuroscientist. He's well known for 83 00:06:10,160 --> 00:06:15,360 Speaker 1: his writings on rationality, religion, ethics, free will, and meditation. 84 00:06:15,880 --> 00:06:19,760 Speaker 1: He has long researched how science can inform human values 85 00:06:19,839 --> 00:06:23,480 Speaker 1: with books like The End of Faith and The Moral Landscape. 86 00:06:23,800 --> 00:06:27,760 Speaker 1: He hosts the Making Sense podcast, which focuses on consciousness 87 00:06:27,800 --> 00:06:30,280 Speaker 1: and society and current events. So today we're going to 88 00:06:30,360 --> 00:06:34,200 Speaker 1: ask what are beliefs really about. Here's my conversation with 89 00:06:34,240 --> 00:06:40,479 Speaker 1: Sam Harris. So, Sam, I want to talk with you 90 00:06:40,480 --> 00:06:44,920 Speaker 1: about belief. Many years ago you did neuroimaging studies on belief, 91 00:06:45,000 --> 00:06:48,640 Speaker 1: So let's start there. Tell us how you think about 92 00:06:48,760 --> 00:06:50,640 Speaker 1: belief and how you measure that in the brain. 93 00:06:50,920 --> 00:06:55,040 Speaker 2: Believing things to be true of als versus being uncertain. 94 00:06:56,200 --> 00:06:59,719 Speaker 2: This is something that our brains do readily, right, This 95 00:06:59,760 --> 00:07:02,520 Speaker 2: is largely This is what distinguishes us, certainly when we're 96 00:07:02,520 --> 00:07:05,760 Speaker 2: talking about complex representations of the world. This distinguishes us 97 00:07:05,800 --> 00:07:09,640 Speaker 2: from our closest primate cousins. I wouldn't say that that 98 00:07:09,680 --> 00:07:12,960 Speaker 2: other mammals and certainly, you know, the other primates don't 99 00:07:12,960 --> 00:07:16,640 Speaker 2: have beliefs because clearly they represent states of the world cognitively, 100 00:07:16,680 --> 00:07:19,360 Speaker 2: and they have expectations on that basis. 101 00:07:19,160 --> 00:07:20,600 Speaker 1: Like there'll be grubs under this rock. 102 00:07:20,720 --> 00:07:23,480 Speaker 2: Yeah, so they've got, you know, goal seeking behavior, and 103 00:07:23,480 --> 00:07:25,960 Speaker 2: that those goals are framed by some expectation of the 104 00:07:26,000 --> 00:07:29,000 Speaker 2: way the world is, and they can be surprised by 105 00:07:29,000 --> 00:07:32,320 Speaker 2: different outcomes. But we're the only species that were we 106 00:07:32,360 --> 00:07:36,000 Speaker 2: know of that trades in these linguistic representations of the world. 107 00:07:36,600 --> 00:07:41,640 Speaker 2: It can be completely divorced from any sensory entanglement with 108 00:07:41,680 --> 00:07:43,720 Speaker 2: the state of the world that is being described. I mean, 109 00:07:43,760 --> 00:07:46,000 Speaker 2: you get a cell phone call and there's a voice 110 00:07:46,000 --> 00:07:48,160 Speaker 2: on the other end of the line and it says, 111 00:07:48,720 --> 00:07:51,000 Speaker 2: you know, we've got your daughter and you got to 112 00:07:51,000 --> 00:07:53,280 Speaker 2: come up with a million dollars to get her back. Now, 113 00:07:53,560 --> 00:07:56,640 Speaker 2: if believed all of a sudden, those you know, small 114 00:07:56,680 --> 00:08:00,440 Speaker 2: mouth noises that came you know through the phone, deletely 115 00:08:00,480 --> 00:08:03,480 Speaker 2: common deer or you know, physiology, your whole world is 116 00:08:03,800 --> 00:08:05,960 Speaker 2: all of a sudden about that, you know, kind of 117 00:08:05,960 --> 00:08:09,160 Speaker 2: bent into the shape of the implications of of that 118 00:08:09,360 --> 00:08:11,680 Speaker 2: you know utterance. Whereas if it's the wrong number. If 119 00:08:11,680 --> 00:08:13,160 Speaker 2: you don't, I don't have a daughter, I know what 120 00:08:13,160 --> 00:08:15,160 Speaker 2: you're talking about, right, just it bounces off, it doesn't, 121 00:08:15,160 --> 00:08:18,160 Speaker 2: it doesn't, it doesn't become behaviorally or emotionally actionable. So 122 00:08:18,800 --> 00:08:25,760 Speaker 2: it was this difference between this fairly gossamer representation of 123 00:08:25,800 --> 00:08:28,600 Speaker 2: the world that that that it's just a piece of 124 00:08:28,680 --> 00:08:31,320 Speaker 2: language right in the mind. It could be your own thought, 125 00:08:31,360 --> 00:08:33,160 Speaker 2: it could be the thought of others you're communicated to you, 126 00:08:33,240 --> 00:08:34,920 Speaker 2: it could be something you read on on the you know, 127 00:08:34,960 --> 00:08:37,439 Speaker 2: the front page of a newspapers. It was it was 128 00:08:37,480 --> 00:08:40,880 Speaker 2: a transition from that just the mere, mere decoding of 129 00:08:40,960 --> 00:08:44,760 Speaker 2: a piece of language, to suddenly finding it not only 130 00:08:44,760 --> 00:08:48,400 Speaker 2: behaviorally actionable, but you know, you know, unthinkable not to 131 00:08:48,440 --> 00:08:51,839 Speaker 2: be motivated by those by those you know phonemes. So 132 00:08:51,840 --> 00:08:54,160 Speaker 2: there's clearly, you know, steps in the chain where you 133 00:08:54,160 --> 00:08:57,400 Speaker 2: you you you parse language or parse a statement and 134 00:08:57,600 --> 00:09:03,240 Speaker 2: understand its meaning, and then there's this secondary operation of 135 00:09:03,760 --> 00:09:06,720 Speaker 2: judging it to be true or false, right, or or 136 00:09:06,760 --> 00:09:09,200 Speaker 2: just deciding that you can't judge it one or the other. 137 00:09:10,240 --> 00:09:14,360 Speaker 2: And it seemed to me that on some level, this 138 00:09:14,440 --> 00:09:16,200 Speaker 2: is the most important thing in the world, right, I mean, 139 00:09:16,200 --> 00:09:18,280 Speaker 2: if you're talking about the power of ideas. You first 140 00:09:18,360 --> 00:09:23,160 Speaker 2: have to stand upstream of that claim to talk about 141 00:09:23,720 --> 00:09:27,720 Speaker 2: the difference between believing or disbelieving, accepting or rejecting certain 142 00:09:27,760 --> 00:09:31,000 Speaker 2: ideas right and so, and ideas are obviously everything in 143 00:09:31,280 --> 00:09:35,080 Speaker 2: our lives. I mean, it's it's how we're making a civilization, 144 00:09:35,200 --> 00:09:39,760 Speaker 2: is how we're failing to secure a civilization. It's how 145 00:09:39,760 --> 00:09:42,760 Speaker 2: we're needlessly producing harm for for ourselves and others. I mean, 146 00:09:42,800 --> 00:09:46,480 Speaker 2: it's just it's it's belief. Is claims to knowledge you 147 00:09:46,520 --> 00:09:49,160 Speaker 2: know true or otherwise? Are everything? 148 00:09:49,240 --> 00:09:49,320 Speaker 3: Right? 149 00:09:49,360 --> 00:09:51,920 Speaker 2: And so the question in my mind is what was 150 00:09:51,960 --> 00:09:56,040 Speaker 2: the brain doing to find something behaviorally actionable or to 151 00:09:56,080 --> 00:09:59,559 Speaker 2: find it not based on accepting or rejecting a claim 152 00:09:59,600 --> 00:10:00,720 Speaker 2: to knowledge about the world. 153 00:10:01,040 --> 00:10:03,200 Speaker 1: And so that's binary. But you also looked. 154 00:10:03,000 --> 00:10:05,520 Speaker 3: At uncertain certainty as a third condition. 155 00:10:05,720 --> 00:10:08,840 Speaker 1: But just so I'm clear, was uncertainty the I just 156 00:10:08,840 --> 00:10:09,880 Speaker 1: don't know one way or the other. 157 00:10:10,160 --> 00:10:13,240 Speaker 2: Yeah, okay, And you clearly, I mean, like you clearly 158 00:10:13,240 --> 00:10:16,400 Speaker 2: couldn't know. So if you know a question that would 159 00:10:16,400 --> 00:10:20,560 Speaker 2: provoke that would be to give someone some gigantic number 160 00:10:20,600 --> 00:10:23,319 Speaker 2: and say that this number is prime. Right, It might 161 00:10:23,360 --> 00:10:25,160 Speaker 2: be prime ends in a one, it might be prime. 162 00:10:25,240 --> 00:10:28,920 Speaker 2: But obviously I don't know, or I have an you know, 163 00:10:28,960 --> 00:10:31,400 Speaker 2: you have an even number of hairs on your body, right, 164 00:10:31,960 --> 00:10:34,440 Speaker 2: you know you presumably you don't know, and so you 165 00:10:34,440 --> 00:10:36,240 Speaker 2: can you can come up with lists of statements that 166 00:10:36,280 --> 00:10:40,559 Speaker 2: are just as easy to decode as any other statements. 167 00:10:41,160 --> 00:10:43,240 Speaker 2: But what you come up against, you know, once you 168 00:10:43,400 --> 00:10:45,760 Speaker 2: understand them, is that there's really no way for you 169 00:10:45,840 --> 00:10:47,280 Speaker 2: to know what's true. 170 00:10:47,480 --> 00:10:50,480 Speaker 1: Got it, So you've got I either believe this to 171 00:10:50,520 --> 00:10:52,400 Speaker 1: be true some state, give me an example of what 172 00:10:52,400 --> 00:10:52,679 Speaker 1: you used. 173 00:10:52,800 --> 00:10:59,280 Speaker 2: My hypothesis was that belief, disbelief, and uncertainty were content 174 00:10:59,360 --> 00:11:02,200 Speaker 2: independent operations of the mind and therefore of the brain. 175 00:11:02,280 --> 00:11:03,800 Speaker 2: So that if I gave you a statement like two 176 00:11:03,800 --> 00:11:06,840 Speaker 2: plus two equals four versus two plus two equals five, 177 00:11:08,600 --> 00:11:13,960 Speaker 2: and those are those are parsed and analyzed very similarly. Right, 178 00:11:14,000 --> 00:11:16,960 Speaker 2: you're just just however, you understand arithmetic at the level 179 00:11:16,960 --> 00:11:19,760 Speaker 2: of the brain. You're doing that, but you're coming to 180 00:11:19,800 --> 00:11:24,480 Speaker 2: the opposite conclusion based on those two examples. But the 181 00:11:24,559 --> 00:11:28,880 Speaker 2: hypothesis was that if we interrogated many different streams of 182 00:11:29,280 --> 00:11:37,079 Speaker 2: knowledge and belief formation, so mathematical, logical, autobiographical, religious, political, right, say, 183 00:11:36,960 --> 00:11:41,080 Speaker 2: the ethical the cognitive and you know, neuro physiological operations 184 00:11:41,120 --> 00:11:44,679 Speaker 2: required to parse and judge that the truth value of 185 00:11:44,679 --> 00:11:46,400 Speaker 2: any of those statements had to be very very different. 186 00:11:46,400 --> 00:11:48,760 Speaker 2: I mean, what you're doing for arithmetic is not the 187 00:11:48,760 --> 00:11:52,120 Speaker 2: same as what you're doing for for religion presumably, or ethics, 188 00:11:52,720 --> 00:11:57,079 Speaker 2: or you know, you know, episodic memories. Say, but there 189 00:11:57,160 --> 00:11:59,800 Speaker 2: was this final there had to be a bottleneck with 190 00:12:00,000 --> 00:12:05,439 Speaker 2: which reported acceptance or rejection of the statement or uncertainty. 191 00:12:05,559 --> 00:12:07,560 Speaker 1: Great, and so you put people in the scanner and 192 00:12:07,600 --> 00:12:08,240 Speaker 1: what did you find. 193 00:12:09,120 --> 00:12:12,360 Speaker 2: Well, we found that accepting a statement to be true 194 00:12:13,160 --> 00:12:17,880 Speaker 2: versus rejecting it as false did report a kind of 195 00:12:17,880 --> 00:12:22,679 Speaker 2: content agnostic you know, subsequent operation. One thing that was interesting, though, 196 00:12:22,679 --> 00:12:26,960 Speaker 2: is that, and this actually goes back many many centuries 197 00:12:27,080 --> 00:12:32,800 Speaker 2: to the work of Spinoza. Spinosa hypothesized that merely understanding 198 00:12:32,960 --> 00:12:36,560 Speaker 2: a statement was akin to accepting it. Like it's like 199 00:12:36,600 --> 00:12:38,440 Speaker 2: in order to just if I give you, if I 200 00:12:38,480 --> 00:12:41,200 Speaker 2: give you just any utterance that you you know, any 201 00:12:41,200 --> 00:12:45,600 Speaker 2: claim about the world that you can parse. Merely kind 202 00:12:45,600 --> 00:12:48,080 Speaker 2: of getting to the end of the sentence and understanding 203 00:12:48,120 --> 00:12:51,040 Speaker 2: it puts you in a in a mode of cain 204 00:12:51,200 --> 00:12:54,920 Speaker 2: tacitly accepting it to be true and rejecting it as 205 00:12:54,960 --> 00:12:57,840 Speaker 2: false was a subsequent operation. And one thing you would 206 00:12:57,840 --> 00:13:00,760 Speaker 2: expect if that, in fact were true, is that behaviorally 207 00:13:01,559 --> 00:13:04,520 Speaker 2: false statements would be people would be slower to judge 208 00:13:04,559 --> 00:13:08,720 Speaker 2: false statements of equal complexity than to true statements. And 209 00:13:08,760 --> 00:13:10,520 Speaker 2: we found that. So you get people a statement like 210 00:13:10,520 --> 00:13:13,839 Speaker 2: two plus two equals five, they're slower to say no, 211 00:13:13,960 --> 00:13:17,440 Speaker 2: that's false than two plus two equals four is true. 212 00:13:17,480 --> 00:13:19,880 Speaker 2: I mean that that was a it was a fairly 213 00:13:19,920 --> 00:13:24,240 Speaker 2: significant behavioral difference. But in any case, the. 214 00:13:25,200 --> 00:13:27,319 Speaker 1: We let me, let me dive into that for a second. 215 00:13:27,679 --> 00:13:32,760 Speaker 1: Does that mean that hearing other perspectives, being exposed to 216 00:13:32,800 --> 00:13:35,800 Speaker 1: other perspectives, which you think, well, that's false, But at 217 00:13:35,880 --> 00:13:39,640 Speaker 1: least you've spent that moment standing in that person's truth 218 00:13:40,280 --> 00:13:43,360 Speaker 1: for a moment. So, for example, with political debates, you know, 219 00:13:43,360 --> 00:13:44,560 Speaker 1: one of the things that I think is the most 220 00:13:44,600 --> 00:13:47,880 Speaker 1: important for students is to, you know, get a three 221 00:13:47,960 --> 00:13:51,200 Speaker 1: sixty view of the arguments, to debate from both sides, 222 00:13:51,280 --> 00:13:54,160 Speaker 1: and be ready to do that. When we were in school, 223 00:13:54,200 --> 00:13:57,520 Speaker 1: I think, I don't know if this is retrospective romanticization, 224 00:13:57,600 --> 00:13:59,760 Speaker 1: but I believe that was more common. You would you 225 00:13:59,760 --> 00:14:02,440 Speaker 1: would come into a debate ready to do either side, 226 00:14:03,120 --> 00:14:06,280 Speaker 1: and therefore you knew the issues and the nuance is 227 00:14:06,320 --> 00:14:10,360 Speaker 1: slightly better than let's say a kid who's in college 228 00:14:10,360 --> 00:14:12,200 Speaker 1: who thinks, sorry, this is the truth and I'm not 229 00:14:12,280 --> 00:14:15,000 Speaker 1: going to entertain another hypothesis. Just based on what you said, 230 00:14:15,040 --> 00:14:18,280 Speaker 1: it seems to me interesting that if you're told another perspective, 231 00:14:18,320 --> 00:14:20,240 Speaker 1: you at least stand in that truth for a moment 232 00:14:21,200 --> 00:14:23,000 Speaker 1: before rejecting it, which might be useful. 233 00:14:23,240 --> 00:14:28,760 Speaker 2: Yeah, although it does actually connect with other non normative biases, 234 00:14:28,800 --> 00:14:32,880 Speaker 2: where it's something called Obviously, we have a replication crisis 235 00:14:32,920 --> 00:14:36,800 Speaker 2: in much of this psychological science, so I don't know 236 00:14:36,800 --> 00:14:39,440 Speaker 2: if this work has been replicated. But at the time, 237 00:14:39,480 --> 00:14:42,520 Speaker 2: there was something called the illusory truth effect, where you 238 00:14:43,480 --> 00:14:46,760 Speaker 2: merely being told something. 239 00:14:46,640 --> 00:14:51,080 Speaker 1: Several times so you're more likely to believe it. 240 00:14:51,000 --> 00:14:52,480 Speaker 3: Even in the act of debunking it. 241 00:14:52,520 --> 00:14:54,240 Speaker 2: So if I tell you that a certain thing was 242 00:14:54,520 --> 00:14:56,760 Speaker 2: thought to be true, but in fact it's not true, 243 00:14:57,360 --> 00:15:00,840 Speaker 2: there's a kind of memory distortion bias where just you know, 244 00:15:01,080 --> 00:15:02,920 Speaker 2: a year from now, if I ask you about that thing, 245 00:15:03,600 --> 00:15:05,960 Speaker 2: there's this reliable bias in the direction of, oh, yeah, 246 00:15:05,960 --> 00:15:08,000 Speaker 2: that was that was true, right, but you sort of 247 00:15:08,040 --> 00:15:11,160 Speaker 2: the debunking, and this is why misinformation can be so 248 00:15:12,160 --> 00:15:15,200 Speaker 2: pernicious and difficult to debunk. And again I bracket this 249 00:15:15,520 --> 00:15:18,360 Speaker 2: with not knowing whether or not I am now spreading misinformation, 250 00:15:18,560 --> 00:15:21,000 Speaker 2: because I don't know if this is washed out. 251 00:15:21,040 --> 00:15:24,160 Speaker 1: I've certainly heard the misery truth effect many times, so 252 00:15:24,200 --> 00:15:26,960 Speaker 1: now it's stuck in me as the truth. Ye fine, 253 00:15:27,120 --> 00:15:31,160 Speaker 1: So okay, So coming back to this then, so Spinoza said, 254 00:15:31,920 --> 00:15:33,880 Speaker 1: you know, you have to evaluate something as though it's 255 00:15:34,160 --> 00:15:36,360 Speaker 1: stand in that space for a moment before you make 256 00:15:36,400 --> 00:15:38,000 Speaker 1: your decision that that's actually false. 257 00:15:38,360 --> 00:15:38,720 Speaker 3: Yeah. 258 00:15:39,400 --> 00:15:44,160 Speaker 2: So the reporter of that in our study was mostly 259 00:15:44,560 --> 00:15:47,240 Speaker 2: activity in the anterior insula, which is, as you know, 260 00:15:47,280 --> 00:15:52,880 Speaker 2: the insula is the cortical structure that that is, you know, 261 00:15:54,560 --> 00:15:57,680 Speaker 2: very focused on in terror reception. And it's just the 262 00:15:57,760 --> 00:16:01,120 Speaker 2: kind of the states of the body are being evaluated 263 00:16:01,160 --> 00:16:05,560 Speaker 2: here and discussed. I mean, the discussed literature suggests that 264 00:16:05,560 --> 00:16:09,280 Speaker 2: it's mostly driven by the insula, and it's and so 265 00:16:09,320 --> 00:16:12,760 Speaker 2: that this was somehow unsurprising to us because it does 266 00:16:12,800 --> 00:16:15,520 Speaker 2: seem I mean, I think just you can get in 267 00:16:15,560 --> 00:16:19,600 Speaker 2: touch with this in the first person subjective way, there 268 00:16:19,680 --> 00:16:24,320 Speaker 2: is something to reject a proposition as false feels a 269 00:16:24,320 --> 00:16:26,120 Speaker 2: certain way. I mean, it is a kind of emotion, 270 00:16:26,200 --> 00:16:31,080 Speaker 2: it's a psychological rejection state. And if it's emphatically false, 271 00:16:31,400 --> 00:16:34,520 Speaker 2: certainly on a topic you care about, there's really just 272 00:16:34,720 --> 00:16:37,560 Speaker 2: to put that on. The continuum of disgust was not 273 00:16:37,640 --> 00:16:42,000 Speaker 2: at all surprising, right, And conversely, true statements were evaluated 274 00:16:42,000 --> 00:16:47,760 Speaker 2: with the same kind of positively valanced reward structures that 275 00:16:47,800 --> 00:16:51,560 Speaker 2: we know about in the frontal midline area of the brain. 276 00:16:53,160 --> 00:16:57,480 Speaker 2: And this might be surprising to people who haven't thought 277 00:16:57,480 --> 00:17:00,920 Speaker 2: about neuroscience very much, but generally speaking, we know that 278 00:17:01,480 --> 00:17:05,959 Speaker 2: all of our acts of higher cognition are are leveraged 279 00:17:06,040 --> 00:17:08,680 Speaker 2: from areas of you know, areas of the brain which 280 00:17:08,720 --> 00:17:11,280 Speaker 2: were not designed merely for that for that, you know, 281 00:17:11,359 --> 00:17:14,400 Speaker 2: not designed by evolution merely to support that higher cognition. 282 00:17:14,720 --> 00:17:16,639 Speaker 2: So clearly you have to do math and language and 283 00:17:16,680 --> 00:17:21,040 Speaker 2: neuroscience and everything else that only humans do using these 284 00:17:21,040 --> 00:17:24,280 Speaker 2: structures that that you know, even our chimp cousins have. 285 00:17:25,760 --> 00:17:28,880 Speaker 1: So when you hear something false, activates some sense of disgust, 286 00:17:28,920 --> 00:17:31,480 Speaker 1: and use something true, it's beautiful and rewarding. 287 00:17:31,840 --> 00:17:33,800 Speaker 2: And there was one other piece that the and this 288 00:17:33,840 --> 00:17:36,080 Speaker 2: came in a subsequent study when we were looking at 289 00:17:36,840 --> 00:17:41,560 Speaker 2: political versus and religious versus, just kind of semantic ordinary, 290 00:17:41,800 --> 00:17:44,960 Speaker 2: more ordinary beliefs. We found that beliefs that there was 291 00:17:45,000 --> 00:17:46,880 Speaker 2: there was a subsequent study where we which I ran 292 00:17:46,920 --> 00:17:52,200 Speaker 2: with Jonas Kaplan at USC He's over in Demasio's lab, 293 00:17:53,240 --> 00:17:57,680 Speaker 2: where we tried to in real time push back against 294 00:17:57,680 --> 00:17:59,960 Speaker 2: people strongly held beliefs while they were in this skin. 295 00:18:00,240 --> 00:18:02,760 Speaker 2: So we'd present them with a proposition which we we 296 00:18:02,840 --> 00:18:06,720 Speaker 2: knew in advance that that you know, some subjects would 297 00:18:06,840 --> 00:18:09,320 Speaker 2: strongly believe someone's strongly disbelieve, and we'd try to push 298 00:18:09,400 --> 00:18:13,760 Speaker 2: their beliefs in the opposite direction, giving them fairly persuasive, 299 00:18:13,760 --> 00:18:16,240 Speaker 2: but still in many cases kind of just sham evidence. 300 00:18:16,320 --> 00:18:16,439 Speaker 3: Right. 301 00:18:16,560 --> 00:18:18,240 Speaker 2: So like he's like someone who would be put in 302 00:18:18,280 --> 00:18:20,159 Speaker 2: the scanner and we would we would ask them to 303 00:18:20,200 --> 00:18:23,679 Speaker 2: rate on a scale of one to seven their confidence 304 00:18:23,720 --> 00:18:28,040 Speaker 2: that secondhand smoke is a significant health concern, right, and 305 00:18:28,080 --> 00:18:30,399 Speaker 2: then we would present I think in this case it 306 00:18:30,440 --> 00:18:33,600 Speaker 2: was sham evidence against that conviction, seeing if we could 307 00:18:33,640 --> 00:18:37,720 Speaker 2: knock get back. And but we did this for beliefs 308 00:18:37,800 --> 00:18:41,560 Speaker 2: that would be nonetheless strongly held, but around which we 309 00:18:41,600 --> 00:18:45,840 Speaker 2: wouldn't expect someone's identity to be really anchored versus beliefs 310 00:18:45,880 --> 00:18:48,040 Speaker 2: that were you know, political or religious, where they you 311 00:18:48,040 --> 00:18:50,920 Speaker 2: would imagine that would be more resistant to to revising 312 00:18:50,960 --> 00:18:55,320 Speaker 2: their opinion whatever your evidence, right, And we found that 313 00:18:56,960 --> 00:19:00,919 Speaker 2: areas in again midline, but now in the posterior regions 314 00:19:00,920 --> 00:19:04,359 Speaker 2: of the brain, you know, kind of classically kind of 315 00:19:04,359 --> 00:19:12,240 Speaker 2: default mode network areas were preferentially upregulated there where and 316 00:19:12,280 --> 00:19:15,480 Speaker 2: some of these areas are and there's other work showing 317 00:19:15,520 --> 00:19:20,800 Speaker 2: that they when people are given stimuli that that are 318 00:19:20,840 --> 00:19:23,879 Speaker 2: interrogating self representation itself, like if you I give you 319 00:19:23,880 --> 00:19:25,479 Speaker 2: a list of words and ask you which of these 320 00:19:25,520 --> 00:19:27,640 Speaker 2: words apply it to you? You know, we're like patient 321 00:19:28,040 --> 00:19:31,879 Speaker 2: or kind or anxious or and then people are if 322 00:19:31,920 --> 00:19:35,480 Speaker 2: the if the if the task is is this me right? 323 00:19:35,600 --> 00:19:39,960 Speaker 2: That's this this the same region that gets activated. So 324 00:19:40,880 --> 00:19:43,400 Speaker 2: you know, at least in that experiment, we we did 325 00:19:43,840 --> 00:19:48,159 Speaker 2: produce some evidence that beliefs were people were resistant to 326 00:19:48,280 --> 00:19:51,359 Speaker 2: modifying their beliefs no matter what the counter evidence was, 327 00:19:51,840 --> 00:19:54,560 Speaker 2: where the were of a sort that seemed to be 328 00:19:54,600 --> 00:19:58,760 Speaker 2: invoking kind of the self referential uh processing. 329 00:19:59,000 --> 00:20:01,760 Speaker 1: So beliefs are tied to identity. 330 00:20:01,800 --> 00:20:04,040 Speaker 2: Not not all but such like like again there's there's 331 00:20:04,080 --> 00:20:06,359 Speaker 2: there were two buckets. Some could be strongly held. But 332 00:20:07,000 --> 00:20:09,719 Speaker 2: you know, presumably your identity isn't so tied up in 333 00:20:09,920 --> 00:20:13,720 Speaker 2: you know, whether secondhand smoke is a health hazard versus uh, 334 00:20:13,960 --> 00:20:17,080 Speaker 2: you know something about you know, abortion is wrong, or 335 00:20:17,119 --> 00:20:20,440 Speaker 2: you know, whatever your political conviction was that we analyzed 336 00:20:20,440 --> 00:20:20,960 Speaker 2: in advance. 337 00:20:35,920 --> 00:20:38,000 Speaker 1: So this leads us to the question, I think, if 338 00:20:38,320 --> 00:20:40,960 Speaker 1: what is belief for from an evolutionary point of view, 339 00:20:41,040 --> 00:20:45,919 Speaker 1: what are these beliefs useful for doing for for the 340 00:20:45,960 --> 00:20:49,760 Speaker 1: rest of the brain. One idea about beliefs is that 341 00:20:50,840 --> 00:20:54,040 Speaker 1: they're trying to build a an accurate model of the world. 342 00:20:54,560 --> 00:20:56,600 Speaker 1: Another version of beliefs is that it has to do 343 00:20:56,720 --> 00:21:01,879 Speaker 1: with something social, as in belonging, with identity and the 344 00:21:02,240 --> 00:21:05,639 Speaker 1: capitlain and all. Study you just pointed two would certainly 345 00:21:05,640 --> 00:21:08,679 Speaker 1: support that. And then another that people propose that has 346 00:21:08,720 --> 00:21:13,000 Speaker 1: to do with emotional regulation. It makes you feel better, more certain, 347 00:21:13,080 --> 00:21:15,640 Speaker 1: less anxious about particular things, especially I think that would 348 00:21:15,680 --> 00:21:17,960 Speaker 1: apply to letus say, religious beliefs. And it may not 349 00:21:18,000 --> 00:21:20,920 Speaker 1: be that there's anything exclusive about these answers, but if 350 00:21:20,920 --> 00:21:24,600 Speaker 1: we were thinking about waiting them, how do you see 351 00:21:24,640 --> 00:21:25,719 Speaker 1: the function of beliefs? 352 00:21:25,720 --> 00:21:31,760 Speaker 2: Well, I think it's subsumes almost every act of cognition 353 00:21:32,920 --> 00:21:37,639 Speaker 2: apart from kind of pure kind of sensory engagement with 354 00:21:37,720 --> 00:21:43,159 Speaker 2: the world. And even in that case, you can you 355 00:21:43,200 --> 00:21:47,680 Speaker 2: can translate the sensory experience into kind of belief talk. 356 00:21:48,000 --> 00:21:49,840 Speaker 2: I mean, it doesn't have the we don't it's not 357 00:21:49,920 --> 00:21:52,800 Speaker 2: mediated linguistically in the same way. But like my belief 358 00:21:53,520 --> 00:21:56,199 Speaker 2: that that's a table, it's built up over you know, 359 00:21:56,440 --> 00:21:58,919 Speaker 2: you can in many channels. One channel as I can 360 00:21:58,960 --> 00:22:00,840 Speaker 2: see it, and I can see that, you know, supporting 361 00:22:00,880 --> 00:22:03,560 Speaker 2: your drink, and so we're using that as a table 362 00:22:04,040 --> 00:22:09,879 Speaker 2: whatever it is. But the linguistic mediation comes into play 363 00:22:09,920 --> 00:22:14,199 Speaker 2: when we might say, oh, would you just would you 364 00:22:14,200 --> 00:22:16,239 Speaker 2: go put that over on the table. It's like then, 365 00:22:16,640 --> 00:22:18,600 Speaker 2: so now I have this abstract representation of the table, 366 00:22:18,640 --> 00:22:21,040 Speaker 2: I walk into the room, I find the table, right. 367 00:22:21,080 --> 00:22:25,760 Speaker 2: So obviously we're engaging this in multiple modes. But once 368 00:22:25,800 --> 00:22:29,480 Speaker 2: you break free of just moment to moment, you know, 369 00:22:30,240 --> 00:22:34,480 Speaker 2: behavioral imperatives of navigating that, you know, our mediate environment, 370 00:22:36,119 --> 00:22:42,199 Speaker 2: it's almost all linguistic or you know, imagistic representations of 371 00:22:42,359 --> 00:22:46,040 Speaker 2: possible states of the future, you know, plausible states of 372 00:22:46,080 --> 00:22:50,119 Speaker 2: the past. You know, do we try to reconcile our memory, 373 00:22:50,119 --> 00:22:52,960 Speaker 2: you know, our memories when there's any kind of discordance there. 374 00:22:53,000 --> 00:22:55,480 Speaker 2: But again, we're just trading in language. Oh no, no, 375 00:22:55,520 --> 00:22:57,359 Speaker 2: he didn't say that. He said this, Or no, no, 376 00:22:57,440 --> 00:22:59,640 Speaker 2: that wasn't. That wasn't on Tuesday. That was on Wednesday, 377 00:22:59,680 --> 00:23:02,480 Speaker 2: And I mean it was It's all we're doing is 378 00:23:02,520 --> 00:23:06,000 Speaker 2: talking to ourselves and talking to others about what the 379 00:23:06,040 --> 00:23:08,439 Speaker 2: world is like and what our place in it is 380 00:23:08,520 --> 00:23:11,159 Speaker 2: and what should we do next. I mean, we have 381 00:23:11,240 --> 00:23:18,000 Speaker 2: just we have this massive navigation problem which covers really 382 00:23:18,080 --> 00:23:20,960 Speaker 2: our life in every respect. I mean even you know, 383 00:23:21,040 --> 00:23:24,400 Speaker 2: I view morality as a navigation problem, and morality really 384 00:23:24,440 --> 00:23:28,320 Speaker 2: is a story of what should we do next, you know, 385 00:23:28,480 --> 00:23:30,280 Speaker 2: or in the future, right, Like, what is what's the 386 00:23:30,359 --> 00:23:32,200 Speaker 2: right thing to do? So you get two people in 387 00:23:32,240 --> 00:23:35,560 Speaker 2: a room and they're for whatever reason, their incentives aren't aligned, 388 00:23:35,560 --> 00:23:38,040 Speaker 2: and there's some kind of negotiation, and then it immediately 389 00:23:38,080 --> 00:23:40,119 Speaker 2: becomes a conversation about what's the right thing to do, 390 00:23:40,160 --> 00:23:42,000 Speaker 2: what's the fair thing to do, what's the what do 391 00:23:42,119 --> 00:23:45,520 Speaker 2: we do last time? And if you actually kind of 392 00:23:45,600 --> 00:23:48,600 Speaker 2: drill down on what's happening at each point along the way, 393 00:23:48,640 --> 00:23:53,359 Speaker 2: and what is anchoring everyone's emotional response. You're looking at 394 00:23:53,760 --> 00:23:56,240 Speaker 2: knowledge claims about the way the world is, the way 395 00:23:56,240 --> 00:24:00,000 Speaker 2: that it's likely to be, and people's kind of the waitings, 396 00:24:00,000 --> 00:24:03,359 Speaker 2: the emotional waitings. People are giving these claims with respect 397 00:24:03,400 --> 00:24:06,159 Speaker 2: to kind of a probabilistic sense of the likelihood that 398 00:24:06,160 --> 00:24:10,239 Speaker 2: they're true or false. Right, So, if someone says to you, oh, no, no, no, 399 00:24:10,280 --> 00:24:15,960 Speaker 2: that's not what happened. There's a very controversial shooting in 400 00:24:16,000 --> 00:24:18,040 Speaker 2: the news right now that has kind of shattered the 401 00:24:18,040 --> 00:24:21,720 Speaker 2: country in the last twelve hours. An ice officer shot 402 00:24:21,720 --> 00:24:25,520 Speaker 2: a woman protester in her car. You know, millions upon 403 00:24:25,520 --> 00:24:28,000 Speaker 2: millions have seen the video and have rival interpretations of 404 00:24:28,000 --> 00:24:30,840 Speaker 2: the video based on what they believe is plausible to think, 405 00:24:30,880 --> 00:24:33,400 Speaker 2: based on the angle they were looking at and etc. 406 00:24:34,000 --> 00:24:39,440 Speaker 2: All of this is a story of rival beliefs framed 407 00:24:39,440 --> 00:24:43,920 Speaker 2: by different shadings of facts. But again, facts in this 408 00:24:44,000 --> 00:24:49,760 Speaker 2: case are beliefs upon beliefs upon beliefs that people are 409 00:24:49,920 --> 00:24:51,800 Speaker 2: granting more or less credence. 410 00:24:52,040 --> 00:24:52,240 Speaker 3: Right. 411 00:24:52,320 --> 00:24:54,399 Speaker 2: So, Like in this case, you know, the woman was 412 00:24:55,000 --> 00:24:57,840 Speaker 2: driving her car toward the officer, So was she trying 413 00:24:57,840 --> 00:24:59,119 Speaker 2: to get away or was she trying to run the 414 00:24:59,119 --> 00:25:04,320 Speaker 2: officer overw how you interpret that maneuver changes your sense 415 00:25:04,359 --> 00:25:07,280 Speaker 2: of the ethics of it completely. What is an officer 416 00:25:07,320 --> 00:25:10,280 Speaker 2: supposed to do when he's no longer physically in danger 417 00:25:10,720 --> 00:25:12,560 Speaker 2: with his gun, but his gun has drawn, he's already 418 00:25:12,560 --> 00:25:16,000 Speaker 2: shot one round at the suspect. You either know nothing 419 00:25:16,000 --> 00:25:17,520 Speaker 2: about that or you know a lot about that, and 420 00:25:17,520 --> 00:25:19,359 Speaker 2: you have very strong convictions about what the right thing 421 00:25:19,400 --> 00:25:21,399 Speaker 2: to do is or what the legal thing to do is. Again, 422 00:25:21,760 --> 00:25:26,879 Speaker 2: is just a blizzard of propositional claims, linguistically mediated propositional 423 00:25:26,920 --> 00:25:28,640 Speaker 2: claims about the way the world is or should be, 424 00:25:29,160 --> 00:25:32,200 Speaker 2: And it's all a matter of belief and our creedence around. 425 00:25:31,960 --> 00:25:35,720 Speaker 1: It, agreed. But what I'm interested in is not so 426 00:25:35,800 --> 00:25:38,359 Speaker 1: much the individual beliefs that we might have about the 427 00:25:38,359 --> 00:25:41,800 Speaker 1: trajectory of the car so on, but the social aspect 428 00:25:41,880 --> 00:25:44,879 Speaker 1: of this, in terms of what do my friends and 429 00:25:44,960 --> 00:25:48,320 Speaker 1: my colleagues on whatever my political side of the argument is, 430 00:25:48,400 --> 00:25:51,439 Speaker 1: what do they believe is true? We're so obviously influenced 431 00:25:51,520 --> 00:25:55,320 Speaker 1: by our groups, by our cultures, by our families, by 432 00:25:55,359 --> 00:25:59,680 Speaker 1: our neighborhoods. To what degree is belief fundamentally a social 433 00:25:59,720 --> 00:26:01,639 Speaker 1: issue in our social species? 434 00:26:01,720 --> 00:26:05,080 Speaker 2: Well, I think it is almost entirely. When you look 435 00:26:05,119 --> 00:26:09,479 Speaker 2: at it's kind of the primacy of language in shaping 436 00:26:09,480 --> 00:26:13,480 Speaker 2: these beliefs. So we learn the language is a social phenomenon, right, 437 00:26:13,480 --> 00:26:16,520 Speaker 2: It's an intersubjective phenomenon. You learn it. I mean, it's 438 00:26:16,520 --> 00:26:18,680 Speaker 2: not that we don't have a propensity to learn it. 439 00:26:18,720 --> 00:26:21,320 Speaker 2: Obviously we do, and that is kind of a private 440 00:26:21,560 --> 00:26:24,480 Speaker 2: individual fact about each of us. But when you look 441 00:26:24,480 --> 00:26:26,960 Speaker 2: at how it gets trained up in dialogue with your 442 00:26:27,000 --> 00:26:30,679 Speaker 2: parents and caregivers and other people in your environment from 443 00:26:31,080 --> 00:26:35,399 Speaker 2: your first hours of life, it's an intrinsically social phenomenon 444 00:26:35,440 --> 00:26:39,840 Speaker 2: which we then later internalize, such that even when mommy 445 00:26:39,840 --> 00:26:43,000 Speaker 2: and Daddy leave the room, we're left talking to ourselves. Right, 446 00:26:43,040 --> 00:26:45,680 Speaker 2: And so there's that kind of inner voice that gets 447 00:26:46,240 --> 00:26:50,679 Speaker 2: kindled and which we never seem to shake. Most people 448 00:26:50,760 --> 00:26:55,840 Speaker 2: are mediating all of their experience with language, and we 449 00:26:56,000 --> 00:26:58,800 Speaker 2: do this with one another. It is the basis of 450 00:26:58,960 --> 00:27:02,920 Speaker 2: virtually all human cooperation. I mean, it's certainly cooperation with 451 00:27:02,960 --> 00:27:07,400 Speaker 2: respect any complex task. I mean, like if you got 452 00:27:07,480 --> 00:27:11,200 Speaker 2: up and started a trip, you know, I might wordlessly 453 00:27:11,240 --> 00:27:13,320 Speaker 2: and instantly you know, grab you to keep you from 454 00:27:13,359 --> 00:27:15,240 Speaker 2: falling over, or you drop something, and I try to 455 00:27:15,240 --> 00:27:20,360 Speaker 2: catch it. That's not linguistic. But basically, any cooperation more 456 00:27:20,359 --> 00:27:23,159 Speaker 2: complex than that is a matter of us talking to 457 00:27:23,200 --> 00:27:26,400 Speaker 2: one another about what our goals are and what we're 458 00:27:26,400 --> 00:27:27,080 Speaker 2: going to do next. 459 00:27:27,280 --> 00:27:30,480 Speaker 1: Yeah, So how do you think that applies to beliefs? Though? 460 00:27:30,560 --> 00:27:34,640 Speaker 1: My belief about whether the woman was trying to kill 461 00:27:34,640 --> 00:27:39,040 Speaker 1: the cop or get away, my belief about gun control, abortion, whatever, 462 00:27:39,119 --> 00:27:40,800 Speaker 1: What do these things have to do with the people 463 00:27:40,800 --> 00:27:43,880 Speaker 1: I'm surrounded by? In other words, how do we make 464 00:27:43,920 --> 00:27:46,240 Speaker 1: our beliefs about what it's true what it's not true. 465 00:27:46,680 --> 00:27:50,879 Speaker 2: So all of our talk about beliefs is occurring in 466 00:27:50,920 --> 00:27:55,240 Speaker 2: a community, right, in the community of people who are 467 00:27:55,600 --> 00:27:57,840 Speaker 2: we have direct relationships with, and then just a community 468 00:27:57,880 --> 00:28:01,160 Speaker 2: of that is just a matter of are being embedded 469 00:28:01,200 --> 00:28:03,640 Speaker 2: in an information landscape where we're hearing from other people 470 00:28:03,680 --> 00:28:06,159 Speaker 2: who we may never meet, or we're reading books, you know, 471 00:28:06,200 --> 00:28:08,359 Speaker 2: written by people who may have been dead for hundreds 472 00:28:08,400 --> 00:28:12,760 Speaker 2: of years, but we're trafficking in the kind of just 473 00:28:12,840 --> 00:28:18,560 Speaker 2: a torrents of linguistic representations of the world, and we're 474 00:28:18,640 --> 00:28:25,560 Speaker 2: constantly revising or resisting revising our view of the world 475 00:28:25,560 --> 00:28:28,040 Speaker 2: on that basis. Right, So you read something and you 476 00:28:28,080 --> 00:28:32,800 Speaker 2: know this is where various you know, reasoning heuristics, you know, 477 00:28:32,960 --> 00:28:35,920 Speaker 2: normative and otherwise come into play where you could have 478 00:28:36,000 --> 00:28:38,640 Speaker 2: something like confirmation bias, right, which you know, in science 479 00:28:38,680 --> 00:28:40,880 Speaker 2: we're alert to the fact that this is a kind 480 00:28:40,880 --> 00:28:44,680 Speaker 2: of you know, cognitive defect or failure mode where it's like, 481 00:28:44,960 --> 00:28:47,680 Speaker 2: you know, if you're just looking for the confirmation of 482 00:28:47,760 --> 00:28:52,120 Speaker 2: your cherished hypothesis or your cherished opinion, well then we 483 00:28:52,200 --> 00:28:54,280 Speaker 2: know that's just not a good operation to see if 484 00:28:54,320 --> 00:28:58,040 Speaker 2: in fact you're you're in touch with reality. And so, 485 00:28:58,880 --> 00:29:01,920 Speaker 2: but in every area of discourse, whether it's a formal 486 00:29:02,000 --> 00:29:04,760 Speaker 2: one like science or it's just we're just you know, 487 00:29:04,840 --> 00:29:12,200 Speaker 2: gossiping with friends, we're constantly trading in statements about the 488 00:29:12,200 --> 00:29:16,960 Speaker 2: way the world is. And these statements could fall into 489 00:29:17,720 --> 00:29:21,560 Speaker 2: various categories that are just are more or less morally 490 00:29:21,600 --> 00:29:24,320 Speaker 2: salient or or psychologically. 491 00:29:25,120 --> 00:29:25,840 Speaker 3: Combustible. 492 00:29:25,960 --> 00:29:27,360 Speaker 2: I mean, so, you know, if you if you're talking 493 00:29:27,400 --> 00:29:30,360 Speaker 2: about religious beliefs, like core religious beliefs that are you know, 494 00:29:30,400 --> 00:29:33,840 Speaker 2: foundational for a specific community, obviously those have a different 495 00:29:34,240 --> 00:29:38,000 Speaker 2: gravity to them, and they are all you know, pathologically 496 00:29:38,040 --> 00:29:41,680 Speaker 2: in my view, they are protected by certain norms that 497 00:29:41,760 --> 00:29:44,680 Speaker 2: are really the antithesis of what we what we do 498 00:29:44,760 --> 00:29:47,520 Speaker 2: in science where you know, you're I mean this, you know, 499 00:29:47,960 --> 00:29:52,200 Speaker 2: religion is the only area of culture where the notion 500 00:29:52,280 --> 00:29:55,560 Speaker 2: of dogma is not not pejorative. R I mean, dogma 501 00:29:55,600 --> 00:29:57,880 Speaker 2: is literally not a bad word in you know, dogma's 502 00:29:58,000 --> 00:30:01,200 Speaker 2: you know, traditionally Catholic, you know, and word. These are 503 00:30:01,240 --> 00:30:04,520 Speaker 2: the things you're gonna believe without you know, sufficient evidence, 504 00:30:04,520 --> 00:30:06,000 Speaker 2: and you're gonna believe them as much as you can 505 00:30:06,040 --> 00:30:07,960 Speaker 2: believe anything, and they're not on the table to be 506 00:30:08,440 --> 00:30:11,640 Speaker 2: argued against or revised right now in science. If you 507 00:30:11,840 --> 00:30:13,720 Speaker 2: if you can say, if you tell someone well that's 508 00:30:13,760 --> 00:30:16,800 Speaker 2: just you know, you're being dogmatic, or that's that's a dogma, 509 00:30:16,880 --> 00:30:19,360 Speaker 2: well that's not you know, that's definitely not a nice 510 00:30:19,360 --> 00:30:23,000 Speaker 2: thing to say about somebody's reasoning. So yeah, so this 511 00:30:23,080 --> 00:30:27,800 Speaker 2: is this is it's it's everywhere. And again, the hypothesis 512 00:30:27,880 --> 00:30:31,880 Speaker 2: going into this original uh neuroscientific work was that there's 513 00:30:31,920 --> 00:30:36,040 Speaker 2: something importantly similar across across all of these domains that 514 00:30:36,120 --> 00:30:40,440 Speaker 2: we're doing where you're you're just you're you're accepting a 515 00:30:40,520 --> 00:30:44,760 Speaker 2: representation as behaviorally valid or you're rejecting it as not 516 00:30:45,080 --> 00:30:47,520 Speaker 2: and or you're or you're in a kind of holding 517 00:30:47,560 --> 00:30:49,320 Speaker 2: pattern where you just can't know one way or the other. 518 00:30:49,360 --> 00:30:51,360 Speaker 1: But I still want to understand what is the difference, 519 00:30:51,680 --> 00:30:54,240 Speaker 1: either from neuroimaging studies or just how you've thought about 520 00:30:54,280 --> 00:30:56,880 Speaker 1: serveral years, What is the difference between let's say, scientific 521 00:30:57,200 --> 00:31:01,240 Speaker 1: and political and religious beliefs. That are all beliefs that 522 00:31:01,280 --> 00:31:04,760 Speaker 1: you might take to be true or false. But it 523 00:31:04,840 --> 00:31:09,280 Speaker 1: does feel like the social component is a really important 524 00:31:09,280 --> 00:31:12,800 Speaker 1: one to political beliefs, maybe scientific beliefs too. In some cases, 525 00:31:13,560 --> 00:31:18,280 Speaker 1: the emotional regulation might be really important to religious beliefs, 526 00:31:18,320 --> 00:31:20,200 Speaker 1: as in, I've got a lot of anxiety about this, 527 00:31:20,240 --> 00:31:21,840 Speaker 1: and this makes me feel better if I were to 528 00:31:21,880 --> 00:31:26,160 Speaker 1: accept that. So there are these different components about why 529 00:31:26,240 --> 00:31:29,360 Speaker 1: we believe. In other words, if we take the position 530 00:31:29,440 --> 00:31:33,040 Speaker 1: that belief isn't actually just about figuring out what's true 531 00:31:33,080 --> 00:31:35,280 Speaker 1: in the world, which we wish it were, but it's 532 00:31:35,320 --> 00:31:38,640 Speaker 1: clearly not that in brains. So this is my question 533 00:31:38,680 --> 00:31:42,000 Speaker 1: for you. How do you think about the when beliefs 534 00:31:42,040 --> 00:31:45,880 Speaker 1: are for the purpose of social cohesion, when they're for 535 00:31:45,920 --> 00:31:48,200 Speaker 1: the purpose of emotional regulation, and when they're for the 536 00:31:48,240 --> 00:31:52,320 Speaker 1: purpose of getting the right answer for predictive actorscy the world. 537 00:31:53,360 --> 00:31:56,400 Speaker 2: Well, I think all of that is still pretty knit together. 538 00:31:56,440 --> 00:32:01,480 Speaker 2: But yeah, I would I would agree that certain beliefs function, 539 00:32:02,640 --> 00:32:06,160 Speaker 2: or certain professions of belief at least functions kind of 540 00:32:06,240 --> 00:32:08,880 Speaker 2: you know, loyalty tests or signaling as a you know, 541 00:32:09,320 --> 00:32:13,640 Speaker 2: you know, a shibboleth as you know that where you're 542 00:32:14,160 --> 00:32:17,440 Speaker 2: signaling group membership, right like like we have a code, 543 00:32:17,640 --> 00:32:20,600 Speaker 2: and on that basis, there's a level of kind of 544 00:32:20,600 --> 00:32:25,280 Speaker 2: social trust and expectations about each other's ethical intuitions and 545 00:32:25,280 --> 00:32:29,520 Speaker 2: political commitments. And it's the kind of shorthand. But if 546 00:32:29,560 --> 00:32:33,240 Speaker 2: you're going to tear at those sort of core group 547 00:32:33,440 --> 00:32:38,520 Speaker 2: identity assertions, whether it's a you know it's a religious 548 00:32:38,880 --> 00:32:41,200 Speaker 2: you know, a mainstream religion or you know, small or 549 00:32:41,200 --> 00:32:45,200 Speaker 2: still like a spiritual or religious cult, or a political 550 00:32:45,280 --> 00:32:50,600 Speaker 2: cult say, or political culture, these are kind of anchoring 551 00:32:50,920 --> 00:32:53,800 Speaker 2: assertions about what is true about what really happened, about 552 00:32:53,800 --> 00:32:56,200 Speaker 2: what is real about the unique sanctity of a given book, say, 553 00:32:57,600 --> 00:32:59,960 Speaker 2: which we're not disposed to rein. 554 00:32:59,800 --> 00:33:01,000 Speaker 3: Tear very much. 555 00:33:01,080 --> 00:33:03,840 Speaker 2: And when you when you talk about specific things where 556 00:33:03,880 --> 00:33:09,320 Speaker 2: they really were, these different ways of knowing collide. There's 557 00:33:09,360 --> 00:33:11,160 Speaker 2: so many examples. He takes something like, you know, the 558 00:33:11,160 --> 00:33:13,920 Speaker 2: shroud of Turin which was thought to be this. Perhaps 559 00:33:13,960 --> 00:33:15,840 Speaker 2: in certain circles it still is thought to be that 560 00:33:15,880 --> 00:33:19,520 Speaker 2: this Christian relic where this shroud had had received the 561 00:33:20,840 --> 00:33:23,960 Speaker 2: facial imprint of Jesus, you know, it was thought to 562 00:33:24,000 --> 00:33:28,440 Speaker 2: be his burial shroud. But then carbon dating was done 563 00:33:28,520 --> 00:33:30,520 Speaker 2: on it and it was it was revealed to I think, 564 00:33:31,160 --> 00:33:34,600 Speaker 2: I think it's the thirteenth century relic. So it's you know, 565 00:33:34,640 --> 00:33:37,920 Speaker 2: it's kind of a thousand plus years off from being 566 00:33:37,920 --> 00:33:41,120 Speaker 2: the real shroud. So is this a claim about about 567 00:33:42,040 --> 00:33:44,760 Speaker 2: purely religion or is this now? I mean, once carbon 568 00:33:44,840 --> 00:33:48,560 Speaker 2: dating gets invented and you can you can use it, 569 00:33:48,560 --> 00:33:51,320 Speaker 2: it suddenly becomes a claim about about chemistry, right, you know. 570 00:33:51,640 --> 00:33:55,720 Speaker 2: And so I mean that is the the enduring tension 571 00:33:55,800 --> 00:33:59,440 Speaker 2: between science and religion. Whenever you're making claims about the world, 572 00:33:59,480 --> 00:34:03,680 Speaker 2: about the past, asked about what happens after death, that 573 00:34:03,760 --> 00:34:07,360 Speaker 2: the way the mind works, about the connections between people, 574 00:34:07,520 --> 00:34:12,560 Speaker 2: you know, tangible and invisible, you're at least in principle, 575 00:34:12,800 --> 00:34:18,160 Speaker 2: in principle risking making claims that wander onto the territory 576 00:34:18,200 --> 00:34:21,440 Speaker 2: of every other way we judge the truth value of 577 00:34:21,480 --> 00:34:22,360 Speaker 2: claims about the world. 578 00:34:22,719 --> 00:34:26,160 Speaker 1: So why is it so hard for let's say highly 579 00:34:26,200 --> 00:34:29,319 Speaker 1: intelligent people to come to agreement. I mean, we know 580 00:34:29,400 --> 00:34:33,600 Speaker 1: in any science department in a university, people argue. They 581 00:34:33,600 --> 00:34:36,720 Speaker 1: spend their careers arguing. Again, if this were just about 582 00:34:36,960 --> 00:34:40,680 Speaker 1: the brain trying to find predictive accuracy, that would be 583 00:34:40,760 --> 00:34:42,920 Speaker 1: very difficult to understand. But because of all these other 584 00:34:43,000 --> 00:34:47,239 Speaker 1: components to it, the social, the emotional, it's it's a 585 00:34:47,280 --> 00:34:48,399 Speaker 1: much more complicated thing. 586 00:34:48,480 --> 00:34:50,759 Speaker 2: The search for truth it is. I mean, you can 587 00:34:50,800 --> 00:34:54,840 Speaker 2: sharpen it up more than we tend to. For instance, 588 00:34:54,880 --> 00:34:58,200 Speaker 2: you can you can be alert to the problem of 589 00:34:58,640 --> 00:35:01,480 Speaker 2: someone making claims that are on falsifiable. Right. This is 590 00:35:01,520 --> 00:35:05,000 Speaker 2: something that that Carl Popper, the philosopher of science, gave 591 00:35:05,080 --> 00:35:09,200 Speaker 2: us just the the litmus test for him was the 592 00:35:09,320 --> 00:35:12,480 Speaker 2: falsifiability or lack there of a statement that was That 593 00:35:12,560 --> 00:35:15,960 Speaker 2: was how that was the boundary between science and and 594 00:35:16,600 --> 00:35:20,360 Speaker 2: the rest of our thinking. So what doesn't mean to 595 00:35:20,920 --> 00:35:23,840 Speaker 2: make a claim that is unfalsifiable? It means it means 596 00:35:23,840 --> 00:35:28,279 Speaker 2: that you're you're saying something is true. And yet there's 597 00:35:28,360 --> 00:35:32,440 Speaker 2: no state of the world that could know, even not 598 00:35:32,480 --> 00:35:34,640 Speaker 2: even in principle, a state of the world that we 599 00:35:34,640 --> 00:35:38,560 Speaker 2: can imagine that would report to the falsity of the claim. 600 00:35:38,560 --> 00:35:39,879 Speaker 3: Well, then you're it is. 601 00:35:40,280 --> 00:35:44,240 Speaker 2: What's an example, you know, this is a the world 602 00:35:44,440 --> 00:35:48,360 Speaker 2: as the entire world, the cosmos, everything that is apparently 603 00:35:48,400 --> 00:35:54,680 Speaker 2: real was created by God just you know, ten seconds ago. 604 00:35:55,520 --> 00:35:58,440 Speaker 2: But what God also created were all of the memories 605 00:35:58,440 --> 00:36:01,080 Speaker 2: and all of the backward looking evidence we think we 606 00:36:01,200 --> 00:36:04,800 Speaker 2: have about the antiquity of the cosmos. But the cosmos 607 00:36:04,840 --> 00:36:07,680 Speaker 2: is actually ten seconds old. We just have all these 608 00:36:07,760 --> 00:36:09,920 Speaker 2: memories jammed in our heads and all of this stuff 609 00:36:09,960 --> 00:36:12,560 Speaker 2: on the on the internet, et cetera. And the rocks 610 00:36:12,640 --> 00:36:14,640 Speaker 2: or the way they are and that, you know, so 611 00:36:14,960 --> 00:36:17,880 Speaker 2: nothing would be different. But you know, this is this 612 00:36:17,960 --> 00:36:19,759 Speaker 2: is my claim. This is my magical claim, right, so 613 00:36:20,200 --> 00:36:22,719 Speaker 2: it's unfalsifiable. You know, every time you're going to give 614 00:36:22,760 --> 00:36:25,160 Speaker 2: me something that you think would falsify falsified, I can 615 00:36:25,280 --> 00:36:28,239 Speaker 2: just kind of layer into that into the thesis. Well, no, 616 00:36:28,320 --> 00:36:30,680 Speaker 2: God made it that way so as to make it 617 00:36:30,719 --> 00:36:33,759 Speaker 2: seem like, you know that way, right, So we're not 618 00:36:34,040 --> 00:36:37,120 Speaker 2: We tend to not be interested in claims like that 619 00:36:37,200 --> 00:36:40,759 Speaker 2: for good reason, right, because they're just they're not one 620 00:36:40,840 --> 00:36:44,440 Speaker 2: they you know, there's they run against the grain of 621 00:36:44,480 --> 00:36:48,919 Speaker 2: a very useful heuristic. We have in science, which goes 622 00:36:49,120 --> 00:36:53,600 Speaker 2: by the name of Oakham's razor, but more generically, just 623 00:36:53,640 --> 00:36:57,600 Speaker 2: the idea of that a thesis should be in so 624 00:36:57,680 --> 00:37:01,040 Speaker 2: far as it's possible, parsimonious, right, Like you shouldn't just 625 00:37:01,840 --> 00:37:05,680 Speaker 2: proliferate your you know, claims, you know, so as to 626 00:37:05,719 --> 00:37:08,320 Speaker 2: shore up I mean, we don't want epicycles within epicycles 627 00:37:08,360 --> 00:37:11,640 Speaker 2: within epicycles. We want the most elegant, you know, simple 628 00:37:12,520 --> 00:37:14,280 Speaker 2: thesis that still conserves the data. 629 00:37:15,040 --> 00:37:18,000 Speaker 1: But I still want to understand. If you got let's say, 630 00:37:18,040 --> 00:37:21,040 Speaker 1: Popper and Spinoza in a room together. Here, you've got 631 00:37:21,040 --> 00:37:24,960 Speaker 1: two very clear thinkers, but there would be beliefs that 632 00:37:25,040 --> 00:37:27,359 Speaker 1: each of them hold that they would end up disagreeing on. 633 00:37:28,360 --> 00:37:30,759 Speaker 1: And again I'm just I'm just circling back to this 634 00:37:30,840 --> 00:37:35,680 Speaker 1: point that it's it can't be just about the rational. 635 00:37:35,719 --> 00:37:38,960 Speaker 1: There are all these other reasons why humans hold beliefs, 636 00:37:39,239 --> 00:37:40,960 Speaker 1: you know, the social and emotional or the things I 637 00:37:41,040 --> 00:37:43,960 Speaker 1: keep hanging on about. But this feels important to me 638 00:37:44,800 --> 00:37:47,239 Speaker 1: in terms of understanding why it is so difficult, why 639 00:37:47,320 --> 00:37:50,600 Speaker 1: there's such polarization. Because what would be great is if 640 00:37:50,600 --> 00:37:55,319 Speaker 1: we could sit down and you know, make our statements 641 00:37:55,560 --> 00:37:58,399 Speaker 1: shout everything in all caps on X or blue sky, 642 00:37:58,719 --> 00:38:02,440 Speaker 1: and everyone sees the logic and says, oh, okay, we agree, 643 00:38:02,719 --> 00:38:04,360 Speaker 1: and and the whole world would come to agreement, but 644 00:38:04,400 --> 00:38:06,920 Speaker 1: that that will never happen. And and I've just been 645 00:38:06,920 --> 00:38:13,080 Speaker 1: curious about why why beliefs are so different in different 646 00:38:13,080 --> 00:38:15,160 Speaker 1: heads and often so intractable. 647 00:38:15,280 --> 00:38:18,200 Speaker 2: Yeah, well, people have I mean, obviously the people are 648 00:38:18,320 --> 00:38:23,000 Speaker 2: driven by more than just a desire to not be 649 00:38:23,080 --> 00:38:25,560 Speaker 2: wrong right, or to have or to to have their 650 00:38:26,239 --> 00:38:32,680 Speaker 2: their representations of the world be accurate in the abstract. 651 00:38:32,719 --> 00:38:34,680 Speaker 2: I mean, people are incentivized by the things. You know. 652 00:38:34,719 --> 00:38:37,520 Speaker 2: It's like, you know, there's the the famous line that 653 00:38:37,640 --> 00:38:40,040 Speaker 2: now I've actually forgotten how it goes, but something you know, 654 00:38:40,080 --> 00:38:42,440 Speaker 2: it's hard to convince a man of of something that 655 00:38:42,719 --> 00:38:45,080 Speaker 2: one of his occupation requires that he not be convinced 656 00:38:45,080 --> 00:38:46,440 Speaker 2: of it, or something like that, right, you know. So 657 00:38:46,440 --> 00:38:48,560 Speaker 2: it's like it's like, yes, if you're if you stand 658 00:38:48,560 --> 00:38:50,560 Speaker 2: to lose a lot of money if a certain a 659 00:38:50,560 --> 00:38:53,080 Speaker 2: certain state of the world is so, you're going to 660 00:38:53,160 --> 00:38:55,239 Speaker 2: tend to avert your eyes from that for as long 661 00:38:55,239 --> 00:38:57,600 Speaker 2: as possible, because you're I mean, this is just the 662 00:38:58,160 --> 00:39:01,080 Speaker 2: the classic problem of and says I mean just it's 663 00:39:01,120 --> 00:39:01,960 Speaker 2: a perverse incentive. 664 00:39:02,040 --> 00:39:02,160 Speaker 3: Right. 665 00:39:02,239 --> 00:39:05,960 Speaker 2: So in science we have worked this out to a 666 00:39:06,000 --> 00:39:08,400 Speaker 2: remarkable degree. I mean, the thing that distinguishes the culture 667 00:39:08,400 --> 00:39:11,960 Speaker 2: of science. It's not perfect, but what distinguishes science from 668 00:39:11,960 --> 00:39:16,160 Speaker 2: basically everything else human beings do is that we have 669 00:39:16,239 --> 00:39:22,920 Speaker 2: created a culture where there's a competitive apparatus that is 670 00:39:23,760 --> 00:39:26,440 Speaker 2: error correcting by its very nature, right, Like where you know, 671 00:39:26,200 --> 00:39:30,759 Speaker 2: we could both be in neuroscience and have rival theses 672 00:39:31,400 --> 00:39:34,120 Speaker 2: and we're trying to you know, could falsify each other's 673 00:39:34,160 --> 00:39:38,560 Speaker 2: claims or at least pressure tests them. And science is 674 00:39:38,600 --> 00:39:41,680 Speaker 2: the only part of culture where you can actually win 675 00:39:41,840 --> 00:39:44,920 Speaker 2: points for proving yourself wrong, right, I mean, just, I 676 00:39:44,920 --> 00:39:46,920 Speaker 2: mean it doesn't happen as much as we would want 677 00:39:46,960 --> 00:39:51,279 Speaker 2: it to, but it is. It really is a kind 678 00:39:51,280 --> 00:39:55,120 Speaker 2: of a gold standard moment of scientific communication for somebody 679 00:39:55,200 --> 00:39:58,759 Speaker 2: who has had a cherished thesis for you know, decades, 680 00:39:58,880 --> 00:40:02,400 Speaker 2: to admit he or she was wrong about basically everything 681 00:40:02,960 --> 00:40:07,000 Speaker 2: in public and to thank the person who did the 682 00:40:07,040 --> 00:40:09,120 Speaker 2: work that showed that he or. 683 00:40:09,040 --> 00:40:09,600 Speaker 3: She was wrong. 684 00:40:09,680 --> 00:40:13,120 Speaker 2: Right, I mean that you know, people people have you know, again, 685 00:40:13,200 --> 00:40:15,040 Speaker 2: it probably doesn't happen in real time as much as 686 00:40:15,040 --> 00:40:16,960 Speaker 2: we would want, but it certainly happens. And it is 687 00:40:17,320 --> 00:40:21,080 Speaker 2: it really is the quintessence of the scientific enterprise, right 688 00:40:21,840 --> 00:40:24,960 Speaker 2: and and and what what is not the quintessence is 689 00:40:25,000 --> 00:40:30,960 Speaker 2: to see some scientist who is just clearly emotionally attached 690 00:40:31,480 --> 00:40:34,360 Speaker 2: to all of the sunk costs of his work, you know, 691 00:40:34,400 --> 00:40:36,960 Speaker 2: his work and his reputation, and he's been known for 692 00:40:37,000 --> 00:40:40,480 Speaker 2: this thing and this now, this battered thesis, is losing 693 00:40:40,480 --> 00:40:45,000 Speaker 2: his credibility among the next generation of scientists. And this yet, 694 00:40:45,080 --> 00:40:48,920 Speaker 2: this old warhorse, is holding on to this doomed project 695 00:40:49,360 --> 00:40:52,799 Speaker 2: despite the fact that there's a now mountain of good 696 00:40:52,880 --> 00:40:56,240 Speaker 2: argument and good evidence piling up against him. At that moment, 697 00:40:56,400 --> 00:41:00,640 Speaker 2: it's actually right, you know, it begins to seem more 698 00:41:00,680 --> 00:41:03,799 Speaker 2: correct to say that what he's doing is it's not 699 00:41:03,880 --> 00:41:06,480 Speaker 2: like an alternative version of science. It's like it's like 700 00:41:06,520 --> 00:41:09,200 Speaker 2: no longer science, right, Like you're you're failing these these 701 00:41:09,239 --> 00:41:13,080 Speaker 2: obvious tests of credibility and the obstacle course has become 702 00:41:13,239 --> 00:41:16,080 Speaker 2: too complicated for you emotionally for some reason, and you're 703 00:41:16,120 --> 00:41:17,200 Speaker 2: no longer navigating it. 704 00:41:17,400 --> 00:41:31,760 Speaker 4: Yeah. 705 00:41:32,120 --> 00:41:34,920 Speaker 1: So okay, So this brings up something about science that 706 00:41:35,000 --> 00:41:37,440 Speaker 1: I flagged earlier that I wanted to come back to 707 00:41:37,600 --> 00:41:41,680 Speaker 1: which is so when I read these original your papers 708 00:41:41,719 --> 00:41:44,759 Speaker 1: from two thousand and eight and two thousand and nine. Right, 709 00:41:44,840 --> 00:41:48,000 Speaker 1: So there's there's a belief that you think is true, 710 00:41:48,040 --> 00:41:51,000 Speaker 1: belief that is false, and then the uncertain ones that 711 00:41:51,040 --> 00:41:53,680 Speaker 1: you just don't know about. But what's so interesting to me? 712 00:41:54,200 --> 00:41:59,080 Speaker 1: As we know in science there's instead a waiting or 713 00:41:59,120 --> 00:42:02,120 Speaker 1: a probability that we put on different sorts of beliefs, 714 00:42:02,280 --> 00:42:04,960 Speaker 1: and even our most cherished beliefs. We never use the 715 00:42:05,000 --> 00:42:07,719 Speaker 1: word truth in science. We say, Okay, the way to 716 00:42:07,760 --> 00:42:11,000 Speaker 1: the evidence supports this, and if in a few years 717 00:42:11,040 --> 00:42:13,200 Speaker 1: the way that the evidence might support something else, fine 718 00:42:13,280 --> 00:42:16,120 Speaker 1: we go with that. And so I'm curious how you 719 00:42:16,160 --> 00:42:18,080 Speaker 1: would I know this wasn't part of the study, but 720 00:42:18,120 --> 00:42:21,480 Speaker 1: how you might think about beliefs of this sort, which 721 00:42:21,560 --> 00:42:24,959 Speaker 1: is the type that we hold in science of saying okay, well, 722 00:42:25,400 --> 00:42:27,840 Speaker 1: there's mountain of evidence here. Some of it points the 723 00:42:27,840 --> 00:42:29,719 Speaker 1: other way, someone points off this way, but most of 724 00:42:29,760 --> 00:42:30,680 Speaker 1: it is pointing over here. 725 00:42:31,280 --> 00:42:31,440 Speaker 3: Yeah. 726 00:42:31,440 --> 00:42:34,040 Speaker 2: Well, I do think that virtually all of our beliefs 727 00:42:34,040 --> 00:42:37,600 Speaker 2: are probabilistic. I mean, certainly in science explicitly so, I 728 00:42:37,600 --> 00:42:40,240 Speaker 2: mean we have been we have been beaten into shape 729 00:42:40,239 --> 00:42:43,319 Speaker 2: on that on that point, then again by largely by 730 00:42:43,320 --> 00:42:47,000 Speaker 2: Popper but by others where we just know we know 731 00:42:47,160 --> 00:42:49,719 Speaker 2: enough now to not say that's you know, we're one 732 00:42:49,800 --> 00:42:52,319 Speaker 2: hundred percent certain something, and so it's just the way 733 00:42:52,360 --> 00:42:54,440 Speaker 2: of the evidence, et cetera. Yeah, I mean, just in 734 00:42:54,800 --> 00:42:59,160 Speaker 2: very simple terms. And you could imagine being absolutely certain, 735 00:42:59,320 --> 00:43:02,360 Speaker 2: based on your or understanding of mathematics, that every triangle 736 00:43:02,400 --> 00:43:03,600 Speaker 2: has one hundred and eighty degrees. 737 00:43:04,400 --> 00:43:04,680 Speaker 1: Uh. 738 00:43:04,719 --> 00:43:07,760 Speaker 2: And I'd say, well, you know you're you're certain certain 739 00:43:07,800 --> 00:43:10,399 Speaker 2: and yes, absolutely certain. Would you be willing to bet 740 00:43:10,400 --> 00:43:13,239 Speaker 2: your life that it's so? And you might balk a 741 00:43:13,239 --> 00:43:15,880 Speaker 2: little bit and say, yeah, yeah, i'd be. If i'd 742 00:43:15,880 --> 00:43:17,759 Speaker 2: better on anything, I'd better on that. Well, it's it's 743 00:43:17,800 --> 00:43:19,560 Speaker 2: sort of like two plus to equals four for me. 744 00:43:20,040 --> 00:43:23,880 Speaker 2: But then someone introduces to you a concept of you know, 745 00:43:23,960 --> 00:43:28,640 Speaker 2: non look Euclidean geometry that you didn't know before, right, 746 00:43:28,680 --> 00:43:30,719 Speaker 2: So like you've just what you had in mind was 747 00:43:31,000 --> 00:43:32,520 Speaker 2: a triangle on a flat surface. 748 00:43:32,920 --> 00:43:34,640 Speaker 3: But now the basketball. 749 00:43:34,719 --> 00:43:37,359 Speaker 2: Yeah, but now someone like Remond comes along and says, yeah, 750 00:43:37,760 --> 00:43:40,280 Speaker 2: just visualize this. We're now going to curve the space. 751 00:43:40,680 --> 00:43:43,640 Speaker 2: What's happening to that triangle on the basketball doesn't look 752 00:43:43,680 --> 00:43:44,920 Speaker 2: like a little bit, a little more, a little less one 753 00:43:45,000 --> 00:43:51,000 Speaker 2: hundred degrees. All of a sudden, your intuitions get knocked around, 754 00:43:51,320 --> 00:43:54,480 Speaker 2: And if you're rational, you are now going to update 755 00:43:54,520 --> 00:43:58,200 Speaker 2: your file about triangles, right, and so if if a 756 00:43:58,320 --> 00:44:01,759 Speaker 2: core belief about something that's simple as a triangle can 757 00:44:01,800 --> 00:44:04,719 Speaker 2: be overturned, But you just had a further utterance from 758 00:44:04,800 --> 00:44:06,480 Speaker 2: you know, somebody who knows more about math and you, 759 00:44:07,320 --> 00:44:10,520 Speaker 2: what is truly off the table for revision is probably 760 00:44:10,520 --> 00:44:11,080 Speaker 2: not much. 761 00:44:12,640 --> 00:44:13,600 Speaker 3: I wish that were true. 762 00:44:13,600 --> 00:44:15,960 Speaker 1: I think I may be more pessimistic than you on 763 00:44:16,040 --> 00:44:19,600 Speaker 1: this point, which is, you might easily convince me about 764 00:44:19,680 --> 00:44:23,239 Speaker 1: a triangle, But core beliefs that I might have on 765 00:44:23,320 --> 00:44:29,160 Speaker 1: any hot button political issue, those are tougher. I'm using myself. 766 00:44:29,520 --> 00:44:33,239 Speaker 3: Can give me an example, Oh, whatever. 767 00:44:33,000 --> 00:44:36,000 Speaker 1: Your views are on abortion or gun control or whatever, 768 00:44:38,960 --> 00:44:42,120 Speaker 1: those are resistant to change because it's not as simple 769 00:44:42,120 --> 00:44:45,440 Speaker 1: as a logical thing of Oh, I see the triangle 770 00:44:45,440 --> 00:44:48,720 Speaker 1: on the basketball. Got it, I've now changed my model. 771 00:44:49,480 --> 00:44:53,680 Speaker 2: But I'm sure certainly in both of those cases there 772 00:44:53,760 --> 00:44:58,040 Speaker 2: is much more information that could be forthcoming on either 773 00:44:58,080 --> 00:45:03,440 Speaker 2: topic that could push someone's because because people's intuitions, especially 774 00:45:03,520 --> 00:45:07,560 Speaker 2: on those two topics tend to be anchored to like 775 00:45:07,840 --> 00:45:10,319 Speaker 2: some meme that got in their head, you know, very 776 00:45:10,440 --> 00:45:13,560 Speaker 2: likely early in life, that doesn't actually have a lot 777 00:45:13,600 --> 00:45:15,799 Speaker 2: of content to it. So it's like the claim that 778 00:45:15,840 --> 00:45:19,160 Speaker 2: abortion is wrong. That that that, you know, I'll drill 779 00:45:19,200 --> 00:45:21,160 Speaker 2: down on that a little bit. It felt it's likely 780 00:45:21,200 --> 00:45:25,439 Speaker 2: backed up by utterances like life starts at the moment 781 00:45:25,480 --> 00:45:25,920 Speaker 2: of conception. 782 00:45:26,719 --> 00:45:28,040 Speaker 3: Right. That's like if you. 783 00:45:28,440 --> 00:45:32,240 Speaker 2: If you're if you're a you know, a religious believer 784 00:45:32,239 --> 00:45:35,000 Speaker 2: who believes abortion is wrong, Well, that's it's just that's 785 00:45:35,000 --> 00:45:37,640 Speaker 2: not true. If you're if you're a Muslim, that's just 786 00:45:37,680 --> 00:45:41,000 Speaker 2: not true. You know, in the Muslim faith, I believe 787 00:45:43,360 --> 00:45:47,120 Speaker 2: the soul is thought enter the zygote. I think around 788 00:45:47,200 --> 00:45:50,040 Speaker 2: day eighty. It's either day eighty, day one, twenty something 789 00:45:50,120 --> 00:45:53,600 Speaker 2: like that. So but definitely not at the moment of conception. 790 00:45:53,760 --> 00:45:53,920 Speaker 3: Right. 791 00:45:53,960 --> 00:45:57,400 Speaker 2: So, Okay, perhaps you're a Muslim who just didn't know 792 00:45:57,440 --> 00:45:59,160 Speaker 2: that that was the doctrine, right, so someone told you 793 00:45:59,239 --> 00:46:01,960 Speaker 2: life was started the conception. But then but then they 794 00:46:02,040 --> 00:46:04,520 Speaker 2: actually check that the Holy Books and and and it 795 00:46:04,600 --> 00:46:06,960 Speaker 2: turns out you're wrong. Okay, that's kind of like triangles 796 00:46:07,000 --> 00:46:09,200 Speaker 2: all of a sudden. So it is with gun control. 797 00:46:09,239 --> 00:46:11,680 Speaker 2: I mean, people just don't know a lot about guns. 798 00:46:11,680 --> 00:46:13,640 Speaker 2: They don't know a lot about the actual statistics of 799 00:46:13,840 --> 00:46:15,640 Speaker 2: human violence, and they don't know how many people get 800 00:46:15,640 --> 00:46:17,719 Speaker 2: shot and how they get shot and why, and they 801 00:46:17,880 --> 00:46:23,080 Speaker 2: just they tend to have they just I mean, it's 802 00:46:23,120 --> 00:46:25,640 Speaker 2: just there's just a it's a thicket of ignorance and 803 00:46:25,680 --> 00:46:27,920 Speaker 2: assumption on on lots of these topics. 804 00:46:28,040 --> 00:46:29,359 Speaker 3: And and I'm not. 805 00:46:29,280 --> 00:46:31,560 Speaker 1: Sure I agree with this in the following sense, just 806 00:46:31,600 --> 00:46:36,000 Speaker 1: because I've I've known very closely family members and friends 807 00:46:36,080 --> 00:46:39,480 Speaker 1: on very different sides of this issue who are extraordinarily 808 00:46:39,520 --> 00:46:45,680 Speaker 1: knowledgeable and bring up an infinite encyclopedia of statistics about 809 00:46:45,719 --> 00:46:49,560 Speaker 1: these issues. But they're on opposite sides, right, So so ignorance, 810 00:46:49,960 --> 00:46:52,200 Speaker 1: isn't I think a way that we can explain away 811 00:46:52,880 --> 00:46:54,440 Speaker 1: some people's positions on this. 812 00:46:55,840 --> 00:46:59,919 Speaker 2: Well, I'll grant you that it is possible for two 813 00:47:00,080 --> 00:47:03,279 Speaker 2: people to have more or less the same set of 814 00:47:03,360 --> 00:47:06,400 Speaker 2: facts and to feel very differently about them. And then 815 00:47:06,480 --> 00:47:09,040 Speaker 2: the question is what explains that? 816 00:47:09,239 --> 00:47:12,279 Speaker 3: Right? So what else? What else do they believe? Right? 817 00:47:12,320 --> 00:47:14,960 Speaker 1: What's about their internal model in this world and their 818 00:47:15,000 --> 00:47:18,840 Speaker 1: expectations about the future and the government of the future, 819 00:47:18,920 --> 00:47:21,279 Speaker 1: and the crime of the future and so on that 820 00:47:21,400 --> 00:47:24,200 Speaker 1: makes them feel that, hey, I don't need a gun 821 00:47:24,239 --> 00:47:25,400 Speaker 1: to a house. I do need to gun to a 822 00:47:25,400 --> 00:47:26,080 Speaker 1: house whatever it is. 823 00:47:26,160 --> 00:47:29,360 Speaker 2: Yeah, So in that case, I would say they're waiting 824 00:47:29,640 --> 00:47:34,080 Speaker 2: different probabilities. I mean, so, it's almost never the case, 825 00:47:34,360 --> 00:47:37,239 Speaker 2: especially for that debate. It's almost never the case that 826 00:47:37,239 --> 00:47:40,400 Speaker 2: you're talking about two people who are in touch with 827 00:47:40,440 --> 00:47:42,239 Speaker 2: the same set of facts, because it's just, you know, 828 00:47:42,320 --> 00:47:45,200 Speaker 2: the people who are deep into gun culture, who are 829 00:47:45,239 --> 00:47:47,680 Speaker 2: it's all about the Second Amendment, and they're happy, they 830 00:47:47,719 --> 00:47:51,080 Speaker 2: love shooting guns, they have guns, they're comfortable around guns, 831 00:47:51,480 --> 00:47:54,879 Speaker 2: they know they can store them safely, say, and they 832 00:47:54,920 --> 00:47:59,160 Speaker 2: have a very They've spent ten thousand hours rehearsing scenarios 833 00:47:59,200 --> 00:48:01,160 Speaker 2: where they would have to defend their lives or the 834 00:48:01,200 --> 00:48:04,680 Speaker 2: lives of their loved ones against some malicious intruder. And 835 00:48:04,719 --> 00:48:08,120 Speaker 2: that's there, you know, that's become, you know, a deep 836 00:48:08,120 --> 00:48:12,319 Speaker 2: part of their identity. Even that person is dealing with 837 00:48:12,360 --> 00:48:14,359 Speaker 2: a very different set of facts. And the person who 838 00:48:15,040 --> 00:48:18,080 Speaker 2: has no experience with guns, doesn't want any experience with guns, 839 00:48:18,080 --> 00:48:21,160 Speaker 2: hates guns, thinks they're dangerous, thinks that it's just appalling 840 00:48:21,200 --> 00:48:22,799 Speaker 2: that we live in a society that has so many 841 00:48:22,840 --> 00:48:26,080 Speaker 2: guns and we've got this absurd level of gun violence 842 00:48:26,440 --> 00:48:29,680 Speaker 2: that's unrecorded, that that makes us an outlier of an 843 00:48:29,680 --> 00:48:32,440 Speaker 2: outlier in you know, other in the developed world than 844 00:48:32,520 --> 00:48:34,480 Speaker 2: just to kind of it just just we're just as 845 00:48:34,480 --> 00:48:35,480 Speaker 2: a horror show. 846 00:48:36,120 --> 00:48:36,880 Speaker 3: How did we get here? 847 00:48:36,920 --> 00:48:39,719 Speaker 2: It's it's like just pure masochism that we were living 848 00:48:39,760 --> 00:48:42,200 Speaker 2: with this, you know, the religion of the Second Amendment. Right, 849 00:48:42,239 --> 00:48:45,440 Speaker 2: So those two the people, no matter how well informed 850 00:48:45,480 --> 00:48:49,160 Speaker 2: each representative of those two camps thinks they are, I 851 00:48:49,200 --> 00:48:52,200 Speaker 2: guarantee you that in almost every case, they have very 852 00:48:52,200 --> 00:48:54,560 Speaker 2: different beliefs about what is actually true in the world 853 00:48:54,600 --> 00:48:55,400 Speaker 2: with respect. 854 00:48:55,120 --> 00:48:56,480 Speaker 3: To the races of violence. 855 00:48:56,520 --> 00:48:58,399 Speaker 1: It might both be holding on to facts, though they're 856 00:48:58,440 --> 00:49:02,520 Speaker 1: paying attention to different facts. So the gun owner is 857 00:49:02,560 --> 00:49:05,560 Speaker 1: paying attention to things like tyrannies of governments and what 858 00:49:05,600 --> 00:49:08,239 Speaker 1: happens when governments take guns away from the citizens and 859 00:49:08,239 --> 00:49:10,279 Speaker 1: so on. Those are the kind of facts they draw up. 860 00:49:10,640 --> 00:49:14,240 Speaker 1: The gun non owner is drawing up facts about crime 861 00:49:15,080 --> 00:49:18,000 Speaker 1: and how many accidents happen in households with guns and 862 00:49:18,040 --> 00:49:18,359 Speaker 1: so on. 863 00:49:18,680 --> 00:49:21,600 Speaker 2: But they might have those facts wrong. So like the 864 00:49:21,640 --> 00:49:24,400 Speaker 2: non gun owner will believe that it's almost never the 865 00:49:24,400 --> 00:49:27,000 Speaker 2: case that someone successfully defends himself with a gun or 866 00:49:27,239 --> 00:49:29,160 Speaker 2: defends his family with a gun. That just doesn't happen. 867 00:49:29,200 --> 00:49:33,400 Speaker 2: That's just NRA propaganda. What really happens is kids accidentally 868 00:49:33,440 --> 00:49:35,560 Speaker 2: get a hold of their parents' guns and get shot 869 00:49:35,600 --> 00:49:38,319 Speaker 2: and shoot themselves or shoot someone else, and it's a tragedy, right. 870 00:49:39,560 --> 00:49:41,200 Speaker 3: And the statistics. 871 00:49:40,560 --> 00:49:44,360 Speaker 2: Around gun violence are highly politicized, and it's very hard 872 00:49:44,560 --> 00:49:47,040 Speaker 2: in fact to go in with an open mind and 873 00:49:47,080 --> 00:49:50,000 Speaker 2: figure out what is actually just real, you know, empirically, 874 00:49:51,080 --> 00:49:54,840 Speaker 2: because it's been so polarized. But even if we create 875 00:49:55,200 --> 00:49:58,439 Speaker 2: the perfect case where they really do have the same 876 00:49:58,480 --> 00:50:02,359 Speaker 2: sets of facts, uh with respect to what is being 877 00:50:02,400 --> 00:50:08,320 Speaker 2: talked about, there are other facts and other propositional attitudes 878 00:50:08,440 --> 00:50:12,120 Speaker 2: beliefs that are covertly working to change the sort of 879 00:50:12,200 --> 00:50:14,239 Speaker 2: the waitings that they're giving to all of these facts. 880 00:50:14,280 --> 00:50:17,400 Speaker 2: So like there are people who are just for whom 881 00:50:18,160 --> 00:50:23,680 Speaker 2: the prospect of being finding themselves helpless to defend themselves 882 00:50:23,800 --> 00:50:27,760 Speaker 2: or their their loved ones from evil. That is the 883 00:50:27,840 --> 00:50:30,560 Speaker 2: that is an outcome that is so bad, I mean, 884 00:50:30,600 --> 00:50:33,680 Speaker 2: the utility function that the that is that is so 885 00:50:33,800 --> 00:50:35,480 Speaker 2: much worse than all the other bad things that could 886 00:50:35,480 --> 00:50:39,880 Speaker 2: happen in life that whatever low probability you put on it, 887 00:50:40,320 --> 00:50:42,040 Speaker 2: like the probability Let's say we could agree about the 888 00:50:42,040 --> 00:50:44,560 Speaker 2: probability of a home home invasion, you know, where the 889 00:50:44,560 --> 00:50:46,719 Speaker 2: person has come to murder you and your family, Right, 890 00:50:46,760 --> 00:50:51,440 Speaker 2: whatever that probability is. If I wait that as worse 891 00:50:51,640 --> 00:50:56,320 Speaker 2: than everyone dying in a fire, and and and a 892 00:50:56,480 --> 00:50:58,880 Speaker 2: hundred other things that you think are probably worse than 893 00:50:58,920 --> 00:51:01,839 Speaker 2: than than a home invasion, then we just have we 894 00:51:01,960 --> 00:51:05,160 Speaker 2: just have different utility functions. Yeah, right, And that's and 895 00:51:05,280 --> 00:51:07,800 Speaker 2: that is often at play. I mean, that's where emotion 896 00:51:08,520 --> 00:51:11,480 Speaker 2: comes into it. Where you see people get really you know, 897 00:51:11,520 --> 00:51:16,120 Speaker 2: we seem like we're just having a conversation about facts, 898 00:51:16,200 --> 00:51:21,200 Speaker 2: but you see people, even granting the same sets of facts, 899 00:51:21,840 --> 00:51:28,480 Speaker 2: become highly polarized and energized around the significance of those facts. 900 00:51:28,800 --> 00:51:32,080 Speaker 1: Yes, yeah, I couldn't agree more. And this is why 901 00:51:32,120 --> 00:51:34,560 Speaker 1: I feel like I'm still trying to figure out how 902 00:51:34,600 --> 00:51:39,880 Speaker 1: to think about beliefs because so much of our positioning 903 00:51:39,960 --> 00:51:43,120 Speaker 1: on everything has to do with these other issues. And 904 00:51:43,520 --> 00:51:47,080 Speaker 1: I've I mean, I watched this all through the last 905 00:51:47,080 --> 00:51:50,960 Speaker 1: decade where people will end up switching political allegiances not 906 00:51:51,200 --> 00:51:53,799 Speaker 1: because they like the fact, so much of the other 907 00:51:54,120 --> 00:51:58,680 Speaker 1: side it's because something discuss disgusts them about their own side, 908 00:51:59,000 --> 00:52:01,440 Speaker 1: something gets said to them, or someone accuses them of 909 00:52:01,480 --> 00:52:03,960 Speaker 1: something and they get so mad about it, and that's 910 00:52:04,000 --> 00:52:04,959 Speaker 1: what drives them. 911 00:52:05,920 --> 00:52:06,120 Speaker 4: Well. 912 00:52:06,160 --> 00:52:10,400 Speaker 2: But politics, I mean, one thing that's so frustrating about 913 00:52:10,440 --> 00:52:14,440 Speaker 2: politics is that it should be the space in which 914 00:52:14,480 --> 00:52:22,960 Speaker 2: we we rationally operate so as to cooperatively produce outcomes 915 00:52:23,000 --> 00:52:24,880 Speaker 2: that that you know, improve our lives. 916 00:52:24,960 --> 00:52:27,279 Speaker 3: Right. So it's like we politics really is. 917 00:52:28,400 --> 00:52:32,480 Speaker 2: An art of cooperation and yet and to do that, well, 918 00:52:33,520 --> 00:52:36,560 Speaker 2: we really have to be mean, you know, reason has 919 00:52:36,600 --> 00:52:41,160 Speaker 2: to be the master value, and what's true, what's plausible? 920 00:52:41,920 --> 00:52:42,320 Speaker 3: Uh? 921 00:52:42,400 --> 00:52:45,200 Speaker 2: And then the ethics. You know, you can't keep ethics 922 00:52:45,200 --> 00:52:46,520 Speaker 2: out of it. I mean, there are questions of what's 923 00:52:46,520 --> 00:52:50,120 Speaker 2: fair and what's et cetera, what is what is good? 924 00:52:50,640 --> 00:52:54,520 Speaker 2: What should we value? But so much of it is 925 00:52:55,200 --> 00:52:59,120 Speaker 2: pure tribalism, right, And it's pure in group signaling and 926 00:52:59,280 --> 00:53:03,400 Speaker 2: outgroup disparagement, and it's and and there when that is 927 00:53:03,440 --> 00:53:08,520 Speaker 2: the becoming the master value, it's just it's just not 928 00:53:08,719 --> 00:53:12,560 Speaker 2: about honestly talking about the world at all. Right, You 929 00:53:12,680 --> 00:53:14,600 Speaker 2: just like, like, I mean, in so far as you're 930 00:53:14,640 --> 00:53:17,560 Speaker 2: trading and claims about you know, the purport to be 931 00:53:17,640 --> 00:53:22,280 Speaker 2: fact based, it's still what you're really doing is it's 932 00:53:22,320 --> 00:53:24,160 Speaker 2: a team sport and you're trying to figure out how 933 00:53:24,160 --> 00:53:26,000 Speaker 2: to win, Right, You're just trying to So if you're 934 00:53:26,000 --> 00:53:31,520 Speaker 2: saying negative things about the opposition, you begin to not 935 00:53:31,680 --> 00:53:35,359 Speaker 2: care whether they're even true. Let's just see what can stick, Right, 936 00:53:35,640 --> 00:53:38,480 Speaker 2: this guy's a you know, this guy's a child molester. Well, 937 00:53:38,600 --> 00:53:41,000 Speaker 2: what's your evidence for that? I you know, I don't know. 938 00:53:41,040 --> 00:53:44,520 Speaker 2: I saw something somewhere on social media. But you know, 939 00:53:44,600 --> 00:53:46,879 Speaker 2: like if I keep repeating it, you know, enough people 940 00:53:46,880 --> 00:53:48,400 Speaker 2: are going to think he's a child molester that it 941 00:53:48,440 --> 00:53:50,759 Speaker 2: could harmless chances come election time. 942 00:53:50,840 --> 00:53:50,960 Speaker 3: Right. 943 00:53:51,000 --> 00:53:53,520 Speaker 2: So it's like that's the game, and it's awful. Right, 944 00:53:53,560 --> 00:53:59,000 Speaker 2: It's just so degrading in principle because it's leveraging the 945 00:53:59,040 --> 00:54:04,000 Speaker 2: most anti social emotions. Mostly it's leveraging fear and outrage. 946 00:54:04,000 --> 00:54:07,880 Speaker 2: I mean, I get irrational fear and outrage and false 947 00:54:07,920 --> 00:54:10,560 Speaker 2: certainty or the pretense of certainty where you actually have 948 00:54:10,600 --> 00:54:12,200 Speaker 2: no rational basis for certainty. 949 00:54:12,400 --> 00:54:14,560 Speaker 1: Yeah, And in fact, this reminds me I want to 950 00:54:14,560 --> 00:54:17,520 Speaker 1: come back to something at the very beginning about a 951 00:54:18,040 --> 00:54:22,040 Speaker 1: belief that one might think is false. You found that 952 00:54:22,040 --> 00:54:25,680 Speaker 1: that activates the anterior insula, presumably a sense of disgust 953 00:54:25,719 --> 00:54:30,279 Speaker 1: about that. I wonder the degree to which this might 954 00:54:30,360 --> 00:54:34,840 Speaker 1: differ based on different false information, some false information. For example, 955 00:54:34,880 --> 00:54:41,279 Speaker 1: you see something online clearly misinformation, disinformation, and one might 956 00:54:41,360 --> 00:54:46,000 Speaker 1: think this is really a concern for my side of 957 00:54:46,040 --> 00:54:49,040 Speaker 1: the political argument, or for my you know, ethnic group, 958 00:54:49,160 --> 00:54:53,920 Speaker 1: or for my anything this kind of misinformation. I'm simulating 959 00:54:53,920 --> 00:54:57,239 Speaker 1: the future. What if everyone believed this piece of misinformation 960 00:54:57,600 --> 00:55:02,840 Speaker 1: that's really disastrous? And so I wonder if some sorts 961 00:55:02,880 --> 00:55:07,359 Speaker 1: of misinform, let's say, false beliefs, our reaction to them 962 00:55:07,440 --> 00:55:10,360 Speaker 1: isn't simply discussed, isn't simply ah, okay, I've categorized that 963 00:55:10,400 --> 00:55:13,120 Speaker 1: as false, but instead it's hey, I've categorized this as 964 00:55:13,120 --> 00:55:15,160 Speaker 1: something that's really a concern for me. 965 00:55:15,960 --> 00:55:16,200 Speaker 3: Yeah. 966 00:55:16,239 --> 00:55:19,879 Speaker 2: Well, it's very easy to say. Again, this comes back 967 00:55:19,920 --> 00:55:23,560 Speaker 2: to the power of ideas, when you're thinking of just 968 00:55:23,960 --> 00:55:28,359 Speaker 2: how what is what could the effect be of a 969 00:55:28,400 --> 00:55:32,360 Speaker 2: single sentence? You know, in any language, the sky really 970 00:55:32,440 --> 00:55:34,160 Speaker 2: is the limit. If the front page of the New 971 00:55:34,239 --> 00:55:38,680 Speaker 2: York Times reads that nuclear war has started in thirty minutes, 972 00:55:38,680 --> 00:55:41,480 Speaker 2: it's going to get to you. Right, that's everything, right, 973 00:55:41,520 --> 00:55:45,400 Speaker 2: This is like, I mean, we're just decoding words in 974 00:55:45,440 --> 00:55:48,960 Speaker 2: this case on a website, and yet unless we can 975 00:55:49,000 --> 00:55:52,160 Speaker 2: figure out some reason not to grant them credence, right, 976 00:55:52,160 --> 00:55:53,720 Speaker 2: it's like, okay, wait a minute, this has been hacked, 977 00:55:54,360 --> 00:55:56,920 Speaker 2: is it? You know, is it April Fool's Day? There's 978 00:55:56,920 --> 00:55:58,960 Speaker 2: something wrong with my computers? Like this is really the 979 00:55:58,960 --> 00:56:01,439 Speaker 2: New York Times? Is that really the website? Now with AI, 980 00:56:01,600 --> 00:56:03,840 Speaker 2: were on the cusp of having to declare, you know, 981 00:56:03,960 --> 00:56:07,360 Speaker 2: epistemological bankruptcy with respect on the internet because any video 982 00:56:07,560 --> 00:56:11,000 Speaker 2: can now you know, virtually be faked. Right. So even 983 00:56:11,040 --> 00:56:13,040 Speaker 2: now if we saw, you know, a video of Vladimir 984 00:56:13,080 --> 00:56:15,880 Speaker 2: Putin saying I've just launched you know, our full arsenal 985 00:56:15,960 --> 00:56:18,480 Speaker 2: the evil Empire of the United States. It'll be there 986 00:56:18,480 --> 00:56:20,960 Speaker 2: in twenty four minutes. If we saw that and it 987 00:56:21,000 --> 00:56:24,680 Speaker 2: was translated perfect Russian, and you saw you know, Gary 988 00:56:24,760 --> 00:56:27,319 Speaker 2: Kasparov saying yeah, that's that's really Putin and that's really 989 00:56:27,360 --> 00:56:29,680 Speaker 2: Russian and that's really what he said, and it's now 990 00:56:29,680 --> 00:56:31,640 Speaker 2: on the New York Times website. Now you're thinking, wait 991 00:56:31,680 --> 00:56:34,640 Speaker 2: a minute, that does the New York Times have sufficient 992 00:56:34,840 --> 00:56:38,040 Speaker 2: technology to detect deep fakes? Do I really have to 993 00:56:38,080 --> 00:56:43,880 Speaker 2: believe this? But once you can't grab a handhold to 994 00:56:43,920 --> 00:56:51,520 Speaker 2: resist this slide into just helplessly imbibing the propositional import 995 00:56:51,600 --> 00:56:56,680 Speaker 2: of those words. Right World War three, you're emotionally and 996 00:56:56,719 --> 00:57:02,160 Speaker 2: behaviorally completely exposed to the implication of that claim. 997 00:57:02,280 --> 00:57:02,960 Speaker 3: Oh that's interesting. 998 00:57:03,040 --> 00:57:05,360 Speaker 1: You're exposed to it, or you've put up walls against 999 00:57:05,440 --> 00:57:06,040 Speaker 1: it as well. 1000 00:57:06,640 --> 00:57:08,560 Speaker 2: The way to put up walls against it is to 1001 00:57:08,600 --> 00:57:11,400 Speaker 2: find I mean that there are other If it's big enough, 1002 00:57:12,200 --> 00:57:14,839 Speaker 2: you know, it becomes irresistible. But I mean you can 1003 00:57:14,880 --> 00:57:18,360 Speaker 2: there are many ways to put up walls against beliefs 1004 00:57:18,360 --> 00:57:20,240 Speaker 2: we don't like the taste of, right like, we can 1005 00:57:20,280 --> 00:57:23,080 Speaker 2: ignore them, we can move on, we can distract ourselves. 1006 00:57:23,480 --> 00:57:25,200 Speaker 2: You know, this is this is kind of the purview 1007 00:57:25,280 --> 00:57:28,240 Speaker 2: of what we think about is self deception, right Like, 1008 00:57:28,280 --> 00:57:29,960 Speaker 2: how is it? How did how did the guy not 1009 00:57:30,120 --> 00:57:31,680 Speaker 2: know his wife was cheating on him? I mean there 1010 00:57:31,680 --> 00:57:33,240 Speaker 2: were these signs and he sort of I mean he 1011 00:57:33,320 --> 00:57:36,080 Speaker 2: must have seen that thing or heard that thing, or 1012 00:57:36,400 --> 00:57:38,800 Speaker 2: if she did that and I saw that. So there's 1013 00:57:38,840 --> 00:57:41,600 Speaker 2: all these you can you can say that there's this 1014 00:57:41,680 --> 00:57:45,280 Speaker 2: sort of this this boundary even within within the precincts 1015 00:57:45,280 --> 00:57:49,960 Speaker 2: of one mind, where you can we might say there's 1016 00:57:50,000 --> 00:57:54,840 Speaker 2: a an avoidance of knowledge when the knowledge is really there. 1017 00:57:54,880 --> 00:57:58,800 Speaker 2: But no if so if if the whole world kind 1018 00:57:58,800 --> 00:58:02,960 Speaker 2: of sticks your face in a claim. Right again, it's 1019 00:58:03,000 --> 00:58:05,520 Speaker 2: on the internet. The Internet's on fire with this thing 1020 00:58:05,680 --> 00:58:10,200 Speaker 2: being news, and you have to figure out whether it's true. Again, 1021 00:58:10,240 --> 00:58:14,400 Speaker 2: it's really just a matter of words or images that 1022 00:58:14,440 --> 00:58:20,560 Speaker 2: purport to represent the world suddenly transforming your life. Right, 1023 00:58:20,600 --> 00:58:22,040 Speaker 2: and this is all of a sudden, This is like 1024 00:58:22,040 --> 00:58:24,840 Speaker 2: like literally a sentence could be spoken to both of 1025 00:58:24,920 --> 00:58:30,840 Speaker 2: us that was so incontrovertible given the framing around it, 1026 00:58:30,880 --> 00:58:33,320 Speaker 2: given the way we both know the world is given 1027 00:58:33,800 --> 00:58:36,919 Speaker 2: how much would have to have been faked in order 1028 00:58:36,960 --> 00:58:39,720 Speaker 2: for it to come that way to us from that source, right, 1029 00:58:40,080 --> 00:58:43,040 Speaker 2: and this goes to the role of authority and gatekeeping 1030 00:58:43,120 --> 00:58:47,040 Speaker 2: and right. So, but provided it comes you know, over 1031 00:58:47,080 --> 00:58:50,360 Speaker 2: the transom at just the right angle, you know, the 1032 00:58:51,200 --> 00:58:55,080 Speaker 2: distance between us in this moment, you're reasonably comfortable and 1033 00:58:56,200 --> 00:58:58,640 Speaker 2: without much of a care in the world, and absolute 1034 00:58:58,720 --> 00:59:02,280 Speaker 2: panic is just you know, it's not much longer than 1035 00:59:02,280 --> 00:59:04,240 Speaker 2: it took to decode the sentence in the first place. 1036 00:59:04,280 --> 00:59:07,640 Speaker 2: It's like, holy shit, right, I mean, it's rare that 1037 00:59:07,640 --> 00:59:12,160 Speaker 2: that the stakes are that high and the testimony is that, 1038 00:59:13,160 --> 00:59:16,840 Speaker 2: you know, unimpeachable and unequivocal and irresistible. 1039 00:59:16,920 --> 00:59:17,080 Speaker 3: Right. 1040 00:59:17,120 --> 00:59:20,320 Speaker 2: So it's where we're more often trading in the in 1041 00:59:20,360 --> 00:59:23,520 Speaker 2: the land of well, it doesn't matter that much whether 1042 00:59:23,520 --> 00:59:25,720 Speaker 2: it's true or false. You know, I'll take it on 1043 00:59:25,920 --> 00:59:28,160 Speaker 2: or not. I can act as if it's true, but 1044 00:59:28,200 --> 00:59:30,040 Speaker 2: if it turns out not to be true, and so 1045 00:59:30,080 --> 00:59:34,000 Speaker 2: we're and and and we're we're dealing in probabilities where yeah, 1046 00:59:34,040 --> 00:59:36,320 Speaker 2: we're willing to bet something in that direction, but we're 1047 00:59:36,320 --> 00:59:37,840 Speaker 2: not willing to bet everything in that direction. 1048 00:59:38,400 --> 00:59:41,640 Speaker 1: What I mean is when you see a piece of obvious, 1049 00:59:41,760 --> 00:59:47,920 Speaker 1: blatant misinformation online, someone posts something stupid and and it 1050 00:59:48,040 --> 00:59:52,800 Speaker 1: feels like what your mind does is simulate, Wait, if 1051 00:59:52,880 --> 00:59:56,600 Speaker 1: everyone thought this, if this spreads, if this meme goes viral, 1052 00:59:57,560 --> 01:00:01,480 Speaker 1: that's going to have consequences. Yeah, That's what I'm pointing to. 1053 01:00:01,600 --> 01:00:04,800 Speaker 1: Is not necessarily like the nuclear war thing. But here's 1054 01:00:04,840 --> 01:00:08,400 Speaker 1: an example. Right after the Bondie Beach massacre of Jewish 1055 01:00:08,400 --> 01:00:12,480 Speaker 1: celebrating Hanukkah, somebody posted something saying, oh, it turns out 1056 01:00:12,520 --> 01:00:15,640 Speaker 1: one of the shooters is Jewish, and they showed a 1057 01:00:15,640 --> 01:00:20,560 Speaker 1: Facebook page of the guy wearing a yamica and all 1058 01:00:20,640 --> 01:00:23,320 Speaker 1: his friends and his friends on there. It was such 1059 01:00:23,320 --> 01:00:25,280 Speaker 1: an obvious AI fake when you zoom in on the 1060 01:00:25,280 --> 01:00:27,680 Speaker 1: buttons and so on, the buttons didn't make sense and 1061 01:00:27,680 --> 01:00:31,200 Speaker 1: so on. But I think the feeling that one might 1062 01:00:31,320 --> 01:00:37,040 Speaker 1: have viewing that is, hey, that's a dangerous piece of misinformation. 1063 01:00:37,800 --> 01:00:41,200 Speaker 1: Why because if it spread, if lots of people saw it, 1064 01:00:41,480 --> 01:00:44,480 Speaker 1: blah blah blah. So what I mean is it's not 1065 01:00:44,920 --> 01:00:47,920 Speaker 1: just the evaluation of, oh is this true or false, 1066 01:00:48,600 --> 01:00:51,600 Speaker 1: it's the I'm just wondering if the discussed element in 1067 01:00:51,640 --> 01:00:55,480 Speaker 1: real life might be the what are the future consequences 1068 01:00:55,480 --> 01:00:56,520 Speaker 1: of that piece of mistass? 1069 01:00:56,520 --> 01:00:59,160 Speaker 2: Oh yes, well yeah, I think that can That can 1070 01:00:59,240 --> 01:01:03,800 Speaker 2: make the field of disgusted not only a neurophysiological analogy. 1071 01:01:04,040 --> 01:01:06,080 Speaker 2: It really can make it quite salient. I mean, if 1072 01:01:06,120 --> 01:01:11,040 Speaker 2: someone says something that you know to be false or 1073 01:01:11,120 --> 01:01:13,200 Speaker 2: you have every reason to believe it's false, and not 1074 01:01:13,240 --> 01:01:17,360 Speaker 2: only and it's and and if but if believed, it 1075 01:01:17,400 --> 01:01:20,720 Speaker 2: will open the door to just manifest harms in the world. 1076 01:01:21,240 --> 01:01:25,680 Speaker 2: And it's just it's it's an ugly divisive idea that 1077 01:01:25,720 --> 01:01:29,880 Speaker 2: you believe was maliciously concocted so as to produce those harms, right, 1078 01:01:29,920 --> 01:01:33,040 Speaker 2: So like it's not all these all this negative uh, 1079 01:01:33,200 --> 01:01:36,320 Speaker 2: negativity stacked on top of it. Yeah, I mean, I 1080 01:01:36,360 --> 01:01:38,640 Speaker 2: you know, I feel discussed all the time, and consciously 1081 01:01:38,680 --> 01:01:41,800 Speaker 2: feel discussed all the time at what people pretend to believe, 1082 01:01:42,040 --> 01:01:44,080 Speaker 2: you know, or what some people actually wind up being 1083 01:01:44,120 --> 01:01:46,520 Speaker 2: taken in by and actually believe. And the consequences in 1084 01:01:46,520 --> 01:01:48,680 Speaker 2: the world are obvious. I mean, I mean, in fact, 1085 01:01:48,680 --> 01:01:51,320 Speaker 2: if you once you put this kind of lens on 1086 01:01:52,280 --> 01:01:57,800 Speaker 2: to your your view of more or less everything, Uh, people, 1087 01:01:57,880 --> 01:02:01,800 Speaker 2: do you just see that all of the world's mayhem 1088 01:02:02,520 --> 01:02:07,000 Speaker 2: is a story of kind of failures, I mean, apart 1089 01:02:07,040 --> 01:02:11,440 Speaker 2: from the things that are visited upon us from nature, 1090 01:02:12,200 --> 01:02:14,080 Speaker 2: just like bad luck, you know, just like you know, 1091 01:02:14,120 --> 01:02:18,280 Speaker 2: an asteroid impact or or a virus you know, jump 1092 01:02:18,360 --> 01:02:21,840 Speaker 2: species and now you know we're suffering its consequences. Everything 1093 01:02:21,840 --> 01:02:27,160 Speaker 2: else is a self inflicted wound based on bad ideas, 1094 01:02:27,680 --> 01:02:33,880 Speaker 2: you know, just just rank tribalism and conspiracy thinking and lies, 1095 01:02:33,960 --> 01:02:36,800 Speaker 2: I mean conscious lies meant to divide people and to 1096 01:02:37,280 --> 01:02:41,960 Speaker 2: motivate them to produce harm. And it's it's all a 1097 01:02:42,000 --> 01:02:46,480 Speaker 2: matter of kind of this belief making and belief grasping 1098 01:02:46,720 --> 01:02:52,000 Speaker 2: machinery of our minds that is just ceaselessly doing its work. 1099 01:02:52,200 --> 01:02:54,720 Speaker 1: It makes me wonder why changing our minds is so 1100 01:02:55,280 --> 01:02:59,920 Speaker 1: difficult when it comes to a scientific proposition. It seems 1101 01:02:59,920 --> 01:03:03,200 Speaker 1: to be easier, maybe because often those are like chess 1102 01:03:03,240 --> 01:03:06,320 Speaker 1: moves in the sense that they're cognitive. There's not a 1103 01:03:06,360 --> 01:03:08,680 Speaker 1: lot of emotional weight with it. But like the example 1104 01:03:08,720 --> 01:03:12,280 Speaker 1: you gave of the older professor who's emotionally so weighted 1105 01:03:12,320 --> 01:03:15,560 Speaker 1: down by what he's come up with, sometimes it's very 1106 01:03:15,560 --> 01:03:18,680 Speaker 1: hard to change our minds about things. What's your take 1107 01:03:18,760 --> 01:03:22,960 Speaker 1: about the challenges and any hopes about people changing their 1108 01:03:23,000 --> 01:03:23,760 Speaker 1: minds on things. 1109 01:03:23,960 --> 01:03:26,440 Speaker 2: Well, I think you need one needs to internaliz. I 1110 01:03:26,520 --> 01:03:28,320 Speaker 2: mean we do this in science, but I think this 1111 01:03:28,600 --> 01:03:30,760 Speaker 2: just should be exported to the rest of culture. I 1112 01:03:30,800 --> 01:03:37,280 Speaker 2: think we all have an ethical obligation to internalize a 1113 01:03:37,400 --> 01:03:42,560 Speaker 2: value that is deeper than any place where our kind 1114 01:03:42,600 --> 01:03:45,080 Speaker 2: of wishful thinking or the way that our preference is 1115 01:03:45,080 --> 01:03:47,000 Speaker 2: for the way the world is could be anchored right, 1116 01:03:47,000 --> 01:03:50,560 Speaker 2: And the deeper value is we actually want to know 1117 01:03:50,600 --> 01:03:53,160 Speaker 2: what's real. Right, We would like like we have a 1118 01:03:53,200 --> 01:03:58,800 Speaker 2: reality bias. And it's not to say that that you 1119 01:03:58,840 --> 01:04:02,400 Speaker 2: can't find these sort of low instances where a person's 1120 01:04:02,400 --> 01:04:04,920 Speaker 2: interest might be better served by not really being in 1121 01:04:04,960 --> 01:04:08,040 Speaker 2: touch with reality, like some some scope for delusion, you know, 1122 01:04:08,080 --> 01:04:11,000 Speaker 2: self serving delusion might you know, we can certainly concoct 1123 01:04:11,000 --> 01:04:14,120 Speaker 2: those examples where like the person would have if he 1124 01:04:14,160 --> 01:04:16,480 Speaker 2: if he knew what the people in the room really 1125 01:04:16,480 --> 01:04:18,480 Speaker 2: thought of him, well then he wouldn't perform nearly as 1126 01:04:18,480 --> 01:04:20,240 Speaker 2: well as he did when he thought he was popular 1127 01:04:20,320 --> 01:04:22,600 Speaker 2: or you know, thought he was handsome, or what everything was. 1128 01:04:22,680 --> 01:04:27,480 Speaker 2: But generally speaking, we shouldn't want to be deceived or 1129 01:04:27,520 --> 01:04:34,200 Speaker 2: self deceived or psychotic or and and trading in these 1130 01:04:34,240 --> 01:04:39,240 Speaker 2: hallucinations and having and watching our culture get bent every 1131 01:04:39,280 --> 01:04:43,560 Speaker 2: whish way by these kind of mass adoptions of delusion. 1132 01:04:43,680 --> 01:04:47,640 Speaker 2: You know, there's moral panics and paranoia and lives and 1133 01:04:47,680 --> 01:04:49,280 Speaker 2: half truths that become operative. 1134 01:04:49,560 --> 01:04:49,720 Speaker 3: You know. 1135 01:04:49,880 --> 01:04:53,280 Speaker 2: Again, politics is so much the story of this. But 1136 01:04:53,360 --> 01:04:57,120 Speaker 2: it's you know, I would say it it embraces you know, 1137 01:04:57,120 --> 01:04:59,080 Speaker 2: almost everything we do. He's just I mean, it's not 1138 01:04:59,160 --> 01:05:01,280 Speaker 2: it's not that understand in the world is everything. It's 1139 01:05:01,320 --> 01:05:05,520 Speaker 2: not that being rational is everything, but it is this, 1140 01:05:05,880 --> 01:05:10,120 Speaker 2: it is the thing that can safeguard everything else that 1141 01:05:10,160 --> 01:05:13,080 Speaker 2: we care about. R I mean, I view your reason as, 1142 01:05:14,120 --> 01:05:17,040 Speaker 2: among other things, the guardian of love, right like like 1143 01:05:17,040 --> 01:05:19,360 Speaker 2: like there's there's other things you can love, like the 1144 01:05:19,520 --> 01:05:22,080 Speaker 2: other experiences you can have that are not a matter 1145 01:05:22,160 --> 01:05:27,600 Speaker 2: of just rigorously understanding what is so right, but rigorously 1146 01:05:27,680 --> 01:05:30,400 Speaker 2: understanding what is so is the thing that get that 1147 01:05:30,400 --> 01:05:34,040 Speaker 2: that that that stands at the gate to protect all 1148 01:05:34,080 --> 01:05:36,200 Speaker 2: those other moments where you just want to you know, 1149 01:05:37,000 --> 01:05:39,640 Speaker 2: hug your lover or you know you're your best friend, 1150 01:05:39,720 --> 01:05:43,840 Speaker 2: or play frisbee, or appreciate art or just watch a movie. Well, 1151 01:05:44,960 --> 01:05:47,600 Speaker 2: when it comes time to figure out what's actually happening 1152 01:05:47,600 --> 01:05:50,800 Speaker 2: in the world, when there's a pandemic raging, or when 1153 01:05:51,520 --> 01:05:54,040 Speaker 2: you know people are flying airplanes into our buildings or 1154 01:05:54,360 --> 01:05:58,560 Speaker 2: or whatever the whatever it is, we should have a 1155 01:05:58,720 --> 01:06:01,600 Speaker 2: very deep bias for the truth. 1156 01:06:02,080 --> 01:06:04,160 Speaker 1: How would you teach that to high schoolers? What would 1157 01:06:04,160 --> 01:06:04,880 Speaker 1: you tell them? 1158 01:06:05,200 --> 01:06:10,360 Speaker 2: Well, one, you want to remove the stigma around honestly 1159 01:06:10,400 --> 01:06:12,880 Speaker 2: confessing that you don't know what's true, right, So saying 1160 01:06:12,920 --> 01:06:16,880 Speaker 2: I don't know should they should have absolutely no discomfort 1161 01:06:17,040 --> 01:06:20,360 Speaker 2: attached to it. In fact, failing to say that you 1162 01:06:20,400 --> 01:06:23,240 Speaker 2: don't know when you clearly don't know. Should should That's 1163 01:06:23,280 --> 01:06:25,400 Speaker 2: where the stigma should be, right, So to pretend to 1164 01:06:25,440 --> 01:06:28,400 Speaker 2: know something you don't know is where I mean that 1165 01:06:28,400 --> 01:06:33,120 Speaker 2: that should be you know, scored as kind of unseemly 1166 01:06:33,240 --> 01:06:35,400 Speaker 2: and then people should get the social feedback around that. 1167 01:06:35,480 --> 01:06:38,760 Speaker 2: So whenever you were caught bullshitting, right, that's the thing 1168 01:06:38,840 --> 01:06:41,320 Speaker 2: that should be embarrassing, not when the teacher asked you 1169 01:06:41,880 --> 01:06:44,840 Speaker 2: the question and you just didn't know, right, or And 1170 01:06:45,160 --> 01:06:48,880 Speaker 2: people should never be afraid to ask what seems like 1171 01:06:48,920 --> 01:06:51,520 Speaker 2: a stupid question, like even in a scientific context. I mean, 1172 01:06:51,960 --> 01:06:54,360 Speaker 2: you know, there are great accounts of, you know, the 1173 01:06:54,360 --> 01:06:57,640 Speaker 2: behavior of some of the most eminent scientists you can 1174 01:06:57,680 --> 01:07:01,840 Speaker 2: think of where one of the things that made their 1175 01:07:02,920 --> 01:07:10,360 Speaker 2: company so EDIFYE and it actually kind of discombobulating for 1176 01:07:10,680 --> 01:07:13,880 Speaker 2: you know, lesser scientists is that they would ask a 1177 01:07:13,920 --> 01:07:17,120 Speaker 2: lot of dumb questions, you know, apparently dumb questions which 1178 01:07:17,120 --> 01:07:20,000 Speaker 2: if if if a novice was that would ask that question, 1179 01:07:20,040 --> 01:07:23,360 Speaker 2: they might actually see there might be some kind of 1180 01:07:23,400 --> 01:07:26,360 Speaker 2: social embarrassment around around that's kind of a dumb that's 1181 01:07:26,360 --> 01:07:28,640 Speaker 2: really an elementary question. But when you know, Richard Fyneman 1182 01:07:28,760 --> 01:07:31,800 Speaker 2: is asking you that dumb question. You know, you know 1183 01:07:31,880 --> 01:07:34,200 Speaker 2: you're in the company of Richard Feynman, right, you know 1184 01:07:34,360 --> 01:07:36,760 Speaker 2: he knows physics at least as well as you know physics. 1185 01:07:37,360 --> 01:07:42,040 Speaker 2: And he's asking that question. He's actually now putting pressure 1186 01:07:42,160 --> 01:07:46,360 Speaker 2: on what is kind of a load bearing wall and 1187 01:07:46,400 --> 01:07:49,040 Speaker 2: may in fact be a load bearing fiction of the 1188 01:07:49,080 --> 01:07:53,360 Speaker 2: whole enterprise. Right, And so you should be not at 1189 01:07:53,360 --> 01:07:58,360 Speaker 2: all afraid to ask dumb questions. And there should be 1190 01:07:58,360 --> 01:08:01,480 Speaker 2: this kind of master value where you want to calibrate 1191 01:08:01,720 --> 01:08:06,960 Speaker 2: your conviction rather finally with the weight of the evidence 1192 01:08:07,040 --> 01:08:09,320 Speaker 2: and the and the and the solidity of the argument. Right, 1193 01:08:09,200 --> 01:08:11,600 Speaker 2: like like you you want to you don't want to 1194 01:08:11,600 --> 01:08:15,800 Speaker 2: be at eleven with your certainty when there's actually nothing 1195 01:08:15,840 --> 01:08:20,360 Speaker 2: to stand on. And you also should not want to 1196 01:08:20,360 --> 01:08:22,519 Speaker 2: be wrong for a moment longer than you. 1197 01:08:22,479 --> 01:08:23,280 Speaker 3: Have to be right. 1198 01:08:23,320 --> 01:08:24,960 Speaker 2: So like so like if you I mean this is 1199 01:08:25,479 --> 01:08:27,240 Speaker 2: this is the spirit in which I would have any 1200 01:08:28,280 --> 01:08:30,040 Speaker 2: debate on any topic. I mean, I have you know, 1201 01:08:30,960 --> 01:08:34,040 Speaker 2: I tend to know what I think when I'm certainly 1202 01:08:34,040 --> 01:08:36,360 Speaker 2: on a topic with where I'm in any kind of 1203 01:08:36,360 --> 01:08:40,520 Speaker 2: debate mode with somebody. But if someone makes a point 1204 01:08:40,680 --> 01:08:44,760 Speaker 2: where I see their right and I was wrong, like 1205 01:08:45,240 --> 01:08:48,000 Speaker 2: my my kind of master my master value in that 1206 01:08:48,080 --> 01:08:52,080 Speaker 2: moment is not to defend myself, you know, just to 1207 01:08:52,080 --> 01:08:54,960 Speaker 2: put a brave face on my on my error and 1208 01:08:55,040 --> 01:08:57,080 Speaker 2: try to, you know, just kind of beat back the 1209 01:08:57,120 --> 01:09:00,720 Speaker 2: truth that they just articulated, because I mean, there's for 1210 01:09:00,760 --> 01:09:02,559 Speaker 2: two reasons for this. One is if the place I'm 1211 01:09:02,600 --> 01:09:07,360 Speaker 2: standing actually has no support and the person who's criticizing 1212 01:09:07,400 --> 01:09:09,640 Speaker 2: me can see it, and they pointed out, and the 1213 01:09:09,680 --> 01:09:12,240 Speaker 2: audience is likely to see it. If they can't see 1214 01:09:12,240 --> 01:09:14,479 Speaker 2: it in the first second, they're likely to see it 1215 01:09:14,560 --> 01:09:17,200 Speaker 2: like thirty seconds from now. What I really want to 1216 01:09:17,200 --> 01:09:20,360 Speaker 2: do is no longer to be standing there, both as 1217 01:09:20,360 --> 01:09:24,840 Speaker 2: a matter of like, you know, like just intellectual integrity, 1218 01:09:24,880 --> 01:09:28,160 Speaker 2: but also just as a matter of like not losing 1219 01:09:28,400 --> 01:09:30,680 Speaker 2: you know, egregiously, if I in so far as I 1220 01:09:30,720 --> 01:09:33,040 Speaker 2: care about the way I'm being perceived by the audience. 1221 01:09:33,080 --> 01:09:37,080 Speaker 2: So the master move there would be to actually let 1222 01:09:37,080 --> 01:09:40,040 Speaker 2: the person educate you on that point right now. Hopefully 1223 01:09:40,080 --> 01:09:43,000 Speaker 2: it's not a point that is so central to to 1224 01:09:44,200 --> 01:09:46,160 Speaker 2: you know, that the topic you were debating such that 1225 01:09:46,240 --> 01:09:48,680 Speaker 2: you now have just admit you're completely wrong and you've 1226 01:09:48,760 --> 01:09:51,559 Speaker 2: changed your mind. But in fact that were the case, 1227 01:09:52,280 --> 01:09:54,120 Speaker 2: that's what I would want to do. I mean that again, 1228 01:09:54,160 --> 01:09:56,240 Speaker 2: that's the thing. I go back to the scientist who 1229 01:09:56,600 --> 01:09:59,880 Speaker 2: might have spent his entire career banging away on some 1230 01:10:00,640 --> 01:10:05,400 Speaker 2: doomed theory. The moment has really proven wrong, right the moment. 1231 01:10:05,960 --> 01:10:08,880 Speaker 2: I mean again, this is these are situations that are 1232 01:10:08,880 --> 01:10:12,200 Speaker 2: few and far between, but it does happen. The pivot 1233 01:10:12,240 --> 01:10:14,960 Speaker 2: you want to be capable of making is just to 1234 01:10:15,040 --> 01:10:19,040 Speaker 2: be grateful to now know what's real, right like that. I mean, 1235 01:10:19,040 --> 01:10:21,799 Speaker 2: that's where the reality bought. But we should be hungry 1236 01:10:22,720 --> 01:10:28,439 Speaker 2: to wake up from the dream of misapprehension. This is 1237 01:10:28,479 --> 01:10:31,600 Speaker 2: a hallucination that this is a spell we have to 1238 01:10:31,680 --> 01:10:34,000 Speaker 2: keep breaking. We just we come into this world not 1239 01:10:34,160 --> 01:10:38,000 Speaker 2: knowing what is going on, and our entanglement with it 1240 01:10:38,040 --> 01:10:42,040 Speaker 2: and with others and with culture and with increasingly sophisticated 1241 01:10:42,080 --> 01:10:47,640 Speaker 2: ways of knowing is is continually trimming down this, this 1242 01:10:48,040 --> 01:10:52,479 Speaker 2: misunderstanding and aligning us with something Again we're not We 1243 01:10:52,600 --> 01:10:56,280 Speaker 2: never get some final account where we can step outside 1244 01:10:56,280 --> 01:10:59,080 Speaker 2: of our view of reality so as to look back 1245 01:10:59,120 --> 01:11:00,840 Speaker 2: at it and say, oh, yeah, this is where you 1246 01:11:01,000 --> 01:11:04,120 Speaker 2: This is exactly how the map fit the territory. We're 1247 01:11:04,160 --> 01:11:07,120 Speaker 2: always just kind of living the map. But when we 1248 01:11:07,320 --> 01:11:11,000 Speaker 2: when we you know, bump into a hard object, you know, 1249 01:11:11,120 --> 01:11:13,439 Speaker 2: in the darkness, we should know, we don't want to 1250 01:11:13,439 --> 01:11:15,280 Speaker 2: do that again. But it is, it comes down to 1251 01:11:15,360 --> 01:11:21,519 Speaker 2: a very clear sense of what intellectual integrity is. And 1252 01:11:21,600 --> 01:11:24,160 Speaker 2: we don't teach that very well. We teach it well 1253 01:11:24,320 --> 01:11:28,360 Speaker 2: in science and in you know, kind of rigorous areas 1254 01:11:28,080 --> 01:11:32,120 Speaker 2: of academia, like you know, philosophy, and in other areas 1255 01:11:32,880 --> 01:11:35,439 Speaker 2: we seem to teach the antithesis. I mean, I mean, 1256 01:11:35,960 --> 01:11:38,760 Speaker 2: one of the worst things that's happened of late, and 1257 01:11:38,800 --> 01:11:41,759 Speaker 2: it's it has been a source of great cultural division, 1258 01:11:42,120 --> 01:11:45,439 Speaker 2: is that so much of what purports to be knowledge 1259 01:11:45,640 --> 01:11:49,720 Speaker 2: gathering in the Ivory Tower has become just rankly politicized, 1260 01:11:49,800 --> 01:11:52,920 Speaker 2: such that you have whole disciplines which are have just 1261 01:11:52,960 --> 01:11:54,520 Speaker 2: been vitiated by politics. 1262 01:11:54,600 --> 01:11:56,360 Speaker 3: Right. So this it's a culture, a culture. 1263 01:11:56,120 --> 01:12:00,479 Speaker 2: Of activism that is masquerading as a culture of of 1264 01:12:00,880 --> 01:12:05,360 Speaker 2: knowledge acquisition, and it's completely upside down. And you know, therefore, 1265 01:12:05,520 --> 01:12:09,599 Speaker 2: and it's been it's you know, the the revolt against 1266 01:12:09,600 --> 01:12:13,519 Speaker 2: the elites is all too understandable in light of this, 1267 01:12:13,760 --> 01:12:15,799 Speaker 2: and it's I mean, we're going to take a generation 1268 01:12:15,920 --> 01:12:17,320 Speaker 2: to recover I think from. 1269 01:12:17,240 --> 01:12:18,080 Speaker 3: It, I agreed. 1270 01:12:18,560 --> 01:12:20,080 Speaker 1: I will say, by the way, it is a total 1271 01:12:20,120 --> 01:12:22,759 Speaker 1: side note, I'm very hopeful about the role of AI 1272 01:12:23,720 --> 01:12:27,880 Speaker 1: helping with this problem. Here's why we can set up 1273 01:12:27,960 --> 01:12:31,800 Speaker 1: a debate bot so that each student debates the bought 1274 01:12:32,000 --> 01:12:35,280 Speaker 1: some hot button issue. They go, they get graded on 1275 01:12:35,360 --> 01:12:40,040 Speaker 1: the quality of their arguments, and then they switch sides 1276 01:12:40,080 --> 01:12:42,240 Speaker 1: and they argue the other side. Why because the AI 1277 01:12:42,400 --> 01:12:47,240 Speaker 1: is great at doing this. It's calm, it's patient, It's 1278 01:12:47,439 --> 01:12:50,000 Speaker 1: able to do that with every student for as long 1279 01:12:50,040 --> 01:12:52,640 Speaker 1: as it wants, whereas no teacher could ever have that 1280 01:12:52,760 --> 01:12:55,200 Speaker 1: kind of time or patience. So I think this is 1281 01:12:55,520 --> 01:12:58,160 Speaker 1: going to be really helpful in terms of teaching students 1282 01:12:58,240 --> 01:13:01,479 Speaker 1: this three sixty view. Is there anything you've changed your 1283 01:13:01,800 --> 01:13:04,040 Speaker 1: belief about in the last decade or two? 1284 01:13:04,520 --> 01:13:08,360 Speaker 2: This is previously actually this part of the AI conversation. 1285 01:13:10,080 --> 01:13:14,080 Speaker 2: So I've been very worried about AI and the fundamental 1286 01:13:14,160 --> 01:13:16,680 Speaker 2: question of the alignment problem. I just what it's going 1287 01:13:16,720 --> 01:13:19,880 Speaker 2: to look like to build you know, truly superhuman general 1288 01:13:19,880 --> 01:13:23,439 Speaker 2: intelligence and kind of the ways in which we might 1289 01:13:23,479 --> 01:13:28,200 Speaker 2: do that wrong, you know, and effectively you know, ruin everything, 1290 01:13:28,200 --> 01:13:30,560 Speaker 2: because suddenly we have we have created a technology that 1291 01:13:30,680 --> 01:13:32,800 Speaker 2: is more powerful than we are, an autonomous and not 1292 01:13:32,880 --> 01:13:35,480 Speaker 2: aligned with our well being, and we're kind of negotiating 1293 01:13:35,520 --> 01:13:38,080 Speaker 2: with it now, and it becomes this doomed chess game, 1294 01:13:38,240 --> 01:13:40,960 Speaker 2: Like it's like, you're not going to play chess harder 1295 01:13:41,080 --> 01:13:43,640 Speaker 2: so as to to realign the chess engine. Then now 1296 01:13:43,920 --> 01:13:45,559 Speaker 2: is now just beating you at chess, and you're not 1297 01:13:45,560 --> 01:13:48,400 Speaker 2: going to negotiate with a super intelligent AI that's not 1298 01:13:49,080 --> 01:13:54,400 Speaker 2: disposed to take your advice hereafter. So, so I'm worried 1299 01:13:54,439 --> 01:13:56,920 Speaker 2: about AI. And I did think at least five ten 1300 01:13:57,000 --> 01:14:00,040 Speaker 2: years ago that autonomous weapons would be something that we 1301 01:14:00,080 --> 01:14:03,040 Speaker 2: wouldn't want to do. That just seems like a very 1302 01:14:03,080 --> 01:14:04,800 Speaker 2: dangerous thing, And so I think I even signed an 1303 01:14:04,840 --> 01:14:08,320 Speaker 2: open letter that we shouldn't have autonomous weapons. I think, 1304 01:14:08,360 --> 01:14:12,040 Speaker 2: my again, this is not really a happy epiphany, But 1305 01:14:12,240 --> 01:14:17,360 Speaker 2: I do think that leaving the alignment concern aside, given 1306 01:14:17,400 --> 01:14:21,560 Speaker 2: the arms race with China clearly and other potential adversaries 1307 01:14:21,600 --> 01:14:23,920 Speaker 2: around AI, I feel like we just need to win 1308 01:14:24,000 --> 01:14:27,720 Speaker 2: that arms race. So i'm, you know, until I hear 1309 01:14:27,720 --> 01:14:30,840 Speaker 2: a better argument. Now, I'm in favor of all the 1310 01:14:30,840 --> 01:14:33,520 Speaker 2: ways in which AI can be adopted by the military. 1311 01:14:33,960 --> 01:14:38,000 Speaker 2: Autonomous weapons included. I just think the failure mode of 1312 01:14:38,280 --> 01:14:41,760 Speaker 2: declining to do that work and waiting for China to 1313 01:14:41,800 --> 01:14:45,599 Speaker 2: do it is so obvious and excruciating that I'm willing 1314 01:14:45,680 --> 01:14:50,760 Speaker 2: to roll the dice on let's get there first. The 1315 01:14:50,760 --> 01:14:52,280 Speaker 2: first time I heard about it, I was sort of 1316 01:14:52,320 --> 01:14:54,800 Speaker 2: like the gun control guy, like I don't want anything 1317 01:14:54,840 --> 01:14:56,679 Speaker 2: to do with guns, and they're just dangerous and awful. 1318 01:14:57,800 --> 01:15:00,800 Speaker 2: And then I sort of just flipped to none, no, no, 1319 01:15:00,840 --> 01:15:02,559 Speaker 2: we need as many guns as possible, and we need 1320 01:15:02,600 --> 01:15:07,840 Speaker 2: them fast. But again, I've obviously opened to argument on that, 1321 01:15:07,880 --> 01:15:09,559 Speaker 2: but that's that was a very clear change of heart. 1322 01:15:10,360 --> 01:15:12,439 Speaker 1: This comes back to the central theme that we've been 1323 01:15:12,479 --> 01:15:16,160 Speaker 1: discussing about how beliefs you know, in the laboratory, the 1324 01:15:16,200 --> 01:15:18,639 Speaker 1: way one has to study it is with simple beliefs. 1325 01:15:18,680 --> 01:15:21,280 Speaker 1: In the year, they're true or they're false, But in 1326 01:15:21,320 --> 01:15:25,479 Speaker 1: real life they're tied to everything else, including our views 1327 01:15:25,520 --> 01:15:29,080 Speaker 1: of larger and larger geopolitical issues, such that you might 1328 01:15:29,200 --> 01:15:32,719 Speaker 1: have a flip of a belief predicated on these other 1329 01:15:33,360 --> 01:15:36,640 Speaker 1: larger issues about where you think the world is going politically. So, 1330 01:15:36,840 --> 01:15:40,479 Speaker 1: given this other constellation of beliefs that people have that 1331 01:15:40,600 --> 01:15:45,120 Speaker 1: navigate their particular beliefs about politics or abortion or gun 1332 01:15:45,200 --> 01:15:51,000 Speaker 1: control or whatever. Is healthy disagreement something that you see 1333 01:15:51,200 --> 01:15:55,040 Speaker 1: as possible, and if so, in what ways would you 1334 01:15:55,080 --> 01:15:55,759 Speaker 1: like to see that happen? 1335 01:15:56,240 --> 01:15:58,680 Speaker 2: Well, I think we we live in the zone of 1336 01:15:58,800 --> 01:16:02,480 Speaker 2: healthy disagreement all the time, and we're constantly disagreeing with 1337 01:16:02,479 --> 01:16:04,200 Speaker 2: with people we love about things. 1338 01:16:04,400 --> 01:16:05,479 Speaker 3: The question is how much. 1339 01:16:05,360 --> 01:16:07,479 Speaker 2: Does it matter? And when it matters, it's like when 1340 01:16:07,479 --> 01:16:11,080 Speaker 2: you raise the stakes, then you begin to get uncomfortable 1341 01:16:11,320 --> 01:16:14,559 Speaker 2: and you feel like you have to converge. I mean, 1342 01:16:14,560 --> 01:16:18,679 Speaker 2: it just became it becomes behaviorally and emotionally imperative to converge, 1343 01:16:19,120 --> 01:16:20,880 Speaker 2: certainly when lives depend on it. Well, then all of 1344 01:16:20,920 --> 01:16:22,560 Speaker 2: a sudden, then it's like it's a kind of a 1345 01:16:23,320 --> 01:16:26,439 Speaker 2: moral emergency to get get your facts straight, get you know, 1346 01:16:26,560 --> 01:16:29,360 Speaker 2: you know, you get some of the best argument and win. 1347 01:16:29,960 --> 01:16:32,840 Speaker 2: So it really just it comes down to what the 1348 01:16:32,880 --> 01:16:37,360 Speaker 2: stakes are and whether we're in a zone that's anything 1349 01:16:37,520 --> 01:16:42,360 Speaker 2: like that which is indicated by the common phrase, or 1350 01:16:42,400 --> 01:16:46,719 Speaker 2: we're just gonna have to agree to disagree, right, Like, well, yeah, 1351 01:16:46,720 --> 01:16:49,479 Speaker 2: that's we over here. We can we can do that 1352 01:16:49,560 --> 01:16:51,920 Speaker 2: and still be friends and this is perfect, right, there's 1353 01:16:51,960 --> 01:16:54,160 Speaker 2: no problem like that's kind of the spice of life 1354 01:16:54,200 --> 01:16:59,400 Speaker 2: to agree to disagree about that. But over here like 1355 01:16:59,520 --> 01:17:03,439 Speaker 2: we can't take another step until we figure out who's right. 1356 01:17:07,920 --> 01:17:10,920 Speaker 1: That was my interview with Sam Harris. Sam's research here 1357 01:17:11,000 --> 01:17:14,200 Speaker 1: shows that even at the most basic level, the brain 1358 01:17:14,280 --> 01:17:19,360 Speaker 1: distinguishes between accepting or rejecting a claim or remaining uncertain 1359 01:17:19,400 --> 01:17:23,479 Speaker 1: about that claim, and that distinction the brain makes that 1360 01:17:23,560 --> 01:17:27,560 Speaker 1: then roots what you do in terms of feeling and behavior. 1361 01:17:27,800 --> 01:17:30,800 Speaker 1: Some things you act on right away because you believe 1362 01:17:30,840 --> 01:17:33,720 Speaker 1: they're true, and other things you dismiss and they don't 1363 01:17:33,760 --> 01:17:37,760 Speaker 1: move any further through your neural system. So your brain's 1364 01:17:37,920 --> 01:17:43,040 Speaker 1: assessment about the truth or falsity of something determines everything 1365 01:17:43,080 --> 01:17:47,679 Speaker 1: about what happens next. As Sam also mentioned, beliefs differ 1366 01:17:47,880 --> 01:17:52,000 Speaker 1: in terms of how tightly they bind to identity. Some 1367 01:17:52,040 --> 01:17:55,840 Speaker 1: beliefs are pretty easy to revise when new evidence appears, 1368 01:17:56,080 --> 01:18:00,679 Speaker 1: like here's data about how you could make nuclear react safe. 1369 01:18:01,320 --> 01:18:05,360 Speaker 1: But other beliefs are anchored to social belonging or moral 1370 01:18:05,360 --> 01:18:09,720 Speaker 1: commitments you have, or longstanding narratives about who you are 1371 01:18:09,840 --> 01:18:13,840 Speaker 1: and how the world operates. In those cases, changing a 1372 01:18:13,920 --> 01:18:17,400 Speaker 1: belief can feel like changing sides or stepping away from 1373 01:18:17,439 --> 01:18:21,520 Speaker 1: a community or giving something up that wants provided stability. 1374 01:18:22,320 --> 01:18:26,120 Speaker 1: What emerges is a picture of belief as a kind 1375 01:18:26,120 --> 01:18:31,000 Speaker 1: of navigation system. It helps us decide what to act 1376 01:18:31,080 --> 01:18:34,200 Speaker 1: on and who to trust, and what to fear, and 1377 01:18:34,240 --> 01:18:38,479 Speaker 1: what futures to prepare for. And belief is shaped by 1378 01:18:38,600 --> 01:18:43,880 Speaker 1: evidence and facts, yes, but also by incentives and emotions 1379 01:18:44,240 --> 01:18:49,040 Speaker 1: and social environments we inhabit. Zooming out this brain research 1380 01:18:49,080 --> 01:18:54,760 Speaker 1: has consequences far beyond individual brains, because the same mechanisms 1381 01:18:54,760 --> 01:18:58,479 Speaker 1: that allow us to cooperate and reason and build shared 1382 01:18:58,600 --> 01:19:02,720 Speaker 1: models of the world can and also amplify division and 1383 01:19:03,120 --> 01:19:10,120 Speaker 1: harden identities and spread ideas whose consequences outpace their accuracy. 1384 01:19:11,040 --> 01:19:14,240 Speaker 1: In an age where language and images and claims are 1385 01:19:14,280 --> 01:19:19,240 Speaker 1: moving faster than ever, understanding how belief forms and how 1386 01:19:19,280 --> 01:19:23,680 Speaker 1: it hardens is becoming something of a civic duty. If 1387 01:19:23,720 --> 01:19:28,679 Speaker 1: belief is the gateway between information and action, then learning 1388 01:19:29,000 --> 01:19:32,240 Speaker 1: how that gateway works is one of the most important 1389 01:19:32,320 --> 01:19:37,240 Speaker 1: challenges of our time. Will presumably never all agree on 1390 01:19:37,280 --> 01:19:40,639 Speaker 1: our beliefs, but a deeper understanding of this, I hope, 1391 01:19:41,040 --> 01:19:47,520 Speaker 1: can allow disagreement to remain tethered to reality and proportion 1392 01:19:48,120 --> 01:19:59,240 Speaker 1: and humility go to eagleman dot com slash podcast. For 1393 01:19:59,280 --> 01:20:03,599 Speaker 1: more information and to find further reading, join the weekly 1394 01:20:03,640 --> 01:20:07,080 Speaker 1: discussions on my substack, and check out and subscribe to 1395 01:20:07,120 --> 01:20:10,360 Speaker 1: Inner Cosmos on YouTube for videos of each episode and 1396 01:20:10,400 --> 01:20:16,080 Speaker 1: to leave comments. Until next time, I'm David Eagleman and 1397 01:20:16,120 --> 01:20:17,720 Speaker 1: this is Inner Cosmos