1 00:00:03,040 --> 00:00:05,280 Speaker 1: Welcome to Stuff to Blow Your Mind, the production of 2 00:00:05,360 --> 00:00:14,440 Speaker 1: My Heart Radio. Hey, welcome to Stuff to Blow Your Mind. 3 00:00:14,480 --> 00:00:17,759 Speaker 1: My name is Robert Lamb and I'm Joe McCormick. And 4 00:00:17,880 --> 00:00:21,680 Speaker 1: today we're going to be talking about an interesting observation 5 00:00:21,760 --> 00:00:26,360 Speaker 1: in cognitive psychology that deals with language that starts off 6 00:00:26,400 --> 00:00:29,080 Speaker 1: as kind of just a funny little quirk about the 7 00:00:29,120 --> 00:00:32,479 Speaker 1: way we process certain kinds of sentences, but ends up 8 00:00:32,520 --> 00:00:36,440 Speaker 1: having some some broader and more interesting implications about knowledge 9 00:00:36,440 --> 00:00:38,680 Speaker 1: and language and thought. But I thought the best way 10 00:00:38,720 --> 00:00:42,839 Speaker 1: to start here would just be to illustrate the prime 11 00:00:42,880 --> 00:00:45,400 Speaker 1: example of the effect we're going to be talking about. 12 00:00:45,560 --> 00:00:47,400 Speaker 1: And to do that, I think we need to do 13 00:00:47,520 --> 00:00:50,040 Speaker 1: a bit of Bible trivia. Rob Are you ready to 14 00:00:50,080 --> 00:00:55,960 Speaker 1: go to Sunday School. Let's do it. Let's go to 15 00:00:56,000 --> 00:00:58,360 Speaker 1: Sunday School. Okay, if you I'm gonna ask you a 16 00:00:58,440 --> 00:01:00,800 Speaker 1: few questions about the Bible. If you get one wrong, 17 00:01:00,840 --> 00:01:03,800 Speaker 1: you are going to get a paddling. WHOA, what the 18 00:01:03,840 --> 00:01:08,240 Speaker 1: domination is this one of the ones that means business? Okay? 19 00:01:08,280 --> 00:01:12,959 Speaker 1: So how let's see it? So um in in the 20 00:01:12,959 --> 00:01:15,440 Speaker 1: garden of Eden, what type of animal is it that 21 00:01:15,600 --> 00:01:18,720 Speaker 1: tempts Eve to eat from the tree. Oh, that's a snake, 22 00:01:20,040 --> 00:01:23,600 Speaker 1: that's right, the serpent, it is, uh. Okay. When after 23 00:01:23,680 --> 00:01:26,119 Speaker 1: God created the world, on which day of the week 24 00:01:26,160 --> 00:01:29,680 Speaker 1: did he rest? Oh, that was the seventh day. He 25 00:01:29,800 --> 00:01:33,000 Speaker 1: got that one, right, Okay, next one, How many animals 26 00:01:33,080 --> 00:01:35,840 Speaker 1: of each kind did Moses take on the arc? Too? 27 00:01:35,920 --> 00:01:41,480 Speaker 1: Of course? And there you go. That is the prime example. Now, Rob, 28 00:01:41,480 --> 00:01:43,520 Speaker 1: I know you were playing along because you already know 29 00:01:43,760 --> 00:01:45,920 Speaker 1: the trick in here what I actually said when I 30 00:01:45,920 --> 00:01:48,240 Speaker 1: asked that question. Hopefully you were playing along at home 31 00:01:48,320 --> 00:01:50,320 Speaker 1: as as you're listening, or maybe you're not at home, 32 00:01:50,320 --> 00:01:53,400 Speaker 1: wherever the heck you are. Um, you may have thought 33 00:01:53,400 --> 00:01:56,080 Speaker 1: the same thing, right, Moses took two of each animal 34 00:01:56,120 --> 00:01:58,640 Speaker 1: on the arc. But in fact, in the Bible story 35 00:01:58,680 --> 00:02:01,720 Speaker 1: which maybe not everybody knows, but maybe you do know 36 00:02:01,880 --> 00:02:04,480 Speaker 1: this story of of the arc and in the Book 37 00:02:04,480 --> 00:02:06,760 Speaker 1: of Genesis, and you do, in fact to know that 38 00:02:06,800 --> 00:02:09,800 Speaker 1: it was not Moses who did that. It was Noah 39 00:02:09,800 --> 00:02:12,280 Speaker 1: in the story who took animals on the arc. And 40 00:02:12,360 --> 00:02:14,640 Speaker 1: yet you thought, after I said the question that the 41 00:02:14,639 --> 00:02:18,200 Speaker 1: answer is too and didn't even register the fact that 42 00:02:18,400 --> 00:02:22,920 Speaker 1: the name was wrong. Yeah, it's it's an interesting uh 43 00:02:23,320 --> 00:02:27,560 Speaker 1: phenomenon to uh, to encounter you others, but also in yourself, 44 00:02:28,000 --> 00:02:31,760 Speaker 1: because you there's several different ways to look at it, 45 00:02:31,720 --> 00:02:33,000 Speaker 1: and we'll get into a number of these here. But 46 00:02:33,040 --> 00:02:34,959 Speaker 1: like even just now, when you ask me those questions 47 00:02:34,960 --> 00:02:37,520 Speaker 1: like the serpent one, I'm totally firm on that, Like 48 00:02:37,560 --> 00:02:39,320 Speaker 1: I know that I know that aspect of the story 49 00:02:39,360 --> 00:02:41,600 Speaker 1: inside and out. Uh, And of course I know it's 50 00:02:41,639 --> 00:02:44,000 Speaker 1: the seventh day that he rested on the God rested 51 00:02:44,040 --> 00:02:46,280 Speaker 1: on he. She Yet however you want to look at it, 52 00:02:47,400 --> 00:02:50,200 Speaker 1: But there was still like this moment of hesitation because 53 00:02:50,200 --> 00:02:52,000 Speaker 1: I was like, it's seven, right, it is seven. I 54 00:02:52,000 --> 00:02:54,639 Speaker 1: don't want to come come off with the wrong answer 55 00:02:54,680 --> 00:02:58,120 Speaker 1: on the podcast. Um. But then when one encountering granted 56 00:02:58,120 --> 00:03:00,399 Speaker 1: already knew the answer to the third one, but there 57 00:03:00,480 --> 00:03:02,600 Speaker 1: is this temptation though to like when you when you 58 00:03:02,639 --> 00:03:06,120 Speaker 1: know why, when you know the answer to something, like 59 00:03:06,160 --> 00:03:08,520 Speaker 1: you just you can just jump in without hesitation, Like 60 00:03:08,560 --> 00:03:12,320 Speaker 1: there's a certainty that just propels you. Um, you're excited 61 00:03:12,360 --> 00:03:15,200 Speaker 1: to get your answer in and then you know, get 62 00:03:15,200 --> 00:03:18,320 Speaker 1: the acclaim and the praise for getting it right. Yeah, 63 00:03:18,360 --> 00:03:20,359 Speaker 1: there's a certain kind of way in which a question, 64 00:03:20,480 --> 00:03:24,080 Speaker 1: especially a question posed in quiz format where you feel 65 00:03:24,120 --> 00:03:27,800 Speaker 1: you are under performance pressure and you're being evaluated for 66 00:03:27,840 --> 00:03:29,480 Speaker 1: whether or not you're going to get the right answer. 67 00:03:29,760 --> 00:03:33,120 Speaker 1: It sort of takes away some amount of critical thinking 68 00:03:33,160 --> 00:03:36,200 Speaker 1: that would normally go into reading a sentence and causes 69 00:03:36,200 --> 00:03:39,280 Speaker 1: you to focus more exclusively on like, just can I 70 00:03:39,320 --> 00:03:41,760 Speaker 1: get the right answer? And so it's not hard to 71 00:03:41,800 --> 00:03:43,680 Speaker 1: see how now, and this of course might not be 72 00:03:43,720 --> 00:03:46,520 Speaker 1: the only explanation for why this is happening, but it's 73 00:03:46,600 --> 00:03:49,320 Speaker 1: not hard to see why you could pretty easily miss 74 00:03:49,360 --> 00:03:52,880 Speaker 1: a major error in a question that is not you know, 75 00:03:53,000 --> 00:03:55,800 Speaker 1: that is not necessarily something that you're fuzzy on to 76 00:03:55,840 --> 00:03:58,560 Speaker 1: begin with, Like you could know perfectly well that it's 77 00:03:58,600 --> 00:04:01,280 Speaker 1: Noah in the story, and yet it just goes completely 78 00:04:01,320 --> 00:04:04,120 Speaker 1: over your head. Yeah, and you know, we've we've been 79 00:04:04,120 --> 00:04:05,960 Speaker 1: doing this podcast quite a while at this point, and 80 00:04:05,960 --> 00:04:08,840 Speaker 1: occasionally this this comes up in our data, not so 81 00:04:08,920 --> 00:04:11,720 Speaker 1: much in things that we've researched for the podcast, because 82 00:04:11,720 --> 00:04:14,800 Speaker 1: I feel like if we've been crunching the facts or 83 00:04:14,800 --> 00:04:17,159 Speaker 1: the numbers, you know, or the you know, we're we're 84 00:04:17,200 --> 00:04:19,400 Speaker 1: more likely to be putting a lot of thought into 85 00:04:19,400 --> 00:04:22,359 Speaker 1: the situation and we're maybe just a you know, a 86 00:04:22,360 --> 00:04:25,839 Speaker 1: little hesitant anyway. But the times where I've personally like 87 00:04:26,000 --> 00:04:28,880 Speaker 1: said something that was absolutely incorrect, it would be something 88 00:04:28,880 --> 00:04:30,880 Speaker 1: that I felt so sure about that I just belt 89 00:04:30,920 --> 00:04:33,520 Speaker 1: it out without fact checking it at all. You know, Uh, 90 00:04:33,560 --> 00:04:36,560 Speaker 1: something you generally it's something not directly related to the episode, 91 00:04:36,560 --> 00:04:39,320 Speaker 1: but something that just kind of comes up in organic conversation. 92 00:04:39,720 --> 00:04:42,600 Speaker 1: That's exactly right. Yeah, it's when you feel so confident 93 00:04:42,680 --> 00:04:45,440 Speaker 1: that you're not even being careful, you know that that 94 00:04:45,560 --> 00:04:49,000 Speaker 1: you can really make some big blunders. Uh. There was 95 00:04:49,040 --> 00:04:50,880 Speaker 1: some other questions I was reading about in one of 96 00:04:50,920 --> 00:04:54,160 Speaker 1: the I think the earliest study on this phenomenon we're 97 00:04:54,160 --> 00:04:56,800 Speaker 1: talking about today. Some of the other questions were in 98 00:04:56,839 --> 00:04:59,920 Speaker 1: the Biblical story, what is Joshua swallowed by? Of cour 99 00:05:00,200 --> 00:05:03,000 Speaker 1: that's Jonah that is swallowed by the whale or the 100 00:05:03,040 --> 00:05:06,240 Speaker 1: great fish, the sea monster. Joshua, of course, is the 101 00:05:06,240 --> 00:05:09,240 Speaker 1: the conquering leader of the Israelites as they go about Kanaan. 102 00:05:09,520 --> 00:05:12,279 Speaker 1: Another one I really liked was in the novel Moby Dick. 103 00:05:12,400 --> 00:05:15,200 Speaker 1: What color was the whale that Captain Nemo was after? 104 00:05:15,760 --> 00:05:18,599 Speaker 1: I think I think I might have fallen for that one. Yeah, 105 00:05:18,640 --> 00:05:20,680 Speaker 1: I mean, I wonder how much of the ego is 106 00:05:20,720 --> 00:05:22,960 Speaker 1: involved here because it's like you're kind of like, all right, 107 00:05:23,040 --> 00:05:24,480 Speaker 1: let's get to the part of this where I get 108 00:05:24,520 --> 00:05:26,640 Speaker 1: to talk and get to be the one is correct, 109 00:05:26,960 --> 00:05:29,239 Speaker 1: you know, like like fast forward through all this other stuff. 110 00:05:29,240 --> 00:05:31,480 Speaker 1: I don't care. I have an answer and it is 111 00:05:31,560 --> 00:05:36,000 Speaker 1: the correct one. Yeah. Um, that's that's quite perceptive and 112 00:05:36,040 --> 00:05:39,080 Speaker 1: I think that's right. Um. But but anyway, so this 113 00:05:39,320 --> 00:05:41,880 Speaker 1: question that we're looking at today, that this effect of 114 00:05:42,160 --> 00:05:46,279 Speaker 1: not noticing that the question says Moses and just barreling 115 00:05:46,360 --> 00:05:48,760 Speaker 1: right on through to the answer, even if you know 116 00:05:48,920 --> 00:05:51,400 Speaker 1: that it's actually Noah and the story and not Moses. 117 00:05:51,880 --> 00:05:54,159 Speaker 1: This effect has a name, and it's known as the 118 00:05:54,240 --> 00:05:58,760 Speaker 1: Moses illusion. It's a particular type of semantic illusion that 119 00:05:58,839 --> 00:06:03,400 Speaker 1: occurs when we are trying to process certain kinds of sentences. 120 00:06:03,440 --> 00:06:06,520 Speaker 1: And this was first explored in a classic study in psychology. 121 00:06:06,640 --> 00:06:09,120 Speaker 1: It was a study called from Words to Meaning, a 122 00:06:09,200 --> 00:06:13,240 Speaker 1: Semantic Illusion, published in the Journal of Verbal Learning and 123 00:06:13,320 --> 00:06:16,520 Speaker 1: Verbal Behavior in nineteen eighty one by Thomas D. Ericsson 124 00:06:16,560 --> 00:06:19,400 Speaker 1: and Mark E. Mattson. And I think it's interesting that 125 00:06:19,480 --> 00:06:23,280 Speaker 1: this original observation about this this question about Moses, it 126 00:06:23,360 --> 00:06:27,880 Speaker 1: comes out of a mysterious question about how we process 127 00:06:28,080 --> 00:06:32,159 Speaker 1: the meaning of sentences. Uh, the authors of this study ask, quote, 128 00:06:32,360 --> 00:06:36,640 Speaker 1: how are the meanings of individual words combined to form 129 00:06:36,640 --> 00:06:40,760 Speaker 1: a more global description of meaning? And if you start 130 00:06:40,800 --> 00:06:44,280 Speaker 1: to think hard about this question about the human capacity 131 00:06:44,320 --> 00:06:48,680 Speaker 1: for language, I would argue it is absolutely astonishing. It's 132 00:06:48,680 --> 00:06:52,000 Speaker 1: almost baffling the way that we're not only able to 133 00:06:52,040 --> 00:06:55,960 Speaker 1: associate symbolic meaning with certain sounds coming out of our 134 00:06:56,000 --> 00:06:59,000 Speaker 1: mouths or glyphs on a page, but you're able to 135 00:06:59,200 --> 00:07:04,839 Speaker 1: combine those things endlessly to form and comprehend infinite variations 136 00:07:04,920 --> 00:07:09,159 Speaker 1: of combinations of those sounds, to create sentences that actually 137 00:07:09,200 --> 00:07:11,680 Speaker 1: means something and other people can understand what you mean 138 00:07:11,680 --> 00:07:14,520 Speaker 1: when you say them. Like, I think this type of 139 00:07:14,560 --> 00:07:17,120 Speaker 1: capacity for language is one of the features of the 140 00:07:17,200 --> 00:07:21,720 Speaker 1: natural world that to me seems closest to magic. Yeah. Absolutely, 141 00:07:21,720 --> 00:07:23,760 Speaker 1: And I feel like that the mostest illusion is one 142 00:07:23,800 --> 00:07:26,280 Speaker 1: of those things that that reveals the magic, that makes 143 00:07:26,280 --> 00:07:31,280 Speaker 1: you more aware of the magic trick that is inherent 144 00:07:31,360 --> 00:07:34,160 Speaker 1: to your just everyday perception of reality and how you 145 00:07:34,200 --> 00:07:37,360 Speaker 1: engage with facts and information and the fact that you're 146 00:07:37,400 --> 00:07:41,240 Speaker 1: just like, it's crazy that we're just constantly throwing together 147 00:07:41,360 --> 00:07:46,080 Speaker 1: sentences almost effortlessly that are combining all these words together. 148 00:07:46,120 --> 00:07:49,800 Speaker 1: Each word has a huge range of of possible meanings 149 00:07:49,840 --> 00:07:52,760 Speaker 1: and associations, and that we are able to do this 150 00:07:52,840 --> 00:07:56,320 Speaker 1: with such fluency, I mean, sometimes with more fluency than 151 00:07:56,400 --> 00:07:59,680 Speaker 1: other times. But uh, but yeah, it is truly a 152 00:07:59,720 --> 00:08:02,400 Speaker 1: staund unding to me. And so the authors here are 153 00:08:02,440 --> 00:08:04,720 Speaker 1: sort of talking about this process and some of the 154 00:08:05,040 --> 00:08:08,720 Speaker 1: question marks that existed at the time in science about 155 00:08:08,760 --> 00:08:12,040 Speaker 1: how we form sentences and how we comprehend sentences. So 156 00:08:12,080 --> 00:08:14,640 Speaker 1: they start in their introduction by talking about how quote, 157 00:08:14,640 --> 00:08:18,520 Speaker 1: a central process in language comprehension is the construction of 158 00:08:18,560 --> 00:08:22,239 Speaker 1: a global description of the sentence meaning from the meanings 159 00:08:22,280 --> 00:08:25,240 Speaker 1: of individual words which make up the sentence. Right, So 160 00:08:25,320 --> 00:08:28,480 Speaker 1: you know what individual words mean, but somehow, like we're 161 00:08:28,480 --> 00:08:31,600 Speaker 1: just talking about, you can combine them into these overall 162 00:08:31,800 --> 00:08:35,400 Speaker 1: gist forms of what somebody's getting at, you know, like 163 00:08:35,400 --> 00:08:38,480 Speaker 1: like what kind of answer is being requested by a 164 00:08:38,600 --> 00:08:41,480 Speaker 1: question that might be made up of ten different words 165 00:08:41,559 --> 00:08:43,240 Speaker 1: that are all, you know, throwing your brain in ten 166 00:08:43,320 --> 00:08:46,120 Speaker 1: different directions. Yet you can get the gist of the 167 00:08:46,200 --> 00:08:49,400 Speaker 1: question and figure out what is getting at pretty quickly actually, 168 00:08:49,800 --> 00:08:51,360 Speaker 1: And they talk about how there's been a lot of 169 00:08:51,360 --> 00:08:54,480 Speaker 1: work on how language processing works in the realm of 170 00:08:54,559 --> 00:08:57,439 Speaker 1: artificial intelligence, but at the time of this paper, there 171 00:08:57,480 --> 00:08:59,520 Speaker 1: was still a lot that we didn't know about the 172 00:08:59,559 --> 00:09:03,040 Speaker 1: globe whole meaning of of a sentence and and how 173 00:09:03,080 --> 00:09:06,319 Speaker 1: that's constructed in the brain. And so they summarize the 174 00:09:06,360 --> 00:09:09,000 Speaker 1: way they're starting this paper by saying, uh, it has 175 00:09:09,040 --> 00:09:13,079 Speaker 1: become widely assumed that sentences are subject to exhaustive analysis 176 00:09:13,080 --> 00:09:17,840 Speaker 1: and consistency checks during processing, but this is not the case. 177 00:09:18,320 --> 00:09:21,040 Speaker 1: People do not always understand what is said to them. 178 00:09:21,240 --> 00:09:25,280 Speaker 1: Sometimes they fail to understand, sometimes they misunderstand. And while 179 00:09:25,360 --> 00:09:28,360 Speaker 1: these failures of comprehension are sometimes due to lack of 180 00:09:28,400 --> 00:09:31,040 Speaker 1: appropriate knowledge or error on the part of the speaker, 181 00:09:31,440 --> 00:09:34,520 Speaker 1: there are other cases in which such failures occur when 182 00:09:34,600 --> 00:09:39,640 Speaker 1: the understander possesses all the knowledge necessary for correct understanding. 183 00:09:40,120 --> 00:09:43,520 Speaker 1: This paper explores such a phenomenon, and then they give 184 00:09:43,559 --> 00:09:46,520 Speaker 1: the example of the Moses illusion that we already talked about. 185 00:09:46,559 --> 00:09:49,800 Speaker 1: The question that they pose is how many animals of 186 00:09:49,840 --> 00:09:52,680 Speaker 1: each kind did Moses take on the arc? And so 187 00:09:52,720 --> 00:09:54,800 Speaker 1: what the authors here found in their original study in 188 00:09:54,880 --> 00:09:57,800 Speaker 1: eighty one was that the majority of people fail to 189 00:09:57,960 --> 00:10:01,440 Speaker 1: notice a problem with the question and simply answer to 190 00:10:02,120 --> 00:10:06,440 Speaker 1: despite later displaying knowledge that it was in fact Noah 191 00:10:06,480 --> 00:10:10,120 Speaker 1: and the story and not Moses, and so that it's 192 00:10:10,160 --> 00:10:12,800 Speaker 1: not that they just don't know that much about the Bible, 193 00:10:12,840 --> 00:10:16,160 Speaker 1: like they can answer the question correctly when it's posed like, hey, 194 00:10:16,360 --> 00:10:19,080 Speaker 1: was it Noah or Moses who took animals onto the ark? 195 00:10:19,200 --> 00:10:21,920 Speaker 1: They can answer that correctly and yet still fail to 196 00:10:21,960 --> 00:10:24,960 Speaker 1: notice a problem in the question, And studies find that 197 00:10:25,000 --> 00:10:27,800 Speaker 1: people do this even when they're not rushed. They still 198 00:10:27,840 --> 00:10:30,480 Speaker 1: make the mistake when they are given unlimited time to 199 00:10:30,520 --> 00:10:33,480 Speaker 1: think about it. Another interesting thing here they found was 200 00:10:33,520 --> 00:10:37,520 Speaker 1: that the effect is not caused by people misreading or 201 00:10:37,559 --> 00:10:41,440 Speaker 1: mishearing the question, because people still make the Moses illusion 202 00:10:41,520 --> 00:10:45,200 Speaker 1: mistake even if they themselves read the question out loud, 203 00:10:45,280 --> 00:10:48,240 Speaker 1: including the name Moses, so they are saying Moses out 204 00:10:48,280 --> 00:10:51,440 Speaker 1: of their own lips and they still might not notice it. Now. 205 00:10:51,480 --> 00:10:55,000 Speaker 1: In this first study, the authors conclude that what's very 206 00:10:55,040 --> 00:10:58,040 Speaker 1: important because they're getting at things about the semantics of 207 00:10:58,080 --> 00:11:00,480 Speaker 1: words and a sentence and how the meanings of sentences 208 00:11:00,520 --> 00:11:04,400 Speaker 1: are formed. They conclude that shared semantic features of the 209 00:11:04,480 --> 00:11:08,400 Speaker 1: mix up are probably significantly contributing to the effect. In 210 00:11:08,440 --> 00:11:12,199 Speaker 1: other words, this effect would probably not be nearly as pronounced, 211 00:11:12,240 --> 00:11:15,600 Speaker 1: maybe not even maybe totally non existent if the items 212 00:11:15,640 --> 00:11:19,200 Speaker 1: were not in some way closely related in the way 213 00:11:19,200 --> 00:11:23,120 Speaker 1: that's a two Bible characters are. If you ask, you know, 214 00:11:23,160 --> 00:11:25,800 Speaker 1: how many of each kind did Captain Hook take into 215 00:11:25,840 --> 00:11:29,120 Speaker 1: the arc, the effect probably vanishes. Another study, I was 216 00:11:29,120 --> 00:11:31,400 Speaker 1: looking at side at an example I found really funny, 217 00:11:31,440 --> 00:11:34,840 Speaker 1: which was how many animals of each kind did Nixon 218 00:11:34,880 --> 00:11:38,200 Speaker 1: take on the arc? And yeah, and and I like 219 00:11:38,240 --> 00:11:40,400 Speaker 1: that because they were saying, Okay, well, what if it's 220 00:11:40,440 --> 00:11:44,679 Speaker 1: just like phonological similarities, like Nixon and Noah have some similarities. 221 00:11:44,720 --> 00:11:46,559 Speaker 1: They start with the same sound, they've got the same 222 00:11:46,640 --> 00:11:49,840 Speaker 1: number of syllables. But clearly when you put Nixon in 223 00:11:49,880 --> 00:11:53,960 Speaker 1: the sentence, people notice. And so the Moses illusion is 224 00:11:54,040 --> 00:11:57,960 Speaker 1: just one persistent example from a class of mental phenomena 225 00:11:58,080 --> 00:12:00,800 Speaker 1: that could be called knowledge neglect act. This is a 226 00:12:00,920 --> 00:12:03,319 Speaker 1: term used by a couple of authors that will cite 227 00:12:03,400 --> 00:12:07,040 Speaker 1: later in the episode, But knowledge neglect and simplified terms, 228 00:12:07,160 --> 00:12:10,760 Speaker 1: is when you behave as if you don't know something 229 00:12:10,920 --> 00:12:14,440 Speaker 1: even though you definitely do know it, And the Moses 230 00:12:14,480 --> 00:12:17,240 Speaker 1: illusion is of course an example of knowledge neglect because 231 00:12:17,280 --> 00:12:20,760 Speaker 1: the problem isn't that people think Moses was the biblical 232 00:12:20,840 --> 00:12:22,880 Speaker 1: character who built the arc. You can know that it 233 00:12:22,920 --> 00:12:25,920 Speaker 1: was Noah, not Moses. If you're asked directly, you'll get 234 00:12:25,960 --> 00:12:29,199 Speaker 1: the answer right, But you don't notice the problem when 235 00:12:29,240 --> 00:12:32,000 Speaker 1: it's phrased in a question like this. And of course 236 00:12:32,040 --> 00:12:33,960 Speaker 1: it's not just Moses and Noah. There are plenty of 237 00:12:33,960 --> 00:12:36,640 Speaker 1: other sentences in studies that have shown the same thing. 238 00:12:36,760 --> 00:12:40,480 Speaker 1: Though it is interesting that Moses and Noah are like, 239 00:12:40,600 --> 00:12:43,240 Speaker 1: sort of the perfect example of it. I think there 240 00:12:43,280 --> 00:12:46,640 Speaker 1: might be particular characteristics of these two names and characters 241 00:12:47,400 --> 00:12:50,440 Speaker 1: that make it like that make people especially prone to 242 00:12:50,520 --> 00:12:52,360 Speaker 1: the mix up in this case, though it is true 243 00:12:52,360 --> 00:12:55,760 Speaker 1: for lots of other types of you know, words and objects. Well, 244 00:12:55,920 --> 00:12:58,280 Speaker 1: speaking of that, let's do a quick breakdown. I'm just 245 00:12:58,679 --> 00:13:01,840 Speaker 1: especially for folks who are not up on Moses and Noah. Uh, 246 00:13:02,160 --> 00:13:05,760 Speaker 1: just to give a little you know, basic information about 247 00:13:05,760 --> 00:13:07,560 Speaker 1: each of them, and give me the give me the 248 00:13:07,600 --> 00:13:10,400 Speaker 1: magic the gathering card on each one. Okay, well, let's start. 249 00:13:10,480 --> 00:13:14,080 Speaker 1: Let's start with with Noah. Okay, certainly the the older 250 00:13:14,160 --> 00:13:16,920 Speaker 1: of the two, the first that in the chronological order. 251 00:13:17,200 --> 00:13:21,080 Speaker 1: So Noah was is written as as A was an 252 00:13:21,080 --> 00:13:26,360 Speaker 1: antediluvian patriarch in Jewish, Christian, and Islamic traditions. The basic story, 253 00:13:26,679 --> 00:13:28,640 Speaker 1: God grows sick of humanity, so he tells Noah to 254 00:13:28,720 --> 00:13:30,600 Speaker 1: round up his family and two of every animal and 255 00:13:30,640 --> 00:13:33,520 Speaker 1: get them on a big old boat the arc, uh, 256 00:13:33,840 --> 00:13:35,679 Speaker 1: the first of two arcs we're going to discuss here, 257 00:13:35,840 --> 00:13:38,680 Speaker 1: so they alone can survive the global flood that's about 258 00:13:38,679 --> 00:13:42,480 Speaker 1: to happen. Yeah. Now, one interesting variation. I think most 259 00:13:42,520 --> 00:13:45,120 Speaker 1: people probably wouldn't even their brains wouldn't go this far 260 00:13:45,160 --> 00:13:49,040 Speaker 1: into the question. Uh, it is actually more complicated than 261 00:13:49,120 --> 00:13:51,640 Speaker 1: two of every kind, because it also says in the 262 00:13:51,640 --> 00:13:54,680 Speaker 1: Noah's ark story that I think they're supposed to bring 263 00:13:54,800 --> 00:13:58,080 Speaker 1: more of every kind of like certain types of animals, 264 00:13:58,080 --> 00:14:01,319 Speaker 1: like certain clean animals and just two of the unclean 265 00:14:01,360 --> 00:14:04,079 Speaker 1: animals or something. But but yeah, when you get into 266 00:14:04,080 --> 00:14:06,440 Speaker 1: the nitty gritty of it, he gets a little more complicated, right, 267 00:14:06,480 --> 00:14:09,560 Speaker 1: I mean it's all kinds of animal management, Yeah, which 268 00:14:09,640 --> 00:14:12,959 Speaker 1: I would love to see somebody fail the test of 269 00:14:13,320 --> 00:14:18,040 Speaker 1: the the Noah illusion the Moses illusion here by by 270 00:14:18,080 --> 00:14:20,400 Speaker 1: going into a lot of detail about the you know, 271 00:14:20,480 --> 00:14:23,680 Speaker 1: the the actual biblical text while still failing. I think 272 00:14:23,680 --> 00:14:25,680 Speaker 1: that was right. Well it was fourteen of every kind 273 00:14:25,680 --> 00:14:32,360 Speaker 1: of clean animal. Alright. Well, anyway, Noah strengths megaproject management 274 00:14:32,400 --> 00:14:37,360 Speaker 1: and animal handling. Obviously weakness, alcoholism. That's a major part 275 00:14:37,400 --> 00:14:41,240 Speaker 1: of the story. Um. Actors of note who have betrayed him. Uh. 276 00:14:41,280 --> 00:14:43,760 Speaker 1: This is not a complete list, but these are the 277 00:14:43,760 --> 00:14:48,680 Speaker 1: main main ones. John Houston, Russell Crowe, David thrill Full 278 00:14:49,040 --> 00:14:51,040 Speaker 1: This is a guy on Shameless. He also played Dr 279 00:14:51,160 --> 00:14:56,400 Speaker 1: John d and Elizabeth the Golden Age. John Voight, David Rentall. Uh, 280 00:14:56,600 --> 00:14:59,680 Speaker 1: David Rentall is the guy who played Aries Targarian on 281 00:14:59,760 --> 00:15:03,360 Speaker 1: the Game of Thrones show. Oh interesting, wait a Aries 282 00:15:03,600 --> 00:15:08,760 Speaker 1: to the Mad King. I believe. So that's the main areas, right, Okay, yeah, 283 00:15:08,840 --> 00:15:10,680 Speaker 1: well maybe I guess for some reason I thought there 284 00:15:10,720 --> 00:15:13,720 Speaker 1: was another one. I am wrong. Um, okay, so the 285 00:15:14,120 --> 00:15:16,920 Speaker 1: I've got a really funny story about John Voight playing Noah. 286 00:15:17,000 --> 00:15:19,480 Speaker 1: I remember seeing this one, oh I have. It was 287 00:15:19,600 --> 00:15:22,480 Speaker 1: made for TV. I think came out when I was 288 00:15:22,520 --> 00:15:25,880 Speaker 1: in like middle school, and it is not at all 289 00:15:26,080 --> 00:15:29,880 Speaker 1: faithful to the Bible and to say, very hollywooded up 290 00:15:30,440 --> 00:15:33,680 Speaker 1: version of the Noah's arks story. John Voight does play 291 00:15:33,720 --> 00:15:37,400 Speaker 1: Noah and the arc is attacked by pirates. What. Yeah, 292 00:15:37,560 --> 00:15:40,200 Speaker 1: it's attacked by like water World pirates. I mean, it 293 00:15:40,280 --> 00:15:42,400 Speaker 1: might as well be Dennis Hopper and the Smokers, but 294 00:15:42,480 --> 00:15:45,840 Speaker 1: it's actually I think they get attacked by pirates led 295 00:15:45,840 --> 00:15:50,200 Speaker 1: by the biblical character Lot. Okay, alright, Well, if that 296 00:15:50,280 --> 00:15:52,560 Speaker 1: is in the Bible, at least they're they're playing around 297 00:15:52,600 --> 00:15:56,720 Speaker 1: with it. Was this brought up at all when um Uh, 298 00:15:57,000 --> 00:16:01,040 Speaker 1: when Darren Arnosky was being criticized for the plot of 299 00:16:01,120 --> 00:16:05,080 Speaker 1: his Noah movie, which has like um Uh giants and 300 00:16:05,160 --> 00:16:07,560 Speaker 1: nephelim in it. Oh he I kind of liked his 301 00:16:07,640 --> 00:16:11,600 Speaker 1: Noah movie. It was way more more faithful to I 302 00:16:11,640 --> 00:16:15,120 Speaker 1: think it included stuff from non canonical ancient texts, but 303 00:16:15,360 --> 00:16:18,920 Speaker 1: was actually inspired by ancient texts. Okay, alright, I still 304 00:16:18,960 --> 00:16:21,080 Speaker 1: haven't seen it. It's it's been on the list for 305 00:16:21,080 --> 00:16:23,720 Speaker 1: a while. All right. Let's talk about Moses real quick, Okay. 306 00:16:23,720 --> 00:16:27,480 Speaker 1: So Moses comes later. He's an Old Testament prophet um 307 00:16:27,720 --> 00:16:31,280 Speaker 1: central figure in the narrative of the Exodus. In the account, 308 00:16:31,360 --> 00:16:33,920 Speaker 1: he helps the Jewish people in their liberation from Egypt 309 00:16:33,960 --> 00:16:37,280 Speaker 1: Egyptian captivity and following tim the tin Plegs of Egypt, 310 00:16:37,280 --> 00:16:40,040 Speaker 1: he assists them in the Exodus, and he also is 311 00:16:40,080 --> 00:16:42,120 Speaker 1: involved with an arc. But it's the Ark of the 312 00:16:42,160 --> 00:16:45,160 Speaker 1: Covenant that we've discussed on the show before. Not a boat, 313 00:16:45,600 --> 00:16:50,360 Speaker 1: but a golden vessel that contains sacred items. Yeah. I 314 00:16:50,560 --> 00:16:53,040 Speaker 1: would assume that the words are related because they're both 315 00:16:53,080 --> 00:16:56,840 Speaker 1: like a container of kinds, like a big box. Okay, 316 00:16:56,840 --> 00:17:00,640 Speaker 1: So Moses his strength community organizing of course, and sorcery 317 00:17:01,200 --> 00:17:04,400 Speaker 1: his weaknesses. This is this is kind of interesting, I guess, 318 00:17:04,440 --> 00:17:07,480 Speaker 1: because it's either not obeying God and everything or obeying 319 00:17:07,520 --> 00:17:10,440 Speaker 1: God and everything depending on who you ask, right, Uh, 320 00:17:11,160 --> 00:17:13,080 Speaker 1: I mean, if you ask God, he would say, well, 321 00:17:13,119 --> 00:17:15,040 Speaker 1: he didn't obey me in everything. That's why I didn't 322 00:17:15,040 --> 00:17:18,040 Speaker 1: get to go into the Promised Land. But especially modern 323 00:17:18,040 --> 00:17:19,920 Speaker 1: critics are like, it seems like he he may be 324 00:17:20,000 --> 00:17:22,280 Speaker 1: followed the letter of the law a little bit too. 325 00:17:22,480 --> 00:17:26,600 Speaker 1: Uh uh too. Seriously, I seem to recall at one 326 00:17:26,640 --> 00:17:29,359 Speaker 1: point him commanding the death penalty for a dude who 327 00:17:29,480 --> 00:17:31,800 Speaker 1: was working on the Sabbath. That seems a little harsh. Yeah, 328 00:17:31,840 --> 00:17:34,800 Speaker 1: it seems it seems a little harsh. Um. Okay, So 329 00:17:34,840 --> 00:17:38,480 Speaker 1: actors of note who have betrayed Moses, well, Charlton Heston obviously, 330 00:17:38,720 --> 00:17:43,760 Speaker 1: Bert Lancaster, mel Brooks, Ben Kingsley, Val Kilmer though that 331 00:17:43,760 --> 00:17:48,040 Speaker 1: that may have just been a voice role. And Christian Bale. Now, 332 00:17:48,119 --> 00:17:50,560 Speaker 1: the last one is interesting because as I was looking 333 00:17:50,600 --> 00:17:52,439 Speaker 1: at these actors was one of the interesting things is 334 00:17:52,480 --> 00:17:56,720 Speaker 1: even though they're basically interchangeable, like the same. Um, you know, 335 00:17:57,000 --> 00:17:58,960 Speaker 1: in most of these cases, you're dealing with the same 336 00:17:59,000 --> 00:18:01,240 Speaker 1: white dude that could play either of these characters in 337 00:18:01,240 --> 00:18:05,600 Speaker 1: a big Hollywood production. Um, but it's interesting that I 338 00:18:05,640 --> 00:18:09,600 Speaker 1: don't think anyone has actually played both Moses and Noah, 339 00:18:09,800 --> 00:18:14,480 Speaker 1: though Christian Bale reportedly came very close because Darren Aronovski 340 00:18:14,680 --> 00:18:17,800 Speaker 1: originally wanted Christian Bale to play the title role in 341 00:18:17,920 --> 00:18:21,840 Speaker 1: his Noah film, but scheduling conflicts prohibited that from happening. 342 00:18:22,080 --> 00:18:25,080 Speaker 1: Oh he couldn't because he was filming like Terminator Mick 343 00:18:25,160 --> 00:18:28,439 Speaker 1: g or whatever. Yeah. I don't know, but um, but 344 00:18:28,520 --> 00:18:32,760 Speaker 1: imagine if if Bale had played both Noah and Moses. 345 00:18:32,840 --> 00:18:35,480 Speaker 1: What would that have meant for the Moses illusion? Would 346 00:18:35,480 --> 00:18:37,920 Speaker 1: it have made the would would it just destroyed our 347 00:18:37,920 --> 00:18:42,080 Speaker 1: semantic understanding of reality? Maybe there's a secret council. There's 348 00:18:42,160 --> 00:18:45,520 Speaker 1: like no Hollywood no actor can play both of these 349 00:18:45,600 --> 00:18:49,000 Speaker 1: roles because it will totally tear our understanding of of 350 00:18:49,000 --> 00:18:52,840 Speaker 1: of facts and fiction apart. I could see that. I mean, so, 351 00:18:52,920 --> 00:18:54,960 Speaker 1: I think what some of the authors here are proposing 352 00:18:55,359 --> 00:18:58,320 Speaker 1: is that the the fact that it's not just that 353 00:18:58,440 --> 00:19:01,400 Speaker 1: Moses and Noah are words that kind of sounds similar. 354 00:19:01,440 --> 00:19:04,520 Speaker 1: They've got some similar consonants and in the same number 355 00:19:04,560 --> 00:19:07,600 Speaker 1: of syllables, similar vowel sounds. That's all true, and that 356 00:19:07,640 --> 00:19:10,880 Speaker 1: does seem to matter, but it's also very important that 357 00:19:10,920 --> 00:19:14,520 Speaker 1: they are semantically related, that they are both characters from 358 00:19:14,560 --> 00:19:17,719 Speaker 1: the Torah, from the Old Testament, and that sort of 359 00:19:17,840 --> 00:19:19,800 Speaker 1: links them together. And I think the more you could 360 00:19:19,840 --> 00:19:22,680 Speaker 1: do to link them even further together and associate them 361 00:19:22,680 --> 00:19:25,480 Speaker 1: in in our minds, like yes, having one actor play both, 362 00:19:25,520 --> 00:19:30,120 Speaker 1: I think that would actually probably make people even more susceptible. Yeah. Um, 363 00:19:30,400 --> 00:19:32,920 Speaker 1: I was thinking about this too, Like obviously we've already 364 00:19:32,960 --> 00:19:35,480 Speaker 1: touched on a few extra examples of this, but I 365 00:19:35,520 --> 00:19:37,480 Speaker 1: was trying to come up with with other examples that 366 00:19:37,520 --> 00:19:39,960 Speaker 1: would play on the same energy here, and one that 367 00:19:40,000 --> 00:19:42,040 Speaker 1: came to mind would be, Uh, if we were to 368 00:19:42,080 --> 00:19:44,880 Speaker 1: look to Chinese mythology, if we were to say, hey, 369 00:19:44,920 --> 00:19:47,080 Speaker 1: how did the Yellow Emperor decide how to order the 370 00:19:47,119 --> 00:19:50,080 Speaker 1: animals of the zodiac? And you might respond with, oh, well, 371 00:19:50,119 --> 00:19:54,280 Speaker 1: there's this cool little story about a race for the animals, etcetera. Um, 372 00:19:54,320 --> 00:19:56,600 Speaker 1: but it wasn't the Yellow Emperor. It was the Jade Emperor, 373 00:19:56,600 --> 00:20:01,200 Speaker 1: who's an even more primordial god ruler than the low Whimperer. Um. 374 00:20:01,320 --> 00:20:02,800 Speaker 1: So I don't know. That seems like it could be 375 00:20:03,320 --> 00:20:07,280 Speaker 1: could play in the similar could work in a similar 376 00:20:07,320 --> 00:20:11,320 Speaker 1: way to the Moses and Noah illusion. Or how about 377 00:20:11,320 --> 00:20:13,239 Speaker 1: this in Return of the Jedi, what was Django fet 378 00:20:13,359 --> 00:20:16,200 Speaker 1: swallowed by? Oh? I just see. For some reason, I 379 00:20:16,240 --> 00:20:18,359 Speaker 1: feel like that one doesn't work because then learn as 380 00:20:18,359 --> 00:20:20,600 Speaker 1: you as soon as you say the word django, like 381 00:20:20,680 --> 00:20:23,280 Speaker 1: people's alarms go off and like, wait a minute, what 382 00:20:23,280 --> 00:20:26,640 Speaker 1: are we talking about? Yeah, yeah, yeah, yeah, well maybe 383 00:20:26,680 --> 00:20:29,840 Speaker 1: it would. Okay, here's one for Avatar the last Airbender 384 00:20:29,880 --> 00:20:32,880 Speaker 1: fans out there, Um, we're hearing from several of them, 385 00:20:33,000 --> 00:20:37,240 Speaker 1: which nation was the Avatar Apa born into. I don't 386 00:20:37,240 --> 00:20:39,040 Speaker 1: know if that one worked or not, but of course 387 00:20:39,119 --> 00:20:41,399 Speaker 1: Ang is the last and not the last Avatar. A 388 00:20:41,640 --> 00:20:44,400 Speaker 1: is the avatar uh Appa is the sky by sin 389 00:20:44,480 --> 00:20:47,320 Speaker 1: that he rides on. Ah, I see, so I don't know, 390 00:20:47,400 --> 00:20:50,440 Speaker 1: ang Appa, maybe that works. Not sure? Well that went 391 00:20:50,480 --> 00:20:59,240 Speaker 1: over my head anyway. So you might think, well, now 392 00:20:59,320 --> 00:21:01,000 Speaker 1: that we have told told you there is such a 393 00:21:01,040 --> 00:21:03,639 Speaker 1: thing as the Moses illusion, uh you know you would 394 00:21:03,640 --> 00:21:07,040 Speaker 1: never fall for it, right because you know you will 395 00:21:07,119 --> 00:21:10,600 Speaker 1: now always having this knowledge in your mind. Notice when 396 00:21:10,640 --> 00:21:13,400 Speaker 1: there will be substitutions of this kind in a question 397 00:21:13,480 --> 00:21:16,240 Speaker 1: orous sentence. But it turns out that's not necessarily true. 398 00:21:16,480 --> 00:21:19,800 Speaker 1: Uh So there was this original research from nineteen eighty one, 399 00:21:19,880 --> 00:21:22,119 Speaker 1: but there have been a bunch of studies in the 400 00:21:22,119 --> 00:21:26,680 Speaker 1: decades since then replicating the original finding and further probing 401 00:21:26,760 --> 00:21:29,480 Speaker 1: the effect to figure out what's going on in our brains. 402 00:21:30,080 --> 00:21:32,560 Speaker 1: Uh So, I wanted to talk about some typical findings. 403 00:21:32,840 --> 00:21:35,960 Speaker 1: First of all, some things that were summarized in h 404 00:21:36,040 --> 00:21:38,720 Speaker 1: in a few literature reviews I was looking at. One 405 00:21:38,840 --> 00:21:42,840 Speaker 1: was in a book chapter by Elizabeth J. Marsh and SHARDA. Umanath. 406 00:21:43,400 --> 00:21:47,960 Speaker 1: It was a book called Processing Inaccurate Information published by 407 00:21:48,080 --> 00:21:51,720 Speaker 1: M I. T. Press inteen. That book sounds like a scream, 408 00:21:51,760 --> 00:21:54,960 Speaker 1: but their chapter is called knowledge neglect Failures to notice 409 00:21:55,000 --> 00:21:58,360 Speaker 1: contradictions with stored knowledge will revisit this chapter a few 410 00:21:58,359 --> 00:22:01,520 Speaker 1: times later in the episode. But but they summarize some 411 00:22:01,600 --> 00:22:05,479 Speaker 1: things about the Moses solustion. Uh. So they say that 412 00:22:05,560 --> 00:22:07,919 Speaker 1: most of the time people will fall for the Moses 413 00:22:07,960 --> 00:22:11,440 Speaker 1: solution even though they actually know the difference between Moses 414 00:22:11,440 --> 00:22:14,639 Speaker 1: and Noah, as demonstrated with later interrogation. So you can 415 00:22:14,680 --> 00:22:17,800 Speaker 1: ask people questions like who built the arc or who 416 00:22:17,880 --> 00:22:19,919 Speaker 1: took the animals into the arc, and they'll get the 417 00:22:19,920 --> 00:22:22,480 Speaker 1: answer right, but they still fail to notice that it's 418 00:22:22,520 --> 00:22:25,080 Speaker 1: Moses in the question. And this can be accomplished with 419 00:22:25,119 --> 00:22:29,000 Speaker 1: other similar switcher Us actually included rob a list for 420 00:22:29,040 --> 00:22:31,679 Speaker 1: you to look at of questions like this one. I like, 421 00:22:31,920 --> 00:22:35,399 Speaker 1: is um, what did Goldilocks eat at the Three Little 422 00:22:35,400 --> 00:22:38,920 Speaker 1: Pigs house? And a lot of people will just answer Porridge, 423 00:22:38,920 --> 00:22:41,879 Speaker 1: even though you can later ask them like, hey, whose 424 00:22:41,920 --> 00:22:44,600 Speaker 1: house did Goldilocks go into? The Three Bears or the 425 00:22:44,600 --> 00:22:46,520 Speaker 1: three Little pigs? And they of course know that it 426 00:22:46,600 --> 00:22:50,560 Speaker 1: was the bears. Now that one's interesting because for me anyway, 427 00:22:50,680 --> 00:22:54,080 Speaker 1: there's a there's an associated mental image of the bears 428 00:22:54,280 --> 00:22:57,680 Speaker 1: or the pigs. Uh. They they look rather different uh 429 00:22:57,720 --> 00:23:00,480 Speaker 1: and and ultimately they have different functions in this stories. 430 00:23:01,280 --> 00:23:04,280 Speaker 1: Whereas Moses and Noah are more interchangeable, and it is 431 00:23:04,280 --> 00:23:06,439 Speaker 1: the same sort of character, and there of course the 432 00:23:06,480 --> 00:23:08,879 Speaker 1: same species, because the pigs are there to be the 433 00:23:08,960 --> 00:23:11,600 Speaker 1: victims of the big bad wolf story and to get eaten, 434 00:23:12,280 --> 00:23:14,960 Speaker 1: and the bears are there too. I don't know what 435 00:23:15,119 --> 00:23:17,159 Speaker 1: just hang out in their house, I guess right. But 436 00:23:17,200 --> 00:23:20,520 Speaker 1: I can still imagine someone um falling for this or 437 00:23:20,640 --> 00:23:24,320 Speaker 1: or you know, having airing in answering this question, because 438 00:23:24,359 --> 00:23:26,320 Speaker 1: in a way, again you're you're racing into the finish line, 439 00:23:26,320 --> 00:23:29,439 Speaker 1: you're picking up on the you know, the basics of 440 00:23:29,480 --> 00:23:33,040 Speaker 1: the question, even though you're you're you're skipping over this, 441 00:23:33,040 --> 00:23:37,800 Speaker 1: this this this misinformation that's embedded in the middle of it. Right. Uh. 442 00:23:37,800 --> 00:23:40,000 Speaker 1: Though it's interesting that you mentioned racing to get to 443 00:23:40,040 --> 00:23:42,320 Speaker 1: the answer, I do think you're basically right about that, 444 00:23:42,400 --> 00:23:45,840 Speaker 1: Except it doesn't really seem that time is a factor here, 445 00:23:45,880 --> 00:23:49,720 Speaker 1: because giving people extra or even unlimited time to think 446 00:23:49,760 --> 00:23:53,639 Speaker 1: about the question does not eliminate the effect, does it. 447 00:23:53,720 --> 00:23:55,760 Speaker 1: So it doesn't seem to result from people being in 448 00:23:55,800 --> 00:23:58,240 Speaker 1: a hurry in terms of time, though I think you 449 00:23:58,240 --> 00:24:00,600 Speaker 1: could still think about it as people being in a 450 00:24:00,680 --> 00:24:03,439 Speaker 1: hurry in terms of just like wanting to get to 451 00:24:03,480 --> 00:24:05,840 Speaker 1: the part where they answer the question I don't know. 452 00:24:05,880 --> 00:24:08,400 Speaker 1: Maybe that could be like self imposed time limits, even 453 00:24:08,400 --> 00:24:12,120 Speaker 1: if they're not imposed by somebody externally trying to rush 454 00:24:12,160 --> 00:24:14,960 Speaker 1: you through. Now, Also, in a typical setup for these 455 00:24:15,160 --> 00:24:19,160 Speaker 1: Moses illusion experiments, readers will be warned that some questions 456 00:24:19,160 --> 00:24:22,280 Speaker 1: will contain incorrect presuppositions, so it's not just like a 457 00:24:22,280 --> 00:24:24,960 Speaker 1: trick question where they don't know this is coming. They'll 458 00:24:25,000 --> 00:24:28,880 Speaker 1: be told, Okay, some of these questions will be valid questions, 459 00:24:28,920 --> 00:24:31,720 Speaker 1: in which case you should just answer them, But other 460 00:24:31,840 --> 00:24:35,959 Speaker 1: questions will have incorrect presuppositions, and when you come across 461 00:24:36,000 --> 00:24:38,280 Speaker 1: one of those, you should note that the question is 462 00:24:38,320 --> 00:24:41,520 Speaker 1: not valid. Now, the interesting thing is, I would think 463 00:24:41,600 --> 00:24:44,480 Speaker 1: something like that would almost completely erase the effect, because 464 00:24:44,480 --> 00:24:47,680 Speaker 1: you're putting people on guard to be like interrogating the questions. 465 00:24:47,880 --> 00:24:50,320 Speaker 1: But it doesn't. You can put people on guard like 466 00:24:50,359 --> 00:24:53,959 Speaker 1: that and they still fall for the Moses illusion. In 467 00:24:54,000 --> 00:24:57,160 Speaker 1: these experiments, it does seem to be a very robust effect, 468 00:24:57,240 --> 00:24:59,960 Speaker 1: like a substantial number of people will fail to detect 469 00:25:00,040 --> 00:25:02,679 Speaker 1: to errors in questions, even though they later showed that 470 00:25:02,720 --> 00:25:06,359 Speaker 1: they possessed the knowledge to answer them correctly. Uh. The 471 00:25:06,400 --> 00:25:09,440 Speaker 1: exact percentages of the effect, though very a good bit 472 00:25:09,880 --> 00:25:14,600 Speaker 1: uh from that chapter by Martian Umanov the they right quote. Overall, 473 00:25:14,680 --> 00:25:18,199 Speaker 1: the Moses ilusion is robust, with readers answering from fourteen 474 00:25:18,280 --> 00:25:22,520 Speaker 1: percent to forty percent to fifty two percent to seventy 475 00:25:22,560 --> 00:25:26,760 Speaker 1: seven percent of distorted questions depending on the particular experiments. 476 00:25:26,760 --> 00:25:28,760 Speaker 1: So they're citing a number of different results there. The 477 00:25:28,800 --> 00:25:35,840 Speaker 1: fourteen percent was by Van Jarsveld Dikstra and Herman's was 478 00:25:35,880 --> 00:25:40,080 Speaker 1: Hannon and Donovan in two thousand, one percent was Ericson 479 00:25:40,119 --> 00:25:43,720 Speaker 1: and Mattson in N one, and seventy seven percent was 480 00:25:43,760 --> 00:25:46,880 Speaker 1: Barton and Sandford in three. And I would imagine these 481 00:25:46,880 --> 00:25:49,320 Speaker 1: differences have a lot to do with, like, what what 482 00:25:49,480 --> 00:25:52,960 Speaker 1: exactly types of warnings you're giving people ahead of time, 483 00:25:53,000 --> 00:25:56,320 Speaker 1: what exactly what exact examples are used? As we've said, 484 00:25:56,480 --> 00:26:00,000 Speaker 1: you know, it's it's clear that different questions are more 485 00:26:00,040 --> 00:26:02,240 Speaker 1: prone than others. So like, I think more people would 486 00:26:02,280 --> 00:26:05,800 Speaker 1: probably fall for the Moses Noah confusion than for the 487 00:26:05,880 --> 00:26:09,000 Speaker 1: three Little Pigs Three Bears confusion. Yeah, I have to 488 00:26:09,040 --> 00:26:11,760 Speaker 1: say some of the the examples that you included on 489 00:26:11,760 --> 00:26:13,600 Speaker 1: a list here, it's it's interesting to run through this 490 00:26:13,680 --> 00:26:18,000 Speaker 1: because even though I'm not encountering them as actual questions, 491 00:26:18,040 --> 00:26:19,800 Speaker 1: like one and someone in one of these studies would 492 00:26:19,800 --> 00:26:22,080 Speaker 1: be I can certainly pick up on the ones that 493 00:26:22,119 --> 00:26:26,439 Speaker 1: I feel like would have been more likely to fool me, like, 494 00:26:26,480 --> 00:26:29,520 Speaker 1: for instance, what kind of treated Lincoln chop down? What 495 00:26:29,600 --> 00:26:33,200 Speaker 1: kind of treated Washington chop down? Um? Like, I I 496 00:26:33,280 --> 00:26:36,480 Speaker 1: can imagine myself sort of this being a story I'm 497 00:26:36,480 --> 00:26:40,080 Speaker 1: not tremendously invested in, but have a version off stored away. 498 00:26:40,800 --> 00:26:44,160 Speaker 1: I can instantly skip, or even not instantly, but even 499 00:26:44,160 --> 00:26:46,520 Speaker 1: with some thought, would be like I think, yeah, cherry Tree, 500 00:26:46,640 --> 00:26:49,920 Speaker 1: Cherry Tree, that's the one, you know, even if said Lincoln. Yeah, 501 00:26:49,920 --> 00:26:52,800 Speaker 1: even if it's said Lincoln, because also I don't know Lincoln. 502 00:26:54,040 --> 00:26:56,520 Speaker 1: Something about like their stories about him, you know, we 503 00:26:56,560 --> 00:26:59,080 Speaker 1: also have sort of tall tales about him and his exploits, 504 00:26:59,119 --> 00:27:03,399 Speaker 1: and uh, the one about him, there's one about him 505 00:27:03,440 --> 00:27:05,760 Speaker 1: answering a duel. So I challenged him to a duel, 506 00:27:05,760 --> 00:27:08,720 Speaker 1: and he says, well, I get to choose the place 507 00:27:08,760 --> 00:27:11,919 Speaker 1: and the weapon. So I choose, Uh, let's sledge hammers 508 00:27:11,960 --> 00:27:14,680 Speaker 1: and five ft of water or something. Right there, David, 509 00:27:14,680 --> 00:27:17,000 Speaker 1: he's tall and the other person was short, something like that. 510 00:27:17,040 --> 00:27:19,720 Speaker 1: I have no idea if that's even a legitimate story, 511 00:27:19,760 --> 00:27:21,520 Speaker 1: but I have it in my head. So I have 512 00:27:21,680 --> 00:27:24,520 Speaker 1: an image of Lincoln holding some sort of a long 513 00:27:24,560 --> 00:27:28,600 Speaker 1: handled tool. So it fits in nicely into the story, 514 00:27:28,680 --> 00:27:32,640 Speaker 1: like I can easily overlay one over the other. Yeah. 515 00:27:32,680 --> 00:27:35,960 Speaker 1: One of the examples that I feel extremely confident that 516 00:27:36,000 --> 00:27:38,919 Speaker 1: I would not fall for is the one of what 517 00:27:39,119 --> 00:27:42,360 Speaker 1: is the name of the Mexican dip made with mashed artichokes? 518 00:27:43,840 --> 00:27:46,840 Speaker 1: I definitely, I mean, I just know artichokes. No, that 519 00:27:47,000 --> 00:27:49,879 Speaker 1: is not what it is. You don't mash artichokes, do you? 520 00:27:49,960 --> 00:27:53,040 Speaker 1: I mean, I haven't seen him match. He could, like 521 00:27:53,080 --> 00:27:58,119 Speaker 1: can make an artichoke paste, but artichoke guacamole. That sounds gross. 522 00:27:58,400 --> 00:28:01,320 Speaker 1: I mean, but yet arti choke death is amazing but 523 00:28:03,880 --> 00:28:09,080 Speaker 1: sound right? But anyway, So Marcia and Umanov also note 524 00:28:09,119 --> 00:28:12,800 Speaker 1: that um that that error detection is lower when items 525 00:28:12,960 --> 00:28:15,720 Speaker 1: uh that items are swapped are similar in a couple 526 00:28:15,760 --> 00:28:18,760 Speaker 1: of ways. We've already mentioned these, but they reiterate that 527 00:28:19,400 --> 00:28:22,560 Speaker 1: it helps when there's phonological similarity. So do the words 528 00:28:22,600 --> 00:28:26,200 Speaker 1: sound close to each other? I feel like uh, avocados 529 00:28:26,240 --> 00:28:29,040 Speaker 1: and artichokes like they have some similar vowel sounds, and 530 00:28:29,080 --> 00:28:31,439 Speaker 1: they start with the same letter, but they sound different 531 00:28:31,560 --> 00:28:34,439 Speaker 1: enough to me that I'm immediately strong. I think somehow 532 00:28:34,520 --> 00:28:37,119 Speaker 1: like the hard K sound coming towards the end of 533 00:28:37,160 --> 00:28:41,080 Speaker 1: the word artichoke, but coming towards the beginning of or 534 00:28:41,120 --> 00:28:43,120 Speaker 1: I guess in the middle of avocado. Somehow that makes 535 00:28:43,120 --> 00:28:46,880 Speaker 1: a big difference in my brain. And then, of course, 536 00:28:47,160 --> 00:28:50,200 Speaker 1: as we've been saying, semantic similarity, are the concept somehow 537 00:28:50,320 --> 00:28:53,080 Speaker 1: similar or related? Would we put them in a kind 538 00:28:53,080 --> 00:28:56,520 Speaker 1: of meaning next us together in the brain? Uh? And 539 00:28:56,520 --> 00:28:59,280 Speaker 1: and of course it's notable that the Moses versus Noah 540 00:28:59,320 --> 00:29:02,160 Speaker 1: one meets both with the criteria. They sound similar and 541 00:29:02,200 --> 00:29:06,040 Speaker 1: they're related. So anyway, it's just this interesting fact about 542 00:29:06,040 --> 00:29:09,840 Speaker 1: our brains, that something about being asked a question like this. 543 00:29:09,960 --> 00:29:12,760 Speaker 1: So trying to process a sentence like the questions in 544 00:29:12,800 --> 00:29:16,160 Speaker 1: these studies causes us to ignore the fact that the 545 00:29:16,200 --> 00:29:19,000 Speaker 1: contents of the sentence conflict with things that we know 546 00:29:19,120 --> 00:29:22,080 Speaker 1: to be true. And I wanted to mention one other 547 00:29:22,120 --> 00:29:24,680 Speaker 1: study I was looking at that. This one is by 548 00:29:25,440 --> 00:29:29,120 Speaker 1: Hadency Bottoms, Andrea N. Slick, and Elizabeth J. Marsh from 549 00:29:29,880 --> 00:29:33,200 Speaker 1: published in the journal Memory called Memory and the Moses 550 00:29:33,280 --> 00:29:37,680 Speaker 1: illusion failures to detect contradictions with stored knowledge yield negative 551 00:29:37,720 --> 00:29:41,600 Speaker 1: memorial consequences. Now we can revisit some of the things 552 00:29:41,600 --> 00:29:43,400 Speaker 1: and this more as we go on, but I just 553 00:29:43,440 --> 00:29:45,560 Speaker 1: wanted to note a few things that they bring up. 554 00:29:46,120 --> 00:29:48,920 Speaker 1: Uh So, first of all, they note some other previous 555 00:29:48,960 --> 00:29:54,880 Speaker 1: findings in their introduction. One is that um error detection improves, 556 00:29:54,960 --> 00:29:57,320 Speaker 1: so people are less likely to fall for the Moses 557 00:29:57,360 --> 00:30:00,640 Speaker 1: illusion when the error appears in what they called the 558 00:30:00,720 --> 00:30:04,840 Speaker 1: cleft phrase or the main focus of the sentence. So 559 00:30:04,880 --> 00:30:06,920 Speaker 1: there are ways that you can basically ask the same 560 00:30:07,040 --> 00:30:10,040 Speaker 1: question but just sort of rearrange the words to make 561 00:30:10,080 --> 00:30:13,640 Speaker 1: people more likely to notice the problem. So, if you 562 00:30:13,680 --> 00:30:16,680 Speaker 1: take the sentence how many animals of each kind did 563 00:30:16,720 --> 00:30:19,320 Speaker 1: Moses take on the arc? The word Moses is kind 564 00:30:19,360 --> 00:30:23,360 Speaker 1: of syntactically de emphasized in that sentence, you know, it's 565 00:30:23,400 --> 00:30:25,760 Speaker 1: not like the main focus of the way the sentence 566 00:30:25,800 --> 00:30:29,520 Speaker 1: is phrased. You can reorient the words to make Moses 567 00:30:29,600 --> 00:30:31,920 Speaker 1: more prominent in the sentence, in which case people are 568 00:30:31,920 --> 00:30:34,920 Speaker 1: more likely to catch the problem. Yeah, Like, I also 569 00:30:34,920 --> 00:30:37,120 Speaker 1: feel like having the word show up so late in 570 00:30:37,160 --> 00:30:41,120 Speaker 1: the sentence. I'm I'm, I'm like you're always predicting where 571 00:30:41,160 --> 00:30:44,040 Speaker 1: sentences are going, you know, so you've kind of already 572 00:30:44,040 --> 00:30:45,960 Speaker 1: filled it in to a certain extent, like you know, 573 00:30:46,080 --> 00:30:48,480 Speaker 1: you know who we're talking about. Uh, even if you 574 00:30:48,560 --> 00:30:51,520 Speaker 1: end up using the wrong name. Um, yeah, I think 575 00:30:51,520 --> 00:30:54,520 Speaker 1: you're exactly right about that. Like that, once you've heard 576 00:30:54,560 --> 00:30:56,360 Speaker 1: I don't know, you get like four or five words 577 00:30:56,400 --> 00:30:59,080 Speaker 1: into the sentence, you sort of are like you already 578 00:30:59,080 --> 00:31:01,120 Speaker 1: know what it's going to be, and you're just sort 579 00:31:01,120 --> 00:31:03,680 Speaker 1: of like okay, you like mostly ignoring the words that 580 00:31:03,720 --> 00:31:06,640 Speaker 1: come after that. Another thing that they point out that's 581 00:31:06,640 --> 00:31:11,280 Speaker 1: interesting is that error detection improves when questions appear in 582 00:31:11,320 --> 00:31:14,840 Speaker 1: a difficult to read font and they say this is 583 00:31:14,880 --> 00:31:19,080 Speaker 1: because it reduces processing fluency, which in turn makes material 584 00:31:19,480 --> 00:31:23,120 Speaker 1: seem less familiar and less true. And this was found 585 00:31:23,120 --> 00:31:26,000 Speaker 1: by Song and Schwartz in two thousand and eight. And this, 586 00:31:26,080 --> 00:31:28,560 Speaker 1: of course, this comes back to our old friend. Processing 587 00:31:28,560 --> 00:31:32,320 Speaker 1: fluency a cognitive factor that I believe is one of 588 00:31:32,320 --> 00:31:36,960 Speaker 1: the most underappreciated influences on our thoughts and beliefs and behavior. 589 00:31:37,480 --> 00:31:39,920 Speaker 1: We talked about it in our episode on the Illusory 590 00:31:39,960 --> 00:31:44,120 Speaker 1: truth Effect. Basically, processing fluency means how easy is it 591 00:31:44,200 --> 00:31:48,440 Speaker 1: for this stimulus to be processed by the brain. And uh, 592 00:31:48,440 --> 00:31:50,800 Speaker 1: and it came up in the illusory truth effect episode 593 00:31:50,880 --> 00:31:54,280 Speaker 1: because I remember. The illusory truth effect is where statements 594 00:31:54,320 --> 00:31:58,240 Speaker 1: you've encountered before seem more true than statements that are 595 00:31:58,280 --> 00:32:02,040 Speaker 1: new to you. And one possible explanation for this is 596 00:32:02,240 --> 00:32:05,680 Speaker 1: that familiar statements are easier for the brain to process 597 00:32:05,760 --> 00:32:09,200 Speaker 1: than unfamiliar ones are, and at some level, the brain 598 00:32:09,520 --> 00:32:13,600 Speaker 1: makes an equivalence between that processing fluency, how easy it 599 00:32:13,680 --> 00:32:17,440 Speaker 1: is to process this incoming sentence because it's familiar and 600 00:32:17,760 --> 00:32:21,440 Speaker 1: factual trustworthiness. They actually have nothing to do with one another, 601 00:32:21,480 --> 00:32:24,040 Speaker 1: but the brain maybe uses a little bit of shortcut there. 602 00:32:25,200 --> 00:32:28,040 Speaker 1: So are you saying that in the future for our 603 00:32:28,080 --> 00:32:31,160 Speaker 1: our shared notes, Joe, we should use chiller font instead 604 00:32:31,200 --> 00:32:35,160 Speaker 1: of whatever we're using now. Yeah, that would that make 605 00:32:35,200 --> 00:32:37,520 Speaker 1: it less like I mean, I think that would generally 606 00:32:37,680 --> 00:32:40,360 Speaker 1: slow us down and make it harder to do the podcast, 607 00:32:40,440 --> 00:32:43,280 Speaker 1: But it also might make it less likely that we 608 00:32:43,320 --> 00:32:47,680 Speaker 1: would just like flub words here and there, because it 609 00:32:47,680 --> 00:32:51,240 Speaker 1: would be a like really effortful, laborious process to get 610 00:32:51,280 --> 00:32:54,600 Speaker 1: through every single thought, which you know sometimes it is anyway, 611 00:32:54,720 --> 00:32:58,080 Speaker 1: but that that's on us, um but anyway, so Song 612 00:32:58,160 --> 00:32:59,840 Speaker 1: and Shorts here in two thousand and eight found that 613 00:33:00,040 --> 00:33:02,720 Speaker 1: simply by making statements harder to read, so you put 614 00:33:02,760 --> 00:33:05,600 Speaker 1: them in, you said Chiller, I was thinking Papyrus. I 615 00:33:05,600 --> 00:33:08,480 Speaker 1: don't know what what actual thought they used, but it 616 00:33:08,520 --> 00:33:11,200 Speaker 1: would just make people more likely to spot errors in 617 00:33:11,200 --> 00:33:14,680 Speaker 1: the questions instead of just rolling right over them without noticing. 618 00:33:14,920 --> 00:33:18,400 Speaker 1: And you know that makes sense to me, Yeah, yeah, 619 00:33:18,720 --> 00:33:22,400 Speaker 1: it does. It is interesting that that's how our brains work, though, Yeah, 620 00:33:22,480 --> 00:33:24,760 Speaker 1: it is sort of counterintuitive at the same time, like 621 00:33:24,840 --> 00:33:27,680 Speaker 1: you might just assume that if something's harder to read, 622 00:33:27,840 --> 00:33:30,520 Speaker 1: you would be less likely to catch errors in it. 623 00:33:30,560 --> 00:33:32,720 Speaker 1: But yeah, I think there's some kind of process where 624 00:33:32,720 --> 00:33:35,280 Speaker 1: it's like slowing you down. It's not allowing you to 625 00:33:35,360 --> 00:33:39,120 Speaker 1: just like skip over the parts that that seemed like yeah, yeah, okay, 626 00:33:39,280 --> 00:33:41,600 Speaker 1: Moses whatever. Yeah, it's like like a bit of food 627 00:33:41,640 --> 00:33:44,320 Speaker 1: that's extra chewy, so you're going to really taste this, 628 00:33:44,520 --> 00:33:46,160 Speaker 1: You're really going to get a feel for the text here. 629 00:33:46,160 --> 00:33:48,800 Speaker 1: There's no just wolf in this down. Yeah. Now. In 630 00:33:48,840 --> 00:33:51,400 Speaker 1: the study by Bottoms at All, they were looking at 631 00:33:51,400 --> 00:33:55,200 Speaker 1: the question of whether participants can detect errors in questions 632 00:33:55,320 --> 00:33:59,240 Speaker 1: better if there are just more errors overall in the 633 00:33:59,280 --> 00:34:01,000 Speaker 1: sample of quest gens. So, if I give you a 634 00:34:01,000 --> 00:34:04,160 Speaker 1: bunch of questions and like, I don't know, seventy of 635 00:34:04,200 --> 00:34:07,320 Speaker 1: them contain errors of this kind in them, are people 636 00:34:07,360 --> 00:34:09,680 Speaker 1: more likely to catch them? And it looks like the 637 00:34:09,719 --> 00:34:11,920 Speaker 1: answer is yes. Like if you if you've got people 638 00:34:11,960 --> 00:34:15,560 Speaker 1: on guard because they were just constantly problems with these questions, 639 00:34:15,840 --> 00:34:17,879 Speaker 1: their guard goes up, and they do seem to make 640 00:34:17,920 --> 00:34:21,560 Speaker 1: the Moses illusion mistake less often. And it strikes me 641 00:34:21,640 --> 00:34:24,680 Speaker 1: that that could be possibly, or at least partially because 642 00:34:25,440 --> 00:34:29,240 Speaker 1: once you start, you know, showing people questions where most 643 00:34:29,320 --> 00:34:32,160 Speaker 1: of them contain a problem, or even just a large 644 00:34:32,200 --> 00:34:36,240 Speaker 1: minority of them contain a problem, people probably start uh 645 00:34:36,400 --> 00:34:40,719 Speaker 1: interacting with the questions less as questions and becoming less 646 00:34:40,760 --> 00:34:44,200 Speaker 1: focused on just getting the answer and start looking at 647 00:34:44,239 --> 00:34:46,680 Speaker 1: them more like a puzzle where you're you're trying to 648 00:34:46,760 --> 00:34:50,160 Speaker 1: parse the sentence very clearly. Yeah. Yeah, it's like, how 649 00:34:50,280 --> 00:34:54,040 Speaker 1: is this trying to trick me? Yeah? But then there's 650 00:34:54,080 --> 00:34:57,720 Speaker 1: one kind of scary implication from this paper the author's 651 00:34:57,840 --> 00:35:01,040 Speaker 1: right quote. More generally, the failure of detect errors had 652 00:35:01,120 --> 00:35:05,640 Speaker 1: negative memorial consequences, increasing the likelihood that errors were used 653 00:35:05,840 --> 00:35:11,200 Speaker 1: to answer later general knowledge questions. Methodological implications of this 654 00:35:11,280 --> 00:35:14,720 Speaker 1: finding are discussed, as it suggests that typical analyzes likely 655 00:35:14,880 --> 00:35:19,560 Speaker 1: underestimate the size of the Moses illusion. Overall, answering distorted 656 00:35:19,640 --> 00:35:23,960 Speaker 1: questions can yield errors in the knowledge base. More importantly, 657 00:35:24,160 --> 00:35:28,520 Speaker 1: prior knowledge does not protect against these negative memorial consequences. 658 00:35:28,960 --> 00:35:30,600 Speaker 1: And Robert, I think you had a note about that. 659 00:35:30,640 --> 00:35:32,319 Speaker 1: We can talk a little bit more about that in 660 00:35:32,320 --> 00:35:35,400 Speaker 1: a bit, but yeah, basically, there there is some evidence 661 00:35:35,480 --> 00:35:40,000 Speaker 1: that just steamrolling over an incorrect fact in a sentence, 662 00:35:40,120 --> 00:35:44,040 Speaker 1: even when you know otherwise, can can later damage your 663 00:35:44,080 --> 00:35:48,080 Speaker 1: ability to recall that fact correctly. Yeah, yeah, so it 664 00:35:49,120 --> 00:35:51,920 Speaker 1: yeahs as we'll discuss here. It's it's not just a 665 00:35:51,920 --> 00:35:54,800 Speaker 1: situation where oh, well this is a quirk, this is interesting. 666 00:35:54,840 --> 00:35:57,040 Speaker 1: The brain does this. I mean, it is that, but 667 00:35:57,200 --> 00:36:01,440 Speaker 1: it it has it has greater implications. Yeah. Now, I 668 00:36:01,480 --> 00:36:04,120 Speaker 1: want to go back on the other side and say that, uh, 669 00:36:04,960 --> 00:36:07,759 Speaker 1: when we encounter things like this, you know, illusions that 670 00:36:07,920 --> 00:36:10,400 Speaker 1: humans often fall for when you read about a certain 671 00:36:10,440 --> 00:36:14,280 Speaker 1: type of I don't know, cognitive bias or or something. 672 00:36:14,760 --> 00:36:17,920 Speaker 1: I think our tendency is often to at first react 673 00:36:18,000 --> 00:36:21,319 Speaker 1: like wow, our dumb brains were so stupid. But but 674 00:36:21,440 --> 00:36:24,440 Speaker 1: I think there's another way to think about it, and 675 00:36:24,520 --> 00:36:28,520 Speaker 1: that's this, How amazing is it that we have such 676 00:36:28,520 --> 00:36:32,359 Speaker 1: a powerful command of language based reasoning that we can 677 00:36:32,400 --> 00:36:37,239 Speaker 1: answer questions even though key elements of the sentence do 678 00:36:37,280 --> 00:36:39,840 Speaker 1: not match with our knowledge base. I mean, think about 679 00:36:39,880 --> 00:36:43,880 Speaker 1: the trouble that a computer would run into trying to 680 00:36:43,960 --> 00:36:47,560 Speaker 1: do the same thing. Like, While it's an interesting case 681 00:36:47,560 --> 00:36:50,600 Speaker 1: of an illusion failing to notice facts that conflict with 682 00:36:50,640 --> 00:36:55,120 Speaker 1: our existing knowledge, it's also a demonstration of an absolutely 683 00:36:55,160 --> 00:36:59,880 Speaker 1: amazing capacity for language comprehension, even when there are severe 684 00:37:00,239 --> 00:37:03,560 Speaker 1: errors in the questions or sentences that we're trying to comprehend. 685 00:37:03,680 --> 00:37:07,440 Speaker 1: Like somehow our brains are so good at getting what 686 00:37:07,560 --> 00:37:11,000 Speaker 1: seems to be the gist the intended global meaning of 687 00:37:11,040 --> 00:37:14,560 Speaker 1: a sentence, even when pivotal items in that sentence are 688 00:37:14,600 --> 00:37:17,160 Speaker 1: wrong and should be pointing you off in the wrong 689 00:37:17,200 --> 00:37:22,000 Speaker 1: direction and make you totally confused. Yeah, yeah, um, you know. 690 00:37:22,040 --> 00:37:24,160 Speaker 1: I can't help but be reminded in all this about 691 00:37:24,200 --> 00:37:26,680 Speaker 1: the drawing of the bicycle that we've touched on before 692 00:37:26,719 --> 00:37:29,520 Speaker 1: about often, I mean, it's different. We're not dealing with language, 693 00:37:29,560 --> 00:37:33,000 Speaker 1: We're dealing with a uh, like a mental image. Like 694 00:37:33,040 --> 00:37:34,799 Speaker 1: we all think we have the mental image of a 695 00:37:34,800 --> 00:37:37,560 Speaker 1: bicycle pretty firm in our heads, and yet when put 696 00:37:37,560 --> 00:37:40,560 Speaker 1: to the test, when acts asked to draw a bicycle, um, 697 00:37:40,800 --> 00:37:44,120 Speaker 1: we're often floored. Yeah. That was a different one of 698 00:37:44,120 --> 00:37:47,120 Speaker 1: our cognitive Allusions episodes. That was the the illusion of 699 00:37:47,160 --> 00:37:51,680 Speaker 1: explanatory depth. Yeah, the issue where people they tend to 700 00:37:51,680 --> 00:37:55,160 Speaker 1: think like that they understand how something works until they're 701 00:37:55,160 --> 00:37:57,920 Speaker 1: asked to explain it. So somehow the brain has a 702 00:37:57,920 --> 00:38:02,680 Speaker 1: way of representing a sort of pat Tempken comprehension. You 703 00:38:02,719 --> 00:38:04,839 Speaker 1: know that it puts up this facade of yeah, you 704 00:38:04,880 --> 00:38:07,040 Speaker 1: know how that works. Yeah, I I I know how 705 00:38:07,080 --> 00:38:09,120 Speaker 1: I know the parts of a bicycle. I know all 706 00:38:09,120 --> 00:38:11,600 Speaker 1: the parts of a can opener. I could make one basically. 707 00:38:11,960 --> 00:38:14,080 Speaker 1: But then if you are asked to like explain the 708 00:38:14,120 --> 00:38:16,400 Speaker 1: steps of how it works or draw all the parts, 709 00:38:16,440 --> 00:38:20,399 Speaker 1: you're like, uh, yeah. I thought about this a lot 710 00:38:20,520 --> 00:38:24,040 Speaker 1: watching the Outlander TV show about the time traveler goes 711 00:38:24,080 --> 00:38:26,680 Speaker 1: back in time and she's recreating various things that she 712 00:38:26,760 --> 00:38:29,200 Speaker 1: knows about from the future, and I'm like, God, like, 713 00:38:29,239 --> 00:38:30,680 Speaker 1: how many of us, you know, we go if we 714 00:38:30,680 --> 00:38:31,960 Speaker 1: were to do that, if we were to go back 715 00:38:31,960 --> 00:38:34,800 Speaker 1: in time, we might tell somebody about all these marvelous 716 00:38:34,840 --> 00:38:38,200 Speaker 1: things like oh yeah, penicillin and uh, you know, bicycles 717 00:38:38,200 --> 00:38:39,799 Speaker 1: and whatnot, and they'd be like, oh great, how does 718 00:38:39,800 --> 00:38:41,800 Speaker 1: it work? And be like, um, yeah, no, no idea. 719 00:38:41,920 --> 00:38:44,120 Speaker 1: I have some some vague So I have some of 720 00:38:44,160 --> 00:38:46,319 Speaker 1: the facts in my head, but not near enough to 721 00:38:46,480 --> 00:38:55,279 Speaker 1: reproduce anything that I'm talking about. Thank thank Coming back 722 00:38:55,320 --> 00:38:58,560 Speaker 1: to this thing about how the Moses solution is, it 723 00:38:58,640 --> 00:39:01,600 Speaker 1: is and could be looked at as an example of 724 00:39:01,600 --> 00:39:06,680 Speaker 1: how amazingly adaptive at comprehension our brains are actually found 725 00:39:06,680 --> 00:39:09,800 Speaker 1: a book chapter discussing this very aspect of the effect. 726 00:39:10,120 --> 00:39:14,040 Speaker 1: So the authors here were hik Young Park and Lynn M. 727 00:39:14,160 --> 00:39:17,840 Speaker 1: Rader uh And this was a chapter in a book, 728 00:39:18,200 --> 00:39:20,520 Speaker 1: and the chapter was called the Moses Illusion. I think 729 00:39:20,520 --> 00:39:22,919 Speaker 1: it was published in two thousand four. And so they're 730 00:39:22,960 --> 00:39:26,560 Speaker 1: talking about different potential explanations for the Moses illusion, what's 731 00:39:26,600 --> 00:39:29,080 Speaker 1: going on in the brain, and they conclude that they 732 00:39:29,160 --> 00:39:31,759 Speaker 1: or at least they argue that the most likely explanation 733 00:39:31,920 --> 00:39:35,040 Speaker 1: for what's going on when we fall for this is 734 00:39:35,320 --> 00:39:38,640 Speaker 1: something they call the partial match hypothesis. So I just 735 00:39:38,680 --> 00:39:41,239 Speaker 1: want to read from their conclusion that's along the lines 736 00:39:41,280 --> 00:39:44,520 Speaker 1: of what we've just been talking about. Quote. Research on 737 00:39:44,560 --> 00:39:48,080 Speaker 1: the moses ilusion demonstrates that people have difficulty in detecting 738 00:39:48,120 --> 00:39:52,240 Speaker 1: distortions or inaccuracies when a distorted element is semantically related 739 00:39:52,280 --> 00:39:55,480 Speaker 1: to the theme of the sentence. Why should our cognitive 740 00:39:55,520 --> 00:39:59,000 Speaker 1: system be so tolerant of distortions and find it so 741 00:39:59,080 --> 00:40:02,720 Speaker 1: difficult to do careful matches to memory. It might seem 742 00:40:02,760 --> 00:40:05,360 Speaker 1: that partial matching is a less than ideal way to 743 00:40:05,360 --> 00:40:09,879 Speaker 1: process information. However, the partial match process is not only 744 00:40:10,000 --> 00:40:13,440 Speaker 1: common and normal, but also a necessary mechanism of our 745 00:40:13,480 --> 00:40:18,879 Speaker 1: cognitive system. This partial match process enables useful communication and comprehension. 746 00:40:19,320 --> 00:40:22,480 Speaker 1: Very few things that we see or here will perfectly 747 00:40:22,560 --> 00:40:26,440 Speaker 1: match the representation that we already have stored in memory. 748 00:40:26,840 --> 00:40:29,400 Speaker 1: In order to answer questions, we need to be able 749 00:40:29,440 --> 00:40:32,920 Speaker 1: to use an acceptable match. In order to understand a 750 00:40:32,960 --> 00:40:35,880 Speaker 1: new situation and map it onto something we have already 751 00:40:35,920 --> 00:40:40,480 Speaker 1: seen or done, we must accept slight variations every day. 752 00:40:40,520 --> 00:40:44,280 Speaker 1: At many levels, we accept slight distortions without even noticing 753 00:40:44,320 --> 00:40:47,960 Speaker 1: the process. Occasionally we notice a distortion and choose to 754 00:40:48,000 --> 00:40:52,080 Speaker 1: ignore it, but more frequently we do not even realize 755 00:40:52,120 --> 00:40:56,759 Speaker 1: that distortions have occurred, a rigid comprehension system would have 756 00:40:56,800 --> 00:41:00,400 Speaker 1: a difficult time. Indeed, many of our cognitive operations are 757 00:41:00,480 --> 00:41:05,360 Speaker 1: driven by familiarity based heuristics rather than careful matching operations. 758 00:41:05,680 --> 00:41:08,680 Speaker 1: The Moses illusion is an example of how the adaptive 759 00:41:08,760 --> 00:41:13,640 Speaker 1: human cognitive system works. Everyday, cognitive processing must be based 760 00:41:13,640 --> 00:41:17,439 Speaker 1: on simple heuristics, such as matching sets of features, rather 761 00:41:17,480 --> 00:41:22,160 Speaker 1: than exact matches, as very few tasks require exact matches. 762 00:41:22,640 --> 00:41:27,560 Speaker 1: Sentences do not match stored information, faces change, voices may 763 00:41:27,680 --> 00:41:32,040 Speaker 1: change slightly, even our pets and friends change over time. Therefore, 764 00:41:32,040 --> 00:41:35,080 Speaker 1: it makes sense that people do use partial matches in 765 00:41:35,120 --> 00:41:38,719 Speaker 1: the normal course of matching to memory. Partial matching is 766 00:41:38,800 --> 00:41:42,000 Speaker 1: immutable because it is the most efficient way for memory 767 00:41:42,000 --> 00:41:44,839 Speaker 1: to operate given the nature of the environment in which 768 00:41:44,880 --> 00:41:47,600 Speaker 1: we live. And so, yeah, this really makes me think 769 00:41:47,600 --> 00:41:50,000 Speaker 1: along the lines of what we were just saying a 770 00:41:50,000 --> 00:41:52,960 Speaker 1: few minutes ago. Like the Moses illusion is kind of 771 00:41:52,960 --> 00:41:56,000 Speaker 1: funny when you notice yourself doing it, but it's also 772 00:41:56,360 --> 00:42:01,120 Speaker 1: it's also kind of a superpower. Like I'm imagine if 773 00:42:01,160 --> 00:42:03,440 Speaker 1: you went to a video store, which we still have 774 00:42:03,520 --> 00:42:05,799 Speaker 1: one in Atlanta. Imagine you went there and you were 775 00:42:05,840 --> 00:42:09,239 Speaker 1: to say, um, yeah, I'm looking for a particular movie. Um, 776 00:42:09,280 --> 00:42:12,040 Speaker 1: it started Anthony Hopkins and it had a puppet in it. 777 00:42:12,520 --> 00:42:15,759 Speaker 1: And instead of being able to piece that together and 778 00:42:15,880 --> 00:42:18,359 Speaker 1: tell you which movie you're talking about, what if they 779 00:42:18,360 --> 00:42:20,440 Speaker 1: were to say, Okay, keep listening, I need you to 780 00:42:20,480 --> 00:42:23,120 Speaker 1: list the entire cast. I need all of the details. 781 00:42:23,320 --> 00:42:26,080 Speaker 1: We have to make a match here. Or yeah. Imagine 782 00:42:26,120 --> 00:42:28,680 Speaker 1: somebody comes into the video store and they say, I'm 783 00:42:28,719 --> 00:42:31,880 Speaker 1: looking for The Godfather too, and they say, sorry, we 784 00:42:31,960 --> 00:42:35,320 Speaker 1: don't have that. What they actually have is The Godfather 785 00:42:35,400 --> 00:42:40,040 Speaker 1: col In part two. Oh man, that that's not completely unbelievable, 786 00:42:40,200 --> 00:42:41,840 Speaker 1: not with our video store, but just sort of like 787 00:42:41,920 --> 00:42:49,319 Speaker 1: the cliche video store. You mean, the Godfather Part two? Philistine. 788 00:42:50,760 --> 00:42:52,600 Speaker 1: I mean, that's a kind of silly example. But I 789 00:42:52,600 --> 00:42:54,840 Speaker 1: think the authors of this chapter are exactly right that 790 00:42:55,239 --> 00:42:59,080 Speaker 1: every basically every single moment of our lives, we are 791 00:42:59,320 --> 00:43:02,520 Speaker 1: testing reality against our memories, and we have to do 792 00:43:02,560 --> 00:43:05,600 Speaker 1: so in a fast and loose way, And our ability 793 00:43:05,680 --> 00:43:07,800 Speaker 1: to do so in a fast and loose way, without 794 00:43:07,840 --> 00:43:11,200 Speaker 1: relying on every detail to be an exact correct match 795 00:43:11,920 --> 00:43:15,719 Speaker 1: is is what allows us to live adaptively, to sort 796 00:43:15,719 --> 00:43:20,920 Speaker 1: of like be thinking creatures. Looking for exact matches between 797 00:43:21,000 --> 00:43:24,280 Speaker 1: the current case you're observing and what's stored in your memory. 798 00:43:24,680 --> 00:43:27,960 Speaker 1: Like I made the comparison to a computer earlier. Today, 799 00:43:28,000 --> 00:43:30,680 Speaker 1: I guess we're more familiar with more adaptive types of 800 00:43:30,719 --> 00:43:33,319 Speaker 1: computer functions that are based on like AI or like 801 00:43:33,440 --> 00:43:35,840 Speaker 1: huge amounts of machine learning or something like that. It 802 00:43:35,880 --> 00:43:39,000 Speaker 1: makes me think about like the early old days of 803 00:43:39,520 --> 00:43:42,680 Speaker 1: dealing with the you know, computer programming, where like if 804 00:43:42,719 --> 00:43:46,280 Speaker 1: you slightly misspelled, like you know, um, you're you're playing 805 00:43:46,360 --> 00:43:49,560 Speaker 1: Zork or something and you type will like woolke north 806 00:43:49,680 --> 00:43:51,799 Speaker 1: the w o l K, it's not is to be 807 00:43:51,840 --> 00:43:54,560 Speaker 1: like that is not a valid action, Like, yeah, it's 808 00:43:54,560 --> 00:43:58,600 Speaker 1: amazing nowadays, how just like how much thumb fumbling I 809 00:43:58,600 --> 00:44:01,680 Speaker 1: can put into typing something in search and it still 810 00:44:01,719 --> 00:44:03,920 Speaker 1: knows what I'm talking about. I still um able to 811 00:44:03,920 --> 00:44:05,640 Speaker 1: floor it every now and then because I'll get really 812 00:44:05,640 --> 00:44:09,319 Speaker 1: reckless and u and it'll just have no clue. But 813 00:44:09,480 --> 00:44:12,319 Speaker 1: but more often than not, it'll it'll guess what I'm 814 00:44:12,320 --> 00:44:15,400 Speaker 1: going for. But that is amazing because that is the 815 00:44:15,400 --> 00:44:18,680 Speaker 1: the the the input receiver whatever, you know, this piece 816 00:44:18,719 --> 00:44:22,040 Speaker 1: of technology it's called AI. Because it's becoming more like 817 00:44:22,080 --> 00:44:25,719 Speaker 1: our brains. It's becoming usefully sloppy and and loose in 818 00:44:25,760 --> 00:44:28,759 Speaker 1: the way our brains are. Now, I guess we could 819 00:44:28,760 --> 00:44:31,360 Speaker 1: talk about a couple of other possible examples of knowledge 820 00:44:31,440 --> 00:44:35,000 Speaker 1: neglect or implications of knowledge neglect. One that I came 821 00:44:35,040 --> 00:44:37,560 Speaker 1: across that I thought was pretty funny is something that 822 00:44:37,760 --> 00:44:42,480 Speaker 1: is seems fairly narrow. But it's known as the yolk phenomenon. Uh, 823 00:44:42,560 --> 00:44:45,400 Speaker 1: So it goes like this apparently was originally described in 824 00:44:45,440 --> 00:44:49,880 Speaker 1: an article in the Psychological Review by Gregory Kimball and 825 00:44:50,080 --> 00:44:53,240 Speaker 1: Lawrence Pearl Mutter. Uh. This was in the year nineteen seventy, 826 00:44:53,320 --> 00:44:56,600 Speaker 1: if I didn't already say that. But it consists of 827 00:44:56,719 --> 00:45:00,480 Speaker 1: asking somebody a list of questions and and it's designed 828 00:45:00,520 --> 00:45:03,360 Speaker 1: to produce a certain answer. So you say, what do 829 00:45:03,400 --> 00:45:06,120 Speaker 1: we call the tree that grows from acorns? And you 830 00:45:06,160 --> 00:45:08,200 Speaker 1: say an oak? And then you say, what do you 831 00:45:08,239 --> 00:45:11,680 Speaker 1: call a funny story joke? What's the sound made by 832 00:45:11,680 --> 00:45:15,320 Speaker 1: a frog croak? What's another word for a cape cloak? 833 00:45:15,760 --> 00:45:18,040 Speaker 1: What do we call the white of an egg? And 834 00:45:18,160 --> 00:45:22,640 Speaker 1: most people say yolk um, which is obviously wrong. And 835 00:45:22,680 --> 00:45:25,319 Speaker 1: people are not confused about the white of an egg 836 00:45:25,360 --> 00:45:27,759 Speaker 1: being called the yolk, But it seems like instead, the 837 00:45:27,760 --> 00:45:30,720 Speaker 1: implication is that there's a certain kind of pattern seeking 838 00:45:30,840 --> 00:45:35,239 Speaker 1: that overtakes semantic processing here, like the brain starts to 839 00:45:35,280 --> 00:45:39,520 Speaker 1: conclude while you're answering these questions because of the established 840 00:45:39,560 --> 00:45:43,480 Speaker 1: pattern that rhyming is more important than the actual meaning 841 00:45:43,520 --> 00:45:47,400 Speaker 1: of the word that rhymes and you know it rhymes march. Yeah, exactly, 842 00:45:47,440 --> 00:45:50,040 Speaker 1: it's the rhyme as reason effects sort of. I mean, uh, 843 00:45:50,920 --> 00:45:52,960 Speaker 1: which I think we talked with that in our episode 844 00:45:53,080 --> 00:45:56,879 Speaker 1: on anti metaboli. But I was wondering, I wonder how 845 00:45:56,920 --> 00:45:59,720 Speaker 1: many items in a list like this it takes before 846 00:45:59,800 --> 00:46:03,480 Speaker 1: the majority of respondents will give the yolk type answer, 847 00:46:03,520 --> 00:46:05,560 Speaker 1: will ignore the known meaning of a word and just 848 00:46:05,600 --> 00:46:09,520 Speaker 1: supply the nonsensical rhyming match. I don't know. I feel 849 00:46:09,520 --> 00:46:12,000 Speaker 1: like I'm very susceptible to this one because I I 850 00:46:12,200 --> 00:46:15,520 Speaker 1: recently was trying to do a recipe and it got 851 00:46:15,560 --> 00:46:17,960 Speaker 1: kind of confusing, and I had a moment where I 852 00:46:18,000 --> 00:46:20,600 Speaker 1: had to ask myself, wait, which part is the yolk 853 00:46:20,640 --> 00:46:24,359 Speaker 1: and which is eg white. Um. It was only a 854 00:46:24,360 --> 00:46:26,759 Speaker 1: momentary lapse, but there were a lot of things going on. 855 00:46:26,800 --> 00:46:28,560 Speaker 1: There was a lot I was like having to take 856 00:46:28,600 --> 00:46:30,000 Speaker 1: them apart, you know, as one of those we have 857 00:46:30,040 --> 00:46:32,080 Speaker 1: to have the egg white and one bowl and the 858 00:46:32,160 --> 00:46:34,200 Speaker 1: yolks and the other. And it was I was making 859 00:46:34,200 --> 00:46:38,640 Speaker 1: a suflat, That's what it was. And yeah, and I 860 00:46:38,680 --> 00:46:40,640 Speaker 1: did had I had not had coffee yet either, so 861 00:46:40,719 --> 00:46:43,640 Speaker 1: I had that going for me. Um. It was successful. 862 00:46:43,680 --> 00:46:45,480 Speaker 1: But yeah, there was that moment where I'm like, okay, 863 00:46:45,640 --> 00:46:47,640 Speaker 1: I have to have so many egg whites and then 864 00:46:47,640 --> 00:46:51,279 Speaker 1: a different number of yolks and which ones which now, 865 00:46:52,000 --> 00:46:54,719 Speaker 1: so I would totally fall for this, I mean, did 866 00:46:54,760 --> 00:46:57,360 Speaker 1: you succeed? Did it rise? Yeah? I had rise. It 867 00:46:57,400 --> 00:46:59,279 Speaker 1: was good. Yeah, I don't think I want to put 868 00:46:59,320 --> 00:47:01,520 Speaker 1: it in regular their weekly rotation, but it was. It 869 00:47:01,560 --> 00:47:03,640 Speaker 1: was good for a special treat. I feel like the 870 00:47:03,719 --> 00:47:07,000 Speaker 1: soufle a that is just one of the most notorious, tricky, 871 00:47:07,000 --> 00:47:09,760 Speaker 1: tricky dishes for people who aren't I guess, like working 872 00:47:09,760 --> 00:47:12,360 Speaker 1: in you know, kitchens or bakeries every day. Yeah, it 873 00:47:12,400 --> 00:47:14,200 Speaker 1: was still it was tricky. It was tricky for me, 874 00:47:14,239 --> 00:47:16,160 Speaker 1: even though I went with a very what seemed like 875 00:47:16,160 --> 00:47:19,319 Speaker 1: a very simple recipe that that didn't steer me too wrong, 876 00:47:19,360 --> 00:47:23,399 Speaker 1: but still I got lost a little bit for a moment. Well, 877 00:47:23,400 --> 00:47:25,920 Speaker 1: I'm impressed, So I was. I was reading through this 878 00:47:25,960 --> 00:47:29,320 Speaker 1: book chapter as well, um on knowledge neglect by marsh 879 00:47:29,400 --> 00:47:33,200 Speaker 1: and Humana, and uh yeah, this was this was very interesting. 880 00:47:33,360 --> 00:47:36,440 Speaker 1: Um So yeah, they point to a couple of other misconceptions. 881 00:47:36,440 --> 00:47:39,600 Speaker 1: I don't think we've mentioned these on on the episode 882 00:47:39,600 --> 00:47:42,719 Speaker 1: thus far, but one of them was Toronto is the 883 00:47:42,760 --> 00:47:46,800 Speaker 1: capital of Canada, and a blow to the head cures amnesia, 884 00:47:46,920 --> 00:47:49,000 Speaker 1: which I guess is like a TV you know, cartoon 885 00:47:49,080 --> 00:47:51,800 Speaker 1: kind of a thing. But these are all like examples 886 00:47:51,880 --> 00:47:53,960 Speaker 1: of misconceptions that you might have in your head that 887 00:47:54,040 --> 00:47:56,799 Speaker 1: are are not true. They point out that you know 888 00:47:56,840 --> 00:48:01,080 Speaker 1: it tries we might misconceptions are impossible to ignore, and uh, 889 00:48:01,719 --> 00:48:05,320 Speaker 1: your best hope if you can't avoid hearing misconceptions altogether, 890 00:48:05,400 --> 00:48:08,200 Speaker 1: which again is probably impossible, uh, is to have them 891 00:48:08,239 --> 00:48:12,040 Speaker 1: immediately corrected. But that would be difficult, Like you'd have 892 00:48:12,080 --> 00:48:16,160 Speaker 1: to have like a standing conversation with somebody who would 893 00:48:16,160 --> 00:48:19,040 Speaker 1: not fall for your miscommunication, you know, or you'd have 894 00:48:19,080 --> 00:48:22,920 Speaker 1: to just be constantly, uh, like with with paranoia, just 895 00:48:23,200 --> 00:48:26,600 Speaker 1: fact checking everything you come across. Otherwise, some of them 896 00:48:26,600 --> 00:48:29,680 Speaker 1: are going to get past your your guard and they're 897 00:48:29,719 --> 00:48:32,640 Speaker 1: not going to be instantly corrected, and then they're just 898 00:48:32,719 --> 00:48:34,920 Speaker 1: kinda they're just kind of in there, like even if 899 00:48:34,920 --> 00:48:37,799 Speaker 1: you hear otherwise later, you might still fall back to 900 00:48:37,880 --> 00:48:41,000 Speaker 1: the earlier misconception. Yeah, or it's just or it's something 901 00:48:41,040 --> 00:48:43,319 Speaker 1: that doesn't come up in daily life, you know, so 902 00:48:43,360 --> 00:48:45,239 Speaker 1: you just there's never been an opportunity for it to 903 00:48:45,239 --> 00:48:49,240 Speaker 1: be corrected. I'm reminded of that episode of This American 904 00:48:49,280 --> 00:48:52,000 Speaker 1: Life where they started off by talking about this, uh, 905 00:48:52,120 --> 00:48:55,640 Speaker 1: this this particular individual who had just grown up thinking 906 00:48:55,680 --> 00:48:59,160 Speaker 1: that unicorns existed, like it had never been corrected for, 907 00:48:59,560 --> 00:49:02,360 Speaker 1: and so just had that misconception in her head until 908 00:49:02,400 --> 00:49:05,719 Speaker 1: finally she's at a party and there in a conversation, 909 00:49:05,800 --> 00:49:08,200 Speaker 1: like just a random chatter about hey, what are your 910 00:49:08,200 --> 00:49:11,240 Speaker 1: favorite animals or something, and she she mentions the unicorn, 911 00:49:11,280 --> 00:49:15,080 Speaker 1: and there's like this awkward silence. So why would that 912 00:49:15,120 --> 00:49:18,120 Speaker 1: be all that awkward? I mean, would she like the unicorn? 913 00:49:18,200 --> 00:49:20,840 Speaker 1: Which is real? Well, I think it was. It was 914 00:49:20,920 --> 00:49:23,480 Speaker 1: probably why, if I'm remembering it correctly, it was. There's 915 00:49:23,480 --> 00:49:25,480 Speaker 1: a certain bit of ambiguity where people are like if 916 00:49:25,520 --> 00:49:28,359 Speaker 1: she joking or oh my goodness, she's not joking, she thinks, 917 00:49:29,239 --> 00:49:31,280 Speaker 1: But it also makes all of us I think wonder, 918 00:49:31,640 --> 00:49:35,680 Speaker 1: which what what misconceptions do we have just rattling around 919 00:49:35,719 --> 00:49:38,239 Speaker 1: in our brain right now, we have no idea, but 920 00:49:38,280 --> 00:49:40,440 Speaker 1: they're just they're ready to go at any moment, you know, 921 00:49:40,520 --> 00:49:43,279 Speaker 1: they can be loaded into the torpedo tube of conversation 922 00:49:43,400 --> 00:49:47,440 Speaker 1: or podcasting or the next job interview, just just just 923 00:49:47,480 --> 00:49:49,560 Speaker 1: ready to go when you have no idea. I'd say 924 00:49:49,560 --> 00:49:51,719 Speaker 1: one of the most common edits I have to make 925 00:49:51,719 --> 00:49:54,080 Speaker 1: to this show before we release it as I realized 926 00:49:54,120 --> 00:49:56,439 Speaker 1: that I just sort of said something that I knew 927 00:49:56,560 --> 00:49:59,200 Speaker 1: was true, And then later I'm listening back to it, 928 00:49:59,239 --> 00:50:02,640 Speaker 1: I'm like, wait a minute, it I don't think that's right. Yeah, yeah, 929 00:50:02,640 --> 00:50:05,440 Speaker 1: I've definitely definitely done that before. But well, I mean 930 00:50:05,440 --> 00:50:07,360 Speaker 1: when I said it, I wasn't even wondering, you know, 931 00:50:07,760 --> 00:50:11,000 Speaker 1: just now now. The authors here they they touch on, 932 00:50:11,040 --> 00:50:13,520 Speaker 1: of course, the fact that the prior knowledge seems like 933 00:50:13,560 --> 00:50:15,839 Speaker 1: it should be able to protect us, uh, you know, 934 00:50:16,080 --> 00:50:18,880 Speaker 1: and and yet quote surprisingly, the effects of exposure to 935 00:50:19,040 --> 00:50:21,840 Speaker 1: misconceptions are not limited to cases where people are ignorant 936 00:50:21,840 --> 00:50:24,080 Speaker 1: of the true state of the world. We touched on 937 00:50:24,120 --> 00:50:27,680 Speaker 1: that already. Um. Another great example they bring bring out 938 00:50:27,840 --> 00:50:32,280 Speaker 1: is a plane crashed, where did they bury the survivors? Okay, 939 00:50:32,320 --> 00:50:35,360 Speaker 1: which you know obviously you're not going to bury survivors. 940 00:50:35,400 --> 00:50:37,640 Speaker 1: You were going to bury the dead. But again, this 941 00:50:37,719 --> 00:50:39,680 Speaker 1: is another question where you've kind of filled in all 942 00:50:39,680 --> 00:50:42,640 Speaker 1: the blanks, you know. Uh, they by the time the 943 00:50:42,680 --> 00:50:45,840 Speaker 1: survivors is the last word in the sentence. Uh, and 944 00:50:45,880 --> 00:50:48,120 Speaker 1: you fall for it, right, So it's not like you 945 00:50:48,239 --> 00:50:50,600 Speaker 1: think that the survivors get buried, but you could be 946 00:50:50,640 --> 00:50:53,120 Speaker 1: trying to answer the questions just because like that's gone 947 00:50:53,160 --> 00:50:56,279 Speaker 1: straight past you. Yeah, And they really drive home in 948 00:50:56,280 --> 00:50:59,320 Speaker 1: this that knowledge neglect isn't just a momentary lapse in memory, 949 00:50:59,360 --> 00:51:02,600 Speaker 1: but rather something with real consequences for memory. If you 950 00:51:02,600 --> 00:51:06,320 Speaker 1: don't recognize the error, the error can become coded into 951 00:51:06,320 --> 00:51:10,560 Speaker 1: your memory, into your worldview as fact. Uh. And because 952 00:51:10,600 --> 00:51:15,440 Speaker 1: that error was recently encountered, it's more easily accessed. So 953 00:51:15,560 --> 00:51:18,040 Speaker 1: again we have to remember that items in our memory 954 00:51:18,040 --> 00:51:21,120 Speaker 1: are not made of stone, they're made of clay. Merely 955 00:51:21,160 --> 00:51:25,000 Speaker 1: accessing them can change them. And our most accessed memories 956 00:51:25,200 --> 00:51:27,920 Speaker 1: are the most changed memories of all are the ones 957 00:51:28,000 --> 00:51:30,520 Speaker 1: we can trust the least. Um. So an air that 958 00:51:30,560 --> 00:51:33,640 Speaker 1: pops to mind quickly is more likely to be thought 959 00:51:33,640 --> 00:51:36,960 Speaker 1: of it as fact, not oh I heard once that X, 960 00:51:37,040 --> 00:51:39,400 Speaker 1: I'm not sure about X, but I think X, but 961 00:51:39,520 --> 00:51:42,160 Speaker 1: rather just X is true. X is the answer yes, 962 00:51:42,239 --> 00:51:44,040 Speaker 1: So I guess this is this is connecting back to 963 00:51:44,280 --> 00:51:46,839 Speaker 1: that finding we talked about earlier that you know, um 964 00:51:47,200 --> 00:51:52,680 Speaker 1: that even against your existing prior knowledge, like misconceptions or 965 00:51:52,760 --> 00:51:55,480 Speaker 1: errors that get by you unnoticed in one of these 966 00:51:55,520 --> 00:51:59,080 Speaker 1: Moses solution type sentences can later damage your ability to 967 00:51:59,120 --> 00:52:03,000 Speaker 1: remember the actual fact of that sentence correctly. Um, it 968 00:52:03,040 --> 00:52:06,480 Speaker 1: can undermine your knowledge that it was in fact Noah, potentially, 969 00:52:06,960 --> 00:52:09,200 Speaker 1: And this makes me think about the broader phenomenon of 970 00:52:10,239 --> 00:52:12,279 Speaker 1: people who are really trying to argue a point will 971 00:52:12,360 --> 00:52:18,120 Speaker 1: often structure sentences to try to get something past you 972 00:52:18,280 --> 00:52:21,640 Speaker 1: really quickly. In the non pivotal part of the sentence. 973 00:52:21,719 --> 00:52:23,920 Speaker 1: It's almost like we have an intuitive grasp of the 974 00:52:24,239 --> 00:52:28,360 Speaker 1: Moses illusion type thing, where like a, I don't know, 975 00:52:28,400 --> 00:52:30,960 Speaker 1: you see people like like arguing about politics on TV 976 00:52:31,160 --> 00:52:33,640 Speaker 1: or something, and like so one person will pose a 977 00:52:33,719 --> 00:52:37,520 Speaker 1: question to the other person, and the the pivotal part 978 00:52:37,520 --> 00:52:39,880 Speaker 1: of the sentence that's supposed to be in dispute maybe 979 00:52:40,000 --> 00:52:42,279 Speaker 1: is is one part of the sentence, But then in 980 00:52:42,320 --> 00:52:44,799 Speaker 1: a different part of the sentence, there's also like a 981 00:52:44,840 --> 00:52:48,520 Speaker 1: disputable claim that's just like shoved in there and goes 982 00:52:48,560 --> 00:52:51,120 Speaker 1: by real quick, right right, Yeah, If you end up 983 00:52:51,120 --> 00:52:54,279 Speaker 1: with a statement that has some some mistruths sort of 984 00:52:54,320 --> 00:52:57,399 Speaker 1: sprinkled in there, they're not key to the like the 985 00:52:57,440 --> 00:53:00,560 Speaker 1: main you know, talking point, or even the main untruth. 986 00:53:00,680 --> 00:53:03,200 Speaker 1: You know, that's that can oft often be the nefarious 987 00:53:03,239 --> 00:53:07,640 Speaker 1: thing too. It's like you catch the larger um misconception 988 00:53:07,840 --> 00:53:09,759 Speaker 1: or lie in the statement, but then there are other 989 00:53:09,800 --> 00:53:12,000 Speaker 1: lies in there that you're not paying attention to because 990 00:53:12,000 --> 00:53:13,960 Speaker 1: of the big one. Now, the authors here they point 991 00:53:14,000 --> 00:53:16,680 Speaker 1: out that improved monitoring can help you know, this is 992 00:53:16,680 --> 00:53:18,880 Speaker 1: stuff like we're talking about, like putting things into a 993 00:53:18,880 --> 00:53:23,719 Speaker 1: different font, etcetera. Um, But drawing attention to errors can 994 00:53:23,800 --> 00:53:28,399 Speaker 1: have the opposite effect, increasing suggestibility, which is is weird 995 00:53:28,440 --> 00:53:32,080 Speaker 1: therefore to it as an ironic effect. Um. Plus, many 996 00:53:32,120 --> 00:53:36,239 Speaker 1: manipulations designed to promote monitoring may actually fail to do so, 997 00:53:36,280 --> 00:53:39,160 Speaker 1: and they say it's difficult to predict which manipulations will 998 00:53:39,200 --> 00:53:42,279 Speaker 1: actually work. So again there's no there's no like one 999 00:53:42,360 --> 00:53:44,480 Speaker 1: guy like, here are the three steps you need to 1000 00:53:44,520 --> 00:53:47,960 Speaker 1: take to uh to keep this misinformation from leaking into 1001 00:53:48,000 --> 00:53:49,920 Speaker 1: your brain. I think a lot of what I take 1002 00:53:49,960 --> 00:53:52,960 Speaker 1: away from this is that, uh, I don't know, being 1003 00:53:53,040 --> 00:53:57,680 Speaker 1: well informed is an ongoing process that last your entire life. 1004 00:53:57,680 --> 00:54:00,040 Speaker 1: And it's not a question of like just getting the 1005 00:54:00,120 --> 00:54:03,120 Speaker 1: right facts in the bank one time and then you're set, 1006 00:54:03,440 --> 00:54:07,200 Speaker 1: you know. Yeah, there's a lot of upkeep involved and 1007 00:54:07,360 --> 00:54:10,280 Speaker 1: a lot of just continual pruning and not just new weeds, 1008 00:54:10,320 --> 00:54:13,279 Speaker 1: weeds that have been in there your whole life sometimes 1009 00:54:13,400 --> 00:54:17,399 Speaker 1: or seeing right they happen, and the very least um, yeah, 1010 00:54:17,480 --> 00:54:20,120 Speaker 1: the authors who they also drive home that ultimately we 1011 00:54:20,160 --> 00:54:23,200 Speaker 1: know a lot more about how people come to misremember 1012 00:54:23,280 --> 00:54:28,680 Speaker 1: events versus misremember facts, especially when errors are are the 1013 00:54:28,800 --> 00:54:32,880 Speaker 1: errors involved contradict stored knowledge. So uh you know, you know, 1014 00:54:32,920 --> 00:54:35,880 Speaker 1: again we get into the complexity of memory, the different 1015 00:54:35,880 --> 00:54:39,120 Speaker 1: types of memory that we have going on in the brain. Um, 1016 00:54:39,200 --> 00:54:41,600 Speaker 1: and we we still have a lot more to learn 1017 00:54:41,680 --> 00:54:45,400 Speaker 1: about just how this all comes together. Yeah. Now, you know, 1018 00:54:45,440 --> 00:54:47,440 Speaker 1: here's a question that comes to mind. Um, I wonder 1019 00:54:47,440 --> 00:54:51,040 Speaker 1: if anyone has constructed a Moses illusion statement using Bilbo 1020 00:54:51,120 --> 00:54:55,600 Speaker 1: and frodo. Oh, yes, that might work. So, um, like 1021 00:54:55,719 --> 00:54:59,680 Speaker 1: what was Bilbo carrying into the fires of Mountain Doom? Yeah, 1022 00:54:59,760 --> 00:55:01,160 Speaker 1: that's sort of thing I don't know, of course, I 1023 00:55:01,200 --> 00:55:03,319 Speaker 1: guess you would want to. You'd want to try and 1024 00:55:03,360 --> 00:55:05,480 Speaker 1: construct it right so that you get Bilbo there at 1025 00:55:05,480 --> 00:55:08,160 Speaker 1: the very end or Frodo at the very end, depending 1026 00:55:08,200 --> 00:55:11,520 Speaker 1: on how you're you're you're messing around with it. Um, 1027 00:55:12,400 --> 00:55:15,520 Speaker 1: who was who was the dragon whose lair was infiltrated 1028 00:55:15,560 --> 00:55:18,560 Speaker 1: by Frodo Baggins. Yeah, yeah, that sort of thing that 1029 00:55:18,640 --> 00:55:21,279 Speaker 1: might work. Yeah, I said, Bilbo and Frodo or even 1030 00:55:21,280 --> 00:55:24,080 Speaker 1: closer together than Noah and Moses. Yeah, I mean they 1031 00:55:24,120 --> 00:55:28,960 Speaker 1: are certainly that they actually overlap, as opposed to being 1032 00:55:29,000 --> 00:55:33,360 Speaker 1: separated by by long stretches of time. Very very similar characters, 1033 00:55:33,400 --> 00:55:39,440 Speaker 1: actually related, right, they are related? Yeah? Um? Yeah, what uncle, 1034 00:55:39,600 --> 00:55:43,480 Speaker 1: great uncle Uncle? I always forget what happened to Frodo's parents. 1035 00:55:43,560 --> 00:55:47,239 Speaker 1: I've read it and I still forget it. I'm gonna say, uncle, 1036 00:55:47,280 --> 00:55:50,640 Speaker 1: all the all the Hobbits are cousins. Yeah, they're all related. Actually, yes, 1037 00:55:51,960 --> 00:55:54,000 Speaker 1: all right, Well, there you have it. We'd love to 1038 00:55:54,040 --> 00:55:55,840 Speaker 1: hear from everybody about this, because, of course, this just 1039 00:55:55,920 --> 00:55:58,120 Speaker 1: touches on how our brains work and how it now 1040 00:55:58,239 --> 00:56:01,759 Speaker 1: they are brains work with with new information, be it 1041 00:56:02,120 --> 00:56:06,640 Speaker 1: accurate or or or or a misconception. Uh so, I 1042 00:56:06,640 --> 00:56:09,600 Speaker 1: think everybody out there has something to share. Which of 1043 00:56:09,640 --> 00:56:13,520 Speaker 1: these Moses illusions worked the most on you? Which ones 1044 00:56:13,520 --> 00:56:15,799 Speaker 1: I've worked on you in the past. Uh, we'd love 1045 00:56:15,840 --> 00:56:18,200 Speaker 1: to hear from you, all right. If you want to 1046 00:56:18,239 --> 00:56:20,200 Speaker 1: check out other episodes of Stuff to Blow your Mind, 1047 00:56:20,480 --> 00:56:22,120 Speaker 1: you know where to find it. You can find the 1048 00:56:22,120 --> 00:56:24,160 Speaker 1: Stuff to Blow your Mind podcast feed wherever you get 1049 00:56:24,200 --> 00:56:27,080 Speaker 1: your your podcast, and we'll have core episodes of Stuff 1050 00:56:27,080 --> 00:56:29,520 Speaker 1: to Blow your Mind on Tuesdays and Thursdays. You've got 1051 00:56:29,560 --> 00:56:32,080 Speaker 1: listener mail on Monday's, you've got them, We've got the 1052 00:56:32,160 --> 00:56:35,280 Speaker 1: Artifact on Wednesdays. You've got Weird House Cinema on Fridays 1053 00:56:35,280 --> 00:56:38,480 Speaker 1: in a vault episode on the weekends. Huge things. As 1054 00:56:38,480 --> 00:56:41,920 Speaker 1: always to our excellent audio producer Seth Nicholas Johnson. If 1055 00:56:41,920 --> 00:56:43,600 Speaker 1: you would like to get in touch with us with 1056 00:56:43,719 --> 00:56:46,360 Speaker 1: feedback on this episode or any other, to suggest a 1057 00:56:46,440 --> 00:56:48,480 Speaker 1: topic for the future, or just to say hello, you 1058 00:56:48,480 --> 00:56:51,080 Speaker 1: can email us at contact. That's Stuff to Blow your 1059 00:56:51,080 --> 00:57:01,200 Speaker 1: Mind dot com. Stuff to Blow Your Mind is production 1060 00:57:01,280 --> 00:57:04,000 Speaker 1: of I Heart Radio. For more podcasts for my heart Radio, 1061 00:57:04,200 --> 00:57:06,560 Speaker 1: this is the I Heart Radio app, Apple Podcasts or 1062 00:57:06,560 --> 00:57:20,880 Speaker 1: wherever you listening to your favorite shows,