1 00:00:05,720 --> 00:00:08,079 Speaker 1: Hey, welcome to Stuff to Blow Your Mind. This is 2 00:00:08,200 --> 00:00:11,480 Speaker 1: Robert Lamb and I'm Joe McCormick, and today we're bringing 3 00:00:11,480 --> 00:00:13,840 Speaker 1: you an episode from the Vault. This one it was 4 00:00:13,880 --> 00:00:17,800 Speaker 1: about a psychological effect called the Moses illusion. It originally 5 00:00:17,800 --> 00:00:25,720 Speaker 1: published February. All right, let's jump right in Welcome to 6 00:00:25,760 --> 00:00:35,280 Speaker 1: Stuff to Blow Your Mind production of My Heart Radio. Hey, 7 00:00:35,400 --> 00:00:37,120 Speaker 1: welcome to Stuff to Blow your Mind. My name is 8 00:00:37,200 --> 00:00:41,160 Speaker 1: Robert Lamb and I'm Joe McCormick, and today we're going 9 00:00:41,200 --> 00:00:45,840 Speaker 1: to be talking about an interesting observation in cognitive psychology 10 00:00:45,840 --> 00:00:49,120 Speaker 1: that deals with language that starts off as kind of 11 00:00:49,159 --> 00:00:52,360 Speaker 1: just a funny little quirk about the way we process 12 00:00:52,400 --> 00:00:55,440 Speaker 1: certain kinds of sentences but ends up having some some 13 00:00:55,560 --> 00:00:59,400 Speaker 1: broader and more interesting implications about knowledge and language and 14 00:00:59,440 --> 00:01:02,440 Speaker 1: thought both. The best way to start here would just 15 00:01:02,480 --> 00:01:06,680 Speaker 1: be to illustrate the prime example of the effect we're 16 00:01:06,680 --> 00:01:08,800 Speaker 1: going to be talking about. And to do that, I 17 00:01:08,840 --> 00:01:11,600 Speaker 1: think we need to do a bit of Bible trivia. Rob, 18 00:01:11,720 --> 00:01:17,800 Speaker 1: are you ready to go to Sunday School. Let's do it. 19 00:01:17,840 --> 00:01:20,039 Speaker 1: Let's go to Sunday School. Okay, And if you I'm 20 00:01:20,040 --> 00:01:22,360 Speaker 1: gonna ask you a few questions about the Bible. If 21 00:01:22,360 --> 00:01:25,240 Speaker 1: you get one wrong, you are going to get a paddling. Whoa, 22 00:01:25,560 --> 00:01:27,840 Speaker 1: what the domination? Is this one of the ones that 23 00:01:27,880 --> 00:01:31,920 Speaker 1: means business? All right? Okay, so how about let's see 24 00:01:31,920 --> 00:01:36,280 Speaker 1: it so, um, in in the garden of Eden, what 25 00:01:36,360 --> 00:01:38,959 Speaker 1: type of animal is it that tempts Eve to eat 26 00:01:39,000 --> 00:01:42,840 Speaker 1: from the tree? Oh, that's a snake, that's right, the 27 00:01:42,840 --> 00:01:47,120 Speaker 1: serpent it is? Uh. Okay. When after God created the world, 28 00:01:47,160 --> 00:01:50,320 Speaker 1: on which day of the week did he rest? Oh? 29 00:01:50,360 --> 00:01:53,360 Speaker 1: That was the seventh day. You got that one, right, Okay, 30 00:01:53,560 --> 00:01:56,600 Speaker 1: next one, How many animals of each kind did Moses 31 00:01:56,640 --> 00:02:00,960 Speaker 1: take on the arc? Oh? Too, of course. And there 32 00:02:01,040 --> 00:02:03,800 Speaker 1: you go. That is the prime example. Now, Rob, I 33 00:02:03,840 --> 00:02:06,080 Speaker 1: know you were playing along because you already know the 34 00:02:06,160 --> 00:02:08,400 Speaker 1: trick in here what I actually said when I asked 35 00:02:08,440 --> 00:02:10,720 Speaker 1: that question. Hopefully you were playing along at home as 36 00:02:10,800 --> 00:02:12,880 Speaker 1: as you're listening, or maybe you're not at home, wherever 37 00:02:12,919 --> 00:02:15,720 Speaker 1: the heck you are. Um, you may have thought the 38 00:02:15,760 --> 00:02:18,400 Speaker 1: same thing, right, Moses took two of each animal on 39 00:02:18,480 --> 00:02:21,160 Speaker 1: the arc. But in fact, in the Bible story which 40 00:02:21,400 --> 00:02:24,280 Speaker 1: maybe not everybody knows, but maybe you do know this 41 00:02:24,360 --> 00:02:27,360 Speaker 1: story of of the arc and in the Book of Genesis, 42 00:02:27,720 --> 00:02:29,440 Speaker 1: and you do in fact know that it was not 43 00:02:29,680 --> 00:02:32,600 Speaker 1: Moses who did that. It was Noah in the story 44 00:02:32,639 --> 00:02:35,400 Speaker 1: who took animals on the arc. And yet you thought 45 00:02:35,520 --> 00:02:37,639 Speaker 1: after I said the question that the answer is too 46 00:02:37,960 --> 00:02:42,640 Speaker 1: and didn't even register the fact that the name was wrong. Yeah, 47 00:02:42,680 --> 00:02:48,040 Speaker 1: It's it's an interesting uh phenomenon to uh, to encounter, 48 00:02:48,320 --> 00:02:51,120 Speaker 1: you know, others, but also in yourself because you m 49 00:02:52,680 --> 00:02:54,320 Speaker 1: there's several different ways to look at and we'll get 50 00:02:54,320 --> 00:02:55,760 Speaker 1: into a number of these here. But like even just 51 00:02:55,840 --> 00:02:57,919 Speaker 1: now when you ask me those questions like the serpent one, 52 00:02:58,160 --> 00:03:00,400 Speaker 1: I'm totally firm on that, like I of the I 53 00:03:00,400 --> 00:03:02,840 Speaker 1: know that aspect of the story inside and out. Uh. 54 00:03:02,840 --> 00:03:04,600 Speaker 1: And of course I know it's the seventh day that 55 00:03:04,600 --> 00:03:07,240 Speaker 1: he rested on the God rested on he she yet 56 00:03:07,360 --> 00:03:10,359 Speaker 1: ohever you want to look at it. Uh, But there 57 00:03:10,400 --> 00:03:12,800 Speaker 1: was still like this moment of hesitation because I was like, 58 00:03:12,919 --> 00:03:14,639 Speaker 1: it's seven, right, it is seven. I don't want to 59 00:03:14,680 --> 00:03:18,720 Speaker 1: come come off with the wrong answer on the podcast. Um. 60 00:03:18,760 --> 00:03:20,840 Speaker 1: But then when one encount and granted already knew the 61 00:03:20,840 --> 00:03:23,520 Speaker 1: answer to the third one, but there is this temptation 62 00:03:23,560 --> 00:03:26,359 Speaker 1: though too, like when you when you know why, when 63 00:03:26,360 --> 00:03:28,800 Speaker 1: you know the answer to something like you just you 64 00:03:28,800 --> 00:03:31,640 Speaker 1: can just jump in without hesitation, Like there's a certainty 65 00:03:31,680 --> 00:03:35,320 Speaker 1: that just propels you. Um, you're excited to get your 66 00:03:35,400 --> 00:03:38,560 Speaker 1: answer in and then you know, get the acclaim and 67 00:03:38,600 --> 00:03:41,080 Speaker 1: the praise for getting it right. Yeah, there's a certain 68 00:03:41,160 --> 00:03:43,680 Speaker 1: kind of way in which a question, especially a question 69 00:03:43,880 --> 00:03:47,040 Speaker 1: posed in quiz format, where you feel you are under 70 00:03:47,080 --> 00:03:50,520 Speaker 1: performance pressure and you're being evaluated for whether or not 71 00:03:50,560 --> 00:03:52,480 Speaker 1: you're going to get the right answer. It sort of 72 00:03:52,560 --> 00:03:56,080 Speaker 1: takes away some amount of critical thinking that would normally 73 00:03:56,080 --> 00:03:59,080 Speaker 1: go into reading a sentence and causes you to focus 74 00:03:59,120 --> 00:04:01,880 Speaker 1: more exclusive lee on like just can I get the 75 00:04:01,960 --> 00:04:04,640 Speaker 1: right answer? And so it's not hard to see how now, 76 00:04:04,640 --> 00:04:06,840 Speaker 1: and this of course might not be the only explanation 77 00:04:06,880 --> 00:04:09,360 Speaker 1: for why this is happening, but it's not hard to 78 00:04:09,400 --> 00:04:12,720 Speaker 1: see why you could pretty easily miss a major error 79 00:04:12,760 --> 00:04:15,520 Speaker 1: in a question that is not you know, that is 80 00:04:15,520 --> 00:04:18,560 Speaker 1: not necessarily something that you're fuzzy on to begin with, 81 00:04:18,600 --> 00:04:21,320 Speaker 1: Like you could know perfectly well that it's Noah in 82 00:04:21,360 --> 00:04:25,320 Speaker 1: the story, and yet it just goes completely over your head. Yeah, 83 00:04:25,360 --> 00:04:27,280 Speaker 1: and you know, we've we've been doing this podcast quite 84 00:04:27,320 --> 00:04:29,560 Speaker 1: a while at this point, and occasionally this this comes 85 00:04:29,600 --> 00:04:31,800 Speaker 1: up in our data, not so much in things that 86 00:04:31,839 --> 00:04:35,400 Speaker 1: we've researched for the podcast, because I feel like if 87 00:04:35,480 --> 00:04:37,880 Speaker 1: we've been crunching the facts or the numbers, you know, 88 00:04:38,040 --> 00:04:40,160 Speaker 1: or the you know, we're we're more likely to be 89 00:04:40,160 --> 00:04:43,320 Speaker 1: putting a lot of thought into the situation and we're 90 00:04:43,360 --> 00:04:46,200 Speaker 1: maybe just a you know, a little hesitant anyway. But 91 00:04:46,279 --> 00:04:49,080 Speaker 1: the times where I've personally like said something that was 92 00:04:49,120 --> 00:04:51,840 Speaker 1: absolutely incorrect, it would be something that I felt so 93 00:04:51,880 --> 00:04:54,039 Speaker 1: sure about that I just belt it out without fact 94 00:04:54,120 --> 00:04:56,760 Speaker 1: checking it at all. You know, Uh, something you generally 95 00:04:56,800 --> 00:04:59,120 Speaker 1: it's something not directly related to the episode, but something 96 00:04:59,120 --> 00:05:02,159 Speaker 1: that just kind of comes up in organic conversation. That's 97 00:05:02,160 --> 00:05:05,040 Speaker 1: exactly right. Yeah, it's when you feel so confident that 98 00:05:05,080 --> 00:05:07,880 Speaker 1: you're not even being careful, you know that that you 99 00:05:07,920 --> 00:05:11,359 Speaker 1: can really make some big blunders. Uh. There were some 100 00:05:11,400 --> 00:05:13,640 Speaker 1: other questions I was reading about in one of the 101 00:05:14,000 --> 00:05:16,599 Speaker 1: I think the earliest study on this phenomenon we're talking 102 00:05:16,640 --> 00:05:19,120 Speaker 1: about today. Some of the other questions were in the 103 00:05:19,160 --> 00:05:22,640 Speaker 1: biblical story, what is Joshua swallowed by? Of course, that's 104 00:05:23,040 --> 00:05:25,760 Speaker 1: Jonah that is swallowed by the whale or the great fish, 105 00:05:25,920 --> 00:05:29,200 Speaker 1: the sea monster. Joshua of course is the the conquering 106 00:05:29,320 --> 00:05:32,040 Speaker 1: leader of the Israelites as they go about Kanaan. Another 107 00:05:32,080 --> 00:05:34,520 Speaker 1: one I really liked was in the novel Moby Dick. 108 00:05:34,640 --> 00:05:37,440 Speaker 1: What color was the whale that Captain Nemo was after? 109 00:05:38,000 --> 00:05:40,800 Speaker 1: I think I think I might have fallen for that one. Yeah, 110 00:05:40,880 --> 00:05:42,880 Speaker 1: I mean, I wonder how much of the ego is 111 00:05:42,920 --> 00:05:45,200 Speaker 1: involved here, because it's like, you're kind of like, all right, 112 00:05:45,279 --> 00:05:46,719 Speaker 1: let's get to the part of this where I get 113 00:05:46,760 --> 00:05:48,880 Speaker 1: to talk and get to be the one is correct, 114 00:05:49,640 --> 00:05:51,520 Speaker 1: like like fast followers through all this other stuff. I 115 00:05:51,560 --> 00:05:53,840 Speaker 1: don't care. I have an answer and it is the 116 00:05:53,880 --> 00:05:58,320 Speaker 1: correct one. Yeah. Um, that's that's quite perceptive, and I 117 00:05:58,360 --> 00:06:02,000 Speaker 1: think that's right. Um. But anyway, so this question that 118 00:06:02,040 --> 00:06:05,560 Speaker 1: we're looking at today, that this effect of not noticing 119 00:06:05,640 --> 00:06:08,919 Speaker 1: that the question says Moses and just barreling right on 120 00:06:09,000 --> 00:06:11,480 Speaker 1: through to the answer, even if you know that it's 121 00:06:11,520 --> 00:06:14,640 Speaker 1: actually Noah in the story and not Moses. This effect 122 00:06:14,680 --> 00:06:17,520 Speaker 1: has a name, and it's known as the Moses illusion. 123 00:06:17,800 --> 00:06:21,880 Speaker 1: It's a particular type of semantic illusion that occurs when 124 00:06:21,920 --> 00:06:25,880 Speaker 1: we are trying to process certain kinds of sentences, and 125 00:06:25,960 --> 00:06:28,800 Speaker 1: this was first explored in a classic study in psychology. 126 00:06:28,839 --> 00:06:31,359 Speaker 1: It was a study called from Words to Meaning a 127 00:06:31,440 --> 00:06:35,440 Speaker 1: Semantic Illusion, published in the Journal of Verbal Learning and 128 00:06:35,520 --> 00:06:38,760 Speaker 1: Verbal Behavior in nineteen eighty one by Thomas D. Ericsson 129 00:06:38,800 --> 00:06:41,640 Speaker 1: and Mark E. Mattson. And I think it's interesting that 130 00:06:41,720 --> 00:06:45,520 Speaker 1: this original observation about this, this question about Moses, it 131 00:06:45,600 --> 00:06:50,120 Speaker 1: comes out of a mysterious question about how we process 132 00:06:50,320 --> 00:06:54,400 Speaker 1: the meaning of sentences. Uh. The authors of this study ask, quote, 133 00:06:54,600 --> 00:06:58,880 Speaker 1: how are the meanings of individual words combined to form 134 00:06:58,880 --> 00:07:02,960 Speaker 1: a more global description of meaning? And if you start 135 00:07:03,040 --> 00:07:06,520 Speaker 1: to think hard about this question about the human capacity 136 00:07:06,560 --> 00:07:10,920 Speaker 1: for language, I would argue it is absolutely astonishing. It's 137 00:07:10,920 --> 00:07:14,240 Speaker 1: almost baffling the way that we're not only able to 138 00:07:14,280 --> 00:07:18,200 Speaker 1: associate symbolic meaning with certain sounds coming out of our 139 00:07:18,240 --> 00:07:21,240 Speaker 1: mouths or glyphs on a page, but you're able to 140 00:07:21,440 --> 00:07:27,080 Speaker 1: combine those things endlessly to form and comprehend infinite variations 141 00:07:27,160 --> 00:07:31,400 Speaker 1: of combinations of those sounds, to create sentences that actually 142 00:07:31,440 --> 00:07:33,920 Speaker 1: means something and other people can understand what you mean 143 00:07:33,920 --> 00:07:36,760 Speaker 1: when you say them. Like, I think this type of 144 00:07:36,800 --> 00:07:39,360 Speaker 1: capacity for language is one of the features of the 145 00:07:39,440 --> 00:07:43,960 Speaker 1: natural world that to me seems closest to magic. Yeah. Absolutely, 146 00:07:43,960 --> 00:07:46,000 Speaker 1: And I feel like that the mostest illusion is one 147 00:07:46,040 --> 00:07:48,520 Speaker 1: of those things that that reveals the magic that makes 148 00:07:48,520 --> 00:07:53,520 Speaker 1: you more aware of the magic trick that is inherent 149 00:07:53,600 --> 00:07:56,400 Speaker 1: to your just everyday perception of reality and how you 150 00:07:56,440 --> 00:07:59,600 Speaker 1: engage with facts and information and the fact that you're 151 00:07:59,600 --> 00:08:03,080 Speaker 1: just like it. It's crazy that we're just constantly throwing 152 00:08:03,120 --> 00:08:08,320 Speaker 1: together sentences, almost effortlessly, that are combining all these words together. 153 00:08:08,360 --> 00:08:12,040 Speaker 1: Each word has a huge range of of possible meanings 154 00:08:12,080 --> 00:08:14,800 Speaker 1: and associations, and and that we are able to do 155 00:08:14,840 --> 00:08:18,360 Speaker 1: this with such fluency, I mean, sometimes with more fluency 156 00:08:18,400 --> 00:08:21,880 Speaker 1: than other times. But uh, but yeah, it is truly 157 00:08:21,920 --> 00:08:24,880 Speaker 1: astounding to me. And so the authors here are sort 158 00:08:24,880 --> 00:08:27,720 Speaker 1: of talking about this process and some of the question 159 00:08:27,760 --> 00:08:31,160 Speaker 1: marks that existed at the time in science about how 160 00:08:31,200 --> 00:08:34,440 Speaker 1: we form sentences and how we comprehend sentences. So they 161 00:08:34,480 --> 00:08:37,000 Speaker 1: start in their introduction by talking about how quote a 162 00:08:37,040 --> 00:08:40,920 Speaker 1: central process in language comprehension is the construction of a 163 00:08:40,960 --> 00:08:44,600 Speaker 1: global description of the sentence meaning from the meanings of 164 00:08:44,640 --> 00:08:47,640 Speaker 1: individual words which make up the sentence. Right, So you 165 00:08:47,679 --> 00:08:50,800 Speaker 1: know what individual words mean, but somehow, like we're just 166 00:08:50,840 --> 00:08:54,400 Speaker 1: talking about, you can combine them into these overall gist 167 00:08:54,520 --> 00:08:57,760 Speaker 1: forms of what somebody is getting at. You know, like, like, 168 00:08:57,840 --> 00:09:01,280 Speaker 1: what kind of answer is being quested by a question 169 00:09:01,640 --> 00:09:03,839 Speaker 1: that might be made up of ten different words that 170 00:09:03,880 --> 00:09:06,439 Speaker 1: are all you know, throwing your brain in ten different directions. 171 00:09:06,720 --> 00:09:09,040 Speaker 1: Yet you can get the gist of the question and 172 00:09:09,160 --> 00:09:12,160 Speaker 1: figure out what is getting at pretty quickly actually, And 173 00:09:12,200 --> 00:09:13,840 Speaker 1: they talk about how there's been a lot of work 174 00:09:13,920 --> 00:09:17,960 Speaker 1: on how language processing works in the realm of artificial intelligence, 175 00:09:17,960 --> 00:09:20,120 Speaker 1: but at the time of this paper, there was still 176 00:09:20,160 --> 00:09:22,920 Speaker 1: a lot that we didn't know about the global meaning 177 00:09:23,040 --> 00:09:26,240 Speaker 1: of of a sentence and and how that's constructed in 178 00:09:26,280 --> 00:09:29,440 Speaker 1: the brain. And so they summarize the way they're starting 179 00:09:29,440 --> 00:09:32,360 Speaker 1: this paper by saying, uh, it has become widely assumed 180 00:09:32,400 --> 00:09:36,840 Speaker 1: that sentences are subject to exhaustive analysis and consistency checks 181 00:09:37,160 --> 00:09:41,079 Speaker 1: during processing, but this is not the case. People do 182 00:09:41,120 --> 00:09:44,120 Speaker 1: not always understand what is said to them. Sometimes they 183 00:09:44,120 --> 00:09:48,319 Speaker 1: fail to understand, sometimes they misunderstand. And while these failures 184 00:09:48,320 --> 00:09:51,640 Speaker 1: of comprehension are sometimes due to lack of appropriate knowledge 185 00:09:51,720 --> 00:09:54,000 Speaker 1: or error on the part of the speaker, there are 186 00:09:54,040 --> 00:09:57,800 Speaker 1: other cases in which such failures occur when the understander 187 00:09:57,840 --> 00:10:03,000 Speaker 1: possesses all the knowledge necessary for correct understanding. This paper 188 00:10:03,000 --> 00:10:06,360 Speaker 1: explores such a phenomenon and then they give the example 189 00:10:06,480 --> 00:10:08,959 Speaker 1: of the Moses illusion that we already talked about. The 190 00:10:09,120 --> 00:10:12,280 Speaker 1: question that they pose is how many animals of each 191 00:10:12,360 --> 00:10:15,040 Speaker 1: kind did Moses take on the arc? And so what 192 00:10:15,080 --> 00:10:17,400 Speaker 1: the authors here found in their original study and eighty 193 00:10:17,440 --> 00:10:20,560 Speaker 1: one was that the majority of people failed to notice 194 00:10:20,600 --> 00:10:24,920 Speaker 1: a problem with the question and simply answer to despite 195 00:10:25,400 --> 00:10:28,760 Speaker 1: later displaying knowledge that it was in fact Noah and 196 00:10:28,840 --> 00:10:32,559 Speaker 1: the story and not Moses. And so that it's not 197 00:10:32,640 --> 00:10:35,000 Speaker 1: that they just don't know that much about the Bible, 198 00:10:35,080 --> 00:10:38,400 Speaker 1: like they can answer the question correctly when it's posed, like, hey, 199 00:10:38,600 --> 00:10:41,320 Speaker 1: was it Noah or Moses who took animals onto the ark? 200 00:10:41,679 --> 00:10:44,160 Speaker 1: They can answer that correctly and yet still fail to 201 00:10:44,200 --> 00:10:47,200 Speaker 1: notice a problem in the question. And studies find that 202 00:10:47,240 --> 00:10:50,040 Speaker 1: people do this even when they're not rushed. They still 203 00:10:50,040 --> 00:10:52,720 Speaker 1: make the mistake when they are given unlimited time to 204 00:10:52,760 --> 00:10:55,720 Speaker 1: think about it. Another interesting thing here they found was 205 00:10:55,760 --> 00:10:59,560 Speaker 1: that the the effect is not caused by people misreading 206 00:10:59,640 --> 00:11:02,720 Speaker 1: or miss hearing the question, because people still make the 207 00:11:02,800 --> 00:11:06,720 Speaker 1: Moses illusion mistake even if they themselves read the question 208 00:11:06,800 --> 00:11:09,720 Speaker 1: out loud, including the name Moses, so they are saying 209 00:11:09,800 --> 00:11:12,199 Speaker 1: Moses out of their own lips, and they still might 210 00:11:12,240 --> 00:11:15,280 Speaker 1: not notice it now. In this first study, the authors 211 00:11:15,320 --> 00:11:19,040 Speaker 1: conclude that what's very important, because they're getting at things 212 00:11:19,080 --> 00:11:21,640 Speaker 1: about the semantics of words and a sentence and how 213 00:11:21,640 --> 00:11:24,800 Speaker 1: the meanings of sentences are formed. They conclude that shared 214 00:11:25,000 --> 00:11:29,600 Speaker 1: semantic features of the mix up are probably significantly contributing 215 00:11:29,640 --> 00:11:32,840 Speaker 1: to the effect. In other words, this effect would probably 216 00:11:32,880 --> 00:11:36,040 Speaker 1: not be nearly as pronounced, maybe not even maybe totally 217 00:11:36,040 --> 00:11:38,880 Speaker 1: non existent if the items were not in some way 218 00:11:38,920 --> 00:11:43,800 Speaker 1: closely related in the way that's a two Bible characters are. 219 00:11:44,440 --> 00:11:46,480 Speaker 1: If you ask, you know, how many of each kind 220 00:11:46,559 --> 00:11:50,360 Speaker 1: did Captain Hook take into the arc, the effect probably vanishes. 221 00:11:50,480 --> 00:11:52,720 Speaker 1: Another study, I was looking at side at an example 222 00:11:52,720 --> 00:11:55,800 Speaker 1: I found really funny, which was how many animals of 223 00:11:55,840 --> 00:11:59,600 Speaker 1: each kind did Nixon take on the arc? And yeah, 224 00:11:59,679 --> 00:12:02,160 Speaker 1: and and I like that because they were saying, Okay, well, 225 00:12:02,160 --> 00:12:05,480 Speaker 1: what if it's just like phonological similarities, like Nixon and 226 00:12:05,559 --> 00:12:08,200 Speaker 1: Noah have some similarities. They start with the same sound, 227 00:12:08,200 --> 00:12:11,200 Speaker 1: they've got the same number of syllables. But clearly when 228 00:12:11,200 --> 00:12:14,760 Speaker 1: you put Nixon in the sentence, people notice. And so 229 00:12:14,840 --> 00:12:18,520 Speaker 1: the Moses illusion is just one persistent example from a 230 00:12:18,640 --> 00:12:22,400 Speaker 1: class of mental phenomena that could be called knowledge neglect. 231 00:12:22,679 --> 00:12:24,840 Speaker 1: This is a term used by a couple of authors 232 00:12:24,840 --> 00:12:28,120 Speaker 1: that will cite later in the episode. But knowledge neglect 233 00:12:28,200 --> 00:12:31,920 Speaker 1: and simplified terms, is when you behave as if you 234 00:12:32,000 --> 00:12:35,240 Speaker 1: don't know something even though you definitely do know it. 235 00:12:35,960 --> 00:12:38,280 Speaker 1: And the Moses illusion is of course an example of 236 00:12:38,360 --> 00:12:42,040 Speaker 1: knowledge neglect, because the problem isn't that people think Moses 237 00:12:42,240 --> 00:12:44,640 Speaker 1: was the biblical character who built the arc. You can 238 00:12:44,679 --> 00:12:47,720 Speaker 1: know that it was Noah, not Moses. If you're asked directly, 239 00:12:47,760 --> 00:12:50,640 Speaker 1: you'll get the answer right, but you don't notice the 240 00:12:50,720 --> 00:12:53,880 Speaker 1: problem when it's phrased in a question like this. And 241 00:12:53,880 --> 00:12:55,880 Speaker 1: of course it's not just Moses and Noah. There are 242 00:12:55,880 --> 00:12:58,360 Speaker 1: plenty of other sentences in studies that have shown the 243 00:12:58,400 --> 00:13:02,240 Speaker 1: same thing. Though it is interesting that Moses and Noah 244 00:13:02,280 --> 00:13:05,080 Speaker 1: are like sort of the perfect example of it. I 245 00:13:05,120 --> 00:13:08,160 Speaker 1: think there might be particular characteristics of these two names 246 00:13:08,160 --> 00:13:12,160 Speaker 1: and characters that make it like that make people especially 247 00:13:12,240 --> 00:13:14,160 Speaker 1: prone to the mix up in this case, though it 248 00:13:14,240 --> 00:13:16,360 Speaker 1: is true for lots of other types of you know, 249 00:13:16,400 --> 00:13:19,079 Speaker 1: words and objects. Well, speaking of that, let's do a 250 00:13:19,160 --> 00:13:22,120 Speaker 1: quick breakdown and just especially for folks who are not 251 00:13:22,160 --> 00:13:26,199 Speaker 1: that up on Moses and Noah. Uh, just to give 252 00:13:26,240 --> 00:13:28,480 Speaker 1: a little you know, basic information about each of them, 253 00:13:28,720 --> 00:13:31,000 Speaker 1: and give me, give me the magic the gathering card 254 00:13:31,080 --> 00:13:33,560 Speaker 1: on each one. Okay, well let's start. Let's start with 255 00:13:33,559 --> 00:13:36,800 Speaker 1: with Noah. Okay, certainly the the older of the two, 256 00:13:36,880 --> 00:13:40,920 Speaker 1: the first that in the chronological order. So Noah was 257 00:13:41,200 --> 00:13:44,800 Speaker 1: h is written as as a was an antediluvian patriarch 258 00:13:44,880 --> 00:13:49,079 Speaker 1: in Jewish, Christian, and Islamic traditions. The basic story, God 259 00:13:49,160 --> 00:13:51,160 Speaker 1: grows sick of humanity, so he tells Noah to round 260 00:13:51,200 --> 00:13:53,040 Speaker 1: up his family and two of every animal and get 261 00:13:53,040 --> 00:13:56,160 Speaker 1: them on a big old boat the arc, uh, the 262 00:13:56,200 --> 00:13:58,200 Speaker 1: first of two arcs we're going to discuss here, so 263 00:13:58,320 --> 00:14:02,680 Speaker 1: they alone can survive the noble flood that's about to happen. Yeah. Now, 264 00:14:02,800 --> 00:14:06,040 Speaker 1: one interesting variation. I think most people probably wouldn't even 265 00:14:06,160 --> 00:14:09,560 Speaker 1: their brains wouldn't go this far into the question. Uh. 266 00:14:09,640 --> 00:14:12,320 Speaker 1: It is actually more complicated than two of every kind, 267 00:14:12,480 --> 00:14:15,440 Speaker 1: because it also says in the Noah's ark story that 268 00:14:15,640 --> 00:14:18,600 Speaker 1: I think they're supposed to bring more of every kind 269 00:14:18,679 --> 00:14:21,800 Speaker 1: of like certain types of animals, like certain clean animals 270 00:14:21,880 --> 00:14:24,720 Speaker 1: and just two of the unclean animals or something. But 271 00:14:25,320 --> 00:14:27,760 Speaker 1: but yeah, when you get it gets a little more 272 00:14:27,800 --> 00:14:31,520 Speaker 1: complicated right, I mean it's all kinds of animal management. Yeah, 273 00:14:31,560 --> 00:14:34,880 Speaker 1: which I would love to see somebody fail the test 274 00:14:35,160 --> 00:14:39,320 Speaker 1: of of the the Noah illusion the Moses illusion here 275 00:14:39,600 --> 00:14:42,240 Speaker 1: by by going into a lot of detail about the 276 00:14:42,480 --> 00:14:45,600 Speaker 1: you know, the the actual biblical text while still failing. 277 00:14:45,680 --> 00:14:47,720 Speaker 1: I think that right. Well, it was fourteen of every 278 00:14:47,760 --> 00:14:54,080 Speaker 1: kind of clean animal alright. Well, anyway, Noah strengths megaproject 279 00:14:54,080 --> 00:14:59,400 Speaker 1: management and animal handling. Obviously weakness, alcoholism, that's a major 280 00:14:59,440 --> 00:15:02,120 Speaker 1: part of the store, right. Um. Actors of note who 281 00:15:02,120 --> 00:15:05,360 Speaker 1: have betrayed him. Uh, this is not a complete list, 282 00:15:05,440 --> 00:15:08,720 Speaker 1: but these are the main main ones. John Houston, Russell Crowe, 283 00:15:09,360 --> 00:15:12,680 Speaker 1: David Thrillful. This is a guy on Shameless. He also 284 00:15:12,720 --> 00:15:16,720 Speaker 1: played Dr John d and Elizabeth the Golden Age. John Voight, 285 00:15:17,200 --> 00:15:20,520 Speaker 1: David Rentall. Uh, David Rentall is the guy who played 286 00:15:20,520 --> 00:15:24,560 Speaker 1: Aries Targarian on the Game of Thrones show. Oh interesting, 287 00:15:24,560 --> 00:15:28,080 Speaker 1: wait a Aries to the Mad King. I believe. So 288 00:15:28,200 --> 00:15:31,560 Speaker 1: that's the main areas, right, Okay, yeah, well maybe I 289 00:15:31,800 --> 00:15:33,400 Speaker 1: guess for some reason I thought there was another one. 290 00:15:33,400 --> 00:15:36,760 Speaker 1: I am wrong. Um, okay, So the I've got a 291 00:15:36,760 --> 00:15:39,760 Speaker 1: really funny story about John Voight playing Noah. I remember 292 00:15:40,000 --> 00:15:42,520 Speaker 1: seeing this one, oh, I have. It was made for TV. 293 00:15:42,760 --> 00:15:45,840 Speaker 1: I think came out when I was in like middle school, 294 00:15:46,120 --> 00:15:49,560 Speaker 1: and it is not at all faithful to the Bible, 295 00:15:49,640 --> 00:15:53,920 Speaker 1: and to say, very hollywooded up version of the Noah's 296 00:15:54,000 --> 00:15:56,720 Speaker 1: Arks story. John Voight does play Noah and the arc 297 00:15:56,800 --> 00:16:01,000 Speaker 1: is attacked by pirates. What. Yeah, it's attacked by like 298 00:16:01,080 --> 00:16:03,080 Speaker 1: water World pirates. I mean, it might as well be 299 00:16:03,120 --> 00:16:06,080 Speaker 1: Dennis Hopper and the Smokers, but it's actually I think 300 00:16:06,120 --> 00:16:12,040 Speaker 1: they get attacked by pirates led by the biblical character Lot. Okay, alright, well, 301 00:16:12,080 --> 00:16:14,280 Speaker 1: if that is in the Bible, at least they're they're 302 00:16:14,280 --> 00:16:16,600 Speaker 1: playing around with it. Was this brought up at all 303 00:16:16,760 --> 00:16:22,240 Speaker 1: when um Uh, when Darren Aronofsky was being criticized for 304 00:16:22,720 --> 00:16:25,640 Speaker 1: the plot of his Noah movie, which has like um 305 00:16:25,920 --> 00:16:29,400 Speaker 1: Uh giants and Nephelim in it. Oh, I kind of 306 00:16:29,440 --> 00:16:32,920 Speaker 1: liked his Noah movie. It was way more more faithful 307 00:16:32,960 --> 00:16:37,240 Speaker 1: to I think it included stuff from non canonical ancient texts, 308 00:16:37,280 --> 00:16:40,920 Speaker 1: but was actually inspired by ancient texts. Okay, alright, I 309 00:16:41,000 --> 00:16:43,200 Speaker 1: still haven't seen it. It's it's been on the list 310 00:16:43,240 --> 00:16:45,960 Speaker 1: for a while. All Right, let's talk about Moses real quick. Okay, 311 00:16:45,960 --> 00:16:49,720 Speaker 1: So Moses comes later. He's an Old Testament prophet um 312 00:16:49,960 --> 00:16:53,520 Speaker 1: central figure in the narrative of the Exodus. In the account, 313 00:16:53,600 --> 00:16:56,160 Speaker 1: he helps the Jewish people in their liberation from Egypt 314 00:16:56,200 --> 00:16:59,480 Speaker 1: Egyptian captivity and following Tim the tin Plegs of Egypt. 315 00:16:59,480 --> 00:17:02,280 Speaker 1: He assists him in the Exodus, and he also is 316 00:17:02,320 --> 00:17:04,880 Speaker 1: involved with an arc. But it's the Ark of the Covenant, 317 00:17:05,080 --> 00:17:07,399 Speaker 1: which we've discussed on the show before. Not a boat, 318 00:17:07,800 --> 00:17:12,600 Speaker 1: but a golden vessel that contains sacred items. Yeah. I 319 00:17:12,800 --> 00:17:15,280 Speaker 1: would assume that the words are related because they're both 320 00:17:15,320 --> 00:17:19,080 Speaker 1: like a container of kinds, like a big box. Okay, 321 00:17:19,080 --> 00:17:22,879 Speaker 1: So Moses his strength community organizing of course, and sorcery 322 00:17:23,440 --> 00:17:26,640 Speaker 1: his weaknesses. This is this is kind of interesting, I guess, 323 00:17:26,680 --> 00:17:29,720 Speaker 1: because it's either not obeying God and everything or obeying 324 00:17:29,760 --> 00:17:32,680 Speaker 1: God and everything, depending on who you ask, right, uh, 325 00:17:33,400 --> 00:17:35,320 Speaker 1: I mean, if you ask God, he would say, well, 326 00:17:35,320 --> 00:17:37,280 Speaker 1: he didn't obey me and everything. That's why I didn't 327 00:17:37,280 --> 00:17:40,240 Speaker 1: get to go into the Promised Land. But especially modern 328 00:17:40,280 --> 00:17:42,119 Speaker 1: critics are like, it seems like he he may be 329 00:17:42,240 --> 00:17:44,600 Speaker 1: followed the letter of the law a little bit too. 330 00:17:44,720 --> 00:17:48,840 Speaker 1: Um uh too. Seriously, I seem to recall at one 331 00:17:48,840 --> 00:17:51,640 Speaker 1: point him commanding the death penalty for a dude who 332 00:17:51,720 --> 00:17:54,040 Speaker 1: was working on the Sabbath. That seems a little harsh. Yeah, 333 00:17:54,080 --> 00:17:57,480 Speaker 1: it seems seems a little harsh. Um. Okay, So actors 334 00:17:57,480 --> 00:18:00,720 Speaker 1: of note who have betrayed Moses, well, Charlton has obviously, 335 00:18:00,960 --> 00:18:06,000 Speaker 1: Burt Lancaster, mel Brooks, Ben Kingsley, Val Kilmer, though that 336 00:18:06,000 --> 00:18:10,240 Speaker 1: that may have just been a voice role. And Christian Bale. Now, 337 00:18:10,359 --> 00:18:12,800 Speaker 1: the last one is interesting because as I was looking 338 00:18:12,800 --> 00:18:14,679 Speaker 1: at these actors was one of the interesting things is 339 00:18:14,720 --> 00:18:18,959 Speaker 1: even though they're basically interchangeable, like the same, Um, you know, 340 00:18:19,240 --> 00:18:21,200 Speaker 1: in most of these cases, you're dealing with the same 341 00:18:21,200 --> 00:18:23,480 Speaker 1: white dude that could play either of these characters in 342 00:18:23,480 --> 00:18:27,840 Speaker 1: a big Hollywood production. Um. But it's interesting that I 343 00:18:27,840 --> 00:18:31,840 Speaker 1: don't think anyone has actually played both Moses and Noah, 344 00:18:32,040 --> 00:18:36,720 Speaker 1: though Christian Bale reportedly came very close because Darren Aronofsky 345 00:18:36,920 --> 00:18:40,040 Speaker 1: originally wanted Christian Bale to play the title role in 346 00:18:40,160 --> 00:18:44,080 Speaker 1: his Noah film, but scheduling conflicts prohibited that from happening. 347 00:18:44,320 --> 00:18:47,320 Speaker 1: Oh he couldn't because he was filming like terminator Mick 348 00:18:47,400 --> 00:18:50,679 Speaker 1: g or whatever. Yeah. I don't know, but um, but 349 00:18:50,760 --> 00:18:55,000 Speaker 1: imagine if if Bale had played both Noah and Moses. 350 00:18:55,080 --> 00:18:57,720 Speaker 1: What would that have meant for the Moses illusion? Would 351 00:18:57,720 --> 00:19:00,200 Speaker 1: it have made the would would it just destroy are 352 00:19:00,200 --> 00:19:04,359 Speaker 1: semantic understanding of reality? Maybe there's a secret counsel. There's 353 00:19:04,400 --> 00:19:07,760 Speaker 1: like no Hollywood, no actor can play both of these 354 00:19:07,840 --> 00:19:11,240 Speaker 1: roles because it will totally tear our understanding of of 355 00:19:11,240 --> 00:19:15,080 Speaker 1: of facts and fiction apart. I could see that. I mean, so, 356 00:19:15,160 --> 00:19:17,200 Speaker 1: I think what some of the authors here are proposing 357 00:19:17,600 --> 00:19:20,560 Speaker 1: is that the the fact that it's not just that 358 00:19:20,680 --> 00:19:23,640 Speaker 1: Moses and Noah are words that kind of sound similar. 359 00:19:23,640 --> 00:19:26,440 Speaker 1: They've got some similar consonants and uh in the same 360 00:19:26,520 --> 00:19:29,639 Speaker 1: number of syllables, similar vowel sounds. That's all true, and 361 00:19:29,680 --> 00:19:32,920 Speaker 1: that does seem to matter, But it's also very important 362 00:19:32,960 --> 00:19:36,480 Speaker 1: that they are semantically related, that they are both characters 363 00:19:36,560 --> 00:19:39,800 Speaker 1: from the Torah, from the Old Testament, and that sort 364 00:19:39,840 --> 00:19:41,800 Speaker 1: of links them together. And I think the more you 365 00:19:41,840 --> 00:19:44,679 Speaker 1: could do to link them even further together and associate 366 00:19:44,720 --> 00:19:47,199 Speaker 1: them in in our minds, like yes, having one actor 367 00:19:47,200 --> 00:19:49,600 Speaker 1: play both, I think that would actually probably make people 368 00:19:49,640 --> 00:19:53,800 Speaker 1: even more susceptible. Yeah, um, I was thinking about this too, 369 00:19:53,840 --> 00:19:56,840 Speaker 1: Like obviously we've already touched on a few extra examples 370 00:19:56,840 --> 00:19:58,720 Speaker 1: of this. But I was trying to come up with 371 00:19:58,720 --> 00:20:01,280 Speaker 1: with other examples that would play on the same energy here, 372 00:20:01,640 --> 00:20:03,919 Speaker 1: and one that came to mind would be, uh, if 373 00:20:03,960 --> 00:20:05,960 Speaker 1: we were to look to Chinese mythology, if we were 374 00:20:06,000 --> 00:20:08,800 Speaker 1: to say, hey, how did the Yellow Emperor decide how 375 00:20:08,840 --> 00:20:11,280 Speaker 1: to order the animals of the zodiac? And you might 376 00:20:11,480 --> 00:20:13,960 Speaker 1: respond with, oh, well, there's this cool little story about 377 00:20:14,080 --> 00:20:17,080 Speaker 1: a race for the animals, etcetera. Um, but it wasn't 378 00:20:17,080 --> 00:20:19,119 Speaker 1: the Yellow Emperor. It was the Jade Emperor, who's an 379 00:20:19,160 --> 00:20:23,440 Speaker 1: even more primordial god ruler than the Yellow Emperor. Um. 380 00:20:23,560 --> 00:20:25,200 Speaker 1: So I don't know that seems like it could be 381 00:20:25,560 --> 00:20:29,520 Speaker 1: could play in the similar could work in a similar 382 00:20:29,520 --> 00:20:33,520 Speaker 1: way to the Moses and Noah illusion. Or how about 383 00:20:33,560 --> 00:20:35,560 Speaker 1: this in Return of the Jedi, what was Django fet 384 00:20:35,600 --> 00:20:38,480 Speaker 1: swallowed by? Oh? I just see. For some reason, I 385 00:20:38,480 --> 00:20:40,560 Speaker 1: feel like that one doesn't work because then learn as 386 00:20:40,600 --> 00:20:42,840 Speaker 1: you as soon as you say the word django, like 387 00:20:42,920 --> 00:20:45,520 Speaker 1: people's alarms go off and like, wait a minute, what 388 00:20:45,520 --> 00:20:48,280 Speaker 1: are we talking about? Yeah, yeah, yeah, yeah, Well I 389 00:20:48,320 --> 00:20:51,399 Speaker 1: don't maybe it would. Okay, here's one for Avatar, the 390 00:20:51,440 --> 00:20:54,479 Speaker 1: last Airbender fans out there, Um, we're hearing from several 391 00:20:54,520 --> 00:20:58,520 Speaker 1: of them. Which nation was the avatar Apa born into. 392 00:20:59,200 --> 00:21:01,040 Speaker 1: I don't know if that one or not. But of 393 00:21:01,080 --> 00:21:03,520 Speaker 1: course ang is the last and not the last avatar, 394 00:21:03,600 --> 00:21:06,399 Speaker 1: and is the avatar uh Appa is the sky by 395 00:21:06,480 --> 00:21:09,280 Speaker 1: sin that he rides on. Ah. I see, so I 396 00:21:09,280 --> 00:21:12,320 Speaker 1: don't know ang Appa. Maybe that works not sure? Well 397 00:21:12,359 --> 00:21:21,120 Speaker 1: that went over my head anyway. So you might think, well, 398 00:21:21,240 --> 00:21:23,240 Speaker 1: now that we have told you there is such a 399 00:21:23,280 --> 00:21:25,840 Speaker 1: thing as the Moses illusion, uh you know you would 400 00:21:25,880 --> 00:21:29,280 Speaker 1: never fall for it, right because you know you will 401 00:21:29,359 --> 00:21:32,840 Speaker 1: now always having this knowledge in your mind. Notice when 402 00:21:32,880 --> 00:21:35,640 Speaker 1: there will be substitutions of this kind in a question 403 00:21:35,680 --> 00:21:38,440 Speaker 1: or a sentence. But it turns out that's not necessarily true. 404 00:21:38,720 --> 00:21:42,080 Speaker 1: Uh So there was this original research from nineteen eighty one, 405 00:21:42,119 --> 00:21:44,359 Speaker 1: but there have been a bunch of studies in the 406 00:21:44,359 --> 00:21:48,920 Speaker 1: decades since then replicating the original finding and further probing 407 00:21:49,000 --> 00:21:51,720 Speaker 1: the effect to figure out what's going on in our brains. 408 00:21:52,320 --> 00:21:54,800 Speaker 1: Uh So, I wanted to talk about some typical findings. 409 00:21:55,040 --> 00:21:58,159 Speaker 1: First of all, some things that were summarized in h 410 00:21:58,280 --> 00:22:00,960 Speaker 1: in a few literature reviews I was looking at. One 411 00:22:01,080 --> 00:22:03,800 Speaker 1: was in a book chapter by Elizabeth J. Marsh and 412 00:22:04,040 --> 00:22:09,360 Speaker 1: SHARDA Umanath. It was a book called Processing Inaccurate Information 413 00:22:09,520 --> 00:22:12,879 Speaker 1: published by M. I. T. Press. In that book sounds 414 00:22:12,880 --> 00:22:16,160 Speaker 1: like a scream, but their chapter is called knowledge neglect 415 00:22:16,240 --> 00:22:19,800 Speaker 1: failures to notice contradictions with stored knowledge and will revisit 416 00:22:19,840 --> 00:22:22,480 Speaker 1: this chapter a few times later in the episode. But 417 00:22:22,840 --> 00:22:27,000 Speaker 1: they summarize some things about the Moses illusion. Uh so 418 00:22:27,160 --> 00:22:29,480 Speaker 1: they say that most of the time people will fall 419 00:22:29,600 --> 00:22:32,439 Speaker 1: for the Moses solution even though they actually know the 420 00:22:32,480 --> 00:22:36,360 Speaker 1: difference between Moses and Noah, as demonstrated with later interrogation. 421 00:22:36,480 --> 00:22:39,320 Speaker 1: So you can ask people questions like who built the 422 00:22:39,480 --> 00:22:41,639 Speaker 1: arc or who took the animals into the arc, and 423 00:22:41,680 --> 00:22:43,960 Speaker 1: they'll get the answer right, but they still fail to 424 00:22:44,000 --> 00:22:46,479 Speaker 1: notice that it's Moses in the question. And this can 425 00:22:46,520 --> 00:22:50,800 Speaker 1: be accomplished with other similar Switcherus actually included rob a 426 00:22:50,880 --> 00:22:53,560 Speaker 1: list for you to look at of questions like this one. 427 00:22:53,600 --> 00:22:57,080 Speaker 1: I like, is um, what did Goldilocks eat at the 428 00:22:57,119 --> 00:22:59,920 Speaker 1: Three Little Pigs house? And a lot of people will 429 00:23:00,040 --> 00:23:02,640 Speaker 1: is to answer porridge, even though you can later ask 430 00:23:02,720 --> 00:23:06,040 Speaker 1: them like, hey, whose house did Goldilocks go into? The 431 00:23:06,040 --> 00:23:08,080 Speaker 1: three bears or the three Little pigs? And they of 432 00:23:08,080 --> 00:23:10,360 Speaker 1: course know that it was the bears. Now that One's 433 00:23:10,400 --> 00:23:14,280 Speaker 1: interesting because for me anyway, there's a there's an associated 434 00:23:14,320 --> 00:23:18,119 Speaker 1: mental image of the bears or the pigs. Uh. They 435 00:23:18,200 --> 00:23:20,960 Speaker 1: they look rather different, uh, and and ultimately they have 436 00:23:21,000 --> 00:23:25,000 Speaker 1: different functions in the stories, whereas Moses and Noah are 437 00:23:25,080 --> 00:23:27,879 Speaker 1: more interchangeable. And it is the same sort of character 438 00:23:28,000 --> 00:23:30,440 Speaker 1: and there of course the same species, because the pigs 439 00:23:30,480 --> 00:23:32,360 Speaker 1: are there to be the victims of the big bad 440 00:23:32,359 --> 00:23:36,040 Speaker 1: wolf story and to get beaten, and the bears are 441 00:23:36,080 --> 00:23:37,960 Speaker 1: there too. I don't know what, just hang out in 442 00:23:37,960 --> 00:23:40,280 Speaker 1: their house, I guess right, But I can still imagine 443 00:23:40,280 --> 00:23:43,600 Speaker 1: someone uh falling for this or or you know, having 444 00:23:44,040 --> 00:23:47,280 Speaker 1: airing in answering this question, because in a way, again, 445 00:23:47,320 --> 00:23:49,119 Speaker 1: you're you're racing into the finish line. You're picking up 446 00:23:49,160 --> 00:23:52,840 Speaker 1: on the you know, the basics of the question, even 447 00:23:52,840 --> 00:23:56,320 Speaker 1: though you're you're you're skipping over this. This this this 448 00:23:56,520 --> 00:24:00,240 Speaker 1: misinformation that's embedded in the middle of it. Right. Though, 449 00:24:00,240 --> 00:24:02,639 Speaker 1: it's interesting that you mentioned racing to get to the answer. 450 00:24:02,680 --> 00:24:05,359 Speaker 1: I do think you're basically right about that, except it 451 00:24:05,400 --> 00:24:08,479 Speaker 1: doesn't really seem that time is a factor here, because 452 00:24:08,680 --> 00:24:12,240 Speaker 1: giving people extra or even unlimited time to think about 453 00:24:12,240 --> 00:24:16,040 Speaker 1: the question does not eliminate the effect does it, So 454 00:24:16,080 --> 00:24:18,119 Speaker 1: it doesn't seem to result from people being in a 455 00:24:18,200 --> 00:24:20,680 Speaker 1: hurry in terms of time, though I think you could 456 00:24:20,720 --> 00:24:23,240 Speaker 1: still think about it as people being in a hurry 457 00:24:23,320 --> 00:24:25,800 Speaker 1: in terms of just like wanting to get to the 458 00:24:25,840 --> 00:24:28,359 Speaker 1: part where they answer the question. I don't know, maybe 459 00:24:28,359 --> 00:24:30,760 Speaker 1: that could be like self imposed time limits, even if 460 00:24:30,760 --> 00:24:35,840 Speaker 1: they're not imposed by somebody externally trying to rush you through. Now, Also, 461 00:24:35,880 --> 00:24:39,200 Speaker 1: in a typical setup for these Moses illusion experiments, readers 462 00:24:39,200 --> 00:24:43,680 Speaker 1: will be warned that some questions will contain incorrect presuppositions, 463 00:24:43,680 --> 00:24:45,639 Speaker 1: so it's not just like a trick question where they 464 00:24:45,640 --> 00:24:48,880 Speaker 1: don't know this is coming. They'll be told, Okay, some 465 00:24:48,960 --> 00:24:51,760 Speaker 1: of these questions will be valid questions, in which case 466 00:24:51,840 --> 00:24:55,080 Speaker 1: you should just answer them, but other questions will have 467 00:24:55,280 --> 00:24:58,800 Speaker 1: incorrect presuppositions, and when you come across one of those, 468 00:24:58,880 --> 00:25:01,920 Speaker 1: you should note that the question is not valid. Now, 469 00:25:02,200 --> 00:25:04,399 Speaker 1: the interesting thing is, I would think something like that 470 00:25:04,440 --> 00:25:07,400 Speaker 1: would almost completely erase the effect, because you're putting people 471 00:25:07,440 --> 00:25:10,760 Speaker 1: on guard to be like interrogating the questions. But it doesn't. 472 00:25:11,000 --> 00:25:13,040 Speaker 1: You can put people on guard like that and they 473 00:25:13,080 --> 00:25:17,119 Speaker 1: still fall for the Moses solution. In these experiments, it 474 00:25:17,200 --> 00:25:19,639 Speaker 1: does seem to be a very robust effect, like a 475 00:25:19,720 --> 00:25:23,400 Speaker 1: substantial number of people will fail to detect errors in questions, 476 00:25:23,520 --> 00:25:26,160 Speaker 1: even though they later showed that they possessed the knowledge 477 00:25:26,240 --> 00:25:30,520 Speaker 1: to answer them correctly. Uh. The exact percentages of the effect, 478 00:25:30,600 --> 00:25:33,400 Speaker 1: though very a good bit UH from that chapter by 479 00:25:33,720 --> 00:25:37,720 Speaker 1: Martian Umanov the they right quote. Overall, the Moses solution 480 00:25:37,840 --> 00:25:41,760 Speaker 1: is robust, with readers answering from fourteen percent to forty 481 00:25:41,840 --> 00:25:45,679 Speaker 1: percent to fifty two percent to seventy seven percent of 482 00:25:45,760 --> 00:25:49,520 Speaker 1: distorted questions depending on the particular experiments. So they're citing 483 00:25:49,520 --> 00:25:52,040 Speaker 1: a number of different results there. The fourteen percent was 484 00:25:52,160 --> 00:25:59,160 Speaker 1: by van Jarsveld Dikstra, and Herman's was Hannon and Donovan 485 00:25:59,240 --> 00:26:04,359 Speaker 1: and too, and one was Ericson and Mattson in one 486 00:26:04,560 --> 00:26:08,959 Speaker 1: and was Barton and Sandford inte. And I would imagine 487 00:26:08,960 --> 00:26:11,480 Speaker 1: these differences have a lot to do with like what 488 00:26:11,480 --> 00:26:15,200 Speaker 1: what exactly types of warnings you're giving people ahead of time, 489 00:26:15,240 --> 00:26:18,560 Speaker 1: what exactly what exact examples are used? As we've said, 490 00:26:18,720 --> 00:26:22,120 Speaker 1: you know, it's it's clear that different questions are more 491 00:26:22,240 --> 00:26:24,879 Speaker 1: prone than others. Like I think more people would probably 492 00:26:24,920 --> 00:26:28,320 Speaker 1: fall for the Moses Noah confusion than for the Three 493 00:26:28,320 --> 00:26:31,440 Speaker 1: Little Pigs, Three bears confusion. Yeah, I have to say 494 00:26:31,480 --> 00:26:34,040 Speaker 1: some of the the examples that you included on a 495 00:26:34,080 --> 00:26:36,200 Speaker 1: list here, it's it's interesting to run through this because 496 00:26:36,560 --> 00:26:40,359 Speaker 1: even though I'm not encountering them as actual questions like 497 00:26:40,400 --> 00:26:42,159 Speaker 1: one and someone in one of these studies would be 498 00:26:42,760 --> 00:26:44,399 Speaker 1: I can certainly pick up on the ones that I 499 00:26:44,800 --> 00:26:48,679 Speaker 1: feel like would have been more likely to fool me, like, 500 00:26:48,720 --> 00:26:51,720 Speaker 1: for instance, what kind of treated Lincoln chop down? What 501 00:26:51,840 --> 00:26:55,680 Speaker 1: kind of treated Washington chop down? Um? Like I can 502 00:26:55,720 --> 00:26:58,840 Speaker 1: imagine myself sort of this being a story I'm not 503 00:26:59,000 --> 00:27:01,680 Speaker 1: tremendously in the it in, but have a version off 504 00:27:01,720 --> 00:27:06,080 Speaker 1: stored away I can instantly skip, or even not instantly, 505 00:27:06,080 --> 00:27:08,000 Speaker 1: but even with some thought would be like I think, yeah, 506 00:27:08,080 --> 00:27:10,680 Speaker 1: cherry Tree, cherry Tree, that's the one, you know, even 507 00:27:10,720 --> 00:27:13,600 Speaker 1: if said Lincoln. Yeah, even if it's said Lincoln, because 508 00:27:13,640 --> 00:27:17,959 Speaker 1: also I don't know Lincoln. Something about like their stories 509 00:27:18,000 --> 00:27:19,680 Speaker 1: about him, you know, we also have sort of tall 510 00:27:19,720 --> 00:27:22,919 Speaker 1: tales about him and his exploits, and um, the one 511 00:27:22,960 --> 00:27:26,680 Speaker 1: about him, Uh, there's one about him answering a duel. 512 00:27:26,760 --> 00:27:28,879 Speaker 1: Somebody challenged him to a duel and he says, well, 513 00:27:28,880 --> 00:27:31,639 Speaker 1: I get to choose the place and the weapon. So 514 00:27:31,680 --> 00:27:35,400 Speaker 1: I choose, uh, what's sledgehammers and five feet of water 515 00:27:35,520 --> 00:27:38,320 Speaker 1: or something? DAVI that he's tall and the other person 516 00:27:38,440 --> 00:27:39,879 Speaker 1: was short, something like that. I have no idea if 517 00:27:39,920 --> 00:27:42,440 Speaker 1: that's an even a legitimate story, but I have it 518 00:27:42,520 --> 00:27:44,879 Speaker 1: in my head. So I have an image of Lincoln 519 00:27:45,280 --> 00:27:49,080 Speaker 1: holding some sort of long handled tool, so it fits 520 00:27:49,119 --> 00:27:52,639 Speaker 1: in nicely into the story, like I can easily overlay 521 00:27:52,680 --> 00:27:56,040 Speaker 1: one over the other. Yeah. One of the examples that 522 00:27:56,119 --> 00:27:59,119 Speaker 1: I feel extremely confident that I would not fall for 523 00:27:59,440 --> 00:28:01,960 Speaker 1: is the one of what is the name of the 524 00:28:02,080 --> 00:28:07,199 Speaker 1: Mexican dip made with mashed artichokes? I definitely, I mean, 525 00:28:07,320 --> 00:28:10,040 Speaker 1: I just know artichokes. No, that is not what it is. 526 00:28:10,480 --> 00:28:13,520 Speaker 1: You don't mash artichokes, do you. I mean, I haven't 527 00:28:13,520 --> 00:28:19,440 Speaker 1: seen it. Could could make an artichoke paste, but artichoke guacamole. 528 00:28:19,600 --> 00:28:23,320 Speaker 1: That sounds gross, I mean, but yet artichoke depp is amazing, 529 00:28:23,440 --> 00:28:28,959 Speaker 1: but artichoke guacamole just says it sound right? But anyway, 530 00:28:29,280 --> 00:28:32,440 Speaker 1: So Marcia and Umanov also note that um that that 531 00:28:32,680 --> 00:28:36,520 Speaker 1: error detection is lower when items uh that items are 532 00:28:36,520 --> 00:28:39,200 Speaker 1: swapped are similar in a couple of ways. We've already 533 00:28:39,240 --> 00:28:42,440 Speaker 1: mentioned these, but they reiterate that it helps when there's 534 00:28:42,480 --> 00:28:46,040 Speaker 1: phonological similarity. So do the words sound close to each other? 535 00:28:46,080 --> 00:28:49,720 Speaker 1: I feel like, uh, Avocados and artichokes, like they have 536 00:28:49,960 --> 00:28:52,400 Speaker 1: some similar vowel sounds, and they start with the same letter, 537 00:28:52,520 --> 00:28:55,920 Speaker 1: but they sound different enough to me that I'm immediately strong. 538 00:28:56,000 --> 00:28:58,800 Speaker 1: I think somehow like the hard k sound coming towards 539 00:28:58,840 --> 00:29:01,000 Speaker 1: the end of the word art a choke, but coming 540 00:29:01,080 --> 00:29:04,040 Speaker 1: towards the beginning of or, I guess in the middle 541 00:29:04,080 --> 00:29:06,600 Speaker 1: of avocado. Somehow, that makes a big difference in my brain. 542 00:29:08,400 --> 00:29:11,320 Speaker 1: And then, of course, as we've been saying, semantic similarity, 543 00:29:11,360 --> 00:29:14,719 Speaker 1: are the concept somehow similar or related? Would we put 544 00:29:14,800 --> 00:29:17,080 Speaker 1: them in a kind of meaning next us together in 545 00:29:17,200 --> 00:29:20,000 Speaker 1: the brain? Uh? And and of course it's notable that 546 00:29:20,240 --> 00:29:23,160 Speaker 1: the Moses versus Noah one meets both of the criteria. 547 00:29:23,240 --> 00:29:26,520 Speaker 1: They sound similar and they're related. So anyway, it's just 548 00:29:26,680 --> 00:29:30,200 Speaker 1: this interesting fact about our brains that something about being 549 00:29:30,440 --> 00:29:33,400 Speaker 1: asked a question like this. So trying to process a 550 00:29:33,480 --> 00:29:36,720 Speaker 1: sentence like the questions in these studies causes us to 551 00:29:36,920 --> 00:29:40,160 Speaker 1: ignore the fact that the contents of the sentence conflict 552 00:29:40,240 --> 00:29:42,680 Speaker 1: with things that we know to be true, and I 553 00:29:42,760 --> 00:29:45,520 Speaker 1: wanted to mention one other study I was looking at that. 554 00:29:45,760 --> 00:29:50,080 Speaker 1: This one is by Hadency Bottoms, Andrea and Slick and 555 00:29:50,120 --> 00:29:54,360 Speaker 1: Elizabeth J. Marsh from published in the journal Memory called 556 00:29:54,480 --> 00:29:58,080 Speaker 1: Memory and the Moses illusion failures to detect contradictions with 557 00:29:58,160 --> 00:30:02,800 Speaker 1: stored knowledge yield negative memorial consequences. Now we can revisit 558 00:30:03,160 --> 00:30:04,880 Speaker 1: some of the things in this more as we go on, 559 00:30:05,080 --> 00:30:07,000 Speaker 1: but I just wanted to note a few things that 560 00:30:07,120 --> 00:30:10,080 Speaker 1: they bring up. Uh So, first of all, they note 561 00:30:10,120 --> 00:30:14,160 Speaker 1: some other previous findings in their introduction. One is that 562 00:30:14,560 --> 00:30:18,600 Speaker 1: um error detection improves, so people are less likely to 563 00:30:18,680 --> 00:30:21,920 Speaker 1: fall for the Moses illusion when the error appears in 564 00:30:22,000 --> 00:30:25,600 Speaker 1: what they call the cleft phrase or the main focus 565 00:30:25,760 --> 00:30:28,040 Speaker 1: of the sentence. So there are ways that you can 566 00:30:28,080 --> 00:30:31,080 Speaker 1: basically ask the same question but just sort of rearrange 567 00:30:31,200 --> 00:30:35,560 Speaker 1: the words to make people more likely to notice the problem. So, 568 00:30:35,640 --> 00:30:38,280 Speaker 1: if you take the sentence how many animals of each 569 00:30:38,400 --> 00:30:41,040 Speaker 1: kind did Moses take on the arc? The word Moses 570 00:30:41,200 --> 00:30:45,280 Speaker 1: is kind of syntactically de emphasized in that sentence, you know, 571 00:30:45,400 --> 00:30:47,520 Speaker 1: it's not like the main focus of the way the 572 00:30:47,600 --> 00:30:50,800 Speaker 1: sentence is phrased. You can re orient the words to 573 00:30:51,000 --> 00:30:53,720 Speaker 1: make moses more prominent in the sentence, in which case 574 00:30:53,760 --> 00:30:56,720 Speaker 1: people are more likely to catch the problem. Yeah, Like 575 00:30:56,800 --> 00:30:58,960 Speaker 1: I also feel like having the word show up so 576 00:30:59,120 --> 00:31:02,160 Speaker 1: late in the sentence. I'm I'm, I'm. Like, you're always 577 00:31:02,280 --> 00:31:05,600 Speaker 1: predicting where sentences are going, you know, yes, so you've 578 00:31:05,640 --> 00:31:07,680 Speaker 1: kind of already filled it in to a certain extent, 579 00:31:07,840 --> 00:31:10,160 Speaker 1: like you know, you know who we're talking about. Uh, 580 00:31:10,280 --> 00:31:13,280 Speaker 1: even if you end up using the wrong name. Um, yeah, 581 00:31:13,480 --> 00:31:16,280 Speaker 1: I think you're exactly right about that. Like that, once 582 00:31:16,360 --> 00:31:18,000 Speaker 1: you've heard I don't know, you get like four or 583 00:31:18,040 --> 00:31:20,280 Speaker 1: five words into the sentence, you you sort of are 584 00:31:20,400 --> 00:31:22,480 Speaker 1: like you already know what it's going to be, and 585 00:31:22,640 --> 00:31:25,280 Speaker 1: you're just sort of like okay, you like mostly ignoring 586 00:31:25,360 --> 00:31:28,160 Speaker 1: the words that come after that. Another thing that they 587 00:31:28,240 --> 00:31:32,040 Speaker 1: point out that's interesting is that error detection improves when 588 00:31:32,280 --> 00:31:36,480 Speaker 1: questions appear in a difficult to read font And they 589 00:31:36,520 --> 00:31:40,160 Speaker 1: say this is because it reduces processing fluency, which in 590 00:31:40,240 --> 00:31:44,640 Speaker 1: turn makes material seem less familiar and less true. And 591 00:31:44,720 --> 00:31:47,200 Speaker 1: this was found by Song and Schwartz in two thousand 592 00:31:47,200 --> 00:31:49,400 Speaker 1: and eight. And this, of course, this comes back to 593 00:31:49,440 --> 00:31:53,840 Speaker 1: our old friend. Processing fluency, a cognitive factor that I 594 00:31:53,920 --> 00:31:57,320 Speaker 1: believe is one of the most underappreciated influences on our 595 00:31:57,400 --> 00:32:00,520 Speaker 1: thoughts and beliefs and behavior. We talked about it in 596 00:32:00,560 --> 00:32:04,280 Speaker 1: our episode on the illusory truth effect. Basically, processing fluency 597 00:32:04,360 --> 00:32:08,160 Speaker 1: means how easy is it for this stimulus to be 598 00:32:08,280 --> 00:32:11,200 Speaker 1: processed by the brain and uh, and it came up 599 00:32:11,240 --> 00:32:13,960 Speaker 1: in the illusory truth Effect episode because I remember. The 600 00:32:14,040 --> 00:32:18,280 Speaker 1: illusory truth effect is where statements you've encountered before seem 601 00:32:18,440 --> 00:32:22,080 Speaker 1: more true than statements that are new to you. And 602 00:32:22,280 --> 00:32:26,080 Speaker 1: one possible explanation for this is that familiar statements are 603 00:32:26,160 --> 00:32:29,360 Speaker 1: easier for the brain to process than unfamiliar ones are, 604 00:32:29,720 --> 00:32:33,320 Speaker 1: and at some level, the brain makes an equivalence between 605 00:32:33,480 --> 00:32:37,040 Speaker 1: that processing fluency, how easy it is to process this 606 00:32:37,160 --> 00:32:42,200 Speaker 1: incoming sentence because it's familiar and factual trustworthiness. They actually 607 00:32:42,280 --> 00:32:44,160 Speaker 1: have nothing to do with one another, but the brain 608 00:32:44,280 --> 00:32:48,280 Speaker 1: maybe uses a little bit of shortcut there. So are 609 00:32:48,320 --> 00:32:51,520 Speaker 1: you saying that in the future for our our shared notes, Joe, 610 00:32:51,640 --> 00:32:55,000 Speaker 1: we should use chill or font instead of whatever we're 611 00:32:55,040 --> 00:32:58,080 Speaker 1: using now. Yeah, that would that make it less like 612 00:32:58,240 --> 00:33:00,720 Speaker 1: I mean, I think that would generally slow us down 613 00:33:00,880 --> 00:33:02,840 Speaker 1: and make it harder to do the podcast, But it 614 00:33:02,960 --> 00:33:05,840 Speaker 1: also might make it less likely that we would just 615 00:33:06,000 --> 00:33:10,240 Speaker 1: like flub words here and there, because it would be 616 00:33:10,400 --> 00:33:13,960 Speaker 1: a like really effortful, laborious process to get through every 617 00:33:14,000 --> 00:33:17,000 Speaker 1: single thought, which you know sometimes it is anyway, but 618 00:33:17,160 --> 00:33:20,400 Speaker 1: that that's on us, um. But anyway, So Song and 619 00:33:20,480 --> 00:33:22,560 Speaker 1: Shorts here in two thousand and eight found that simply 620 00:33:22,640 --> 00:33:25,160 Speaker 1: by making statements harder to read so you put them in, 621 00:33:25,480 --> 00:33:28,120 Speaker 1: you said, Chiller, I was thinking, Papyrus. I don't know 622 00:33:28,240 --> 00:33:31,040 Speaker 1: what what actual thought they used, but it would just 623 00:33:31,160 --> 00:33:34,080 Speaker 1: make people more likely to spot errors in the questions 624 00:33:34,200 --> 00:33:37,200 Speaker 1: instead of just rolling right over them without noticing. And 625 00:33:37,680 --> 00:33:41,280 Speaker 1: you know that makes sense to me, Yeah, yeah, it does. 626 00:33:41,360 --> 00:33:44,320 Speaker 1: It is interesting that that's how our brains work, though, Yeah, 627 00:33:44,720 --> 00:33:46,960 Speaker 1: it is sort of counterintuitive at the same time, like 628 00:33:47,080 --> 00:33:49,880 Speaker 1: you might just assume that if something's harder to read, 629 00:33:50,080 --> 00:33:52,680 Speaker 1: you would be less likely to catch errors in it. 630 00:33:52,800 --> 00:33:54,920 Speaker 1: But yeah, I think there's some kind of process where 631 00:33:54,960 --> 00:33:57,440 Speaker 1: it's like slowing you down. It's not allowing you to 632 00:33:57,600 --> 00:34:01,240 Speaker 1: just like skip over the parts that it seemed like yeah, yeah, okay, 633 00:34:01,520 --> 00:34:04,040 Speaker 1: Moses whatever. It's like like a bit of food that's 634 00:34:04,080 --> 00:34:06,880 Speaker 1: extra chewy, so you're going to really taste this, You're 635 00:34:06,920 --> 00:34:08,600 Speaker 1: really going to get a feel for the texture. There's 636 00:34:08,600 --> 00:34:11,120 Speaker 1: no just wolf in this down. Yeah. Now, In the 637 00:34:11,200 --> 00:34:13,680 Speaker 1: study by Bottoms at All, they were looking at the 638 00:34:13,760 --> 00:34:17,799 Speaker 1: question of whether participants can detect errors in questions better 639 00:34:18,200 --> 00:34:21,840 Speaker 1: if there are just more errors overall in the sample 640 00:34:21,880 --> 00:34:23,480 Speaker 1: of questions. So, if I give you a bunch of 641 00:34:23,600 --> 00:34:27,040 Speaker 1: questions and like, I don't know, seventy of them contain 642 00:34:27,239 --> 00:34:30,080 Speaker 1: errors of this kind in them, are people more likely 643 00:34:30,200 --> 00:34:32,840 Speaker 1: to catch them? And it looks like the answer is yes. Like, 644 00:34:32,960 --> 00:34:35,200 Speaker 1: if you if you've got people on guard because they 645 00:34:35,239 --> 00:34:38,799 Speaker 1: were just constantly problems with these questions, their guard goes 646 00:34:38,880 --> 00:34:41,200 Speaker 1: up and they do seem to make the Moses illusion 647 00:34:41,480 --> 00:34:44,439 Speaker 1: mistake less often. And it strikes me that that could 648 00:34:44,560 --> 00:34:48,440 Speaker 1: be possibly, or at least partially because once you start, 649 00:34:48,760 --> 00:34:52,400 Speaker 1: you know, showing people questions where most of them contain 650 00:34:52,480 --> 00:34:55,160 Speaker 1: a problem, or even just a large minority of them 651 00:34:55,239 --> 00:34:59,440 Speaker 1: contain a problem, people probably start uh interacting with the 652 00:34:59,520 --> 00:35:03,759 Speaker 1: question less as questions and becoming less focused on just 653 00:35:03,920 --> 00:35:07,000 Speaker 1: getting the answer and start looking at them more like 654 00:35:07,120 --> 00:35:09,840 Speaker 1: a puzzle where you're you're trying to parse the sentence 655 00:35:10,040 --> 00:35:13,000 Speaker 1: very clearly. Yeah, Yeah, It's like, how is this trying 656 00:35:13,080 --> 00:35:16,840 Speaker 1: to trick me. Yeah, but then there's one kind of 657 00:35:16,920 --> 00:35:21,440 Speaker 1: scary implication from this paper the author's right quote. More generally, 658 00:35:21,560 --> 00:35:25,800 Speaker 1: the failure to detect errors had negative memorial consequences, increasing 659 00:35:25,880 --> 00:35:29,720 Speaker 1: the likelihood that errors were used to answer later general 660 00:35:29,880 --> 00:35:34,640 Speaker 1: knowledge questions. Methodological implications of this finding are discussed, as 661 00:35:34,680 --> 00:35:38,560 Speaker 1: it suggests that typical analyzes likely underestimate the size of 662 00:35:38,600 --> 00:35:43,480 Speaker 1: the Moses illusion. Overall, answering distorted questions can yield errors 663 00:35:43,640 --> 00:35:47,480 Speaker 1: in the knowledge base. More importantly, prior knowledge does not 664 00:35:47,680 --> 00:35:52,000 Speaker 1: protect against these negative memorial consequences. And Robert, I think 665 00:35:52,040 --> 00:35:53,360 Speaker 1: you had a note about that. We can talk a 666 00:35:53,400 --> 00:35:55,720 Speaker 1: little bit more about that in a bit, but yeah, basically, 667 00:35:55,800 --> 00:36:00,320 Speaker 1: there there is some evidence that just steamrolling over an 668 00:36:00,440 --> 00:36:04,279 Speaker 1: incorrect fact in a sentence, even when you know otherwise, 669 00:36:04,480 --> 00:36:10,000 Speaker 1: can can later damage your ability to recall that fact correctly. Yeah, yeah, 670 00:36:10,080 --> 00:36:13,759 Speaker 1: so it Yeah, as as well discuss here. It's it's 671 00:36:13,760 --> 00:36:16,120 Speaker 1: not just a situation where oh, well this is a quirk. 672 00:36:16,239 --> 00:36:18,279 Speaker 1: This is interesting. The brain does this. I mean, it 673 00:36:18,480 --> 00:36:22,719 Speaker 1: is that, but it it has it has greater implications. Yeah. 674 00:36:23,200 --> 00:36:25,080 Speaker 1: Now I want to go back on the other side 675 00:36:25,120 --> 00:36:28,920 Speaker 1: and say that when we encounter things like this, you know, 676 00:36:29,040 --> 00:36:32,160 Speaker 1: illusions that humans often fall for. When you read about 677 00:36:32,200 --> 00:36:35,719 Speaker 1: a certain type of I don't know, cognitive bias or 678 00:36:36,000 --> 00:36:39,400 Speaker 1: or something. I think our tendency is often to at 679 00:36:39,440 --> 00:36:42,800 Speaker 1: first react like, wow, our dumb brains were so stupid. 680 00:36:42,960 --> 00:36:45,680 Speaker 1: But but I think there's another way to think about it, 681 00:36:46,400 --> 00:36:50,080 Speaker 1: and that's this. How amazing is it that we have 682 00:36:50,480 --> 00:36:54,279 Speaker 1: such a powerful command of language based reasoning that we 683 00:36:54,360 --> 00:36:59,160 Speaker 1: can answer questions even though key elements of the sentence 684 00:36:59,280 --> 00:37:01,759 Speaker 1: do not match with our knowledge base. I mean, think 685 00:37:01,760 --> 00:37:05,920 Speaker 1: about the trouble that a computer would run into trying 686 00:37:06,000 --> 00:37:09,400 Speaker 1: to do the same thing. Like, While it's an interesting 687 00:37:09,520 --> 00:37:12,720 Speaker 1: case of an illusion failing to notice facts that conflict 688 00:37:12,760 --> 00:37:16,440 Speaker 1: with our existing knowledge, it's also a demonstration of an 689 00:37:16,520 --> 00:37:21,520 Speaker 1: absolutely amazing capacity for language comprehension, even when there are 690 00:37:21,680 --> 00:37:24,959 Speaker 1: severe errors in the questions or sentences that we're trying 691 00:37:25,000 --> 00:37:29,160 Speaker 1: to comprehend, Like somehow our brains are so good at 692 00:37:29,239 --> 00:37:32,640 Speaker 1: getting what seems to be the gist the intended global 693 00:37:32,800 --> 00:37:36,000 Speaker 1: meaning of a sentence, even when pivotal items in that 694 00:37:36,160 --> 00:37:38,960 Speaker 1: sentence are wrong and should be pointing you off in 695 00:37:39,040 --> 00:37:43,319 Speaker 1: the wrong direction and make you totally confused. Yeah, yeah, um, 696 00:37:44,080 --> 00:37:45,560 Speaker 1: you know, I can't help but be reminded in all 697 00:37:45,640 --> 00:37:48,399 Speaker 1: this about the drawing of the bicycle that we've touched 698 00:37:48,480 --> 00:37:50,960 Speaker 1: on before about often, I mean it's different. We're not 699 00:37:51,040 --> 00:37:53,800 Speaker 1: dealing with language, We're dealing with a uh like a 700 00:37:53,880 --> 00:37:56,520 Speaker 1: mental image. Like we all think we have the mental 701 00:37:56,600 --> 00:37:59,080 Speaker 1: image of a bicycle pretty firm in our heads, and 702 00:37:59,200 --> 00:38:01,040 Speaker 1: yet when put to the test, when acts asked to 703 00:38:01,200 --> 00:38:05,439 Speaker 1: draw a bicycle, um, we're often floored. You know. Yeah, 704 00:38:05,600 --> 00:38:08,160 Speaker 1: that was a different one of our cognitive Allusions episodes. 705 00:38:08,239 --> 00:38:10,840 Speaker 1: That was the the illusion of explanatory depth. Yeah, the 706 00:38:10,920 --> 00:38:15,000 Speaker 1: issue where people they tend to think like that they 707 00:38:15,160 --> 00:38:18,440 Speaker 1: understand how something works until they're asked to explain it. 708 00:38:18,840 --> 00:38:21,640 Speaker 1: So somehow the brain has a way of representing a 709 00:38:21,760 --> 00:38:25,640 Speaker 1: sort of pat Tempken comprehension, you know that it puts 710 00:38:25,719 --> 00:38:27,880 Speaker 1: up this facade of yeah, you know how that works. 711 00:38:28,200 --> 00:38:30,000 Speaker 1: I I I know how I know the parts of 712 00:38:30,040 --> 00:38:32,440 Speaker 1: a bicycle. I know all the parts of a can opener. 713 00:38:32,520 --> 00:38:34,880 Speaker 1: I could make one basically. But then if you are 714 00:38:34,920 --> 00:38:37,479 Speaker 1: asked to like explain the steps of how it works 715 00:38:37,600 --> 00:38:41,680 Speaker 1: or draw all the parts, you're like, uh, yeah, I 716 00:38:41,760 --> 00:38:44,839 Speaker 1: thought about this a lot watching the Outlander TV show 717 00:38:44,920 --> 00:38:47,160 Speaker 1: about the time traveler goes back in time and she's 718 00:38:47,440 --> 00:38:50,120 Speaker 1: recreating various things that she knows about from the future, 719 00:38:50,440 --> 00:38:52,279 Speaker 1: and like, God, like, how many of us, you know, 720 00:38:52,360 --> 00:38:53,640 Speaker 1: we go if we were to do that, if we 721 00:38:53,680 --> 00:38:55,920 Speaker 1: were to go back in time, we might tell somebody 722 00:38:55,960 --> 00:38:58,800 Speaker 1: about all these marvelous things like oh yeah, penicillin and 723 00:38:59,400 --> 00:39:01,360 Speaker 1: uh you know, by sickles and whatnot, and it'd be like, 724 00:39:01,440 --> 00:39:02,719 Speaker 1: oh great, how did it work? And it'd be like, 725 00:39:02,920 --> 00:39:05,240 Speaker 1: uh yeah, no, no idea, I have some some vague 726 00:39:05,680 --> 00:39:07,239 Speaker 1: so I have some of the facts that I had, 727 00:39:07,320 --> 00:39:10,520 Speaker 1: but not near enough to reproduce anything that I'm talking about. 728 00:39:14,920 --> 00:39:17,719 Speaker 1: Thank you, thank you, thank you. Coming back to this 729 00:39:17,840 --> 00:39:21,040 Speaker 1: thing about how the Moses ilusion is, it is and 730 00:39:21,200 --> 00:39:24,880 Speaker 1: could be looked at as an example of how amazingly 731 00:39:25,040 --> 00:39:29,200 Speaker 1: adaptive at comprehension our brains are. Actually found a book 732 00:39:29,280 --> 00:39:32,960 Speaker 1: chapter discussing this very aspect of the effect. So the 733 00:39:33,040 --> 00:39:36,720 Speaker 1: authors here were hik Young Park and Lynn M. Rader. 734 00:39:37,400 --> 00:39:40,480 Speaker 1: Uh And this was a chapter in a book, and 735 00:39:40,600 --> 00:39:42,759 Speaker 1: the chapter was called the Moses Illusion. I think it 736 00:39:42,840 --> 00:39:45,399 Speaker 1: was published in two thousand four. And so they're talking 737 00:39:45,400 --> 00:39:49,000 Speaker 1: about different potential explanations for the Moses illusion. What's going 738 00:39:49,040 --> 00:39:51,440 Speaker 1: on in the brain, and they conclude that they, or 739 00:39:51,440 --> 00:39:54,200 Speaker 1: at least they argue that the most likely explanation for 740 00:39:54,880 --> 00:39:57,799 Speaker 1: what's going on when we fall for this is something 741 00:39:57,880 --> 00:40:00,840 Speaker 1: they call the partial match HYPOTHESI us so I just 742 00:40:00,920 --> 00:40:03,479 Speaker 1: want to read from their conclusion that's along the lines 743 00:40:03,520 --> 00:40:06,719 Speaker 1: of what we've just been talking about. Quote. Research on 744 00:40:06,800 --> 00:40:10,240 Speaker 1: the Moses illusion demonstrates that people have difficulty in detecting 745 00:40:10,360 --> 00:40:14,480 Speaker 1: distortions or inaccuracies when a distorted element is semantically related 746 00:40:14,520 --> 00:40:17,719 Speaker 1: to the theme of the sentence. Why should our cognitive 747 00:40:17,760 --> 00:40:21,200 Speaker 1: system be so tolerant of distortions and find it so 748 00:40:21,320 --> 00:40:24,919 Speaker 1: difficult to do careful matches to memory. It might seem 749 00:40:25,000 --> 00:40:27,560 Speaker 1: that partial matching is a less than ideal way to 750 00:40:27,600 --> 00:40:32,080 Speaker 1: process information. However, the partial match process is not only 751 00:40:32,239 --> 00:40:35,640 Speaker 1: common and normal, but also a necessary mechanism of our 752 00:40:35,719 --> 00:40:41,080 Speaker 1: cognitive system. This partial match process enables useful communication and comprehension. 753 00:40:41,560 --> 00:40:44,680 Speaker 1: Very few things that we see or here will perfectly 754 00:40:44,800 --> 00:40:48,560 Speaker 1: match the representation that we already have stored in memory. 755 00:40:49,080 --> 00:40:51,560 Speaker 1: In order to answer questions, we need to be able 756 00:40:51,640 --> 00:40:55,120 Speaker 1: to use an acceptable match. In order to understand a 757 00:40:55,160 --> 00:40:58,080 Speaker 1: new situation and map it onto something we have already 758 00:40:58,120 --> 00:41:02,680 Speaker 1: seen or done, we must apt slight variations every day. 759 00:41:02,760 --> 00:41:06,480 Speaker 1: At many levels, we accept slight distortions without even noticing 760 00:41:06,560 --> 00:41:10,120 Speaker 1: the process. Occasionally we notice a distortion and choose to 761 00:41:10,239 --> 00:41:14,279 Speaker 1: ignore it, but more frequently we do not even realize 762 00:41:14,360 --> 00:41:18,960 Speaker 1: that distortions have occurred. A rigid comprehension system would have 763 00:41:19,040 --> 00:41:22,600 Speaker 1: a difficult time. Indeed, many of our cognitive operations are 764 00:41:22,680 --> 00:41:27,560 Speaker 1: driven by familiarity based heuristics rather than careful matching operations. 765 00:41:27,920 --> 00:41:30,880 Speaker 1: The Moses illusion is an example of how the adaptive 766 00:41:31,000 --> 00:41:35,840 Speaker 1: human cognitive system works. Everyday, cognitive processing must be based 767 00:41:35,880 --> 00:41:39,640 Speaker 1: on simple heuristics, such as matching sets of features, rather 768 00:41:39,719 --> 00:41:44,360 Speaker 1: than exact matches, as very few tasks require exact matches. 769 00:41:44,880 --> 00:41:49,759 Speaker 1: Sentences do not match stored information. Faces change, voices may 770 00:41:49,880 --> 00:41:54,240 Speaker 1: change slightly, even our pets and friends change over time. Therefore, 771 00:41:54,280 --> 00:41:57,279 Speaker 1: it makes sense that people do use partial matches in 772 00:41:57,360 --> 00:42:00,920 Speaker 1: the normal course of matching to memory. Rcial matching is 773 00:42:01,040 --> 00:42:04,200 Speaker 1: immutable because it is the most efficient way for memory 774 00:42:04,239 --> 00:42:07,040 Speaker 1: to operate given the nature of the environment in which 775 00:42:07,080 --> 00:42:09,799 Speaker 1: we live. And so, yeah, this really makes me think 776 00:42:09,840 --> 00:42:12,160 Speaker 1: along the lines of what we were just saying a 777 00:42:12,239 --> 00:42:15,080 Speaker 1: few minutes ago. Like the Moses illusion is kind of 778 00:42:15,200 --> 00:42:18,200 Speaker 1: funny when you notice yourself doing it, but it's also 779 00:42:18,600 --> 00:42:22,960 Speaker 1: it's also kind of a superpower. Yeah, Like, imagine if 780 00:42:23,400 --> 00:42:25,640 Speaker 1: you went to a video store, which we still have 781 00:42:25,760 --> 00:42:28,000 Speaker 1: one in Atlanta. Imagine you went there and you were 782 00:42:28,040 --> 00:42:30,640 Speaker 1: to say, um, yeah, I'm looking for a particular movie. 783 00:42:30,920 --> 00:42:33,960 Speaker 1: Um it started Anthony Hopkins and it had a puppet 784 00:42:34,000 --> 00:42:37,160 Speaker 1: in it. And instead of being able to piece that 785 00:42:37,239 --> 00:42:40,200 Speaker 1: together and tell you which movie you're talking about, what 786 00:42:40,320 --> 00:42:42,440 Speaker 1: if they were to say, Okay, keep listening, I need 787 00:42:42,480 --> 00:42:44,680 Speaker 1: you to list the entire cast. I need all of 788 00:42:44,760 --> 00:42:47,879 Speaker 1: the details. We have to make acent match here. Or Yeah, 789 00:42:47,880 --> 00:42:50,640 Speaker 1: imagine somebody comes into the video store and they say, 790 00:42:50,719 --> 00:42:53,920 Speaker 1: I'm looking for The Godfather too, and they say, sorry, 791 00:42:54,000 --> 00:42:57,000 Speaker 1: we don't have that. What they actually have is The 792 00:42:57,040 --> 00:43:00,680 Speaker 1: Godfather Cole and Part two. Oh man, that that's not 793 00:43:00,840 --> 00:43:03,680 Speaker 1: completely unbelievable and not with our video store, but just 794 00:43:03,760 --> 00:43:08,160 Speaker 1: sort of like the cliche video store. You mean the 795 00:43:08,280 --> 00:43:13,759 Speaker 1: god Father part to Philistine. I mean, that's a kind 796 00:43:13,800 --> 00:43:15,680 Speaker 1: of silly example, but I think the authors of this 797 00:43:15,840 --> 00:43:19,520 Speaker 1: chapter are exactly right that every basically, every single moment 798 00:43:19,600 --> 00:43:23,800 Speaker 1: of our lives, we are testing reality against our memories, 799 00:43:23,880 --> 00:43:25,719 Speaker 1: and we have to do so in a fast and 800 00:43:25,840 --> 00:43:28,600 Speaker 1: loose way, and our ability to do so in a 801 00:43:28,680 --> 00:43:31,719 Speaker 1: fast and loose way without relying on every detail to 802 00:43:31,800 --> 00:43:35,880 Speaker 1: be an exact correct match is is what allows us 803 00:43:35,960 --> 00:43:39,560 Speaker 1: to live adaptively, to sort of like be thinking creatures 804 00:43:40,800 --> 00:43:44,800 Speaker 1: looking for exact matches between the current case you're observing 805 00:43:45,040 --> 00:43:47,680 Speaker 1: and what's stored in your memory. Like I made the 806 00:43:47,719 --> 00:43:50,799 Speaker 1: comparison to a computer earlier. Today, I guess we're more 807 00:43:50,880 --> 00:43:54,080 Speaker 1: familiar with more adaptive types of computer functions that are 808 00:43:54,120 --> 00:43:56,680 Speaker 1: based on like AI or like huge amounts of machine 809 00:43:56,760 --> 00:43:58,879 Speaker 1: learning or something like that. It makes me think about 810 00:43:58,920 --> 00:44:02,640 Speaker 1: like the early old days of dealing with the you know, 811 00:44:02,719 --> 00:44:07,640 Speaker 1: computer programming, where like if you slightly misspelled, like you know, um, 812 00:44:07,920 --> 00:44:10,680 Speaker 1: you're you're playing Zork or something and you type like 813 00:44:10,880 --> 00:44:13,680 Speaker 1: wolke north the w o l K, it's not is 814 00:44:13,719 --> 00:44:16,600 Speaker 1: to be like that is not a valid action. Like yeah, 815 00:44:16,640 --> 00:44:20,640 Speaker 1: it's amazing nowadays, how just like how much thumb fumbling 816 00:44:20,719 --> 00:44:23,480 Speaker 1: I can put into typing something in search and it 817 00:44:23,600 --> 00:44:26,080 Speaker 1: still knows what I'm talking about. I still um able 818 00:44:26,120 --> 00:44:27,560 Speaker 1: to floor it every now and then because I'll get 819 00:44:27,600 --> 00:44:31,319 Speaker 1: really reckless and u and it'll just have no clue. 820 00:44:31,440 --> 00:44:34,279 Speaker 1: But but more often than not, it'll it'll guess what 821 00:44:34,400 --> 00:44:37,080 Speaker 1: I'm going for. But that is amazing because that is 822 00:44:37,280 --> 00:44:40,879 Speaker 1: the the the input receiver whatever, you know. This piece 823 00:44:40,920 --> 00:44:44,200 Speaker 1: of technology it's called AI because it's becoming more like 824 00:44:44,320 --> 00:44:47,920 Speaker 1: our brains. It's becoming usefully sloppy and and loose in 825 00:44:48,000 --> 00:44:51,000 Speaker 1: the way our brains are. Now. I guess we could 826 00:44:51,000 --> 00:44:53,600 Speaker 1: talk about a couple of other possible examples of knowledge 827 00:44:53,680 --> 00:44:57,239 Speaker 1: neglect or implications of knowledge neglect. One that I came 828 00:44:57,280 --> 00:44:59,640 Speaker 1: across that I thought was pretty funny is something that 829 00:45:00,840 --> 00:45:04,680 Speaker 1: seems fairly narrow, but it's known as the yolk phenomenon. Uh, 830 00:45:04,800 --> 00:45:07,600 Speaker 1: So it goes like this apparently was originally described in 831 00:45:07,680 --> 00:45:12,040 Speaker 1: an article in the Psychological Review by Gregory Kimball and 832 00:45:12,280 --> 00:45:15,480 Speaker 1: Lawrence Pearl Mutter. Uh this was in the year ninety 833 00:45:15,520 --> 00:45:18,640 Speaker 1: if I didn't already say that, But it consists of 834 00:45:18,960 --> 00:45:22,680 Speaker 1: asking somebody a list of questions and and it's designed 835 00:45:22,719 --> 00:45:25,560 Speaker 1: to produce a certain answer. So you say, what do 836 00:45:25,640 --> 00:45:28,279 Speaker 1: we call the tree that grows from acorns? And you 837 00:45:28,400 --> 00:45:30,400 Speaker 1: say an oak? And then you say, what do you 838 00:45:30,480 --> 00:45:33,880 Speaker 1: call a funny story joke? What's the sound made by 839 00:45:33,920 --> 00:45:37,560 Speaker 1: a frog croak? What's another word? For a cape cloak. 840 00:45:38,000 --> 00:45:40,200 Speaker 1: What do we call the white of an egg? And 841 00:45:40,400 --> 00:45:44,840 Speaker 1: most people say yolk um, which is obviously wrong, And 842 00:45:44,880 --> 00:45:47,520 Speaker 1: people are not confused about the white of an egg 843 00:45:47,600 --> 00:45:49,960 Speaker 1: being called the yolk. But it seems like instead the 844 00:45:50,000 --> 00:45:52,920 Speaker 1: implication is that there's a certain kind of pattern seeking 845 00:45:53,080 --> 00:45:57,440 Speaker 1: that overtakes semantic processing here, like the brain starts to 846 00:45:57,520 --> 00:46:01,719 Speaker 1: conclude while you're answering these questions because of the established 847 00:46:01,800 --> 00:46:05,680 Speaker 1: pattern that rhyming is more important than the actual meaning 848 00:46:05,719 --> 00:46:08,320 Speaker 1: of the word that rhymes and you know it rhymes 849 00:46:08,400 --> 00:46:12,279 Speaker 1: march exactly. It's the rhymes reason effects sort of. I mean, uh, 850 00:46:13,160 --> 00:46:15,160 Speaker 1: which I think we talked with that in our episode 851 00:46:15,320 --> 00:46:19,000 Speaker 1: on anti metaboli. But I was wondering, I wonder how 852 00:46:19,120 --> 00:46:21,920 Speaker 1: many items in a list like this it takes before 853 00:46:22,040 --> 00:46:25,680 Speaker 1: the majority of respondents will give the yolk type answer, 854 00:46:25,760 --> 00:46:27,759 Speaker 1: will ignore the known meaning of a word and just 855 00:46:27,840 --> 00:46:31,759 Speaker 1: supply the nonsensical rhyming match. I don't know. I feel 856 00:46:31,760 --> 00:46:34,160 Speaker 1: like I'm very susceptible to this one, because I I 857 00:46:34,400 --> 00:46:37,719 Speaker 1: recently was trying to do a recipe and it got 858 00:46:37,800 --> 00:46:40,160 Speaker 1: kind of confusing, and I had a moment where I 859 00:46:40,239 --> 00:46:42,839 Speaker 1: had to ask myself, wait, which part is the yolk 860 00:46:42,880 --> 00:46:47,400 Speaker 1: and which is white. Um, it was only a momentary lapse, 861 00:46:47,480 --> 00:46:49,160 Speaker 1: but there were a lot of things going on. There 862 00:46:49,200 --> 00:46:51,280 Speaker 1: was a lot. I was like having to take them apart, 863 00:46:51,360 --> 00:46:52,360 Speaker 1: you know, as one of those we have to have 864 00:46:52,440 --> 00:46:54,800 Speaker 1: the egg white and one bowl and the yolks and 865 00:46:54,880 --> 00:46:56,600 Speaker 1: the other. And it was I was making a su 866 00:46:56,680 --> 00:47:00,479 Speaker 1: flight That's what it was, and an applicated dish. Yeah, 867 00:47:00,640 --> 00:47:02,719 Speaker 1: and I did had I had not had coffee yet either, 868 00:47:02,800 --> 00:47:05,840 Speaker 1: so I had that going for me. Um. It was successful. 869 00:47:05,920 --> 00:47:07,680 Speaker 1: But yeah, there was that moment where I'm like, okay, 870 00:47:07,880 --> 00:47:09,839 Speaker 1: I have to have so many egg whites and then 871 00:47:09,880 --> 00:47:13,440 Speaker 1: a different number of yolks and which ones which? Now? 872 00:47:13,960 --> 00:47:16,719 Speaker 1: Uh so I would totally fall for this. I mean, 873 00:47:16,800 --> 00:47:19,439 Speaker 1: did you succeed? Did it rise? Yeah? I had Rise. 874 00:47:19,520 --> 00:47:21,360 Speaker 1: It was good. Yeah. I don't think I want to 875 00:47:21,360 --> 00:47:23,719 Speaker 1: put it in regular weekly rotation, but it was. It 876 00:47:23,800 --> 00:47:25,840 Speaker 1: was good for a special treat. I feel like the 877 00:47:25,960 --> 00:47:29,000 Speaker 1: soufle a that is just one of the most notorious, tricky, 878 00:47:29,239 --> 00:47:31,919 Speaker 1: tricky dishes for people who aren't I guess like working 879 00:47:32,000 --> 00:47:34,560 Speaker 1: in you know, kitchens or bakeries every day. Yeah, it 880 00:47:34,640 --> 00:47:36,400 Speaker 1: was still it was tricky. It was tricky for me 881 00:47:36,440 --> 00:47:38,360 Speaker 1: even though I went with a very what seemed like 882 00:47:38,400 --> 00:47:41,480 Speaker 1: a very simple recipe that that didn't steer me too wrong, 883 00:47:41,600 --> 00:47:45,600 Speaker 1: but still I got lost a little bit for a moment. Well, 884 00:47:45,640 --> 00:47:48,120 Speaker 1: I'm impressed, So I was. I was reading through this 885 00:47:48,200 --> 00:47:51,480 Speaker 1: book chapter as well, um on Knowledge Neglect by marsh 886 00:47:51,640 --> 00:47:55,400 Speaker 1: and Humana, and uh, yeah, this was this was very interesting. 887 00:47:55,560 --> 00:47:58,640 Speaker 1: Um so yeah. They point to a couple of other misconceptions. 888 00:47:58,680 --> 00:48:01,800 Speaker 1: I don't think we've mentioned these on on the episode 889 00:48:01,840 --> 00:48:04,920 Speaker 1: thus far, but one of them was Toronto is the 890 00:48:04,960 --> 00:48:09,040 Speaker 1: capital of Canada, and a blow to the head cure's amnesia, 891 00:48:09,160 --> 00:48:11,200 Speaker 1: which I guess is like a TV you know, cartoon 892 00:48:11,320 --> 00:48:14,000 Speaker 1: kind of a thing. But these are all like examples 893 00:48:14,080 --> 00:48:16,160 Speaker 1: of misconceptions that you might have in your head that 894 00:48:16,280 --> 00:48:18,960 Speaker 1: are are not true. They point out that you know 895 00:48:19,080 --> 00:48:23,319 Speaker 1: it tries, we might misconceptions are impossible to ignore, and uh, 896 00:48:23,960 --> 00:48:27,520 Speaker 1: your best hope if you can't avoid hearing misconceptions altogether, 897 00:48:27,600 --> 00:48:30,400 Speaker 1: which again is probably impossible, uh, is to have them 898 00:48:30,480 --> 00:48:34,200 Speaker 1: immediately corrected. But that would be difficult, Like you'd have 899 00:48:34,280 --> 00:48:38,319 Speaker 1: to have like a standing conversation with somebody who would 900 00:48:38,400 --> 00:48:41,239 Speaker 1: not fall for your miscommunication, you know, or you'd have 901 00:48:41,320 --> 00:48:45,040 Speaker 1: to just be constantly, uh like with with paranoia, just 902 00:48:45,440 --> 00:48:48,799 Speaker 1: fact checking everything you come across. Otherwise some of them 903 00:48:48,840 --> 00:48:51,919 Speaker 1: are going to get past your your guard and they're 904 00:48:51,920 --> 00:48:54,840 Speaker 1: not going to be instantly corrected. And then they're just 905 00:48:54,960 --> 00:48:57,120 Speaker 1: kinda they're just kind of in there. Like even if 906 00:48:57,160 --> 00:49:00,040 Speaker 1: you hear otherwise later, you might still fall back to 907 00:49:00,160 --> 00:49:03,200 Speaker 1: the earlier misconception. Yeah, or it's just or it's something 908 00:49:03,239 --> 00:49:05,480 Speaker 1: that doesn't come up in daily life, you know, so 909 00:49:05,600 --> 00:49:07,440 Speaker 1: you just there's never been an opportunity for it to 910 00:49:07,480 --> 00:49:11,440 Speaker 1: be corrected. I'm reminded of that episode of This American 911 00:49:11,520 --> 00:49:14,239 Speaker 1: Life where they started off by talking about this, uh, 912 00:49:14,360 --> 00:49:17,800 Speaker 1: this this particular individual who had just grown up thinking 913 00:49:17,920 --> 00:49:21,320 Speaker 1: that unicorns existed, like it had never been corrected for, 914 00:49:21,800 --> 00:49:24,160 Speaker 1: and so she just had that misconception in her head 915 00:49:24,280 --> 00:49:27,200 Speaker 1: until finally she's at a party in there and there's 916 00:49:27,320 --> 00:49:30,200 Speaker 1: a conversation, like just a random chatter about, hey, what 917 00:49:30,239 --> 00:49:32,800 Speaker 1: are your favorite animals or something, and she she mentions 918 00:49:32,880 --> 00:49:36,960 Speaker 1: the unicorn and there's like this awkward silence. So why 919 00:49:37,000 --> 00:49:39,360 Speaker 1: would that be all that awkward? I mean, would she 920 00:49:39,520 --> 00:49:42,759 Speaker 1: like the unicorn which is real? Well, I think it was, 921 00:49:42,920 --> 00:49:45,200 Speaker 1: it was probably why if I'm remembering it correctly it was. 922 00:49:45,520 --> 00:49:47,560 Speaker 1: There's a certain bit of ambiguity where people are like, 923 00:49:47,680 --> 00:49:50,160 Speaker 1: is she joking or oh my goodness, she's not joking. 924 00:49:50,239 --> 00:49:52,799 Speaker 1: She thinks there. But it also makes all of us, 925 00:49:52,840 --> 00:49:56,320 Speaker 1: I think, wonder, which, what what misconceptions do we have 926 00:49:57,080 --> 00:49:59,600 Speaker 1: just rattling around in our brain right now? We have 927 00:49:59,719 --> 00:50:01,719 Speaker 1: no idea, but they're just they're ready to go at 928 00:50:01,719 --> 00:50:03,759 Speaker 1: any moment, you know, they can be loaded into the 929 00:50:03,840 --> 00:50:08,440 Speaker 1: torpedo tube of conversation or podcasting or the next job interview, 930 00:50:08,800 --> 00:50:11,040 Speaker 1: just just just ready to go when you have no idea. 931 00:50:11,360 --> 00:50:13,440 Speaker 1: I'd say one of the most common edits I have 932 00:50:13,640 --> 00:50:15,680 Speaker 1: to make to this show before we release it as 933 00:50:15,719 --> 00:50:18,160 Speaker 1: I realized that I just sort of said something that 934 00:50:18,280 --> 00:50:21,160 Speaker 1: I knew was true. And then later I'm listening back 935 00:50:21,200 --> 00:50:23,160 Speaker 1: to it, I'm like, wait a minute, I don't think 936 00:50:23,200 --> 00:50:26,520 Speaker 1: that's right. Yeah, yeah, I've definitely definitely done that before. 937 00:50:27,120 --> 00:50:28,600 Speaker 1: But well, I mean when I said it, I wasn't 938 00:50:28,600 --> 00:50:31,920 Speaker 1: even wondering, you know, just now now. The authors here, 939 00:50:32,000 --> 00:50:34,160 Speaker 1: they they touch on, of course, the fact that the 940 00:50:34,280 --> 00:50:37,640 Speaker 1: prior knowledge seems like it should be able to protect us, uh, 941 00:50:37,800 --> 00:50:40,440 Speaker 1: you know, and and yet quote surprisingly, the effects of 942 00:50:40,480 --> 00:50:43,480 Speaker 1: exposure to misconceptions are not limited to cases where people 943 00:50:43,480 --> 00:50:45,840 Speaker 1: are ignorant of the true state of the world. We 944 00:50:45,920 --> 00:50:49,080 Speaker 1: touched on that already. Um. Another great example they bring 945 00:50:49,480 --> 00:50:52,319 Speaker 1: bring out is a plane crashed, where did they bury 946 00:50:52,400 --> 00:50:56,480 Speaker 1: the survivors? Okay, which you know obviously you're not going 947 00:50:56,560 --> 00:50:59,000 Speaker 1: to bury survivors, you were going to bury the dead. 948 00:50:59,080 --> 00:51:01,320 Speaker 1: But again, this is another question where you've kind of 949 00:51:01,360 --> 00:51:04,239 Speaker 1: filled in all the blanks, you know. Uh, they by 950 00:51:04,280 --> 00:51:07,920 Speaker 1: the time the survivors is the last word in the sentence. Uh, 951 00:51:08,000 --> 00:51:10,120 Speaker 1: and you fall for it, right, So it's not like 952 00:51:10,239 --> 00:51:12,680 Speaker 1: you think that the survivors get buried, but you could 953 00:51:12,719 --> 00:51:15,080 Speaker 1: be trying to answer the questions just because like that's 954 00:51:15,120 --> 00:51:18,400 Speaker 1: gone straight past you. Yeah, And they really drive home 955 00:51:18,400 --> 00:51:21,000 Speaker 1: in this that knowledge neglect isn't just a momentary lapse 956 00:51:21,080 --> 00:51:24,600 Speaker 1: in memory, but rather something with real consequences for memory. 957 00:51:24,640 --> 00:51:27,640 Speaker 1: If you don't recognize the error, the error can become 958 00:51:27,760 --> 00:51:32,000 Speaker 1: coded into your memory, into your worldview as fact. Uh. 959 00:51:32,320 --> 00:51:36,880 Speaker 1: And because that error was recently encountered, it's more easily accessed. 960 00:51:37,520 --> 00:51:39,839 Speaker 1: So again we have to remember that items in our 961 00:51:39,880 --> 00:51:42,600 Speaker 1: memory are not made of stone, they're made of clay. 962 00:51:43,000 --> 00:51:46,640 Speaker 1: Merely accessing them can change them. And our most accessed 963 00:51:46,719 --> 00:51:50,160 Speaker 1: memories are the most changed memories of all the ones 964 00:51:50,200 --> 00:51:52,719 Speaker 1: we can trust the least. Um So, an air that 965 00:51:52,800 --> 00:51:55,839 Speaker 1: pops to mind quickly is more likely to be thought 966 00:51:55,840 --> 00:51:59,200 Speaker 1: of as fact, not Oh I heard once that X. 967 00:51:59,239 --> 00:52:01,600 Speaker 1: I'm not sure about X, but I think X, but 968 00:52:01,760 --> 00:52:04,439 Speaker 1: rather just X is true, X is the answer. Yeah, 969 00:52:04,480 --> 00:52:06,239 Speaker 1: So I guess this is This is connecting back to 970 00:52:06,480 --> 00:52:09,040 Speaker 1: that finding we talked about earlier that you know, um 971 00:52:09,440 --> 00:52:14,840 Speaker 1: that even against your existing prior knowledge, like misconceptions or 972 00:52:15,000 --> 00:52:17,680 Speaker 1: errors that get by you unnoticed in one of these 973 00:52:17,760 --> 00:52:21,239 Speaker 1: Moses solution type sentences can later damage your ability to 974 00:52:21,360 --> 00:52:25,200 Speaker 1: remember the actual fact of that sentence correctly. Um, it 975 00:52:25,280 --> 00:52:28,640 Speaker 1: can undermine your knowledge that it was in fact Noah, potentially. 976 00:52:29,200 --> 00:52:31,400 Speaker 1: And this makes me think about the broader phenomenon of 977 00:52:32,480 --> 00:52:34,520 Speaker 1: people who are really trying to argue a point will 978 00:52:34,600 --> 00:52:40,200 Speaker 1: often structure sentences to try to get something past you 979 00:52:40,520 --> 00:52:43,840 Speaker 1: really quickly. In the non pivotal part of the sentence. 980 00:52:43,960 --> 00:52:46,360 Speaker 1: It's almost like we have an intuitive grasp of the 981 00:52:46,480 --> 00:52:50,560 Speaker 1: Moses solution type thing, where like a, I don't know, 982 00:52:50,600 --> 00:52:53,120 Speaker 1: you see people like like arguing about politics on TV 983 00:52:53,400 --> 00:52:55,880 Speaker 1: or something, and like so one person will pose a 984 00:52:55,960 --> 00:52:59,719 Speaker 1: question to the other person, and the the pivotal part 985 00:52:59,760 --> 00:53:02,040 Speaker 1: of the sentence that's supposed to be in dispute, maybe 986 00:53:02,239 --> 00:53:04,480 Speaker 1: is is one part of the sentence, but then in 987 00:53:04,560 --> 00:53:06,920 Speaker 1: a different part of the sentence, there's also like a 988 00:53:07,080 --> 00:53:10,719 Speaker 1: disputable claim that's just like shoved in there and goes 989 00:53:10,800 --> 00:53:13,319 Speaker 1: by real quick, right right, Yeah. If you end up 990 00:53:13,320 --> 00:53:16,480 Speaker 1: with a statement that has some some mistruths sort of 991 00:53:16,560 --> 00:53:19,360 Speaker 1: sprinkled in there that are not key to the like 992 00:53:19,480 --> 00:53:22,760 Speaker 1: the main you know, talking point, or even the main untruth, 993 00:53:22,880 --> 00:53:25,759 Speaker 1: you know, that's that can often be the nefarious thing too. 994 00:53:25,840 --> 00:53:30,480 Speaker 1: It's like you catch the larger um misconception or lie 995 00:53:30,680 --> 00:53:32,360 Speaker 1: in the statement, but then there are other lies in 996 00:53:32,440 --> 00:53:34,320 Speaker 1: there that you're not paying attention to because of the 997 00:53:34,360 --> 00:53:36,440 Speaker 1: big one. Now, the authors here they point out that 998 00:53:36,520 --> 00:53:39,279 Speaker 1: improved monitoring can help, you know, this is stuff like 999 00:53:39,360 --> 00:53:42,759 Speaker 1: we're talking about, like putting things in a different font, etcetera. Um, 1000 00:53:43,760 --> 00:53:47,560 Speaker 1: But drawing attention to errors can have the opposite effect, 1001 00:53:47,840 --> 00:53:51,279 Speaker 1: increasing suggestibility, which is is weird therefore to it as 1002 00:53:51,280 --> 00:53:56,080 Speaker 1: an ironic effect. Um. Plus, many manipulations designed to promote 1003 00:53:56,160 --> 00:53:58,879 Speaker 1: monitoring may actually fail to do so, And they say 1004 00:53:58,880 --> 00:54:02,680 Speaker 1: it's difficult to predict which manipulations will actually work. So again, 1005 00:54:02,719 --> 00:54:05,520 Speaker 1: there's no there's no like one guy, like, here are 1006 00:54:05,560 --> 00:54:07,960 Speaker 1: the three steps you need to take to uh to 1007 00:54:08,160 --> 00:54:11,359 Speaker 1: keep this misinformation from leaking into your brain. I think 1008 00:54:11,360 --> 00:54:13,640 Speaker 1: a lot of what I take away from this is that, uh, 1009 00:54:14,120 --> 00:54:18,040 Speaker 1: I don't know, being well informed is an ongoing process 1010 00:54:18,200 --> 00:54:21,000 Speaker 1: that last your entire life. And it's not a question 1011 00:54:21,120 --> 00:54:23,680 Speaker 1: of like just getting the right facts in the bank 1012 00:54:23,840 --> 00:54:27,879 Speaker 1: one time and then you're set. You know. Yeah, there's 1013 00:54:27,880 --> 00:54:30,200 Speaker 1: a lot of upkeep involved and a lot of just 1014 00:54:30,239 --> 00:54:33,040 Speaker 1: continual pruning and not just new weeds, weeds that have 1015 00:54:33,120 --> 00:54:36,120 Speaker 1: been in there your whole life sometimes or seeing right 1016 00:54:36,160 --> 00:54:40,600 Speaker 1: bap the very least. Um. Yeah. The authors why, they 1017 00:54:40,640 --> 00:54:43,120 Speaker 1: also drive home that ultimately we know a lot more 1018 00:54:43,160 --> 00:54:48,040 Speaker 1: about how people come to misremember events versus misremember facts, 1019 00:54:48,640 --> 00:54:53,360 Speaker 1: especially when errors are are the errors involved contradict stored knowledge. 1020 00:54:53,440 --> 00:54:56,040 Speaker 1: So uh, you know, you know, again we get into 1021 00:54:56,080 --> 00:54:59,000 Speaker 1: the complexity of memory, the different types of memory that 1022 00:54:59,120 --> 00:55:01,840 Speaker 1: we have going on the brain. Um, and we we 1023 00:55:01,920 --> 00:55:04,640 Speaker 1: still have a lot more to learn about just how 1024 00:55:04,880 --> 00:55:07,920 Speaker 1: this all comes together. Yeah. Now, you know, here's a 1025 00:55:08,000 --> 00:55:10,080 Speaker 1: question that comes to mind. Um, I wonder if anyone 1026 00:55:10,200 --> 00:55:13,800 Speaker 1: has constructed a Moses illusion statement using Bilbo and Frodo. 1027 00:55:14,480 --> 00:55:18,800 Speaker 1: Oh yes, that might so. Um, like what was Bilbo 1028 00:55:18,960 --> 00:55:22,319 Speaker 1: carrying into the fires of Mountain Doom? Yeah, that's sort 1029 00:55:22,360 --> 00:55:23,680 Speaker 1: of thing. I don't know, of course, I guess you 1030 00:55:23,680 --> 00:55:26,120 Speaker 1: would want to you'd want to try and construct it 1031 00:55:26,280 --> 00:55:28,000 Speaker 1: right so that you get Bilbo there at the very 1032 00:55:28,200 --> 00:55:30,600 Speaker 1: end or Frodo at the very end, depending on how 1033 00:55:30,640 --> 00:55:35,000 Speaker 1: you're you're you're messing around with its rum who was 1034 00:55:35,160 --> 00:55:39,919 Speaker 1: who was the dragon whose lair was infiltrated by Frodo Baggins. Yeah, yeah, 1035 00:55:40,040 --> 00:55:42,279 Speaker 1: that sort of thing that might work. Yeah, I said, 1036 00:55:42,320 --> 00:55:45,879 Speaker 1: Bilbo and Frodo or even closer together than Noah and Moses. Yeah, 1037 00:55:45,920 --> 00:55:50,600 Speaker 1: I mean they are certainly that they actually overlap, as 1038 00:55:50,640 --> 00:55:53,279 Speaker 1: opposed to being separated by by long stretches of time. 1039 00:55:54,400 --> 00:55:59,560 Speaker 1: Very very similar characters actually related, right, they are related? Yeah? Um, yeah, 1040 00:55:59,760 --> 00:56:03,200 Speaker 1: it's that they would work. What Uncle, great uncle Uncle, 1041 00:56:03,280 --> 00:56:05,920 Speaker 1: So I always forget what happened to Frodo's parents. I've 1042 00:56:05,960 --> 00:56:09,439 Speaker 1: read it and I still forget it. I'm gonna say, uncle, 1043 00:56:09,520 --> 00:56:12,839 Speaker 1: all the all the Hobbits are cousins. Yeah, they're all related. Actually, yes, 1044 00:56:14,200 --> 00:56:16,200 Speaker 1: all right, well there you have it. We'd love to 1045 00:56:16,239 --> 00:56:18,040 Speaker 1: hear from everybody about this, because of course this just 1046 00:56:18,160 --> 00:56:20,319 Speaker 1: touches on how our brains work and how it now 1047 00:56:20,480 --> 00:56:23,960 Speaker 1: they are brains work with with new information, be it 1048 00:56:24,320 --> 00:56:29,000 Speaker 1: accurate or or or or misconception. Uh So I think 1049 00:56:29,040 --> 00:56:32,160 Speaker 1: everybody out there has something to share. Which of these 1050 00:56:32,960 --> 00:56:35,880 Speaker 1: Moses illusions worked the most on you? Which ones I've 1051 00:56:35,920 --> 00:56:38,120 Speaker 1: worked on you in the past. Uh, we'd love to 1052 00:56:38,160 --> 00:56:40,600 Speaker 1: hear from you. All right. If you want to check 1053 00:56:40,640 --> 00:56:42,759 Speaker 1: out other episodes of Stuff to Blow your Mind, you 1054 00:56:42,840 --> 00:56:44,600 Speaker 1: know where to find it. You can find the Stuff 1055 00:56:44,640 --> 00:56:46,520 Speaker 1: to Blow your Mind podcast feed wherever you get your 1056 00:56:46,680 --> 00:56:49,360 Speaker 1: your podcasts, and we'll have core episodes of Stuff to 1057 00:56:49,400 --> 00:56:52,319 Speaker 1: Blow your Mind on Tuesdays and Thursdays, you've got listener mail. 1058 00:56:52,360 --> 00:56:55,400 Speaker 1: On Mondays, you've got them, We've got the Artifact on Wednesdays. 1059 00:56:55,440 --> 00:56:57,919 Speaker 1: You've got Weird House Cinema on Fridays in a vault 1060 00:56:57,920 --> 00:57:01,279 Speaker 1: episode on the weekends, Huge things. As always to our 1061 00:57:01,280 --> 00:57:04,520 Speaker 1: excellent audio producer Seth Nicholas Johnson. If you would like 1062 00:57:04,600 --> 00:57:06,640 Speaker 1: to get in touch with us with feedback on this 1063 00:57:06,800 --> 00:57:09,560 Speaker 1: episode or any other, to suggest a topic for the future, 1064 00:57:09,640 --> 00:57:11,560 Speaker 1: or just to say hello, you can email us at 1065 00:57:11,719 --> 00:57:22,120 Speaker 1: contact at Stuff to Blow Your Mind dot com. Stuff 1066 00:57:22,160 --> 00:57:24,320 Speaker 1: to Blow Your Mind is production of I Heart Radio. 1067 00:57:24,720 --> 00:57:26,680 Speaker 1: For more podcasts for my heart Radio, this is the 1068 00:57:26,720 --> 00:57:29,600 Speaker 1: i heart Radio app, Apple Podcasts, or wherever you're listening 1069 00:57:29,600 --> 00:57:30,520 Speaker 1: to your favorite shows.