1 00:00:05,760 --> 00:00:07,520 Speaker 1: Hey, you welcome to Stuff to Blow Your Mind. My 2 00:00:07,600 --> 00:00:10,879 Speaker 1: name is Robert Lamb and I'm Joe McCormick, and it's Saturday. 3 00:00:10,880 --> 00:00:13,480 Speaker 1: Time to go into the Old Vault. This episode originally 4 00:00:13,520 --> 00:00:18,239 Speaker 1: aired July, and this was part one of our discussion 5 00:00:18,280 --> 00:00:22,160 Speaker 1: of the illusory truth effect, one of the many biases 6 00:00:22,320 --> 00:00:25,560 Speaker 1: that that, unfortunately affects all of our brains and makes 7 00:00:25,560 --> 00:00:28,120 Speaker 1: it harder for us to know what's true. Yeah, this 8 00:00:28,200 --> 00:00:30,520 Speaker 1: is one that's, uh, you know, this is gonna shake 9 00:00:30,560 --> 00:00:32,879 Speaker 1: some of your your foundation stones. I think you know, 10 00:00:32,960 --> 00:00:35,000 Speaker 1: this one's gonna make you rethink the way you interact 11 00:00:35,040 --> 00:00:38,800 Speaker 1: with with the world and how you interact with truth. So, uh, 12 00:00:38,840 --> 00:00:41,479 Speaker 1: you know, it's maybe a harrowing journey at times, but 13 00:00:41,520 --> 00:00:44,000 Speaker 1: I think you're gonna emerge on the other end stronger. 14 00:00:44,280 --> 00:00:47,880 Speaker 1: And Part two will come a week from today. Your today, 15 00:00:47,880 --> 00:00:54,200 Speaker 1: not our today, as we record this. Welcome to Stuff 16 00:00:54,240 --> 00:01:03,120 Speaker 1: to Blow Your Mind from how Stuff Works dot Com. Hey, 17 00:01:03,200 --> 00:01:05,040 Speaker 1: you welcome to Stuff to Blow Your Mind. My name 18 00:01:05,080 --> 00:01:07,720 Speaker 1: is Robert Lamb and I'm Joe McCormick. In today, we're 19 00:01:07,760 --> 00:01:10,800 Speaker 1: gonna be talking about one of our favorite subjects, our 20 00:01:10,880 --> 00:01:14,720 Speaker 1: tendency to believe things that aren't true now, Robert, I 21 00:01:14,760 --> 00:01:18,399 Speaker 1: wonder is there a false factoid or claim that you 22 00:01:18,520 --> 00:01:21,920 Speaker 1: just always find yourself recalling as true even though you've 23 00:01:22,000 --> 00:01:24,720 Speaker 1: checked it before and discovered it to be false in 24 00:01:24,760 --> 00:01:29,000 Speaker 1: the past. Yeah, this is an interesting question because I 25 00:01:29,000 --> 00:01:30,840 Speaker 1: feel like there are things that come up in research 26 00:01:31,120 --> 00:01:34,840 Speaker 1: all the time, certainly key things over the years. Uh, 27 00:01:34,920 --> 00:01:37,360 Speaker 1: you know, as we research and riot different topics where 28 00:01:37,360 --> 00:01:39,360 Speaker 1: I have to correct on, you know, where I think 29 00:01:39,400 --> 00:01:40,920 Speaker 1: I knew something and then I'm like, oh, well that's 30 00:01:40,959 --> 00:01:43,240 Speaker 1: that's actually now that actually do the research, it's not 31 00:01:43,680 --> 00:01:45,920 Speaker 1: not a not a fact. And then then you know, 32 00:01:45,959 --> 00:01:48,600 Speaker 1: the same goes for false beliefs, that beliefs that they 33 00:01:48,640 --> 00:01:52,320 Speaker 1: creep in the sort of Mandela fact type of scenario. 34 00:01:52,440 --> 00:01:55,080 Speaker 1: For instance, there was a time when I thought Gene 35 00:01:55,120 --> 00:01:58,240 Speaker 1: Wilder was dead prior to his actual death, and then 36 00:01:58,280 --> 00:02:00,480 Speaker 1: as he actually dead now he is actual really dead now. 37 00:02:00,520 --> 00:02:02,240 Speaker 1: So I thought he was dead before he was dead 38 00:02:02,280 --> 00:02:04,760 Speaker 1: exactly because and I think it was just a combination 39 00:02:04,800 --> 00:02:08,320 Speaker 1: of he was not as active anymore, and I wasn't 40 00:02:08,600 --> 00:02:12,280 Speaker 1: really keeping up with the Gene Wilder filmography and like 41 00:02:12,400 --> 00:02:15,760 Speaker 1: current events related to Gene Wilder, and something maybe I 42 00:02:15,800 --> 00:02:17,640 Speaker 1: picked up on some news piece at some point, and 43 00:02:17,720 --> 00:02:20,520 Speaker 1: somehow he got clicked to the dead category. And then 44 00:02:20,520 --> 00:02:22,280 Speaker 1: when I found out he was alive, it was it 45 00:02:22,400 --> 00:02:24,760 Speaker 1: was really like he came back to life. And I 46 00:02:24,760 --> 00:02:27,520 Speaker 1: had the same thing happened literally just the other day 47 00:02:27,960 --> 00:02:30,560 Speaker 1: with a standout comedian, Larry Miller. I don't think I 48 00:02:30,600 --> 00:02:33,560 Speaker 1: remember who that is. Oh. He he had to kind 49 00:02:33,600 --> 00:02:37,960 Speaker 1: of you know, dry observational comedy he had. I think 50 00:02:37,960 --> 00:02:40,280 Speaker 1: he still acts, but he would show up on say 51 00:02:40,600 --> 00:02:42,960 Speaker 1: night Chord. I think, oh, wait a minute, is seeing 52 00:02:43,000 --> 00:02:46,480 Speaker 1: some Christopher guest movies. He may have been, Yeah, but 53 00:02:46,680 --> 00:02:50,160 Speaker 1: for some reason years ago, I got into my head 54 00:02:50,200 --> 00:02:52,720 Speaker 1: that he had passed away, and so occasionally I would 55 00:02:52,720 --> 00:02:54,160 Speaker 1: think of Larry Miller and was like, oh, yeah, I 56 00:02:54,200 --> 00:02:57,760 Speaker 1: remember Larry Miller. Too bad he passed. And then I 57 00:02:57,919 --> 00:02:59,600 Speaker 1: actually looked him up the other day and it turns 58 00:02:59,639 --> 00:03:01,680 Speaker 1: out he was not passed away. He's still very much 59 00:03:01,680 --> 00:03:04,799 Speaker 1: alive and active. And I was just living in this 60 00:03:04,840 --> 00:03:07,440 Speaker 1: fantasy world of dead and Larry Miller's. You know, I 61 00:03:07,480 --> 00:03:11,440 Speaker 1: have false beliefs that recur with much more significance, like 62 00:03:11,560 --> 00:03:15,840 Speaker 1: I keep remembering that, Yeah, maybe it's just because I 63 00:03:15,880 --> 00:03:17,120 Speaker 1: was told this all the time when I was a 64 00:03:17,200 --> 00:03:21,960 Speaker 1: kid that vitamin C supplements will ward off colds. That 65 00:03:22,160 --> 00:03:25,120 Speaker 1: is not experimentally proven. That that is like not a 66 00:03:25,160 --> 00:03:28,440 Speaker 1: finding of science. And yet I just always if I 67 00:03:28,440 --> 00:03:32,400 Speaker 1: haven't checked in a while, it just seeps right back in, like, yes, 68 00:03:32,440 --> 00:03:35,240 Speaker 1: that is true, vitamin C it'll keep colds away. Well, 69 00:03:35,280 --> 00:03:37,200 Speaker 1: it's easy to fall into the trap. I do this 70 00:03:37,240 --> 00:03:40,400 Speaker 1: all the time with with various vitamins and some supplements 71 00:03:40,400 --> 00:03:42,320 Speaker 1: where I'm like, I don't know if it works, probably 72 00:03:42,320 --> 00:03:44,200 Speaker 1: doesn't work, but I'm going to go and take it 73 00:03:44,240 --> 00:03:46,560 Speaker 1: just in case, because it's it's vitamin C. You know, 74 00:03:46,680 --> 00:03:49,120 Speaker 1: what's the what's what's the harm there? It's kind of 75 00:03:49,120 --> 00:03:51,480 Speaker 1: like believing in God just in cases that he exists, 76 00:03:52,080 --> 00:03:54,840 Speaker 1: believing in vitamins. Yeah, but then you end up with 77 00:03:54,880 --> 00:03:57,400 Speaker 1: like a weird sort of vitamin tentacle going out of 78 00:03:57,400 --> 00:03:59,960 Speaker 1: your neck. And you didn't see that come digit, that's 79 00:04:00,040 --> 00:04:02,440 Speaker 1: fake news. The Joe vidamin C will not cause a 80 00:04:02,440 --> 00:04:04,360 Speaker 1: tentacle to grow out of your neck. But now you've 81 00:04:04,400 --> 00:04:08,120 Speaker 1: heard it, it's true. Um. You know, I feel like 82 00:04:08,120 --> 00:04:12,160 Speaker 1: there are things that have popped up where where I'll think, well, 83 00:04:12,200 --> 00:04:16,600 Speaker 1: I've always heard X, but I've never actually looked it up. UM, 84 00:04:16,680 --> 00:04:19,359 Speaker 1: And and then that's where the problem seeps in, you know, 85 00:04:19,440 --> 00:04:21,480 Speaker 1: where I just I think I know something, but I'm 86 00:04:21,480 --> 00:04:25,280 Speaker 1: not sure, but I don't care enough to actually investigate. UM. 87 00:04:25,520 --> 00:04:28,320 Speaker 1: There is one possible example that that comes up, and 88 00:04:28,360 --> 00:04:30,960 Speaker 1: that is there was this, of course, the idea that 89 00:04:31,000 --> 00:04:35,960 Speaker 1: George Washington Carver invented peanut butter. He know, he had 90 00:04:36,040 --> 00:04:38,479 Speaker 1: something to do with peanuts. He had, so yeah, he was. 91 00:04:38,600 --> 00:04:41,880 Speaker 1: He was. He is a famous inventor, important to African 92 00:04:41,920 --> 00:04:45,680 Speaker 1: American inventor. And I just I didn't know a lot 93 00:04:45,720 --> 00:04:48,400 Speaker 1: about him, and I had always heard the peanut butter thing, 94 00:04:49,160 --> 00:04:51,599 Speaker 1: but I didn't actually research until I helped my son 95 00:04:51,680 --> 00:04:54,839 Speaker 1: with a class project about him earlier this year, and 96 00:04:54,880 --> 00:04:57,480 Speaker 1: then I was able to definitely, you know, check that 97 00:04:57,520 --> 00:04:59,599 Speaker 1: one off on the mental list, like, okay, this is 98 00:04:59,800 --> 00:05:02,120 Speaker 1: the is this is false. He did not invent peanut 99 00:05:02,120 --> 00:05:04,760 Speaker 1: better he did, but he did do stuff with peanuts, 100 00:05:05,480 --> 00:05:08,039 Speaker 1: could do stuff with peanuts, but not peanut butter. Okay. 101 00:05:08,480 --> 00:05:11,479 Speaker 1: You know a huge place where you can see false 102 00:05:11,520 --> 00:05:16,159 Speaker 1: beliefs persisting, UM is in people's beliefs about sort of 103 00:05:16,200 --> 00:05:22,280 Speaker 1: like political facts or sociological data. A very very common 104 00:05:22,279 --> 00:05:25,799 Speaker 1: one is people's beliefs about crime. I think it's because 105 00:05:25,880 --> 00:05:29,320 Speaker 1: like crime is one of those like sensational types of 106 00:05:29,360 --> 00:05:32,720 Speaker 1: subjects and makes people think about violence, images of blood 107 00:05:32,760 --> 00:05:35,120 Speaker 1: they see on the news and stuff like that. You know, 108 00:05:35,200 --> 00:05:38,160 Speaker 1: poll conducted by Pew in the fall of get this 109 00:05:38,560 --> 00:05:42,200 Speaker 1: fifty seven percent, So a majority of people who had 110 00:05:42,279 --> 00:05:46,000 Speaker 1: voted or planned to vote in sixteen said that crime 111 00:05:46,120 --> 00:05:49,279 Speaker 1: had gotten worse in the United States since two thousand 112 00:05:49,360 --> 00:05:53,520 Speaker 1: eight by every objective measure. Exactly, the opposite is true. 113 00:05:53,600 --> 00:05:56,800 Speaker 1: That's just not true. FBI statistics, based off of like 114 00:05:56,960 --> 00:06:00,679 Speaker 1: nationwide police reports, found that violent cry time and property 115 00:06:00,680 --> 00:06:04,320 Speaker 1: crime in the United States fell nineteen percent and twenty 116 00:06:04,360 --> 00:06:09,000 Speaker 1: three percent, respectively between two thousand eight and and so 117 00:06:09,120 --> 00:06:11,880 Speaker 1: you think, okay, well, maybe that if that's just police reports, 118 00:06:11,920 --> 00:06:14,760 Speaker 1: maybe fewer people are reporting crimes to the police, right, 119 00:06:15,440 --> 00:06:17,920 Speaker 1: But also the U. S. Department of Justice Bureau of 120 00:06:17,960 --> 00:06:22,080 Speaker 1: Justice Statistics does direct annual surveys of more than ninety 121 00:06:22,160 --> 00:06:24,960 Speaker 1: thou households to see about rates of crime that might 122 00:06:25,000 --> 00:06:28,080 Speaker 1: not be reported to police. And quote the b j 123 00:06:28,320 --> 00:06:30,640 Speaker 1: S state to show that violent crime and property crime 124 00:06:31,120 --> 00:06:34,479 Speaker 1: rates fell twenty six percent and twenty two percent, respectively, 125 00:06:34,800 --> 00:06:38,120 Speaker 1: between two thousand eight and so a majority of people 126 00:06:38,200 --> 00:06:41,520 Speaker 1: are believing something that by every measure we know, is 127 00:06:41,560 --> 00:06:44,440 Speaker 1: not true. Crime has gone down, and yet a majority 128 00:06:44,480 --> 00:06:47,480 Speaker 1: of people believe it has gone up. And it's not 129 00:06:47,680 --> 00:06:50,760 Speaker 1: hard to see why that might be true when you 130 00:06:50,839 --> 00:06:54,880 Speaker 1: consider like the political messaging of certain politicians. It's also 131 00:06:55,240 --> 00:06:58,480 Speaker 1: you could also, of course think about just people negativity bias, right, 132 00:06:58,560 --> 00:07:01,360 Speaker 1: the tendency to believe things are worse than they are 133 00:07:01,520 --> 00:07:05,200 Speaker 1: in the broad sense or mean world syndrome looking at 134 00:07:05,600 --> 00:07:08,280 Speaker 1: dangerous stuff happening on the news and thus having an 135 00:07:08,320 --> 00:07:11,520 Speaker 1: overrepresentation of it in your mind. But I think we 136 00:07:11,560 --> 00:07:17,280 Speaker 1: would be wrong to ignore the effects of hearing specific politicians. Well, 137 00:07:17,440 --> 00:07:20,520 Speaker 1: for for example, in twenty sixteen, specifically it was Donald 138 00:07:20,520 --> 00:07:22,600 Speaker 1: Trump a lot talking about how crime is through the 139 00:07:22,680 --> 00:07:25,600 Speaker 1: roof right, And and to your point, we can look 140 00:07:25,680 --> 00:07:28,440 Speaker 1: to statistics on this. This is not something that is uh. 141 00:07:28,520 --> 00:07:30,480 Speaker 1: That is just you know, in the ether, we have 142 00:07:30,560 --> 00:07:32,960 Speaker 1: hard data. It's not a matter of opinion. It's just 143 00:07:33,040 --> 00:07:36,440 Speaker 1: like every measure we have says that's not correct. But 144 00:07:36,520 --> 00:07:38,600 Speaker 1: what about other beliefs. I mean that that's certainly not 145 00:07:38,640 --> 00:07:41,960 Speaker 1: in isolation. There are lots of cases where there are 146 00:07:42,080 --> 00:07:47,520 Speaker 1: widespread beliefs in things that are just simply factually not true. Yeah, 147 00:07:47,760 --> 00:07:51,000 Speaker 1: I'll run through a few here that that range and topic. 148 00:07:51,040 --> 00:07:53,480 Speaker 1: For instance, here's a nice science related when to kick 149 00:07:53,520 --> 00:07:57,400 Speaker 1: off with. In a two thousand fifteen Pew survey, only 150 00:07:57,520 --> 00:08:00,720 Speaker 1: thirty percent of Americans knew that water boils at all 151 00:08:00,760 --> 00:08:04,440 Speaker 1: lower temperature at higher altitudes. Thirty nine percent said it 152 00:08:04,440 --> 00:08:07,600 Speaker 1: would boil at the same temperature in Denver in l A, 153 00:08:07,600 --> 00:08:12,800 Speaker 1: again Denver being at a far higher altitude, and had 154 00:08:12,840 --> 00:08:15,440 Speaker 1: it reversed. So the majority, something like two thirds of 155 00:08:15,480 --> 00:08:19,600 Speaker 1: people were just flat wrong yes and uh yeah. And 156 00:08:19,640 --> 00:08:22,320 Speaker 1: to put that in perspective with another science fact, most 157 00:08:22,320 --> 00:08:27,720 Speaker 1: Americans in this two thousand fifteen survey correctly identified the 158 00:08:27,760 --> 00:08:31,400 Speaker 1: Earth's inner layer, the core, as its hottest part, and 159 00:08:31,760 --> 00:08:35,560 Speaker 1: nearly as many two knew that uranium was needed to 160 00:08:35,640 --> 00:08:39,320 Speaker 1: make nuclear energy and nuclear weapons. Well, should we be 161 00:08:39,360 --> 00:08:42,560 Speaker 1: comforted by the fact that that's what people know? I don't, like, 162 00:08:43,360 --> 00:08:45,319 Speaker 1: they don't know much about water, but they know about 163 00:08:45,360 --> 00:08:49,520 Speaker 1: nuclear weapons. Well, I mean, on one hand, nuclear weapons 164 00:08:50,080 --> 00:08:53,600 Speaker 1: was is and was more in the news. Uh, And 165 00:08:53,640 --> 00:08:55,800 Speaker 1: then on the other hand, like the the inside of 166 00:08:55,840 --> 00:09:01,400 Speaker 1: the earth is more engaging and and also completely on politicize. Uh, well, 167 00:09:01,520 --> 00:09:03,840 Speaker 1: well water is not politically boiling point of water is 168 00:09:03,840 --> 00:09:07,319 Speaker 1: not politicized. But it's also not very sexy. I guess 169 00:09:07,320 --> 00:09:10,880 Speaker 1: like it's it's it's one of those things where unless 170 00:09:10,920 --> 00:09:15,240 Speaker 1: you're actively moving from low to high altitudes or you know, 171 00:09:15,440 --> 00:09:17,160 Speaker 1: living part of your time and dinner from part of 172 00:09:17,200 --> 00:09:19,679 Speaker 1: your time in l A, I guess it's it's very 173 00:09:19,720 --> 00:09:22,240 Speaker 1: possible to live your entire life without really having any 174 00:09:22,400 --> 00:09:25,400 Speaker 1: real world experience with the difference. Uh though, I do 175 00:09:25,440 --> 00:09:28,920 Speaker 1: feel like if you read enough baking manuals, it comes up. Yeah, 176 00:09:29,000 --> 00:09:31,079 Speaker 1: but maybe you just don't remember which way it goes. 177 00:09:31,160 --> 00:09:33,640 Speaker 1: I guess that you know one of the ones you've 178 00:09:33,679 --> 00:09:36,400 Speaker 1: got here that that has come up many times in 179 00:09:36,440 --> 00:09:38,400 Speaker 1: my life. It's come up enough that I know the 180 00:09:38,520 --> 00:09:41,840 Speaker 1: right answer now. But the misconception that you can see 181 00:09:41,880 --> 00:09:45,079 Speaker 1: the Great Wall of China from space, Oh yeah, this 182 00:09:45,160 --> 00:09:47,120 Speaker 1: is one that I have to admit. I think I 183 00:09:47,240 --> 00:09:50,720 Speaker 1: used to adhere to again without really referencing it, because 184 00:09:50,720 --> 00:09:53,160 Speaker 1: it just it had that kind of truthiness to it, right, 185 00:09:53,200 --> 00:09:55,600 Speaker 1: and you want it to be to be real. The 186 00:09:55,720 --> 00:10:00,960 Speaker 1: idea that that this this epic structure aided Uh, you 187 00:10:01,000 --> 00:10:05,600 Speaker 1: know long Ago is visible from space, but it's actually 188 00:10:05,760 --> 00:10:10,079 Speaker 1: been been disproven multiple times. It's it's only visible from 189 00:10:10,200 --> 00:10:14,200 Speaker 1: low orbit under very ideal conditions, and it's not visible 190 00:10:14,200 --> 00:10:16,040 Speaker 1: from the Moon at all. Because that's another version of 191 00:10:16,080 --> 00:10:18,400 Speaker 1: it that it's that that the Great Wall of China 192 00:10:18,440 --> 00:10:20,760 Speaker 1: can be seen from the moon. Yeah, I guess there 193 00:10:20,760 --> 00:10:23,280 Speaker 1: are nuances to the word visible. What's what I mean? 194 00:10:23,360 --> 00:10:25,800 Speaker 1: But in the normal sense that you would mean it's 195 00:10:25,840 --> 00:10:30,080 Speaker 1: not visible from space. Correct. Now a two thousand sixteen 196 00:10:30,440 --> 00:10:33,400 Speaker 1: you gov poll, this was this is a UK group. 197 00:10:34,080 --> 00:10:38,160 Speaker 1: They looked at how belief in pizza gate that thing, Yeah, 198 00:10:38,160 --> 00:10:42,360 Speaker 1: how that shakes out across different voter groups. Now, that 199 00:10:42,440 --> 00:10:45,199 Speaker 1: was this vast conspiracy theory people had about how there 200 00:10:45,240 --> 00:10:48,200 Speaker 1: was a pizza restaurant in Washington, d C. That was 201 00:10:48,280 --> 00:10:51,160 Speaker 1: like running child slavery rings that was linked to the 202 00:10:51,160 --> 00:10:53,640 Speaker 1: Democratic Party. Yeah, it had to do with an idea 203 00:10:53,679 --> 00:10:59,800 Speaker 1: that Clinton campaign emails supposedly talked about human trafficking and pedophilia. 204 00:11:00,120 --> 00:11:05,000 Speaker 1: And according to this particular poll, sevent of polled Clinton 205 00:11:05,120 --> 00:11:08,280 Speaker 1: voters believed that that this was the case, this was 206 00:11:08,320 --> 00:11:12,800 Speaker 1: a reality, and of Trump voters did um. And then 207 00:11:12,800 --> 00:11:15,120 Speaker 1: there's another classic they looked out to to put this 208 00:11:15,160 --> 00:11:18,920 Speaker 1: in perspective, the idea that President Barack Obama was born 209 00:11:18,920 --> 00:11:24,480 Speaker 1: in Kenya. Kind of alarmingly enough, across both groups of voters, 210 00:11:24,520 --> 00:11:26,760 Speaker 1: both the Clinton voters and the Trump voters, they found 211 00:11:26,920 --> 00:11:29,600 Speaker 1: thirty six percent believed it, despite the fact that that too, 212 00:11:29,600 --> 00:11:32,720 Speaker 1: has been debunked time and time again. Yeah, I mean, 213 00:11:33,000 --> 00:11:36,560 Speaker 1: it's crazy that these types of beliefs can catch on 214 00:11:36,600 --> 00:11:40,920 Speaker 1: so well, especially like we we understand very well the 215 00:11:41,040 --> 00:11:45,880 Speaker 1: way that political ideology and tribal thinking affects the way 216 00:11:45,920 --> 00:11:49,880 Speaker 1: we form opinions. Obviously, our our opinions are deeply informed 217 00:11:50,320 --> 00:11:53,200 Speaker 1: by what people we view as are in group believe, 218 00:11:53,320 --> 00:11:55,280 Speaker 1: and so we want to be in line with the 219 00:11:55,360 --> 00:11:57,760 Speaker 1: in group and and stuff like that. But you also 220 00:11:57,840 --> 00:12:00,480 Speaker 1: can't really ignore the fact that these are things that 221 00:12:00,520 --> 00:12:03,400 Speaker 1: if you pay attention to certain sources, you're going to 222 00:12:03,480 --> 00:12:07,400 Speaker 1: be hearing over and over and over again. And what 223 00:12:07,520 --> 00:12:11,040 Speaker 1: effect that might have because it's you know, widely accepted 224 00:12:11,080 --> 00:12:14,360 Speaker 1: fulk wisdom that if you repeat a lie enough, people 225 00:12:14,400 --> 00:12:17,240 Speaker 1: start to believe that it's the truth. Right, That's one 226 00:12:17,240 --> 00:12:19,040 Speaker 1: of the things. I mean, I don't know, you've said 227 00:12:19,040 --> 00:12:22,079 Speaker 1: it enough times that I'm already convinced exactly, I mean 228 00:12:22,280 --> 00:12:24,600 Speaker 1: almost going along with this. Uh. There's a quote that 229 00:12:24,640 --> 00:12:29,480 Speaker 1: often gets sourced to the Nazi propaganda minister, Joseph Gebbles. Uh. 230 00:12:29,800 --> 00:12:32,040 Speaker 1: Some versions of the quote say something like, if you 231 00:12:32,120 --> 00:12:34,880 Speaker 1: repeat a lie often enough, people will believe it, and 232 00:12:34,960 --> 00:12:38,240 Speaker 1: you will even come to believe it yourself. I couldn't 233 00:12:38,240 --> 00:12:41,000 Speaker 1: find any evidence that just Gebbels actually said that. It 234 00:12:41,040 --> 00:12:43,880 Speaker 1: seems to be a misattribution, but it's sort of a 235 00:12:43,880 --> 00:12:47,679 Speaker 1: paraphrase of similar ideas that are, you know, within that 236 00:12:47,679 --> 00:12:50,920 Speaker 1: that frame of thinking. Like Adolf Hitler himself wrote in 237 00:12:51,040 --> 00:12:55,240 Speaker 1: mind comp quote, the most brilliant propagandist technique will yield 238 00:12:55,280 --> 00:12:58,640 Speaker 1: no success unless one fundamental principle is born in mind 239 00:12:58,679 --> 00:13:03,120 Speaker 1: constantly and with unflagging attention. It must confine itself to 240 00:13:03,160 --> 00:13:06,480 Speaker 1: a few points and repeat them over and over here 241 00:13:06,559 --> 00:13:09,440 Speaker 1: as so often in the world, persistence is the first 242 00:13:09,720 --> 00:13:13,760 Speaker 1: and most important requirement for success. So just the fact 243 00:13:13,840 --> 00:13:16,520 Speaker 1: that Hitler said it obviously shouldn't make us think, well, 244 00:13:16,559 --> 00:13:19,080 Speaker 1: you know, he's right with Hitler though. If if Hitler 245 00:13:19,160 --> 00:13:21,280 Speaker 1: was good at anything, it was getting lots of people 246 00:13:21,320 --> 00:13:25,080 Speaker 1: to believe lies. Certainly, so I think, because this is 247 00:13:25,080 --> 00:13:29,120 Speaker 1: such an important issue. And because widespread misconceptions are so common, 248 00:13:29,240 --> 00:13:33,160 Speaker 1: and because they can, in fact, especially in some political circumstances, 249 00:13:33,200 --> 00:13:36,400 Speaker 1: be so destructive, And because the repetition of lies and 250 00:13:36,760 --> 00:13:40,320 Speaker 1: false statements in in at every scale of existence, you know, 251 00:13:40,400 --> 00:13:43,280 Speaker 1: in in mass media and in our personal private lives, 252 00:13:43,840 --> 00:13:45,920 Speaker 1: is so common, I think it's worth looking at the 253 00:13:45,960 --> 00:13:50,559 Speaker 1: actual empirical case. Is this true, the idea that repeating 254 00:13:50,720 --> 00:13:55,960 Speaker 1: statements over and over does that actually change what we believe. 255 00:13:56,000 --> 00:13:57,760 Speaker 1: It's one of those things that you know, it sounds 256 00:13:57,760 --> 00:14:01,840 Speaker 1: so common sensical, you just assume it's true. But according 257 00:14:01,880 --> 00:14:03,920 Speaker 1: to the logic we're using now, those are exactly the 258 00:14:04,000 --> 00:14:07,560 Speaker 1: kinds of statements that maybe we should be careful about. Yeah, 259 00:14:07,559 --> 00:14:10,080 Speaker 1: and I think this is an important important topic for 260 00:14:10,080 --> 00:14:12,800 Speaker 1: for everybody. I don't care who you voted for in 261 00:14:12,840 --> 00:14:16,800 Speaker 1: any previous elections or which political party in your given 262 00:14:16,880 --> 00:14:19,120 Speaker 1: system that you adhere to. I think if you're listening 263 00:14:19,120 --> 00:14:22,479 Speaker 1: to this show especially, you want to think for yourself. 264 00:14:23,000 --> 00:14:27,080 Speaker 1: You want to reduce the amount of manipulation that's going 265 00:14:27,120 --> 00:14:30,560 Speaker 1: on with your your own view of reality. And and 266 00:14:30,720 --> 00:14:32,760 Speaker 1: and that's what we're going to discuss here today. We're 267 00:14:32,760 --> 00:14:39,560 Speaker 1: gonna discuss the degree to which false information can manipulate 268 00:14:39,680 --> 00:14:42,280 Speaker 1: our view of reality and ultimately, what are some of 269 00:14:42,280 --> 00:14:45,720 Speaker 1: the things we can do to to hold onto our 270 00:14:45,840 --> 00:14:49,040 Speaker 1: our our individuality and all of this exactly. So this 271 00:14:49,080 --> 00:14:50,400 Speaker 1: is going to be the first of a two part 272 00:14:50,400 --> 00:14:54,480 Speaker 1: episode where we explore the liar's best trick, the question 273 00:14:54,560 --> 00:14:58,680 Speaker 1: of repetition and exposure in forming our beliefs and changing 274 00:14:58,680 --> 00:15:01,720 Speaker 1: our attitudes. So that's going to be the jumping off 275 00:15:01,720 --> 00:15:05,480 Speaker 1: point for today's episode. Does exposure and repetition is hearing 276 00:15:05,480 --> 00:15:09,360 Speaker 1: a claim and hearing it repeated actually have the power 277 00:15:09,720 --> 00:15:14,240 Speaker 1: to change our beliefs? Or is that just unverified folk wisdom? Yeah? 278 00:15:14,280 --> 00:15:16,480 Speaker 1: And and of course it goes it goes well beyond politics. 279 00:15:16,480 --> 00:15:18,920 Speaker 1: It also gets into marketing. You know, we've we've touched 280 00:15:18,960 --> 00:15:23,400 Speaker 1: on the manipulative nature of marketing and advertisement on the 281 00:15:23,440 --> 00:15:26,880 Speaker 1: show before, and it's always one of the things you 282 00:15:26,880 --> 00:15:28,880 Speaker 1: always come back to. It's always about messaging, right, Like 283 00:15:28,960 --> 00:15:31,120 Speaker 1: what is the message of the product? What's the message 284 00:15:31,120 --> 00:15:34,240 Speaker 1: of the ad campaign? And how they just continue to 285 00:15:34,320 --> 00:15:38,000 Speaker 1: hammer that home. Why do brands have slogans? Yeah? Why 286 00:15:38,000 --> 00:15:40,880 Speaker 1: don't they just tell you a positive message about the 287 00:15:40,920 --> 00:15:43,600 Speaker 1: brand that's different every time? Why do they tell you 288 00:15:43,680 --> 00:15:48,000 Speaker 1: the same message in the same words in every commercial. Yeah, 289 00:15:48,040 --> 00:15:51,880 Speaker 1: why did those fabulous horror trailers from the nineteen seventies 290 00:15:51,880 --> 00:15:55,000 Speaker 1: say the name of the film eighteen times, don't go 291 00:15:55,040 --> 00:15:57,040 Speaker 1: in the basement, don't go in the basement, No one, no, 292 00:15:57,160 --> 00:16:00,880 Speaker 1: just seventeen will be admitted. Yeah, it's all. It's all 293 00:16:00,960 --> 00:16:03,240 Speaker 1: kind of part of the same situation. All right, Well, 294 00:16:03,240 --> 00:16:04,920 Speaker 1: we're going to take a quick break and when we 295 00:16:04,920 --> 00:16:07,800 Speaker 1: get back, we will dive into the research in the 296 00:16:07,880 --> 00:16:13,320 Speaker 1: history of psychology about repetition and exposure. Thank you, thank you. 297 00:16:13,920 --> 00:16:16,240 Speaker 1: All right, we're back. So the first question we're going 298 00:16:16,280 --> 00:16:19,360 Speaker 1: to be looking at today is whether anyone has actually 299 00:16:19,400 --> 00:16:23,280 Speaker 1: studied this question, this question of whether exposing people to 300 00:16:23,440 --> 00:16:27,040 Speaker 1: acclaim and then repeating the claim makes them believe it, 301 00:16:27,440 --> 00:16:30,920 Speaker 1: Whether anybody studied that in a controlled scientific context. And 302 00:16:30,960 --> 00:16:34,000 Speaker 1: the answer is a resounding yes. There are I think 303 00:16:34,160 --> 00:16:37,800 Speaker 1: dozens of studies on this subject in various forms. Probably 304 00:16:37,840 --> 00:16:40,840 Speaker 1: the flagship study on this, the first big one that 305 00:16:40,880 --> 00:16:44,440 Speaker 1: everybody sites that that really got people into the subject, 306 00:16:44,480 --> 00:16:46,760 Speaker 1: that got the ball rolling on it was from ninety 307 00:16:47,400 --> 00:16:51,120 Speaker 1: and it was by Lynn Hasher, David Goldstein, and Thomas Topino, 308 00:16:51,640 --> 00:16:55,360 Speaker 1: and it was called Frequency and the Conference of Referential 309 00:16:55,480 --> 00:16:59,200 Speaker 1: Validity in the Journal of Verbal Learning and Verbal Behavior, 310 00:16:59,280 --> 00:17:02,160 Speaker 1: And that was as I said in nineteen seven. So 311 00:17:02,320 --> 00:17:04,520 Speaker 1: the authors start out in the study by talking about 312 00:17:04,520 --> 00:17:07,320 Speaker 1: how most studies of memory that take place in the 313 00:17:07,400 --> 00:17:13,000 Speaker 1: lab involved useless or meaningless information units. So researchers, for example, 314 00:17:13,080 --> 00:17:16,120 Speaker 1: might try to see how well subjects remember a phrase 315 00:17:16,240 --> 00:17:18,800 Speaker 1: like I just made this up. The purple donkey was 316 00:17:18,880 --> 00:17:22,640 Speaker 1: made of soft spoken muscular elves, Like, can you remember 317 00:17:22,680 --> 00:17:25,960 Speaker 1: that word for word? The purple donkey was made by 318 00:17:26,040 --> 00:17:31,520 Speaker 1: muscular elves. Now, now you're close, not made of Yeah, 319 00:17:31,800 --> 00:17:33,960 Speaker 1: it makes a lot more sense to say made by wow. 320 00:17:34,000 --> 00:17:36,520 Speaker 1: But I see, I've already missed up the origin story 321 00:17:36,600 --> 00:17:40,440 Speaker 1: of the purple donkey. Statements like this have no importance 322 00:17:40,440 --> 00:17:42,359 Speaker 1: in the real world, and part of what they were 323 00:17:42,359 --> 00:17:45,960 Speaker 1: talking about is that we're testing for memory and things 324 00:17:46,080 --> 00:17:49,960 Speaker 1: that don't have any validity to reality. Um. So the 325 00:17:50,000 --> 00:17:53,040 Speaker 1: authors write that they're curious about what kind of processing 326 00:17:53,080 --> 00:17:56,480 Speaker 1: subjects do with information units that might have validity in 327 00:17:56,520 --> 00:18:00,720 Speaker 1: the real world. For example, factual statements like quote, the 328 00:18:00,760 --> 00:18:04,560 Speaker 1: total population of Greenland is about fifty thou which at 329 00:18:04,560 --> 00:18:07,120 Speaker 1: the time of the study it was. I checked, though, 330 00:18:07,119 --> 00:18:08,840 Speaker 1: that seems like a lot more people than should be 331 00:18:08,920 --> 00:18:12,080 Speaker 1: in Greenland, right, I was. I was surprised by that. Well, 332 00:18:12,119 --> 00:18:15,320 Speaker 1: but I would agree based on I'll be a limited 333 00:18:15,320 --> 00:18:18,760 Speaker 1: amount of of information that I've I've read and viewed 334 00:18:18,800 --> 00:18:22,200 Speaker 1: about Greenland. You know, you typically, in my experience, Greenland 335 00:18:22,200 --> 00:18:25,520 Speaker 1: shows up in nature documentaries, and of course you're going 336 00:18:25,560 --> 00:18:29,280 Speaker 1: to see rather barren uh locations in those films. Well, 337 00:18:29,440 --> 00:18:31,959 Speaker 1: greenlanders out there in the audience. Let us know if 338 00:18:31,960 --> 00:18:35,000 Speaker 1: you're listening, what's life like up there in Greenland. I'm 339 00:18:35,080 --> 00:18:39,480 Speaker 1: interested now. But anyway back, So, yeah, population of Greenland 340 00:18:39,480 --> 00:18:42,400 Speaker 1: at the times about fifty and so statements like this 341 00:18:42,560 --> 00:18:45,400 Speaker 1: both refer to something that could be true or false 342 00:18:45,440 --> 00:18:47,439 Speaker 1: in the real world, and there are also things that 343 00:18:47,440 --> 00:18:50,439 Speaker 1: people are probably uncertain about, like do you know what 344 00:18:50,520 --> 00:18:53,320 Speaker 1: the actual population of Greenland is? I didn't know before 345 00:18:53,320 --> 00:18:55,560 Speaker 1: I looked it up, And so we know that the 346 00:18:55,600 --> 00:18:58,159 Speaker 1: statement is either true or false, but we aren't sure 347 00:18:58,400 --> 00:19:00,879 Speaker 1: whether it's true or false. And of course, to go 348 00:19:00,920 --> 00:19:03,720 Speaker 1: back to a previous episode, you're kind of anchoring my 349 00:19:03,800 --> 00:19:08,199 Speaker 1: expectations by throwing out without any population data in my 350 00:19:08,200 --> 00:19:10,840 Speaker 1: head about Greenland, like that's suddenly all I have to 351 00:19:10,880 --> 00:19:13,760 Speaker 1: go on, Oh yeah, that's interesting. Also, so like it 352 00:19:13,800 --> 00:19:17,160 Speaker 1: could be three thousand or it could be like a million, 353 00:19:17,680 --> 00:19:21,840 Speaker 1: and either you're sort of moving your guests range toward fifty. 354 00:19:22,960 --> 00:19:24,680 Speaker 1: So the thing they point out is, even though most 355 00:19:24,680 --> 00:19:27,320 Speaker 1: people don't know what the population of Greenland is, we're 356 00:19:27,320 --> 00:19:30,159 Speaker 1: often willing, into some extent able to make guesses as 357 00:19:30,200 --> 00:19:33,480 Speaker 1: to whether statements like this are true. So where does 358 00:19:33,560 --> 00:19:36,800 Speaker 1: this semantic knowledge come from? When we feel like we 359 00:19:36,880 --> 00:19:39,919 Speaker 1: have knowledge to offer a guess about what the population 360 00:19:39,960 --> 00:19:43,600 Speaker 1: of Greenland is even when we don't really know what 361 00:19:43,800 --> 00:19:47,520 Speaker 1: is that what allows us to judge these questions? And 362 00:19:47,560 --> 00:19:52,200 Speaker 1: the authors note that frequency is a really powerful variable 363 00:19:52,240 --> 00:19:54,600 Speaker 1: in all kinds of judgments we make about the world. 364 00:19:55,160 --> 00:19:59,439 Speaker 1: So they hypothesize that quote frequency might also serve as 365 00:19:59,480 --> 00:20:04,080 Speaker 1: the major access route that plausible statements have into our 366 00:20:04,119 --> 00:20:08,240 Speaker 1: pool of general knowledge. So the idea is that we 367 00:20:08,320 --> 00:20:12,000 Speaker 1: build our knowledge base based on how frequently we are 368 00:20:12,080 --> 00:20:14,959 Speaker 1: exposed to ideas. You hear an idea a lot, and 369 00:20:15,040 --> 00:20:17,960 Speaker 1: that gets reinforced in the knowledge base. You've never heard 370 00:20:18,040 --> 00:20:20,000 Speaker 1: an idea before, or you don't hear it a lot, 371 00:20:20,040 --> 00:20:22,480 Speaker 1: it doesn't get reinforced, and it doesn't exist in the 372 00:20:22,520 --> 00:20:26,520 Speaker 1: knowledge base. So here's the experimental part. Researchers came up 373 00:20:26,520 --> 00:20:28,720 Speaker 1: with a list of a hundred and forty true statements 374 00:20:28,760 --> 00:20:32,359 Speaker 1: and false statements, crafted so they all sound plausible. Yeah, 375 00:20:32,359 --> 00:20:34,600 Speaker 1: they could be true, you know, the average person would 376 00:20:34,600 --> 00:20:37,920 Speaker 1: be unsure whether or not they're true. And the statements 377 00:20:37,920 --> 00:20:42,480 Speaker 1: were on all kinds of subjects like geography, arts and literature, history, sports, 378 00:20:42,840 --> 00:20:47,040 Speaker 1: current events science. A few examples of true statements included 379 00:20:47,119 --> 00:20:52,000 Speaker 1: things like Cairo, Egypt has a larger population than Chicago, Illinois, 380 00:20:52,560 --> 00:20:55,800 Speaker 1: and French horn players get cash bonuses to stay in 381 00:20:55,800 --> 00:20:59,320 Speaker 1: the U. S. Arm True. Well, i'd see it. I 382 00:20:59,359 --> 00:21:01,680 Speaker 1: should have joined the army after all. See, I was 383 00:21:01,680 --> 00:21:04,000 Speaker 1: a french horn player really in high school. Yeah, I 384 00:21:04,040 --> 00:21:06,560 Speaker 1: didn't know that. What's it like playing the french horn. 385 00:21:07,000 --> 00:21:08,919 Speaker 1: It's just, you know, there's a lot of spit and 386 00:21:08,920 --> 00:21:12,480 Speaker 1: a lot of shoving your hand up horns. That's it. 387 00:21:13,200 --> 00:21:15,439 Speaker 1: Otherwise it's like playing a trumpet. Now. See I actually 388 00:21:15,440 --> 00:21:17,760 Speaker 1: played trumpet, and it's a lot less fun than what 389 00:21:17,800 --> 00:21:20,399 Speaker 1: you're describing. Yeah, I mean, I don't know. There is 390 00:21:20,400 --> 00:21:23,040 Speaker 1: an elegance to the way you hold it, and you again, 391 00:21:23,080 --> 00:21:26,200 Speaker 1: you have your hand in the inside the horn. I 392 00:21:26,240 --> 00:21:28,920 Speaker 1: don't know. Being a trumpet player to me always felt 393 00:21:28,960 --> 00:21:33,760 Speaker 1: like being a person who's complaining at full volume. Yeah, yeah, 394 00:21:33,800 --> 00:21:37,560 Speaker 1: there is. There's more of an outward stance with the trumpet, right, 395 00:21:37,560 --> 00:21:39,960 Speaker 1: you're blasting outward, But in the French horn, you it's 396 00:21:40,320 --> 00:21:45,040 Speaker 1: it's more like you're playing music into yourself. That's quite beautiful, 397 00:21:46,040 --> 00:21:47,960 Speaker 1: more beautiful than any music I ever played on the 398 00:21:47,960 --> 00:21:51,040 Speaker 1: French horn. Uh So we got to get back to 399 00:21:51,080 --> 00:21:53,920 Speaker 1: the study. Okay, so that's that's supposedly true. French horn 400 00:21:53,920 --> 00:21:55,960 Speaker 1: players at the time got cash bonuses to stay in 401 00:21:56,000 --> 00:21:58,600 Speaker 1: the U. S. Army. Examples of false statements where things 402 00:21:58,640 --> 00:22:01,680 Speaker 1: like the People's Republic of China was founded in ninety 403 00:22:01,720 --> 00:22:05,359 Speaker 1: seven it was actually ninety nine. Or the copy bara 404 00:22:05,600 --> 00:22:08,480 Speaker 1: is the largest of the marsupials. That's not true. The 405 00:22:08,560 --> 00:22:11,760 Speaker 1: largest marcipial is the red kangaroo. The largest known in 406 00:22:11,800 --> 00:22:15,560 Speaker 1: the fossil record is this thing called the extinct dip protodon. 407 00:22:16,840 --> 00:22:18,600 Speaker 1: Now that in the capy bar is a rodent, right, 408 00:22:18,960 --> 00:22:21,320 Speaker 1: I only is the capy bar and marsupial. I didn't 409 00:22:21,320 --> 00:22:23,159 Speaker 1: even look it up. I only know that because I 410 00:22:23,200 --> 00:22:25,679 Speaker 1: go to a lot of zoos these days. But it 411 00:22:25,760 --> 00:22:27,359 Speaker 1: is a mammal, it's a it's a rodent. It is 412 00:22:27,359 --> 00:22:29,960 Speaker 1: a mammal, and it is a rodent, not a marsupial. Well, 413 00:22:30,040 --> 00:22:32,560 Speaker 1: there we go. So it's wrong in multiple ways. So 414 00:22:32,600 --> 00:22:34,480 Speaker 1: you got this big list that came up with of 415 00:22:34,560 --> 00:22:37,480 Speaker 1: true and false statements, and all of them should sound 416 00:22:37,480 --> 00:22:39,880 Speaker 1: plausible to the average person, But most people are not 417 00:22:39,920 --> 00:22:41,880 Speaker 1: going to be likely to know for sure whether they're 418 00:22:41,880 --> 00:22:44,560 Speaker 1: true unless they just happened to have some special random 419 00:22:44,600 --> 00:22:49,800 Speaker 1: knowledge or expertise. And so researchers held three sessions with participants, 420 00:22:49,800 --> 00:22:53,520 Speaker 1: each separated by two weeks, and on each of the sessions, 421 00:22:53,560 --> 00:22:57,080 Speaker 1: the participants were played back a tape of a selection 422 00:22:57,160 --> 00:23:01,560 Speaker 1: of sixty recorded statements from that list, and the subjects 423 00:23:01,560 --> 00:23:04,639 Speaker 1: were asked to judge how confident they were that the 424 00:23:04,680 --> 00:23:06,720 Speaker 1: statements were true. And this was on a scale of 425 00:23:06,720 --> 00:23:10,120 Speaker 1: one to seven, with like four being uncertain, five being 426 00:23:10,200 --> 00:23:14,080 Speaker 1: possibly true, six being probably true, seven being definitely true. 427 00:23:14,640 --> 00:23:17,359 Speaker 1: And in each session, some of the statements were true, 428 00:23:17,400 --> 00:23:20,120 Speaker 1: some were false. But here's where the real magic happened. 429 00:23:20,320 --> 00:23:23,760 Speaker 1: At the second session and the third session, each time 430 00:23:24,240 --> 00:23:28,160 Speaker 1: subjects got a mix of new true and false statements 431 00:23:28,200 --> 00:23:31,520 Speaker 1: that they've never seen before, plus true and false statements 432 00:23:31,560 --> 00:23:34,919 Speaker 1: that they had already seen in the previous sessions. So 433 00:23:34,920 --> 00:23:36,840 Speaker 1: while most of the claims they saw were new, a 434 00:23:36,920 --> 00:23:41,040 Speaker 1: minority got repeated each time. And what the researchers found 435 00:23:41,160 --> 00:23:45,199 Speaker 1: was that whether a statement was true or false, the 436 00:23:45,320 --> 00:23:49,199 Speaker 1: more times the students saw it, the more they believed it. 437 00:23:50,240 --> 00:23:52,879 Speaker 1: So again, this would be this, this, this would be 438 00:23:52,880 --> 00:23:55,639 Speaker 1: the principle in action. The more they're hearing this, uh, 439 00:23:55,720 --> 00:23:59,320 Speaker 1: this this false fact, the more they're coming to believe 440 00:23:59,359 --> 00:24:02,080 Speaker 1: that it is true. Yeah, even in this constrained, kind 441 00:24:02,080 --> 00:24:05,439 Speaker 1: of weird experimental context where they're aware that some of 442 00:24:05,440 --> 00:24:07,360 Speaker 1: these facts are going to be false, it's not like 443 00:24:07,640 --> 00:24:10,679 Speaker 1: they're being told this persuasively by a person trying to 444 00:24:10,720 --> 00:24:14,159 Speaker 1: convince them. They're just reading this from a list of 445 00:24:14,200 --> 00:24:17,119 Speaker 1: statements that are known to be either true or false. 446 00:24:17,160 --> 00:24:21,479 Speaker 1: I mean, there's no there's no persuasive aspect to this 447 00:24:21,560 --> 00:24:24,399 Speaker 1: at all, right, Right, These are not politically charged or 448 00:24:24,440 --> 00:24:27,440 Speaker 1: really charged by worldview at all. They're they're just plain 449 00:24:27,640 --> 00:24:31,520 Speaker 1: neutral statements that really have very little interest to most people. 450 00:24:31,560 --> 00:24:35,439 Speaker 1: Also probably, but what happened was whether the statement was 451 00:24:35,480 --> 00:24:38,960 Speaker 1: true or false, people believed it more if they saw 452 00:24:39,000 --> 00:24:42,280 Speaker 1: it more times. So I've got a little chart in 453 00:24:42,320 --> 00:24:44,439 Speaker 1: here of what happened with the false statements. You can 454 00:24:44,480 --> 00:24:46,560 Speaker 1: have a look at Robert as you can see the 455 00:24:46,720 --> 00:24:50,119 Speaker 1: new false statements. The false statements people saw for the 456 00:24:50,200 --> 00:24:53,119 Speaker 1: very first time covered around, you know, like four or 457 00:24:53,119 --> 00:24:56,160 Speaker 1: four point one across all three sessions. That would correspond 458 00:24:56,160 --> 00:24:58,160 Speaker 1: to people saying they're uncertain. I don't know. I don't 459 00:24:58,160 --> 00:25:00,200 Speaker 1: know whether French horn players get a cash bone us 460 00:25:00,200 --> 00:25:02,480 Speaker 1: for staying in the army and playing into the self 461 00:25:02,600 --> 00:25:05,600 Speaker 1: and sticking the hand up the horn. But in the 462 00:25:05,640 --> 00:25:09,960 Speaker 1: second session there repeated false statements jumped up from about 463 00:25:10,080 --> 00:25:13,080 Speaker 1: four point over four point one to about four point five, 464 00:25:13,400 --> 00:25:16,159 Speaker 1: and then in the third session up again to about 465 00:25:16,160 --> 00:25:19,119 Speaker 1: four point seven. And we only saw what happened with 466 00:25:19,160 --> 00:25:21,520 Speaker 1: two sessions. Who knows what have happened what might have 467 00:25:21,520 --> 00:25:25,800 Speaker 1: happened if you had continued adding more sessions. So just 468 00:25:25,920 --> 00:25:28,119 Speaker 1: seeing a statement more than once appeared to make it 469 00:25:28,160 --> 00:25:31,280 Speaker 1: more believable even though it wasn't true. And the pattern 470 00:25:31,359 --> 00:25:33,840 Speaker 1: was roughly the same for true statements, which isn't all 471 00:25:33,840 --> 00:25:36,600 Speaker 1: that surprising since the experiment was based on, you know, 472 00:25:36,680 --> 00:25:38,600 Speaker 1: statements that people didn't know whether they were true or 473 00:25:38,600 --> 00:25:42,159 Speaker 1: false to begin with. So, the authors wrote in conclusion, quote, 474 00:25:42,280 --> 00:25:45,320 Speaker 1: the present research has demonstrated that the repetition of a 475 00:25:45,400 --> 00:25:50,680 Speaker 1: plausible statement increases a person's belief in the referential validity 476 00:25:50,760 --> 00:25:53,199 Speaker 1: or truth of that statement. I don't know why they 477 00:25:53,200 --> 00:25:55,840 Speaker 1: had to say referential validity. They could have just said truth. 478 00:25:56,480 --> 00:25:59,159 Speaker 1: And that's that's some science writing for you. Uh, in 479 00:25:59,240 --> 00:26:02,760 Speaker 1: the truth of that statement. And indeed, the present experiment 480 00:26:03,040 --> 00:26:06,359 Speaker 1: appears to lend empirical support to the idea that quote, 481 00:26:06,400 --> 00:26:09,800 Speaker 1: if people are told something often enough, they'll believe it. 482 00:26:10,760 --> 00:26:13,120 Speaker 1: Uh so, so yeah, this is this is the first 483 00:26:13,160 --> 00:26:15,000 Speaker 1: real study to find this and a few other things 484 00:26:15,040 --> 00:26:18,080 Speaker 1: the authors thought were worth considering. Uh. The fact that 485 00:26:18,119 --> 00:26:20,840 Speaker 1: this effect was displayed in statements from a big broad 486 00:26:20,920 --> 00:26:24,320 Speaker 1: pool of different types of subject matter suggests this is 487 00:26:24,359 --> 00:26:27,000 Speaker 1: not extremely context dependent. Right, It's not just going to 488 00:26:27,080 --> 00:26:30,000 Speaker 1: be political beliefs that are subject to this. It seems 489 00:26:30,040 --> 00:26:32,720 Speaker 1: to be all different kinds of statements in all different 490 00:26:32,800 --> 00:26:36,040 Speaker 1: kinds of domains. Another thing they noted was that the 491 00:26:36,080 --> 00:26:39,919 Speaker 1: effect was present for true statements and false statements. Either way, 492 00:26:40,000 --> 00:26:43,280 Speaker 1: if students saw the claims more often, they believed them 493 00:26:43,280 --> 00:26:46,880 Speaker 1: with greater confidence. But another takeaway is that the effect 494 00:26:46,960 --> 00:26:51,320 Speaker 1: is not huge for false statements. Three exposures was roughly 495 00:26:51,440 --> 00:26:55,280 Speaker 1: enough to get you from I'm uncertain to it's possibly true. 496 00:26:55,640 --> 00:26:58,119 Speaker 1: But as we mentioned before, how many this is just 497 00:26:58,200 --> 00:27:01,320 Speaker 1: two or three sessions right right? Who knows what what 498 00:27:01,400 --> 00:27:04,359 Speaker 1: would have happened if maybe you had done this more 499 00:27:04,400 --> 00:27:07,240 Speaker 1: times in a row, or if there had been other 500 00:27:07,359 --> 00:27:10,359 Speaker 1: factors affecting whether people were likely to believe these things 501 00:27:10,400 --> 00:27:13,639 Speaker 1: to begin with, say if they had valences to the 502 00:27:13,640 --> 00:27:17,080 Speaker 1: person's political identity or something like that. Yeah, you know, 503 00:27:17,119 --> 00:27:20,040 Speaker 1: as we're researching this into and discussing it, I couldn't 504 00:27:20,040 --> 00:27:25,960 Speaker 1: help but think of notable examples of uh false stories 505 00:27:26,000 --> 00:27:29,240 Speaker 1: about generally like like celebrities from the past. And I'm 506 00:27:29,240 --> 00:27:32,160 Speaker 1: not going to mention any of them specifically. Why not, Well, 507 00:27:32,200 --> 00:27:33,879 Speaker 1: because you know they all tend to be a bit crude. 508 00:27:33,960 --> 00:27:37,520 Speaker 1: There there are several of them about like carrying down 509 00:27:37,920 --> 00:27:41,600 Speaker 1: various sort of you know, pretty boy rockers or actors 510 00:27:41,720 --> 00:27:45,200 Speaker 1: from the past, uh in in the in. The interesting 511 00:27:45,240 --> 00:27:47,560 Speaker 1: thing about him is these are generally like pre internet 512 00:27:48,240 --> 00:27:51,960 Speaker 1: um stories that had to circulate the word of mouth 513 00:27:52,080 --> 00:27:53,840 Speaker 1: or I think in one case there was talk of 514 00:27:53,880 --> 00:27:56,160 Speaker 1: like a whole bunch of of facts is going out 515 00:27:56,600 --> 00:27:59,399 Speaker 1: in Hollywood where someone was just basically just wanted to 516 00:27:59,440 --> 00:28:01,639 Speaker 1: take somebody down because they didn't like them. I remember, 517 00:28:02,080 --> 00:28:04,480 Speaker 1: I think, like ninth grade here is starting to hear 518 00:28:04,520 --> 00:28:07,000 Speaker 1: this bizarre story about rich your gear. Yeah, that's that's 519 00:28:07,040 --> 00:28:09,159 Speaker 1: the main one I'm thinking of. And I think it 520 00:28:09,200 --> 00:28:13,040 Speaker 1: basically comes down to Richard Geary's is a handsome, successful 521 00:28:13,080 --> 00:28:15,760 Speaker 1: guy and and for a lot of people, you want 522 00:28:15,800 --> 00:28:18,560 Speaker 1: to like really you know, and knock him down or 523 00:28:18,600 --> 00:28:23,040 Speaker 1: not you Yeah, and uh, and so when you encounter 524 00:28:23,280 --> 00:28:27,600 Speaker 1: a bit of slander or libel like that, or uh, 525 00:28:27,840 --> 00:28:30,800 Speaker 1: just a ridiculous story, you're going to be more inclined 526 00:28:30,840 --> 00:28:33,080 Speaker 1: to believe it if you kind of want it to 527 00:28:33,080 --> 00:28:35,720 Speaker 1: be true, right or if you or you're like, yeah, 528 00:28:35,920 --> 00:28:37,920 Speaker 1: let screw that guy. I'm gonna I'm gonna go ahead 529 00:28:37,920 --> 00:28:39,160 Speaker 1: and believe this, or even if I don't believe it, 530 00:28:39,160 --> 00:28:41,320 Speaker 1: I'm going to pass it on. But either way, whether 531 00:28:41,440 --> 00:28:44,080 Speaker 1: or not you're predisposed to believe it's true, it looks 532 00:28:44,080 --> 00:28:46,800 Speaker 1: like this initial study at least provides evidence that you 533 00:28:46,800 --> 00:28:51,959 Speaker 1: would be more disposed to believe it's true in either case. Like, 534 00:28:52,000 --> 00:28:55,240 Speaker 1: so whatever you're starting point is it's gonna nudge you up, 535 00:28:55,680 --> 00:28:58,600 Speaker 1: like if nothing else becomes word association, Like if you're 536 00:28:58,600 --> 00:29:02,440 Speaker 1: not really a fan, say Richard Gear's work, uh, and 537 00:29:02,480 --> 00:29:05,400 Speaker 1: you can't name your your favorite Richard Gear film off 538 00:29:05,400 --> 00:29:07,680 Speaker 1: the top of your head, that might be the primary 539 00:29:07,800 --> 00:29:10,760 Speaker 1: keyword that pops up when you hear his name. Yeah, 540 00:29:10,760 --> 00:29:14,880 Speaker 1: it could be. Uh. Yeah. So so this bizarre effect 541 00:29:15,000 --> 00:29:18,440 Speaker 1: that we're talking about where hearing a fact repeated, even 542 00:29:18,480 --> 00:29:20,720 Speaker 1: if you've got no good reason to believe it's true, 543 00:29:21,080 --> 00:29:24,720 Speaker 1: just hearing it repeated causes you to be more likely 544 00:29:24,760 --> 00:29:26,640 Speaker 1: to believe it. This came to be known first as 545 00:29:26,760 --> 00:29:30,080 Speaker 1: the truth effect, and then later on probably a better 546 00:29:30,120 --> 00:29:32,920 Speaker 1: title was the illusory truth effect. I think we should 547 00:29:32,960 --> 00:29:36,280 Speaker 1: use the second one because otherwise that makes it sound true. Yeah, 548 00:29:36,680 --> 00:29:38,040 Speaker 1: it just makes it sound like yeah, if you if 549 00:29:38,080 --> 00:29:40,040 Speaker 1: you repeat something, then it is then it is true. 550 00:29:40,040 --> 00:29:42,480 Speaker 1: There is no question anymore. So. The basic version of 551 00:29:42,480 --> 00:29:45,760 Speaker 1: the illusory truth effect is quote. People are more likely 552 00:29:45,840 --> 00:29:49,920 Speaker 1: to judge repeated statements as true compared to new statements, 553 00:29:50,240 --> 00:29:52,200 Speaker 1: And another way of putting it is that all other 554 00:29:52,320 --> 00:29:55,040 Speaker 1: things being equal, you're more likely to believe a claim 555 00:29:55,080 --> 00:29:58,400 Speaker 1: if you've heard it before than one you haven't heard before, 556 00:29:58,920 --> 00:30:01,200 Speaker 1: and the more times you you're the claim, the more 557 00:30:01,280 --> 00:30:04,360 Speaker 1: likely you are to believe it. But so far we've 558 00:30:04,360 --> 00:30:06,640 Speaker 1: just talked about one study, right, this this one nineteen 559 00:30:06,680 --> 00:30:09,440 Speaker 1: seventy seven study, uh what we can call it the 560 00:30:09,440 --> 00:30:12,800 Speaker 1: Star Wars study. If you want the Star Wars study here. 561 00:30:12,880 --> 00:30:15,720 Speaker 1: It's a fairly small sample, just one study. If you 562 00:30:15,720 --> 00:30:18,200 Speaker 1: want to be skeptical and rigorous, maybe especially because this 563 00:30:18,240 --> 00:30:20,440 Speaker 1: backs up ful quisdom, which is always something you should 564 00:30:20,480 --> 00:30:22,719 Speaker 1: be careful about. We should see if the effect has 565 00:30:22,720 --> 00:30:27,120 Speaker 1: been replicated by other researchers, and boy, howdy it has. 566 00:30:27,400 --> 00:30:29,920 Speaker 1: That's right. This next one comes to us from nineteen 567 00:30:29,960 --> 00:30:33,880 Speaker 1: seventy nine Journal of Experimental Psychology, Human Learning and Memory, 568 00:30:34,360 --> 00:30:37,200 Speaker 1: the work of Frederick T. Bacon. Yeah, this was called 569 00:30:37,240 --> 00:30:42,120 Speaker 1: the Credibility of repeated Statements memory for trivia. So Bacon 570 00:30:42,240 --> 00:30:45,240 Speaker 1: performed he was trying to replicate this this effect. He 571 00:30:45,320 --> 00:30:49,239 Speaker 1: performed additional experiments to test the previous team's conclusions and 572 00:30:49,280 --> 00:30:52,280 Speaker 1: add some nuance. So his first experiment, you got ninety 573 00:30:52,280 --> 00:30:55,200 Speaker 1: eight undergrads and they had two sessions in which they 574 00:30:55,200 --> 00:30:57,840 Speaker 1: were asked to rate sentences as true or false, with 575 00:30:58,000 --> 00:31:01,200 Speaker 1: three weeks between the two Sessions, and Bacon found that 576 00:31:01,240 --> 00:31:05,480 Speaker 1: the repetition illusory truth effect was modulated by whether the 577 00:31:05,560 --> 00:31:10,280 Speaker 1: subjects consciously believed that a sentence had been repeated. That is, 578 00:31:10,520 --> 00:31:13,920 Speaker 1: if they remembered that they had seen the sentence last time, 579 00:31:14,280 --> 00:31:16,840 Speaker 1: they were more inclined to believe it. If they believed 580 00:31:16,880 --> 00:31:19,360 Speaker 1: they were seeing a sentence for the first time, they 581 00:31:19,360 --> 00:31:21,760 Speaker 1: were less likely to believe it. And this was true 582 00:31:21,840 --> 00:31:25,640 Speaker 1: regardless of the statements themselves. I can't help but think 583 00:31:25,680 --> 00:31:28,840 Speaker 1: of our modern version of this with Facebook feed right, 584 00:31:29,200 --> 00:31:32,520 Speaker 1: because you're inevitably, if you're your Facebook user, or perhaps 585 00:31:32,560 --> 00:31:34,800 Speaker 1: if you're a Twitter user or some other social media 586 00:31:35,040 --> 00:31:37,200 Speaker 1: you're you're scrolling down right and there are a lot 587 00:31:37,240 --> 00:31:39,640 Speaker 1: of sentences coming at you. Some you just kind of 588 00:31:39,640 --> 00:31:41,640 Speaker 1: read in passing. So maybe you don't read at all. 589 00:31:42,720 --> 00:31:45,560 Speaker 1: But are you actually stopping to really think about what 590 00:31:45,680 --> 00:31:49,400 Speaker 1: a particular headline or you know, or paragraph is saying 591 00:31:49,880 --> 00:31:51,920 Speaker 1: or is it just kind of scrolling in the background 592 00:31:51,920 --> 00:31:54,160 Speaker 1: of your mind. Yeah, And the result of this one 593 00:31:54,160 --> 00:31:57,640 Speaker 1: experiment here would seem to indicate if it has validity, 594 00:31:57,640 --> 00:31:59,840 Speaker 1: it would mean the ones you stop and pay attention 595 00:31:59,880 --> 00:32:02,280 Speaker 1: to you and make a memory about are the ones 596 00:32:02,320 --> 00:32:05,880 Speaker 1: you're more likely to believe later on. But then also 597 00:32:06,560 --> 00:32:08,840 Speaker 1: in another experiment, he had a group of sixty four 598 00:32:08,880 --> 00:32:12,480 Speaker 1: undergrads and he replicated the illusory truth effect and found 599 00:32:12,480 --> 00:32:15,800 Speaker 1: that students believed repeated statements to be more credible even 600 00:32:15,840 --> 00:32:19,120 Speaker 1: if the students were informed that the statements were being repeated. 601 00:32:19,400 --> 00:32:22,360 Speaker 1: So you can directly tell somebody, hey, I know, I 602 00:32:22,440 --> 00:32:25,000 Speaker 1: just asked you if it was true or false that 603 00:32:25,120 --> 00:32:28,200 Speaker 1: zebras could automatically detach their own tongues and fling the 604 00:32:28,200 --> 00:32:31,640 Speaker 1: tongues at attacking hyenas. I asked you that same thing 605 00:32:31,680 --> 00:32:33,960 Speaker 1: three weeks ago. It may or may not be true, 606 00:32:34,840 --> 00:32:38,240 Speaker 1: And even in this case, repeating the statement still makes 607 00:32:38,280 --> 00:32:40,440 Speaker 1: them judge it to be more true than statements they're 608 00:32:40,440 --> 00:32:43,000 Speaker 1: seeing for the first time. So you can warn people 609 00:32:43,080 --> 00:32:45,480 Speaker 1: that something fishy is going on and they still fall 610 00:32:45,560 --> 00:32:48,880 Speaker 1: for it. So you could you could straight up share 611 00:32:48,880 --> 00:32:54,000 Speaker 1: a piece of just undeniably fake news on social media 612 00:32:54,280 --> 00:32:56,800 Speaker 1: and and said, hey, guys, this is this is something news. 613 00:32:56,840 --> 00:33:00,040 Speaker 1: This has been totally debunked. Um, you can look it 614 00:33:00,120 --> 00:33:03,320 Speaker 1: up on Snopes, etcetera. And that's still not going to 615 00:33:03,360 --> 00:33:06,000 Speaker 1: completely disarm the piece that you're sharing. Well, we will 616 00:33:06,000 --> 00:33:08,680 Speaker 1: talk so I would say, yes, we will talk more 617 00:33:08,720 --> 00:33:11,160 Speaker 1: about that in the second episode where this kind of 618 00:33:11,160 --> 00:33:14,800 Speaker 1: thing comes into conflict with real world beliefs. And just 619 00:33:14,840 --> 00:33:16,800 Speaker 1: to be clear, I made up that zebra thing that 620 00:33:16,800 --> 00:33:19,320 Speaker 1: that wasn't from the vacant study. I thought that would 621 00:33:19,360 --> 00:33:22,280 Speaker 1: be clear. But that's number one, not true. Number two, 622 00:33:22,480 --> 00:33:24,200 Speaker 1: as far as I know, not one of the examples. 623 00:33:24,200 --> 00:33:26,160 Speaker 1: Bacon used, Right, Well, I'm sorry you had to drag 624 00:33:26,240 --> 00:33:28,680 Speaker 1: Zebras into all this. Joe. Well, you know, I like 625 00:33:28,760 --> 00:33:32,800 Speaker 1: the idea of a weaponized tongue that's gonna go beyond 626 00:33:32,840 --> 00:33:36,200 Speaker 1: the X Men. Surely that exists in reality. Well yes, 627 00:33:36,280 --> 00:33:38,520 Speaker 1: but but not with Zebras. No. No, I guess that's 628 00:33:38,560 --> 00:33:41,840 Speaker 1: amphibians and stuff. Okay, okay, So back to the study, 629 00:33:42,000 --> 00:33:45,080 Speaker 1: so Bacon says in his abstract quote. It was further 630 00:33:45,200 --> 00:33:49,800 Speaker 1: determined that statements that contradicted early ones were rated as 631 00:33:49,920 --> 00:33:55,720 Speaker 1: relatively true if misclassified as repetitions, but that statements judged 632 00:33:55,760 --> 00:33:59,440 Speaker 1: to be changed were rated as relatively false. So even 633 00:33:59,520 --> 00:34:02,840 Speaker 1: if you remember that you saw something before, you're more 634 00:34:02,920 --> 00:34:05,920 Speaker 1: likely to believe it's true. It's kind of odd. That 635 00:34:06,040 --> 00:34:09,960 Speaker 1: makes you wonder, what is the initial uh, what's the 636 00:34:10,000 --> 00:34:13,080 Speaker 1: initial stimulus that caused you to misremember that you had 637 00:34:13,120 --> 00:34:16,080 Speaker 1: seen it before. Well, as we've discussed on the show before, 638 00:34:16,120 --> 00:34:19,200 Speaker 1: I mean, there are multiple ways that false memories can 639 00:34:19,200 --> 00:34:22,640 Speaker 1: be can be encoded. Oh yeah, absolutely. And so Bacon 640 00:34:22,760 --> 00:34:26,520 Speaker 1: concludes that basically, people are predisposed to believe statements that 641 00:34:26,600 --> 00:34:31,320 Speaker 1: affirm existing knowledge and to disbelieve statements that contradict existing knowledge. 642 00:34:31,400 --> 00:34:34,840 Speaker 1: That's not all that unusual, right, But but it's specifically 643 00:34:34,920 --> 00:34:38,040 Speaker 1: the repetition effect that seems to be playing a role here. 644 00:34:38,600 --> 00:34:40,799 Speaker 1: Let's take a look at another study. How about nine 645 00:34:41,320 --> 00:34:45,800 Speaker 1: two Marian Schwartz Repetition and Rated Truth Value of Statements 646 00:34:45,840 --> 00:34:49,680 Speaker 1: from the American Journal of Psychology. So Schwartz here has 647 00:34:49,920 --> 00:34:53,239 Speaker 1: conducted two experiments on what psychologists were by this time 648 00:34:53,280 --> 00:34:56,120 Speaker 1: calling the truth effect what we're calling the illusory truth 649 00:34:56,160 --> 00:34:59,560 Speaker 1: effect UM. So, experiment one, you get a group of 650 00:34:59,560 --> 00:35:02,160 Speaker 1: subject and they rate claims on a seven point truth 651 00:35:02,239 --> 00:35:04,239 Speaker 1: value scale, just like in the first study, the star 652 00:35:04,280 --> 00:35:07,760 Speaker 1: Wars study, the seventy seven study, UM and a different 653 00:35:07,760 --> 00:35:10,640 Speaker 1: group of subjects rated the same statements on a seven 654 00:35:10,680 --> 00:35:13,960 Speaker 1: point scale of how familiar they were with the statements 655 00:35:14,000 --> 00:35:17,560 Speaker 1: before the experiment started. How familiar are you with this 656 00:35:18,400 --> 00:35:23,200 Speaker 1: repetition increased both ratings. So both pre experimental familiarity as 657 00:35:23,200 --> 00:35:26,319 Speaker 1: well as the perceived truth value. They both went up 658 00:35:26,320 --> 00:35:29,799 Speaker 1: when people saw them more than once. That's not surprising. 659 00:35:29,840 --> 00:35:33,400 Speaker 1: Again the replication, and then also the fact that you 660 00:35:33,440 --> 00:35:35,440 Speaker 1: have seen something before, we'll tend to make you more 661 00:35:35,480 --> 00:35:39,000 Speaker 1: familiar with it. Then you've got another experiment here. Second 662 00:35:39,040 --> 00:35:42,440 Speaker 1: one replicated the illusory truth effect. Again found that it 663 00:35:42,480 --> 00:35:45,879 Speaker 1: didn't matter whether you mixed up repeated statements that people 664 00:35:45,920 --> 00:35:49,160 Speaker 1: had seen before with new statements or only showed them 665 00:35:49,239 --> 00:35:53,799 Speaker 1: repeated statements. Either way, belief and repeated statements went up. 666 00:35:53,840 --> 00:35:55,640 Speaker 1: And this was done so that they could rule out 667 00:35:55,640 --> 00:35:57,879 Speaker 1: the possibility they're thinking, you know, maybe it's only by 668 00:35:57,920 --> 00:36:01,479 Speaker 1: contrast with new and unfamiliar statements that repeated one seem 669 00:36:01,520 --> 00:36:04,360 Speaker 1: more credible. That is not the case. Either way you 670 00:36:04,400 --> 00:36:06,839 Speaker 1: do it, if you've seen it before, you believe it more. 671 00:36:07,480 --> 00:36:09,920 Speaker 1: And so this study has taken as evidence that the 672 00:36:09,960 --> 00:36:13,640 Speaker 1: feeling of familiarity with an idea might be an important part, 673 00:36:13,719 --> 00:36:16,160 Speaker 1: or even the most important part, of how we judge 674 00:36:16,200 --> 00:36:19,520 Speaker 1: something as true or plausible. But we should shift to 675 00:36:19,560 --> 00:36:24,680 Speaker 1: asking the question of why why would increasing familiarity with 676 00:36:24,719 --> 00:36:28,640 Speaker 1: the statement through repetition make it seem more true to us. 677 00:36:29,320 --> 00:36:32,000 Speaker 1: It makes me think about this passage from Wittgenstein in 678 00:36:32,080 --> 00:36:35,480 Speaker 1: his Philosophical Investigations, about how absurd it would be to 679 00:36:35,560 --> 00:36:39,560 Speaker 1: use repetition of a mental representation as evidence that the 680 00:36:39,600 --> 00:36:43,800 Speaker 1: representation is correct. He writes, quote, for example, I don't 681 00:36:43,800 --> 00:36:46,480 Speaker 1: know if I've remembered the time of departure of a 682 00:36:46,560 --> 00:36:49,520 Speaker 1: train right, And to check it, I call to mind 683 00:36:49,600 --> 00:36:52,759 Speaker 1: how a page of the timetable looked. Is it the 684 00:36:52,800 --> 00:36:56,360 Speaker 1: same here? No, For this process has got to produce 685 00:36:56,360 --> 00:36:59,840 Speaker 1: a memory which is actually correct. If the mental image 686 00:36:59,840 --> 00:37:03,640 Speaker 1: of the timetable could not itself be tested for correctness, 687 00:37:03,680 --> 00:37:06,600 Speaker 1: how could it confirm the correctness of the first memory? 688 00:37:07,160 --> 00:37:10,080 Speaker 1: As if someone were to buy several copies of the 689 00:37:10,080 --> 00:37:13,520 Speaker 1: morning paper to assure himself that what it said was true. 690 00:37:14,040 --> 00:37:16,000 Speaker 1: And that's that's kind of what we're doing. Like he's 691 00:37:16,000 --> 00:37:18,440 Speaker 1: talking about mental images, But the general point is a 692 00:37:18,440 --> 00:37:22,040 Speaker 1: good one. We're essentially buying several copies of the same 693 00:37:22,120 --> 00:37:26,880 Speaker 1: newspaper to to increase our belief that what the newspaper 694 00:37:26,960 --> 00:37:32,319 Speaker 1: says is actually accurate. Now when it possible, interpretation that 695 00:37:32,360 --> 00:37:35,240 Speaker 1: comes to mind is just like the idea of say, 696 00:37:35,600 --> 00:37:39,600 Speaker 1: walking us picking out stepping stones to cross a creek. Right, 697 00:37:40,160 --> 00:37:42,480 Speaker 1: you step to one stone and it doesn't slip out 698 00:37:42,520 --> 00:37:44,719 Speaker 1: from underneath you, and so you use that too to 699 00:37:45,640 --> 00:37:47,879 Speaker 1: make your way across the other stones and hopefully make 700 00:37:47,880 --> 00:37:50,680 Speaker 1: it across the entire creek without getting your feet wet 701 00:37:50,760 --> 00:37:53,920 Speaker 1: or falling in being swept down stream to the to 702 00:37:53,960 --> 00:37:57,920 Speaker 1: the waterfall. So to what extent are we just like 703 00:37:58,040 --> 00:38:02,600 Speaker 1: trusting anything that hasn't resulted in catastrophe thus fire. Well, 704 00:38:02,800 --> 00:38:05,200 Speaker 1: I would say that it would make more sense for 705 00:38:05,280 --> 00:38:10,000 Speaker 1: that to be true with sort of embodied physical, experimental 706 00:38:10,120 --> 00:38:12,520 Speaker 1: knowledge about the world than it would for that to 707 00:38:12,560 --> 00:38:15,279 Speaker 1: make sense for that to apply to semantic knowledge of 708 00:38:15,400 --> 00:38:17,959 Speaker 1: things people tell us. Or maybe our brains just aren't 709 00:38:18,000 --> 00:38:22,080 Speaker 1: good at differentiating between semantic knowledge that's imparted through words. 710 00:38:22,160 --> 00:38:25,239 Speaker 1: You know, maybe somebody saying all those stones will hold 711 00:38:25,280 --> 00:38:27,400 Speaker 1: you up is encoded by the brain in sort of 712 00:38:27,400 --> 00:38:30,160 Speaker 1: the same way as testing out one stone at a time. 713 00:38:31,040 --> 00:38:33,440 Speaker 1: Uh yeah, I don't know. So this is what we 714 00:38:33,440 --> 00:38:35,319 Speaker 1: should explore for the rest of the episode. I think, 715 00:38:35,680 --> 00:38:40,520 Speaker 1: why should repeatedly exposing ourselves to the same information increase 716 00:38:40,560 --> 00:38:43,440 Speaker 1: our confidence in it? If we didn't have good reasons 717 00:38:43,480 --> 00:38:45,359 Speaker 1: to believe it the first time. It's clear that this 718 00:38:45,400 --> 00:38:48,040 Speaker 1: is what's happening, But why does it happen this way? 719 00:38:48,200 --> 00:38:49,920 Speaker 1: All right, we'll take one more break and when we 720 00:38:50,000 --> 00:38:55,000 Speaker 1: come back, we'll we'll jump into this. Thank alright, we're back. 721 00:38:55,480 --> 00:39:00,000 Speaker 1: So we're asking this question of why repeatedly exposing ourselves 722 00:39:00,080 --> 00:39:04,120 Speaker 1: to the same information would increase our confidence if we 723 00:39:04,160 --> 00:39:07,200 Speaker 1: didn't have good reasons to believe the information the first time. 724 00:39:07,520 --> 00:39:10,360 Speaker 1: It's clear from several experiments that this is what happens 725 00:39:10,400 --> 00:39:13,160 Speaker 1: in our brains. If if a statement is repeated, we 726 00:39:13,239 --> 00:39:16,319 Speaker 1: believe it more. But why do our brains work that way? 727 00:39:16,320 --> 00:39:20,920 Speaker 1: It doesn't necessarily make sense. Yeah, And one possible interpretation 728 00:39:20,920 --> 00:39:22,840 Speaker 1: that came to mind is, of course we've touched on 729 00:39:22,880 --> 00:39:26,719 Speaker 1: this before, that that we're all social animals. Yeah, so 730 00:39:26,880 --> 00:39:29,319 Speaker 1: I've I've wondered if there this is a byproduct of 731 00:39:29,320 --> 00:39:31,840 Speaker 1: the drive to fit in with a given group or tribe, 732 00:39:32,280 --> 00:39:35,560 Speaker 1: that there's ultimately a survival advantage and getting along with 733 00:39:35,560 --> 00:39:38,480 Speaker 1: the group, and so does that bleed over into highly 734 00:39:38,520 --> 00:39:42,279 Speaker 1: repeated or highly circulated lies or untruths. So basically like 735 00:39:42,400 --> 00:39:45,280 Speaker 1: if there is a lie going around in the group. 736 00:39:45,400 --> 00:39:47,600 Speaker 1: You'll get along with the group better if you just 737 00:39:47,680 --> 00:39:50,600 Speaker 1: accept the lie. Yeah, And I'm not you know, certainly 738 00:39:50,600 --> 00:39:52,520 Speaker 1: after looking at more of the research, I'm not arguing 739 00:39:52,520 --> 00:39:56,560 Speaker 1: that that is the core um mechanism involved here. This 740 00:39:56,680 --> 00:39:58,600 Speaker 1: is worth exploring that. But but I but I do 741 00:39:58,920 --> 00:40:01,040 Speaker 1: like wonder to what extent that is that's playing a role. 742 00:40:01,080 --> 00:40:03,360 Speaker 1: Because you we we all have our our groups that 743 00:40:03,400 --> 00:40:05,920 Speaker 1: we are involved in, our our our friends, our family, 744 00:40:06,000 --> 00:40:08,680 Speaker 1: or our work groups, our social media groups are are 745 00:40:08,680 --> 00:40:12,120 Speaker 1: sort of echo chambers that we find online. And uh, 746 00:40:12,440 --> 00:40:14,640 Speaker 1: does it make you more susceptible to the lie just 747 00:40:14,680 --> 00:40:20,440 Speaker 1: because there is this this ingrained need to fit in 748 00:40:20,600 --> 00:40:23,879 Speaker 1: with that group too, to share the same values, and 749 00:40:23,960 --> 00:40:27,080 Speaker 1: to put it on in the prehistoric framework, to to 750 00:40:27,480 --> 00:40:30,040 Speaker 1: continue to have access to the fire and the and 751 00:40:30,200 --> 00:40:32,560 Speaker 1: the feast. Yeah, I think I think that's a possibility 752 00:40:32,560 --> 00:40:35,560 Speaker 1: worth exploring. Let's let's take a look at it. Okay. Well, 753 00:40:35,360 --> 00:40:37,200 Speaker 1: I started looking into this a little bit and I 754 00:40:37,680 --> 00:40:42,200 Speaker 1: ran across a paper titled the Evolution of Misbelief misbelief 755 00:40:42,320 --> 00:40:45,880 Speaker 1: misbelief from two thousand nine. This is published in Behavioral 756 00:40:45,920 --> 00:40:48,800 Speaker 1: and Brain Sciences, and it was by Ryan T. McKay 757 00:40:49,000 --> 00:40:53,960 Speaker 1: and Daniel Dinnett. Daniel Dinnett, all right, so they approached 758 00:40:54,040 --> 00:40:56,920 Speaker 1: the following. I guess you could call it a paradox 759 00:40:56,920 --> 00:41:00,080 Speaker 1: in the paper. Given that we evolve to thrive in 760 00:41:00,239 --> 00:41:02,919 Speaker 1: a fact based world, what other kind of world could 761 00:41:02,920 --> 00:41:05,320 Speaker 1: there exactly? Yeah, I mean, we we're dealing with with 762 00:41:05,480 --> 00:41:09,000 Speaker 1: actual reality here. But but given that we've evolved to 763 00:41:09,040 --> 00:41:13,279 Speaker 1: thrive in this world, shouldn't true beliefs be adaptive and 764 00:41:13,400 --> 00:41:18,200 Speaker 1: misbeliefs be maladaptive. It's clear that in many cases, probably 765 00:41:18,239 --> 00:41:21,600 Speaker 1: most cases, that is the way things are. Right. Believing 766 00:41:21,640 --> 00:41:23,600 Speaker 1: that you are able to fly off the edge of 767 00:41:23,600 --> 00:41:27,000 Speaker 1: a cliff is not good for you. Believing that polar 768 00:41:27,040 --> 00:41:30,560 Speaker 1: bears want to cuddle with you is not advantageous. Holding 769 00:41:30,600 --> 00:41:34,200 Speaker 1: false beliefs like this doesn't work out well for people. Yeah, 770 00:41:34,200 --> 00:41:38,640 Speaker 1: they're they're they're reckless and dangerous misbeliefs that clearly like, 771 00:41:38,880 --> 00:41:40,920 Speaker 1: if you've reached the point where you're believing in that, 772 00:41:41,320 --> 00:41:43,640 Speaker 1: you're going to go extinct. So it's obvious that there 773 00:41:43,719 --> 00:41:45,640 Speaker 1: is going to be at least some kind of major 774 00:41:45,719 --> 00:41:49,760 Speaker 1: selection pressure in the brain for shaping brains that believe 775 00:41:49,920 --> 00:41:54,320 Speaker 1: mostly true things, unless there are cases where believing something 776 00:41:54,320 --> 00:41:59,879 Speaker 1: that's false outweighs the negative the drawbacks essentially. So here here, 777 00:42:00,320 --> 00:42:03,920 Speaker 1: here's what they wrote. Quote on this assumption, our beliefs 778 00:42:03,920 --> 00:42:06,880 Speaker 1: about the world are essentially tools that enable us to 779 00:42:07,040 --> 00:42:11,919 Speaker 1: act effectively in the world. Moreover, to be reliable, such 780 00:42:11,920 --> 00:42:14,879 Speaker 1: tools must be produced in us, it is assumed by 781 00:42:14,920 --> 00:42:18,880 Speaker 1: systems designed by evolution to be truth aiming and hence 782 00:42:19,239 --> 00:42:24,400 Speaker 1: barring miracles, These systems must be designed to generate grounded beliefs. 783 00:42:24,440 --> 00:42:28,200 Speaker 1: A system for generating ungrounded but mostly true beliefs would 784 00:42:28,239 --> 00:42:31,960 Speaker 1: be an oracle, as impossible as a perpetual motion machine. 785 00:42:32,280 --> 00:42:33,879 Speaker 1: I like that. Yeah, So there's got to be like 786 00:42:33,920 --> 00:42:37,720 Speaker 1: a grounding procedure through which we can discover true beliefs 787 00:42:37,800 --> 00:42:39,919 Speaker 1: if we're going to have them. Otherwise we're just talking 788 00:42:39,920 --> 00:42:43,000 Speaker 1: about magic. But we have to account for these varying 789 00:42:43,080 --> 00:42:46,120 Speaker 1: levels of misbelief and self deception in the human experience. 790 00:42:46,440 --> 00:42:49,640 Speaker 1: They write, If evolution has designed us to appraise the 791 00:42:49,640 --> 00:42:52,920 Speaker 1: world accurately and to form true beliefs, how are we 792 00:42:53,000 --> 00:42:56,640 Speaker 1: to account for the routine exceptions to this rule instances 793 00:42:56,680 --> 00:43:00,759 Speaker 1: of misbelief? Most of us, at times believe oppositions that 794 00:43:00,920 --> 00:43:04,279 Speaker 1: end up being disproved. Many of us produce beliefs that 795 00:43:04,320 --> 00:43:07,480 Speaker 1: others consider obviously false to begin with, and some of 796 00:43:07,520 --> 00:43:11,320 Speaker 1: his form beliefs that are not just manifestly but bizarrely false. 797 00:43:11,800 --> 00:43:15,880 Speaker 1: How can this be? Are all these misbeliefs just accidents, 798 00:43:15,920 --> 00:43:20,480 Speaker 1: incidences of pathology or breakdown or at best undesirable but 799 00:43:20,560 --> 00:43:24,920 Speaker 1: tolerable byproducts. Might some of them contrat the default presumption 800 00:43:25,320 --> 00:43:29,200 Speaker 1: be adaptive in and of themselves. I like this distinction 801 00:43:29,239 --> 00:43:31,520 Speaker 1: they're making. I think this is actually useful. So they're 802 00:43:31,520 --> 00:43:35,839 Speaker 1: breaking misbeliefs down into two basic kinds of categories, right 803 00:43:36,280 --> 00:43:39,600 Speaker 1: right that they're talking about. One those resulting from a 804 00:43:39,680 --> 00:43:43,440 Speaker 1: breakdown in the normal functioning of the belief formation system. 805 00:43:43,680 --> 00:43:48,279 Speaker 1: This would be delusions malfunctions, so things like face blindness 806 00:43:48,520 --> 00:43:51,720 Speaker 1: or or catard syndrome. Okay, this is when the brain 807 00:43:52,000 --> 00:43:56,239 Speaker 1: is creating incorrect beliefs because it's not working right, it's 808 00:43:56,280 --> 00:43:58,879 Speaker 1: not doing what it's supposed to be doing. But then 809 00:43:58,920 --> 00:44:02,600 Speaker 1: the second category are those that are arising in the 810 00:44:02,640 --> 00:44:06,680 Speaker 1: normal course of that system's operations, So beliefs based on 811 00:44:06,800 --> 00:44:10,000 Speaker 1: incomplete or inaccurate information. These would be This would be 812 00:44:10,000 --> 00:44:12,640 Speaker 1: a case of manufacture. And we'll get into examples of 813 00:44:12,640 --> 00:44:14,520 Speaker 1: this in a second There could be tons of examples. 814 00:44:14,520 --> 00:44:16,200 Speaker 1: One that comes to my mind that would be an 815 00:44:16,200 --> 00:44:18,919 Speaker 1: example of this would be optical illusions. When you when 816 00:44:18,920 --> 00:44:21,840 Speaker 1: you witness an optical illusion, you have a false belief 817 00:44:21,960 --> 00:44:23,840 Speaker 1: that has been generated by your brain. But it's not 818 00:44:23,880 --> 00:44:27,280 Speaker 1: because your brain is doing anything wrong. It's just because, 819 00:44:27,360 --> 00:44:30,200 Speaker 1: like it's being exploited by a situation that's not part 820 00:44:30,200 --> 00:44:33,879 Speaker 1: of its normal what it normally needs to do. Right. Yeah, 821 00:44:34,040 --> 00:44:36,319 Speaker 1: they point out that it's it's easy to think of 822 00:44:36,360 --> 00:44:39,080 Speaker 1: these in light of of an artifact. Is it failing 823 00:44:39,160 --> 00:44:41,279 Speaker 1: due to a limitation in the design in a way 824 00:44:41,280 --> 00:44:44,480 Speaker 1: that is culpable or tolerable? Examples here being say a 825 00:44:44,560 --> 00:44:47,680 Speaker 1: clock that doesn't keep time, keep good time versus a 826 00:44:47,719 --> 00:44:50,800 Speaker 1: toaster oven that doesn't keep time at all. You can't 827 00:44:50,800 --> 00:44:53,680 Speaker 1: expect the toaster of and to keep time unless it's 828 00:44:53,680 --> 00:44:55,680 Speaker 1: got a time right. Well, yes, so that's true. Yes, 829 00:44:55,960 --> 00:44:58,280 Speaker 1: I would have said a purple donkey built by muscular 830 00:44:58,280 --> 00:45:02,160 Speaker 1: elves that doesn't keep because you wouldn't even expect that's true? 831 00:45:02,280 --> 00:45:04,520 Speaker 1: Yes now, But but it gets more complicated when you 832 00:45:04,520 --> 00:45:08,839 Speaker 1: go into the biological realm, because what counts as immune function, dysfunction, 833 00:45:09,280 --> 00:45:13,040 Speaker 1: a pathogen infection, but what it would have. Ultimately, the 834 00:45:13,080 --> 00:45:16,160 Speaker 1: immune system airs by defending the body against say, a 835 00:45:16,239 --> 00:45:20,760 Speaker 1: transplant organ, and may ensure its survival because the body 836 00:45:20,840 --> 00:45:23,560 Speaker 1: is going to reject that attempt to reject that that 837 00:45:23,680 --> 00:45:28,000 Speaker 1: heart transplant, even though the heart transplant could save the patient, 838 00:45:28,080 --> 00:45:30,399 Speaker 1: will save the patients. So it seems like in order 839 00:45:30,440 --> 00:45:34,279 Speaker 1: to understand this, you almost have to understand the context, right, right, 840 00:45:34,320 --> 00:45:37,600 Speaker 1: They say that they invoked the work of Ruth Garrett Milliken, 841 00:45:38,120 --> 00:45:40,520 Speaker 1: who in saying that we can't look to an organ's 842 00:45:40,560 --> 00:45:43,360 Speaker 1: current properties or disposition, we have to look to its history. 843 00:45:43,640 --> 00:45:46,319 Speaker 1: That makes sense to me. Organ transplants, of course, are 844 00:45:46,400 --> 00:45:49,200 Speaker 1: not part of our evolutionary history. So this is just 845 00:45:49,239 --> 00:45:52,799 Speaker 1: the body functioning normally in rejecting the invader heart. Right. 846 00:45:52,840 --> 00:45:56,000 Speaker 1: The body is not malfunctioning, is doing what it's supposed 847 00:45:56,040 --> 00:45:58,000 Speaker 1: to do. We're just throwing a situation at it that 848 00:45:58,040 --> 00:46:01,160 Speaker 1: it's not prepared to deal with. Yeah, so that brings 849 00:46:01,200 --> 00:46:04,959 Speaker 1: us to the more human examples, you know, lies and 850 00:46:05,080 --> 00:46:07,920 Speaker 1: uh and so forth. Oh, that's interesting. So a lie 851 00:46:08,000 --> 00:46:11,000 Speaker 1: could be like a thing that our bodies were not 852 00:46:11,120 --> 00:46:14,239 Speaker 1: really a prepared to deal with very well, which is 853 00:46:14,280 --> 00:46:17,719 Speaker 1: weird to think of because of how common lies are. Yeah, 854 00:46:17,719 --> 00:46:20,160 Speaker 1: they right. However adaptive it may be for us to 855 00:46:20,200 --> 00:46:23,719 Speaker 1: believe truly, it may be adaptive for other parties if 856 00:46:23,760 --> 00:46:26,840 Speaker 1: we believe falsely. Now that, of course this is something 857 00:46:26,880 --> 00:46:29,080 Speaker 1: Just to to interject here, I think this is something 858 00:46:29,120 --> 00:46:33,040 Speaker 1: that we ultimately see holds holds true with other animals, 859 00:46:33,040 --> 00:46:36,080 Speaker 1: like the role of deception of course in u in 860 00:46:36,080 --> 00:46:40,240 Speaker 1: in in certainly in hunting and defense, in even acquiring mates, 861 00:46:40,960 --> 00:46:44,520 Speaker 1: they continue an evolutionary arms race of deceptive ploys and 862 00:46:44,560 --> 00:46:49,160 Speaker 1: counterploys may thus ensue. In some cases, the other parties 863 00:46:49,200 --> 00:46:52,560 Speaker 1: in question may not even be animate agents, but cultural 864 00:46:52,600 --> 00:46:55,920 Speaker 1: traits or systems. Although such cases are interesting in their 865 00:46:55,920 --> 00:46:59,480 Speaker 1: own right, the adaptive misbeliefs we pursue in this article 866 00:46:59,760 --> 00:47:03,440 Speaker 1: are beneficial to their consumers. Misbeliefs that evolved to the 867 00:47:03,480 --> 00:47:07,200 Speaker 1: detriment of their believers are not our quarries, so they 868 00:47:07,280 --> 00:47:10,400 Speaker 1: stress the difference between beliefs and what they referred to 869 00:47:10,440 --> 00:47:14,719 Speaker 1: as a leafs uh and uh. For for instance, if 870 00:47:14,719 --> 00:47:17,600 Speaker 1: I'm freaked out by tall buildings, I might not believe 871 00:47:17,680 --> 00:47:19,840 Speaker 1: that I'm going to fall off, but I might a 872 00:47:19,960 --> 00:47:22,680 Speaker 1: lieve that I'm going to fall off the leave as 873 00:47:22,719 --> 00:47:26,279 Speaker 1: in like like a moral yes. Yeah, and in this 874 00:47:26,320 --> 00:47:28,440 Speaker 1: case it seems to be something that is tall a 875 00:47:28,440 --> 00:47:31,960 Speaker 1: tolerated side effect of an imperfect system. But it's not 876 00:47:32,080 --> 00:47:33,960 Speaker 1: McKay and dinn itt that end up bringing up the 877 00:47:34,000 --> 00:47:39,600 Speaker 1: illusory truth effect, but psychologists Pascal boy Yer in commentary 878 00:47:39,719 --> 00:47:43,680 Speaker 1: on the paper, Uh, this particular paper from from McKay 879 00:47:43,680 --> 00:47:46,000 Speaker 1: and Dinnet, by the ways available online. I'll try to 880 00:47:46,000 --> 00:47:47,520 Speaker 1: include a link to it on the landing page for 881 00:47:47,560 --> 00:47:52,840 Speaker 1: this episode. But in his commentary, Boyer rights dramatic memory 882 00:47:52,880 --> 00:47:56,640 Speaker 1: distortion seem to influence belief fixation. For instance, in the 883 00:47:56,680 --> 00:48:00,200 Speaker 1: illusory truth effect, statements read several times are more likely 884 00:48:00,560 --> 00:48:04,080 Speaker 1: rated as true than statements read only once. People who 885 00:48:04,160 --> 00:48:07,440 Speaker 1: repeatedly imagine performing a particular action may end up believing 886 00:48:07,440 --> 00:48:09,879 Speaker 1: they actually performed it. Oh yeah, this is something I've 887 00:48:09,920 --> 00:48:12,279 Speaker 1: read before. If yeah, if so? If you just like 888 00:48:12,320 --> 00:48:15,440 Speaker 1: have people walk through a task in their mind and 889 00:48:15,440 --> 00:48:17,960 Speaker 1: then ask them later if they remember doing it. A 890 00:48:17,960 --> 00:48:21,120 Speaker 1: lot of times they remember physically acting it out. Yeah, 891 00:48:21,160 --> 00:48:24,160 Speaker 1: I've certainly had this occur with me, Like there'll be 892 00:48:24,200 --> 00:48:26,160 Speaker 1: something I need to do and I'm thinking about doing it, 893 00:48:26,440 --> 00:48:28,960 Speaker 1: and then I can't remember if I actually carried it out, 894 00:48:29,840 --> 00:48:33,680 Speaker 1: and this is uh, this is called imagination inflation, He writes. 895 00:48:34,160 --> 00:48:37,720 Speaker 1: Misinformation paradigms show that most people are vulnerable to memory 896 00:48:37,760 --> 00:48:42,400 Speaker 1: revision when plausible information is implied by experimenters in social 897 00:48:42,920 --> 00:48:45,799 Speaker 1: contagion protocols, people tend to believe they actually saw what 898 00:48:45,960 --> 00:48:48,920 Speaker 1: is in fact suggested by the confederate with whom they 899 00:48:48,920 --> 00:48:51,920 Speaker 1: watched a video. So that he's just listing lots of 900 00:48:51,960 --> 00:48:55,160 Speaker 1: the ways that we are end up with false beliefs. 901 00:48:55,200 --> 00:48:59,120 Speaker 1: There's a plethora of examples of mechanisms for putting false 902 00:48:59,120 --> 00:49:01,800 Speaker 1: beliefs in our brain. Right. Yeah, I know there's a 903 00:49:01,840 --> 00:49:05,080 Speaker 1: lot of territory covered in this paper in the attached responses, 904 00:49:05,120 --> 00:49:08,480 Speaker 1: but I can't I come back to the sort of 905 00:49:08,640 --> 00:49:11,759 Speaker 1: key reason that I sought it out, Like, like, when 906 00:49:11,840 --> 00:49:14,640 Speaker 1: is self self deception helpful? Is it necessary for the 907 00:49:14,680 --> 00:49:18,840 Speaker 1: deception of others? It doesn't quite seem to be, Like, 908 00:49:19,160 --> 00:49:21,200 Speaker 1: you don't have to believe the lie yourself to tell 909 00:49:21,200 --> 00:49:25,200 Speaker 1: someone else the lie, regardless of what telling the lie 910 00:49:25,239 --> 00:49:29,200 Speaker 1: repeatedly might do to you. Well, so, boy, he's skeptical 911 00:49:29,239 --> 00:49:31,560 Speaker 1: of the idea, right, So is he basically saying, like, 912 00:49:31,719 --> 00:49:36,080 Speaker 1: you don't want to overstate the the adaptiveness of believing lies. 913 00:49:36,719 --> 00:49:39,360 Speaker 1: But yeah, he drives something that memory need only be 914 00:49:39,400 --> 00:49:43,200 Speaker 1: as good as the advantage and decision making it affords. Okay, 915 00:49:43,200 --> 00:49:46,840 Speaker 1: so he's essentially going for the byproduct thing for most 916 00:49:47,160 --> 00:49:50,360 Speaker 1: most beliefs. He's he's saying like, look, you know, memory 917 00:49:50,440 --> 00:49:53,920 Speaker 1: needs to do certain things, and in the course of 918 00:49:53,960 --> 00:49:56,440 Speaker 1: doing those things, it may generate some false beliefs. We 919 00:49:56,480 --> 00:50:01,439 Speaker 1: don't have to assume that those false beliefs themselves are beneficial, right, Yeah, 920 00:50:01,520 --> 00:50:03,680 Speaker 1: And and to come back to McKay and Dinnett, they 921 00:50:03,680 --> 00:50:06,400 Speaker 1: point out that natural selection doesn't seem to care about truth. 922 00:50:06,880 --> 00:50:09,640 Speaker 1: It only cares about reproductive success, so that there are 923 00:50:09,680 --> 00:50:12,520 Speaker 1: various cases where a particular false belief or misbelief is 924 00:50:12,560 --> 00:50:16,760 Speaker 1: seemingly adaptive. You believe in a non existent fire god, Okay, 925 00:50:16,840 --> 00:50:20,880 Speaker 1: but say that its laws inhibit overt selfish behavior that 926 00:50:21,320 --> 00:50:23,319 Speaker 1: gets you in trouble and not work out for you 927 00:50:23,360 --> 00:50:25,960 Speaker 1: in the long run. So in that case, you have 928 00:50:26,000 --> 00:50:28,880 Speaker 1: an adaptive misbelief. Now, if the fire God wherever to 929 00:50:28,880 --> 00:50:32,279 Speaker 1: actually appear, then this would be an adaptive belief. But 930 00:50:32,320 --> 00:50:34,480 Speaker 1: then there are arguably a whole host of other false 931 00:50:34,520 --> 00:50:38,880 Speaker 1: ideas that seem adaptive positive self deceptions about ability the 932 00:50:38,880 --> 00:50:42,360 Speaker 1: placebo effect for instance. Um, they bring up the self 933 00:50:42,440 --> 00:50:47,160 Speaker 1: theories of intelligence, entity and incremental view of intelligence. Um. 934 00:50:47,200 --> 00:50:49,560 Speaker 1: This being like, m Am, I born with a certain 935 00:50:49,920 --> 00:50:52,760 Speaker 1: and uh intellect? Or do I develop it over time? 936 00:50:52,960 --> 00:50:57,720 Speaker 1: And how those different core beliefs can affect your effectiveness 937 00:50:57,880 --> 00:51:00,319 Speaker 1: in life Like doesn't mean like, oh, I've to work 938 00:51:00,360 --> 00:51:03,040 Speaker 1: really hard in order to stay stay on top of 939 00:51:03,080 --> 00:51:06,120 Speaker 1: this or is it a situation where oh, I'm I'm brilliant, 940 00:51:06,160 --> 00:51:08,440 Speaker 1: I can accomplish anything. And and of course I think 941 00:51:08,480 --> 00:51:11,800 Speaker 1: you can argue for pitfalls on both sides. And of 942 00:51:11,840 --> 00:51:14,440 Speaker 1: course there's always the optimal margin of illusion in play, 943 00:51:15,000 --> 00:51:18,200 Speaker 1: which comes to us from Roy f Ball moister. Uh. 944 00:51:18,239 --> 00:51:21,320 Speaker 1: And you know, ultimately crazy over confidence as we do, 945 00:51:21,400 --> 00:51:23,560 Speaker 1: as we discussed, is going to lead to extinction. Right, 946 00:51:23,640 --> 00:51:25,880 Speaker 1: you don't want to cuddle the polar bear. Right, cuddling 947 00:51:25,880 --> 00:51:28,560 Speaker 1: the polar bear thinking you can fly? Uh, These are 948 00:51:28,560 --> 00:51:31,400 Speaker 1: going to lead you falling off the side of a 949 00:51:31,400 --> 00:51:34,520 Speaker 1: mountain or winding up at a polar bear's tummy. Yeah. Now, 950 00:51:34,560 --> 00:51:38,440 Speaker 1: I could certainly understand the idea of socially adaptive misbeliefs. 951 00:51:38,480 --> 00:51:41,799 Speaker 1: I think that thing. Those things definitely do exist, and 952 00:51:41,800 --> 00:51:45,040 Speaker 1: in some cases there might be some overlap with the 953 00:51:45,080 --> 00:51:48,240 Speaker 1: types of things that get repeated so often, Like reasons 954 00:51:48,280 --> 00:51:53,359 Speaker 1: for believing untrue things can also compound each other. I mean, 955 00:51:53,360 --> 00:51:56,160 Speaker 1: I'm about to explain why I think false beliefs gained 956 00:51:56,160 --> 00:51:59,879 Speaker 1: through exposure and repetition are not adaptive in themselves. Uh, 957 00:52:00,040 --> 00:52:02,360 Speaker 1: but you can have more than one reason for believing 958 00:52:02,480 --> 00:52:07,120 Speaker 1: something that's untrue. Think about objectively untrue statements that get repeated, 959 00:52:07,120 --> 00:52:09,440 Speaker 1: as we were talking about earlier in a political context. 960 00:52:09,800 --> 00:52:12,680 Speaker 1: The evidence shows that we believe them partially because of 961 00:52:12,680 --> 00:52:16,399 Speaker 1: how often they repeated, but there's also social cognition and 962 00:52:16,440 --> 00:52:19,920 Speaker 1: also identity protective cognition. In other words, we tend to 963 00:52:19,920 --> 00:52:22,640 Speaker 1: believe things that members of our political tribe and social 964 00:52:22,680 --> 00:52:26,480 Speaker 1: in groups say, and for social cohesion reasons that that 965 00:52:26,600 --> 00:52:30,080 Speaker 1: is adaptive for us. We also believe things that validate 966 00:52:30,120 --> 00:52:32,320 Speaker 1: our sense of personal identity. But I think it's pretty 967 00:52:32,320 --> 00:52:35,480 Speaker 1: clear that that these types of effects can work in 968 00:52:35,520 --> 00:52:41,520 Speaker 1: a nasty perverse tag team format, boosting and complementing one another. 969 00:52:42,040 --> 00:52:45,240 Speaker 1: But even if we we put aside these complementary effects, 970 00:52:45,239 --> 00:52:49,680 Speaker 1: put aside uh, social and identity protective cognition, put those 971 00:52:49,719 --> 00:52:53,040 Speaker 1: aside and just focus on the explanation for the illusory 972 00:52:53,080 --> 00:52:56,880 Speaker 1: truth effect and repetition. There's a really interesting thing that 973 00:52:57,239 --> 00:52:59,560 Speaker 1: comes out, and this is based on the idea of 974 00:52:59,640 --> 00:53:03,799 Speaker 1: pros tessing fluency, which is it's a it's a concept 975 00:53:03,920 --> 00:53:06,239 Speaker 1: that is way more interesting than the name would let 976 00:53:06,239 --> 00:53:10,480 Speaker 1: you real. So the dominant explanation for the illusory truth 977 00:53:10,480 --> 00:53:15,239 Speaker 1: effect in the psychology literature, which we're about to get into, um, 978 00:53:15,280 --> 00:53:18,480 Speaker 1: it fits into this byproduct category that we were just 979 00:53:18,520 --> 00:53:20,759 Speaker 1: talking about. Based on all I read, it seems the 980 00:53:20,800 --> 00:53:24,759 Speaker 1: informed majority opinion of psychologists is that the illusion of 981 00:53:24,800 --> 00:53:27,520 Speaker 1: truth that we get from exposure and repetition is an 982 00:53:27,600 --> 00:53:33,719 Speaker 1: unfortunate byproduct of generally useful cognitive heuristic. Now, a heuristic, 983 00:53:33,760 --> 00:53:36,279 Speaker 1: as we've talked about before, is a mental shortcut. It's 984 00:53:36,280 --> 00:53:39,840 Speaker 1: a fast and cheap trick that the brain uses to 985 00:53:40,000 --> 00:53:42,840 Speaker 1: arrive at a judgment or produce some kind of result 986 00:53:42,920 --> 00:53:46,160 Speaker 1: without using too much effort. And it's it's worth driving 987 00:53:46,160 --> 00:53:50,000 Speaker 1: home that our brains need fast and cheap tricks. Brains 988 00:53:50,040 --> 00:53:53,160 Speaker 1: are very energy hungry. Yeah yeah, there's only there's only 989 00:53:53,239 --> 00:53:56,160 Speaker 1: so much power to go around there, so it's uh, 990 00:53:56,200 --> 00:53:58,640 Speaker 1: it's kind of has to hold everything together with a 991 00:53:58,680 --> 00:54:02,359 Speaker 1: bunch of tricks. Yeah, so it works something like this. 992 00:54:03,200 --> 00:54:05,800 Speaker 1: Let's go on, let's go with it. Assume that on balance, 993 00:54:06,080 --> 00:54:10,040 Speaker 1: true statements get uttered more often than lies. As cynical 994 00:54:10,080 --> 00:54:12,680 Speaker 1: as we like to be, that's probably true, right, True 995 00:54:12,680 --> 00:54:16,520 Speaker 1: statements are generally more useful to people. Also, there's a 996 00:54:16,560 --> 00:54:20,480 Speaker 1: sort of convergence effect where there's only one way for 997 00:54:20,520 --> 00:54:23,120 Speaker 1: a true statement to be true, but there are lots 998 00:54:23,160 --> 00:54:26,160 Speaker 1: of different ways to say a lie about the subject 999 00:54:26,200 --> 00:54:28,880 Speaker 1: of that statement. So like, true statements on a subject 1000 00:54:28,960 --> 00:54:32,359 Speaker 1: are going to be more consistent usually than lies about 1001 00:54:32,360 --> 00:54:35,600 Speaker 1: the subject, because a lie about the subject could be anything. Well, 1002 00:54:35,640 --> 00:54:40,319 Speaker 1: and also lies lies it in large part have to 1003 00:54:40,360 --> 00:54:44,080 Speaker 1: be believable. Like think about the various true statements and 1004 00:54:44,200 --> 00:54:46,640 Speaker 1: uh and false statements that might be uttered during the 1005 00:54:46,640 --> 00:54:49,839 Speaker 1: course of a given day at work. Yes, somebody, hey, 1006 00:54:49,840 --> 00:54:52,320 Speaker 1: where's the bathroom? You know it's a new building. Say, 1007 00:54:52,400 --> 00:54:54,759 Speaker 1: then they're they're gonna probably say, oh, it's over there, 1008 00:54:54,800 --> 00:54:57,080 Speaker 1: and they're they're they're probably going to tell you the truth. 1009 00:54:57,320 --> 00:55:00,239 Speaker 1: It generally does not serve people well to lie about 1010 00:55:00,280 --> 00:55:02,479 Speaker 1: the location of the bathroom, right, because you're gonna find 1011 00:55:02,480 --> 00:55:04,239 Speaker 1: out and then you're gonna say, hey, why did you 1012 00:55:04,239 --> 00:55:06,600 Speaker 1: tell me the bathroom is over there and not over there? 1013 00:55:06,640 --> 00:55:11,480 Speaker 1: Are you insane? But then some of the false statements 1014 00:55:11,480 --> 00:55:14,359 Speaker 1: you're liable to hear might be, hey, if you, uh, 1015 00:55:14,719 --> 00:55:16,239 Speaker 1: I don't know, let's see have you started on that 1016 00:55:16,280 --> 00:55:19,040 Speaker 1: report yet, Let's do on Friday, And they'll say, oh, yeah, 1017 00:55:19,040 --> 00:55:20,440 Speaker 1: I've got it, I've got it taken care of. I'll 1018 00:55:20,440 --> 00:55:22,680 Speaker 1: get it to you on Friday. You know. There there's 1019 00:55:22,800 --> 00:55:24,279 Speaker 1: there are a lot of statements like that that are 1020 00:55:24,600 --> 00:55:26,879 Speaker 1: that ultimately you can't really check in on, like you're 1021 00:55:26,880 --> 00:55:28,640 Speaker 1: just gonna have to take their word for it. Then 1022 00:55:28,760 --> 00:55:31,719 Speaker 1: that kind of lie, yeah, you'll never find out, you know, 1023 00:55:32,239 --> 00:55:34,879 Speaker 1: yeah exactly. Or I can't come into work today because 1024 00:55:34,880 --> 00:55:36,719 Speaker 1: I'm sick, Well all right, I'm you know, we're not 1025 00:55:36,760 --> 00:55:39,279 Speaker 1: going to ask for a doctor's note. You might be lying, 1026 00:55:39,280 --> 00:55:41,680 Speaker 1: you might not, but it's just kind of a gimme 1027 00:55:42,080 --> 00:55:44,719 Speaker 1: on that situation. Yeah, that's another reason that we're more 1028 00:55:44,760 --> 00:55:48,239 Speaker 1: likely to be exposed to true statements generally, or at 1029 00:55:48,320 --> 00:55:51,240 Speaker 1: least that were more likely to detect true statements generally, 1030 00:55:51,560 --> 00:55:55,440 Speaker 1: because false statements are harder to verify, usually by design 1031 00:55:55,480 --> 00:55:57,480 Speaker 1: of the person making them. So you're able to find 1032 00:55:57,480 --> 00:56:00,200 Speaker 1: yourself in an environment that's mostly built out of uh 1033 00:56:00,560 --> 00:56:05,440 Speaker 1: true statements and believable lies. Right, So, on this assumption, 1034 00:56:06,120 --> 00:56:08,880 Speaker 1: you know you're you're in a hurry, and your brain 1035 00:56:09,640 --> 00:56:13,000 Speaker 1: it is not designed to consume infinite energy. It wants 1036 00:56:13,080 --> 00:56:15,360 Speaker 1: to try to be efficient. You don't have time to 1037 00:56:15,440 --> 00:56:18,239 Speaker 1: evaluate all claims rigorously. I mean, even no matter how 1038 00:56:18,360 --> 00:56:22,399 Speaker 1: skeptical you want to be, we can confirm this eventually. 1039 00:56:22,520 --> 00:56:25,040 Speaker 1: You are just not going to have time to look 1040 00:56:25,080 --> 00:56:27,480 Speaker 1: into everything you believe. You're just gonna have to take 1041 00:56:27,520 --> 00:56:30,480 Speaker 1: somebody's word for it. It's not practical to try to 1042 00:56:30,560 --> 00:56:34,080 Speaker 1: live by verifying every single belief. Oh yeah, I mean 1043 00:56:34,120 --> 00:56:37,400 Speaker 1: it would. You've just got to have something firm underneath 1044 00:56:37,440 --> 00:56:39,480 Speaker 1: your feet in order to proceed. Oh yeah, you've got 1045 00:56:39,480 --> 00:56:41,480 Speaker 1: a bedrock. But then you've also got to have you 1046 00:56:41,520 --> 00:56:43,800 Speaker 1: just I mean, you take somebody's word on where the 1047 00:56:43,840 --> 00:56:47,719 Speaker 1: bathroom is, like, you're not gonna try to fact check them. 1048 00:56:47,800 --> 00:56:49,719 Speaker 1: You know, well, I guess you will by trying to 1049 00:56:49,719 --> 00:56:52,040 Speaker 1: go there. But other other things like that, mundane things 1050 00:56:52,080 --> 00:56:54,520 Speaker 1: people tell you throughout the day, You're just gonna have 1051 00:56:54,600 --> 00:56:57,959 Speaker 1: to believe them. There's just no, it doesn't make any 1052 00:56:57,960 --> 00:57:00,560 Speaker 1: sense to try to verify all of it because you 1053 00:57:00,600 --> 00:57:05,360 Speaker 1: don't have time. So therefore, an easy shortcut for assuming 1054 00:57:05,400 --> 00:57:08,200 Speaker 1: that a statement is more likely true is have I 1055 00:57:08,280 --> 00:57:12,520 Speaker 1: heard this statement before? Statements they get uttered more often 1056 00:57:12,520 --> 00:57:16,560 Speaker 1: are more likely to be from that class of true statements. Okay, 1057 00:57:16,640 --> 00:57:19,200 Speaker 1: I can roll with that. Now, there's another type of 1058 00:57:19,240 --> 00:57:23,240 Speaker 1: parallel thinking that says, uh, that says, you know, also, 1059 00:57:23,520 --> 00:57:26,480 Speaker 1: it's actually more difficult to disbelieve something than it is 1060 00:57:26,480 --> 00:57:30,040 Speaker 1: to believe it, because and I don't know if this 1061 00:57:30,120 --> 00:57:32,360 Speaker 1: is really confirmed or if this is just one theory 1062 00:57:32,440 --> 00:57:35,240 Speaker 1: about how the information processing in the brain works. But 1063 00:57:35,600 --> 00:57:39,440 Speaker 1: just as a quick tangent, there is a model of 1064 00:57:39,480 --> 00:57:42,400 Speaker 1: thinking that says, Okay, to believe a statement is true, 1065 00:57:42,520 --> 00:57:44,720 Speaker 1: To hear a statement and say I believe it is 1066 00:57:44,760 --> 00:57:47,880 Speaker 1: just one step in the brain. To hear a statement 1067 00:57:47,880 --> 00:57:50,880 Speaker 1: and reject it as false is a two step procedure 1068 00:57:50,920 --> 00:57:53,440 Speaker 1: where first you have to hear it and believe it 1069 00:57:53,520 --> 00:57:56,120 Speaker 1: to understand it, and then you have to go back 1070 00:57:56,200 --> 00:57:59,040 Speaker 1: and revise what you just did and say, but it's 1071 00:57:59,080 --> 00:58:02,200 Speaker 1: not true. Yeah. It's ultimately like a king setting down 1072 00:58:02,440 --> 00:58:05,400 Speaker 1: at a banquet table, right, is the king to simply 1073 00:58:06,000 --> 00:58:09,840 Speaker 1: eat every Uh? Every food item on the plate and 1074 00:58:09,880 --> 00:58:11,840 Speaker 1: trust in it, and trust that he's not going to 1075 00:58:11,920 --> 00:58:15,440 Speaker 1: be poisoned or is he going to independently test each thing. 1076 00:58:15,600 --> 00:58:18,480 Speaker 1: Has the food taster come up, put the mid transfer 1077 00:58:18,600 --> 00:58:21,840 Speaker 1: this gobblet of wine into the rhinoceros horn, etcetera. Hold 1078 00:58:21,840 --> 00:58:24,640 Speaker 1: the magic crystal over this plate of beans. And you know, 1079 00:58:24,680 --> 00:58:27,680 Speaker 1: another thing that came to mind was some of our 1080 00:58:27,720 --> 00:58:30,840 Speaker 1: discussions we've had in the past about consciousness and imagination 1081 00:58:30,880 --> 00:58:34,600 Speaker 1: as a simulation engine, that we use our imagination to 1082 00:58:35,400 --> 00:58:40,440 Speaker 1: mentally simulate possible outcomes so that we can best choose 1083 00:58:40,480 --> 00:58:42,760 Speaker 1: how we're going to react to the world. And when 1084 00:58:42,760 --> 00:58:44,920 Speaker 1: I'm presented with something that might be a lie or 1085 00:58:45,240 --> 00:58:47,840 Speaker 1: or of some sort of untruth or a bit of 1086 00:58:47,840 --> 00:58:52,280 Speaker 1: of misinformation, I still can't help but imagine it, right, 1087 00:58:52,320 --> 00:58:55,240 Speaker 1: I'm having to create a mental picture of it. Um 1088 00:58:55,280 --> 00:58:58,760 Speaker 1: in a sense you're kind of believing it for the moment. Yeah, yeah, 1089 00:58:58,800 --> 00:59:01,440 Speaker 1: because I have to simulated in my head. And in 1090 00:59:01,760 --> 00:59:03,760 Speaker 1: cases of people who can form mental pictures, you have 1091 00:59:03,800 --> 00:59:06,600 Speaker 1: to form those mental pictures. And uh, now you know, 1092 00:59:06,600 --> 00:59:09,360 Speaker 1: and I imagine a lot of this what shakes out 1093 00:59:09,360 --> 00:59:13,360 Speaker 1: after has to do with an individual's particular worldview. But 1094 00:59:13,920 --> 00:59:16,560 Speaker 1: I wonder if in some cases it's like a type 1095 00:59:16,560 --> 00:59:19,400 Speaker 1: one error in cognition, you know, it's a false positive. 1096 00:59:19,880 --> 00:59:24,240 Speaker 1: Uh uh that uh, that I I'm imagining this is 1097 00:59:24,280 --> 00:59:27,200 Speaker 1: a possible outcome, and then maybe I'm more inclined to 1098 00:59:27,280 --> 00:59:29,720 Speaker 1: believe it just so that I can keep it from 1099 00:59:29,760 --> 00:59:32,600 Speaker 1: harming me. Yeah. I think that's a very very reasonable 1100 00:59:32,640 --> 00:59:35,280 Speaker 1: way of imagining it. But so here's where we get 1101 00:59:35,320 --> 00:59:37,840 Speaker 1: into the final part of our discussion today, which is 1102 00:59:37,880 --> 00:59:41,360 Speaker 1: the idea of what I mentioned a minute ago, processing fluency. 1103 00:59:41,800 --> 00:59:45,280 Speaker 1: So processing fluency just means how easy it is to 1104 00:59:45,400 --> 00:59:50,280 Speaker 1: process incoming information. And you wouldn't believe the research on 1105 00:59:50,360 --> 00:59:53,560 Speaker 1: how many of our decisions and mental outcomes seem to 1106 00:59:53,600 --> 00:59:56,880 Speaker 1: be based at least in part on processing fluency. The 1107 00:59:57,000 --> 01:00:01,520 Speaker 1: brain really really likes things to be easy. It really 1108 01:00:01,640 --> 01:00:04,440 Speaker 1: likes things to be to go smooth, to not be 1109 01:00:04,520 --> 01:00:09,120 Speaker 1: too difficult. Uh So, to start off, based on existing research, 1110 01:00:09,160 --> 01:00:12,000 Speaker 1: it definitely seems true that people have an easier time 1111 01:00:12,520 --> 01:00:17,400 Speaker 1: processing statements and information they've heard before. In fact, Robert, 1112 01:00:17,480 --> 01:00:20,680 Speaker 1: you probably know this from direct experience. Like a familiar 1113 01:00:20,760 --> 01:00:23,600 Speaker 1: statement when used in the context of a sentence or 1114 01:00:23,600 --> 01:00:27,920 Speaker 1: an argument, is processed quite smoothly, but a new, unfamiliar 1115 01:00:28,120 --> 01:00:31,200 Speaker 1: statement in the same context often causes you to say, wait, 1116 01:00:31,280 --> 01:00:33,000 Speaker 1: hold on, back up, I need to wrap my head 1117 01:00:33,000 --> 01:00:38,040 Speaker 1: around this. Familiar is easy, unfamiliar is difficult. But how 1118 01:00:38,040 --> 01:00:41,880 Speaker 1: would you test whether the ease of processing information we're 1119 01:00:41,880 --> 01:00:45,760 Speaker 1: actually affecting our judgment of the truth of a statement. 1120 01:00:46,200 --> 01:00:48,040 Speaker 1: And I want to get into a couple of quick, 1121 01:00:48,120 --> 01:00:51,840 Speaker 1: really interesting studies on this that we're so simple and 1122 01:00:51,880 --> 01:00:56,960 Speaker 1: so brilliant. So in Rayburn Schwartz did a study and 1123 01:00:57,080 --> 01:01:01,320 Speaker 1: consciousness and cognition called Effects of perceptual Fluency on judgments 1124 01:01:01,320 --> 01:01:04,400 Speaker 1: of truth, and they took true or false statements, kind 1125 01:01:04,400 --> 01:01:06,480 Speaker 1: of like in the studies we've seen before of the 1126 01:01:06,600 --> 01:01:10,200 Speaker 1: variety osorn no Is in Chile or Greenland has roughly 1127 01:01:10,240 --> 01:01:14,200 Speaker 1: fifty inhabitants, and they presented those statements to people, and 1128 01:01:14,240 --> 01:01:17,400 Speaker 1: the main independent variable was that they presented the statements 1129 01:01:17,400 --> 01:01:21,160 Speaker 1: either against a white background, in a high contrast, easy 1130 01:01:21,200 --> 01:01:24,520 Speaker 1: to read color, or in a low contrast, hard to 1131 01:01:24,560 --> 01:01:27,920 Speaker 1: read color, And apparently that made all the difference in 1132 01:01:27,920 --> 01:01:30,080 Speaker 1: the world. The idea is that the hard to read 1133 01:01:30,160 --> 01:01:34,840 Speaker 1: one has low processing fluency, it's difficult, and the easy 1134 01:01:34,880 --> 01:01:38,720 Speaker 1: to read one has high processing fluency. It's easy to process. 1135 01:01:39,200 --> 01:01:41,720 Speaker 1: And they found that this made a big difference in 1136 01:01:42,000 --> 01:01:46,120 Speaker 1: what people believed was true or false. Uh that quote. 1137 01:01:46,280 --> 01:01:50,240 Speaker 1: Moderately visible statements were judged as true at chance level, 1138 01:01:50,560 --> 01:01:54,520 Speaker 1: whereas highly visible statements were judged as true significantly above 1139 01:01:54,640 --> 01:01:59,880 Speaker 1: chance level. We conclude the perceptual fluency affects judgments of truth. 1140 01:02:00,400 --> 01:02:02,840 Speaker 1: This is another one that makes sense from a marketing standpoint, right, 1141 01:02:02,880 --> 01:02:07,520 Speaker 1: just make your message very clear, very very easily absorbed, 1142 01:02:07,720 --> 01:02:10,320 Speaker 1: and people will begin to buy into it. Oh. Absolutely, 1143 01:02:10,360 --> 01:02:13,120 Speaker 1: And this has actually been studied in marketing and consumer 1144 01:02:13,120 --> 01:02:16,640 Speaker 1: preference like there is one study from Novimski at All 1145 01:02:16,720 --> 01:02:18,960 Speaker 1: published in two thousand seven in the Journal of Marketing 1146 01:02:19,000 --> 01:02:22,360 Speaker 1: Research that in short, it found that consumers more often 1147 01:02:22,400 --> 01:02:26,640 Speaker 1: tend to choose brands that represent ease and fluency. Like say, 1148 01:02:26,680 --> 01:02:29,680 Speaker 1: if the information about a brand is easy to read, 1149 01:02:29,960 --> 01:02:32,680 Speaker 1: consumers are more likely to choose that brand that's the 1150 01:02:32,680 --> 01:02:35,040 Speaker 1: one they want. So that makes me wonder why Coca 1151 01:02:35,040 --> 01:02:36,880 Speaker 1: cola is written in cursive. It's just like you would 1152 01:02:36,920 --> 01:02:39,120 Speaker 1: want it just very clear, but old letters. Well, didn't 1153 01:02:39,120 --> 01:02:42,000 Speaker 1: they try to change the can when agod I haven't 1154 01:02:42,040 --> 01:02:46,080 Speaker 1: really looked at a can recently, maybe it's not incursive anymore, 1155 01:02:46,200 --> 01:02:49,240 Speaker 1: you know, they're Actually you might have two things in conflict, right, 1156 01:02:49,560 --> 01:02:52,200 Speaker 1: So you could have in conflict if you if you've 1157 01:02:52,240 --> 01:02:55,000 Speaker 1: got an old logo that people are familiar with, but 1158 01:02:55,040 --> 01:02:57,800 Speaker 1: it's hard to read, that the hard to read part 1159 01:02:58,160 --> 01:03:01,280 Speaker 1: might be undercutting their preference for it, but the fact 1160 01:03:01,280 --> 01:03:04,000 Speaker 1: that it's familiar might be boosting their preference for it. 1161 01:03:04,000 --> 01:03:06,040 Speaker 1: If you try to change it to something that's easier 1162 01:03:06,040 --> 01:03:10,280 Speaker 1: to read, the change might introduce more difficulty in processing 1163 01:03:10,400 --> 01:03:13,680 Speaker 1: than the ease of reading would improve processing. Yeah, that 1164 01:03:13,720 --> 01:03:15,680 Speaker 1: makes sense, all right, So I want to cite one 1165 01:03:15,680 --> 01:03:18,480 Speaker 1: more study, a study by Christian uncle Bach in two 1166 01:03:18,520 --> 01:03:22,840 Speaker 1: thousand seven from the Journal of Experimental Psychology, Learning, Memory, 1167 01:03:22,840 --> 01:03:26,920 Speaker 1: and Cognition. And uncle Bach does an interesting thing in 1168 01:03:26,920 --> 01:03:30,040 Speaker 1: the study where he's got a hypothesis he wants to test. 1169 01:03:30,120 --> 01:03:34,400 Speaker 1: He writes, quote, I argue that experienced fluency is used 1170 01:03:34,440 --> 01:03:36,960 Speaker 1: as a que in judgments of truth according to the 1171 01:03:37,040 --> 01:03:43,240 Speaker 1: cues ecological validity, meaning like successfulness in the real world. Quote. 1172 01:03:43,640 --> 01:03:46,840 Speaker 1: That is, the truth effect occurs because repetition leads to 1173 01:03:47,000 --> 01:03:50,640 Speaker 1: more fluent processing of a statement, and people have learned 1174 01:03:50,960 --> 01:03:55,040 Speaker 1: that the experience of processing fluency correlates positively with the 1175 01:03:55,080 --> 01:03:57,440 Speaker 1: truth of a statement. So this is sort of what 1176 01:03:57,440 --> 01:04:00,080 Speaker 1: we were talking about earlier. It's a heuristic that you know, 1177 01:04:00,160 --> 01:04:02,720 Speaker 1: you're more likely to encounter true statements in the wild. 1178 01:04:03,080 --> 01:04:07,360 Speaker 1: People learn this through experience, and then they use the 1179 01:04:07,360 --> 01:04:11,800 Speaker 1: the the queue of processing fluency to to be the 1180 01:04:11,920 --> 01:04:15,280 Speaker 1: judge of whether something is familiar or not. And if 1181 01:04:15,320 --> 01:04:19,439 Speaker 1: it's familiar and they get that processing fluency bump it's 1182 01:04:19,440 --> 01:04:22,200 Speaker 1: easy to process, then they're more likely to believe it's 1183 01:04:22,200 --> 01:04:24,760 Speaker 1: true because that's what has worked for them in the past. 1184 01:04:25,200 --> 01:04:27,640 Speaker 1: And if this is true, Uncle Box says, I bet 1185 01:04:27,720 --> 01:04:29,760 Speaker 1: I could reverse it with a little bit of training, 1186 01:04:30,280 --> 01:04:33,080 Speaker 1: and he does. He's got an experiment where with a 1187 01:04:33,120 --> 01:04:37,040 Speaker 1: training phase he actually does three different experiments, and essentially 1188 01:04:37,040 --> 01:04:40,000 Speaker 1: what he does is that he trains people in a 1189 01:04:40,040 --> 01:04:43,560 Speaker 1: scenario where things that are easier to process, either because 1190 01:04:43,720 --> 01:04:47,640 Speaker 1: of being easier to read or because of repetition and familiarity. 1191 01:04:47,960 --> 01:04:51,360 Speaker 1: Either way, those things are more correlated with the thing 1192 01:04:51,520 --> 01:04:54,560 Speaker 1: with the thing being false, and when people get trained 1193 01:04:54,600 --> 01:04:57,920 Speaker 1: in sessions like that, they lose the effect. So the 1194 01:04:58,480 --> 01:05:00,600 Speaker 1: good takeaway there is that if he's correct, it would 1195 01:05:00,600 --> 01:05:03,720 Speaker 1: probably also mean that your susceptibility to the lusory truth 1196 01:05:03,760 --> 01:05:07,320 Speaker 1: effect is dependent on what kind of environment you've trained in, 1197 01:05:07,400 --> 01:05:11,520 Speaker 1: and that you could potentially untrain yourself on it. But 1198 01:05:11,600 --> 01:05:13,480 Speaker 1: that would be hard to do because we all live 1199 01:05:13,520 --> 01:05:15,200 Speaker 1: in this world all the time where most of the 1200 01:05:15,200 --> 01:05:18,000 Speaker 1: time people are telling us true things, right, and and again, 1201 01:05:18,040 --> 01:05:20,040 Speaker 1: the brain is still going to need all of these 1202 01:05:20,080 --> 01:05:23,480 Speaker 1: shortcuts in order to function properly. Yeah, exactly, But you 1203 01:05:23,520 --> 01:05:26,120 Speaker 1: could just be using the opposite shortcut, Like if you 1204 01:05:26,200 --> 01:05:28,240 Speaker 1: live in a world where people lie to you all 1205 01:05:28,320 --> 01:05:31,320 Speaker 1: the time. Uncle Bock's results here would suggest that you 1206 01:05:31,320 --> 01:05:34,160 Speaker 1: would eventually adapt to this, and you would instead become 1207 01:05:34,200 --> 01:05:37,080 Speaker 1: exactly the opposite. New claims you've never heard before would 1208 01:05:37,080 --> 01:05:39,520 Speaker 1: seem more true to you, and repeated claims that you're 1209 01:05:39,520 --> 01:05:42,080 Speaker 1: familiar with would seem like lies to you. Okay, so 1210 01:05:42,120 --> 01:05:44,880 Speaker 1: there's hopeful us after all. Yeah, I mean, but we 1211 01:05:44,920 --> 01:05:46,600 Speaker 1: can't expect to live in a world like that, and 1212 01:05:46,640 --> 01:05:48,400 Speaker 1: we don't want to live in a world like like 1213 01:05:48,400 --> 01:05:50,560 Speaker 1: like that, Like you don't want to train your brain 1214 01:05:51,040 --> 01:05:53,800 Speaker 1: to live in a world where everything is assumed to 1215 01:05:53,840 --> 01:05:58,640 Speaker 1: be a lie. And make surely somebody has has considered 1216 01:05:58,640 --> 01:06:02,200 Speaker 1: exploring this in fiction. Yeah, it would be it would 1217 01:06:02,240 --> 01:06:04,040 Speaker 1: be a delicate affair to to really put it together 1218 01:06:04,040 --> 01:06:06,640 Speaker 1: and make it work on paper. But it's a world 1219 01:06:06,640 --> 01:06:08,040 Speaker 1: that I don't want to live in, but I kind 1220 01:06:08,040 --> 01:06:10,680 Speaker 1: of want to visit fictionally. Oh yeah, i'd go there 1221 01:06:10,720 --> 01:06:12,520 Speaker 1: with you. That that's a good one to to come 1222 01:06:12,560 --> 01:06:15,120 Speaker 1: back to. But just as a quick note before we 1223 01:06:15,160 --> 01:06:18,760 Speaker 1: close out today, I think this idea of processing fluency 1224 01:06:18,840 --> 01:06:22,600 Speaker 1: is a really interesting one. There's tons of research on it, Like, uh, 1225 01:06:22,640 --> 01:06:27,560 Speaker 1: there is a study I found by Sasha Topalinski fromen 1226 01:06:27,560 --> 01:06:31,120 Speaker 1: in Cognition and Emotion about how processing fluency affects how 1227 01:06:31,160 --> 01:06:34,600 Speaker 1: funny we find jokes that apparently if a joke is 1228 01:06:34,640 --> 01:06:38,240 Speaker 1: easier to process, we've got high processing fluency on the joke, 1229 01:06:38,680 --> 01:06:41,200 Speaker 1: we think it's funnier. I guess it just like feels 1230 01:06:41,200 --> 01:06:44,000 Speaker 1: good to get it without with less effort or something. 1231 01:06:44,640 --> 01:06:48,760 Speaker 1: Uh So there were multiple experiments, but basically here, let 1232 01:06:48,760 --> 01:06:50,280 Speaker 1: me let me give you a quick preview. I'm gonna 1233 01:06:50,280 --> 01:06:53,440 Speaker 1: say a word, Robert, peanuts. Do you like that word 1234 01:06:53,440 --> 01:06:56,760 Speaker 1: when you think about it? Peanuts? Peanuts? It's pretty good. 1235 01:06:56,840 --> 01:06:59,520 Speaker 1: It's it's not the funniest, where's no cheese, but but 1236 01:06:59,680 --> 01:07:01,520 Speaker 1: but I like it Okay, I just said that word. 1237 01:07:01,600 --> 01:07:04,040 Speaker 1: So one example of this type of study would be 1238 01:07:04,240 --> 01:07:07,760 Speaker 1: if you prime somebody with significant nouns from the punch 1239 01:07:07,840 --> 01:07:10,760 Speaker 1: line of a joke fifteen minutes or even up to 1240 01:07:11,000 --> 01:07:14,080 Speaker 1: just one minute before you tell them the joke, people 1241 01:07:14,160 --> 01:07:18,320 Speaker 1: find the joke more hilarious. However, if you tell them 1242 01:07:18,360 --> 01:07:21,720 Speaker 1: a significant noun from the punch line immediately before the joke, 1243 01:07:21,800 --> 01:07:23,880 Speaker 1: they find the joke less funny, and the authors think 1244 01:07:23,920 --> 01:07:26,080 Speaker 1: this is probably just or the author thinks this is 1245 01:07:26,120 --> 01:07:28,680 Speaker 1: because if you tell them right before the joke, it 1246 01:07:28,800 --> 01:07:35,200 Speaker 1: sort of spoils the punch line. But knock, knock, who's there? Cash? Cash? Who? No? Thanks? 1247 01:07:35,200 --> 01:07:39,280 Speaker 1: I prefer peanuts. Ah see, it works. It's not even aigg, 1248 01:07:39,720 --> 01:07:42,960 Speaker 1: not even but you already established peanuts, so it helped, right. 1249 01:07:43,160 --> 01:07:45,280 Speaker 1: I tried to let a minute or so elapse there. 1250 01:07:45,880 --> 01:07:48,120 Speaker 1: I don't know if it worked well. It's also complicated 1251 01:07:48,120 --> 01:07:50,280 Speaker 1: because we did bring up peanuts and peanut better earlier 1252 01:07:50,280 --> 01:07:52,720 Speaker 1: in the episode. I didn't even think about that, but 1253 01:07:52,960 --> 01:07:56,320 Speaker 1: this actually I am not a student of stand up 1254 01:07:56,360 --> 01:08:00,320 Speaker 1: comedy by any stretch of the imagination, but I watching 1255 01:08:00,400 --> 01:08:02,960 Speaker 1: us stand up to see that just that that common 1256 01:08:03,080 --> 01:08:05,640 Speaker 1: structural tool that they use where you have the call 1257 01:08:05,720 --> 01:08:08,320 Speaker 1: back to a previous joke, and they'll often do it 1258 01:08:08,680 --> 01:08:10,520 Speaker 1: right at the end and then it's good night everybody. 1259 01:08:10,600 --> 01:08:13,760 Speaker 1: That's the high note. And it it's not even necessarily 1260 01:08:14,000 --> 01:08:17,439 Speaker 1: like a call back to their to the funniest moment 1261 01:08:17,479 --> 01:08:19,400 Speaker 1: in the bit or the funniest bit in the in 1262 01:08:19,640 --> 01:08:22,120 Speaker 1: the stand up performance, but just the fact that they've 1263 01:08:22,439 --> 01:08:26,439 Speaker 1: brought your mind back to it. Yeah, it generates laughter, 1264 01:08:26,560 --> 01:08:28,800 Speaker 1: and it's the moment to end the show on. Yeah. 1265 01:08:28,840 --> 01:08:32,400 Speaker 1: The theory is that it's it's very satisfying to have 1266 01:08:32,479 --> 01:08:35,280 Speaker 1: a joke that you've where you've been primed for the 1267 01:08:35,320 --> 01:08:39,120 Speaker 1: punch line already, because it's so much easier to get 1268 01:08:39,160 --> 01:08:42,720 Speaker 1: the punchline quickly and have that experience of familiarity in 1269 01:08:42,760 --> 01:08:45,719 Speaker 1: the yaha movement moment because when you say a word 1270 01:08:46,080 --> 01:08:48,280 Speaker 1: and then you say the word again later, the second 1271 01:08:48,280 --> 01:08:50,720 Speaker 1: time you hear the word, you've been primed, like you know, 1272 01:08:50,760 --> 01:08:53,400 Speaker 1: it's more fluid. So Yeah, I think that may very 1273 01:08:53,400 --> 01:08:56,160 Speaker 1: well be going on with callbacks. Another part of the 1274 01:08:56,200 --> 01:08:59,160 Speaker 1: same study was that, like the studies we've been seeing before, 1275 01:08:59,240 --> 01:09:02,200 Speaker 1: jokes presented in an easy to read font were rated 1276 01:09:02,240 --> 01:09:04,880 Speaker 1: as funnier than jokes presented in a really hard to 1277 01:09:04,880 --> 01:09:08,759 Speaker 1: read font that's kind of not surprising, but processing fluency 1278 01:09:08,880 --> 01:09:11,479 Speaker 1: it plays into all this stuff like there is research 1279 01:09:11,520 --> 01:09:16,439 Speaker 1: about how opinions that are repeated more often, even just 1280 01:09:16,560 --> 01:09:19,679 Speaker 1: by a single person in a group, come to seem 1281 01:09:19,720 --> 01:09:22,080 Speaker 1: more prevalent in a group. So you've got ten people 1282 01:09:22,080 --> 01:09:24,680 Speaker 1: standing around, then you've just got Jeff over here, and 1283 01:09:24,760 --> 01:09:27,679 Speaker 1: Jeff keeps saying the same opinion over and over again, 1284 01:09:27,720 --> 01:09:30,439 Speaker 1: even if you're aware it's just Jeff saying it in 1285 01:09:30,479 --> 01:09:32,760 Speaker 1: the end, if he does that, you will think that 1286 01:09:32,760 --> 01:09:36,000 Speaker 1: that opinion is more prevalent in the entire group the 1287 01:09:36,080 --> 01:09:38,400 Speaker 1: more people hold it. Well, that would make sense. You 1288 01:09:38,439 --> 01:09:41,160 Speaker 1: have one person in a group who say continually trashes 1289 01:09:41,240 --> 01:09:44,120 Speaker 1: on the movie Aliens. Oh no, why would that happen? 1290 01:09:44,120 --> 01:09:45,760 Speaker 1: I don't know, but let's say it it happens. You know, 1291 01:09:45,800 --> 01:09:47,320 Speaker 1: I could see where it could reach the point where 1292 01:09:47,360 --> 01:09:48,760 Speaker 1: you're kind of like, I don't really know how I 1293 01:09:48,760 --> 01:09:51,240 Speaker 1: feel about Aliens now, because I sure do here here 1294 01:09:51,320 --> 01:09:54,640 Speaker 1: Jeff uh talking trash about it all the time. Or 1295 01:09:54,760 --> 01:09:56,880 Speaker 1: you could walk away from it being like, man, I 1296 01:09:56,880 --> 01:09:59,840 Speaker 1: don't understand all these people who hate Aliens, even though 1297 01:09:59,880 --> 01:10:02,439 Speaker 1: it's just one person. Yeah, that that seems to be 1298 01:10:02,479 --> 01:10:05,120 Speaker 1: something that would go on. Processing fluency also appears to 1299 01:10:05,160 --> 01:10:07,559 Speaker 1: have something to do with aesthetic pleasure. There's been a 1300 01:10:07,560 --> 01:10:10,760 Speaker 1: lot of research and theory about this that that's a 1301 01:10:10,800 --> 01:10:14,920 Speaker 1: major component of what feels aesthetically pleasing to us is 1302 01:10:14,960 --> 01:10:19,040 Speaker 1: based on what's easy to process. Another part is that 1303 01:10:19,120 --> 01:10:22,599 Speaker 1: processing fluencing fluency apparently affects how credible a face looks. 1304 01:10:23,439 --> 01:10:26,040 Speaker 1: So if are you going to believe somebody, well, it 1305 01:10:26,080 --> 01:10:29,320 Speaker 1: turns out if their face is easier to process, especially 1306 01:10:29,320 --> 01:10:31,800 Speaker 1: because you've seen it a bunch of times before, you're 1307 01:10:31,840 --> 01:10:33,880 Speaker 1: more likely to believe it. Even if they're not famous 1308 01:10:33,920 --> 01:10:36,439 Speaker 1: and they're not somebody you know, they're not like somebody 1309 01:10:36,439 --> 01:10:39,320 Speaker 1: you've had experience with that you can you know, judge 1310 01:10:39,360 --> 01:10:42,639 Speaker 1: their credibility just random faces shown to you in different 1311 01:10:42,640 --> 01:10:45,280 Speaker 1: sessions of an experiment. If you've seen them before, they're 1312 01:10:45,280 --> 01:10:48,600 Speaker 1: more credible. Of course, that reminds me of various experiments 1313 01:10:49,040 --> 01:10:52,559 Speaker 1: over the years involving the believability of people with beards. 1314 01:10:52,880 --> 01:10:56,960 Speaker 1: People with facial hair or beards harder to process. They 1315 01:10:57,080 --> 01:10:59,040 Speaker 1: have I have not looked into it recently, so I 1316 01:10:59,040 --> 01:11:01,760 Speaker 1: don't know if there any are recent studies that that 1317 01:11:01,840 --> 01:11:04,200 Speaker 1: the crack this nut. But but there there have been 1318 01:11:04,200 --> 01:11:05,880 Speaker 1: studies that have looked in the past where they make 1319 01:11:05,880 --> 01:11:08,439 Speaker 1: the argument that, yes, an individual with a beard, you're 1320 01:11:08,479 --> 01:11:11,120 Speaker 1: going to have a little more distrust towards them. Well, 1321 01:11:11,160 --> 01:11:13,920 Speaker 1: obviously nobody should trust me. Well, no, we trust you 1322 01:11:13,960 --> 01:11:17,320 Speaker 1: because we know you. Joe. Yeah, do you really? Do 1323 01:11:17,360 --> 01:11:19,639 Speaker 1: you ever really know someone? Well, I'll tell you one 1324 01:11:19,640 --> 01:11:23,400 Speaker 1: thing I know, and that's peanuts stuff. You got me, 1325 01:11:23,520 --> 01:11:25,320 Speaker 1: You got me there, you made me laugh and my 1326 01:11:25,439 --> 01:11:29,280 Speaker 1: joke didn't make you. Okay, Okay, So we gotta wrap 1327 01:11:29,360 --> 01:11:31,960 Speaker 1: up there. We've gone long here, but so we'll be 1328 01:11:32,000 --> 01:11:34,679 Speaker 1: back in the next episode to explore more recent findings 1329 01:11:34,720 --> 01:11:37,000 Speaker 1: and some of the ways that the illusory truth effect 1330 01:11:37,040 --> 01:11:41,559 Speaker 1: really does matter in our in our political and social world. Um, 1331 01:11:41,600 --> 01:11:44,439 Speaker 1: but so main takeaways I would say today is that 1332 01:11:44,720 --> 01:11:48,479 Speaker 1: the illusory truth effect is real. Exposure and repetition really 1333 01:11:48,520 --> 01:11:52,080 Speaker 1: does change our beliefs. The illusory truth effect is small, 1334 01:11:52,600 --> 01:11:56,759 Speaker 1: meaning it doesn't automatically overwhelm other criteria in our decision 1335 01:11:56,800 --> 01:11:59,599 Speaker 1: making and judgment. In fact, in many cases it appears 1336 01:11:59,600 --> 01:12:02,439 Speaker 1: that there are not a statement is actually true is 1337 01:12:02,520 --> 01:12:04,880 Speaker 1: more important to our judgment than whether or not it's 1338 01:12:04,920 --> 01:12:07,519 Speaker 1: repeated or made easier to read, or any of these 1339 01:12:07,520 --> 01:12:12,240 Speaker 1: other processing fluency boosts. But on average, over lots of repetitions, 1340 01:12:12,240 --> 01:12:14,240 Speaker 1: it's easy to see how this could have a big effect, 1341 01:12:14,320 --> 01:12:17,320 Speaker 1: especially when you bring it back to propaganda purposes on 1342 01:12:17,640 --> 01:12:20,759 Speaker 1: things we believe as a society, things that shift voting 1343 01:12:20,800 --> 01:12:24,200 Speaker 1: patterns in small but significant ways, and stuff like that. Yeah, 1344 01:12:24,200 --> 01:12:26,760 Speaker 1: that's the key. That it's not occurring within a vacuum. 1345 01:12:27,080 --> 01:12:30,080 Speaker 1: It's uh, it's it's it's affecting and being affected by 1346 01:12:30,080 --> 01:12:34,920 Speaker 1: all these other um mental processes and factors that are 1347 01:12:34,960 --> 01:12:38,679 Speaker 1: affecting our decision making and worldview totally. But we will 1348 01:12:38,800 --> 01:12:42,160 Speaker 1: get more into that in our next episode. In the meantime, 1349 01:12:42,240 --> 01:12:44,040 Speaker 1: be sure to check out all the episodes of Stuff 1350 01:12:44,080 --> 01:12:45,960 Speaker 1: to Blow Your Mind at Stuff to Blow your Mind 1351 01:12:46,040 --> 01:12:48,200 Speaker 1: dot com. That is the mothership. That is where you 1352 01:12:48,200 --> 01:12:51,400 Speaker 1: will find everything, as well as links out to our 1353 01:12:51,479 --> 01:12:54,360 Speaker 1: various social media accounts. If you want to support the show, 1354 01:12:54,560 --> 01:12:57,880 Speaker 1: we always urge you to leave us a positive review, 1355 01:12:58,040 --> 01:13:01,160 Speaker 1: leave us some stars or whatever the rating system is. 1356 01:13:01,280 --> 01:13:04,519 Speaker 1: Just rate and review us wherever possible. Big thanks as 1357 01:13:04,560 --> 01:13:08,160 Speaker 1: always to our excellent audio producers Alex Williams and Tory Harrison. 1358 01:13:08,479 --> 01:13:09,920 Speaker 1: If you would like to get in touch with us 1359 01:13:09,920 --> 01:13:11,800 Speaker 1: to let us know your feedback on this episode or 1360 01:13:11,800 --> 01:13:14,479 Speaker 1: any other, or to let us know a topic you'd 1361 01:13:14,479 --> 01:13:16,080 Speaker 1: like us to cover in a future episode, you can 1362 01:13:16,120 --> 01:13:18,960 Speaker 1: always email us at blow the Mind at how stuff 1363 01:13:19,000 --> 01:13:30,640 Speaker 1: works dot com for more on this and thousands of 1364 01:13:30,640 --> 01:13:41,400 Speaker 1: other topics. Does it how stuff works dot com, gol