1 00:00:05,760 --> 00:00:07,480 Speaker 1: Hey, you Welcome to Stuff to Blow Your Mind. My 2 00:00:07,560 --> 00:00:10,680 Speaker 1: name is Robert Lamb and I'm Joe McCormick, and it's Saturday. 3 00:00:10,760 --> 00:00:14,560 Speaker 1: Time for a vault episode. This episode originally published July twelve, 4 00:00:15,760 --> 00:00:18,439 Speaker 1: and this is part two of our exploration of the 5 00:00:18,480 --> 00:00:22,600 Speaker 1: illusory truth effect. That's right, this one, I will land 6 00:00:22,640 --> 00:00:25,000 Speaker 1: the plane for you and uh and hopefully give you 7 00:00:25,079 --> 00:00:28,200 Speaker 1: some tools that you might be able to employ to 8 00:00:28,360 --> 00:00:31,319 Speaker 1: fight the power of the illusory truth effect. Or at 9 00:00:31,400 --> 00:00:34,040 Speaker 1: least that's that's the intention. All right, let's dive right in. 10 00:00:37,200 --> 00:00:40,280 Speaker 1: Welcome to Stuff to Blow Your Mind from how Stuffworks 11 00:00:40,320 --> 00:00:48,800 Speaker 1: dot com. Hey, you, welcome to Stuff to Blow your Mind. 12 00:00:48,920 --> 00:00:51,280 Speaker 1: My name is Robert Lamb and I'm Joe McCormick, and 13 00:00:51,360 --> 00:00:55,240 Speaker 1: we're back part two of our exploration of the illusory 14 00:00:55,360 --> 00:00:59,760 Speaker 1: truth effect, probably the liar's best trick. If you haven't 15 00:00:59,800 --> 00:01:01,960 Speaker 1: heard in our last episodes, you'd probably go back listen 16 00:01:01,960 --> 00:01:04,600 Speaker 1: to that first. But if you have int or if 17 00:01:04,640 --> 00:01:07,039 Speaker 1: you have, let's just do a quick recap of what 18 00:01:07,080 --> 00:01:09,440 Speaker 1: we talked about last time. We discussed all of the 19 00:01:09,480 --> 00:01:11,920 Speaker 1: research on this thing that's sort of been part of 20 00:01:11,920 --> 00:01:15,160 Speaker 1: folk wisdom that if you say something and if you 21 00:01:15,200 --> 00:01:18,360 Speaker 1: repeat it, and repeat it and repeat it. People become 22 00:01:18,440 --> 00:01:21,200 Speaker 1: over time more likely to believe that thing, and that 23 00:01:21,319 --> 00:01:25,440 Speaker 1: is thoroughly validated by experimental research. Right. And we also 24 00:01:25,480 --> 00:01:27,520 Speaker 1: talked a little bit about why does it even make 25 00:01:27,560 --> 00:01:30,080 Speaker 1: sense that we would come to believe things that were 26 00:01:30,120 --> 00:01:32,920 Speaker 1: not true about the world that we live in just 27 00:01:32,959 --> 00:01:36,240 Speaker 1: because they were repeated. Yeah, and so the basis that 28 00:01:36,280 --> 00:01:38,640 Speaker 1: we ultimately ended up on last time that seems to 29 00:01:38,640 --> 00:01:41,120 Speaker 1: be favored by most of the psychologists to study this 30 00:01:41,760 --> 00:01:45,679 Speaker 1: is based in the idea of processing fluency that for 31 00:01:45,760 --> 00:01:49,040 Speaker 1: whatever reason, one researcher we talked about last time came 32 00:01:49,080 --> 00:01:51,560 Speaker 1: to believe that it was because of conditioning based on 33 00:01:51,640 --> 00:01:54,560 Speaker 1: real world effects. But for whatever reason, we tend to 34 00:01:54,600 --> 00:01:58,160 Speaker 1: associate things that are easy to process, things with high 35 00:01:58,200 --> 00:02:03,320 Speaker 1: processing fluency with truth. So something's easy to read, we 36 00:02:03,360 --> 00:02:06,360 Speaker 1: think it's more true. Or if something is an idea 37 00:02:06,440 --> 00:02:09,560 Speaker 1: we've seen or heard or encountered before, because that's easier 38 00:02:09,560 --> 00:02:12,960 Speaker 1: to process because of familiarity, we believe that it is 39 00:02:13,000 --> 00:02:15,360 Speaker 1: more likely to be true than if we're encountering it 40 00:02:15,400 --> 00:02:16,920 Speaker 1: for the first time. But of course, in all of 41 00:02:16,919 --> 00:02:20,560 Speaker 1: this extreme implausibility is going to be a boundary condition 42 00:02:20,600 --> 00:02:23,280 Speaker 1: that's going to kick in. So this is like the 43 00:02:23,280 --> 00:02:28,519 Speaker 1: Ted Cruizes, the Zodiac killer um level of of of implausibility. 44 00:02:28,639 --> 00:02:31,000 Speaker 1: What's just because the ages don't match up right, well, 45 00:02:31,120 --> 00:02:33,320 Speaker 1: just and it's just kind of like, alright, I'm not 46 00:02:33,400 --> 00:02:36,120 Speaker 1: bleeding that that sounds ridiculous, but some people do believe that. 47 00:02:36,440 --> 00:02:39,359 Speaker 1: So your boundary condition may not be where somebody else 48 00:02:39,560 --> 00:02:42,680 Speaker 1: canoundary condition is well, the boundary conditions will vary from 49 00:02:42,720 --> 00:02:47,400 Speaker 1: individual to individual. Um. So yeah, So the question that 50 00:02:47,560 --> 00:02:49,680 Speaker 1: we should address to start off in this one is 51 00:02:49,680 --> 00:02:52,799 Speaker 1: in the last episode, we discussed how this effect has 52 00:02:52,840 --> 00:02:56,360 Speaker 1: been thoroughly validated in the lab. But here's a question. 53 00:02:56,680 --> 00:02:58,920 Speaker 1: Does it work in the real world and is it 54 00:02:58,960 --> 00:03:02,520 Speaker 1: really all that powerful? Like a lot of researchers seem 55 00:03:02,560 --> 00:03:06,280 Speaker 1: to assume that, surely, if you already know something about 56 00:03:06,280 --> 00:03:10,560 Speaker 1: a subject, repetition of a contradictory false statement wouldn't actually 57 00:03:10,639 --> 00:03:15,200 Speaker 1: undermine your real knowledge, would it. Surely they would tend 58 00:03:15,200 --> 00:03:18,400 Speaker 1: to assume that this this ilusory truth effect only works 59 00:03:18,440 --> 00:03:21,480 Speaker 1: for state statements that were uncertain about to begin with, 60 00:03:21,840 --> 00:03:25,280 Speaker 1: and statements that seem highly plausible, like if you didn't 61 00:03:25,320 --> 00:03:28,560 Speaker 1: know anything about either Ted Cruz or the Zodiac Killer 62 00:03:28,600 --> 00:03:30,280 Speaker 1: really and then you would just sort of say, all right, 63 00:03:30,320 --> 00:03:33,880 Speaker 1: maybe that's possible, whereas an individual who has read multiple 64 00:03:33,919 --> 00:03:37,000 Speaker 1: books on the Zodiac Killer would say, no, that doesn't 65 00:03:37,040 --> 00:03:39,760 Speaker 1: that doesn't match up. That is just ridiculous. Yeah, So 66 00:03:39,920 --> 00:03:43,880 Speaker 1: that that's the assumption. But unfortunately some more recent research 67 00:03:44,000 --> 00:03:48,000 Speaker 1: has really turned that assumption on its head. Uh So, 68 00:03:48,120 --> 00:03:50,800 Speaker 1: I want to talk about an important recent study in 69 00:03:50,840 --> 00:03:53,480 Speaker 1: the illusory truth effect that brings it's a bearer of 70 00:03:53,520 --> 00:03:57,200 Speaker 1: bad news. The study is from the Journal of Experimental 71 00:03:57,200 --> 00:04:03,160 Speaker 1: Psychology General in Fasio, Brashier, Pain and marsh and it's 72 00:04:03,200 --> 00:04:07,840 Speaker 1: called knowledge does not Protect against illusory truth. So they 73 00:04:07,880 --> 00:04:10,400 Speaker 1: pointed out that the illusory truth effects that we talked 74 00:04:10,400 --> 00:04:14,440 Speaker 1: about last time, based on processing fluency, is widely accepted, 75 00:04:14,520 --> 00:04:17,680 Speaker 1: well established, but it had been previously thought that this 76 00:04:17,720 --> 00:04:21,760 Speaker 1: effect was constrained by a few things. Now, one constraint 77 00:04:21,960 --> 00:04:25,680 Speaker 1: shown to actually exist in the literature is recollection of 78 00:04:25,720 --> 00:04:29,880 Speaker 1: the quality of the source of the information. So previous 79 00:04:29,880 --> 00:04:33,400 Speaker 1: studies have shown that if you specifically remember where a 80 00:04:33,440 --> 00:04:36,719 Speaker 1: statement came from, and you consider the source of the 81 00:04:36,800 --> 00:04:41,240 Speaker 1: statement a dishonest or untrustworthy source, that can produce kind 82 00:04:41,279 --> 00:04:44,520 Speaker 1: of a reverse truth effect, where repetition of a statement 83 00:04:44,600 --> 00:04:47,440 Speaker 1: known to come from a liar or an untrustworthy source 84 00:04:47,760 --> 00:04:50,080 Speaker 1: causes us to disbelieve it. So this sounds like this 85 00:04:50,080 --> 00:04:53,320 Speaker 1: should be good news, right right? Yeah? Did I ultimately 86 00:04:53,360 --> 00:04:56,800 Speaker 1: the question did I hear that on the radio? Did? 87 00:04:57,040 --> 00:04:58,720 Speaker 1: Or did I see it on a T shirt? Yeah? 88 00:04:58,800 --> 00:05:02,120 Speaker 1: Or was this the cover of the National Enquirer? Like 89 00:05:02,200 --> 00:05:05,360 Speaker 1: you remember that's where it came from, and you're you know, 90 00:05:05,480 --> 00:05:08,480 Speaker 1: that's an untrustworthy source. So it actually has the reverse effect. 91 00:05:08,600 --> 00:05:11,640 Speaker 1: You hear that repeated and it makes you go no, no, no, 92 00:05:11,720 --> 00:05:15,200 Speaker 1: that's not true at all. But this isn't as much 93 00:05:15,240 --> 00:05:18,880 Speaker 1: of a protection as we think, because honestly, how well 94 00:05:18,960 --> 00:05:22,039 Speaker 1: do you remember the exact source of every bit of 95 00:05:22,080 --> 00:05:25,920 Speaker 1: semantic knowledge in your head? Why no, bat Boy did 96 00:05:25,960 --> 00:05:28,160 Speaker 1: not come from the New York Times, But there are 97 00:05:28,200 --> 00:05:30,080 Speaker 1: lots of other things that are in your head that 98 00:05:30,160 --> 00:05:32,440 Speaker 1: did come from the cover of the National Enquirer, and 99 00:05:32,480 --> 00:05:35,720 Speaker 1: you don't remember that that's where it came from. I 100 00:05:35,760 --> 00:05:39,200 Speaker 1: guarantee it you've stood in line at the grocery store. Well, 101 00:05:39,279 --> 00:05:44,919 Speaker 1: if it's a story about any particular aged celebrities, brave 102 00:05:45,040 --> 00:05:48,240 Speaker 1: last days or sad last days, they probably came from 103 00:05:48,279 --> 00:05:51,080 Speaker 1: inquire But yes, there there, there's probably there are probably 104 00:05:51,160 --> 00:05:53,200 Speaker 1: some stories in there that I would not definitely be 105 00:05:53,240 --> 00:05:56,520 Speaker 1: able to pin down to inquire versus other sources. Robert, 106 00:05:56,560 --> 00:05:59,560 Speaker 1: I see right through your bravado. Some Inquirer stories have 107 00:05:59,600 --> 00:06:02,760 Speaker 1: gotten through to you. Uh. Yeah. Other studies have backed 108 00:06:02,760 --> 00:06:05,160 Speaker 1: this up. After just a period of a few weeks, 109 00:06:05,360 --> 00:06:07,719 Speaker 1: what may have once been stored in the brain as 110 00:06:08,120 --> 00:06:12,080 Speaker 1: false claim by an untrustworthy source could potentially, over time 111 00:06:12,120 --> 00:06:16,360 Speaker 1: become just a familiar statement I remember, which of course, 112 00:06:16,560 --> 00:06:20,200 Speaker 1: once it's familiar translates it into more likely to be 113 00:06:20,240 --> 00:06:23,280 Speaker 1: a true fact. Uh. There was at least one study 114 00:06:23,279 --> 00:06:27,040 Speaker 1: that looked into this, by beg Annas and Feignacci in 115 00:06:27,160 --> 00:06:32,880 Speaker 1: Nino called Dissociation of processes and belief, source recollection, statement, familiarity, 116 00:06:32,920 --> 00:06:35,719 Speaker 1: and the illusion of truth, And basically they found that 117 00:06:35,800 --> 00:06:38,800 Speaker 1: when the source of a claim is not super memorable 118 00:06:38,839 --> 00:06:44,440 Speaker 1: as unreliable, familiarity can be more important than truth or reliability. Okay, 119 00:06:44,440 --> 00:06:47,760 Speaker 1: so it's not necessarily a like a magazine that that 120 00:06:47,839 --> 00:06:51,600 Speaker 1: has a negative reputation in your mind, but it's not 121 00:06:51,960 --> 00:06:54,200 Speaker 1: something that's completely reputable either. It just kind of follows 122 00:06:54,200 --> 00:06:56,839 Speaker 1: in between or even if it has a negative reputation 123 00:06:56,960 --> 00:07:00,359 Speaker 1: and it's just not all that memorable, you can lose 124 00:07:00,400 --> 00:07:03,120 Speaker 1: track of where it came from and it will suffer 125 00:07:03,279 --> 00:07:06,800 Speaker 1: from the illusory truth effect. This can happen even when 126 00:07:06,880 --> 00:07:10,000 Speaker 1: you should have remembered that it came from an untrustworthy source. 127 00:07:10,200 --> 00:07:13,440 Speaker 1: There are exceptions when the source is really memorable, but 128 00:07:13,600 --> 00:07:16,560 Speaker 1: a lot of times it doesn't protect you. Now, the 129 00:07:16,640 --> 00:07:21,480 Speaker 1: second assumption about constraints on the illusory truth effect is 130 00:07:21,520 --> 00:07:25,320 Speaker 1: about knowledge. Right, We've all got knowledge already in our heads, 131 00:07:25,360 --> 00:07:28,160 Speaker 1: and the idea is that pre existing knowledge will protect 132 00:07:28,200 --> 00:07:31,200 Speaker 1: against the effect. And this is what came under scrutiny 133 00:07:31,240 --> 00:07:36,400 Speaker 1: in this particular study by Fasio and our co co authors. So, 134 00:07:36,760 --> 00:07:39,880 Speaker 1: despite being an assumption repeated again and again in the 135 00:07:39,920 --> 00:07:43,920 Speaker 1: illusory truth literature, very few of the studies actually bothered 136 00:07:43,960 --> 00:07:47,480 Speaker 1: to test whether knowledge protects people. I was just sort 137 00:07:47,520 --> 00:07:50,440 Speaker 1: of asserted to be true as if it were obvious, 138 00:07:50,800 --> 00:07:53,160 Speaker 1: And the few that did bother to tested in any 139 00:07:53,160 --> 00:07:56,520 Speaker 1: way generally did so by testing how the effect presented 140 00:07:56,520 --> 00:08:01,120 Speaker 1: in people who claimed subject area expertise. So, uh, these 141 00:08:01,160 --> 00:08:04,280 Speaker 1: studies yielded contradictory results. But here's a couple of examples. 142 00:08:04,720 --> 00:08:07,640 Speaker 1: Scroll in nineteen eighty three found that if you rate 143 00:08:07,680 --> 00:08:10,240 Speaker 1: yourself as an expert on cars, Robert, would you rate 144 00:08:10,240 --> 00:08:13,560 Speaker 1: yourself as an expert on cars, but some people would win. 145 00:08:13,760 --> 00:08:16,760 Speaker 1: Some people around the office. Yeah. Car experts, well found 146 00:08:16,800 --> 00:08:21,400 Speaker 1: suffered smaller illusory truth effects uh than non experts on 147 00:08:21,480 --> 00:08:24,520 Speaker 1: car trivia. So that would suggest, Okay, knowledge gives you 148 00:08:24,560 --> 00:08:27,200 Speaker 1: a little bit of an edge. You're not You're not 149 00:08:27,320 --> 00:08:30,800 Speaker 1: as susceptible as amateurs. And then Parks and Tough in 150 00:08:30,880 --> 00:08:34,480 Speaker 1: two thousand and six had people rate claims about known 151 00:08:34,840 --> 00:08:39,080 Speaker 1: versus unknown consumer brands, and the illusory truth effect was 152 00:08:39,120 --> 00:08:42,320 Speaker 1: bigger for statements about brands that people were unfamiliar with. 153 00:08:42,760 --> 00:08:44,480 Speaker 1: That makes sense. So like, if you didn't already know 154 00:08:44,520 --> 00:08:48,439 Speaker 1: anything about this brand, you were more susceptible to illusory 155 00:08:48,440 --> 00:08:51,120 Speaker 1: truth effect on statements about the brand. Yeah, that makes 156 00:08:51,200 --> 00:08:55,320 Speaker 1: perfect sense. On the other hand, Arkey's, Hackett and Boem 157 00:08:55,360 --> 00:08:58,400 Speaker 1: in ninety nine found the opposite, that the higher a 158 00:08:58,480 --> 00:09:02,480 Speaker 1: person rated their expertise sent a subject, the more susceptible 159 00:09:02,520 --> 00:09:05,439 Speaker 1: they were to the illusory truth effect in that subject area. 160 00:09:06,320 --> 00:09:09,240 Speaker 1: Makes you wonder if there's like some kind of insecurity 161 00:09:09,320 --> 00:09:12,000 Speaker 1: or like identity protective thing going on there. Yeah, Like 162 00:09:12,160 --> 00:09:13,920 Speaker 1: I don't want I don't I don't want to be wrong. 163 00:09:14,000 --> 00:09:16,520 Speaker 1: So I'm just gonna nod my head on that situation. 164 00:09:16,600 --> 00:09:18,480 Speaker 1: I don't want to look bad. I've already staked my 165 00:09:18,559 --> 00:09:23,640 Speaker 1: reputation on being a car expert. Also, boem In ninet. 166 00:09:24,240 --> 00:09:28,720 Speaker 1: Found that psychology majors showed a larger illusory truth effect 167 00:09:28,840 --> 00:09:33,280 Speaker 1: on psychology than non majors. But there's some issues with 168 00:09:33,320 --> 00:09:35,959 Speaker 1: these studies. So Fasio and her co her co authors 169 00:09:36,000 --> 00:09:39,160 Speaker 1: point out that these types of tests don't actually manipulate 170 00:09:39,240 --> 00:09:41,839 Speaker 1: direct knowledge of whether the statements are true or false, 171 00:09:42,240 --> 00:09:45,760 Speaker 1: just sort of the perception of related knowledge. So they 172 00:09:45,800 --> 00:09:48,520 Speaker 1: wanted to test this directly. They created a big list 173 00:09:48,559 --> 00:09:51,000 Speaker 1: of statements like we've seen in these other tests, where 174 00:09:51,040 --> 00:09:54,240 Speaker 1: you'll have true statements and false statements, and they based 175 00:09:54,240 --> 00:09:57,079 Speaker 1: this off existing lists of facts that have been shown 176 00:09:57,080 --> 00:10:01,160 Speaker 1: in previous studies to be either generally known were generally unknown. 177 00:10:01,640 --> 00:10:05,240 Speaker 1: And this created four categories of statements. You've got known truths, 178 00:10:05,520 --> 00:10:10,400 Speaker 1: unknown truths, known falsehoods, and unknown falsehoods. Here's some examples. 179 00:10:10,559 --> 00:10:13,480 Speaker 1: You've got a known truth quote. The cyclops is a 180 00:10:13,600 --> 00:10:17,360 Speaker 1: legendary one eyed giant of Greek mythology. Robert checks out. 181 00:10:17,480 --> 00:10:20,160 Speaker 1: Checks out. Okay, how about the Pacific Ocean is the 182 00:10:20,240 --> 00:10:23,120 Speaker 1: largest ocean in the world. Checks out. Then you go 183 00:10:23,160 --> 00:10:26,920 Speaker 1: into known falsehoods. The minotaur is the legendary one eyed 184 00:10:26,960 --> 00:10:30,880 Speaker 1: giant of Greek mythology. Absolutely not the Atlantic Ocean is 185 00:10:30,920 --> 00:10:33,600 Speaker 1: the largest ocean in the world, and most people are 186 00:10:33,600 --> 00:10:36,920 Speaker 1: expected to know that these are not true statements. Then 187 00:10:36,960 --> 00:10:40,480 Speaker 1: you've got unknown stuff. Here's an example. Unknown truth. Billy 188 00:10:40,559 --> 00:10:43,320 Speaker 1: the kid's real last name? What was it? It's Bonnie. 189 00:10:44,720 --> 00:10:48,840 Speaker 1: Unknown falsehood, Billy the kid's real last name is Garrett. Yeah, 190 00:10:48,880 --> 00:10:50,839 Speaker 1: I would have would have been a toss up for 191 00:10:50,960 --> 00:10:53,840 Speaker 1: me because I did not know Billy the kid's last name. 192 00:10:53,880 --> 00:10:57,200 Speaker 1: I thought maybe it was a kid, you know, as 193 00:10:57,200 --> 00:10:59,480 Speaker 1: in Kid Rock as Kid rocks first name is Billy 194 00:10:59,559 --> 00:11:02,880 Speaker 1: kids like his middle name is the So there, there 195 00:11:02,960 --> 00:11:06,760 Speaker 1: you go. So Experiment one, using this set of statements, 196 00:11:06,760 --> 00:11:09,600 Speaker 1: forty students in the first phase. Subjects were shown a 197 00:11:09,679 --> 00:11:12,240 Speaker 1: subset of statements from the list of all four types, 198 00:11:12,679 --> 00:11:15,360 Speaker 1: and they were just asked to judge how interesting the 199 00:11:15,440 --> 00:11:18,839 Speaker 1: statements were. You know, that sounds like a really fun task, right. 200 00:11:19,160 --> 00:11:21,960 Speaker 1: Billy the kid's last name is Bonnie. How interesting was 201 00:11:22,040 --> 00:11:25,680 Speaker 1: that I get more interesting than some names? Yeah? Maybe, 202 00:11:25,760 --> 00:11:28,240 Speaker 1: I guess. I don't know. I didn't find that one 203 00:11:28,280 --> 00:11:31,280 Speaker 1: that interesting. I don't know. I guess it sounds like 204 00:11:31,360 --> 00:11:34,200 Speaker 1: Bonnie is in like pretty it sounds it sounds maybe 205 00:11:34,200 --> 00:11:36,360 Speaker 1: a little odd for what based on the photos needs 206 00:11:36,360 --> 00:11:38,880 Speaker 1: to be kind of like an ugly looking, you know, 207 00:11:39,000 --> 00:11:41,600 Speaker 1: western outlaw. It makes me think like a Robert Burns 208 00:11:41,679 --> 00:11:44,880 Speaker 1: kind of poem thing. And Bonnie Glenn or whereas Garrett 209 00:11:44,920 --> 00:11:46,920 Speaker 1: sound you know, has kind of a guttural sound to it. Yeah, 210 00:11:47,040 --> 00:11:49,440 Speaker 1: got right, Okay, So then they got the second phase. 211 00:11:49,480 --> 00:11:52,600 Speaker 1: This happened immediately after the first phase. Students were given 212 00:11:52,640 --> 00:11:55,320 Speaker 1: another subset of statements from the list, again all four 213 00:11:55,320 --> 00:11:58,480 Speaker 1: types of statements, and they were warned that some statements 214 00:11:58,480 --> 00:12:00,839 Speaker 1: were true and some were false, and they were also 215 00:12:00,920 --> 00:12:03,400 Speaker 1: warned that they would see some repeats from the list 216 00:12:03,440 --> 00:12:05,920 Speaker 1: that they had just reviewed for how interesting they were, 217 00:12:06,440 --> 00:12:08,240 Speaker 1: and then they rated the claims on a scale of 218 00:12:08,240 --> 00:12:10,439 Speaker 1: one to six about how true they were. There was 219 00:12:10,480 --> 00:12:13,400 Speaker 1: also at the end an open ended knowledge check test 220 00:12:13,760 --> 00:12:16,360 Speaker 1: with it had these open ended questions like what is 221 00:12:16,440 --> 00:12:20,000 Speaker 1: the world's largest ocean? What is the one eyed monster 222 00:12:20,120 --> 00:12:23,240 Speaker 1: of Greek myth uh to strengthen the experiment or's picture 223 00:12:23,280 --> 00:12:26,640 Speaker 1: of the individual knowledge of each participant. So then you 224 00:12:26,760 --> 00:12:29,560 Speaker 1: got the results. First of all, the original findings of 225 00:12:29,559 --> 00:12:33,360 Speaker 1: the illusory truth effect were replicated. Repeated statements got higher 226 00:12:33,360 --> 00:12:36,199 Speaker 1: truth ratings than new statements that the students had never 227 00:12:36,240 --> 00:12:41,040 Speaker 1: seen before. But also, quite surprisingly, knowledge did not seem 228 00:12:41,160 --> 00:12:45,480 Speaker 1: to prevent the illusory truth effect. Statements about both previously 229 00:12:45,559 --> 00:12:50,280 Speaker 1: known and previously unknown facts were rated more true if 230 00:12:50,360 --> 00:12:52,920 Speaker 1: they were repeated than if they were new. In other words, 231 00:12:53,120 --> 00:12:58,720 Speaker 1: repetition increased perceived truthfulness, even for contradictions of facts that 232 00:12:58,800 --> 00:13:02,240 Speaker 1: you know. So I want to quote from the author's quote. 233 00:13:02,760 --> 00:13:06,040 Speaker 1: Reading a statement like a sorry is the name of 234 00:13:06,080 --> 00:13:11,400 Speaker 1: the short pleaded skirt worn by Scots? Increased participants later 235 00:13:11,440 --> 00:13:14,400 Speaker 1: belief that that statement was true, even if they could 236 00:13:14,440 --> 00:13:17,520 Speaker 1: correctly answer the question, what is the name of the 237 00:13:17,559 --> 00:13:21,920 Speaker 1: short pleaded skirt worn by Scots? Isn't that bizarre? So 238 00:13:22,040 --> 00:13:24,920 Speaker 1: like you ask somebody what is the short pleaded skirt 239 00:13:25,000 --> 00:13:28,320 Speaker 1: worn by Scots? And they answer kilt? But if you 240 00:13:28,440 --> 00:13:31,200 Speaker 1: show them the phrase a sorry is the name of 241 00:13:31,200 --> 00:13:34,400 Speaker 1: the short pleaded skirts skirt worn by Scots? And then 242 00:13:34,480 --> 00:13:37,400 Speaker 1: show them the phrase again later they will they will 243 00:13:37,480 --> 00:13:40,880 Speaker 1: take the repeated phrase as evidence that that statement is 244 00:13:40,920 --> 00:13:42,920 Speaker 1: more true than if they saw the statement for the 245 00:13:42,960 --> 00:13:46,319 Speaker 1: first time again. It it comes back to the shortcuts 246 00:13:46,360 --> 00:13:51,080 Speaker 1: that our brain make. How weird is that's bizarre? I 247 00:13:51,080 --> 00:13:54,000 Speaker 1: mean again, it's kind of a reminder that the human 248 00:13:54,000 --> 00:13:58,640 Speaker 1: culture and human language just complicates everything. Yeah, it's crazy. 249 00:13:58,760 --> 00:14:01,560 Speaker 1: Uh So again, there has found that the repetition effect 250 00:14:01,600 --> 00:14:03,920 Speaker 1: also emerged for truth. So it wasn't just false statements, 251 00:14:03,920 --> 00:14:06,120 Speaker 1: it was true statements to whether it's true or false, 252 00:14:06,160 --> 00:14:08,960 Speaker 1: if you repeat it, people believe it more. So the 253 00:14:09,000 --> 00:14:11,840 Speaker 1: takeaway from this first experiment is whether a statement is 254 00:14:11,840 --> 00:14:15,480 Speaker 1: true or false, and whether you already no better or not. 255 00:14:15,679 --> 00:14:18,920 Speaker 1: If somebody repeats the statement to you, on average, you're 256 00:14:19,000 --> 00:14:22,040 Speaker 1: more likely to believe it. And then the second part 257 00:14:22,040 --> 00:14:24,360 Speaker 1: of their study was kind of interesting. So they're discussing 258 00:14:24,400 --> 00:14:27,400 Speaker 1: their own finding and they say, quote, the data suggests 259 00:14:27,440 --> 00:14:32,680 Speaker 1: a counterintuitive relationship between fluency. Remember that's the fluency processing 260 00:14:32,680 --> 00:14:36,440 Speaker 1: fluency how easy it is to process information between fluency 261 00:14:36,520 --> 00:14:40,680 Speaker 1: and knowledge. Prior work assumes that people only rely on 262 00:14:40,800 --> 00:14:45,600 Speaker 1: fluency if knowledge retrieval is unsuccessful i e. If participants 263 00:14:45,680 --> 00:14:48,440 Speaker 1: lack relevant knowledge or fail to search memory at all. 264 00:14:48,840 --> 00:14:52,800 Speaker 1: Experiment one demonstrated that the reverse may be true. Perhaps 265 00:14:52,840 --> 00:14:57,440 Speaker 1: people retrieve their knowledge only if fluency is absent. So 266 00:14:57,560 --> 00:14:59,720 Speaker 1: to test this out, they did a second experiment, and 267 00:14:59,720 --> 00:15:02,640 Speaker 1: they repeated a modified version of the experiment to test 268 00:15:02,680 --> 00:15:05,600 Speaker 1: it uh. They believe that the results indicate that people 269 00:15:05,680 --> 00:15:09,960 Speaker 1: sometimes use a fluency conditional model, which means they would 270 00:15:10,040 --> 00:15:13,200 Speaker 1: rely on fluency even if knowledge is available to them. 271 00:15:13,240 --> 00:15:16,680 Speaker 1: Do you start with fluency and influency fails, you fall 272 00:15:16,720 --> 00:15:20,720 Speaker 1: back on what you actually know. We shouldn't over interpret it, 273 00:15:20,720 --> 00:15:24,480 Speaker 1: but in a limited way. There may be processes in 274 00:15:24,520 --> 00:15:26,760 Speaker 1: the brain that say, I'm going to go for what 275 00:15:26,960 --> 00:15:30,320 Speaker 1: feels easy before I even check my memory to see 276 00:15:30,360 --> 00:15:32,600 Speaker 1: what I know. What kind of lines up with there 277 00:15:33,040 --> 00:15:36,800 Speaker 1: the mind's tendency to want to offload memory to people 278 00:15:36,800 --> 00:15:39,880 Speaker 1: and gadgets, like I do I have to remember that 279 00:15:39,920 --> 00:15:41,640 Speaker 1: anymore if the machine is going to do it or 280 00:15:41,680 --> 00:15:43,880 Speaker 1: my spouse is going to do it, And the brain 281 00:15:43,960 --> 00:15:47,640 Speaker 1: says no, I think, well, that's completely prune that section. 282 00:15:47,840 --> 00:15:50,920 Speaker 1: Here's a question, how often have you used a calculator 283 00:15:51,040 --> 00:15:55,360 Speaker 1: to do math that you could yourself easily do? Um? 284 00:15:55,440 --> 00:15:57,640 Speaker 1: You know what I mean, like, not not problems that 285 00:15:57,680 --> 00:16:00,160 Speaker 1: would be really hard, but something that if you us 286 00:16:00,200 --> 00:16:03,280 Speaker 1: took ten seconds, you could probably solve in your head. Yeah. 287 00:16:03,280 --> 00:16:05,560 Speaker 1: I do that in Dungeons and Dragons sometimes when we 288 00:16:05,600 --> 00:16:08,560 Speaker 1: get into hit points and whatnot. You know, I could 289 00:16:08,600 --> 00:16:10,880 Speaker 1: certainly easy. I could either do it in my mind 290 00:16:11,120 --> 00:16:13,200 Speaker 1: or just do it, you know, and pen and pencil 291 00:16:13,200 --> 00:16:15,560 Speaker 1: real quick. But I'll go ahead and type it into 292 00:16:15,720 --> 00:16:18,920 Speaker 1: my calculator just to yeah, I get it done. I've 293 00:16:18,920 --> 00:16:20,880 Speaker 1: done the same thing too. It's weird. It's a little 294 00:16:20,960 --> 00:16:25,440 Speaker 1: disturbing why or search engines, you know, just just throwing 295 00:16:25,480 --> 00:16:29,400 Speaker 1: in the mathematical equation something really simple, um so, such 296 00:16:29,400 --> 00:16:32,320 Speaker 1: as just determining how old a particular actor is or 297 00:16:32,360 --> 00:16:34,520 Speaker 1: how old they would have been during a certain movie. 298 00:16:35,080 --> 00:16:37,200 Speaker 1: I feel like I do that all the time, Like 299 00:16:37,320 --> 00:16:41,400 Speaker 1: you're saying you do that even though you could easily 300 00:16:41,560 --> 00:16:44,920 Speaker 1: know the answer if you checked your own memory. M M. 301 00:16:45,560 --> 00:16:47,480 Speaker 1: I feel like I do that less with search engine, 302 00:16:47,560 --> 00:16:51,000 Speaker 1: Like I definitely do the calculator thing. Yeah, not so 303 00:16:51,080 --> 00:16:53,760 Speaker 1: much that I would remember, say how old Robert de 304 00:16:53,800 --> 00:16:57,400 Speaker 1: Niro was during Godfather too, but I would just but 305 00:16:57,440 --> 00:16:59,560 Speaker 1: I was it would suddenly wonder how old he was, 306 00:16:59,640 --> 00:17:04,119 Speaker 1: and so do the simple mathematical scenario of you know, subtracting, 307 00:17:04,280 --> 00:17:07,040 Speaker 1: subtracting one year from the other. Let's plant a lie 308 00:17:07,040 --> 00:17:09,920 Speaker 1: in everybody's mind right now, Robert de Niro was four 309 00:17:10,000 --> 00:17:12,760 Speaker 1: hundred and twenty three years old when he did Godfather too. 310 00:17:12,840 --> 00:17:15,359 Speaker 1: And now you'll remember that that's implausible, that that's the 311 00:17:15,359 --> 00:17:18,280 Speaker 1: implausibility barrier in action. Oh yeah, maybe I should do 312 00:17:18,320 --> 00:17:21,200 Speaker 1: something else. Yeah, we'll come back to that. But anyway, 313 00:17:21,400 --> 00:17:25,080 Speaker 1: So the conclusion of this experiment by Fasio and co 314 00:17:25,200 --> 00:17:30,040 Speaker 1: authors is that quote participants demonstrated knowledge neglect, or the 315 00:17:30,119 --> 00:17:33,320 Speaker 1: failure to rely on stored knowledge in the face of 316 00:17:33,440 --> 00:17:37,600 Speaker 1: fluent processing experiences, so they'd rather go for what was 317 00:17:37,640 --> 00:17:41,000 Speaker 1: easy to process than what was the correct answer based 318 00:17:41,040 --> 00:17:43,440 Speaker 1: on their own knowledge. At the same time, it's really 319 00:17:43,480 --> 00:17:45,840 Speaker 1: important to note that this doesn't happen every time, it 320 00:17:45,880 --> 00:17:48,880 Speaker 1: doesn't happen with every person, it doesn't happen with every question, 321 00:17:49,000 --> 00:17:52,680 Speaker 1: and it doesn't necessarily happen with huge effects, so the 322 00:17:53,000 --> 00:17:57,159 Speaker 1: effect is relatively small. This was actually pointed out pretty 323 00:17:57,160 --> 00:18:00,440 Speaker 1: well in a BBC article in sten by Tom's Afford. 324 00:18:01,000 --> 00:18:04,160 Speaker 1: He pointed out that while repeated exposure to statements increase 325 00:18:04,200 --> 00:18:07,720 Speaker 1: their believability. The biggest influence on whether a statement was 326 00:18:07,840 --> 00:18:10,639 Speaker 1: rated true or not was whether it was actually true. 327 00:18:11,200 --> 00:18:15,199 Speaker 1: So the the illusory truth effect is valid, and it 328 00:18:15,280 --> 00:18:18,840 Speaker 1: does change the averages of the answers, but it's not 329 00:18:18,920 --> 00:18:22,040 Speaker 1: like the only thing that matters, and it doesn't overpower 330 00:18:22,200 --> 00:18:25,440 Speaker 1: our real knowledge about the truth. It's just weird that 331 00:18:25,520 --> 00:18:28,600 Speaker 1: it does have some effect in the face of actual 332 00:18:28,720 --> 00:18:31,360 Speaker 1: knowledge we have when actual knowledge should mean it has 333 00:18:31,400 --> 00:18:34,679 Speaker 1: no effect. Does that make sense? Yeah? Again, I just 334 00:18:34,720 --> 00:18:37,560 Speaker 1: come back to the you know, to to to the 335 00:18:37,600 --> 00:18:40,879 Speaker 1: fact that the mind is going to offload whatever information 336 00:18:40,920 --> 00:18:43,760 Speaker 1: it can or whatever processing it can. Yeah, those lazy 337 00:18:43,800 --> 00:18:45,640 Speaker 1: brains of ours. Okay, well we should take a quick 338 00:18:45,640 --> 00:18:47,360 Speaker 1: break and then when we come back, we will discuss 339 00:18:47,440 --> 00:18:51,199 Speaker 1: more recent research on the illusory truth effect and some 340 00:18:51,280 --> 00:18:54,800 Speaker 1: related concepts and what it means for our lives. Than 341 00:18:55,560 --> 00:18:59,119 Speaker 1: thank alright, we're back. So we've discussed the subject of 342 00:18:59,119 --> 00:19:01,840 Speaker 1: false memories, but for the many ways in which false 343 00:19:01,880 --> 00:19:06,520 Speaker 1: memories can form um Psychologist Daniel Shackter identified seven in 344 00:19:06,600 --> 00:19:10,720 Speaker 1: fact his his work The Seventh Sins of Memory, transient 345 00:19:10,760 --> 00:19:16,840 Speaker 1: sam's absent mindedness, blocking, misattribution, bias, persistence, Uh, and I 346 00:19:16,920 --> 00:19:19,280 Speaker 1: like to think of it this way. Memory is is 347 00:19:19,320 --> 00:19:22,320 Speaker 1: not something that is carved in stone, but rather uh, 348 00:19:22,400 --> 00:19:24,840 Speaker 1: something that is sculptured from clay, and the clay of 349 00:19:24,920 --> 00:19:28,520 Speaker 1: memory remains valuable every time we retrieved from the drawer 350 00:19:28,520 --> 00:19:32,720 Speaker 1: and handle it. As psychologist Pascal Boyer, who referenced in 351 00:19:32,840 --> 00:19:36,399 Speaker 1: our last episode pointed out, um examples of this range 352 00:19:36,480 --> 00:19:41,640 Speaker 1: from wordless recall intrusions and experiments, to therapy induced imaginings 353 00:19:41,640 --> 00:19:45,440 Speaker 1: of past lives and or ritual abuse, which we've we've 354 00:19:45,480 --> 00:19:47,400 Speaker 1: discussed on the episode on the on the show before 355 00:19:47,400 --> 00:19:50,520 Speaker 1: in past episodes. Uh. So, memory retrieval is a very 356 00:19:50,560 --> 00:19:55,800 Speaker 1: delicate stage. There's actually a line from the television series 357 00:19:55,840 --> 00:19:58,600 Speaker 1: The Expanse and I think captures this perfectly well. The 358 00:19:58,920 --> 00:20:03,119 Speaker 1: character Miller played Thomas jane Um. He sums up that 359 00:20:03,119 --> 00:20:05,600 Speaker 1: they have the character sum up this rather perfectly says, 360 00:20:05,760 --> 00:20:08,480 Speaker 1: you know, every time you remember something, your mind changes 361 00:20:08,520 --> 00:20:11,080 Speaker 1: it a little, until your best and worst memories are 362 00:20:11,080 --> 00:20:14,679 Speaker 1: your biggest illusions. So in the two thousand and eleven paper, 363 00:20:15,000 --> 00:20:19,160 Speaker 1: remembering makes evidence, compelling retrieval from memory can give rise 364 00:20:19,200 --> 00:20:22,840 Speaker 1: to the illusion of truth. From Jason d Azubko and 365 00:20:22,960 --> 00:20:27,240 Speaker 1: Jonathan Fogel, saying, The authors conclude that quote memory retrieval 366 00:20:27,280 --> 00:20:31,000 Speaker 1: is a powerful method for increasing the perceived validity of 367 00:20:31,040 --> 00:20:34,560 Speaker 1: statements and subsequent illusion of truth, and that the illusion 368 00:20:34,600 --> 00:20:37,520 Speaker 1: of truth is a robust effect that can be observed 369 00:20:37,600 --> 00:20:41,919 Speaker 1: even without directly pulling the factual statements in question. WHOA, 370 00:20:42,080 --> 00:20:44,120 Speaker 1: So this is sort of the same effect, but not 371 00:20:44,440 --> 00:20:48,080 Speaker 1: statements coming in from the outside. Right. So they conducted 372 00:20:48,080 --> 00:20:51,639 Speaker 1: a two fifty seven person study, all individuals from the 373 00:20:51,720 --> 00:20:55,040 Speaker 1: University of Waterloo. So we're, you know, relatively small study, 374 00:20:55,040 --> 00:20:57,560 Speaker 1: and they and they admit that they quote may have 375 00:20:57,720 --> 00:21:00,679 Speaker 1: made it particularly difficult to observe any different is between 376 00:21:00,720 --> 00:21:04,360 Speaker 1: our control condition and our experimental conditions. So as always, 377 00:21:04,880 --> 00:21:08,760 Speaker 1: you know, more studies are required. But uh, here's how 378 00:21:08,760 --> 00:21:11,600 Speaker 1: it shakes out. Quote. If this account is correct, the 379 00:21:11,640 --> 00:21:15,800 Speaker 1: current work demonstrates that information retrieved from memory cannot only 380 00:21:15,840 --> 00:21:18,720 Speaker 1: be viewed as relatively more important than more difficult to 381 00:21:18,720 --> 00:21:22,240 Speaker 1: retrieve information, but can also be viewed as more important 382 00:21:22,240 --> 00:21:26,560 Speaker 1: than information that is explicitly provided. In particular, information that 383 00:21:26,680 --> 00:21:30,959 Speaker 1: is retrieved from memory may actually be more fluently processed 384 00:21:30,960 --> 00:21:34,800 Speaker 1: in general than information that is directly perceived. So the 385 00:21:34,840 --> 00:21:39,400 Speaker 1: idea here is that repetition entailed in memory retrieval need 386 00:21:39,480 --> 00:21:41,720 Speaker 1: not be from an external source. It can be internal. 387 00:21:41,760 --> 00:21:43,840 Speaker 1: In the form of memory retrieval, it is it is 388 00:21:43,920 --> 00:21:48,159 Speaker 1: quote naturally more familiar and fluent than information that is perceived. Wow, 389 00:21:48,320 --> 00:21:53,280 Speaker 1: that that is profound. Actually, like the idea that you 390 00:21:53,520 --> 00:21:56,960 Speaker 1: that your memory is the haze of your memories is 391 00:21:57,960 --> 00:22:01,239 Speaker 1: greater evidence sometimes to your own mind. Then what's in 392 00:22:01,320 --> 00:22:03,800 Speaker 1: front of your eyes right now? Yeah, and it and 393 00:22:03,840 --> 00:22:06,800 Speaker 1: it means that like for the for the lie or 394 00:22:06,840 --> 00:22:11,680 Speaker 1: the the untruth to to resonate, Uh, it only needs 395 00:22:11,720 --> 00:22:15,040 Speaker 1: to be memorable, like something that you'll continually retrieve. Oh yeah, 396 00:22:15,040 --> 00:22:17,880 Speaker 1: and that forms that serves as a form of repetition. Oh. 397 00:22:17,880 --> 00:22:20,200 Speaker 1: And this is so true of so many of these 398 00:22:20,240 --> 00:22:23,520 Speaker 1: lies they get repeated so often in public conversations. Is 399 00:22:23,560 --> 00:22:28,080 Speaker 1: that they're the really memorable, weird outlandish ones that stick around. 400 00:22:28,800 --> 00:22:31,399 Speaker 1: I think about in the last episode, we talked about 401 00:22:31,400 --> 00:22:35,080 Speaker 1: the the belief that's still so common that Barack Obama 402 00:22:35,160 --> 00:22:37,719 Speaker 1: was born in Kenya. Yes, there's no evidence of it, 403 00:22:37,760 --> 00:22:40,760 Speaker 1: and it's like such a weird thing to suggest that 404 00:22:40,800 --> 00:22:43,679 Speaker 1: it sticks in people's brains, right, Yeah, And then you 405 00:22:43,760 --> 00:22:46,160 Speaker 1: keep coming back to it. You keep rethinking it. Um, 406 00:22:46,359 --> 00:22:48,280 Speaker 1: I guess we just made you think of it again. Yeah, 407 00:22:48,760 --> 00:22:51,600 Speaker 1: that's the horrible thing about this. We'll have to have 408 00:22:51,600 --> 00:22:53,440 Speaker 1: a discussion about that at the end of the episode. 409 00:22:54,040 --> 00:22:56,159 Speaker 1: Another way of looking at it is this, So, if 410 00:22:56,160 --> 00:22:58,880 Speaker 1: you're a regular listener to this podcast, if I were 411 00:22:58,880 --> 00:23:02,320 Speaker 1: to remind you in every episode that Joe drinks a 412 00:23:02,359 --> 00:23:04,640 Speaker 1: full cup of coffee every morning before he gets out 413 00:23:04,640 --> 00:23:06,920 Speaker 1: of bed, that's not true. That's a lie that I 414 00:23:07,000 --> 00:23:10,240 Speaker 1: just made up. But if I repeated it in every episode, 415 00:23:10,720 --> 00:23:13,240 Speaker 1: and even if Joe said it's a lie, you're hearing 416 00:23:13,240 --> 00:23:15,840 Speaker 1: it enough right that the repetition is going to uh 417 00:23:16,560 --> 00:23:19,880 Speaker 1: potentially influence you. And it's also it's it's a perfectly 418 00:23:19,920 --> 00:23:22,200 Speaker 1: reasonable lie, right. There's no like if you said, oh, 419 00:23:22,320 --> 00:23:25,600 Speaker 1: that's actually what I do, nobody would think you weird 420 00:23:25,720 --> 00:23:27,879 Speaker 1: or anything. Right, it'd be kind of weird that I 421 00:23:27,960 --> 00:23:30,439 Speaker 1: drank it without getting out of bed. Well, I assume 422 00:23:30,440 --> 00:23:32,639 Speaker 1: somebody brings it to your I mean, I didn't say 423 00:23:32,680 --> 00:23:34,159 Speaker 1: that you had the coffee machine set up on the 424 00:23:34,680 --> 00:23:37,200 Speaker 1: night stand, coffee robot that pours coffee on my face 425 00:23:37,280 --> 00:23:41,040 Speaker 1: every morning. But but what if instead of saying this 426 00:23:41,200 --> 00:23:44,879 Speaker 1: lie every episode, what have just once. I told everybody 427 00:23:44,920 --> 00:23:47,280 Speaker 1: that Joe McCormick before he gets out of bed in 428 00:23:47,320 --> 00:23:51,840 Speaker 1: the morning, he um, he shoots back three six hour 429 00:23:51,960 --> 00:23:54,960 Speaker 1: energy drinks one after the other. No, why did you 430 00:23:55,000 --> 00:23:58,440 Speaker 1: do that to me? Robert? Like, but that's potentially more 431 00:23:58,480 --> 00:24:01,520 Speaker 1: memorable because it's a little strange, it's maybe a little 432 00:24:01,560 --> 00:24:06,080 Speaker 1: more funny, and therefore it's exactly the kind of untruth 433 00:24:06,359 --> 00:24:08,440 Speaker 1: that might pop up again. Like you're just you're thinking 434 00:24:08,440 --> 00:24:10,440 Speaker 1: of Joe. You're hearing Joe talk and you're like, oh, yeah, 435 00:24:10,520 --> 00:24:13,119 Speaker 1: Joe shooting back six hour energy drinks first thing in 436 00:24:13,160 --> 00:24:16,160 Speaker 1: the morning. I don't do that either. Come on, But yeah, 437 00:24:16,160 --> 00:24:19,199 Speaker 1: I totally see your point, and I think you're absolutely correct. 438 00:24:19,480 --> 00:24:21,760 Speaker 1: So what they're saying here is essentially that there is 439 00:24:21,880 --> 00:24:24,560 Speaker 1: an illusion of truth effect, not just for statements you 440 00:24:24,600 --> 00:24:27,360 Speaker 1: hear from the outside, but from your own memories. Every 441 00:24:27,400 --> 00:24:29,320 Speaker 1: time you go back and check in with the memory, 442 00:24:29,359 --> 00:24:32,640 Speaker 1: you're reinforcing it and making it seem more true, even 443 00:24:32,680 --> 00:24:35,399 Speaker 1: if you didn't necessarily believe it to be true in 444 00:24:35,440 --> 00:24:37,720 Speaker 1: the first place. Yeah, and you know, they don't really 445 00:24:37,720 --> 00:24:39,679 Speaker 1: get into this, but it also makes me think of 446 00:24:39,720 --> 00:24:42,719 Speaker 1: like just negative things people might have said to you 447 00:24:42,760 --> 00:24:45,520 Speaker 1: in the past. You know, if you know some criticism 448 00:24:45,520 --> 00:24:50,040 Speaker 1: that is is not accurate, but it steams you and 449 00:24:50,040 --> 00:24:52,000 Speaker 1: then you end up sort of you end up reflecting 450 00:24:52,040 --> 00:24:54,480 Speaker 1: on it, perhaps even traumatically, and then it makes you 451 00:24:54,520 --> 00:24:57,960 Speaker 1: more susceptible to its power. Well yeah, I mean, as 452 00:24:58,080 --> 00:25:00,400 Speaker 1: as always you have that fear that all critics systems 453 00:25:00,400 --> 00:25:02,639 Speaker 1: of you are accurate. Now I'd like to turn to 454 00:25:02,800 --> 00:25:06,760 Speaker 1: another paper here, this one with the title making up 455 00:25:06,840 --> 00:25:10,199 Speaker 1: History False Memories of fake news stories, and this is 456 00:25:10,240 --> 00:25:15,000 Speaker 1: from Europe's Journal of Psychology from two thousand and twelve. Uh, 457 00:25:15,240 --> 00:25:17,440 Speaker 1: and again it's worth noting, uh, this is again a 458 00:25:17,480 --> 00:25:20,040 Speaker 1: two thousand twelve papers, so this predates the more recent 459 00:25:20,240 --> 00:25:24,800 Speaker 1: usage and uh, politicization of the term fake news. So 460 00:25:24,800 --> 00:25:27,040 Speaker 1: in this they wanted to see if false news stories 461 00:25:27,040 --> 00:25:29,879 Speaker 1: that were familiar would result in the creation of false 462 00:25:30,000 --> 00:25:33,760 Speaker 1: memories of having heard the story outside of the experiment. 463 00:25:33,960 --> 00:25:36,159 Speaker 1: So they had a small study here forty four undergraduate 464 00:25:36,200 --> 00:25:39,679 Speaker 1: psychology students and they're participating in exchange for course credit. 465 00:25:40,080 --> 00:25:42,639 Speaker 1: They exposed the participants to false news stories that they 466 00:25:42,680 --> 00:25:45,800 Speaker 1: portrayed as true, and then five weeks later, the participants 467 00:25:46,040 --> 00:25:48,520 Speaker 1: were found to be more likely to rate the false 468 00:25:48,560 --> 00:25:52,439 Speaker 1: news pieces as true than test subjects only just exposed 469 00:25:52,440 --> 00:25:56,680 Speaker 1: to the stories. Uh the the author's right. These results 470 00:25:56,680 --> 00:26:00,200 Speaker 1: suggest that repeating false claims will not only in ease 471 00:26:00,240 --> 00:26:04,600 Speaker 1: their believability, but also result in source monitoring errors. So 472 00:26:04,640 --> 00:26:06,800 Speaker 1: again we get in back into this situation where you're 473 00:26:06,960 --> 00:26:09,400 Speaker 1: you have this headline or this news story popping around 474 00:26:09,480 --> 00:26:11,560 Speaker 1: in your head, but you ask yourself, where did I 475 00:26:11,600 --> 00:26:14,000 Speaker 1: hear that? Was it a talk show, radio talk show? 476 00:26:15,000 --> 00:26:17,600 Speaker 1: Was it the BBC? Was it a verified news source 477 00:26:17,640 --> 00:26:20,280 Speaker 1: in my Facebook feed? Or just some dubious bit of 478 00:26:20,320 --> 00:26:23,400 Speaker 1: news that's kind of passing through. Oh and by the way, 479 00:26:23,480 --> 00:26:27,840 Speaker 1: the author not authors on that particular UM paper is 480 00:26:28,160 --> 00:26:32,399 Speaker 1: Danielle C. Polage. Yeah, this really makes me think about 481 00:26:33,119 --> 00:26:36,400 Speaker 1: how um, I don't know, I wonder how the Internet 482 00:26:36,440 --> 00:26:39,560 Speaker 1: has changed the way we think about sources of information. 483 00:26:40,560 --> 00:26:44,200 Speaker 1: Like has the Internet and say like social media feeds 484 00:26:44,680 --> 00:26:48,200 Speaker 1: made us more scrupulous about the sources of information or 485 00:26:48,440 --> 00:26:51,040 Speaker 1: less scrupulous? I don't know, Or maybe it's had a 486 00:26:51,280 --> 00:26:54,040 Speaker 1: you know, divergent effect on different people. Well, I think 487 00:26:54,160 --> 00:26:55,959 Speaker 1: you have you you you do have sort of two 488 00:26:56,000 --> 00:26:57,880 Speaker 1: different timelines going on there, because I feel like, on 489 00:26:57,880 --> 00:27:01,639 Speaker 1: one hand, you have the industry responding. You have like Facebook, 490 00:27:01,680 --> 00:27:06,080 Speaker 1: for instance, responding to criticisms and an overall need for 491 00:27:06,200 --> 00:27:11,960 Speaker 1: better sourcing and UH an attribution of of publication sources. 492 00:27:12,480 --> 00:27:15,000 Speaker 1: And then also, I think every individual is probably going 493 00:27:15,080 --> 00:27:18,920 Speaker 1: through this, the situation where perhaps they're more trusting and 494 00:27:18,920 --> 00:27:21,240 Speaker 1: then they realize, oh, I really need to be better 495 00:27:21,240 --> 00:27:23,240 Speaker 1: about seeing where I'm getting my information and then have 496 00:27:23,320 --> 00:27:26,600 Speaker 1: it to self correct. Now there's another paper that gets 497 00:27:26,600 --> 00:27:28,840 Speaker 1: into some of this here and this UH is a 498 00:27:28,960 --> 00:27:33,560 Speaker 1: forthcoming paper from the Journal of Experimental Psychology General. Now 499 00:27:33,560 --> 00:27:35,680 Speaker 1: we should just note with the lester or a scarre 500 00:27:35,800 --> 00:27:38,480 Speaker 1: this is a forthcoming paper, so take with a grain 501 00:27:38,480 --> 00:27:40,600 Speaker 1: of salt that it has not yet fully passed all 502 00:27:40,640 --> 00:27:44,359 Speaker 1: of the pre pre publication review procedures. But it's a 503 00:27:44,480 --> 00:27:46,320 Speaker 1: it's been put out there and people have been talking 504 00:27:46,320 --> 00:27:50,240 Speaker 1: about it. Yeah, titled prior exposure increases perceived accuracy of 505 00:27:50,240 --> 00:27:52,960 Speaker 1: fake news and and key here and all this is 506 00:27:53,119 --> 00:27:56,720 Speaker 1: quote fluency via prior exposure. They say that even a 507 00:27:56,800 --> 00:28:01,640 Speaker 1: single exposure increases subsequent perceptions of accuracy see quote. Moreover, 508 00:28:01,720 --> 00:28:05,880 Speaker 1: this illusory truth effect for fake news headlines occurs despite 509 00:28:05,880 --> 00:28:09,560 Speaker 1: a low level of overall believability and even when the 510 00:28:09,640 --> 00:28:13,440 Speaker 1: stories are labeled as contested by fact checkers or are 511 00:28:13,520 --> 00:28:17,760 Speaker 1: inconsistent with the reader's political ideology. Also key here that 512 00:28:18,240 --> 00:28:21,280 Speaker 1: is the extreme implausibility that we've been discussing, you know, 513 00:28:21,320 --> 00:28:25,680 Speaker 1: this boundary condition over the illusory truth effect um. Only 514 00:28:25,720 --> 00:28:30,080 Speaker 1: a small degree of potential plausibility is sufficient for repetition 515 00:28:30,440 --> 00:28:34,280 Speaker 1: to increase perceived accuracy. How small? Well, I imagine that's 516 00:28:34,280 --> 00:28:36,480 Speaker 1: going to vary from individual to individual. Right, we come 517 00:28:36,480 --> 00:28:39,720 Speaker 1: back to this we mentioned earlier that then my my 518 00:28:39,800 --> 00:28:43,760 Speaker 1: boundary condition is not gonna be the same as yours. Yeah. Yeah, 519 00:28:43,880 --> 00:28:46,720 Speaker 1: that's a weird thing to wonder about. So like you 520 00:28:46,800 --> 00:28:50,200 Speaker 1: might say that, for one person, if you showed them 521 00:28:50,240 --> 00:28:54,240 Speaker 1: a headline about bat Boy, they would not that wouldn't 522 00:28:54,240 --> 00:28:57,000 Speaker 1: even register as possibly true to begin with, So they're 523 00:28:57,000 --> 00:28:59,560 Speaker 1: never gonna believe it's more likely to be true later, 524 00:29:00,040 --> 00:29:02,760 Speaker 1: But somebody else might. But a lot of those other 525 00:29:02,800 --> 00:29:05,720 Speaker 1: types of headlines, just like weird, uh, you know, kind 526 00:29:05,720 --> 00:29:09,120 Speaker 1: of nasty rumors about celebrities or politicians, A lot of 527 00:29:09,120 --> 00:29:12,760 Speaker 1: those that are slightly more plausible than say, bat Boy, 528 00:29:12,800 --> 00:29:14,840 Speaker 1: are probably gonna stick in a lot of people's minds. 529 00:29:15,600 --> 00:29:19,200 Speaker 1: I think about the way that news feed algorithms keep 530 00:29:19,240 --> 00:29:22,360 Speaker 1: popular stories in front of your eyes on social media. 531 00:29:22,720 --> 00:29:25,560 Speaker 1: If you keep coming back and scrolling, the most popular 532 00:29:25,640 --> 00:29:28,400 Speaker 1: fake news stories do tend to show up again and 533 00:29:28,440 --> 00:29:31,520 Speaker 1: again and again. Yeah, and then hopefully people are shooting 534 00:29:31,520 --> 00:29:34,880 Speaker 1: it down again. But but even then it's going to 535 00:29:34,960 --> 00:29:38,400 Speaker 1: have a limited effect based on this particular study here. Yeah, 536 00:29:38,640 --> 00:29:41,760 Speaker 1: so it's worth remembering that these effects are small, but 537 00:29:41,840 --> 00:29:44,800 Speaker 1: small effects can add up quick example, one of these 538 00:29:44,840 --> 00:29:48,280 Speaker 1: fake headlines that they looked at here was it was 539 00:29:48,360 --> 00:29:53,080 Speaker 1: this ridiculous story and it's totally untrue. Originally five percent 540 00:29:53,120 --> 00:29:55,160 Speaker 1: believed it was true. The second time people saw it, 541 00:29:55,360 --> 00:29:58,480 Speaker 1: ten percent believed it was true. So that might sound small, 542 00:29:58,560 --> 00:30:02,320 Speaker 1: but aggregated over whole populations with lots of manipulative false 543 00:30:02,320 --> 00:30:05,280 Speaker 1: stories and lies, this kind of thing could have huge effects. 544 00:30:05,360 --> 00:30:08,480 Speaker 1: It could swing an election in a country, It could 545 00:30:08,480 --> 00:30:11,480 Speaker 1: tip public opinion on an issue from a minority opinion 546 00:30:11,480 --> 00:30:14,320 Speaker 1: to a majority opinion. It could have real effects in 547 00:30:14,320 --> 00:30:17,080 Speaker 1: the world. Yeah, you're gonna have more than one of 548 00:30:17,080 --> 00:30:18,800 Speaker 1: these going on at a given time. Some of them 549 00:30:18,800 --> 00:30:20,880 Speaker 1: am gonna catch on, some of them are not. But uh, 550 00:30:21,880 --> 00:30:23,880 Speaker 1: adding them all together and they could have an effect. 551 00:30:24,080 --> 00:30:26,480 Speaker 1: So I think maybe we should transition to talk about 552 00:30:26,680 --> 00:30:29,920 Speaker 1: what we should do, both as receivers of information trying 553 00:30:29,960 --> 00:30:33,080 Speaker 1: to figure out what's true, and as purveyors of information 554 00:30:33,160 --> 00:30:36,880 Speaker 1: who you know, have public conversations. What should we do 555 00:30:37,240 --> 00:30:44,480 Speaker 1: in order to try to avoid creating wide widespread misbeliefs 556 00:30:44,520 --> 00:30:47,880 Speaker 1: in knowing what we know? Now, well, let's receive an 557 00:30:47,920 --> 00:30:50,800 Speaker 1: advertisement and then come right back with an answer to 558 00:30:50,840 --> 00:30:55,959 Speaker 1: that question. Okay, thank thank alright, we're back. So one 559 00:30:56,000 --> 00:30:57,960 Speaker 1: of the first questions I think we should ask is 560 00:30:58,440 --> 00:31:01,720 Speaker 1: what can you do about this? If you so, say 561 00:31:01,760 --> 00:31:04,160 Speaker 1: you've listened to these past couple episodes and you're like wow. 562 00:31:04,240 --> 00:31:07,720 Speaker 1: So I I accept that I'm susceptible to the illusory 563 00:31:07,720 --> 00:31:10,640 Speaker 1: truth effect. I know that being exposed to an untrue 564 00:31:10,680 --> 00:31:13,240 Speaker 1: statement or hearing an untrue statement repeated, is going to 565 00:31:13,480 --> 00:31:16,360 Speaker 1: probably make me more likely to believe it. How can 566 00:31:16,400 --> 00:31:19,920 Speaker 1: I protect myself against it? Especially given that we've seen 567 00:31:19,960 --> 00:31:23,040 Speaker 1: all these studies showing that various things apparently don't protect 568 00:31:23,120 --> 00:31:27,200 Speaker 1: you or don't necessarily protect you. Knowing otherwise isn't even 569 00:31:27,200 --> 00:31:31,640 Speaker 1: necessarily going to protect you. And I've I've felt that before. Robert. 570 00:31:31,800 --> 00:31:34,560 Speaker 1: I don't know about you, but like, there are cases 571 00:31:34,600 --> 00:31:38,800 Speaker 1: where I'm confident that I actually know what's true. I've 572 00:31:38,800 --> 00:31:42,320 Speaker 1: done the research, I know what reality is, and yet 573 00:31:42,480 --> 00:31:46,280 Speaker 1: seeing a lie that's that exists in contradiction to what 574 00:31:46,400 --> 00:31:50,480 Speaker 1: I know, over and over and over again actually does 575 00:31:50,560 --> 00:31:52,719 Speaker 1: work on me. I can feel it working on me. 576 00:31:52,760 --> 00:31:56,200 Speaker 1: I can feel doubts setting in. When I see a 577 00:31:56,360 --> 00:31:59,239 Speaker 1: lie repeated with great frequency, I start to wonder, like, 578 00:31:59,840 --> 00:32:01,920 Speaker 1: is true? I mean, I've checked it out before and 579 00:32:01,960 --> 00:32:04,440 Speaker 1: there's nothing to it. But maybe I don't I miss something, 580 00:32:04,560 --> 00:32:08,440 Speaker 1: maybe the maybe there's some new information. I'm not pretty too. Yeah, 581 00:32:08,480 --> 00:32:10,840 Speaker 1: So I really do feel it working on me, even 582 00:32:10,920 --> 00:32:14,360 Speaker 1: though you know I'm somewhat aware of this, and so 583 00:32:14,480 --> 00:32:16,560 Speaker 1: it can be difficult. It can be hard to know 584 00:32:16,640 --> 00:32:19,160 Speaker 1: what to do to protect yourself. But here's one thing 585 00:32:19,200 --> 00:32:22,480 Speaker 1: I want to offer as a as a general rule, 586 00:32:22,560 --> 00:32:26,120 Speaker 1: A huge red flag for judging a statements truth or 587 00:32:26,120 --> 00:32:29,760 Speaker 1: falsehood is I feel like I've heard that somewhere before. 588 00:32:30,560 --> 00:32:33,040 Speaker 1: And I do this. I'm you know, I I fall 589 00:32:33,120 --> 00:32:35,120 Speaker 1: prey to this. I do it all the time. Actually, 590 00:32:35,120 --> 00:32:38,040 Speaker 1: in a conversation, I think something's true because I have 591 00:32:38,080 --> 00:32:41,520 Speaker 1: exactly that feeling. I feel like I've heard this somewhere before. 592 00:32:42,760 --> 00:32:46,160 Speaker 1: I would say, if it feels familiar, but you can't 593 00:32:46,240 --> 00:32:50,120 Speaker 1: recall why it's true, and you can't recall the source 594 00:32:50,200 --> 00:32:52,960 Speaker 1: of where you heard it, you are in the danger zone. 595 00:32:53,120 --> 00:32:55,200 Speaker 1: That is the red that is the red zone for 596 00:32:55,320 --> 00:32:59,160 Speaker 1: repeating and reinforcing a false belief. So I think maybe 597 00:32:59,200 --> 00:33:02,720 Speaker 1: we should try a little experiment. Let's do it. Let's 598 00:33:02,760 --> 00:33:04,960 Speaker 1: repeat something a bunch of times and see if it 599 00:33:05,000 --> 00:33:08,520 Speaker 1: sets in. So here's the phrase, if it feels familiar, 600 00:33:08,720 --> 00:33:12,840 Speaker 1: check the facts. If it feels familiar, to check the facts. 601 00:33:13,280 --> 00:33:16,840 Speaker 1: If it feels familiar, check the facts. If it feels familiar, 602 00:33:17,080 --> 00:33:20,200 Speaker 1: check the facts. It feels familiar, check the facts. Death 603 00:33:20,240 --> 00:33:23,200 Speaker 1: to videodromes, Long live the New Flesh. All right, well, 604 00:33:23,240 --> 00:33:25,320 Speaker 1: we've we've we've done it, job, Joe, I think we've 605 00:33:25,480 --> 00:33:28,360 Speaker 1: we've won. No, we haven't one yet. There's actually there's 606 00:33:28,360 --> 00:33:31,240 Speaker 1: some more stuff we got to talk about. Uh So. 607 00:33:31,520 --> 00:33:33,520 Speaker 1: One of the other studies we looked at was just 608 00:33:34,200 --> 00:33:37,760 Speaker 1: a study in political communication in six by Emily Thorson 609 00:33:37,920 --> 00:33:43,959 Speaker 1: called belief Echoes the Persistent Effects of Corrected Misinformation, And 610 00:33:44,000 --> 00:33:47,000 Speaker 1: this was a study where they did three experiments. Thorsen 611 00:33:47,080 --> 00:33:51,040 Speaker 1: writes that they showed that exposure to negative political information 612 00:33:51,400 --> 00:33:55,120 Speaker 1: persists even after people are informed that the information was 613 00:33:55,160 --> 00:33:56,840 Speaker 1: not true. So this goes along with some of the 614 00:33:56,880 --> 00:33:59,760 Speaker 1: fake news stuff we were just talking about. And Thorson 615 00:33:59,800 --> 00:34:04,120 Speaker 1: called these beliefs that persist after being discredited quote belief echoes. 616 00:34:04,920 --> 00:34:08,080 Speaker 1: So she writes, quote belief echoes occur even when the 617 00:34:08,160 --> 00:34:13,600 Speaker 1: misinformation is corrected immediately. The gold standard of journalistic fact checking. 618 00:34:14,000 --> 00:34:18,040 Speaker 1: The existence of belief echoes racist ethical concerns about journalists 619 00:34:18,040 --> 00:34:22,120 Speaker 1: and fact checking organization's efforts to publicly correct false claims. 620 00:34:22,680 --> 00:34:26,960 Speaker 1: So dang. So even correcting a lie tends to increase 621 00:34:27,000 --> 00:34:29,640 Speaker 1: people's belief in the lie. What can you do then? 622 00:34:29,960 --> 00:34:32,200 Speaker 1: I know? I mean in this on top of the 623 00:34:32,239 --> 00:34:37,279 Speaker 1: reality that in some cases, corrections are not going to 624 00:34:37,880 --> 00:34:42,359 Speaker 1: resonate as as as much as the original, uh lie 625 00:34:42,520 --> 00:34:47,000 Speaker 1: or the original bit of of unfactual information. Well, yeah, 626 00:34:47,080 --> 00:34:50,279 Speaker 1: very often a lie is interesting in the correction is 627 00:34:50,280 --> 00:34:52,839 Speaker 1: not interesting. Yeah. Yeah, the corrections page two, But the 628 00:34:52,840 --> 00:34:55,359 Speaker 1: the original that's the headline on page one. Yeah. So 629 00:34:55,400 --> 00:34:58,719 Speaker 1: there was a article in the Columbia Journalism Review by 630 00:34:58,719 --> 00:35:02,480 Speaker 1: the Dartmouth political scientists Brendan Nihan. It was called building 631 00:35:02,520 --> 00:35:05,880 Speaker 1: a Better Correction Now this is not necessarily responding to 632 00:35:05,920 --> 00:35:08,600 Speaker 1: the exact same research we've been talking about, but it 633 00:35:08,640 --> 00:35:12,600 Speaker 1: addresses the fact that journalistic fact checking, corrections and so 634 00:35:12,760 --> 00:35:17,839 Speaker 1: forth can be insufficiently effective at correcting false beliefs, and 635 00:35:17,960 --> 00:35:20,360 Speaker 1: it does end up coming up with a few recommendations 636 00:35:20,400 --> 00:35:24,120 Speaker 1: based on Nihand's research and other people's research in recent years. 637 00:35:24,800 --> 00:35:29,319 Speaker 1: Number one is, of course, identify sources that speak against 638 00:35:29,400 --> 00:35:32,960 Speaker 1: their ideological interests. So apparently people are more likely to 639 00:35:33,040 --> 00:35:35,960 Speaker 1: accept a correction on a false belief for a widely 640 00:35:36,040 --> 00:35:40,680 Speaker 1: repeated lie, if that correction comes from somebody who who 641 00:35:40,680 --> 00:35:43,799 Speaker 1: it's against their political interests to to discredit it, does 642 00:35:43,840 --> 00:35:46,560 Speaker 1: that make sense? So in the political sphere, if it 643 00:35:46,840 --> 00:35:50,080 Speaker 1: is a misconception that's widely held on the right, you 644 00:35:50,080 --> 00:35:52,279 Speaker 1: need to get somebody from the right to discredit it. 645 00:35:52,440 --> 00:35:54,279 Speaker 1: If it's widely held on the left, you need to 646 00:35:54,320 --> 00:35:57,319 Speaker 1: get somebody from the left to discredit it. Right, So like, 647 00:35:57,360 --> 00:36:00,640 Speaker 1: if if the correction is pandas are not the most 648 00:36:00,680 --> 00:36:03,040 Speaker 1: awesome animal on the planet, it's going to carry more 649 00:36:03,080 --> 00:36:07,799 Speaker 1: weight if Panda weekly runs that correction as opposed to 650 00:36:08,280 --> 00:36:12,040 Speaker 1: you know, grizzly bears monthly exactly correct. So the second 651 00:36:12,040 --> 00:36:15,360 Speaker 1: point coming from the research is don't just assert that 652 00:36:15,440 --> 00:36:20,640 Speaker 1: a false claim is false given alternative causal account, So 653 00:36:20,680 --> 00:36:23,359 Speaker 1: you give a different explanation. To read a quote from 654 00:36:23,400 --> 00:36:27,080 Speaker 1: the article quote in the fictitious scenario used in one study, 655 00:36:27,160 --> 00:36:30,279 Speaker 1: For example, respondents who were told of the presence of 656 00:36:30,400 --> 00:36:35,000 Speaker 1: volatile materials at the scene of a suspicious fire continued 657 00:36:35,080 --> 00:36:38,360 Speaker 1: to blame the materials even after being told the initial 658 00:36:38,400 --> 00:36:42,640 Speaker 1: report was mistaken. So you tell them there's volatile materials there, 659 00:36:42,760 --> 00:36:45,480 Speaker 1: there was a fire. What caused the fire. Oh, those 660 00:36:45,560 --> 00:36:48,920 Speaker 1: volatile materials weren't actually there? People say, oh, it was 661 00:36:49,000 --> 00:36:52,440 Speaker 1: caused by the volatile materials. So the only way to 662 00:36:52,520 --> 00:36:56,160 Speaker 1: persuade people against that seemed to be to give them 663 00:36:56,200 --> 00:36:59,840 Speaker 1: another explanation of what caused the fire. So you don't say, no, 664 00:37:00,040 --> 00:37:02,960 Speaker 1: those materials weren't actually there. You say they weren't there 665 00:37:03,040 --> 00:37:06,880 Speaker 1: and the fire was caused by arson. If that's true. Obviously, 666 00:37:06,920 --> 00:37:09,440 Speaker 1: like you wouldn't want to make up fake alternative accounts, 667 00:37:09,480 --> 00:37:13,200 Speaker 1: but like, this is how you correct a misperception with 668 00:37:13,239 --> 00:37:16,280 Speaker 1: the truth is you give them the alternative causal account 669 00:37:16,320 --> 00:37:19,040 Speaker 1: that is true. And then finally, this is a big one, 670 00:37:19,560 --> 00:37:22,360 Speaker 1: don't state the correction is the negation of the lie. 671 00:37:22,880 --> 00:37:26,719 Speaker 1: Instead state the true fact that stands in contradiction of 672 00:37:26,760 --> 00:37:28,839 Speaker 1: the lie. Yeah, if you're having to say I am 673 00:37:28,840 --> 00:37:31,200 Speaker 1: not a crook, you're kind of saying I have a cruk. 674 00:37:31,800 --> 00:37:33,799 Speaker 1: Instead you say I am a good person. Yeah, yeah, 675 00:37:33,840 --> 00:37:36,760 Speaker 1: if that's true. I mean the good people don't usually 676 00:37:36,760 --> 00:37:40,920 Speaker 1: say I'm a good person. Yeah. So, but an example 677 00:37:40,960 --> 00:37:42,840 Speaker 1: would be from the thing we used at the beginning 678 00:37:42,880 --> 00:37:45,960 Speaker 1: of the last episode about this widespread belief that crime 679 00:37:46,040 --> 00:37:48,640 Speaker 1: has gone up in the United States since two thousand. 680 00:37:48,920 --> 00:37:51,600 Speaker 1: That's not true at all. Crime has gone down. So 681 00:37:52,200 --> 00:37:55,120 Speaker 1: you shouldn't say it's not true that crime has gone 682 00:37:55,160 --> 00:37:56,919 Speaker 1: up because a lot of times people are just gonna 683 00:37:56,920 --> 00:37:59,879 Speaker 1: remember crime has gone up. Instead, what you should say 684 00:38:00,200 --> 00:38:02,799 Speaker 1: that we've been violating this all this time. Here, what 685 00:38:02,880 --> 00:38:05,600 Speaker 1: you should say is crime has gone down since two 686 00:38:05,520 --> 00:38:09,560 Speaker 1: tho eight. State the true fact, don't negate the lie, okay, 687 00:38:09,719 --> 00:38:11,520 Speaker 1: and we have something we can chant to make this 688 00:38:11,840 --> 00:38:13,640 Speaker 1: really take hold in everybody's mind. I don't know. I 689 00:38:13,680 --> 00:38:15,919 Speaker 1: don't want to make you uncomfortable. I want to chance. 690 00:38:16,040 --> 00:38:18,680 Speaker 1: Let's chance. Okay. So here's here's the way i'd put it. 691 00:38:19,040 --> 00:38:21,919 Speaker 1: You won't kill a lie by repeating it instead, say 692 00:38:21,920 --> 00:38:25,880 Speaker 1: what's true. You won't kill a lie by repeating it. Instead, 693 00:38:26,000 --> 00:38:29,000 Speaker 1: say what's true. You won't kill a lie by repeating it. 694 00:38:29,080 --> 00:38:32,680 Speaker 1: Instead say what's true? Death to video Drone. No, you 695 00:38:32,719 --> 00:38:35,680 Speaker 1: won't kill a lie by repeating it. Instead, say what's true? 696 00:38:36,040 --> 00:38:37,880 Speaker 1: If I feel like if we could have made it rhyme, 697 00:38:38,280 --> 00:38:41,359 Speaker 1: we would have helped. Oh, maybe too late. It does 698 00:38:41,400 --> 00:38:43,560 Speaker 1: feel kind of creepy to chance, and that gets into 699 00:38:43,600 --> 00:38:45,200 Speaker 1: a thing that I did want to talk about at 700 00:38:45,200 --> 00:38:48,160 Speaker 1: the end here. That's frustrating because I wonder if there 701 00:38:48,239 --> 00:38:53,360 Speaker 1: is sometimes a sort of perverse system widely spreading bad beliefs, 702 00:38:53,440 --> 00:38:57,560 Speaker 1: essentially because people who are willing to lie and spread 703 00:38:57,600 --> 00:39:02,640 Speaker 1: malicious misinformation are also more willing to blatantly use proven 704 00:39:02,680 --> 00:39:07,399 Speaker 1: manipulation techniques like repetition and chanting and illusory truth, while 705 00:39:07,560 --> 00:39:09,719 Speaker 1: I I feel like more often people who want to 706 00:39:09,760 --> 00:39:12,440 Speaker 1: spread the truth and want to spread true messages are 707 00:39:12,520 --> 00:39:16,759 Speaker 1: more hesitant to use blatantly manipulative types of rhetoric and communication. 708 00:39:16,800 --> 00:39:18,960 Speaker 1: I mean, I don't want to say like I'm so good, 709 00:39:19,000 --> 00:39:22,360 Speaker 1: but like I don't want to give people misinformation, but 710 00:39:22,480 --> 00:39:24,799 Speaker 1: also in trying to help them with that stuff. I 711 00:39:24,840 --> 00:39:28,040 Speaker 1: was just saying, like I felt very uncomfortable, like chanting 712 00:39:28,040 --> 00:39:29,960 Speaker 1: a phrase over and over again, even though I knew 713 00:39:29,960 --> 00:39:34,080 Speaker 1: it would be effective, right. I mean, generally speaking, individuals 714 00:39:34,080 --> 00:39:37,080 Speaker 1: are are very serious about journalism. They're going to want 715 00:39:37,120 --> 00:39:41,720 Speaker 1: to adhere to the standards of their industry and maybe 716 00:39:41,719 --> 00:39:44,840 Speaker 1: not you know, fall back on you know, tribal chance 717 00:39:45,280 --> 00:39:50,200 Speaker 1: about about something because they feel they feel so obviously manipulative, 718 00:39:50,200 --> 00:39:53,440 Speaker 1: and they feel that way because they work. I mean, 719 00:39:53,719 --> 00:39:55,319 Speaker 1: this is kind of like a whole this is a book, 720 00:39:55,360 --> 00:39:57,799 Speaker 1: a whole other area discussion. But you know, I can't 721 00:39:57,800 --> 00:40:00,400 Speaker 1: help but think in terms of the click bait and 722 00:40:00,440 --> 00:40:03,680 Speaker 1: the ease of publication and distribution. I mean, naturally, this 723 00:40:03,719 --> 00:40:06,080 Speaker 1: isn't something that's going to apply to individuals who, via 724 00:40:06,160 --> 00:40:10,480 Speaker 1: celebrity and or political power, already reach a wide audience. 725 00:40:10,960 --> 00:40:15,239 Speaker 1: But you know, any wild conspiracy theory or accusation can 726 00:40:15,320 --> 00:40:18,440 Speaker 1: can penetrate a lot deeper, seemingly these days than in 727 00:40:18,480 --> 00:40:21,120 Speaker 1: pre internet days. We talked earlier about some of the 728 00:40:21,160 --> 00:40:25,040 Speaker 1: celebrity urban myths from decades past and about how to 729 00:40:25,120 --> 00:40:26,879 Speaker 1: really get going they had to you had to have 730 00:40:26,960 --> 00:40:30,759 Speaker 1: just the right celebrity um urban legend and it had 731 00:40:30,800 --> 00:40:33,600 Speaker 1: to had to spread by word of mouth or maybe 732 00:40:33,600 --> 00:40:37,080 Speaker 1: a you know, a concentrated effort to send facts is 733 00:40:37,120 --> 00:40:40,360 Speaker 1: across Hollywood potentially. I don't even know if that's true 734 00:40:40,400 --> 00:40:44,479 Speaker 1: in the Richard Gear case, but be a repeated false 735 00:40:44,480 --> 00:40:47,080 Speaker 1: story exactly. Yeah, it's that's one of those situations where 736 00:40:47,080 --> 00:40:50,960 Speaker 1: I think that correct me if I'm wrong, but out there. 737 00:40:51,000 --> 00:40:53,400 Speaker 1: But I don't think anyone's ever really been able to 738 00:40:53,440 --> 00:40:56,680 Speaker 1: get to the bottom of like where the urban legend 739 00:40:56,680 --> 00:41:01,120 Speaker 1: even really emerged from. Um. But nowadays, like the ease 740 00:41:01,120 --> 00:41:04,040 Speaker 1: of publication is a lot lower, and we're we're having 741 00:41:04,040 --> 00:41:06,279 Speaker 1: we're currently in a time where we seem to be 742 00:41:06,800 --> 00:41:09,920 Speaker 1: correcting and figuring out, well, how do we manage this 743 00:41:10,080 --> 00:41:14,759 Speaker 1: just plethora of of of publications of varying um you know, 744 00:41:15,600 --> 00:41:21,279 Speaker 1: you know, ethical solidity. Yeah, but that's just one part 745 00:41:21,280 --> 00:41:24,200 Speaker 1: of the issue obviously. Well it's a really difficult time. Yeah, 746 00:41:24,239 --> 00:41:27,480 Speaker 1: Our media landscape is is difficult. I don't know what 747 00:41:27,560 --> 00:41:30,200 Speaker 1: to what to do, Like what the best way to 748 00:41:30,239 --> 00:41:35,080 Speaker 1: address the wide spread of misinformation through social media and 749 00:41:35,120 --> 00:41:37,920 Speaker 1: the internet is. I mean, you can't, like, you know, 750 00:41:38,280 --> 00:41:40,279 Speaker 1: you don't want to become a sensor and lock it 751 00:41:40,280 --> 00:41:42,720 Speaker 1: down and say, well, I will decide what's true and false. 752 00:41:42,760 --> 00:41:44,719 Speaker 1: I'll shut you down. You'd want there to be an 753 00:41:44,800 --> 00:41:46,840 Speaker 1: organic way where people would would I don't know, have 754 00:41:46,960 --> 00:41:51,000 Speaker 1: the tools to tell between truth and falsehood themselves. Yeah, 755 00:41:51,560 --> 00:41:53,799 Speaker 1: you know. And then one of the issues too for 756 00:41:53,920 --> 00:41:57,799 Speaker 1: us is that we we sometimes discuss the theories and 757 00:41:57,880 --> 00:42:02,560 Speaker 1: hypotheses that that are not true or disproven over time. 758 00:42:02,920 --> 00:42:05,400 Speaker 1: This is exactly something I wanted to talk about at 759 00:42:05,400 --> 00:42:07,480 Speaker 1: the end of the episode today. It's a very frustrating 760 00:42:07,480 --> 00:42:11,919 Speaker 1: takeaway from this conversation we've had, um that there could 761 00:42:11,920 --> 00:42:15,480 Speaker 1: be negative effects from discussing what's wrong with bad ideas 762 00:42:15,480 --> 00:42:18,600 Speaker 1: and false claims because something we love to do, we 763 00:42:18,680 --> 00:42:21,200 Speaker 1: love to do on this show. For example, we just 764 00:42:21,280 --> 00:42:24,560 Speaker 1: did an episode about the ancient aliens hypothesis, something that 765 00:42:24,800 --> 00:42:26,080 Speaker 1: I don't want to speak for both of us. I 766 00:42:26,120 --> 00:42:28,560 Speaker 1: think neither of us think there's any good evidence to 767 00:42:28,560 --> 00:42:31,160 Speaker 1: believe is true. I do not believe there there is, 768 00:42:31,239 --> 00:42:34,600 Speaker 1: so we we put no stock whatsoever in this hypothesis. 769 00:42:34,880 --> 00:42:36,959 Speaker 1: It's the belief that ancient aliens came to the earth. 770 00:42:37,000 --> 00:42:40,120 Speaker 1: All of the evidence is either really bad over interpretation 771 00:42:40,239 --> 00:42:44,200 Speaker 1: or outright fraud, and yet it's fascinating to understand this 772 00:42:44,320 --> 00:42:48,040 Speaker 1: widely held, unfounded belief, to understand where it came from, 773 00:42:48,120 --> 00:42:50,920 Speaker 1: why people believe it, To talk about the real facts 774 00:42:51,000 --> 00:42:54,040 Speaker 1: and the real knowledge that undermine the existing claims in 775 00:42:54,120 --> 00:42:57,320 Speaker 1: this belief structure, uh, to think about what good evidence 776 00:42:57,360 --> 00:43:00,440 Speaker 1: there could be for past alien contact, if there, if 777 00:43:00,440 --> 00:43:03,040 Speaker 1: it did exist. Yeah, it's it's kind of like trying 778 00:43:03,120 --> 00:43:05,359 Speaker 1: to imagine how a dragon would work based on real 779 00:43:05,400 --> 00:43:07,520 Speaker 1: world biology. Yeah, you know, like you don't want to 780 00:43:07,560 --> 00:43:10,080 Speaker 1: advocate that dragons are real, but it is fun to 781 00:43:10,080 --> 00:43:12,600 Speaker 1: to take it apart and say, well, if they were real, 782 00:43:12,719 --> 00:43:14,520 Speaker 1: this is how it would work, and your discussion of 783 00:43:14,600 --> 00:43:17,839 Speaker 1: that should be based on real biology, and so all 784 00:43:17,880 --> 00:43:20,640 Speaker 1: this stuff. This is all stuff that I really enjoy 785 00:43:20,760 --> 00:43:22,960 Speaker 1: and I think is very valuable. But it makes me 786 00:43:23,040 --> 00:43:26,839 Speaker 1: wonder if even by having that kind of discussion, some 787 00:43:26,960 --> 00:43:30,560 Speaker 1: people are more likely to, you know, months years down 788 00:43:30,600 --> 00:43:33,960 Speaker 1: the road later remember as true the claims that we 789 00:43:34,040 --> 00:43:37,399 Speaker 1: examine in order to criticize and understand where they come 790 00:43:37,440 --> 00:43:40,279 Speaker 1: from in the episode. I don't know if there's any 791 00:43:40,280 --> 00:43:42,560 Speaker 1: way around that. Like, I don't think it's reasonable to 792 00:43:42,600 --> 00:43:45,120 Speaker 1: say we should live in a world where nobody ever 793 00:43:45,320 --> 00:43:50,000 Speaker 1: examines or talks about why widely held untrue beliefs that 794 00:43:50,000 --> 00:43:52,920 Speaker 1: that just doesn't seem reasonable. I think we learn almost 795 00:43:52,960 --> 00:43:55,800 Speaker 1: as much about the world and about ourselves from critically 796 00:43:55,800 --> 00:43:59,200 Speaker 1: studying the false misbeliefs we hold as we do from say, 797 00:43:59,239 --> 00:44:02,240 Speaker 1: reading a list of objectively true statements about the world. 798 00:44:02,360 --> 00:44:07,560 Speaker 1: It's not like studying false beliefs is uninformative. It's very informative. Yeah. 799 00:44:07,600 --> 00:44:11,200 Speaker 1: And in some cases it's it's about not repeating history, right, 800 00:44:11,280 --> 00:44:14,680 Speaker 1: not being doomed to repeat history. Um, when we when 801 00:44:14,719 --> 00:44:17,840 Speaker 1: we've talked about eugenics, for instance on the show, Uh, 802 00:44:18,000 --> 00:44:21,719 Speaker 1: you know that there's some horrible ideas wrapped up in eugenics, 803 00:44:21,719 --> 00:44:24,520 Speaker 1: but it is it is worth remembering. It's it's it's 804 00:44:24,520 --> 00:44:26,920 Speaker 1: worth knowing how we got there. Yeah. We we had 805 00:44:26,920 --> 00:44:29,200 Speaker 1: that discussion with Karl Zimmer a while back that talked 806 00:44:29,239 --> 00:44:31,560 Speaker 1: about that, and that's an important part of the history 807 00:44:31,600 --> 00:44:34,200 Speaker 1: of the study of inheritance. If you just ignore it 808 00:44:34,280 --> 00:44:37,600 Speaker 1: and say we never will talk about that anymore, Um, 809 00:44:37,840 --> 00:44:40,200 Speaker 1: you you do a disservice to like the you know, 810 00:44:40,239 --> 00:44:41,960 Speaker 1: the memory of all the evil that was done in 811 00:44:42,000 --> 00:44:44,000 Speaker 1: its name. And yeah, you like you're saying you open 812 00:44:44,080 --> 00:44:46,520 Speaker 1: yourself to not being aware of the really bad past 813 00:44:46,600 --> 00:44:49,480 Speaker 1: people can go down. Yeah. Now, now, of course, obviously 814 00:44:49,560 --> 00:44:52,759 Speaker 1: ancient aliens is less high stakes than that. But but 815 00:44:52,840 --> 00:44:54,719 Speaker 1: still I think the same as some of the same 816 00:44:54,719 --> 00:44:57,439 Speaker 1: principles apply. And then then again at the same time, 817 00:44:57,480 --> 00:44:59,759 Speaker 1: I like, I don't want to deny this research. I 818 00:45:00,000 --> 00:45:03,360 Speaker 1: knowledge it seems very true that bringing up a statement, 819 00:45:03,560 --> 00:45:06,800 Speaker 1: even to discredit the statement or even to criticize the statement, 820 00:45:07,280 --> 00:45:12,080 Speaker 1: can have the negative side effect of many people increasing 821 00:45:12,120 --> 00:45:14,759 Speaker 1: their belief in that statement later on, just because it 822 00:45:14,920 --> 00:45:17,080 Speaker 1: sticks somewhere in the back of their mind. They don't 823 00:45:17,120 --> 00:45:19,919 Speaker 1: remember the original context in which it came up, which 824 00:45:19,960 --> 00:45:23,480 Speaker 1: was a context of criticism or context of debunking, and 825 00:45:23,560 --> 00:45:26,920 Speaker 1: so people just kind of they think, oh, maybe there 826 00:45:26,960 --> 00:45:29,239 Speaker 1: is something to that. I've heard that somewhere before. It 827 00:45:29,280 --> 00:45:32,360 Speaker 1: feels kind of familiar. Yeah, well, I guess one of 828 00:45:32,480 --> 00:45:35,000 Speaker 1: one argument one could make then it would be, Hey, 829 00:45:35,040 --> 00:45:37,400 Speaker 1: if you're going to cover ancient aliens, then you also 830 00:45:37,480 --> 00:45:39,400 Speaker 1: have to make sure that you cover an ancient in 831 00:45:39,400 --> 00:45:43,400 Speaker 1: an ancient aliens free way, like how life actually emerges 832 00:45:44,040 --> 00:45:48,920 Speaker 1: on Earth, which we certainly discussed evolution on the show before. 833 00:45:49,600 --> 00:45:53,160 Speaker 1: So I think we're we're mostly there. Well, I'm not 834 00:45:53,200 --> 00:45:56,280 Speaker 1: worrying that we have a deficiency of saying true things, 835 00:45:56,800 --> 00:45:58,680 Speaker 1: but I wonder what we can do about the fact 836 00:45:58,800 --> 00:46:01,759 Speaker 1: that these types of discus sessions of bad ideas that 837 00:46:01,800 --> 00:46:05,120 Speaker 1: are really important and interesting to have, can also have 838 00:46:05,200 --> 00:46:09,280 Speaker 1: these negative side effects. I don't think I know quite 839 00:46:09,280 --> 00:46:13,239 Speaker 1: what the answer is yet. Obviously it will depend a 840 00:46:13,239 --> 00:46:15,920 Speaker 1: lot on the context of the idea. Oh yes, certainly, 841 00:46:15,960 --> 00:46:18,799 Speaker 1: and then this would actually be a great a great 842 00:46:18,840 --> 00:46:21,800 Speaker 1: topic to hear back from listeners on. Really, yeah, help 843 00:46:21,880 --> 00:46:24,319 Speaker 1: me out of this dilemma. I feel stuck. I don't 844 00:46:24,360 --> 00:46:26,920 Speaker 1: think I can live in a world where false beliefs 845 00:46:26,920 --> 00:46:30,399 Speaker 1: and bad ideas can never be spoken of. That would 846 00:46:30,440 --> 00:46:32,960 Speaker 1: sort of it would rob intellectual life of so much 847 00:46:33,000 --> 00:46:35,319 Speaker 1: of its richness. You would prevent us from gaining all 848 00:46:35,360 --> 00:46:38,040 Speaker 1: these insights about our culture and our minds. At the 849 00:46:38,080 --> 00:46:41,759 Speaker 1: same time, I don't want to spread bad beliefs. I 850 00:46:41,800 --> 00:46:44,160 Speaker 1: don't know what to do about that. Well, remain remains 851 00:46:44,160 --> 00:46:46,880 Speaker 1: an open question for now. Then, and in the meantime, 852 00:46:47,000 --> 00:46:49,080 Speaker 1: if you want to check out other episodes of Stuff 853 00:46:49,080 --> 00:46:50,480 Speaker 1: to Blow Your Mind, head on over to stuff to 854 00:46:50,480 --> 00:46:52,439 Speaker 1: Blow your Mind dot com. That's the mothership. That's where 855 00:46:52,440 --> 00:46:54,200 Speaker 1: you will find them as well as links out to 856 00:46:54,200 --> 00:46:56,880 Speaker 1: our various social media accounts. And if you want to 857 00:46:56,880 --> 00:46:58,839 Speaker 1: help the show, you want to support the show, rate 858 00:46:58,880 --> 00:47:01,120 Speaker 1: and review us where ever you have the ability to 859 00:47:01,160 --> 00:47:04,040 Speaker 1: do so. Huge thanks as always to our wonderful audio 860 00:47:04,080 --> 00:47:07,120 Speaker 1: producers Alex Williams and Tory Harrison. If you would like 861 00:47:07,160 --> 00:47:09,399 Speaker 1: to get in touch with us directly to to get 862 00:47:09,400 --> 00:47:11,880 Speaker 1: me out of my dilemma from this episode, or to 863 00:47:12,640 --> 00:47:15,040 Speaker 1: suggest a topic for a future episode, to give feedback 864 00:47:15,080 --> 00:47:16,960 Speaker 1: on this episode or any other, just to say hi, 865 00:47:17,080 --> 00:47:19,240 Speaker 1: let us know where you listen from. You can email 866 00:47:19,320 --> 00:47:22,000 Speaker 1: us at blow the Mind at how stuff works dot 867 00:47:22,040 --> 00:47:34,000 Speaker 1: com for more on this and thousands of other topics. 868 00:47:34,120 --> 00:47:58,400 Speaker 1: Does it how stuff works dot com