1 00:00:05,040 --> 00:00:08,640 Speaker 1: Why do you brains sometimes make things up entirely? What 2 00:00:08,720 --> 00:00:11,680 Speaker 1: does this have to do with Supreme Court Justice William 3 00:00:11,760 --> 00:00:15,240 Speaker 1: Douglas sitting in a wheelchair and claiming that he was 4 00:00:15,320 --> 00:00:19,840 Speaker 1: just kicking football field goals or a blind person who 5 00:00:19,920 --> 00:00:23,520 Speaker 1: insists she can see, And what does any of this 6 00:00:23,600 --> 00:00:26,840 Speaker 1: have to do with whether Nelson Mandela did or did 7 00:00:26,880 --> 00:00:30,000 Speaker 1: not die in the nineteen eighties, And whether the cartoon 8 00:00:30,120 --> 00:00:34,839 Speaker 1: character Curious George had a tale or the exact lines 9 00:00:35,120 --> 00:00:39,239 Speaker 1: said in Star Wars or Casablanca, or the spelling of 10 00:00:39,600 --> 00:00:44,199 Speaker 1: Oscar Meyer Wieners or the Berenstein Bears, or the narrative 11 00:00:44,560 --> 00:00:51,800 Speaker 1: that we tell ourselves about our lives. Welcome to Intercosmos 12 00:00:51,840 --> 00:00:54,560 Speaker 1: with me, David Eagleman. I'm a neuroscientist and an author 13 00:00:54,600 --> 00:00:58,320 Speaker 1: at Stanford, and in these episodes we sail deeply into 14 00:00:58,360 --> 00:01:02,200 Speaker 1: our three pound universe to understand why and how our 15 00:01:02,240 --> 00:01:17,520 Speaker 1: lives look the way they do. Today's episode is about confabulation. 16 00:01:18,360 --> 00:01:22,080 Speaker 1: That's when the brain makes something up entirely. But it's 17 00:01:22,120 --> 00:01:27,880 Speaker 1: different than lying. Lying is purposeful deception. You know the truth, 18 00:01:27,959 --> 00:01:31,000 Speaker 1: but you squelch it and make up something in its place. 19 00:01:31,080 --> 00:01:35,280 Speaker 1: We all know what lying is, but Confabulation is a 20 00:01:35,319 --> 00:01:39,080 Speaker 1: different beast. It's where your brain cooks up something that 21 00:01:39,240 --> 00:01:43,600 Speaker 1: is not true, but you believe it entirely. How does 22 00:01:43,640 --> 00:01:47,760 Speaker 1: confabulation happen? How frequently does it happen? Is it seen 23 00:01:48,400 --> 00:01:51,880 Speaker 1: not just in patience with brain damage? But do we 24 00:01:51,960 --> 00:01:54,520 Speaker 1: all do this to some degree? And what does this 25 00:01:54,640 --> 00:01:58,680 Speaker 1: tell us about memory and truth telling and the interpretation 26 00:01:58,800 --> 00:02:04,120 Speaker 1: of your life as a story that sometimes changes retrospectively. 27 00:02:06,560 --> 00:02:10,359 Speaker 1: So to set the table, picture this Alexander, a fifty 28 00:02:10,360 --> 00:02:13,840 Speaker 1: eight year old man, sits comfortably in a hospital room, 29 00:02:14,360 --> 00:02:18,760 Speaker 1: chatting with a neurologist. Alexander describes his morning in detail, 30 00:02:18,800 --> 00:02:21,760 Speaker 1: the breakfast he had, the news that he read, the 31 00:02:22,040 --> 00:02:25,040 Speaker 1: friend that he bumped into on his way here. His 32 00:02:25,120 --> 00:02:29,400 Speaker 1: voice is confident, the details are specific. But there's a problem. 33 00:02:29,560 --> 00:02:33,560 Speaker 1: None of it happened. This man, who's a former school teacher. 34 00:02:33,600 --> 00:02:37,800 Speaker 1: He suffered brain damage years ago. His memory is profoundly impaired. 35 00:02:38,120 --> 00:02:41,200 Speaker 1: He can't form new memories. Every day is a blank slate, 36 00:02:41,480 --> 00:02:44,640 Speaker 1: but he doesn't seem aware of this. Instead, his brain 37 00:02:44,800 --> 00:02:50,560 Speaker 1: fills in the gaps, fabricating a seamless believable reality. And 38 00:02:50,560 --> 00:02:53,040 Speaker 1: he's not lying, not in the way we usually think 39 00:02:53,040 --> 00:02:57,760 Speaker 1: about lying. He fully believes the story he's telling. Why 40 00:02:57,840 --> 00:03:00,400 Speaker 1: when the brain is faced with missing in from does 41 00:03:00,440 --> 00:03:03,400 Speaker 1: it sometimes just make things up? So let's zoom in 42 00:03:03,480 --> 00:03:05,720 Speaker 1: on the issue at the center of all this, which 43 00:03:05,760 --> 00:03:09,480 Speaker 1: is our memory systems. We tend to think of memory 44 00:03:09,560 --> 00:03:15,119 Speaker 1: as a recording device, something that stores our experiences faithfully 45 00:03:15,600 --> 00:03:17,360 Speaker 1: and plays them back on demand. 46 00:03:17,919 --> 00:03:19,079 Speaker 2: But over the past. 47 00:03:18,760 --> 00:03:22,520 Speaker 1: Century, psychology and neuroscience tell us something very different. 48 00:03:22,840 --> 00:03:25,160 Speaker 2: Memory isn't like a video camera. 49 00:03:25,800 --> 00:03:30,639 Speaker 1: It's more like a patchwork quilt stitched together from fragments 50 00:03:30,639 --> 00:03:35,760 Speaker 1: of past experience and guesses and expectations. Most of the 51 00:03:35,760 --> 00:03:41,440 Speaker 1: time this work's just fine. But when memory fails, maybe 52 00:03:41,480 --> 00:03:45,560 Speaker 1: because of injury or aging, the brain doesn't always leave 53 00:03:45,640 --> 00:03:49,440 Speaker 1: a void. Sometimes it fills in the blanks, often with 54 00:03:49,640 --> 00:03:53,200 Speaker 1: details that are completely false. And that's what confabulation is. 55 00:03:53,840 --> 00:03:57,400 Speaker 1: Some forms of it are dramatic, as in cases of 56 00:03:57,440 --> 00:04:00,720 Speaker 1: brain injury, which I'll tell you more about, But milder 57 00:04:00,840 --> 00:04:04,640 Speaker 1: versions happen to all of us, like when we misremember 58 00:04:04,760 --> 00:04:09,360 Speaker 1: childhood events, when we confidently recall things that never happened, 59 00:04:09,840 --> 00:04:14,600 Speaker 1: when we rewrite history without realizing it. So in today's episode, 60 00:04:14,720 --> 00:04:17,720 Speaker 1: we're going to dive deep into the world of confabulation. 61 00:04:18,279 --> 00:04:23,000 Speaker 1: We'll explore cases where brain injury leads to striking, almost 62 00:04:23,040 --> 00:04:29,080 Speaker 1: cinematic fabrications, patients who invent entire days and blind people 63 00:04:29,080 --> 00:04:33,520 Speaker 1: who insist they can see and split brain patients whose 64 00:04:33,640 --> 00:04:37,760 Speaker 1: minds generate explanations out of thin air. And next we'll 65 00:04:37,800 --> 00:04:41,240 Speaker 1: turn the lens on ourselves. How often do we confabulate 66 00:04:41,320 --> 00:04:42,360 Speaker 1: without realizing it? 67 00:04:42,800 --> 00:04:45,280 Speaker 2: How reliable are our own memories? 68 00:04:45,480 --> 00:04:47,760 Speaker 1: And what does this all tell us about the nature 69 00:04:48,160 --> 00:04:54,440 Speaker 1: of reality and history and our sense of self? Okay, 70 00:04:54,720 --> 00:04:58,200 Speaker 1: so confabulation is most obvious and most striking in people 71 00:04:58,200 --> 00:05:03,159 Speaker 1: who have injuries to their brains. The fabrications the stories 72 00:05:03,160 --> 00:05:07,280 Speaker 1: they make up can be detailed and totally convincing, and 73 00:05:07,320 --> 00:05:10,280 Speaker 1: they fully believe the stories they tell. Their brains are 74 00:05:10,320 --> 00:05:15,320 Speaker 1: damaged in ways that impair memory retrieval, but their brains 75 00:05:15,640 --> 00:05:18,240 Speaker 1: just won't admit to the gaps. Instead, they fill those in. 76 00:05:18,279 --> 00:05:23,080 Speaker 1: So let's take an example. The neurologist Oliver Sacks described 77 00:05:23,120 --> 00:05:27,160 Speaker 1: a patient that he called mister Thompson. Now, mister Thompson 78 00:05:27,200 --> 00:05:32,200 Speaker 1: had severe amnesia due to a condition known as Corsicosts syndrome, 79 00:05:32,200 --> 00:05:35,880 Speaker 1: which is caused by chronic alcoholism, which leads to a 80 00:05:35,920 --> 00:05:40,320 Speaker 1: deficiency an thiamine, which damages particular circuits in the brain. Now, 81 00:05:40,520 --> 00:05:44,360 Speaker 1: mister Thompson couldn't form new memories, and yet rather than 82 00:05:44,720 --> 00:05:48,839 Speaker 1: expressing confusion or admitting that he couldn't remember, he constantly 83 00:05:48,839 --> 00:05:53,360 Speaker 1: invented new realities. Every few minutes, mister Thompson would introduce 84 00:05:53,400 --> 00:05:57,760 Speaker 1: himself as someone different, sometimes a shopkeeper, sometimes a businessman, 85 00:05:57,920 --> 00:06:01,320 Speaker 1: sometimes a priest. Once someone entered the room, he would 86 00:06:01,360 --> 00:06:05,960 Speaker 1: confabulate an entire backstory for them on the spot, convinced 87 00:06:05,960 --> 00:06:08,440 Speaker 1: that he had known them for years, and the moment 88 00:06:08,480 --> 00:06:11,960 Speaker 1: they left and returned, he had forgotten everything and would 89 00:06:12,040 --> 00:06:15,320 Speaker 1: create an entirely new identity for them. 90 00:06:15,880 --> 00:06:16,840 Speaker 2: Why did this happen? 91 00:06:17,400 --> 00:06:22,000 Speaker 1: His brain unable to retrieve the real past improvised. It 92 00:06:22,040 --> 00:06:25,000 Speaker 1: was like his mind refused to accept a blank space 93 00:06:25,080 --> 00:06:30,040 Speaker 1: where memory should be, so it generated plausible but false alternatives. 94 00:06:30,839 --> 00:06:34,800 Speaker 1: And confabulation doesn't just happen in corsicost syndrome. We see 95 00:06:34,800 --> 00:06:38,120 Speaker 1: it in many conditions, and each one gives us a 96 00:06:38,320 --> 00:06:44,360 Speaker 1: different window into the mind's drive to create coherence. One 97 00:06:44,520 --> 00:06:48,480 Speaker 1: extraordinary example comes from people who are blind, but they 98 00:06:48,560 --> 00:06:51,920 Speaker 1: don't know it, and they deny it. This condition known 99 00:06:51,960 --> 00:06:56,080 Speaker 1: as Anton's syndrome. This happens when damage to the visual 100 00:06:56,080 --> 00:06:58,560 Speaker 1: cortex makes a person unable to see, but their. 101 00:06:58,400 --> 00:07:00,400 Speaker 2: Brain still believes they can hand. 102 00:07:00,839 --> 00:07:03,960 Speaker 1: So imagine there's a person named Dina and she has 103 00:07:04,000 --> 00:07:07,880 Speaker 1: Anton syndrome, so she's blind. You walk into the room 104 00:07:07,920 --> 00:07:10,280 Speaker 1: and you say something like, oh, this is a nice room. 105 00:07:10,600 --> 00:07:14,720 Speaker 1: How would you describe this? And Dina will say something like, yeah, 106 00:07:14,720 --> 00:07:18,040 Speaker 1: this is nice. The room is bright, there are yellow 107 00:07:18,080 --> 00:07:20,560 Speaker 1: curtains over there, there's a red chair in the corner, 108 00:07:21,200 --> 00:07:23,360 Speaker 1: even though that's not what the room looks like at all, 109 00:07:23,720 --> 00:07:26,040 Speaker 1: and in fact, the room is in total darkness. Anyway, 110 00:07:26,680 --> 00:07:28,880 Speaker 1: you might ask her to do something like can you 111 00:07:28,920 --> 00:07:31,800 Speaker 1: turn on the lamp, and she'll reach out with total 112 00:07:31,800 --> 00:07:34,200 Speaker 1: certainty about where the lamp is, even though there's no 113 00:07:34,320 --> 00:07:37,320 Speaker 1: lamp there, and of course she'll fail to reach the 114 00:07:37,400 --> 00:07:40,960 Speaker 1: switch that she thinks is there, but rather than acknowledging that, 115 00:07:41,280 --> 00:07:45,600 Speaker 1: her brain will generate explanations like, oh, I just miscalculated 116 00:07:45,600 --> 00:07:48,040 Speaker 1: the distance. And if you ask her to try again, 117 00:07:48,160 --> 00:07:50,280 Speaker 1: she might say, oh, you know, my arm hurts too 118 00:07:50,360 --> 00:07:54,440 Speaker 1: much to keep trying again. Dina isn't trying to lie 119 00:07:54,520 --> 00:07:59,400 Speaker 1: to you. Her brain believes it it's fabricating reality in 120 00:07:59,440 --> 00:08:03,480 Speaker 1: real time to compensate for the missing information, and this 121 00:08:03,560 --> 00:08:11,520 Speaker 1: is a fundamental lesson. The brain prioritizes coherence over accuracy. Now, 122 00:08:11,560 --> 00:08:14,880 Speaker 1: one interesting feature of confabulations is that they tend to 123 00:08:14,920 --> 00:08:19,560 Speaker 1: contain a kernel of truth, so Dina might remember a 124 00:08:19,720 --> 00:08:24,360 Speaker 1: room like the one she's describing. One hypothesis about confabulations 125 00:08:24,920 --> 00:08:27,760 Speaker 1: is that their memories in the brain, but they're not 126 00:08:27,920 --> 00:08:30,880 Speaker 1: built from the right pieces given the situation at hand. 127 00:08:31,480 --> 00:08:33,800 Speaker 1: So one way to see this is that a patient 128 00:08:33,880 --> 00:08:37,839 Speaker 1: will confabulate if they're asked a question like where are 129 00:08:37,840 --> 00:08:38,520 Speaker 1: you right now? 130 00:08:38,720 --> 00:08:39,880 Speaker 2: Or how did you get here? 131 00:08:40,600 --> 00:08:43,160 Speaker 1: But if you ask them something about which they don't 132 00:08:43,200 --> 00:08:47,479 Speaker 1: have any pre existing knowledge, like who is Queen schmcgeggy, 133 00:08:48,160 --> 00:08:49,520 Speaker 1: they will say that they don't know. 134 00:08:50,200 --> 00:08:50,719 Speaker 2: It's not that. 135 00:08:50,640 --> 00:08:54,560 Speaker 1: They're creating a fictional answer for things out of the blue. Instead, 136 00:08:54,920 --> 00:08:59,080 Speaker 1: it's that somehow their confabulated reality is growing. 137 00:08:58,679 --> 00:09:01,280 Speaker 2: From the seeds of memory is that they have had. 138 00:09:01,800 --> 00:09:04,560 Speaker 1: So the hypothesis is that the problem in the network 139 00:09:04,640 --> 00:09:08,640 Speaker 1: is that they're not inhibiting irrelevant memories. 140 00:09:09,200 --> 00:09:10,679 Speaker 2: There are some pretty clever ways to. 141 00:09:10,640 --> 00:09:12,439 Speaker 1: Study this, and I'll link these in the show notes, 142 00:09:12,480 --> 00:09:15,679 Speaker 1: but the bottom line is that when I ask you 143 00:09:15,720 --> 00:09:19,720 Speaker 1: a question about your life right now, your brain kindles 144 00:09:19,800 --> 00:09:23,880 Speaker 1: lots and lots of possible pathways, and then certain brain 145 00:09:24,000 --> 00:09:26,640 Speaker 1: areas like the orbit or frontal cortex. 146 00:09:26,720 --> 00:09:29,080 Speaker 2: Squelch the activity of most. 147 00:09:28,800 --> 00:09:32,679 Speaker 1: Of the pathways that aren't relevant in the current circumstances. 148 00:09:33,400 --> 00:09:36,959 Speaker 1: But if your orbitor frontal cortex is damaged, it can't 149 00:09:37,000 --> 00:09:41,000 Speaker 1: suppress the irrelevant memories, and therefore those can come to 150 00:09:41,040 --> 00:09:41,480 Speaker 1: the top. 151 00:09:41,880 --> 00:09:42,080 Speaker 2: Now. 152 00:09:42,120 --> 00:09:44,280 Speaker 1: That might make it sound like this only happens when 153 00:09:44,280 --> 00:09:47,000 Speaker 1: there's damage to a particular part of the brain like 154 00:09:47,000 --> 00:09:50,439 Speaker 1: the orbit or frontal cortex, but confabulation can also pop 155 00:09:50,520 --> 00:09:54,439 Speaker 1: up when people get damaged to bits of their thalamus 156 00:09:54,559 --> 00:09:57,920 Speaker 1: or the hypothalamus. Why, all these areas are parts of 157 00:09:57,960 --> 00:10:01,600 Speaker 1: a pathway called the circuit of pape, and this whole 158 00:10:01,640 --> 00:10:07,520 Speaker 1: circuit is involved for selecting relevant memories versus irrelevant ones. 159 00:10:08,320 --> 00:10:12,560 Speaker 1: The key surprising lesson here is that your current situation, 160 00:10:12,640 --> 00:10:15,200 Speaker 1: what you're looking at right now, doesn't just trigger a 161 00:10:15,320 --> 00:10:20,319 Speaker 1: particular memory, but instead tickles a whole world of possible memories, 162 00:10:20,720 --> 00:10:22,960 Speaker 1: which then other parts of the brain go through a 163 00:10:23,000 --> 00:10:25,880 Speaker 1: lot of trouble to squish down as they're looking. 164 00:10:25,679 --> 00:10:27,080 Speaker 2: For the right one. 165 00:10:27,120 --> 00:10:29,640 Speaker 1: And if you have less of that squishing, if you 166 00:10:29,679 --> 00:10:32,560 Speaker 1: have more noise in the system, then you get a 167 00:10:32,679 --> 00:10:35,160 Speaker 1: memory popping up that has nothing to do with your 168 00:10:35,240 --> 00:10:39,320 Speaker 1: current situation but feels every bit as real to you 169 00:10:39,920 --> 00:10:43,280 Speaker 1: as any other memory. I'll give you another example of confabulation. 170 00:10:43,480 --> 00:10:48,120 Speaker 1: In nineteen seventy four, the Supreme Court Justice William Douglas 171 00:10:48,480 --> 00:10:51,200 Speaker 1: had a stroke that made him paralyzed on his left 172 00:10:51,280 --> 00:10:55,360 Speaker 1: side and confine him to a wheelchair. Now, despite this, 173 00:10:55,960 --> 00:10:59,760 Speaker 1: Douglas insisted on being discharged from the hospital, claiming that 174 00:10:59,800 --> 00:11:03,679 Speaker 1: he was perfectly fine. He dismissed reports of his paralysis 175 00:11:03,720 --> 00:11:07,240 Speaker 1: as a myth, and when he was met with skepticism, 176 00:11:07,280 --> 00:11:10,480 Speaker 1: he even invited reporters to join him on a hike, 177 00:11:11,040 --> 00:11:14,800 Speaker 1: a suggestion that was widely seen as absurd. He went 178 00:11:14,880 --> 00:11:16,640 Speaker 1: so far as to say that he had just been 179 00:11:16,760 --> 00:11:21,440 Speaker 1: kicking football field goals with his paralyzed leg. Because of 180 00:11:21,480 --> 00:11:26,520 Speaker 1: this detachment from reality, Douglas was ultimately removed from his 181 00:11:26,640 --> 00:11:30,360 Speaker 1: position on the Supreme Court. But what he was experiencing 182 00:11:30,480 --> 00:11:34,160 Speaker 1: is called a nosagnosia, which is a condition where a 183 00:11:34,200 --> 00:11:39,000 Speaker 1: person is completely unaware of their body's impairment. People will 184 00:11:39,320 --> 00:11:43,880 Speaker 1: adamantly deny their paralysis, not out of deception, but because 185 00:11:43,920 --> 00:11:48,520 Speaker 1: their brain genuinely believes that they can move normally. Douglas 186 00:11:48,559 --> 00:11:53,120 Speaker 1: fabricated because of his brain's drive to construct a coherent narrative, 187 00:11:53,640 --> 00:11:56,960 Speaker 1: and this is wild to witness. For example, imagine you 188 00:11:57,000 --> 00:11:59,760 Speaker 1: meet a person who is paralyzed on one side and 189 00:11:59,800 --> 00:12:04,160 Speaker 1: they have a no sagnosia. So you gently ask the 190 00:12:04,200 --> 00:12:07,760 Speaker 1: person to put both hands on an imaginary steering wheel 191 00:12:07,760 --> 00:12:09,920 Speaker 1: in front of them. So she puts one hand on 192 00:12:10,000 --> 00:12:12,480 Speaker 1: the steering wheel, and if you ask her why only 193 00:12:12,520 --> 00:12:16,640 Speaker 1: one hand, she will insist that both hands are in place. 194 00:12:17,240 --> 00:12:19,680 Speaker 2: So you might come up with an idea and you ask. 195 00:12:19,480 --> 00:12:23,559 Speaker 1: Her to clap her hands, so she'll just move one hand. 196 00:12:23,840 --> 00:12:26,040 Speaker 2: But she will claim to have clapped. 197 00:12:26,480 --> 00:12:28,720 Speaker 1: If you point out that there was no sound and 198 00:12:28,760 --> 00:12:31,880 Speaker 1: you ask her to try again, she might simply refuse 199 00:12:31,920 --> 00:12:34,640 Speaker 1: and give an excuse like she just doesn't feel like it. 200 00:12:35,280 --> 00:12:38,600 Speaker 2: This is just like Dina, who lost her vision but still. 201 00:12:38,320 --> 00:12:42,120 Speaker 1: Insists she can see even as she struggles to navigate 202 00:12:42,160 --> 00:12:45,880 Speaker 1: the room without bumping into things. Dina might attribute her 203 00:12:45,920 --> 00:12:51,600 Speaker 1: difficulties to poor balance or misplaced furniture, rather than acknowledging 204 00:12:51,640 --> 00:12:56,320 Speaker 1: her blindness. The key insight about a no sagnosia is 205 00:12:56,320 --> 00:13:00,480 Speaker 1: that people like Justice Douglas are not lying. Their brains 206 00:13:00,520 --> 00:13:07,160 Speaker 1: are unconsciously generating explanations that maintain a coherent sense of reality, 207 00:13:07,679 --> 00:13:28,679 Speaker 1: even when that reality is fundamentally flawed. You can also 208 00:13:28,800 --> 00:13:32,120 Speaker 1: seek in fabulation in a very different situation. After a 209 00:13:32,160 --> 00:13:35,439 Speaker 1: person has undergone a split brain surgery. 210 00:13:35,480 --> 00:13:36,400 Speaker 2: Okay, so what is that? 211 00:13:36,679 --> 00:13:39,240 Speaker 1: The two hemispheres of the brain are linked by a 212 00:13:39,760 --> 00:13:43,079 Speaker 1: super highway of neurons. This is called the corpus colosum. 213 00:13:43,440 --> 00:13:46,240 Speaker 1: This is the bridge that connects the two halves of 214 00:13:46,240 --> 00:13:49,160 Speaker 1: the brain, and a split brain surgery is when the 215 00:13:49,280 --> 00:13:52,840 Speaker 1: super highway gets cut with a scalpel such that the 216 00:13:52,840 --> 00:13:56,439 Speaker 1: two halves of the brain, the two hemispheres, are now 217 00:13:56,480 --> 00:14:01,080 Speaker 1: operating independently. Usually these work in concert. Now you might 218 00:14:01,120 --> 00:14:04,320 Speaker 1: ask why would anyone ever have that surgery. It was 219 00:14:04,360 --> 00:14:08,680 Speaker 1: for people with severe epilepsy, where the seizures would spread 220 00:14:08,720 --> 00:14:11,520 Speaker 1: from one hemisphere to the other. So the idea was 221 00:14:11,559 --> 00:14:15,719 Speaker 1: that by destroying the road between the hemispheres, you disallow 222 00:14:15,840 --> 00:14:19,160 Speaker 1: the spread. So neurosurgeons started down this road in the 223 00:14:19,240 --> 00:14:22,880 Speaker 1: nineteen sixties, but they stopped once they realized that the 224 00:14:22,960 --> 00:14:27,440 Speaker 1: surgeries were revealing something astonishing. So let's say someone has 225 00:14:27,520 --> 00:14:30,360 Speaker 1: just had this surgery, and what you do is you 226 00:14:30,480 --> 00:14:35,040 Speaker 1: now show a picture, like a snowy scene, but you 227 00:14:35,200 --> 00:14:38,840 Speaker 1: place that so only the right hemisphere can see that picture. 228 00:14:39,640 --> 00:14:42,400 Speaker 1: And there are several objects laid out on the table. 229 00:14:42,960 --> 00:14:46,280 Speaker 1: One of them is, let's say, a snow shovel, and 230 00:14:46,320 --> 00:14:49,640 Speaker 1: so the left hand, which is controlled by the right hemisphere, 231 00:14:50,040 --> 00:14:54,240 Speaker 1: picks up the snow shovel. But when you verbally ask 232 00:14:54,440 --> 00:14:58,280 Speaker 1: the left hemisphere why the hand just chose a snow shovel, 233 00:14:59,200 --> 00:15:03,200 Speaker 1: the person will canfabulate. They'll make up a story. They'll say, oh, 234 00:15:03,320 --> 00:15:07,160 Speaker 1: my hand picked up that snowshovel because I was recently 235 00:15:07,200 --> 00:15:10,120 Speaker 1: doing some gardening and this reminds me of it. So 236 00:15:10,360 --> 00:15:15,400 Speaker 1: their left hemisphere has no access to the right hemispher's information, 237 00:15:15,840 --> 00:15:19,680 Speaker 1: but rather than admitting confusion, it just makes up a 238 00:15:19,720 --> 00:15:23,320 Speaker 1: story to explain the action. I'll put a link about 239 00:15:23,360 --> 00:15:26,240 Speaker 1: split brain studies on the show notes. But what this 240 00:15:26,400 --> 00:15:30,320 Speaker 1: tells us is that our brain's storytelling function is not 241 00:15:30,400 --> 00:15:34,040 Speaker 1: just about narrating past events. It's active even in the 242 00:15:34,040 --> 00:15:38,440 Speaker 1: present moment. It shapes our reality as we go. In 243 00:15:38,480 --> 00:15:42,600 Speaker 1: all of these cases, from mister Thompson's endless reinventions of 244 00:15:42,680 --> 00:15:46,160 Speaker 1: himself to the blind woman who insists she can see, 245 00:15:46,560 --> 00:15:50,040 Speaker 1: to the split brain patients who fabricate explanations for their 246 00:15:50,080 --> 00:15:53,400 Speaker 1: own actions, the brain is doing what it does best, 247 00:15:53,560 --> 00:15:57,920 Speaker 1: creating a cohesive narrative. It just so happens that sometimes 248 00:15:58,200 --> 00:16:01,960 Speaker 1: the facts don't cooperate. But these extreme cases lead us 249 00:16:02,000 --> 00:16:06,440 Speaker 1: to a bigger question. If confabulation happens in people with 250 00:16:06,560 --> 00:16:08,120 Speaker 1: brain damage. 251 00:16:07,880 --> 00:16:09,120 Speaker 2: What about the rest of us. 252 00:16:09,680 --> 00:16:14,480 Speaker 1: How reliable are our everyday memories and how often do 253 00:16:14,560 --> 00:16:17,080 Speaker 1: we unknowingly rewrite our past? 254 00:16:17,560 --> 00:16:17,760 Speaker 2: Well? 255 00:16:17,840 --> 00:16:20,960 Speaker 1: We do it far more often than we realize. False 256 00:16:21,000 --> 00:16:24,440 Speaker 1: memory doesn't just happen in injured brain. They happen all 257 00:16:24,520 --> 00:16:28,600 Speaker 1: the time in all of us. Our memories feel solid 258 00:16:28,600 --> 00:16:32,800 Speaker 1: and clear and trustworthy, but in reality they're full of fabrications. 259 00:16:33,360 --> 00:16:35,360 Speaker 2: The normal brain, just like the injured. 260 00:16:35,040 --> 00:16:41,960 Speaker 1: Brain, fills in gaps, and typically we don't notice. So 261 00:16:42,040 --> 00:16:45,560 Speaker 1: let's start in the nineteen eighties when Nelson Mandela died 262 00:16:45,600 --> 00:16:49,560 Speaker 1: in prison. Millions of people remember hearing about this and 263 00:16:49,640 --> 00:16:53,000 Speaker 1: reading the headlines or seeing the news stories, so all 264 00:16:53,080 --> 00:16:55,600 Speaker 1: of them were surprised when it was announced in twenty 265 00:16:55,640 --> 00:17:00,000 Speaker 1: thirteen that Nelson Mandela had just died out of prison 266 00:17:00,160 --> 00:17:03,200 Speaker 1: in his home after having been released in nineteen ninety, 267 00:17:03,600 --> 00:17:08,119 Speaker 1: becoming President of South Africa, earning a worldwide reputation. Wait, 268 00:17:08,200 --> 00:17:13,240 Speaker 1: what hadn't he passed away three decades before? Weirdly, so 269 00:17:13,320 --> 00:17:15,119 Speaker 1: many people had that story wrong. 270 00:17:15,160 --> 00:17:16,720 Speaker 2: They thought he died in the eighties. 271 00:17:17,119 --> 00:17:19,920 Speaker 1: That this is now known in the psychology literature as 272 00:17:20,040 --> 00:17:24,560 Speaker 1: the Mandela effect. They truly thought they had heard that 273 00:17:24,640 --> 00:17:27,520 Speaker 1: he had died in prison. We've all had the experience 274 00:17:27,560 --> 00:17:30,320 Speaker 1: of being convinced that something happened in a certain way, 275 00:17:30,760 --> 00:17:33,960 Speaker 1: only to later discover we were completely wrong. But it's 276 00:17:34,000 --> 00:17:36,800 Speaker 1: called the Mandela effect when it's not just you, but 277 00:17:36,920 --> 00:17:40,439 Speaker 1: thousands or millions of other people come to believe the 278 00:17:40,520 --> 00:17:43,520 Speaker 1: same false memory. There are so many examples of this. 279 00:17:44,080 --> 00:17:48,679 Speaker 1: Here's one. Everyone seems to remember Darth Vader saying Luke, 280 00:17:49,040 --> 00:17:52,840 Speaker 1: I am your father, when in fact that's not the line. 281 00:17:53,240 --> 00:17:57,920 Speaker 2: The line is no, I am your father. 282 00:17:58,960 --> 00:18:01,800 Speaker 1: What where did the luke come from in everyone's memory? 283 00:18:01,840 --> 00:18:04,760 Speaker 1: Why do we all misremember that? And how about the 284 00:18:04,760 --> 00:18:08,160 Speaker 1: movie Casablanca. Even if you've never seen it, you probably 285 00:18:08,200 --> 00:18:11,880 Speaker 1: know the most famous line where Ingrid Bergmann says play 286 00:18:11,880 --> 00:18:15,679 Speaker 1: it again, Sam, except that she never says it. 287 00:18:16,359 --> 00:18:19,800 Speaker 2: The line in the movie is play it once, Sam, 288 00:18:20,040 --> 00:18:21,920 Speaker 2: for all time's sake. I don't know what you mean. 289 00:18:21,960 --> 00:18:28,639 Speaker 2: Miss Elton played them play as time goes by? 290 00:18:28,800 --> 00:18:32,080 Speaker 1: So why does everyone believe that the line was play 291 00:18:32,080 --> 00:18:35,760 Speaker 1: it again, Sam? And here's another one. You probably remember 292 00:18:35,880 --> 00:18:38,800 Speaker 1: the Disney movie Snow White and the Seven Dwarves when 293 00:18:38,800 --> 00:18:42,280 Speaker 1: the Witch says mirror, mirror on the wall, who's the 294 00:18:42,320 --> 00:18:45,240 Speaker 1: fairest one of all? So you might be surprised to 295 00:18:45,280 --> 00:18:47,160 Speaker 1: know that she actually says. 296 00:18:47,560 --> 00:18:52,280 Speaker 2: Magic mirror on the wall. Who is the fairest one 297 00:18:52,320 --> 00:18:52,600 Speaker 2: of all? 298 00:18:53,440 --> 00:18:57,239 Speaker 1: But for some reason everyone started saying mirror mirror, And 299 00:18:57,280 --> 00:19:01,240 Speaker 1: that's now how we all misremember. The thing is that 300 00:19:01,320 --> 00:19:03,760 Speaker 1: if you had a false memory about what was said 301 00:19:03,800 --> 00:19:07,320 Speaker 1: in these movies, that memory felt completely real to you, 302 00:19:07,400 --> 00:19:10,640 Speaker 1: as though you had seen that scene. The Mandela effect 303 00:19:10,640 --> 00:19:13,520 Speaker 1: tells us that memory is not an accurate recording in 304 00:19:13,520 --> 00:19:16,719 Speaker 1: the past, but a flexible process, and often it can 305 00:19:16,760 --> 00:19:22,520 Speaker 1: be socially influenced and collectively shaped. And the Mandela effect 306 00:19:22,520 --> 00:19:24,640 Speaker 1: applies in all sensory domains. 307 00:19:24,720 --> 00:19:26,760 Speaker 2: Take your visual memory. 308 00:19:27,080 --> 00:19:30,960 Speaker 1: Tons of people recall the monopoly man as having a monocle, 309 00:19:31,560 --> 00:19:34,560 Speaker 1: but he doesn't and he never did. Here's a question, 310 00:19:35,000 --> 00:19:39,080 Speaker 1: did the cartoon monkey Curious George have a tail or 311 00:19:39,119 --> 00:19:42,760 Speaker 1: not have a tail? Most people remember that he did, 312 00:19:42,920 --> 00:19:45,679 Speaker 1: but that's a false memory. If you look at the books, 313 00:19:45,960 --> 00:19:49,760 Speaker 1: Curious George has no tail at all. Okay, here's another one. 314 00:19:49,840 --> 00:19:52,600 Speaker 1: Think of Pikachu and his tail. He actually does have 315 00:19:52,640 --> 00:19:55,520 Speaker 1: a tail. The tail is yellow, but does it have 316 00:19:55,720 --> 00:19:58,840 Speaker 1: a black tip? Most people, if you press them on it, 317 00:19:58,880 --> 00:20:01,600 Speaker 1: we'll say the tail has a black tip. Even though 318 00:20:01,600 --> 00:20:05,359 Speaker 1: his tail is completely yellow, his ears have black tips, 319 00:20:05,400 --> 00:20:09,399 Speaker 1: and somehow millions of people misremember it as being on 320 00:20:09,520 --> 00:20:11,840 Speaker 1: his tail. And I'll tell you another version of the 321 00:20:11,840 --> 00:20:14,879 Speaker 1: Mandela effect that comes about for a slightly different reason. 322 00:20:15,520 --> 00:20:19,879 Speaker 1: Think about Oscar Meyer Wieners. You've certainly seen the ads 323 00:20:19,880 --> 00:20:23,000 Speaker 1: and maybe even the Oscar Meyer truck driving around. Here's 324 00:20:23,040 --> 00:20:27,800 Speaker 1: the question, how is Meyer spelled? The large majority of 325 00:20:27,840 --> 00:20:31,119 Speaker 1: people will swear that it is spelled me e y 326 00:20:31,320 --> 00:20:35,720 Speaker 1: e r, but in fact it's m a y e r. 327 00:20:36,359 --> 00:20:38,359 Speaker 1: But m e y e r is a much more 328 00:20:38,400 --> 00:20:42,040 Speaker 1: common spelling, and so we misremember it. And here's another 329 00:20:42,080 --> 00:20:45,200 Speaker 1: example of that same thing. You may remember a children's 330 00:20:45,200 --> 00:20:50,040 Speaker 1: book series called The Berenstein Bears. Essentially, everyone remembers this 331 00:20:50,080 --> 00:20:55,120 Speaker 1: as being spelled barren Stein sti e n, when in 332 00:20:55,160 --> 00:21:00,440 Speaker 1: fact the last five letters are stai n. It looks 333 00:21:00,480 --> 00:21:04,480 Speaker 1: like Baron stain bears. When this has pointed out to people, 334 00:21:04,560 --> 00:21:07,400 Speaker 1: they generally don't believe this until they go and pull 335 00:21:07,440 --> 00:21:09,560 Speaker 1: the book off their shelf and take a look. 336 00:21:10,320 --> 00:21:11,359 Speaker 2: I think that both these. 337 00:21:11,240 --> 00:21:15,200 Speaker 1: Examples about Oscar Meyer and Berenstein Bears are interesting because 338 00:21:15,200 --> 00:21:18,920 Speaker 1: they're shaped not necessarily by other people's memories, but instead 339 00:21:18,920 --> 00:21:23,560 Speaker 1: by your expectations given the particulars of the language. In 340 00:21:23,600 --> 00:21:26,360 Speaker 1: other words, some ways of spelling things are so much 341 00:21:26,440 --> 00:21:29,560 Speaker 1: more common that you come to believe that's what you 342 00:21:29,680 --> 00:21:33,240 Speaker 1: had seen, and like the other versions of the Mandela effect, 343 00:21:33,600 --> 00:21:36,320 Speaker 1: you really, really are certain that this is how you 344 00:21:36,440 --> 00:21:39,920 Speaker 1: saw it, And when you see the actual thing written down, 345 00:21:40,440 --> 00:21:44,159 Speaker 1: it's hard to reconcile the certainty of your memory against 346 00:21:44,240 --> 00:21:47,440 Speaker 1: the reality of what is in front of you. Think 347 00:21:47,440 --> 00:21:51,080 Speaker 1: about all these examples of the Mandela effect, quiz your 348 00:21:51,119 --> 00:21:54,359 Speaker 1: friends on what they believe and remember, and think of 349 00:21:54,400 --> 00:21:57,920 Speaker 1: what else you might have misremembered the movies, the characters, 350 00:21:58,240 --> 00:22:02,040 Speaker 1: the logos, whatever, and a note at podcasts at Eagleman 351 00:22:02,119 --> 00:22:03,960 Speaker 1: dot com to let me know because I can't get 352 00:22:04,119 --> 00:22:06,800 Speaker 1: enough of these things. Okay, I think this is one 353 00:22:06,800 --> 00:22:11,840 Speaker 1: of the most fascinating ways to study confabulation in the individual. 354 00:22:11,920 --> 00:22:16,720 Speaker 1: Healthy mind canfabulations which we normally never notice or gain 355 00:22:16,960 --> 00:22:20,399 Speaker 1: an awareness of. And there's a closely related issue about 356 00:22:20,400 --> 00:22:25,080 Speaker 1: the way that our knowledge and beliefs can create unconscious 357 00:22:25,119 --> 00:22:29,760 Speaker 1: distortions about what details we remember. In episode seventy, I 358 00:22:29,800 --> 00:22:33,800 Speaker 1: told you about a nineteen thirty's study on a short 359 00:22:33,960 --> 00:22:38,840 Speaker 1: Native American fable called The War of the Ghosts. Participants 360 00:22:38,920 --> 00:22:41,920 Speaker 1: read the story and then they wrote down their recollection 361 00:22:42,040 --> 00:22:45,240 Speaker 1: of it immediately after, and then again a week after, 362 00:22:45,320 --> 00:22:49,280 Speaker 1: and then again three months after. And as people recalled 363 00:22:49,359 --> 00:22:52,680 Speaker 1: the story again and again through time, it turned out 364 00:22:52,960 --> 00:22:57,080 Speaker 1: they smoothed out the details that were inconsistent with their 365 00:22:57,119 --> 00:23:01,280 Speaker 1: own pre existing knowledge and belief systems. As they recalled 366 00:23:01,359 --> 00:23:04,760 Speaker 1: the story over and over, it became more consistent with 367 00:23:04,880 --> 00:23:08,840 Speaker 1: their worldview. As another example, consider the way we explain 368 00:23:09,119 --> 00:23:12,800 Speaker 1: our own decisions. There are a bunch of studies on this, 369 00:23:12,880 --> 00:23:16,040 Speaker 1: and essentially, when people are asked why they made a 370 00:23:16,080 --> 00:23:21,960 Speaker 1: particular choice, they often canfabulate explanations. Their choices may have 371 00:23:21,960 --> 00:23:25,800 Speaker 1: been influenced by subconscious factors they weren't aware of, but 372 00:23:25,840 --> 00:23:30,040 Speaker 1: they'll create a convincing story to explain why they did. 373 00:23:29,880 --> 00:23:32,160 Speaker 2: What they did. Why did you pick that car? 374 00:23:32,320 --> 00:23:32,480 Speaker 1: Oh? 375 00:23:32,560 --> 00:23:33,560 Speaker 2: I like the design? 376 00:23:34,080 --> 00:23:37,320 Speaker 1: In reality, they were subtly influenced by an ad they 377 00:23:37,320 --> 00:23:40,199 Speaker 1: were exposed to. Why do you believe what you believe? 378 00:23:40,680 --> 00:23:44,760 Speaker 1: Because it's logical? In reality? So much of our belief 379 00:23:44,840 --> 00:23:48,840 Speaker 1: is shaped by our culture and by our emotions, and 380 00:23:48,880 --> 00:23:53,040 Speaker 1: our brains fill in justifications later. Why did you break 381 00:23:53,119 --> 00:23:57,440 Speaker 1: up with that person? Well, we just weren't compatible In reality. 382 00:23:57,440 --> 00:24:00,000 Speaker 1: Maybe they could have worked harder, but they have restructure 383 00:24:00,280 --> 00:24:04,280 Speaker 1: painful memories to make the breakup seem inevitable. In other words, 384 00:24:04,280 --> 00:24:08,679 Speaker 1: studies show that with romantic partners, if a relationship goes bad, 385 00:24:09,320 --> 00:24:12,200 Speaker 1: we tend to remember the details of that relationship as 386 00:24:12,320 --> 00:24:16,920 Speaker 1: more negative than they actually were. If a relationship improves, 387 00:24:17,400 --> 00:24:20,840 Speaker 1: then we see it through rose colored glasses and remember 388 00:24:20,880 --> 00:24:24,480 Speaker 1: things as better than they were. There's a sense in 389 00:24:24,520 --> 00:24:30,840 Speaker 1: which we are always confabulating, turning complex realities into clear stories. 390 00:24:33,040 --> 00:24:36,960 Speaker 1: Now all this may seem innocent enough, but in episode nineteen, 391 00:24:37,040 --> 00:24:40,240 Speaker 1: I dove deep into why This sort of confabulation matters 392 00:24:40,280 --> 00:24:44,400 Speaker 1: so much when it comes to something like eyewitness testimony, 393 00:24:44,640 --> 00:24:47,880 Speaker 1: where there's real world consequence. And the issue that comes 394 00:24:47,960 --> 00:24:50,680 Speaker 1: up here again is not just about the fragility of memory, 395 00:24:51,000 --> 00:24:54,440 Speaker 1: but the way that we feel so certain about whatever 396 00:24:54,520 --> 00:24:58,040 Speaker 1: memory gets served up to us, and our memories can 397 00:24:58,080 --> 00:25:02,080 Speaker 1: get manipulated by very subtle cues. For example, in one 398 00:25:02,160 --> 00:25:06,399 Speaker 1: study by my colleague Elizabeth Loftis, participants watched a video 399 00:25:06,440 --> 00:25:10,119 Speaker 1: of a car accident and were asked a question how 400 00:25:10,200 --> 00:25:12,560 Speaker 1: fast were the cars going when they hit each other? 401 00:25:16,240 --> 00:25:18,879 Speaker 1: Others were asked a slightly different version of the question, 402 00:25:19,480 --> 00:25:22,359 Speaker 1: how fast were the cars going when they smashed into 403 00:25:22,400 --> 00:25:28,280 Speaker 1: each other. The result was that those who heard smashed 404 00:25:28,840 --> 00:25:33,280 Speaker 1: consistently recalled the cars going faster, with some even falsely 405 00:25:33,320 --> 00:25:36,600 Speaker 1: remembering broken glass despite the fact that none was shown. 406 00:25:37,000 --> 00:25:40,800 Speaker 1: So memory isn't stored like a video. It's actively reshaped 407 00:25:40,840 --> 00:25:44,320 Speaker 1: when we recall it, and even a single word can 408 00:25:44,359 --> 00:25:48,960 Speaker 1: alter what we remember. And Loftis did other experiments showing 409 00:25:48,960 --> 00:25:53,159 Speaker 1: how false memories can be injected. She showed that people 410 00:25:53,480 --> 00:25:57,400 Speaker 1: can be led to remember events that never actually happened 411 00:25:57,720 --> 00:26:01,800 Speaker 1: simply through suggestion. In one of her Landmark studies. She 412 00:26:02,040 --> 00:26:07,760 Speaker 1: successfully implanted false childhood memories by asking participants about events 413 00:26:08,000 --> 00:26:10,200 Speaker 1: like the time they were lost in a shopping mall 414 00:26:10,320 --> 00:26:12,959 Speaker 1: or the time they went on a hot air balloon ride, 415 00:26:13,160 --> 00:26:16,000 Speaker 1: even though these things have never occurred, but they were 416 00:26:16,000 --> 00:26:18,639 Speaker 1: described as if they had. Over time, a lot of 417 00:26:18,640 --> 00:26:22,280 Speaker 1: the participants not only accepted these fabricated events as real, 418 00:26:22,760 --> 00:26:25,840 Speaker 1: but they ended up adding more details into the story 419 00:26:26,119 --> 00:26:28,959 Speaker 1: and in the big picture, in the criminal justice system, 420 00:26:29,040 --> 00:26:32,480 Speaker 1: it's been shown with hundreds of studies that you can 421 00:26:32,600 --> 00:26:39,160 Speaker 1: mislead eyewitnesses by suggestive questioning. You can alter their recollections 422 00:26:39,440 --> 00:26:43,679 Speaker 1: of somebody's appearance or what precisely happened during the crime, 423 00:26:44,400 --> 00:26:47,320 Speaker 1: and this problem rears its head every day with eyewitness 424 00:26:47,400 --> 00:26:51,040 Speaker 1: testimony in the courtroom. The problem is that memory is 425 00:26:51,160 --> 00:26:52,840 Speaker 1: notoriously unreliable. 426 00:26:53,200 --> 00:26:54,200 Speaker 2: And I'm not talking. 427 00:26:53,920 --> 00:26:56,640 Speaker 1: About cases where people are intending to deceive, but more 428 00:26:56,720 --> 00:27:00,439 Speaker 1: generally about the problem that memory is malleable. Even subtle 429 00:27:00,480 --> 00:27:04,919 Speaker 1: suggestions like the wording during a police lineup, can alter 430 00:27:05,520 --> 00:27:10,720 Speaker 1: what somebody feels they remember entirely. False details can be inserted, 431 00:27:11,119 --> 00:27:14,840 Speaker 1: and once a memory is altered, it feels just as 432 00:27:14,960 --> 00:27:18,919 Speaker 1: vivid and true as a genuine memory, and this means 433 00:27:19,000 --> 00:27:22,760 Speaker 1: any of us can recall events with absolute confidence. 434 00:27:22,520 --> 00:27:24,359 Speaker 2: While being completely wrong. 435 00:27:24,640 --> 00:27:30,119 Speaker 1: There's this paradox of inaccuracy with high confidence. And the 436 00:27:30,160 --> 00:27:34,800 Speaker 1: reason this matters is because eyewitness accounts are very persuasive 437 00:27:34,880 --> 00:27:37,440 Speaker 1: in courtrooms, even though the research. 438 00:27:37,000 --> 00:27:38,919 Speaker 2: Shows that they are often fiction. 439 00:27:39,440 --> 00:27:42,359 Speaker 1: So please listen to episode nineteen for a deep dive 440 00:27:42,520 --> 00:28:06,639 Speaker 1: on the brain and eyewitness testimony. Zooming back out, why 441 00:28:06,680 --> 00:28:09,640 Speaker 1: does the brain can fabulate even in everyday life. Well, 442 00:28:09,720 --> 00:28:12,800 Speaker 1: as I said, unlike a camera that passively records everything 443 00:28:12,800 --> 00:28:16,560 Speaker 1: it sees, memory is more like a story being rewritten 444 00:28:16,600 --> 00:28:19,560 Speaker 1: every time it's told. Each time we recall an event, 445 00:28:19,960 --> 00:28:23,760 Speaker 1: our brain pulls together fragments of information and stitches them 446 00:28:23,800 --> 00:28:28,000 Speaker 1: together into a coherent narrative. Small gaps get filled in 447 00:28:28,000 --> 00:28:30,679 Speaker 1: with assumptions, as we see with the War of the 448 00:28:30,680 --> 00:28:34,639 Speaker 1: Ghosts experiment. Details get smoothed over as we see with 449 00:28:35,119 --> 00:28:39,080 Speaker 1: the eyewitness experiments. New information can get inserted even if 450 00:28:39,080 --> 00:28:42,840 Speaker 1: it wasn't there originally. Over time, the original memory can 451 00:28:42,880 --> 00:28:46,280 Speaker 1: be completely rewritten, and once we update a memory, we 452 00:28:46,400 --> 00:28:49,640 Speaker 1: forget we ever changed it. The new version feels like 453 00:28:49,720 --> 00:28:53,920 Speaker 1: the original. This is why two people who experience the 454 00:28:54,000 --> 00:28:58,120 Speaker 1: same event can remember it completely differently. Each person's brain 455 00:28:58,480 --> 00:29:01,760 Speaker 1: has reconstructed the experience. It's based on their own biases 456 00:29:02,160 --> 00:29:07,840 Speaker 1: and assumptions and emotions. We've seen how confabulation affects history, 457 00:29:07,960 --> 00:29:13,040 Speaker 1: like the Mandela effect, how it affects society like Eyewinness testimony, 458 00:29:13,040 --> 00:29:16,440 Speaker 1: how it affects your childhood memories. And that leads me 459 00:29:16,800 --> 00:29:20,400 Speaker 1: to always wonder about the small confabulations that we tell 460 00:29:20,440 --> 00:29:23,920 Speaker 1: ourselves every day. It's hard to know the answer to this, 461 00:29:24,080 --> 00:29:28,200 Speaker 1: but how often do you rewrite past decisions, like when 462 00:29:28,200 --> 00:29:33,000 Speaker 1: you rationalize some suboptimal decision that you made by convincing 463 00:29:33,040 --> 00:29:35,960 Speaker 1: yourself that you always wanted the outcome you got. This 464 00:29:36,080 --> 00:29:38,720 Speaker 1: is a post talk confabulation, a way for the brain 465 00:29:39,000 --> 00:29:42,080 Speaker 1: to maintain a sense of consistency. How often do we 466 00:29:42,160 --> 00:29:46,640 Speaker 1: misremember conversations? You ever had an argument where you and 467 00:29:46,680 --> 00:29:49,479 Speaker 1: the other person are one hundred percent certain about what 468 00:29:49,680 --> 00:29:53,400 Speaker 1: was said, but your memories completely disagree. I know it's 469 00:29:53,440 --> 00:29:57,200 Speaker 1: always tempting to say the other person is the one confabulating, 470 00:29:57,640 --> 00:30:00,520 Speaker 1: but my hope is that after listening to this episode, 471 00:30:00,600 --> 00:30:04,239 Speaker 1: you might be slightly more willing to revisit this. So 472 00:30:04,440 --> 00:30:07,640 Speaker 1: if confabulation happens to all of us, how can we 473 00:30:07,680 --> 00:30:11,760 Speaker 1: ever trust our memories? The answer isn't to distrust everything, 474 00:30:11,840 --> 00:30:15,640 Speaker 1: but just to develop a tiny bit of skepticism about 475 00:30:15,640 --> 00:30:19,720 Speaker 1: the stories our minds tell us. Your memory feels real, 476 00:30:19,880 --> 00:30:23,080 Speaker 1: but feeling real doesn't make it true. Okay, so we've 477 00:30:23,080 --> 00:30:26,440 Speaker 1: been exploring how memory is a shifting story. But what 478 00:30:26,480 --> 00:30:30,440 Speaker 1: does this mean for how we understand ourselves. One thing 479 00:30:30,480 --> 00:30:36,880 Speaker 1: that's happened lately in neuroscience is implanting false memories in animals, 480 00:30:36,960 --> 00:30:39,640 Speaker 1: let's say rats. So here's how it works. A team 481 00:30:39,760 --> 00:30:43,240 Speaker 1: led by Sousumo, Tonogawa and MIT puts a rat in 482 00:30:43,320 --> 00:30:45,600 Speaker 1: a box and lets them run around to explore it. 483 00:30:46,080 --> 00:30:48,200 Speaker 1: Then the rats come out of the box and they 484 00:30:48,320 --> 00:30:51,640 Speaker 1: hang out, relax. And what the researchers now do outside 485 00:30:51,680 --> 00:30:56,360 Speaker 1: the box is they reactivate the neurons that encoded the 486 00:30:56,560 --> 00:31:00,959 Speaker 1: memory of that box. They do this using optigs, So 487 00:31:01,000 --> 00:31:04,480 Speaker 1: they reactivate those neurons, and now they deliver a little 488 00:31:04,560 --> 00:31:08,800 Speaker 1: electric shock to the rat's foot. Okay, Now, later they 489 00:31:08,840 --> 00:31:11,480 Speaker 1: put the rat back in the box a place where 490 00:31:11,520 --> 00:31:14,520 Speaker 1: the rat had never before been harmed, and the rat 491 00:31:14,560 --> 00:31:20,120 Speaker 1: freezes in fear, behaving as if it remembered being shocked. There, 492 00:31:20,200 --> 00:31:23,640 Speaker 1: even though that had never actually happened. So the scientists 493 00:31:23,680 --> 00:31:26,880 Speaker 1: were able to create an entirely false experience, one that 494 00:31:26,920 --> 00:31:31,160 Speaker 1: the rat presumably fully believed to be real. Brains don't 495 00:31:31,200 --> 00:31:35,680 Speaker 1: store perfect representations of reality, but flexible, rewriteable narratives. 496 00:31:36,040 --> 00:31:37,040 Speaker 2: So will we one. 497 00:31:36,960 --> 00:31:44,200 Speaker 1: Day implant therapeutic memories to help people overcome PTSD? And 498 00:31:44,360 --> 00:31:48,000 Speaker 1: how would a technology like that blur the line between 499 00:31:48,200 --> 00:31:51,640 Speaker 1: authentic experience and artificial recollection? 500 00:31:52,080 --> 00:31:52,600 Speaker 2: And this, of. 501 00:31:52,560 --> 00:31:56,400 Speaker 1: Course reminds us of the film Total Recall with Arnold Schwarzenegger. 502 00:31:56,960 --> 00:32:00,520 Speaker 1: If you haven't seen this movie, the protagonist Dug Quaid 503 00:32:01,000 --> 00:32:06,360 Speaker 1: visits a company called Recall that offers to implant vivid, 504 00:32:06,560 --> 00:32:11,880 Speaker 1: customized memories of adventures that never happened. So Quaid opts 505 00:32:11,880 --> 00:32:15,320 Speaker 1: for the memory of a secret agent mission on Mars, 506 00:32:15,840 --> 00:32:18,840 Speaker 1: only to discover that he might actually be a secret 507 00:32:18,840 --> 00:32:22,200 Speaker 1: agent whose real memories were erased. This was a very 508 00:32:22,240 --> 00:32:27,040 Speaker 1: pioneering story that played with the tension between authentic experience 509 00:32:27,440 --> 00:32:28,800 Speaker 1: and synthetic memory. 510 00:32:29,400 --> 00:32:32,320 Speaker 2: If you remember something vividly. 511 00:32:32,160 --> 00:32:37,040 Speaker 1: And emotionally and in detail, doesn't matter whether it actually happened. 512 00:32:38,560 --> 00:32:42,040 Speaker 1: The film asks what if your most cherished memories were 513 00:32:42,120 --> 00:32:46,600 Speaker 1: never real? And neuroscience replies that, for better or worse, 514 00:32:46,920 --> 00:32:51,160 Speaker 1: we're not that far away from creating synthetic memories. And 515 00:32:51,240 --> 00:32:57,840 Speaker 1: in any case, you often create them yourself. And I 516 00:32:57,920 --> 00:33:00,440 Speaker 1: just want to highlight it's not just that individ jewels 517 00:33:00,480 --> 00:33:06,840 Speaker 1: have unreliable memories. Societies do as well. They collectively misremember 518 00:33:06,880 --> 00:33:13,080 Speaker 1: their past. Historical confabulation shapes our understanding of events, often 519 00:33:13,200 --> 00:33:16,360 Speaker 1: to serve a specific narrative. In episode forty one, I 520 00:33:16,440 --> 00:33:19,800 Speaker 1: talked about the former USSR and how they loved to 521 00:33:20,080 --> 00:33:25,000 Speaker 1: erase political enemies from photographs. For one example of many, 522 00:33:25,480 --> 00:33:29,800 Speaker 1: there's a famous photo which proudly captures Lenin and other 523 00:33:29,920 --> 00:33:34,120 Speaker 1: Soviet leaders in Red Square in Moscow in nineteen nineteen. 524 00:33:34,840 --> 00:33:36,600 Speaker 1: You can see Lenin and on his left to see 525 00:33:36,680 --> 00:33:40,360 Speaker 1: Leon Trotsky, and on Lenin's right is a man named Kamenev, 526 00:33:40,760 --> 00:33:44,640 Speaker 1: and there's a Bolshevik leader from Georgia in front of them. Now, 527 00:33:44,680 --> 00:33:46,880 Speaker 1: if you look at a release of this photo some 528 00:33:47,040 --> 00:33:50,920 Speaker 1: years later, the official Soviet version of the photo, you 529 00:33:51,000 --> 00:33:55,040 Speaker 1: see that after Leon Trotsky fell from party favor, he 530 00:33:55,200 --> 00:33:58,480 Speaker 1: was airbrushed out of the photo. In the revised photograph, 531 00:33:58,520 --> 00:34:00,680 Speaker 1: there's just an empty space where he used to be 532 00:34:01,240 --> 00:34:05,239 Speaker 1: and Kommenev on Lenin's right has disappeared as well, and 533 00:34:05,280 --> 00:34:08,719 Speaker 1: the bearded Bolshevik leader never existed. 534 00:34:08,280 --> 00:34:09,200 Speaker 2: In the photo either. 535 00:34:09,800 --> 00:34:14,799 Speaker 1: This is essentially the photographic version of confabulation, and this 536 00:34:14,840 --> 00:34:19,960 Speaker 1: happens constantly in the retelling of history. As is often said, 537 00:34:20,520 --> 00:34:23,920 Speaker 1: history is the pack of lies told by the winner. 538 00:34:24,600 --> 00:34:27,680 Speaker 1: And as an apropos side note, it's not at all 539 00:34:27,760 --> 00:34:31,799 Speaker 1: clear who first said that quotation. It's commonly associated with 540 00:34:31,920 --> 00:34:35,200 Speaker 1: Napoleon or Churchill, but apparently there are versions of this 541 00:34:35,280 --> 00:34:41,279 Speaker 1: going back to Herodotus. So nations and cultures are constantly 542 00:34:41,440 --> 00:34:46,279 Speaker 1: shaping public memory. History is always being rewritten, and it 543 00:34:46,360 --> 00:34:51,800 Speaker 1: works because, just like individuals, societies need a coherent story 544 00:34:52,160 --> 00:34:57,400 Speaker 1: when reality is messy. History gets edited, sometimes consciously, sometimes 545 00:34:57,440 --> 00:35:06,759 Speaker 1: through the natural distortion of collective memory. So if our 546 00:35:06,800 --> 00:35:10,640 Speaker 1: memories are fiction, who are We Just think about the 547 00:35:10,640 --> 00:35:13,960 Speaker 1: way that we tell our life stories. We highlight certain events, 548 00:35:13,960 --> 00:35:17,400 Speaker 1: we downplay others. We add emotional weight to moments that 549 00:35:17,480 --> 00:35:20,760 Speaker 1: might have been minor at the time. We rewrite past 550 00:35:20,800 --> 00:35:24,360 Speaker 1: decisions to make them seem more logical. We are, at 551 00:35:24,440 --> 00:35:27,960 Speaker 1: least to some extent, unreliable narrators. 552 00:35:27,400 --> 00:35:28,320 Speaker 2: Of our own lives. 553 00:35:28,960 --> 00:35:33,160 Speaker 1: So we can think of identity as a living document 554 00:35:33,239 --> 00:35:36,759 Speaker 1: which is constantly being updated. Who you think you are 555 00:35:36,800 --> 00:35:39,719 Speaker 1: today is different from who you thought you were ten 556 00:35:39,800 --> 00:35:43,759 Speaker 1: years ago. Some of that shift comes from the deposition 557 00:35:43,880 --> 00:35:46,880 Speaker 1: of new memories, but some of it comes from the 558 00:35:46,960 --> 00:35:51,080 Speaker 1: subtle confabulations that shape our memories. This may not be 559 00:35:51,200 --> 00:35:56,240 Speaker 1: a flaw but a feature, because a perfect, unchanging memory 560 00:35:56,600 --> 00:36:01,799 Speaker 1: would trap us in the past. Instead, we rewriting our 561 00:36:01,880 --> 00:36:05,479 Speaker 1: history in real time to fit the narrative of who 562 00:36:05,760 --> 00:36:09,799 Speaker 1: we believe we are. So some argue that confabulation can 563 00:36:09,840 --> 00:36:12,520 Speaker 1: be useful, but it also has a dark side, which 564 00:36:12,560 --> 00:36:17,239 Speaker 1: is over confidence. Because we don't realize we're confabulating, we 565 00:36:17,360 --> 00:36:20,920 Speaker 1: assume our memories are true, and that can lead to 566 00:36:20,960 --> 00:36:25,360 Speaker 1: serious problems like false convictions, where innocent people are imprisoned 567 00:36:25,360 --> 00:36:28,120 Speaker 1: because if eyewitnesses who believe they are. 568 00:36:27,960 --> 00:36:28,920 Speaker 2: Telling the truth. 569 00:36:29,719 --> 00:36:34,840 Speaker 1: Also a problem with confabulating brains is misinformation. False memories 570 00:36:34,960 --> 00:36:40,439 Speaker 1: contribute to conspiracy theories in urban legends and historical distortions 571 00:36:40,440 --> 00:36:45,000 Speaker 1: that shape public perception, and confabulation leads all the time 572 00:36:45,080 --> 00:36:50,160 Speaker 1: to personal misunderstandings. How many relationships have been damaged because 573 00:36:50,600 --> 00:36:55,160 Speaker 1: two people remember an argument differently and each is convinced 574 00:36:55,320 --> 00:36:59,240 Speaker 1: that their version is correct. This is why it's always 575 00:36:59,239 --> 00:37:03,960 Speaker 1: a good idea to approach your memory with skepticism and humility. 576 00:37:04,520 --> 00:37:09,640 Speaker 1: Just because we remember something vividly doesn't necessitate that it's true. 577 00:37:09,800 --> 00:37:12,120 Speaker 1: So how do we live with this knowledge? How do 578 00:37:12,160 --> 00:37:15,920 Speaker 1: we best navigate the confabulating brain. It doesn't mean we 579 00:37:15,960 --> 00:37:19,440 Speaker 1: should distrust all our memories. It just means we should 580 00:37:19,480 --> 00:37:23,080 Speaker 1: be more open eyed about the situation. So instead of 581 00:37:23,160 --> 00:37:28,160 Speaker 1: saying I remember exactly what happened, try saying this is 582 00:37:28,200 --> 00:37:34,200 Speaker 1: how I remember it, But I could be wrong. As 583 00:37:34,239 --> 00:37:37,360 Speaker 1: we wrap up today's journey through the confabulations of memory, 584 00:37:37,440 --> 00:37:41,760 Speaker 1: let's leave with one thought. Our memories shape our lives, 585 00:37:41,880 --> 00:37:44,719 Speaker 1: our identities, and our understanding of the world. But they 586 00:37:44,760 --> 00:37:49,000 Speaker 1: aren't perfect records. They are ever changing. They're always evolving, 587 00:37:49,200 --> 00:37:54,680 Speaker 1: just like us. The fact is we humans are storytelling creatures. 588 00:37:55,080 --> 00:37:59,280 Speaker 1: We don't just experience the world, we organize it into narrative. 589 00:37:59,640 --> 00:38:02,040 Speaker 1: In this same way, history can be a story we 590 00:38:02,120 --> 00:38:05,799 Speaker 1: tell about the past. Personal identity is a story we 591 00:38:05,880 --> 00:38:09,960 Speaker 1: tell about ourselves. We began today's podcast with a simple 592 00:38:10,040 --> 00:38:14,480 Speaker 1: but unsettling question, why do brains sometimes make things up? 593 00:38:15,120 --> 00:38:18,239 Speaker 1: Along the way we saw that confabulation isn't just a 594 00:38:18,360 --> 00:38:21,439 Speaker 1: quirk of the damaged brain. It's a part of how 595 00:38:21,480 --> 00:38:26,080 Speaker 1: all brains function. We are all in a sense fiction writers, 596 00:38:26,640 --> 00:38:30,840 Speaker 1: Memory is not a recording device. It's a dynamic creative system. 597 00:38:31,000 --> 00:38:35,360 Speaker 1: Every time we recall an event, our brains reconstruct it, 598 00:38:35,440 --> 00:38:39,600 Speaker 1: sometimes correctly, sometimes with error, and sometimes in ways that 599 00:38:39,640 --> 00:38:43,680 Speaker 1: are more than a little fabricated. The strange part is 600 00:38:43,719 --> 00:38:47,760 Speaker 1: that we trust our memories with absolute conviction. We trust 601 00:38:47,800 --> 00:38:51,960 Speaker 1: them in relationships, we trust them in courtrooms. We trust 602 00:38:52,040 --> 00:38:55,200 Speaker 1: them to tell us who we are. But if memories 603 00:38:55,280 --> 00:38:59,080 Speaker 1: can change, if our pasts are being subtly rewritten with 604 00:38:59,160 --> 00:39:02,040 Speaker 1: each passing year, what does that mean for the sense 605 00:39:02,080 --> 00:39:02,520 Speaker 1: of self? 606 00:39:02,800 --> 00:39:05,000 Speaker 2: If we are the sum of our. 607 00:39:05,000 --> 00:39:08,480 Speaker 1: Memories, but those memories are fluid, then how stable is 608 00:39:08,520 --> 00:39:12,439 Speaker 1: the person we think we are. We can become too 609 00:39:12,520 --> 00:39:16,799 Speaker 1: confident in our false memories. We can rewrite history to 610 00:39:16,920 --> 00:39:20,520 Speaker 1: suit our needs, and we can create narratives that justify 611 00:39:20,560 --> 00:39:25,160 Speaker 1: our actions, even when those narratives are inaccurate. So the 612 00:39:25,200 --> 00:39:29,320 Speaker 1: next time you remember some episode in your life, pause, 613 00:39:29,760 --> 00:39:32,719 Speaker 1: take a moment to question it. How do I know 614 00:39:32,840 --> 00:39:36,120 Speaker 1: this memory is accurate? Could my brain be filling in 615 00:39:36,200 --> 00:39:40,640 Speaker 1: gaps somewhere. Is this a true recollection or has it 616 00:39:40,760 --> 00:39:43,920 Speaker 1: been shaped by the stories I've told myself over the years. 617 00:39:44,560 --> 00:39:48,080 Speaker 1: Don't distrust every memory. Instead, this is just an invitation 618 00:39:48,680 --> 00:39:53,160 Speaker 1: to approach memory with humility, to recognize that what feels 619 00:39:53,400 --> 00:39:58,640 Speaker 1: absolutely real might in fact be a creative act. 620 00:39:58,640 --> 00:39:59,440 Speaker 2: Of your brain. 621 00:40:02,920 --> 00:40:05,800 Speaker 1: Go to Eagleman dot com slash podcast for more information 622 00:40:05,880 --> 00:40:08,799 Speaker 1: and to find further reading. Send me an email at 623 00:40:08,920 --> 00:40:12,759 Speaker 1: podcast at eagleman dot com with questions or discussion or 624 00:40:12,800 --> 00:40:15,680 Speaker 1: your examples of the Mandela effect, and check out and 625 00:40:15,680 --> 00:40:19,239 Speaker 1: subscribe to Inner Cosmos on YouTube for videos of each 626 00:40:19,280 --> 00:40:23,160 Speaker 1: episode and to leave comments. Until next time. I'm David Eagleman, 627 00:40:23,280 --> 00:40:25,160 Speaker 1: and this is Inner Cosmos.