1 00:00:05,040 --> 00:00:08,080 Speaker 1: When you're trying to choose which ice cream flavor to get, 2 00:00:08,280 --> 00:00:12,000 Speaker 1: what's actually happening in your brain? How do you decide? 3 00:00:12,560 --> 00:00:15,160 Speaker 1: It seems pretty easy, but that's only if you don't 4 00:00:15,200 --> 00:00:19,319 Speaker 1: see the storms of activity lighting up the brain. And 5 00:00:19,360 --> 00:00:22,400 Speaker 1: how do you make long term decisions like that You're 6 00:00:22,400 --> 00:00:24,040 Speaker 1: not going to eat the ice cream at all, but 7 00:00:24,079 --> 00:00:27,080 Speaker 1: instead you're going to eat some broccoli. And what does 8 00:00:27,120 --> 00:00:30,240 Speaker 1: any of this have to do with the ancient Greeks, 9 00:00:30,360 --> 00:00:35,040 Speaker 1: or what alien hand syndrome is, or what the rules 10 00:00:35,040 --> 00:00:39,479 Speaker 1: should be around how the president can launch the nuclear bomb. 11 00:00:42,720 --> 00:00:45,640 Speaker 1: Welcome to Inner Cosmos with me David Eagleman. I'm a 12 00:00:45,760 --> 00:00:49,560 Speaker 1: neuroscientist and author at Stanford and in these episodes we 13 00:00:49,680 --> 00:00:53,040 Speaker 1: dive deeply into our three pound universe to uncover some 14 00:00:53,120 --> 00:00:57,080 Speaker 1: of the most surprising aspects of our lives. Today's episode 15 00:00:57,080 --> 00:01:07,480 Speaker 1: and next week's is about decision making. Although you generally 16 00:01:07,480 --> 00:01:10,959 Speaker 1: feel like you're just moving through a day, decision making 17 00:01:11,080 --> 00:01:15,000 Speaker 1: lies at the heart of everything we do. The complexity 18 00:01:15,040 --> 00:01:17,920 Speaker 1: of the world presents itself to us, and we are 19 00:01:18,040 --> 00:01:23,240 Speaker 1: constantly weighing alternatives. If you couldn't do that, you couldn't 20 00:01:23,360 --> 00:01:27,040 Speaker 1: navigate the now, and you couldn't plan out your future. 21 00:01:27,560 --> 00:01:31,360 Speaker 1: So by having a richer understanding of how choices battle 22 00:01:31,400 --> 00:01:33,840 Speaker 1: it out in the brain, we can learn how to 23 00:01:33,920 --> 00:01:36,840 Speaker 1: make better decisions for ourselves. And, as we're going to 24 00:01:36,880 --> 00:01:39,600 Speaker 1: see next week, because this is a two partern, we'll 25 00:01:39,640 --> 00:01:41,840 Speaker 1: even see how we can use this to build a 26 00:01:41,880 --> 00:01:45,680 Speaker 1: better criminal justice system. So let's start at the very beginning. 27 00:01:45,959 --> 00:01:49,480 Speaker 1: Should you choose the taco or the brito? Should you 28 00:01:49,520 --> 00:01:52,800 Speaker 1: go for that T shirt or a more formal shirt. 29 00:01:53,120 --> 00:01:55,680 Speaker 1: Should you take care of this email first or that 30 00:01:55,800 --> 00:01:59,480 Speaker 1: bill payment first? Do you feel like the mint, chocolate 31 00:01:59,560 --> 00:02:02,840 Speaker 1: chip or cookies and cream? On any day, you're making 32 00:02:02,880 --> 00:02:07,040 Speaker 1: one thousand little decisions, which phone calls to make, which 33 00:02:07,120 --> 00:02:10,519 Speaker 1: shooes to wear, whether to take the shorter path or 34 00:02:10,560 --> 00:02:14,040 Speaker 1: the faster path, how exactly you should answer a question, 35 00:02:14,600 --> 00:02:17,919 Speaker 1: what you should agree to on your schedule tomorrow. This, 36 00:02:17,960 --> 00:02:20,520 Speaker 1: it turns out, is one of the most central things 37 00:02:20,560 --> 00:02:24,720 Speaker 1: the brain does, make decisions. Your brain is a decision 38 00:02:24,800 --> 00:02:28,280 Speaker 1: making machine. It takes the complexity of the world and 39 00:02:28,320 --> 00:02:32,880 Speaker 1: it squeezes things down to single decisions. So how do 40 00:02:33,080 --> 00:02:38,520 Speaker 1: brains do this? Now? When economists and psychologists first started 41 00:02:38,520 --> 00:02:42,960 Speaker 1: looking at this question, they assumed that we humans act rationally. 42 00:02:43,040 --> 00:02:45,799 Speaker 1: The idea is we look at our options, we add 43 00:02:45,880 --> 00:02:49,480 Speaker 1: up the pros and cons, and we make the optimal decision. 44 00:02:50,040 --> 00:02:52,840 Speaker 1: That's not necessarily how it goes with our brains. And 45 00:02:52,960 --> 00:02:55,160 Speaker 1: in this episode and the next one, we're going to 46 00:02:55,200 --> 00:02:58,560 Speaker 1: get a really good understanding why. So let's get going 47 00:02:58,600 --> 00:03:02,480 Speaker 1: with these drawing where there are two ways to interpret 48 00:03:02,560 --> 00:03:05,400 Speaker 1: the same image. You've probably seen these things. If you 49 00:03:05,840 --> 00:03:08,520 Speaker 1: look at the picture one way, it looks like a rabbit, 50 00:03:08,560 --> 00:03:11,360 Speaker 1: and if you keep staring at it you can see 51 00:03:11,400 --> 00:03:14,440 Speaker 1: that suddenly it looks like a duck. So you probably 52 00:03:14,440 --> 00:03:15,960 Speaker 1: saw this when you were a kid, But just in 53 00:03:16,000 --> 00:03:18,120 Speaker 1: case you didn't, I'm putting it on the show notes 54 00:03:18,160 --> 00:03:22,200 Speaker 1: at eagleman dot com slash podcast. The point is the 55 00:03:22,400 --> 00:03:26,680 Speaker 1: image is what we call perceptually by stable, which is 56 00:03:26,720 --> 00:03:28,720 Speaker 1: just a way of saying that when you stare at 57 00:03:28,720 --> 00:03:31,760 Speaker 1: the picture, your brain can make one interpretation of what 58 00:03:31,760 --> 00:03:35,839 Speaker 1: you're seeing or the other interpretation, and it switches back 59 00:03:35,880 --> 00:03:38,600 Speaker 1: and forth. You see the rabbit and then the duck, 60 00:03:38,880 --> 00:03:41,200 Speaker 1: and then the rabbit and then the duck. The critical 61 00:03:41,200 --> 00:03:44,800 Speaker 1: thing to appreciate is that nothing on the page changes, 62 00:03:44,960 --> 00:03:49,240 Speaker 1: so the only thing that's changing is something inside your brain. 63 00:03:49,680 --> 00:03:53,000 Speaker 1: So some years ago I collaborated with my neurosurgery colleagues 64 00:03:53,040 --> 00:03:56,240 Speaker 1: to study this sort of thing. Now, neuroscience, what I 65 00:03:56,280 --> 00:03:59,360 Speaker 1: do is pretty different from brain surgery what a neurosurgeon does, 66 00:03:59,520 --> 00:04:01,360 Speaker 1: but happily we tend to be friends and we can 67 00:04:01,360 --> 00:04:05,320 Speaker 1: often help each other. So I accompanied surgeries while my 68 00:04:05,400 --> 00:04:08,600 Speaker 1: colleagues were operating on people's brains. And you may know, 69 00:04:08,640 --> 00:04:11,040 Speaker 1: one of the incredible things about neurosurgery is that you 70 00:04:11,080 --> 00:04:14,320 Speaker 1: can do this while a patient is awake and talking 71 00:04:14,360 --> 00:04:17,320 Speaker 1: to you. And this is because you can put little 72 00:04:17,480 --> 00:04:21,960 Speaker 1: electrodes these are small metal wires into the brain and 73 00:04:22,000 --> 00:04:25,520 Speaker 1: the brain has no pain receptors, so it doesn't hurt. 74 00:04:25,600 --> 00:04:28,120 Speaker 1: The patient can't feel that. Now, why does the brain 75 00:04:28,160 --> 00:04:31,400 Speaker 1: have no pain receptors. It's generally because nothing ever touches 76 00:04:31,440 --> 00:04:34,919 Speaker 1: your brain. Your brain, which is about the consistency of jello, 77 00:04:35,200 --> 00:04:38,520 Speaker 1: is well protected in the thick plates of the skull, 78 00:04:38,800 --> 00:04:42,360 Speaker 1: so evolutionarily, there was no point in developing pain receptors 79 00:04:42,360 --> 00:04:47,080 Speaker 1: in the brain anyway. As a consequence, a neurosurgeon only 80 00:04:47,120 --> 00:04:50,359 Speaker 1: needs to use local anesthesia when making a cut on 81 00:04:50,440 --> 00:04:53,480 Speaker 1: the scalp, and then they drill a burhole into the skull, 82 00:04:53,960 --> 00:04:56,960 Speaker 1: and once the brain is exposed, you can poke things 83 00:04:56,960 --> 00:04:59,680 Speaker 1: into the tissue while you and the patient are having 84 00:04:59,760 --> 00:05:02,920 Speaker 1: a conversation about what it's like and what you find 85 00:05:03,000 --> 00:05:07,720 Speaker 1: there is that this empire of eighty six billion neurons 86 00:05:08,120 --> 00:05:12,200 Speaker 1: is constantly storming with activity. Every neuron in your head 87 00:05:12,279 --> 00:05:16,000 Speaker 1: is having these little spikes of electrical voltage known as 88 00:05:16,200 --> 00:05:20,240 Speaker 1: action potentials, and every neuron is firing off some tens 89 00:05:20,400 --> 00:05:22,920 Speaker 1: or hundreds of these spikes every second of your life. 90 00:05:23,560 --> 00:05:28,520 Speaker 1: Every idea you've ever had, every memory you can ever recollect, 91 00:05:28,560 --> 00:05:32,479 Speaker 1: every choice that you've ever contemplated, is written in the 92 00:05:32,600 --> 00:05:37,920 Speaker 1: language of these tiny, mysterious spikes. And using a very 93 00:05:38,000 --> 00:05:41,400 Speaker 1: very tiny electrode think of a super small wire that 94 00:05:41,440 --> 00:05:44,560 Speaker 1: you just push into the brain, you can eavesdrop on 95 00:05:44,720 --> 00:05:48,400 Speaker 1: these spikes. Now, let's call the patient on the table, Marcia. 96 00:05:48,800 --> 00:05:52,039 Speaker 1: So Marcia is wide awake and talking to me, and 97 00:05:52,120 --> 00:05:55,280 Speaker 1: I can show her this bistable picture of the rabbit 98 00:05:55,400 --> 00:05:59,160 Speaker 1: duck and ask her to tell me what she sees. Now, 99 00:05:59,200 --> 00:06:03,520 Speaker 1: the moment that Marcia switches to rabbit or duck, her 100 00:06:03,560 --> 00:06:07,520 Speaker 1: brain has made a decision. A decision doesn't have to 101 00:06:07,520 --> 00:06:11,479 Speaker 1: be conscious in this case, it's a perceptual decision by 102 00:06:11,480 --> 00:06:16,080 Speaker 1: her visual system, and the mechanics of this switchover are 103 00:06:16,120 --> 00:06:19,599 Speaker 1: totally hidden under the hood. Now, what's interesting is that 104 00:06:19,960 --> 00:06:23,200 Speaker 1: in theory, a brain should be able to see both 105 00:06:23,240 --> 00:06:25,000 Speaker 1: the rabbit and the duck at the same time, but 106 00:06:25,040 --> 00:06:28,960 Speaker 1: in reality, brains don't do that. What brains do is 107 00:06:29,000 --> 00:06:33,200 Speaker 1: they take ambiguous data and they make a choice. In 108 00:06:33,240 --> 00:06:35,600 Speaker 1: this case, it eventually remakes the choice, and it might 109 00:06:35,640 --> 00:06:38,200 Speaker 1: switch back and forth over and over. But the point 110 00:06:38,320 --> 00:06:42,320 Speaker 1: is that our brains are always crushing ambiguity down to 111 00:06:42,640 --> 00:06:46,560 Speaker 1: a choice. So when Marsha's brain lands on the interpretation 112 00:06:46,720 --> 00:06:49,520 Speaker 1: that the drawing is of a duck or a rabbit, 113 00:06:49,960 --> 00:06:53,560 Speaker 1: we can listen through the electrode to the responses from 114 00:06:53,640 --> 00:06:56,960 Speaker 1: a small number of neurons these cells in her brain. 115 00:06:57,400 --> 00:07:06,800 Speaker 1: Some neurons shift to a higher rate of activity while 116 00:07:06,880 --> 00:07:15,520 Speaker 1: other neurons slow down their activity. And that happens right 117 00:07:15,560 --> 00:07:18,560 Speaker 1: when Marcia says, oh, now I see it as a duck, 118 00:07:19,040 --> 00:07:21,360 Speaker 1: and then when she says, oh, now it's a rabbit again, 119 00:07:21,440 --> 00:07:24,920 Speaker 1: the neurons are changing their activity. Now, the neurons we 120 00:07:25,000 --> 00:07:29,320 Speaker 1: happen to be eavesdropping on are not by themselves responsible 121 00:07:29,360 --> 00:07:33,920 Speaker 1: for the perceptual change. Instead, they're operating in concert with 122 00:07:34,040 --> 00:07:37,240 Speaker 1: billions of other neurons. So the changes we're listening to 123 00:07:37,840 --> 00:07:43,320 Speaker 1: are just pieces of this massive changing pattern taking hold 124 00:07:43,360 --> 00:07:47,080 Speaker 1: across large swaths of brain territory. But the idea is 125 00:07:47,120 --> 00:07:50,640 Speaker 1: that when one pattern wins out over the other in 126 00:07:50,760 --> 00:07:55,440 Speaker 1: Marcia's brain, a decision has been landed upon. Now it's 127 00:07:55,480 --> 00:07:58,520 Speaker 1: a rabbit. Now it's a duck. Now. Just as a 128 00:07:58,560 --> 00:08:01,120 Speaker 1: side note, it's not always about the neurons speeding up 129 00:08:01,200 --> 00:08:04,960 Speaker 1: or slowing down their spikes. Sometimes neurons change their pattern 130 00:08:04,960 --> 00:08:08,520 Speaker 1: of activity in more subtle ways. They become synchronized or 131 00:08:08,600 --> 00:08:12,760 Speaker 1: desynchronized with other neurons even while they're maintaining their original pace. 132 00:08:13,080 --> 00:08:16,160 Speaker 1: Now here's the thing. Although we can look at this 133 00:08:16,240 --> 00:08:20,720 Speaker 1: in Marsha's brain and correlate the changes with seeing the 134 00:08:20,840 --> 00:08:24,200 Speaker 1: rabbit or the duck, the thing to appreciate is that 135 00:08:24,240 --> 00:08:27,840 Speaker 1: your brain makes thousands of decisions every day of your life, 136 00:08:28,200 --> 00:08:31,200 Speaker 1: and that dictates your experience of the world. What are 137 00:08:31,240 --> 00:08:33,800 Speaker 1: you gonna wear today, who are you going to text today, 138 00:08:34,160 --> 00:08:36,400 Speaker 1: How are you going to interpret what was meant by 139 00:08:36,440 --> 00:08:39,200 Speaker 1: that email? Are you going to exercise today? Or are 140 00:08:39,240 --> 00:08:41,480 Speaker 1: you gonna eat that bag of chips or pass it up. 141 00:08:41,840 --> 00:08:44,960 Speaker 1: We don't typically chew on it, but these small decisions 142 00:08:45,080 --> 00:08:50,400 Speaker 1: underlie every action we take. Who you are emerges from 143 00:08:50,480 --> 00:08:54,800 Speaker 1: the brain wide battles for dominance that rage in your 144 00:08:54,880 --> 00:08:59,200 Speaker 1: skull every moment of your life. So listening to that 145 00:08:59,640 --> 00:09:08,760 Speaker 1: neural activity in Marsha's head, it's impossible not to be 146 00:09:09,080 --> 00:09:13,520 Speaker 1: in awe because this is what every decision in the 147 00:09:13,600 --> 00:09:19,000 Speaker 1: history of our species sounded like. Every marriage proposal, every 148 00:09:19,160 --> 00:09:23,840 Speaker 1: declaration of war, every leap of imagination, every mission launched 149 00:09:23,880 --> 00:09:28,960 Speaker 1: into the unknown, every act of kindness, every lie, every breakthrough, 150 00:09:29,000 --> 00:09:33,480 Speaker 1: every decisive moment. It all happened right here in the 151 00:09:33,600 --> 00:09:38,800 Speaker 1: darkness of the skull, emerging from patterns of activity in 152 00:09:38,960 --> 00:09:43,480 Speaker 1: networks of biological cells. So let's take a closer look 153 00:09:43,480 --> 00:09:46,480 Speaker 1: at what's happening behind the scenes during a decision. So 154 00:09:46,520 --> 00:09:49,600 Speaker 1: imagine you're making a simple choice. You're standing in the 155 00:09:50,080 --> 00:09:53,800 Speaker 1: drinksisle at the store, and you're trying to decide between 156 00:09:53,840 --> 00:09:56,839 Speaker 1: two flavors of sports strength that you like equally. So 157 00:09:57,160 --> 00:10:00,920 Speaker 1: you're looking at lemon and blueberry. Now from on the outside, 158 00:10:01,080 --> 00:10:03,600 Speaker 1: someone's looking at you standing in the aisle. It doesn't 159 00:10:03,600 --> 00:10:06,040 Speaker 1: look like you're doing much. You're simply stuck there in 160 00:10:06,080 --> 00:10:09,080 Speaker 1: the aisle, looking back and forth between these two options. 161 00:10:09,720 --> 00:10:13,640 Speaker 1: But inside your brain, a simple choice like this has 162 00:10:13,760 --> 00:10:17,319 Speaker 1: unleashed a hurricane of activity. Now keep in mind that 163 00:10:17,400 --> 00:10:21,360 Speaker 1: by itself, no single neuron has much influence. But each 164 00:10:21,480 --> 00:10:24,680 Speaker 1: neuron is connected to thousands of others, and they in 165 00:10:24,720 --> 00:10:27,320 Speaker 1: turn connect to thousands of others and so on. In 166 00:10:27,360 --> 00:10:32,920 Speaker 1: this massive, loopy, intertwining network, they're all releasing chemicals that 167 00:10:33,080 --> 00:10:37,839 Speaker 1: excite or depress each other. Within this web, a particular 168 00:10:37,920 --> 00:10:42,120 Speaker 1: constellation of neurons represents the Lemon drink, and this pattern 169 00:10:42,200 --> 00:10:46,160 Speaker 1: is formed from neurons that mutually excite each other. Now 170 00:10:46,160 --> 00:10:49,439 Speaker 1: they're not necessarily next to each other. They might span 171 00:10:50,080 --> 00:10:54,000 Speaker 1: distant brain regions involved in smell and taste, and vision 172 00:10:54,440 --> 00:10:58,600 Speaker 1: and your unique history of memories involving lemons sports strengths. 173 00:10:58,920 --> 00:11:01,760 Speaker 1: Each of these neurons by itself has very little to 174 00:11:01,760 --> 00:11:05,320 Speaker 1: do with lemon. In fact, each neuron plays many different 175 00:11:05,400 --> 00:11:10,199 Speaker 1: roles at different times in ever shifting coalitions. But when 176 00:11:10,240 --> 00:11:14,040 Speaker 1: these neurons all become active collectively at the same time 177 00:11:14,120 --> 00:11:18,280 Speaker 1: in this particular arrangement, that's the Lemons sports drink. To 178 00:11:18,320 --> 00:11:21,319 Speaker 1: your brain. So as you're standing in front of the drinks, 179 00:11:21,480 --> 00:11:26,280 Speaker 1: this federation of neurons eagerly communicates with one another, like 180 00:11:26,360 --> 00:11:30,320 Speaker 1: the way that dispersed individuals link online. Now, these neurons 181 00:11:30,360 --> 00:11:34,120 Speaker 1: aren't acting alone in their electioneering. At the same time, 182 00:11:34,640 --> 00:11:39,240 Speaker 1: the competing possibility the blueberry flavored drink is represented by 183 00:11:39,240 --> 00:11:43,400 Speaker 1: its own neural coalition, and thousands or millions of neurons 184 00:11:43,400 --> 00:11:46,320 Speaker 1: there are talking with each other and trying to get 185 00:11:46,360 --> 00:11:50,400 Speaker 1: the vote out. Now, each coalition lemon and blueberry, tries 186 00:11:50,440 --> 00:11:54,040 Speaker 1: to gain the upper hand by intensifying its own activity 187 00:11:54,080 --> 00:11:59,240 Speaker 1: and suppressing the others. Neural populations compete against one another, 188 00:11:59,400 --> 00:12:03,360 Speaker 1: just like police parties struggle for dominance. They fight it 189 00:12:03,360 --> 00:12:07,040 Speaker 1: out until one of them triumphs in this winner take 190 00:12:07,080 --> 00:12:28,240 Speaker 1: all competition, and whichever network wins defines what you do next. Now, 191 00:12:28,280 --> 00:12:30,959 Speaker 1: this is what is so cool about brains as compared 192 00:12:31,000 --> 00:12:33,840 Speaker 1: to let's say, are digital computers. The brain is a 193 00:12:33,880 --> 00:12:39,360 Speaker 1: machine that runs on conflict between different possibilities, and all 194 00:12:39,400 --> 00:12:43,240 Speaker 1: these possibilities are always trying to outcompete the others, and 195 00:12:43,320 --> 00:12:48,040 Speaker 1: there are always multiple options. Even after you've selected the 196 00:12:48,320 --> 00:12:50,880 Speaker 1: lemon or the blueberry drink, you find yourself in a 197 00:12:50,920 --> 00:12:54,320 Speaker 1: new conflict. Should I drink the whole thing right now? 198 00:12:54,840 --> 00:12:56,720 Speaker 1: Part of you feels like it's a good idea to 199 00:12:56,760 --> 00:12:59,160 Speaker 1: make sure you have plenty of hydration, And at the 200 00:12:59,160 --> 00:13:01,880 Speaker 1: same time, part of you he recognizes that it's very 201 00:13:01,920 --> 00:13:04,200 Speaker 1: sugary and you don't want to do too much of that. 202 00:13:04,559 --> 00:13:07,320 Speaker 1: And part of you also knows you're going to have 203 00:13:07,400 --> 00:13:09,040 Speaker 1: to go to the bathroom soon if you drink the 204 00:13:09,080 --> 00:13:11,679 Speaker 1: whole thing. So whether you polish off the whole drink 205 00:13:11,679 --> 00:13:14,559 Speaker 1: now or save some for later is simply a matter 206 00:13:14,640 --> 00:13:17,560 Speaker 1: of the way that the infighting goes as a result 207 00:13:17,640 --> 00:13:21,440 Speaker 1: of the ongoing conflict in the brain. You can argue 208 00:13:21,440 --> 00:13:25,040 Speaker 1: with yourself, you can curse it yourself, you can cajole yourself. 209 00:13:25,520 --> 00:13:29,080 Speaker 1: But who exactly is talking with whom? It's all you, 210 00:13:29,520 --> 00:13:32,120 Speaker 1: but it's different parts of you now. As I said, 211 00:13:32,120 --> 00:13:36,080 Speaker 1: we're not usually aware of these internal conflicts, but sometimes 212 00:13:36,120 --> 00:13:39,280 Speaker 1: we can use simple tasks to bring this to the surface. 213 00:13:39,840 --> 00:13:44,080 Speaker 1: For example, there was a simple psychological task introduced in 214 00:13:44,080 --> 00:13:47,120 Speaker 1: the nineteen thirties, which we call the Stroop task, and 215 00:13:47,160 --> 00:13:49,640 Speaker 1: it's very easy. All you do is you look at 216 00:13:49,640 --> 00:13:52,160 Speaker 1: a word printed on a page, and the word might 217 00:13:52,200 --> 00:13:55,640 Speaker 1: be in red ink or blue, or green or yellow, 218 00:13:55,840 --> 00:13:57,920 Speaker 1: and all you need to do is say the color 219 00:13:58,000 --> 00:14:00,280 Speaker 1: of the ink. So if I show you the word 220 00:14:00,679 --> 00:14:03,280 Speaker 1: garage and the ink is yellow, all you do is 221 00:14:03,320 --> 00:14:06,000 Speaker 1: you say yellow. Now that sounds very easy to do, 222 00:14:06,120 --> 00:14:08,959 Speaker 1: But the trick is that the words I show you 223 00:14:09,040 --> 00:14:12,880 Speaker 1: might themselves be color words. Like I show you the 224 00:14:12,960 --> 00:14:17,160 Speaker 1: letters R, E, D, but the ink is in green. 225 00:14:17,720 --> 00:14:20,320 Speaker 1: So when you see this, you're supposed to shout out green, 226 00:14:20,680 --> 00:14:23,440 Speaker 1: but it proves very difficult. Or I show you the 227 00:14:23,480 --> 00:14:27,520 Speaker 1: word purple written in yellow ink, or the word blue 228 00:14:28,040 --> 00:14:31,080 Speaker 1: written in red ink. Check out some examples on the 229 00:14:31,120 --> 00:14:34,160 Speaker 1: show notes. Now, the instructions are very easy. All you're 230 00:14:34,160 --> 00:14:36,360 Speaker 1: supposed to do is say the color of the ink. 231 00:14:36,720 --> 00:14:40,320 Speaker 1: So why is the stroop task so hard? It's because 232 00:14:40,360 --> 00:14:43,120 Speaker 1: one network in your brain has the task of identifying 233 00:14:43,240 --> 00:14:45,120 Speaker 1: the color of the ink and putting a name to it. 234 00:14:45,360 --> 00:14:48,360 Speaker 1: But at the same time, you have competing networks in 235 00:14:48,400 --> 00:14:51,720 Speaker 1: your brain that are responsible for reading words. And these 236 00:14:51,760 --> 00:14:56,320 Speaker 1: are so proficient that word reading has become a deeply 237 00:14:56,560 --> 00:14:59,960 Speaker 1: ingrained automatic process. And so you can feel the strugg 238 00:15:00,200 --> 00:15:03,440 Speaker 1: as these systems contend with each other and to get 239 00:15:03,480 --> 00:15:07,520 Speaker 1: the right answer. You have to actively suppress the strong 240 00:15:07,640 --> 00:15:11,320 Speaker 1: impulse to read the word in favor of really concentrating 241 00:15:11,360 --> 00:15:13,880 Speaker 1: on the ink color. So with a simple task like this, 242 00:15:13,960 --> 00:15:18,360 Speaker 1: you can directly experience the conflict. But most of the 243 00:15:18,440 --> 00:15:21,440 Speaker 1: time things are running so smoothly under the hood that 244 00:15:21,480 --> 00:15:24,360 Speaker 1: you don't even realize that you have all these competing networks. 245 00:15:24,680 --> 00:15:28,240 Speaker 1: And it's only under very special circumstances that it can 246 00:15:28,280 --> 00:15:31,680 Speaker 1: become easy for us to witness in another person the 247 00:15:31,800 --> 00:15:36,360 Speaker 1: internal conflict between the different parts of the brain. So 248 00:15:36,520 --> 00:15:39,240 Speaker 1: this can happen when a person has a particular kind 249 00:15:39,320 --> 00:15:44,280 Speaker 1: of epilepsy, where the epileptic seizures spread from one hemisphere 250 00:15:44,280 --> 00:15:47,760 Speaker 1: to the other. In these cases, some patients have undergone 251 00:15:48,080 --> 00:15:51,080 Speaker 1: what is called a split brain surgery, which the brain's 252 00:15:51,160 --> 00:15:56,120 Speaker 1: two hemispheres get surgically disconnected from each other. So normally 253 00:15:56,480 --> 00:16:00,080 Speaker 1: your two hemispheres are connected by a super highway of 254 00:16:00,160 --> 00:16:04,000 Speaker 1: nerves called the corpus colosum, and this allows the right 255 00:16:04,040 --> 00:16:07,480 Speaker 1: and left halves to coordinate and to work in concert. So, 256 00:16:07,600 --> 00:16:11,000 Speaker 1: for example, if you are feeling chilly, both of your 257 00:16:11,000 --> 00:16:14,840 Speaker 1: hands cooperate. One holds your jacket hem while the other 258 00:16:14,960 --> 00:16:19,040 Speaker 1: tugs up the zipper. But When the corpus closum gets cut, 259 00:16:19,480 --> 00:16:23,120 Speaker 1: the two hemispheres aren't talking anymore, and you can get 260 00:16:23,160 --> 00:16:28,120 Speaker 1: a very wild and haunting clinical condition that's called alien 261 00:16:28,400 --> 00:16:31,920 Speaker 1: hand syndrome. So the two hands can act with totally 262 00:16:31,920 --> 00:16:36,960 Speaker 1: different intentions. So a person with a severed corpus colosum 263 00:16:37,360 --> 00:16:39,920 Speaker 1: begins to zip up a jacket with one hand and 264 00:16:39,960 --> 00:16:43,840 Speaker 1: the other hand the alien hand suddenly grabs the zipper 265 00:16:43,920 --> 00:16:47,760 Speaker 1: and pulls it back down. Or the person might reach 266 00:16:47,840 --> 00:16:50,160 Speaker 1: for a piece of pizza with one hand and their 267 00:16:50,200 --> 00:16:52,920 Speaker 1: other hand leaps into action to slap the first hand. 268 00:16:53,640 --> 00:16:57,320 Speaker 1: The normal conflict running in the brain now comes to 269 00:16:57,360 --> 00:17:01,400 Speaker 1: the surface because the two hemispheres are now acting independently 270 00:17:01,440 --> 00:17:04,439 Speaker 1: of one another. They're not working things out under the 271 00:17:04,440 --> 00:17:09,399 Speaker 1: hood anymore. Alien hand syndrome typically fades in the weeks 272 00:17:09,440 --> 00:17:12,520 Speaker 1: after surgery, as the two halves of the brain take 273 00:17:12,520 --> 00:17:16,439 Speaker 1: advantage of any remaining connections to start coordinating again. But 274 00:17:16,520 --> 00:17:19,800 Speaker 1: what it reminds us is that even when we think 275 00:17:19,840 --> 00:17:24,119 Speaker 1: we're being single minded, our actions result from these immense 276 00:17:24,400 --> 00:17:29,719 Speaker 1: battles that continually rise and fall in the darkness of 277 00:17:29,880 --> 00:17:33,879 Speaker 1: the cranium. Now to appreciate some of the major competing 278 00:17:33,960 --> 00:17:38,680 Speaker 1: systems in your brain. Consider a philosophy experiment known as 279 00:17:38,760 --> 00:17:42,280 Speaker 1: the trolley dilemma. I spoke about this once about sixty 280 00:17:42,320 --> 00:17:45,199 Speaker 1: episodes ago, but if you haven't heard this one, it 281 00:17:45,320 --> 00:17:47,879 Speaker 1: paints an important point. So here's how it goes. A 282 00:17:48,080 --> 00:17:51,240 Speaker 1: trolley is barreling down a train track and you are 283 00:17:51,280 --> 00:17:54,960 Speaker 1: a bystander, and you notice that there are four workers 284 00:17:55,000 --> 00:17:58,840 Speaker 1: who are doing repairs down the track, and you realize 285 00:17:58,880 --> 00:18:01,600 Speaker 1: that they are going to get it and killed by 286 00:18:01,680 --> 00:18:04,840 Speaker 1: this trolley. Then you notice that there's a lever right 287 00:18:04,960 --> 00:18:07,960 Speaker 1: near you, and if you pull this lever, that'll divert 288 00:18:08,040 --> 00:18:11,600 Speaker 1: the trolley onto a parallel track. But wait a minute. 289 00:18:11,640 --> 00:18:14,480 Speaker 1: You realize that there's one worker on that track, and 290 00:18:14,520 --> 00:18:18,080 Speaker 1: if you pull the lever, that one worker will get killed. 291 00:18:18,119 --> 00:18:20,680 Speaker 1: So here's the question. Do you let four people get 292 00:18:20,760 --> 00:18:23,280 Speaker 1: killed or do you pull the lever so that only 293 00:18:23,359 --> 00:18:26,080 Speaker 1: one person is killed. When people are asked what they 294 00:18:26,080 --> 00:18:29,480 Speaker 1: would do in this scenario, almost everyone pulls the lever. 295 00:18:29,680 --> 00:18:32,440 Speaker 1: After all, it's better than only one person gets killed 296 00:18:32,560 --> 00:18:37,400 Speaker 1: rather than four, right, But now consider a slightly different scenario. 297 00:18:38,119 --> 00:18:41,000 Speaker 1: In the second scenario begins with the same premise that 298 00:18:41,040 --> 00:18:44,040 Speaker 1: trolley is barreling down the track and four workers are 299 00:18:44,040 --> 00:18:46,520 Speaker 1: going to get killed. But this time you're standing on 300 00:18:46,560 --> 00:18:50,040 Speaker 1: a bridge that goes over the tracks, and you notice 301 00:18:50,040 --> 00:18:53,240 Speaker 1: that there's a large man standing on the bridge watching 302 00:18:53,240 --> 00:18:56,959 Speaker 1: the birds. And you realize that if you push him off, 303 00:18:57,480 --> 00:19:00,840 Speaker 1: he'll land right on the track and his body will 304 00:19:00,920 --> 00:19:05,160 Speaker 1: be sufficient to stop the trolley and save the four workers. 305 00:19:05,720 --> 00:19:09,600 Speaker 1: So do you push him off? In this second scenario? 306 00:19:09,760 --> 00:19:12,000 Speaker 1: Almost no one I talk with is willing to push 307 00:19:12,080 --> 00:19:14,760 Speaker 1: the man. Why not? Well, when I ask them, they 308 00:19:14,760 --> 00:19:16,840 Speaker 1: give answers like that would be murder and that would 309 00:19:16,840 --> 00:19:20,480 Speaker 1: just be wrong. But wait, isn't it the same equation? 310 00:19:20,800 --> 00:19:24,520 Speaker 1: In both cases? The question is would you trade one 311 00:19:24,640 --> 00:19:27,960 Speaker 1: life for four? So why do the results come out 312 00:19:28,160 --> 00:19:32,879 Speaker 1: so differently in the second scenario. Ethicists have addressed this 313 00:19:32,960 --> 00:19:36,040 Speaker 1: problem from many angles, but brain imaging has been able 314 00:19:36,080 --> 00:19:40,600 Speaker 1: to provide a fairly straightforward answer to the brain. The 315 00:19:40,640 --> 00:19:44,440 Speaker 1: first scenario about pulling the lever, that's just a math problem. 316 00:19:44,760 --> 00:19:49,679 Speaker 1: The dilemma activates regions that are involved in solving logical problems. 317 00:19:50,160 --> 00:19:53,520 Speaker 1: But in the second scenario, where you have to physically 318 00:19:53,560 --> 00:19:56,160 Speaker 1: interact with the man and push him to his death, 319 00:19:56,400 --> 00:20:00,560 Speaker 1: that recruits additional networks. Into the decision. Now you have 320 00:20:00,640 --> 00:20:04,959 Speaker 1: the involvement of brain regions involved in emotion. In this 321 00:20:05,160 --> 00:20:09,159 Speaker 1: second scenario, you're caught in a conflict between two systems 322 00:20:09,480 --> 00:20:12,840 Speaker 1: that have very different opinions. Your rational networks tell you 323 00:20:12,880 --> 00:20:16,159 Speaker 1: that one death is better than four, but your emotional 324 00:20:16,160 --> 00:20:21,439 Speaker 1: networks trigger this gut feeling that murdering a bystander is wrong. 325 00:20:21,840 --> 00:20:24,760 Speaker 1: You're now dealing with these competing drives, with the end 326 00:20:24,800 --> 00:20:29,040 Speaker 1: result that your decision is likely to change entirely from 327 00:20:29,400 --> 00:20:48,760 Speaker 1: the first scenario. Now, the trolley dilemma sheds light on 328 00:20:48,880 --> 00:20:53,480 Speaker 1: real world situations. Just think of modern warfare, which has 329 00:20:53,520 --> 00:20:57,000 Speaker 1: become more like pulling the lever rather than pushing the 330 00:20:57,040 --> 00:21:00,240 Speaker 1: man off the bridge. When a person pushed which is 331 00:21:00,320 --> 00:21:03,640 Speaker 1: the button to launch a long range missile, it involves 332 00:21:03,720 --> 00:21:08,680 Speaker 1: only the networks involved in logical problems. Operating a drone 333 00:21:09,200 --> 00:21:13,800 Speaker 1: can become like a video game. Cyber attacks reconsequences at 334 00:21:13,880 --> 00:21:17,200 Speaker 1: a distance, So the rational networks are at work here, 335 00:21:17,680 --> 00:21:22,159 Speaker 1: but not necessarily the emotional networks. The detached nature of 336 00:21:22,320 --> 00:21:27,760 Speaker 1: distance warfare reduces internal conflict, and it makes this easier 337 00:21:27,800 --> 00:21:30,240 Speaker 1: to wage. So in the nineteen sixties there was a 338 00:21:30,280 --> 00:21:34,560 Speaker 1: military thinker who suggested that the button to launch nuclear 339 00:21:34,600 --> 00:21:39,560 Speaker 1: missiles should be implanted in the chest of the president's 340 00:21:39,640 --> 00:21:44,880 Speaker 1: best friend. That way, if the president chooses to launch nukes, 341 00:21:45,359 --> 00:21:48,760 Speaker 1: he'd have to inflict physical violence on his friend first, 342 00:21:48,840 --> 00:21:52,000 Speaker 1: he'd have to tear him open, and that consideration would 343 00:21:52,080 --> 00:21:57,040 Speaker 1: recruit emotional networks into the decision. After all, the world 344 00:21:57,200 --> 00:21:59,960 Speaker 1: would not be better if we all behaved like row 345 00:22:00,040 --> 00:22:05,200 Speaker 1: robots when making life and death decisions. Unchecked reason can 346 00:22:05,240 --> 00:22:09,679 Speaker 1: be dangerous. Our emotions are actually a powerful and often 347 00:22:09,840 --> 00:22:14,920 Speaker 1: insightful constituency, and we'd be remiss to exclude them from 348 00:22:14,920 --> 00:22:19,800 Speaker 1: the parliamentary voting. Although the neuroscience is new, this intuition 349 00:22:19,920 --> 00:22:23,760 Speaker 1: about having different drives has a long history. The ancient 350 00:22:23,880 --> 00:22:27,520 Speaker 1: Greeks suggested that we should think of our lives like 351 00:22:27,600 --> 00:22:31,400 Speaker 1: we are charioteers trying to hold on to two horses. 352 00:22:31,640 --> 00:22:34,400 Speaker 1: We've got the horse of reason and the horse of passion, 353 00:22:34,800 --> 00:22:38,560 Speaker 1: and each horse pulls slightly off center in opposite directions, 354 00:22:38,800 --> 00:22:42,440 Speaker 1: and your job is to keep control of both horses, 355 00:22:42,840 --> 00:22:46,439 Speaker 1: keeping yourself moving down the middle of the road. So 356 00:22:46,640 --> 00:22:49,920 Speaker 1: I've asserted that emotions are really important and we shouldn't 357 00:22:50,000 --> 00:22:52,440 Speaker 1: try to be like mister Spock. But I want to 358 00:22:52,480 --> 00:22:54,879 Speaker 1: double click on that claim, and the way we can 359 00:22:54,960 --> 00:22:58,080 Speaker 1: do that is by seeing what happens when a person 360 00:22:58,480 --> 00:23:03,639 Speaker 1: loses the capacity to include emotions in decision making. So 361 00:23:03,760 --> 00:23:07,040 Speaker 1: let's take someone like Hammy Myers, who I filmed in 362 00:23:07,080 --> 00:23:10,880 Speaker 1: my television show called The Brain. Cammy is a former 363 00:23:11,040 --> 00:23:14,000 Speaker 1: engineer and some years ago she got into a motorcycle 364 00:23:14,040 --> 00:23:17,840 Speaker 1: accident and the consequence was damaged to her orbitofrontal cortex, 365 00:23:17,880 --> 00:23:21,600 Speaker 1: which is the region just above the orbits of the eyes. Now, 366 00:23:21,720 --> 00:23:26,640 Speaker 1: this is a critical region to integrate signals streaming in 367 00:23:26,680 --> 00:23:29,919 Speaker 1: from the body, Signals that tell the rest of the 368 00:23:29,960 --> 00:23:34,400 Speaker 1: brain what state her body is in. Is she hungry, 369 00:23:34,560 --> 00:23:38,119 Speaker 1: is she nervous? Is she excited? Is she embarrassed? Is 370 00:23:38,160 --> 00:23:42,359 Speaker 1: she thirsty? Is she joyful? Now? The interesting thing is 371 00:23:42,400 --> 00:23:45,639 Speaker 1: that Cammy doesn't look like someone who has suffered a 372 00:23:45,720 --> 00:23:48,680 Speaker 1: traumatic brain injury. But if you were to spend even 373 00:23:48,800 --> 00:23:52,080 Speaker 1: five minutes with her, you would detect there's a problem 374 00:23:52,480 --> 00:23:56,280 Speaker 1: with her ability to deal with decision making. So let's 375 00:23:56,280 --> 00:23:59,280 Speaker 1: say you say, hey, do you want the lemon drinker 376 00:23:59,359 --> 00:24:02,760 Speaker 1: the blueberry? And she can describe all the pros and 377 00:24:02,800 --> 00:24:05,399 Speaker 1: the cons while the lemon drinks a little more sour, 378 00:24:05,800 --> 00:24:08,800 Speaker 1: but the blueberry has more sugar, and so on. But 379 00:24:08,840 --> 00:24:14,600 Speaker 1: she can't actually decide. She can't finalize a choice. Even 380 00:24:14,640 --> 00:24:19,720 Speaker 1: the simplest situations leave her mired and indecision why. It's 381 00:24:19,760 --> 00:24:24,320 Speaker 1: because she can no longer read her body's emotional summaries, 382 00:24:24,960 --> 00:24:28,160 Speaker 1: and as a result, decisions become incredibly difficult for her. 383 00:24:28,440 --> 00:24:32,760 Speaker 1: In other words, because of her brain injury, no particular 384 00:24:32,880 --> 00:24:37,000 Speaker 1: choice is now tangibly different from any other choice, and 385 00:24:37,040 --> 00:24:40,800 Speaker 1: without decision making, very little gets done. Tammy reports she 386 00:24:40,880 --> 00:24:44,920 Speaker 1: often spends all day on the sofa, So her brain 387 00:24:44,960 --> 00:24:48,879 Speaker 1: injury tells us something crucial about decision making. It's easy 388 00:24:49,200 --> 00:24:52,560 Speaker 1: to think about the brain commanding the body from on high, 389 00:24:52,560 --> 00:24:55,879 Speaker 1: but in fact, the brain is in constant feedback with 390 00:24:56,040 --> 00:24:59,600 Speaker 1: the body. The physical signals from the body give a 391 00:24:59,680 --> 00:25:02,600 Speaker 1: quick summary of what's going on and what to do 392 00:25:02,680 --> 00:25:06,560 Speaker 1: about it. To land on a choice, the body and 393 00:25:06,640 --> 00:25:10,240 Speaker 1: the brain have to be in close communication, and for 394 00:25:10,320 --> 00:25:12,840 Speaker 1: that you need the orbit or frontal cortex. Now you can, 395 00:25:12,880 --> 00:25:15,760 Speaker 1: of course understand this in your own life. Consider a 396 00:25:15,920 --> 00:25:19,680 Speaker 1: situation like this. You get a package misdelivered to your door, 397 00:25:19,720 --> 00:25:22,040 Speaker 1: but it belongs to your next door neighbors. So you 398 00:25:22,160 --> 00:25:24,679 Speaker 1: go over to deliver it to them, But as you 399 00:25:24,720 --> 00:25:29,200 Speaker 1: approach their gate, their dog growls and bares its teeth. 400 00:25:29,240 --> 00:25:32,359 Speaker 1: So do you open the gate and press on to 401 00:25:32,440 --> 00:25:37,800 Speaker 1: their front door? Your knowledge of the statistics of dog attacks, 402 00:25:37,840 --> 00:25:42,560 Speaker 1: that's not the deciding factor here. Instead, the dog's threatening 403 00:25:42,680 --> 00:25:47,520 Speaker 1: posture triggers a set of physiologic responses in your body. 404 00:25:47,840 --> 00:25:50,400 Speaker 1: You get an increased heart rate, you get a tightening 405 00:25:50,400 --> 00:25:53,320 Speaker 1: of the gut, a tensing of the muscles, you get 406 00:25:53,359 --> 00:25:57,840 Speaker 1: pupil dilation, changes in blood hormones, opening of sweat glands, 407 00:25:57,840 --> 00:26:02,000 Speaker 1: and so on. And all these responses are automatic and unconscious. 408 00:26:02,560 --> 00:26:05,119 Speaker 1: So in this moment, standing there with your hand on 409 00:26:05,240 --> 00:26:08,920 Speaker 1: their fence latch, there are a lot of external details 410 00:26:08,960 --> 00:26:12,760 Speaker 1: you could assess, like what's the color difference between the 411 00:26:12,800 --> 00:26:15,399 Speaker 1: dog's ears and nose, But your brain doesn't care about that. 412 00:26:15,440 --> 00:26:18,080 Speaker 1: Way your brain really needs to know right now is 413 00:26:18,160 --> 00:26:22,080 Speaker 1: whether you should face the dog or deliver the package. 414 00:26:22,080 --> 00:26:26,199 Speaker 1: Another way, the state of your body helps you in 415 00:26:26,240 --> 00:26:29,760 Speaker 1: this task. It serves as a summary of the situation. 416 00:26:30,440 --> 00:26:33,680 Speaker 1: Your physiological signature can be thought of as a low 417 00:26:33,720 --> 00:26:38,600 Speaker 1: resolution headline like this is bad or this is no problem, 418 00:26:38,920 --> 00:26:43,080 Speaker 1: and that helps your brain decide what to do next. 419 00:26:43,640 --> 00:26:47,920 Speaker 1: Most situations involve too many details to compute a purely 420 00:26:48,080 --> 00:26:53,560 Speaker 1: logical decision. To guide the process, we need these abridged summaries. 421 00:26:53,640 --> 00:26:56,560 Speaker 1: I am safe here, I am in danger here. The 422 00:26:56,560 --> 00:27:00,840 Speaker 1: physiological state of the body maintains this instant two way 423 00:27:01,000 --> 00:27:05,000 Speaker 1: dialogue with the brain, and every day we read the 424 00:27:05,119 --> 00:27:08,840 Speaker 1: states of our bodies like this. But in most situations, 425 00:27:08,840 --> 00:27:12,320 Speaker 1: our physiologic signals are more subtle and so we aren't 426 00:27:12,359 --> 00:27:16,639 Speaker 1: aware of them. But these signals are crucial to steering 427 00:27:16,680 --> 00:27:19,280 Speaker 1: the decisions we have to make. So consider being in 428 00:27:19,280 --> 00:27:22,159 Speaker 1: a restaurant. This is the kind of place which leaves 429 00:27:22,520 --> 00:27:27,680 Speaker 1: TAMMI paralyzed within decision. Do I want an appetizer, which one? 430 00:27:27,920 --> 00:27:31,200 Speaker 1: What kind of salad dressing? Which dessert should I order? 431 00:27:31,800 --> 00:27:35,639 Speaker 1: Thousands of choices bear down on the restaurant goer, with 432 00:27:35,760 --> 00:27:38,959 Speaker 1: the end result that we spend hours of our lives 433 00:27:39,040 --> 00:27:42,520 Speaker 1: staring at menus, making our neural networks commit to one 434 00:27:42,560 --> 00:27:46,119 Speaker 1: decision over another. And although we don't commonly realize it, 435 00:27:46,520 --> 00:27:51,800 Speaker 1: our body helps us to navigate this boggling complexity. Take 436 00:27:51,840 --> 00:27:54,440 Speaker 1: the choice of which Italian dish are going to order? 437 00:27:54,480 --> 00:27:57,320 Speaker 1: There's too much data for you to grapple with. You 438 00:27:57,359 --> 00:28:00,680 Speaker 1: can think about calories or price, or salt content, taste 439 00:28:00,720 --> 00:28:02,920 Speaker 1: or whatever. There's lots of data to draw from here. 440 00:28:03,240 --> 00:28:06,080 Speaker 1: But if you were a robot, you'd be stuck here 441 00:28:06,119 --> 00:28:08,960 Speaker 1: all day trying to make your decision with no obvious 442 00:28:09,000 --> 00:28:13,240 Speaker 1: way to trade off. Which details matter more. To land 443 00:28:13,280 --> 00:28:15,680 Speaker 1: on a choice, you need a summary of some sort, 444 00:28:15,760 --> 00:28:18,479 Speaker 1: and that's what the feedback from your body gives you. 445 00:28:18,640 --> 00:28:22,800 Speaker 1: For example, thinking about your budget might make your palm sweat, 446 00:28:22,920 --> 00:28:26,080 Speaker 1: or you might salivate thinking about the last time you 447 00:28:26,200 --> 00:28:30,240 Speaker 1: consumed the angel hair pasta, or thinking about the excessive 448 00:28:30,320 --> 00:28:34,560 Speaker 1: creaminess of the alfredo might put a cramp in your intestines. 449 00:28:34,920 --> 00:28:38,680 Speaker 1: You simulate your experience with one pasta, and then you 450 00:28:38,720 --> 00:28:42,240 Speaker 1: simulate the next and the next, and your bodily experience 451 00:28:42,560 --> 00:28:46,520 Speaker 1: helps your brain to quickly place a value on the 452 00:28:46,560 --> 00:28:49,479 Speaker 1: first pasta offering, and another on the second pasta, and 453 00:28:49,520 --> 00:28:51,640 Speaker 1: on the third and so on, and this is what 454 00:28:51,800 --> 00:28:55,480 Speaker 1: allows you to tip the balance in one direction or another. 455 00:28:55,880 --> 00:28:59,520 Speaker 1: You don't just extract the data from the pasta descriptions, 456 00:28:59,800 --> 00:29:04,440 Speaker 1: you so feel the data. These emotional signatures are more 457 00:29:04,760 --> 00:29:08,080 Speaker 1: subtle than the ones related to facing down a barking dog. 458 00:29:08,160 --> 00:29:11,280 Speaker 1: But the idea is the same. Every choice that you 459 00:29:11,360 --> 00:29:15,800 Speaker 1: face is marked by a bodily signature, and that helps 460 00:29:15,840 --> 00:29:20,240 Speaker 1: you decide. Earlier, when I was deciding between the lemon 461 00:29:20,320 --> 00:29:23,360 Speaker 1: and the blueberry, exercise drink, I told you that there 462 00:29:23,440 --> 00:29:27,720 Speaker 1: was a battle between networks. The physiological states from my 463 00:29:27,920 --> 00:29:30,880 Speaker 1: body are the key things that help tip that battle, 464 00:29:30,880 --> 00:29:35,080 Speaker 1: that allow one network to win over another. But for Tammy, 465 00:29:35,200 --> 00:29:38,840 Speaker 1: because of her brain damage, she can't integrate her bodily 466 00:29:38,920 --> 00:29:42,480 Speaker 1: signals into her decision making, so she has no way 467 00:29:42,800 --> 00:29:46,800 Speaker 1: to rapidly compare the overall value between options. She has 468 00:29:46,840 --> 00:29:50,840 Speaker 1: no way to prioritize the dozens of details that she 469 00:29:50,840 --> 00:29:54,800 Speaker 1: can articulate. That's why Tammy stays on the soface so 470 00:29:54,920 --> 00:29:57,520 Speaker 1: much of the time. None of the choices in front 471 00:29:57,520 --> 00:30:02,480 Speaker 1: of her carry any particular emotional value. There's no way 472 00:30:02,560 --> 00:30:07,280 Speaker 1: to tip one network's campaign over any other, so the 473 00:30:07,360 --> 00:30:12,600 Speaker 1: debates in her neural parliament continue along in deadlock. Because 474 00:30:12,680 --> 00:30:15,760 Speaker 1: the conscious mind has low bandwidth, you don't typically have 475 00:30:15,840 --> 00:30:19,280 Speaker 1: full access to the bodily signals that tip your decisions. 476 00:30:19,680 --> 00:30:24,040 Speaker 1: Most of the action lives far below awareness. Nonetheless, these 477 00:30:24,040 --> 00:30:27,960 Speaker 1: signals can have far reaching consequences on the type of 478 00:30:28,040 --> 00:30:31,920 Speaker 1: person you are and who you'll become. So what we've 479 00:30:31,920 --> 00:30:35,040 Speaker 1: been introduced to today is this incredible thing that brains 480 00:30:35,120 --> 00:30:40,280 Speaker 1: do called decision making, where neurons consider options and things 481 00:30:40,320 --> 00:30:43,400 Speaker 1: are set up so that one coalition can smash everything 482 00:30:43,400 --> 00:30:47,000 Speaker 1: else down so you get to one outcome. And even 483 00:30:47,040 --> 00:30:51,080 Speaker 1: though we typically don't give it much consideration, decision making 484 00:30:51,120 --> 00:30:54,680 Speaker 1: lies at the heart of everything we do during a day. 485 00:30:55,240 --> 00:30:59,360 Speaker 1: Without the ability to consider alternatives and weigh them and 486 00:30:59,440 --> 00:31:02,960 Speaker 1: select one over others, the complexity of the world would 487 00:31:03,000 --> 00:31:06,400 Speaker 1: just paralyze us. And one of the lessons that surfaced 488 00:31:06,520 --> 00:31:10,080 Speaker 1: is that although you feel like you have a single identity, 489 00:31:10,600 --> 00:31:13,600 Speaker 1: you're not of a single mind. Instead, you're a collection 490 00:31:13,760 --> 00:31:17,320 Speaker 1: of many competing neural networks. You are, in a sense, 491 00:31:17,760 --> 00:31:21,400 Speaker 1: a machine built of conflict. So what we covered is 492 00:31:21,400 --> 00:31:25,520 Speaker 1: that every decision you make involves your past experiences stored 493 00:31:25,560 --> 00:31:28,160 Speaker 1: in the states of your body, as well as an 494 00:31:28,160 --> 00:31:31,920 Speaker 1: analysis of your presence situation. Do I have enough money 495 00:31:31,920 --> 00:31:35,160 Speaker 1: to buy X instead of Y? Is options Z available? 496 00:31:35,560 --> 00:31:38,360 Speaker 1: But there's one more part to the story of decisions, 497 00:31:38,400 --> 00:31:41,680 Speaker 1: perhaps the most important part of all, and that is 498 00:31:42,320 --> 00:31:46,400 Speaker 1: predictions about the future. And that is where we're going 499 00:31:46,440 --> 00:31:49,760 Speaker 1: to pick up next week. Until then, keep making good 500 00:31:49,800 --> 00:31:55,320 Speaker 1: decisions and I'll see you next time. Go to eagleman 501 00:31:55,360 --> 00:31:58,360 Speaker 1: dot com slash podcast for more information and to find 502 00:31:58,440 --> 00:32:01,920 Speaker 1: further reading, and send me an email at podcast at 503 00:32:01,920 --> 00:32:05,080 Speaker 1: eagleman dot com with questions or discussions, and check out 504 00:32:05,080 --> 00:32:07,840 Speaker 1: and subscribe to Inner Cosmos on YouTube for videos of 505 00:32:07,880 --> 00:32:11,240 Speaker 1: each episode and to leave comments. Until next time, I'm 506 00:32:11,320 --> 00:32:13,960 Speaker 1: David Eagleman, and you made the very nice decision to 507 00:32:14,080 --> 00:32:16,440 Speaker 1: join me here in the Inner Cosmos