1 00:00:06,720 --> 00:00:08,719 Speaker 1: Hi, this is David Eagleman. I've done a couple dozen 2 00:00:08,760 --> 00:00:11,600 Speaker 1: of these podcast episodes so far, and I have been 3 00:00:11,640 --> 00:00:15,080 Speaker 1: so gratified to see all the emails I get at 4 00:00:15,160 --> 00:00:18,919 Speaker 1: podcasts at Eagleman dot com where people have comments or questions, 5 00:00:18,920 --> 00:00:20,800 Speaker 1: and so what I want to do in this short 6 00:00:20,800 --> 00:00:25,160 Speaker 1: episode today was just address a bunch of questions. Okay, 7 00:00:25,320 --> 00:00:28,320 Speaker 1: Laura from Miami asks, is it true that we only 8 00:00:28,400 --> 00:00:31,200 Speaker 1: use ten percent of our brain? Thank you for that question. 9 00:00:31,240 --> 00:00:33,760 Speaker 1: It turns out that is not true. It turns out 10 00:00:33,760 --> 00:00:37,400 Speaker 1: your brain is screaming with activity around the clock all 11 00:00:37,479 --> 00:00:40,440 Speaker 1: the time. All the neurons in your brain are active. 12 00:00:40,440 --> 00:00:42,879 Speaker 1: They're all popping off, even when you're asleep, even when 13 00:00:42,880 --> 00:00:45,440 Speaker 1: you're in a deep sleep, they're popping off. So it 14 00:00:45,479 --> 00:00:47,120 Speaker 1: turns out you do not use just ten percent of 15 00:00:47,159 --> 00:00:49,120 Speaker 1: your brain. You use one hundred percent of your brain 16 00:00:49,600 --> 00:00:53,560 Speaker 1: essentially all the time. Now, the question that's been interesting 17 00:00:53,560 --> 00:00:56,400 Speaker 1: to me is why is it such a sticky myth 18 00:00:56,600 --> 00:00:58,800 Speaker 1: even though it has no basis in the science. Why 19 00:00:58,800 --> 00:01:01,240 Speaker 1: does it stick around? I think it's because it must 20 00:01:01,320 --> 00:01:04,840 Speaker 1: be sort of aspirational or hopeful, which is, if we 21 00:01:05,160 --> 00:01:06,880 Speaker 1: think we're all using ten percent of our brains and 22 00:01:06,959 --> 00:01:09,200 Speaker 1: we think, wow, I could be much smarter than I 23 00:01:09,240 --> 00:01:12,080 Speaker 1: am and do much more than I can do right now, 24 00:01:12,400 --> 00:01:14,600 Speaker 1: But in fact it's a myth. Thank you for the question. 25 00:01:16,920 --> 00:01:19,360 Speaker 1: I did an episode a little while ago about in 26 00:01:19,400 --> 00:01:22,120 Speaker 1: groups and outgroups and the difference that we can measure 27 00:01:22,360 --> 00:01:26,280 Speaker 1: in the brain, and Haroun from Berkeley wrote and asked, 28 00:01:26,800 --> 00:01:29,880 Speaker 1: what if your brain shows a big difference between in 29 00:01:29,920 --> 00:01:33,119 Speaker 1: and out groups, but you don't want that to be true, 30 00:01:33,600 --> 00:01:36,319 Speaker 1: like with your behavior. So that's a great question, Harun. 31 00:01:36,720 --> 00:01:38,960 Speaker 1: Just as a reminder, what we did in the study 32 00:01:39,080 --> 00:01:42,479 Speaker 1: is we had people watch, let's say a hand get 33 00:01:42,600 --> 00:01:44,960 Speaker 1: stabbed with a syringe needle. They see this piece of 34 00:01:45,000 --> 00:01:47,880 Speaker 1: footage and it activates this pain network in the brain, 35 00:01:48,360 --> 00:01:51,840 Speaker 1: and that is essentially a measure of empathy. But what 36 00:01:51,880 --> 00:01:53,920 Speaker 1: we then did is we added a single word to 37 00:01:54,240 --> 00:01:59,960 Speaker 1: each of the six hands. So we had Christian, Jewish, Muslim, Scientologist, Hindu, atheist. 38 00:01:59,840 --> 00:02:01,880 Speaker 1: And now you see a hand get stabbed, and the 39 00:02:02,000 --> 00:02:05,360 Speaker 1: question is which group do you belong to and do 40 00:02:05,400 --> 00:02:08,040 Speaker 1: you have as much of an empathic response when you 41 00:02:08,080 --> 00:02:11,560 Speaker 1: see any member of your outgroup gets stabbed? And generally, 42 00:02:11,600 --> 00:02:15,040 Speaker 1: what we found is that you have a lower empathic 43 00:02:15,120 --> 00:02:18,000 Speaker 1: response when the label is different when it's not your 44 00:02:18,160 --> 00:02:20,200 Speaker 1: in group. So I think this is the heart of 45 00:02:20,240 --> 00:02:24,280 Speaker 1: Harun's question is if you have that big difference in 46 00:02:24,320 --> 00:02:27,680 Speaker 1: your empathic response, does it mean you're a bad person? 47 00:02:27,840 --> 00:02:29,799 Speaker 1: Does it mean it has some reflection on how you're 48 00:02:29,800 --> 00:02:32,240 Speaker 1: going to act? So let me say two things about this. First, 49 00:02:32,280 --> 00:02:35,480 Speaker 1: it turns out that there is some amount of individual 50 00:02:35,520 --> 00:02:40,560 Speaker 1: differences in this group outgroup difference. Everyone seems to have 51 00:02:40,600 --> 00:02:43,920 Speaker 1: this reaction of caring more about their in group, but 52 00:02:44,040 --> 00:02:46,959 Speaker 1: some show a bigger effect than others. But as far 53 00:02:47,040 --> 00:02:51,280 Speaker 1: as we can tell, this does not map onto people's behavior. 54 00:02:51,800 --> 00:02:55,080 Speaker 1: In other words, we can't help what our low level 55 00:02:55,120 --> 00:02:58,480 Speaker 1: reactions are, what our first response is to something when 56 00:02:58,600 --> 00:03:03,280 Speaker 1: we see something and there's a label. For example, whenever 57 00:03:03,320 --> 00:03:08,280 Speaker 1: I see crunchy tortilla chips, my brain fires on all 58 00:03:08,320 --> 00:03:11,240 Speaker 1: cylinders and tells me to grab those and stick them 59 00:03:11,240 --> 00:03:14,600 Speaker 1: in my mouth so I can enjoy the crunch. But 60 00:03:14,680 --> 00:03:17,880 Speaker 1: it doesn't mean that I actually do that. It doesn't 61 00:03:17,880 --> 00:03:21,519 Speaker 1: mean I actually grab the chips. I have goals at 62 00:03:21,520 --> 00:03:24,720 Speaker 1: different levels, like I've cut carbs out of my diet 63 00:03:24,760 --> 00:03:27,720 Speaker 1: because I think that makes me healthier. And this is 64 00:03:27,800 --> 00:03:31,519 Speaker 1: related to an even longer term issue that for reasons 65 00:03:31,560 --> 00:03:34,240 Speaker 1: of vanity, I want to be an excellent shape and 66 00:03:34,320 --> 00:03:39,400 Speaker 1: the chips would sabotage that. So despite this giant low 67 00:03:39,480 --> 00:03:42,200 Speaker 1: level reaction from my brain when I see the chips, 68 00:03:42,880 --> 00:03:46,280 Speaker 1: I have the capacity to override that decision with different 69 00:03:46,480 --> 00:03:50,560 Speaker 1: layers of decision making. I am not slave to my 70 00:03:50,800 --> 00:03:54,640 Speaker 1: first low level responses. And it's the same thing with 71 00:03:54,720 --> 00:03:59,520 Speaker 1: these in group and outgroup responses. The effect that people show. 72 00:03:59,560 --> 00:04:02,680 Speaker 1: That's the first low level thing that exposed how parts 73 00:04:02,720 --> 00:04:07,520 Speaker 1: of their brain think. And it's massively important because it's 74 00:04:07,600 --> 00:04:09,640 Speaker 1: different if there's no part of your brain that thinks 75 00:04:09,640 --> 00:04:13,120 Speaker 1: that way at all. But when you have this first reaction, 76 00:04:13,840 --> 00:04:17,760 Speaker 1: it has to be either implemented or overridden. In other words, 77 00:04:17,760 --> 00:04:19,080 Speaker 1: if I didn't like chips at all, it would be 78 00:04:19,080 --> 00:04:22,200 Speaker 1: a lot less effort to override that, and we wouldn't 79 00:04:22,200 --> 00:04:25,160 Speaker 1: even be talking about this. So when it comes to 80 00:04:25,200 --> 00:04:30,320 Speaker 1: a low level response of preferring your in group, no problem, 81 00:04:30,400 --> 00:04:33,120 Speaker 1: that's just the first offer that your brain is making. 82 00:04:33,200 --> 00:04:36,520 Speaker 1: But for most of us, we care about societal structure 83 00:04:36,640 --> 00:04:41,480 Speaker 1: and fairness and equality, and so we override that. And 84 00:04:41,640 --> 00:04:44,520 Speaker 1: in fact, I have a suspicion with no data to 85 00:04:44,600 --> 00:04:47,600 Speaker 1: prove this. It's just a suspicion. But I think that 86 00:04:47,839 --> 00:04:50,760 Speaker 1: perhaps some of the most socially conscious people, the people 87 00:04:50,800 --> 00:04:54,960 Speaker 1: who fight against religionism or racism or whatever, that at 88 00:04:55,040 --> 00:04:57,920 Speaker 1: least some of those people are battling some of their 89 00:04:58,000 --> 00:05:03,600 Speaker 1: own quite powerful internal demons. And that's fine because in 90 00:05:03,640 --> 00:05:08,799 Speaker 1: the final analysis, actions speak louder than words. Your behavior 91 00:05:08,839 --> 00:05:12,640 Speaker 1: in the world is the only thing that counts. And 92 00:05:12,680 --> 00:05:14,600 Speaker 1: I'll just mention one more thing. This is sort of 93 00:05:14,640 --> 00:05:18,359 Speaker 1: a long answer, but something that's been discussed among the 94 00:05:18,400 --> 00:05:21,200 Speaker 1: news outlets in the last several years is something called 95 00:05:21,240 --> 00:05:25,520 Speaker 1: the implicit association's test, and there's a whole flurry of 96 00:05:25,640 --> 00:05:29,320 Speaker 1: papers about this, and that is, you see things on 97 00:05:29,360 --> 00:05:31,919 Speaker 1: the screen and you have to react by hitting a button, 98 00:05:32,440 --> 00:05:35,840 Speaker 1: whether it's a good or a bad thing. So maybe 99 00:05:35,880 --> 00:05:39,240 Speaker 1: it's a particular religion or a particular race, or a 100 00:05:39,240 --> 00:05:43,520 Speaker 1: particular sexual orientation or whatever. And the question is, is 101 00:05:43,560 --> 00:05:46,440 Speaker 1: your reaction time a little bit slowed down when you're 102 00:05:46,440 --> 00:05:49,280 Speaker 1: seeing some other group and could that mean that you 103 00:05:49,440 --> 00:05:54,080 Speaker 1: implicitly have some sort of negative association with that group. Now, 104 00:05:54,120 --> 00:05:57,200 Speaker 1: the question in the media has been could this someday 105 00:05:57,600 --> 00:06:01,920 Speaker 1: be used in a courtroom. If let's say an employer 106 00:06:02,160 --> 00:06:07,760 Speaker 1: fires somebody, could you use the implicit associations test to demonstrate, Hey, 107 00:06:08,080 --> 00:06:10,359 Speaker 1: we think this guy is a racist and that's why 108 00:06:10,400 --> 00:06:14,440 Speaker 1: he did the firing. From a neuroscience point of view, 109 00:06:14,920 --> 00:06:17,159 Speaker 1: I think it would be ridiculous to allow that into 110 00:06:17,160 --> 00:06:21,719 Speaker 1: a courtroom because someone might have a racist or sexist 111 00:06:21,839 --> 00:06:26,679 Speaker 1: or religious response at an unconscious low level in their brain, 112 00:06:27,400 --> 00:06:30,560 Speaker 1: and that tells you nothing about how they're going to 113 00:06:30,680 --> 00:06:33,480 Speaker 1: act in the world, or whether the firing of that 114 00:06:33,600 --> 00:06:37,000 Speaker 1: employee had anything to do with that low level brain response. 115 00:06:37,800 --> 00:06:41,400 Speaker 1: As I said, many people actually act better in the 116 00:06:41,440 --> 00:06:45,440 Speaker 1: world because they're embarrassed by their internal first reaction. It 117 00:06:45,480 --> 00:06:47,680 Speaker 1: would be like taking me to court and concluding I 118 00:06:47,839 --> 00:06:50,159 Speaker 1: must be the one who stole the bag of chips 119 00:06:50,480 --> 00:06:53,159 Speaker 1: because I have the biggest neural response to the site 120 00:06:53,160 --> 00:06:56,280 Speaker 1: of chips. It doesn't tell you anything about my final behavior, 121 00:06:56,800 --> 00:07:00,240 Speaker 1: my final decision in a situation. The only thing that 122 00:07:00,320 --> 00:07:03,279 Speaker 1: can give you a good indication of someone's behavior is 123 00:07:03,320 --> 00:07:08,520 Speaker 1: their prior behavior, not their brain's knee jerk response. Thanks 124 00:07:08,560 --> 00:07:25,360 Speaker 1: for that question, okay. Jordan from LA asked a question 125 00:07:25,760 --> 00:07:28,680 Speaker 1: from my episode about dreams. One of the things I 126 00:07:28,720 --> 00:07:31,920 Speaker 1: was talking about there is when somebody goes blind, their 127 00:07:32,080 --> 00:07:36,800 Speaker 1: visual cortex gets taken over by other neighboring territories. And 128 00:07:36,840 --> 00:07:40,240 Speaker 1: so my hypothesis is that the reason we dream, it's 129 00:07:40,320 --> 00:07:43,720 Speaker 1: like a screen saver that protects the visual cortex from 130 00:07:43,720 --> 00:07:46,760 Speaker 1: takeover during the nighttime. So Jordan's question is, what about 131 00:07:46,760 --> 00:07:50,480 Speaker 1: people who go blind later in life. Well, there's actually 132 00:07:50,520 --> 00:07:52,040 Speaker 1: a lot of data on this sort of thing. People 133 00:07:52,040 --> 00:07:56,440 Speaker 1: who become blind after the age of seven have more 134 00:07:56,560 --> 00:08:00,960 Speaker 1: visual content in their dreams than those who become blind earlier, 135 00:08:01,200 --> 00:08:04,280 Speaker 1: and that is consistent with the fact that the occipital 136 00:08:04,320 --> 00:08:07,400 Speaker 1: lobe this part back here your visual cortex. In people 137 00:08:07,400 --> 00:08:11,680 Speaker 1: who become blind later, that gets less fully taken over 138 00:08:11,760 --> 00:08:14,520 Speaker 1: by the other senses, and so the activity that you 139 00:08:14,640 --> 00:08:18,760 Speaker 1: have is experienced more visually. In other words, if you 140 00:08:18,920 --> 00:08:22,680 Speaker 1: go blind later, your visual cortex is more intact and 141 00:08:22,760 --> 00:08:24,960 Speaker 1: less likely to get taken over, and so you have 142 00:08:25,040 --> 00:08:31,760 Speaker 1: more visual dreams. Okay, this next question is from Sampridi 143 00:08:31,920 --> 00:08:35,360 Speaker 1: in Boston. Why are we so bad at remembering what 144 00:08:35,520 --> 00:08:39,760 Speaker 1: happened during a dream? Okay, Well, this is because during 145 00:08:39,880 --> 00:08:43,439 Speaker 1: dreaming you've got these other brain areas like the hippocampus 146 00:08:43,480 --> 00:08:46,600 Speaker 1: and the prefrontal cortex that are essentially shut down. They 147 00:08:46,640 --> 00:08:49,680 Speaker 1: are less active during dream sleep than they are during 148 00:08:49,720 --> 00:08:53,360 Speaker 1: the waking state, and presumably this is what accounts for 149 00:08:53,600 --> 00:08:59,360 Speaker 1: our difficulty remembering our dreams. Now, why does your brain 150 00:08:59,640 --> 00:09:02,280 Speaker 1: shut down these areas so that dreams don't get remembered. 151 00:09:02,720 --> 00:09:06,520 Speaker 1: One possibility is that there's no need to write down 152 00:09:06,600 --> 00:09:10,000 Speaker 1: memory if the central purpose of dream sleep is just 153 00:09:10,040 --> 00:09:14,760 Speaker 1: to keep the visual cortex actively fighting off its neighbors. 154 00:09:15,320 --> 00:09:17,320 Speaker 1: And this is why when you wake up, you say, 155 00:09:17,360 --> 00:09:19,720 Speaker 1: oh my gosh, I just experienced this wild dream, and 156 00:09:19,760 --> 00:09:21,880 Speaker 1: you tell it somebody. You try to do it, but 157 00:09:22,160 --> 00:09:26,360 Speaker 1: it's like gossamer, it just goes away and you can't 158 00:09:26,400 --> 00:09:30,440 Speaker 1: remember what the dream was after just fifteen minutes. And 159 00:09:30,480 --> 00:09:34,240 Speaker 1: by the way, I think this is similar to what 160 00:09:34,360 --> 00:09:38,439 Speaker 1: it is like to have Alzheimer's disease, where you say, okay, 161 00:09:38,760 --> 00:09:42,079 Speaker 1: i'll be there in five minutes, I'll meet you downstairs, 162 00:09:42,200 --> 00:09:44,520 Speaker 1: and then it just sort of goes away and you 163 00:09:44,559 --> 00:09:46,800 Speaker 1: can't remember it. I think, in a sense, we all 164 00:09:47,280 --> 00:09:49,080 Speaker 1: know what it is like to be a person with 165 00:09:49,080 --> 00:09:52,840 Speaker 1: Alzheimer's because we have that every morning when we wake 166 00:09:52,920 --> 00:09:57,400 Speaker 1: up and one possibility about why memory does not get 167 00:09:57,440 --> 00:10:00,360 Speaker 1: written down during dream sleep is just because there's actually 168 00:10:00,360 --> 00:10:03,440 Speaker 1: no need to write down memory. It's not supposed to 169 00:10:03,480 --> 00:10:07,599 Speaker 1: be cataloged for the future. If really the purpose of 170 00:10:07,679 --> 00:10:11,240 Speaker 1: dream sleep is just to keep the visual cortex defended 171 00:10:11,400 --> 00:10:15,520 Speaker 1: against takeover at nighttime, keep sending in your questions to 172 00:10:15,760 --> 00:10:18,680 Speaker 1: podcasts at eagleman dot com and listen to the full 173 00:10:18,720 --> 00:10:22,160 Speaker 1: episodes of Inner Cosmos wherever you listen to your podcasts, 174 00:10:22,200 --> 00:10:25,280 Speaker 1: and also on YouTube, where you can leave questions or comments. 175 00:10:25,840 --> 00:10:28,679 Speaker 1: Until next time. I'm David Eagleman, and this is the 176 00:10:28,720 --> 00:10:30,560 Speaker 1: Inner Cosmos inbox.