1 00:00:00,800 --> 00:00:04,280 Speaker 1: Hey, this is David Eagleman on the Inner Cosmos inbox, 2 00:00:04,360 --> 00:00:07,760 Speaker 1: and I'm going to answer your questions about the brain. Okay, 3 00:00:07,840 --> 00:00:11,440 Speaker 1: Christina from Minnesota asks, what do I think about the 4 00:00:11,520 --> 00:00:16,439 Speaker 1: claims of the government hiding evidence of aliens? Okay, we 5 00:00:16,480 --> 00:00:19,880 Speaker 1: don't really know, but I'll make a general statement from 6 00:00:20,040 --> 00:00:23,000 Speaker 1: the science point of view and one from the neuroscience 7 00:00:23,000 --> 00:00:26,400 Speaker 1: point of view. From the science point of view, it 8 00:00:26,480 --> 00:00:30,680 Speaker 1: certainly seems unlikely that we're the only intelligence civilization in 9 00:00:30,720 --> 00:00:33,320 Speaker 1: the cosmos, given that there are at least one hundred 10 00:00:33,360 --> 00:00:37,120 Speaker 1: billion other galaxies, and every galaxy contains about one hundred 11 00:00:37,120 --> 00:00:40,360 Speaker 1: billion stars, and any number of those may have planets 12 00:00:40,400 --> 00:00:43,440 Speaker 1: rolling around them, and so the number of planets in 13 00:00:43,479 --> 00:00:46,440 Speaker 1: the so called Goldilocks zone where it's not too hot, 14 00:00:46,560 --> 00:00:49,960 Speaker 1: not too cold, and could theoretically sustain life, this is 15 00:00:50,040 --> 00:00:54,200 Speaker 1: a mind bogglingly high number. So there's no really good 16 00:00:54,240 --> 00:00:56,840 Speaker 1: reason to think that we're all alone in the universe. 17 00:00:57,960 --> 00:01:01,560 Speaker 1: So I would say that whenever we have interesting data, 18 00:01:01,600 --> 00:01:04,120 Speaker 1: we should all be examining it, and therefore, if the 19 00:01:04,160 --> 00:01:08,480 Speaker 1: government has data, we should be putting that immediately in 20 00:01:08,520 --> 00:01:12,120 Speaker 1: front of the scientific community. On the other hand, I 21 00:01:12,160 --> 00:01:15,440 Speaker 1: want to emphasize a general neuroscience point, which is that 22 00:01:15,920 --> 00:01:21,400 Speaker 1: just because somebody asserts something to be true doesn't necessitate 23 00:01:21,440 --> 00:01:23,959 Speaker 1: that it is true. There are dozens of ways that 24 00:01:24,040 --> 00:01:28,880 Speaker 1: people can be diluted or fooled, or can pursue opportunities 25 00:01:28,920 --> 00:01:32,280 Speaker 1: for attention, or can misunderstand what they're seeing even with 26 00:01:32,360 --> 00:01:36,040 Speaker 1: the best intentions. And so the media coverage around this, 27 00:01:36,160 --> 00:01:39,360 Speaker 1: I often hear things like, Oh, this guy is a 28 00:01:39,400 --> 00:01:45,080 Speaker 1: decorated soldier, and this is being taken seriously by elected officials, 29 00:01:45,360 --> 00:01:47,440 Speaker 1: And let me just make a general statement. I hope 30 00:01:47,440 --> 00:01:50,160 Speaker 1: this is not offensive to anybody. But we all have 31 00:01:50,280 --> 00:01:53,320 Speaker 1: the same brains. And whether you are a decorated soldier 32 00:01:53,480 --> 00:01:57,200 Speaker 1: or an elected public official, when you go home and 33 00:01:57,240 --> 00:01:59,080 Speaker 1: you're lying in your bed and you close your eyes 34 00:01:59,120 --> 00:02:02,280 Speaker 1: and you're thinking about life and insecurity is in your 35 00:02:02,320 --> 00:02:05,760 Speaker 1: own death and so on, you're no different than anyone else. 36 00:02:06,520 --> 00:02:09,680 Speaker 1: All the rest, the resume and the titles and so on, 37 00:02:09,720 --> 00:02:13,400 Speaker 1: that's all just decorations and it doesn't mean anything. And 38 00:02:13,440 --> 00:02:15,600 Speaker 1: so I hope I'm not being offensive when I say 39 00:02:15,639 --> 00:02:19,000 Speaker 1: that I have met elected officials and I'm not necessarily 40 00:02:19,000 --> 00:02:21,920 Speaker 1: more impressed by their intelligence and insight than I am 41 00:02:21,960 --> 00:02:24,960 Speaker 1: with other people I meet. One does not have to 42 00:02:25,000 --> 00:02:27,720 Speaker 1: be a genius to get elected. One just needs to 43 00:02:27,800 --> 00:02:30,839 Speaker 1: want it badly enough. So the fact that a few 44 00:02:30,880 --> 00:02:35,440 Speaker 1: people with titles assert something to be true has little 45 00:02:35,480 --> 00:02:38,880 Speaker 1: to no bearing on whether it is actually true. So 46 00:02:38,960 --> 00:02:41,520 Speaker 1: my general take on this whole thing is that the 47 00:02:41,600 --> 00:02:44,359 Speaker 1: data is totally insufficient right now for us to say 48 00:02:44,600 --> 00:02:47,480 Speaker 1: there we have found alien life. And as far as 49 00:02:47,520 --> 00:02:50,240 Speaker 1: I can tell, there are a lot of claims about 50 00:02:50,280 --> 00:02:53,799 Speaker 1: the existence of clear data, but not a thing has 51 00:02:53,880 --> 00:02:57,680 Speaker 1: been demonstrated yet or unveiled to the public. So we 52 00:02:57,720 --> 00:03:00,880 Speaker 1: are in a position of having only a clear data 53 00:03:00,919 --> 00:03:04,120 Speaker 1: and a lot of claims, and in a court of 54 00:03:04,200 --> 00:03:07,280 Speaker 1: law that would translate to little or nothing. Let me 55 00:03:07,400 --> 00:03:09,520 Speaker 1: just add one more thing. Some people make the false 56 00:03:09,600 --> 00:03:14,440 Speaker 1: assertion that scientists tend to be closed minded about these 57 00:03:14,480 --> 00:03:17,959 Speaker 1: alien claims, But I think you would see that scientists 58 00:03:18,040 --> 00:03:21,520 Speaker 1: are the most willing to dive into data and completely 59 00:03:21,600 --> 00:03:24,920 Speaker 1: change their minds in an afternoon, faster than you would 60 00:03:24,919 --> 00:03:28,720 Speaker 1: believe if they see something that counts as meaningful data. 61 00:03:29,400 --> 00:03:32,400 Speaker 1: A lot of people just stick with their views just 62 00:03:32,440 --> 00:03:36,800 Speaker 1: as a matter of political positioning, but the scientific mindset 63 00:03:36,840 --> 00:03:40,040 Speaker 1: is one that is always perfectly happy to change stance, 64 00:03:40,080 --> 00:03:43,600 Speaker 1: even one hundred and eighty degrees, if the data support it. 65 00:03:44,120 --> 00:03:46,920 Speaker 1: So the issue right now is that we're waiting on 66 00:03:46,960 --> 00:03:49,640 Speaker 1: the claimed existence of some data, and if the weight 67 00:03:49,760 --> 00:03:53,080 Speaker 1: goes on long enough, eventually it's going to start to 68 00:03:53,080 --> 00:03:56,040 Speaker 1: look less and less likely that the data is actually there. 69 00:03:56,800 --> 00:04:02,240 Speaker 1: Claims are not sufficient to convince data is Thanks for 70 00:04:02,280 --> 00:04:18,520 Speaker 1: that question, Nathan from Colorado asks, how does being a 71 00:04:18,600 --> 00:04:23,119 Speaker 1: neuroscientist affect your experience of the world? Does knowing about 72 00:04:23,120 --> 00:04:28,560 Speaker 1: the mechanisms of perception diminish the pleasure of your conscious experience? 73 00:04:29,720 --> 00:04:32,839 Speaker 1: Terrific question. First of all, one of the weird parts 74 00:04:32,839 --> 00:04:36,840 Speaker 1: about life is that we can't run a control experiment 75 00:04:36,880 --> 00:04:40,960 Speaker 1: to know what life would be like from a different heads. 76 00:04:41,000 --> 00:04:43,880 Speaker 1: So it's hard to know how my experiences would be 77 00:04:43,920 --> 00:04:47,520 Speaker 1: different if my trajectory through life hadn't been exactly what 78 00:04:47,640 --> 00:04:50,799 Speaker 1: it was. But generally I would answer you a question 79 00:04:50,839 --> 00:04:55,279 Speaker 1: by saying my knowledge of neuroscience affects my own sense 80 00:04:55,320 --> 00:05:01,359 Speaker 1: of experience zero, not at all. So I should unpack that, 81 00:05:01,400 --> 00:05:03,359 Speaker 1: which is to say, you know, let's imagine that you 82 00:05:04,440 --> 00:05:08,600 Speaker 1: love mint chocolate chip ice cream. I could write a 83 00:05:08,720 --> 00:05:11,760 Speaker 1: whole book on taste receptors on your tongue and the 84 00:05:11,800 --> 00:05:14,600 Speaker 1: signals they send to your gustatory cortex and where the 85 00:05:14,640 --> 00:05:17,320 Speaker 1: signals go through your brain and leads to the release 86 00:05:17,360 --> 00:05:20,440 Speaker 1: of dopamine and serotonin and blah blah blah. The question is, 87 00:05:21,320 --> 00:05:24,279 Speaker 1: if you read the book, would you enjoy the eating 88 00:05:24,440 --> 00:05:28,080 Speaker 1: of mint chocolate chip ice cream any more or any less? 89 00:05:29,120 --> 00:05:32,720 Speaker 1: And my assertion is that the information might be fascinating 90 00:05:32,760 --> 00:05:36,760 Speaker 1: to you, but it wouldn't change your experience at all. 91 00:05:36,800 --> 00:05:42,800 Speaker 1: Why because the information exists in one place, that of molecules, neurons, 92 00:05:42,800 --> 00:05:46,320 Speaker 1: and so on, and the experience of the ice cream 93 00:05:46,720 --> 00:05:50,960 Speaker 1: exists at the level of your conscious experience and never 94 00:05:51,160 --> 00:05:54,880 Speaker 1: the Twain shall meet. Knowing everything at one level has 95 00:05:54,960 --> 00:05:58,440 Speaker 1: no influence at another level. So the fact that I'm 96 00:05:58,480 --> 00:06:02,320 Speaker 1: a neuroscientist, it doesn't really change much of anything about 97 00:06:02,360 --> 00:06:09,599 Speaker 1: my experience in the world. Agga from Turkey asks, what 98 00:06:09,720 --> 00:06:12,200 Speaker 1: if we have no free will? Does that depress you? 99 00:06:12,800 --> 00:06:14,960 Speaker 1: I've had so many emails about free will that I'm 100 00:06:15,000 --> 00:06:17,280 Speaker 1: going to do an episode on that very soon. But 101 00:06:17,360 --> 00:06:19,920 Speaker 1: let me just summarize what I think is an interesting 102 00:06:20,000 --> 00:06:24,080 Speaker 1: concept here, which is that what all the experiments have 103 00:06:24,600 --> 00:06:28,520 Speaker 1: shown about free will is that it's problematic to trust 104 00:06:28,520 --> 00:06:32,560 Speaker 1: our intuitions about the freedom of our choices. At the moment. 105 00:06:33,400 --> 00:06:38,280 Speaker 1: We don't have perfect experiments to entirely rule free will 106 00:06:38,400 --> 00:06:41,240 Speaker 1: in or out. It's a very complex topic, and one 107 00:06:41,240 --> 00:06:45,080 Speaker 1: that our science might simply be too young to nail 108 00:06:45,160 --> 00:06:49,000 Speaker 1: down thoroughly. But let's entertain for a moment the prospect 109 00:06:49,120 --> 00:06:51,680 Speaker 1: that there really is no free will. The brain is 110 00:06:51,880 --> 00:06:55,359 Speaker 1: just a big, giant, wet machine, and when you arrive 111 00:06:55,400 --> 00:06:58,839 Speaker 1: at that fork in the road, your choice is predetermined 112 00:06:58,880 --> 00:07:01,719 Speaker 1: every time. If I re history a thousand times, you'd 113 00:07:02,000 --> 00:07:05,000 Speaker 1: take the same turn every single time. Now, on the 114 00:07:05,040 --> 00:07:09,080 Speaker 1: face of it, a life that's predictable doesn't sound like 115 00:07:09,120 --> 00:07:12,440 Speaker 1: a life worth living. The good news is that the 116 00:07:12,520 --> 00:07:18,720 Speaker 1: brain's unbelievable complexity means that in actuality, nothing is predictable. 117 00:07:18,800 --> 00:07:21,800 Speaker 1: So in my television show The Brain, I set up 118 00:07:21,880 --> 00:07:26,720 Speaker 1: a tank with rows of ping pong balls along the bottom, 119 00:07:26,760 --> 00:07:31,239 Speaker 1: and each one was delicately poised on its own mouse trap, 120 00:07:31,360 --> 00:07:34,320 Speaker 1: which was sprung and ready to go. So when I 121 00:07:34,480 --> 00:07:37,440 Speaker 1: dropped in one more ping pong ball from the top, 122 00:07:37,920 --> 00:07:41,600 Speaker 1: it's pretty straightforward to mathematically predict where that ball will land. 123 00:07:42,000 --> 00:07:44,960 Speaker 1: But as soon as that ball hits the bottom, it 124 00:07:45,080 --> 00:07:49,400 Speaker 1: sets off an unpredictable chain reaction. It causes other balls 125 00:07:49,800 --> 00:07:53,280 Speaker 1: to believe to be flung from their mousetraps, and those 126 00:07:53,320 --> 00:07:57,680 Speaker 1: balls trigger yet other balls, and the situation quickly explodes 127 00:07:57,760 --> 00:08:02,840 Speaker 1: in complexity. Any error in the initial prediction, no matter 128 00:08:02,960 --> 00:08:07,560 Speaker 1: how small it is, it becomes magnified as balls collide 129 00:08:07,600 --> 00:08:10,440 Speaker 1: and bounce off the sides and land on other balls 130 00:08:10,440 --> 00:08:14,320 Speaker 1: on mouse traps. It takes just a moment before it's 131 00:08:14,400 --> 00:08:19,120 Speaker 1: completely impossible to make any kind of forecast about where 132 00:08:19,160 --> 00:08:20,880 Speaker 1: the balls are going to be and how things are 133 00:08:20,920 --> 00:08:23,680 Speaker 1: going to end up. Now, the thing to note about 134 00:08:23,720 --> 00:08:26,320 Speaker 1: the bing pong balls on the mouse traps is that 135 00:08:26,440 --> 00:08:29,800 Speaker 1: these follow totally basic physical rules that you learn in 136 00:08:29,880 --> 00:08:34,080 Speaker 1: high school physics, but where they end up is totally 137 00:08:34,080 --> 00:08:39,720 Speaker 1: impossible to predict in practice. Now, by analogy, your brain 138 00:08:39,840 --> 00:08:43,680 Speaker 1: is comprised of billions of brain cells and trillions of 139 00:08:43,800 --> 00:08:46,400 Speaker 1: signals interacting every second of your life, and so even 140 00:08:46,440 --> 00:08:50,960 Speaker 1: though it's a physical system, we could never predict precisely 141 00:08:51,000 --> 00:08:53,400 Speaker 1: what is going to happen next. So our brains are 142 00:08:53,440 --> 00:08:58,600 Speaker 1: like the ping pong ball tank, but massively more complex. 143 00:08:59,160 --> 00:09:01,160 Speaker 1: You can fit a few hund ping pong balls in 144 00:09:01,160 --> 00:09:04,920 Speaker 1: the tank, But your skull houses trillions of times more 145 00:09:04,960 --> 00:09:07,520 Speaker 1: interactions than the tank, and it goes on screaming with 146 00:09:07,600 --> 00:09:11,880 Speaker 1: activity every second of your lifetime. And from those innumerable 147 00:09:12,080 --> 00:09:17,160 Speaker 1: exchanges of energy, your thoughts, your feelings, your decisions emerge. 148 00:09:17,640 --> 00:09:22,320 Speaker 1: And that's only the beginning of the unpredictability. Every individual brain, 149 00:09:22,440 --> 00:09:25,520 Speaker 1: yours and mind and everyone else's, is embedded in a 150 00:09:25,760 --> 00:09:29,840 Speaker 1: world of other brains. So when you're sitting across a 151 00:09:29,960 --> 00:09:32,920 Speaker 1: dinner table from people, or you're sitting in a lecture hall, 152 00:09:33,120 --> 00:09:35,520 Speaker 1: or or you have the reach of the internet, you 153 00:09:35,559 --> 00:09:39,120 Speaker 1: can contact all the human neurons on the planet. All 154 00:09:39,200 --> 00:09:42,960 Speaker 1: the neurons are influencing one another, which creates a system 155 00:09:43,000 --> 00:09:47,199 Speaker 1: of unimaginable complexity. And this means that even though neurons 156 00:09:47,280 --> 00:09:52,480 Speaker 1: follow straightforward physical rules, in practice, it's always going to 157 00:09:52,480 --> 00:09:57,240 Speaker 1: be impossible to predict exactly what any individual is going 158 00:09:57,320 --> 00:10:00,079 Speaker 1: to do next, much less than months from now, a 159 00:10:00,200 --> 00:10:05,800 Speaker 1: year from now. The Titanic complexity leaves us with just 160 00:10:05,920 --> 00:10:08,559 Speaker 1: enough insight to understand a simple fact, which is that 161 00:10:08,640 --> 00:10:13,960 Speaker 1: our lives are steered by forces far beyond our capacity 162 00:10:14,400 --> 00:10:17,840 Speaker 1: for any awareness or control, and That's why I'm not 163 00:10:17,960 --> 00:10:20,360 Speaker 1: too worried if we do run like a giant machine 164 00:10:20,360 --> 00:10:24,000 Speaker 1: and don't have free will, because it is truly a 165 00:10:24,120 --> 00:10:29,280 Speaker 1: colossal machine, one whose size alone makes it totally unpredictable, 166 00:10:29,679 --> 00:10:35,200 Speaker 1: and who's embedding among other machines makes it totally unpredictable. 167 00:10:35,960 --> 00:10:39,120 Speaker 1: Keep sending in your questions to podcasts at Eagleman dot 168 00:10:39,160 --> 00:10:42,040 Speaker 1: com and listen to the full episodes of Inner Cosmos 169 00:10:42,040 --> 00:10:45,199 Speaker 1: wherever you listen to your podcasts, and also on YouTube, 170 00:10:45,200 --> 00:10:48,480 Speaker 1: where you can leave questions or comments. Until next time, 171 00:10:48,520 --> 00:10:52,480 Speaker 1: I'm David Eagleman, and this is the Inner Cosmos inbox.