1 00:00:03,520 --> 00:00:06,400 Speaker 1: And my grandfather was born in three. I had a 2 00:00:06,400 --> 00:00:10,280 Speaker 1: proper Victorian for a grandfather. So my grandchild was born 3 00:00:10,360 --> 00:00:14,680 Speaker 1: eighteen seventy nine. What. Yeah, he was fifty years old 4 00:00:14,680 --> 00:00:16,840 Speaker 1: when he had my father, and my father was getting 5 00:00:16,840 --> 00:00:19,960 Speaker 1: close to fifty when he heard me, that's bananas. That 6 00:00:20,079 --> 00:00:22,200 Speaker 1: was really interesting being taught to grow tomatoes by a 7 00:00:22,280 --> 00:00:25,920 Speaker 1: ninety seven year old grandpa. And my mother was always 8 00:00:25,920 --> 00:00:28,280 Speaker 1: working jokes about are you sure girls are allowed to 9 00:00:28,280 --> 00:00:38,120 Speaker 1: grow tomatoes? Hello, I'm mini driver. Welcome to Many Questions 10 00:00:38,280 --> 00:00:42,280 Speaker 1: Season two. I've always loved Pruce's question that it was 11 00:00:42,360 --> 00:00:46,600 Speaker 1: originally an nineteenth century parlor game where players would ask 12 00:00:46,640 --> 00:00:49,960 Speaker 1: each other thirty five questions aimed at revealing the other 13 00:00:50,000 --> 00:00:54,560 Speaker 1: players true nature. It's just the scientific method really. In 14 00:00:54,760 --> 00:00:57,600 Speaker 1: asking different people the same set of questions, you can 15 00:00:57,640 --> 00:01:01,360 Speaker 1: make observations about which truths appear to be universal. I 16 00:01:01,400 --> 00:01:04,920 Speaker 1: love this discipline, and it made me wonder, what if 17 00:01:04,959 --> 00:01:07,959 Speaker 1: these questions were just the jumping off point, what greater 18 00:01:08,080 --> 00:01:11,200 Speaker 1: depths would be revealed if I ask these questions as 19 00:01:11,240 --> 00:01:15,560 Speaker 1: conversation starters with thought leaders and trailblazers across all these 20 00:01:15,560 --> 00:01:19,240 Speaker 1: different disciplines. So I adapted prus questionnaire, and I wrote 21 00:01:19,280 --> 00:01:22,200 Speaker 1: my own seven questions that I personally think a pertinent 22 00:01:22,240 --> 00:01:25,119 Speaker 1: to a person's story. They are when and where were 23 00:01:25,120 --> 00:01:28,520 Speaker 1: you happiest? What is the quality you like least about yourself? 24 00:01:28,880 --> 00:01:32,920 Speaker 1: What relationship, real or fictionalized, defines love for you? What 25 00:01:33,120 --> 00:01:36,720 Speaker 1: question would you most like answered? What person, place, or 26 00:01:36,760 --> 00:01:39,640 Speaker 1: experience has shaped you the most? What would be your 27 00:01:39,720 --> 00:01:42,360 Speaker 1: last meal? And can you tell me something in your 28 00:01:42,400 --> 00:01:46,040 Speaker 1: life that's grown out of a personal disaster? And I've 29 00:01:46,080 --> 00:01:50,480 Speaker 1: gathered a group of really remarkable people, ones that I 30 00:01:50,520 --> 00:01:53,000 Speaker 1: am honored and humbled to have had the chance to 31 00:01:53,040 --> 00:01:55,760 Speaker 1: engage with. You may not hear their answers to all 32 00:01:55,840 --> 00:01:59,360 Speaker 1: seven of these questions. We've whittled it down to which 33 00:01:59,440 --> 00:02:03,360 Speaker 1: question felt closest to their experience or the most surprising, 34 00:02:03,760 --> 00:02:09,200 Speaker 1: or created the most fertile ground to connect. My guest 35 00:02:09,240 --> 00:02:14,040 Speaker 1: today is the neuroscientist and author David Eagleman. I'm not 36 00:02:14,160 --> 00:02:16,800 Speaker 1: sure I've ever had a really long conversation with a 37 00:02:16,840 --> 00:02:20,119 Speaker 1: polymath before, but you sure don't forget it when it's over, 38 00:02:20,440 --> 00:02:23,320 Speaker 1: because you know, you keep waking up thinking about things 39 00:02:23,360 --> 00:02:26,880 Speaker 1: they said, and expands everything from quantum of spirituality to 40 00:02:27,000 --> 00:02:31,919 Speaker 1: philosophy to neuro Law and Science to coffee in I Hop, 41 00:02:33,040 --> 00:02:36,880 Speaker 1: David writes about the brain and how it constructs a perception, 42 00:02:37,040 --> 00:02:40,079 Speaker 1: and how different brains do so differently, and how much 43 00:02:40,120 --> 00:02:44,000 Speaker 1: that matters for society. He is, among many other things, 44 00:02:44,040 --> 00:02:46,960 Speaker 1: the executive director of the Center for Science and Law, 45 00:02:47,200 --> 00:02:49,320 Speaker 1: which is a nonprofit that sets out to improve the 46 00:02:49,440 --> 00:02:52,640 Speaker 1: legal system by importing our knowledge about the human brain, 47 00:02:53,200 --> 00:02:58,400 Speaker 1: which then gives options for rehabilitation beyond mass incarceration. He's 48 00:02:58,440 --> 00:03:01,040 Speaker 1: written tons of books, and you will read and be 49 00:03:01,120 --> 00:03:05,360 Speaker 1: astonished by all of them. He said something that I 50 00:03:05,480 --> 00:03:09,440 Speaker 1: think about a lot. He said he was interested in 51 00:03:09,560 --> 00:03:13,519 Speaker 1: exploring the vastness of our ignorance. And he said to 52 00:03:13,720 --> 00:03:17,760 Speaker 1: this kind of interest and excitement and without judgment of 53 00:03:17,760 --> 00:03:21,600 Speaker 1: our human brains, which he also described, and I'm paraphrasing 54 00:03:21,760 --> 00:03:24,639 Speaker 1: as the three pound meat machine who lives in the 55 00:03:24,760 --> 00:03:29,040 Speaker 1: dark and basically runs everything. I hope you enjoy listening 56 00:03:29,120 --> 00:03:40,280 Speaker 1: to David's brain. What relationship, real or fictionalized, de finds 57 00:03:40,360 --> 00:03:42,040 Speaker 1: love for you? I think there are two ways to 58 00:03:42,080 --> 00:03:45,120 Speaker 1: think about that. One is how we feel about the 59 00:03:45,240 --> 00:03:47,760 Speaker 1: love for our children. So if you think about Cormick 60 00:03:47,840 --> 00:03:50,640 Speaker 1: McCarthy's The Road, for example, you know, it's a post 61 00:03:50,640 --> 00:03:54,240 Speaker 1: apocalyptic world after nuclear war, and this father does everything 62 00:03:54,400 --> 00:03:57,120 Speaker 1: in his power just to save his son, just to 63 00:03:57,120 --> 00:03:59,480 Speaker 1: do everything you can to protect his son and keep 64 00:03:59,560 --> 00:04:01,520 Speaker 1: him alive. And of course there are there are many 65 00:04:01,520 --> 00:04:04,480 Speaker 1: stories like this life is Beautiful Roberto Bernini Amy getting 66 00:04:04,480 --> 00:04:06,120 Speaker 1: the title right of that one. Yeah, when he puts 67 00:04:06,160 --> 00:04:08,680 Speaker 1: him in the bin at the end, Yeah, absolutely, Yeah, 68 00:04:08,720 --> 00:04:10,920 Speaker 1: He's in a concentration camp with his son and he 69 00:04:11,120 --> 00:04:13,320 Speaker 1: does everything you can to protect his son. So that's 70 00:04:13,320 --> 00:04:15,320 Speaker 1: a kind of love that now that I'm a father, 71 00:04:15,440 --> 00:04:19,039 Speaker 1: is very meaningful to me. As for romantic love, I 72 00:04:19,080 --> 00:04:22,240 Speaker 1: think that would change each year of my life for me, 73 00:04:22,480 --> 00:04:24,560 Speaker 1: which is to say, when we're younger, we all take 74 00:04:24,600 --> 00:04:27,600 Speaker 1: these great romance fantasies to represent love, and as you 75 00:04:27,680 --> 00:04:30,480 Speaker 1: get older, more realism seeps in. So I've been married 76 00:04:30,520 --> 00:04:33,200 Speaker 1: for twelve years now, and I appreciate now the way 77 00:04:33,200 --> 00:04:36,320 Speaker 1: that real relationships are complex and people change over different 78 00:04:36,320 --> 00:04:39,760 Speaker 1: time scales, as life changes in his career, opportunity to change. 79 00:04:39,800 --> 00:04:43,120 Speaker 1: And I just recently re saw rewatched Fiddler on the Roof, 80 00:04:43,240 --> 00:04:45,080 Speaker 1: you know, and he says, have you asked his wife? 81 00:04:45,240 --> 00:04:47,240 Speaker 1: You know? Do you love me? And she says, do 82 00:04:47,279 --> 00:04:49,640 Speaker 1: I what and so? And you know, they realized that 83 00:04:49,680 --> 00:04:54,000 Speaker 1: the romantic notions can't capture it. But other notions, uh, 84 00:04:54,040 --> 00:04:55,680 Speaker 1: you know, what they do for each other, how they 85 00:04:55,720 --> 00:04:59,200 Speaker 1: demonstrate their bond to one another. This does capture something important. 86 00:04:59,360 --> 00:05:03,960 Speaker 1: Do you think that the romantic love, the faith based love, 87 00:05:04,400 --> 00:05:07,120 Speaker 1: service based love, the love of our children? Do you 88 00:05:07,120 --> 00:05:09,159 Speaker 1: think that the route that it's erroneous that we have 89 00:05:09,200 --> 00:05:12,880 Speaker 1: the same word for it, because there are myriad ways 90 00:05:13,400 --> 00:05:17,520 Speaker 1: to love, right? Yeah. Raymond Carver has a short story 91 00:05:17,560 --> 00:05:19,800 Speaker 1: called what We talked About When we talk about Love, 92 00:05:19,960 --> 00:05:21,800 Speaker 1: and I've always it's a great short story, but I've 93 00:05:21,800 --> 00:05:24,800 Speaker 1: always loved the title as well, because it's about the 94 00:05:24,800 --> 00:05:27,200 Speaker 1: complexity of it, and like so many words in our language, 95 00:05:27,240 --> 00:05:29,599 Speaker 1: there's just too much semantic weight that that word is 96 00:05:29,600 --> 00:05:32,039 Speaker 1: trying to hold, because in fact, it is composed of 97 00:05:32,120 --> 00:05:36,080 Speaker 1: many different things. Yeah, exactly, It's composed of so many things, 98 00:05:36,279 --> 00:05:40,000 Speaker 1: and which of those things matters to us that changes 99 00:05:40,000 --> 00:05:42,320 Speaker 1: through the years. You know, my father, before he died, 100 00:05:42,480 --> 00:05:45,000 Speaker 1: was in one of these care homes. So I met 101 00:05:45,120 --> 00:05:47,440 Speaker 1: some of the other people on his hallway, much older 102 00:05:47,480 --> 00:05:49,600 Speaker 1: men and women who were there, and I think they 103 00:05:49,640 --> 00:05:52,200 Speaker 1: still cared about love, but it was something so different 104 00:05:52,240 --> 00:05:54,159 Speaker 1: for them. It wasn't about the you know, sort of 105 00:05:54,240 --> 00:05:57,320 Speaker 1: young sexy thing, it was about something else. Yeah, definitely. 106 00:05:57,480 --> 00:06:00,880 Speaker 1: It's really I'm fascinated by the different permutations of it, 107 00:06:01,200 --> 00:06:04,800 Speaker 1: of devotion, and of the way in which people love differently, 108 00:06:04,960 --> 00:06:08,159 Speaker 1: when the love that they have with their version of God, 109 00:06:08,600 --> 00:06:12,320 Speaker 1: or with animals or with something that they do, how 110 00:06:12,360 --> 00:06:17,080 Speaker 1: they love romantically changes directly depending on what their relationship 111 00:06:17,160 --> 00:06:19,360 Speaker 1: with that sort of outsourced love, as it were, the 112 00:06:19,440 --> 00:06:21,680 Speaker 1: love that we don't really talk about. I always think 113 00:06:21,680 --> 00:06:24,000 Speaker 1: of love as romantic. I think lots of people do. 114 00:06:24,080 --> 00:06:27,120 Speaker 1: But yeah, that's right. And the other thing about love is, 115 00:06:27,160 --> 00:06:29,960 Speaker 1: you know, obviously it goes into directions. So we all 116 00:06:29,960 --> 00:06:32,240 Speaker 1: want to get love. We want to receive love, love 117 00:06:32,279 --> 00:06:34,080 Speaker 1: in various ways, including from our dogs and so on. 118 00:06:34,160 --> 00:06:36,640 Speaker 1: But we want to be good to the people that 119 00:06:36,720 --> 00:06:39,560 Speaker 1: we care about and love them well. But it's hard, right, 120 00:06:39,560 --> 00:06:42,279 Speaker 1: because we're made up of all these different neural networks 121 00:06:42,279 --> 00:06:44,479 Speaker 1: that all have different drives and care about different things. 122 00:06:44,480 --> 00:06:47,360 Speaker 1: And so sometimes you're feeling angry, and sometimes you're feeling 123 00:06:47,360 --> 00:06:49,640 Speaker 1: distracted by work, and sometimes you're feeling whatever, and so 124 00:06:49,720 --> 00:06:52,680 Speaker 1: you know, we're constantly finding ourselves in situations where we 125 00:06:52,720 --> 00:06:56,160 Speaker 1: don't behave the way that we would like to. One 126 00:06:56,160 --> 00:06:58,479 Speaker 1: of the books that I'm writting right now has to 127 00:06:58,480 --> 00:07:01,440 Speaker 1: do with something called the Ulysses contract, which is how 128 00:07:01,480 --> 00:07:04,320 Speaker 1: you can make a deal with yourself in time to 129 00:07:04,480 --> 00:07:08,240 Speaker 1: constrain your behavior by doing something right now that essentially 130 00:07:08,279 --> 00:07:10,280 Speaker 1: puts you in a contract so that you'll behave better 131 00:07:10,400 --> 00:07:12,720 Speaker 1: in some future situation. This is what Ulysses did when 132 00:07:12,720 --> 00:07:14,880 Speaker 1: he lashed himself to the mask. He knew that the 133 00:07:15,000 --> 00:07:17,240 Speaker 1: sirens song would tempt him just like any mortal man, 134 00:07:17,240 --> 00:07:19,400 Speaker 1: any crash in the rocks. So what he did is 135 00:07:19,440 --> 00:07:22,640 Speaker 1: the Ulysses of sound mind lashed himself to the mass 136 00:07:22,720 --> 00:07:26,640 Speaker 1: so that the future Ulysses couldn't behave badly. And I 137 00:07:26,720 --> 00:07:30,200 Speaker 1: find this a really interesting concept about the things that 138 00:07:30,240 --> 00:07:32,320 Speaker 1: we do to make sure that we don't behave badly 139 00:07:32,360 --> 00:07:35,520 Speaker 1: the future. This is this absolutely fascinating carry on carry on. Well, 140 00:07:35,560 --> 00:07:38,000 Speaker 1: you know. One example is if you're trying to get 141 00:07:38,040 --> 00:07:40,360 Speaker 1: over some addiction, like you know, alcoholism, what you do 142 00:07:40,400 --> 00:07:42,280 Speaker 1: is you clear all the alcohol out of your house 143 00:07:42,720 --> 00:07:45,560 Speaker 1: so that on a you know, festive Saturday night or 144 00:07:45,560 --> 00:07:47,360 Speaker 1: a lonely Sunday, night or something you're not going to 145 00:07:47,440 --> 00:07:49,440 Speaker 1: go in. Even if you think, oh, I'm sure I 146 00:07:49,440 --> 00:07:51,520 Speaker 1: won't drink this anymore, you get rid of it. That's 147 00:07:51,560 --> 00:07:54,200 Speaker 1: the Ulysses contract or, you know, for drug addicts when 148 00:07:54,200 --> 00:07:55,720 Speaker 1: of the first things they're taught when they're trying to 149 00:07:55,760 --> 00:07:57,680 Speaker 1: break this is never walked around with more than twenty 150 00:07:57,680 --> 00:08:00,240 Speaker 1: dollars in your pocket, because at some point was going 151 00:08:00,320 --> 00:08:02,080 Speaker 1: to come up to you and offer to sell you drugs, 152 00:08:02,080 --> 00:08:03,840 Speaker 1: and you'll be tempted and then you'll give in. So 153 00:08:04,000 --> 00:08:06,240 Speaker 1: there are many things that we do to make sure 154 00:08:06,320 --> 00:08:08,520 Speaker 1: that we can, you know, make a choice now that 155 00:08:08,520 --> 00:08:12,160 Speaker 1: will pay off to keep us acting consistently with our 156 00:08:12,240 --> 00:08:15,680 Speaker 1: long term decision making. I mean, I think I could 157 00:08:15,840 --> 00:08:18,680 Speaker 1: definitely just put a big piece of tape over my 158 00:08:18,720 --> 00:08:23,000 Speaker 1: mouth and that would be my Ulysses contracts sorted. Future 159 00:08:23,080 --> 00:08:25,880 Speaker 1: me is never going to say the stuff that I 160 00:08:25,920 --> 00:08:28,400 Speaker 1: am thinking that I know is going to cause trouble 161 00:08:28,560 --> 00:08:30,600 Speaker 1: because I've got a massive piece of tape. And maybe 162 00:08:30,640 --> 00:08:32,880 Speaker 1: you could, like TM your name. It could be like 163 00:08:33,080 --> 00:08:42,040 Speaker 1: David Eagleman Ulysses tape. I would buy that ship. But 164 00:08:42,120 --> 00:08:45,240 Speaker 1: that requires a modicum of self knowledge that most people 165 00:08:45,400 --> 00:08:48,000 Speaker 1: are not interested in interrogating because they don't want to 166 00:08:48,000 --> 00:08:49,920 Speaker 1: think that there is something fundamentally wrong with them that 167 00:08:49,920 --> 00:08:52,440 Speaker 1: it's going to affect their situation now much less in 168 00:08:52,480 --> 00:08:55,160 Speaker 1: the future when the mermaids are singing and calling you 169 00:08:55,200 --> 00:08:58,840 Speaker 1: into the ocean. Were so tender and we're so lost 170 00:08:59,760 --> 00:09:02,200 Speaker 1: as people. I mean, I love that you write these 171 00:09:02,240 --> 00:09:05,480 Speaker 1: books that really do act as guide posts, because that's 172 00:09:05,520 --> 00:09:08,360 Speaker 1: what I think they are. And I'm constantly looking for 173 00:09:08,400 --> 00:09:12,840 Speaker 1: signs and signed posts because it is so fragile and tender, 174 00:09:12,920 --> 00:09:15,120 Speaker 1: and that you know that you're writing a book for that. 175 00:09:15,160 --> 00:09:17,120 Speaker 1: But I feel for I feel for all of us, 176 00:09:17,200 --> 00:09:20,000 Speaker 1: myself included, going I just I wonder how deep I 177 00:09:20,040 --> 00:09:22,680 Speaker 1: can go into who I am to know how I 178 00:09:22,679 --> 00:09:27,640 Speaker 1: could save myself from myself. Yes, exactly. So, when it 179 00:09:27,640 --> 00:09:31,400 Speaker 1: comes to this issue about self interrogation and really trying 180 00:09:31,400 --> 00:09:33,280 Speaker 1: to understand who we are, one of the hardest things 181 00:09:33,320 --> 00:09:36,640 Speaker 1: to see is the way that we come off to 182 00:09:36,679 --> 00:09:39,080 Speaker 1: other people. Because in fact, this is what one of 183 00:09:39,120 --> 00:09:41,200 Speaker 1: my other books is and I'm writing right now. It's 184 00:09:41,240 --> 00:09:45,000 Speaker 1: called Empire of the Invisible, and it's all about what's 185 00:09:45,000 --> 00:09:48,000 Speaker 1: going on in politics right now. Specifically, it's asking why 186 00:09:48,080 --> 00:09:50,520 Speaker 1: we each believe that we have the truth and we 187 00:09:50,559 --> 00:09:54,160 Speaker 1: see the truth so clearly, and everyone else is misinformed 188 00:09:54,240 --> 00:09:56,600 Speaker 1: or they're a troll or whatever it is. And if 189 00:09:56,640 --> 00:09:59,760 Speaker 1: I can only shout in all caps loud enough on Twitter, 190 00:10:00,000 --> 00:10:03,400 Speaker 1: I could convince everyone that I am right. It's crazy 191 00:10:03,400 --> 00:10:06,000 Speaker 1: to me that everybody believes this at whatever part of 192 00:10:06,000 --> 00:10:09,240 Speaker 1: the political spectrum. And by the way, in terms of relationships, 193 00:10:09,280 --> 00:10:11,240 Speaker 1: we know tend to all believe this as well, which 194 00:10:11,280 --> 00:10:13,679 Speaker 1: is okay, Well, I I already know how to do relationships. 195 00:10:13,760 --> 00:10:16,480 Speaker 1: I'm I'm saying the right things all the time. Do 196 00:10:16,559 --> 00:10:20,160 Speaker 1: you think that that kind of the surety of that 197 00:10:20,280 --> 00:10:24,560 Speaker 1: empiricism that is so pervasive, is that human or is 198 00:10:24,600 --> 00:10:27,640 Speaker 1: that learned? Like is that hardwired into our brains? Is 199 00:10:27,679 --> 00:10:32,840 Speaker 1: that something that was useful once when we were discovering fire? Yeah, 200 00:10:33,000 --> 00:10:35,360 Speaker 1: it couldn't actually be any other way because the way 201 00:10:35,400 --> 00:10:38,240 Speaker 1: we build a model of the world. Remember, your brain 202 00:10:38,360 --> 00:10:41,200 Speaker 1: is locked in silence and darkness inside your skull, and 203 00:10:41,240 --> 00:10:43,840 Speaker 1: all it's trying to do is put together an internal 204 00:10:43,880 --> 00:10:46,000 Speaker 1: model of what is going on out there, which includes 205 00:10:46,080 --> 00:10:48,160 Speaker 1: other people and how other people behave and how they'll 206 00:10:48,160 --> 00:10:49,960 Speaker 1: react to what you say. And the thing is that 207 00:10:50,040 --> 00:10:54,400 Speaker 1: this internal model is inherently limited. It's only built up 208 00:10:54,400 --> 00:10:56,520 Speaker 1: from the little dribbles of data that you get in 209 00:10:56,640 --> 00:10:59,680 Speaker 1: during your years. And so the way the brain works 210 00:10:59,760 --> 00:11:02,000 Speaker 1: is it says, okay, look, I've got this data. I've 211 00:11:02,000 --> 00:11:04,520 Speaker 1: collected all this data. I know what is true, and 212 00:11:04,559 --> 00:11:06,319 Speaker 1: it's just built up from what we've taken in. Now, 213 00:11:06,360 --> 00:11:08,600 Speaker 1: by the way, I will say, we're probably better off 214 00:11:08,600 --> 00:11:11,600 Speaker 1: than we were historically because now we have, for example, 215 00:11:11,640 --> 00:11:14,000 Speaker 1: the printing press, and so we have movies and things 216 00:11:14,000 --> 00:11:16,400 Speaker 1: like that, and so you put all this together and 217 00:11:16,679 --> 00:11:19,600 Speaker 1: we're exposed to literature and two stories that are much 218 00:11:19,600 --> 00:11:22,600 Speaker 1: broader than our own experience. So that helps. But still, 219 00:11:22,760 --> 00:11:24,800 Speaker 1: I've only read a find a number of books in 220 00:11:24,840 --> 00:11:26,760 Speaker 1: my lifetime. I've always met a finite number of people, 221 00:11:26,920 --> 00:11:29,840 Speaker 1: and that has shaped my experience, just like your experiences 222 00:11:29,840 --> 00:11:32,680 Speaker 1: have shaped your brain. And so that's why given that data, 223 00:11:32,720 --> 00:11:34,559 Speaker 1: you say, Okay, I know what is true. This is 224 00:11:34,640 --> 00:11:37,920 Speaker 1: what is true. It's very very interesting. I'm really looking 225 00:11:37,960 --> 00:11:47,160 Speaker 1: forward to reading that book. What person, place, or experience 226 00:11:47,240 --> 00:11:51,720 Speaker 1: most altered your life? Well, interestingly, it's almost certainly my parents. 227 00:11:51,920 --> 00:11:54,360 Speaker 1: But the interesting part is that that feels invisible to us. 228 00:11:54,400 --> 00:11:56,800 Speaker 1: In other words, it's very hard to to see all 229 00:11:56,800 --> 00:11:58,480 Speaker 1: of the ways of what your parents shaped you, because 230 00:11:58,520 --> 00:12:01,320 Speaker 1: you knew nothing else. That was us the background furniture 231 00:12:01,320 --> 00:12:03,199 Speaker 1: of the world, and you grew up against that. So 232 00:12:03,800 --> 00:12:06,320 Speaker 1: that's certainly what has shaped me the most my parents. 233 00:12:06,400 --> 00:12:09,120 Speaker 1: But when I was in graduate school, my thesis advisor, 234 00:12:09,120 --> 00:12:12,920 Speaker 1: ain him Read Montague, was unbelievably influential on me. He 235 00:12:13,400 --> 00:12:14,960 Speaker 1: you know, you come to an age when you're a 236 00:12:14,960 --> 00:12:17,520 Speaker 1: young person. I was like, I guess maybe when I 237 00:12:17,559 --> 00:12:19,920 Speaker 1: entered grad school, you know, you're just leaving your parents 238 00:12:20,040 --> 00:12:22,480 Speaker 1: home and you're looking for other people in the world. 239 00:12:22,520 --> 00:12:25,280 Speaker 1: And he was just a great person to really admire 240 00:12:25,320 --> 00:12:27,960 Speaker 1: in so many ways. He was twenty points at ahead 241 00:12:28,000 --> 00:12:31,800 Speaker 1: of everybody, and he was a terrific athlete, unbelievably strong 242 00:12:31,880 --> 00:12:34,000 Speaker 1: and fast and so on, and and the key thing 243 00:12:34,040 --> 00:12:36,640 Speaker 1: is he gave me no charity. I mean, he spent 244 00:12:36,880 --> 00:12:40,240 Speaker 1: the entirety of grad school beating me up every day. 245 00:12:40,360 --> 00:12:43,920 Speaker 1: And that was really valuable because I think, actually, maybe 246 00:12:43,960 --> 00:12:48,480 Speaker 1: there's sometimes too much charity that happens because teachers are kind, 247 00:12:48,640 --> 00:12:50,760 Speaker 1: or they're just too tired or whatever. So if you 248 00:12:50,800 --> 00:12:53,760 Speaker 1: get something right and you don't get a hundred percent right, 249 00:12:53,840 --> 00:12:56,319 Speaker 1: no one really says anything. They don't expect anything more 250 00:12:56,440 --> 00:12:59,640 Speaker 1: from you, but read expected a hundred percent all the time, 251 00:12:59,720 --> 00:13:01,760 Speaker 1: every time. And that was probably the most valuable thing 252 00:13:01,760 --> 00:13:03,520 Speaker 1: that's happened in my life. And then I guess the 253 00:13:03,600 --> 00:13:05,520 Speaker 1: last thing I'd say is when I was a post doc, 254 00:13:05,720 --> 00:13:06,960 Speaker 1: one of the people I got to work with was 255 00:13:07,000 --> 00:13:09,640 Speaker 1: Francis Crick, who was the co discoverer of the structure 256 00:13:09,679 --> 00:13:12,360 Speaker 1: of DNA. That in my entire life since then, I've 257 00:13:12,559 --> 00:13:15,800 Speaker 1: not had another experience like that, because he was such 258 00:13:15,840 --> 00:13:18,560 Speaker 1: a special person in the sense that not only was 259 00:13:18,600 --> 00:13:21,720 Speaker 1: he essentially the giant of twentieth century biology, but he 260 00:13:21,760 --> 00:13:23,480 Speaker 1: didn't have a life like other people. He never had 261 00:13:23,480 --> 00:13:25,440 Speaker 1: to right grants and try to get a job as 262 00:13:25,480 --> 00:13:27,920 Speaker 1: a professor and whatever. So he just had this office 263 00:13:27,920 --> 00:13:30,200 Speaker 1: and he would read the scientific journals all day. And 264 00:13:30,200 --> 00:13:31,560 Speaker 1: I asked him, as I said, what are you looking 265 00:13:31,600 --> 00:13:33,120 Speaker 1: for when you're reading there? He said, I don't know. 266 00:13:33,720 --> 00:13:36,439 Speaker 1: But what he meant was he's just letting ideas come 267 00:13:36,480 --> 00:13:39,080 Speaker 1: to him, and he's writing letters scientists around the world 268 00:13:39,120 --> 00:13:40,640 Speaker 1: and saying, hey, why don't you do this experience? What 269 00:13:40,640 --> 00:13:42,160 Speaker 1: if you did this and this and this. He was 270 00:13:42,240 --> 00:13:44,800 Speaker 1: just a person who because he'd won a Nobel prize 271 00:13:44,800 --> 00:13:47,720 Speaker 1: as a young person, got to just be a giant 272 00:13:47,760 --> 00:13:50,600 Speaker 1: thinker his whole life and never had to deal with 273 00:13:50,640 --> 00:13:54,840 Speaker 1: the constraints that everyone else had too. Wow, how amazing, 274 00:13:54,880 --> 00:13:57,360 Speaker 1: How amazing that he was your teacher? Yeah, I do think. 275 00:13:57,480 --> 00:14:00,319 Speaker 1: How have things teach you? Yes, we have a big 276 00:14:00,360 --> 00:14:02,360 Speaker 1: swell here in California at the moment. I don't know 277 00:14:02,360 --> 00:14:05,440 Speaker 1: if you're aware of yourself, but I am. And yesterday 278 00:14:05,640 --> 00:14:08,760 Speaker 1: I went out very excited and I just took so 279 00:14:08,920 --> 00:14:13,680 Speaker 1: many giant waves on my head and I was so bummed, 280 00:14:13,920 --> 00:14:15,840 Speaker 1: and I kind of I came in and I sat 281 00:14:15,880 --> 00:14:17,760 Speaker 1: on the beach and my friend I was like, oh, 282 00:14:17,760 --> 00:14:19,760 Speaker 1: it was just the worst. And my friend went, well, 283 00:14:19,760 --> 00:14:21,560 Speaker 1: do you know what you did? Like, do you know what? 284 00:14:21,640 --> 00:14:23,040 Speaker 1: Do you know what happened? I was like, well, yeah, 285 00:14:23,080 --> 00:14:25,800 Speaker 1: you know that first ten footer I was too far 286 00:14:25,840 --> 00:14:27,760 Speaker 1: ahead of it, and on the second one I was 287 00:14:27,800 --> 00:14:29,840 Speaker 1: just dealing with being piste off about the first one, 288 00:14:29,840 --> 00:14:33,320 Speaker 1: and the third one I was scared. And he was like, great, well, 289 00:14:33,480 --> 00:14:35,600 Speaker 1: now you can go back out and not do all 290 00:14:35,640 --> 00:14:39,120 Speaker 1: that ship that you just said, And so I did. 291 00:14:39,200 --> 00:14:41,600 Speaker 1: I went back out and I took another ten waves 292 00:14:41,600 --> 00:14:54,960 Speaker 1: on the head, but I did get one really amazing one. Yeah, 293 00:14:55,400 --> 00:14:57,880 Speaker 1: in your life, can you tell me about something that 294 00:14:57,960 --> 00:15:01,080 Speaker 1: has grown out of a personal disaster? Uh? Yeah. When 295 00:15:01,120 --> 00:15:03,120 Speaker 1: I was in the third grade, I fell off of 296 00:15:03,160 --> 00:15:05,920 Speaker 1: a roof of a house that was under construction, and 297 00:15:05,960 --> 00:15:08,560 Speaker 1: I almost died. Um. I fell, you know, from the 298 00:15:08,640 --> 00:15:11,160 Speaker 1: roof and landed on the brick floor, face first, and 299 00:15:11,160 --> 00:15:14,240 Speaker 1: I shattered my nose. But I think that's part of 300 00:15:14,240 --> 00:15:17,360 Speaker 1: what made me a neuroscientist, because as I was falling, 301 00:15:18,040 --> 00:15:21,200 Speaker 1: I was, first of all, having completely calm, clear thoughts. 302 00:15:21,200 --> 00:15:23,480 Speaker 1: I was thinking about Alice in Wonderland and how this 303 00:15:23,560 --> 00:15:25,280 Speaker 1: must have been what it was like when she was 304 00:15:25,320 --> 00:15:27,440 Speaker 1: falling down the rabbit hole. And just before that, I 305 00:15:27,480 --> 00:15:29,240 Speaker 1: was thinking about, Okay, I wonder if I could still 306 00:15:29,240 --> 00:15:31,720 Speaker 1: grab for the roof, and and then I realized that's 307 00:15:31,760 --> 00:15:33,600 Speaker 1: tar paper and it's not gonna hold, and that's not 308 00:15:33,640 --> 00:15:35,840 Speaker 1: gonna work. And you know, eventually I just turned and 309 00:15:35,840 --> 00:15:38,360 Speaker 1: faced the bricks and hit. But the thing is that 310 00:15:38,400 --> 00:15:40,440 Speaker 1: the whole event seemed to take a very long time. 311 00:15:40,640 --> 00:15:42,960 Speaker 1: I still remember the thoughts very clearly because it was, 312 00:15:43,000 --> 00:15:44,920 Speaker 1: you know, a traumatic event. But when I got to 313 00:15:45,000 --> 00:15:46,800 Speaker 1: high school and I took physics. I realized that the 314 00:15:46,840 --> 00:15:49,080 Speaker 1: whole fall had taken point six of a second, and 315 00:15:49,160 --> 00:15:51,640 Speaker 1: I couldn't figure that out. I couldn't understand how this 316 00:15:51,720 --> 00:15:53,480 Speaker 1: thing that was so fast seemed to taking so long. 317 00:15:53,520 --> 00:15:55,880 Speaker 1: It's like, I'm really interested in our perception of the world, 318 00:15:55,880 --> 00:15:58,720 Speaker 1: and specifically in the perception of time and why things 319 00:15:58,760 --> 00:16:01,600 Speaker 1: seemed to go in slow motion during a life threatening event. 320 00:16:02,040 --> 00:16:04,480 Speaker 1: And I ended up, you know, growing up to become 321 00:16:04,480 --> 00:16:08,000 Speaker 1: a neuroscientist. And I studied that. I did these experiments 322 00:16:08,080 --> 00:16:10,360 Speaker 1: where I dropped people from a hundred and fifty foot 323 00:16:10,360 --> 00:16:13,800 Speaker 1: tall tower in free fall and they're caught in a 324 00:16:13,880 --> 00:16:16,120 Speaker 1: net below, going seventy miles an hour, and I was 325 00:16:16,200 --> 00:16:19,200 Speaker 1: able to measure aspects of their time perception on the 326 00:16:19,200 --> 00:16:21,720 Speaker 1: way down. How did you do that? What I did 327 00:16:21,760 --> 00:16:23,480 Speaker 1: is I built a device that went on their wrist 328 00:16:23,560 --> 00:16:26,600 Speaker 1: and it flashed information at them, visual information at them 329 00:16:26,720 --> 00:16:30,320 Speaker 1: at a certain rate, and depending on how fast they 330 00:16:30,320 --> 00:16:33,120 Speaker 1: were seeing the world, the question was would they be 331 00:16:33,160 --> 00:16:36,200 Speaker 1: able to essentially see in slow motion? Because everybody who's 332 00:16:36,200 --> 00:16:38,160 Speaker 1: ever gotten in a car accident says, oh, you know, 333 00:16:38,160 --> 00:16:40,200 Speaker 1: I was like so much. I saw the hood crumple 334 00:16:40,200 --> 00:16:42,040 Speaker 1: in the rear view mirror fall off and the facial 335 00:16:42,040 --> 00:16:44,400 Speaker 1: expression the other person and so on, and so I 336 00:16:44,440 --> 00:16:46,560 Speaker 1: wanted to really test whether that was true, whether you 337 00:16:46,600 --> 00:16:48,840 Speaker 1: could see in slow motion, and it turned out that 338 00:16:48,960 --> 00:16:51,680 Speaker 1: you do not see in slow motion. It's all a 339 00:16:51,800 --> 00:16:55,080 Speaker 1: retrospective trick of memory, which is to say, when you're 340 00:16:55,080 --> 00:16:59,040 Speaker 1: in a life streating situation, you're laying down memories really densely. 341 00:16:59,400 --> 00:17:02,080 Speaker 1: Normally you're not laying down much memory at all. You know, 342 00:17:02,080 --> 00:17:03,720 Speaker 1: I don't remember my drive home. It was just it 343 00:17:03,840 --> 00:17:06,520 Speaker 1: was nothing. But when something really matters, your brain writes 344 00:17:06,520 --> 00:17:08,680 Speaker 1: sound every single thing. So when you read that back out, 345 00:17:08,680 --> 00:17:10,000 Speaker 1: when you say what does happen? What does happened? What 346 00:17:10,080 --> 00:17:12,880 Speaker 1: just happened, you remember it in such detail that your 347 00:17:12,880 --> 00:17:15,480 Speaker 1: brain estimates, I guess I was have taken longer. I 348 00:17:15,600 --> 00:17:17,800 Speaker 1: was taken a long time. So it's all about the 349 00:17:17,800 --> 00:17:20,439 Speaker 1: way memory is laid down. That's why we think the 350 00:17:20,520 --> 00:17:23,360 Speaker 1: event took place in slow motion, whereas in fact it's 351 00:17:23,400 --> 00:17:25,479 Speaker 1: not slow motion. And I realized after I did these 352 00:17:25,520 --> 00:17:27,800 Speaker 1: experiments that it has to be that way, because you know, 353 00:17:27,880 --> 00:17:30,880 Speaker 1: coming back to the car accidents, which someone says, look, 354 00:17:30,920 --> 00:17:32,960 Speaker 1: I know it went slow motion because I saw these things, 355 00:17:33,200 --> 00:17:35,760 Speaker 1: you can just ask the person, Okay, look the passenger 356 00:17:35,840 --> 00:17:37,640 Speaker 1: on the car seat next to you, who is screaming. 357 00:17:37,680 --> 00:17:41,359 Speaker 1: Did it sounds like the person was actually saying because 358 00:17:41,480 --> 00:17:43,199 Speaker 1: if not, then that means it was not going in 359 00:17:43,200 --> 00:17:46,000 Speaker 1: slow motion, and people have to allow that. Actually, they 360 00:17:46,040 --> 00:17:48,320 Speaker 1: didn't hear things in slow motion, so it's simply that 361 00:17:48,359 --> 00:17:51,000 Speaker 1: they remembered all the details, and so when their brain 362 00:17:51,080 --> 00:17:53,359 Speaker 1: makes an estimate, it says, I guess that must have 363 00:17:53,359 --> 00:17:55,560 Speaker 1: been five seconds, because I don't usually have that much memory. 364 00:17:55,800 --> 00:17:58,280 Speaker 1: Do you think that only a traumatic event can trigger 365 00:17:58,320 --> 00:18:00,760 Speaker 1: that kind of memory sequence, But because something that is 366 00:18:00,800 --> 00:18:04,560 Speaker 1: intensely pleasurable and amazing do the same thing? Yeah, good question. 367 00:18:04,640 --> 00:18:06,679 Speaker 1: It can be the intensely pleasurable and amazing. It's just 368 00:18:07,040 --> 00:18:09,920 Speaker 1: that's more rare. But it's an area of your brain 369 00:18:10,000 --> 00:18:12,760 Speaker 1: called the limbic system and the amygdala in particular, that's 370 00:18:12,760 --> 00:18:15,960 Speaker 1: involved and saying, hey, write this down. This is important, 371 00:18:16,080 --> 00:18:18,040 Speaker 1: and there aren't that many things that are super important 372 00:18:18,040 --> 00:18:20,439 Speaker 1: for us write down. Certainly traumatic events count, and certainly 373 00:18:20,520 --> 00:18:23,119 Speaker 1: super pleasurable events a count, But otherwise most of the 374 00:18:23,200 --> 00:18:26,320 Speaker 1: time you're amygla says, Okay, you know, same old stuff. 375 00:18:26,320 --> 00:18:29,040 Speaker 1: I'm not gonna bother keeping dense memories of this. That's 376 00:18:29,040 --> 00:18:32,120 Speaker 1: so funny because childbirth, I don't know what your partner 377 00:18:32,240 --> 00:18:35,240 Speaker 1: or wife experience, but it's so interesting. There is great 378 00:18:35,280 --> 00:18:38,119 Speaker 1: swathes of the thirty seven hours that I was in 379 00:18:38,200 --> 00:18:42,080 Speaker 1: labor that I remember so acutely and keenly, and they 380 00:18:42,119 --> 00:18:46,000 Speaker 1: involved pain, and they also involved laughter and hilarious things 381 00:18:46,040 --> 00:18:48,239 Speaker 1: that my mother and my sister said. And then there 382 00:18:48,320 --> 00:18:50,920 Speaker 1: must be ours. There were hours and hours and hours 383 00:18:50,920 --> 00:18:53,560 Speaker 1: that I know I was just stumbling through pain, but 384 00:18:53,640 --> 00:18:56,719 Speaker 1: I don't remember exactly. So it's it's interesting. There are 385 00:18:56,720 --> 00:18:58,720 Speaker 1: parts of that thirty seven hours that are carved out 386 00:18:58,760 --> 00:19:01,400 Speaker 1: in the boldest relief. So interesting. I've often wondered why 387 00:19:01,440 --> 00:19:04,160 Speaker 1: my brain chose to remember those bits and not when 388 00:19:04,200 --> 00:19:06,520 Speaker 1: I was sitting in the shower, you know, singing, which 389 00:19:06,560 --> 00:19:08,320 Speaker 1: I know I did because they told me I did it, 390 00:19:08,320 --> 00:19:11,199 Speaker 1: but I don't really remember it. Yeah, Well, what happens 391 00:19:11,240 --> 00:19:13,600 Speaker 1: during pregnancies. You've got all these hormones that are going 392 00:19:13,680 --> 00:19:15,840 Speaker 1: up and down and bouncing all over the place, and 393 00:19:16,200 --> 00:19:19,320 Speaker 1: for better or worse, this just teaches us what biological 394 00:19:19,359 --> 00:19:21,359 Speaker 1: machines we are, which is to say, oh, when this 395 00:19:21,440 --> 00:19:24,159 Speaker 1: hormone's himes and then you're remembering and you can remember 396 00:19:24,160 --> 00:19:26,479 Speaker 1: that later, and then when this other thing is happening, 397 00:19:26,560 --> 00:19:30,119 Speaker 1: forget it. You're just not writing anything. Now it's amazing. Yeah, 398 00:19:30,160 --> 00:19:33,520 Speaker 1: it can be amazing and depressing and eye opening and 399 00:19:33,560 --> 00:19:35,960 Speaker 1: so on. I think it's the most important thing for 400 00:19:36,000 --> 00:19:39,719 Speaker 1: self understanding, for understanding what is our experience in the world. 401 00:19:40,119 --> 00:19:43,760 Speaker 1: I think that grief has taught me that meaning is 402 00:19:43,840 --> 00:19:47,280 Speaker 1: just assigned. We assigned meaning, and the depth of our 403 00:19:47,320 --> 00:19:49,920 Speaker 1: experience and the meaningfulness of our life is in direct 404 00:19:49,920 --> 00:19:52,560 Speaker 1: proportion to what meaning we assigned to it. Well, it's 405 00:19:52,560 --> 00:19:55,280 Speaker 1: even worse than that, I think, which is to say, 406 00:19:55,320 --> 00:19:58,440 Speaker 1: a lot of the stuff is evolutionarily dictated, and so 407 00:19:59,119 --> 00:20:02,040 Speaker 1: you know, when you're a young person and you when 408 00:20:02,119 --> 00:20:04,399 Speaker 1: we think, oh my god, I'm so in love and 409 00:20:04,400 --> 00:20:07,080 Speaker 1: so on, that meaning we didn't really have a choice 410 00:20:07,080 --> 00:20:10,040 Speaker 1: in that. That is what has allowed our species to survive. 411 00:20:10,320 --> 00:20:12,520 Speaker 1: So many things are like that. Why is it that, 412 00:20:12,720 --> 00:20:14,600 Speaker 1: you know, if if there's a lemon pie in the 413 00:20:14,640 --> 00:20:17,359 Speaker 1: oven that smells so good, but let's say a piece 414 00:20:17,359 --> 00:20:20,359 Speaker 1: of poop on the sidewalk smells so aversive, so bad, 415 00:20:20,640 --> 00:20:23,840 Speaker 1: given that they're just molecules both in both cases just 416 00:20:23,880 --> 00:20:26,080 Speaker 1: molecules that are wafting through the air and attached to 417 00:20:26,119 --> 00:20:28,240 Speaker 1: receptors in your nose. You know, if you study full 418 00:20:28,280 --> 00:20:31,800 Speaker 1: faction how that actually works. It's just molecules in different shapes. 419 00:20:31,800 --> 00:20:33,760 Speaker 1: So if I showed you the two different shapes, I said, okay, 420 00:20:33,800 --> 00:20:35,320 Speaker 1: one of these is lemon pie. One of these poop, 421 00:20:35,359 --> 00:20:37,280 Speaker 1: you wouldn't know which is which you couldn't possibly And 422 00:20:37,400 --> 00:20:39,760 Speaker 1: so the question is why is one so pleasurable one 423 00:20:39,760 --> 00:20:42,080 Speaker 1: so aversive? And the answer has to do with the 424 00:20:42,359 --> 00:20:45,880 Speaker 1: evolutionary meaning. So the lemon pie tells you, hey, there's food, 425 00:20:45,880 --> 00:20:48,080 Speaker 1: there's high sugars, and they're great. I can keep this 426 00:20:48,200 --> 00:20:52,320 Speaker 1: battery powered, you know, robot meat, robot going. But the 427 00:20:52,359 --> 00:20:55,040 Speaker 1: poop is full of bacteria, and things that have been 428 00:20:55,040 --> 00:20:59,119 Speaker 1: figured out through revolutionary time are dangerous to your pathologic 429 00:20:59,359 --> 00:21:02,080 Speaker 1: and so the shorthand that your brain does to say, 430 00:21:02,080 --> 00:21:04,960 Speaker 1: oh that's aversive, don't go near that thing. And so 431 00:21:05,040 --> 00:21:07,440 Speaker 1: I often wonder about this issue of all the things 432 00:21:07,480 --> 00:21:11,160 Speaker 1: that we find meaningful in life. The question is how 433 00:21:11,200 --> 00:21:14,200 Speaker 1: how far does the hand of evolution reach in there 434 00:21:14,560 --> 00:21:17,840 Speaker 1: and define what we find meaningful, what we don't exactly 435 00:21:18,040 --> 00:21:20,920 Speaker 1: and spending time on wrapping that that probably isn't quite 436 00:21:21,040 --> 00:21:22,919 Speaker 1: enough life to do that. Or maybe that is, but 437 00:21:22,960 --> 00:21:24,560 Speaker 1: maybe it would just take all the fun out of it. 438 00:21:24,800 --> 00:21:26,399 Speaker 1: You know, I don't actually think so. You know. My 439 00:21:26,440 --> 00:21:29,520 Speaker 1: analogy is if you and I sat down many for 440 00:21:29,560 --> 00:21:31,800 Speaker 1: the next hour and I gave you a diagram, and 441 00:21:31,840 --> 00:21:34,600 Speaker 1: I showed you exactly why you like the taste of 442 00:21:34,760 --> 00:21:37,560 Speaker 1: let's say, chocolate, why you think that tastes so delicious. 443 00:21:37,800 --> 00:21:39,960 Speaker 1: You might say, okay, good, I've got it. I understand 444 00:21:40,000 --> 00:21:43,320 Speaker 1: the entire diagram. But that doesn't change your pleasure about 445 00:21:43,320 --> 00:21:45,720 Speaker 1: it at all. It doesn't improve it, it it doesn't diminish it. 446 00:21:45,720 --> 00:21:47,919 Speaker 1: It's like it's a different world. I mean, if I 447 00:21:47,960 --> 00:21:49,719 Speaker 1: send this to you about you know, the color purple, 448 00:21:49,800 --> 00:21:51,560 Speaker 1: I said, oh, look here, you've got these color photo 449 00:21:51,560 --> 00:21:53,960 Speaker 1: receptors and this happens in a visual cortex, blah blah blah. 450 00:21:54,000 --> 00:21:55,800 Speaker 1: Doesn't change the fact that you look at something purple 451 00:21:55,840 --> 00:21:59,960 Speaker 1: and say, oh, that's beautiful. Has having your children made 452 00:22:00,040 --> 00:22:03,359 Speaker 1: do you think about neuroscience in a different way? Yeah, 453 00:22:03,400 --> 00:22:06,520 Speaker 1: Because like the Yogis talk about beginner's mind, like that 454 00:22:06,520 --> 00:22:08,760 Speaker 1: that is that is a place that you are always 455 00:22:08,760 --> 00:22:11,119 Speaker 1: seeking to get back to, which is what I always 456 00:22:11,160 --> 00:22:13,840 Speaker 1: I've always perceived that, having watched my son grow up, 457 00:22:13,920 --> 00:22:16,719 Speaker 1: that beginner's mind is the purest, most beautiful. They are 458 00:22:16,760 --> 00:22:21,000 Speaker 1: so connected to whatever was pre consciousness and they've brought 459 00:22:21,040 --> 00:22:23,600 Speaker 1: that in with them. So yeah, Well, one of the 460 00:22:23,640 --> 00:22:26,000 Speaker 1: things that has sort of been an interesting surprise to 461 00:22:26,040 --> 00:22:29,000 Speaker 1: me is just seeing the punctuated equilibrium, by which I mean, 462 00:22:29,040 --> 00:22:31,840 Speaker 1: you know, things change suddenly, as in, you know, one 463 00:22:31,920 --> 00:22:34,360 Speaker 1: day your daughter can't read, and then you know, kind 464 00:22:34,359 --> 00:22:36,160 Speaker 1: of a week later, she's a pretty good reader. It's 465 00:22:36,200 --> 00:22:38,000 Speaker 1: just sort of these things that you work on with 466 00:22:38,040 --> 00:22:40,600 Speaker 1: her for a long time, sudden change. And I've always 467 00:22:40,600 --> 00:22:43,280 Speaker 1: found this kind of thing fascinating. It's like the system 468 00:22:43,560 --> 00:22:46,359 Speaker 1: finds something where it says, okay, now I got it, 469 00:22:46,400 --> 00:22:48,480 Speaker 1: and now I know how to read or ride a 470 00:22:48,480 --> 00:22:50,960 Speaker 1: bicycle or whatever the thing is. So that's been really 471 00:22:50,960 --> 00:22:54,000 Speaker 1: interesting to me, and also to really try to get 472 00:22:54,040 --> 00:22:57,880 Speaker 1: an understanding of which things are pre programmed and which 473 00:22:57,920 --> 00:23:00,439 Speaker 1: things are just a matter of absorbing from the world. 474 00:23:00,640 --> 00:23:02,639 Speaker 1: And you know, it's always a combination of both. You 475 00:23:02,640 --> 00:23:05,240 Speaker 1: may know this, but the nature versus nurture debate is 476 00:23:05,280 --> 00:23:07,840 Speaker 1: totally dead because it's always both. You come to the 477 00:23:07,840 --> 00:23:10,280 Speaker 1: table with a lot of pre programming, and you absorb 478 00:23:10,359 --> 00:23:13,120 Speaker 1: the world, and you absorb your language and your culture 479 00:23:13,119 --> 00:23:15,040 Speaker 1: and your neighborhood and your religion and so on. That 480 00:23:15,080 --> 00:23:17,480 Speaker 1: all becomes part of who you are. So, you know, 481 00:23:17,560 --> 00:23:19,960 Speaker 1: just just really watching my kids and trying to understand 482 00:23:20,000 --> 00:23:23,080 Speaker 1: that isn't really Oh, I get so worried that there 483 00:23:23,160 --> 00:23:25,720 Speaker 1: was no way I could have created a scientist because 484 00:23:25,760 --> 00:23:29,159 Speaker 1: of that, because my son is around music and music 485 00:23:29,320 --> 00:23:33,160 Speaker 1: and music and acting and reading and poetry and plays 486 00:23:33,240 --> 00:23:37,920 Speaker 1: and discussing literature and movies and this. And sure enough 487 00:23:37,960 --> 00:23:40,480 Speaker 1: he comes back from this amazing school that he goes 488 00:23:40,560 --> 00:23:42,800 Speaker 1: to having said, can I have base lessons? Can I 489 00:23:42,880 --> 00:23:44,680 Speaker 1: learn the base in September? And I was like yeah, 490 00:23:44,800 --> 00:23:48,480 Speaker 1: totally great. And he comes home and he's already playing 491 00:23:48,480 --> 00:23:50,439 Speaker 1: the bass like he can now play it. And I 492 00:23:50,480 --> 00:23:52,239 Speaker 1: was like, well, I thought we were going to do 493 00:23:52,280 --> 00:23:54,080 Speaker 1: the bass lessons. He was like yeah, I couldn't. I 494 00:23:54,080 --> 00:23:56,040 Speaker 1: couldn't wait. It was just in there. I had to 495 00:23:56,080 --> 00:23:59,320 Speaker 1: start doing it. And I was thinking, like part of 496 00:23:59,359 --> 00:24:01,320 Speaker 1: me was going this so amazing, and pop was going 497 00:24:01,440 --> 00:24:03,960 Speaker 1: you poor thing, like you could have maybe built a bridge, 498 00:24:04,400 --> 00:24:06,200 Speaker 1: like but there was really never any chance that you 499 00:24:06,280 --> 00:24:07,680 Speaker 1: were going to be able to build a bridge because 500 00:24:07,680 --> 00:24:10,119 Speaker 1: of me. You know, I'll tell you the good news, 501 00:24:10,600 --> 00:24:13,080 Speaker 1: that's that you're totally right in your intuition about this. 502 00:24:13,280 --> 00:24:15,919 Speaker 1: But there's two things worth noting. One is that you know, 503 00:24:16,040 --> 00:24:20,320 Speaker 1: kids dropped into the world having a lot of predispositions 504 00:24:20,359 --> 00:24:22,520 Speaker 1: that might be different from yours. So you will influence 505 00:24:22,560 --> 00:24:25,200 Speaker 1: your child massively, and yet your child will go off 506 00:24:25,240 --> 00:24:27,760 Speaker 1: and do things that you didn't really expect. The second 507 00:24:27,840 --> 00:24:31,280 Speaker 1: thing that's so wonderful about this world is the Internet 508 00:24:31,400 --> 00:24:34,760 Speaker 1: and the fact that you have access to anything anytime. 509 00:24:34,800 --> 00:24:37,359 Speaker 1: So he gets lots of music from you, but boy, 510 00:24:37,400 --> 00:24:39,480 Speaker 1: he can just log on and watch you know, Carl 511 00:24:39,520 --> 00:24:43,000 Speaker 1: Sagan's Cosmos, or watch Neil Durass Tyson, or watch anything 512 00:24:43,000 --> 00:24:45,000 Speaker 1: he wants to watch, some brain pop video or some 513 00:24:45,040 --> 00:24:47,760 Speaker 1: ted Ed and suddenly be turned onto bridge building in 514 00:24:47,800 --> 00:24:49,880 Speaker 1: a way that even though he didn't get it from you, 515 00:24:49,920 --> 00:24:52,520 Speaker 1: he got it from ted Ed. So that's the great 516 00:24:52,560 --> 00:24:55,760 Speaker 1: news about this. I mean, I would obviously walk across 517 00:24:55,840 --> 00:24:58,119 Speaker 1: a bridge that a bass player had built, but you 518 00:24:58,160 --> 00:25:06,920 Speaker 1: know thousands wouldn't. Can you tell me where and when 519 00:25:07,240 --> 00:25:10,680 Speaker 1: you were happiest? I hadn't experienced the other night when 520 00:25:10,720 --> 00:25:14,040 Speaker 1: I decided with my two children, who are seven and 521 00:25:14,080 --> 00:25:15,600 Speaker 1: ten years old, that we were going to camp out 522 00:25:15,640 --> 00:25:18,959 Speaker 1: on the trampoline outside, and it was it was very cold, 523 00:25:19,160 --> 00:25:22,040 Speaker 1: and we were very uncomfortable, but it was so much fun. 524 00:25:22,160 --> 00:25:24,439 Speaker 1: It was just all laughter, and so it was just 525 00:25:24,520 --> 00:25:27,040 Speaker 1: such a nice moment with my kids. So as far 526 00:25:27,080 --> 00:25:30,000 Speaker 1: as thinking of a moment, that's my most recent one. 527 00:25:30,240 --> 00:25:32,920 Speaker 1: But I think more generally, you know, I do all 528 00:25:32,920 --> 00:25:37,120 Speaker 1: my writing at eye Hoop, and these are different eye 529 00:25:37,119 --> 00:25:40,000 Speaker 1: hoops all over the world actually, But yeah, I've written 530 00:25:40,040 --> 00:25:42,320 Speaker 1: eight books, so I don't know, like a million words 531 00:25:42,359 --> 00:25:44,760 Speaker 1: I've published so far, and every single one of those 532 00:25:44,760 --> 00:25:46,600 Speaker 1: words has been done at you know, in the National 533 00:25:46,680 --> 00:25:49,879 Speaker 1: House of Pancakes. How many when you were ordering moons 534 00:25:49,920 --> 00:25:53,920 Speaker 1: over my heemy. Actually really just drink the coffee there. 535 00:25:53,960 --> 00:25:55,720 Speaker 1: I don't need too much of the way of pancakes, 536 00:25:55,760 --> 00:25:58,840 Speaker 1: but it's just the right speed. Starbucks is too um, 537 00:25:59,200 --> 00:26:01,359 Speaker 1: you know, there're too many people walking in and out, 538 00:26:01,400 --> 00:26:04,000 Speaker 1: and I hope is slower, and so I just sit 539 00:26:04,119 --> 00:26:06,560 Speaker 1: down for a six hour window and I just right. 540 00:26:06,720 --> 00:26:10,080 Speaker 1: And that is when I am my happiest, especially when 541 00:26:10,119 --> 00:26:12,960 Speaker 1: I feel like I've nailed something. I've gotten something that 542 00:26:13,080 --> 00:26:16,040 Speaker 1: was just a coffin this thought in my head clear 543 00:26:16,240 --> 00:26:18,600 Speaker 1: and on the page, and I've written, you know whatever, 544 00:26:18,640 --> 00:26:21,920 Speaker 1: a hundred fifty seven words that are crystal clear exactly 545 00:26:21,920 --> 00:26:24,119 Speaker 1: what I mean here. That's the best feeling that I know. 546 00:26:24,720 --> 00:26:26,959 Speaker 1: I know that you say you love to do that 547 00:26:27,000 --> 00:26:29,520 Speaker 1: in eye hoop? Can you do that anywhere? Or do 548 00:26:29,560 --> 00:26:33,000 Speaker 1: you get attached to an experience happening in a place 549 00:26:33,040 --> 00:26:35,520 Speaker 1: and then associate that good feeling with that place. You 550 00:26:35,600 --> 00:26:37,960 Speaker 1: can actually be anywhere. It just has to be a 551 00:26:38,000 --> 00:26:40,359 Speaker 1: place that's moving at the right speed. It's y called 552 00:26:40,359 --> 00:26:43,320 Speaker 1: eye hoop because you can only hop very slowly. You 553 00:26:43,400 --> 00:26:47,360 Speaker 1: cannot rush like Starbucks even sounds fast. It's like stars 554 00:26:47,480 --> 00:26:51,640 Speaker 1: and Bucks and movement. I can. I'm not gonna write 555 00:26:51,640 --> 00:26:56,080 Speaker 1: a coffee dash, but I love that feeling of just 556 00:26:56,320 --> 00:27:04,199 Speaker 1: going in deep. Well it's the quality you like at 557 00:27:04,280 --> 00:27:08,600 Speaker 1: least about yourself. Ah well, my whole life has been true. 558 00:27:08,640 --> 00:27:10,639 Speaker 1: I just take on too much. And I have so 559 00:27:10,680 --> 00:27:13,400 Speaker 1: many friends that are good at being laser focused. And 560 00:27:13,480 --> 00:27:15,119 Speaker 1: in fact, just this morning I was talking to a 561 00:27:15,160 --> 00:27:18,200 Speaker 1: colleague of mine who also writes books, and she said, yeah, 562 00:27:18,320 --> 00:27:20,080 Speaker 1: what I do is I start with the table of contents, 563 00:27:20,119 --> 00:27:22,040 Speaker 1: and I write each piece and I know exactly what's 564 00:27:22,080 --> 00:27:23,520 Speaker 1: going to go in the book and with the framework is. 565 00:27:23,560 --> 00:27:24,920 Speaker 1: And that's not at all how I do it. I'm 566 00:27:24,920 --> 00:27:27,600 Speaker 1: completely on the other end of whatever the spectrum is 567 00:27:27,680 --> 00:27:30,480 Speaker 1: that she's on. You know. I just I have ideas 568 00:27:30,520 --> 00:27:33,120 Speaker 1: and I dictate into every note all day long, and I, oh, 569 00:27:33,160 --> 00:27:34,879 Speaker 1: this is gonna be a paragraph here, and I here 570 00:27:34,920 --> 00:27:37,520 Speaker 1: and then and then I tie stuff together with time, 571 00:27:37,720 --> 00:27:41,840 Speaker 1: and I probably spend twice as much time putting something 572 00:27:41,880 --> 00:27:44,800 Speaker 1: together that way and deleting whole you know, scenes and 573 00:27:44,880 --> 00:27:47,560 Speaker 1: paragraphs and chapters. But that's just the way that that 574 00:27:47,600 --> 00:27:50,080 Speaker 1: I write. But the problem is, I'm always I mean, 575 00:27:50,119 --> 00:27:52,040 Speaker 1: this has always been true. I write, you know, five 576 00:27:52,080 --> 00:27:54,800 Speaker 1: books at once, and you know, I teach at Stanford, 577 00:27:54,840 --> 00:27:57,520 Speaker 1: I'm running two companies, and I'm about to start my 578 00:27:57,520 --> 00:28:01,800 Speaker 1: own podcast. That podcast is going to forty minute monologues 579 00:28:01,840 --> 00:28:04,040 Speaker 1: where I'm just talking for forty minutes. And so what 580 00:28:04,119 --> 00:28:05,919 Speaker 1: that means is, I'm just gonna have a ton of 581 00:28:05,960 --> 00:28:08,080 Speaker 1: work on my plate. I've got a great idea. I've 582 00:28:08,080 --> 00:28:12,280 Speaker 1: got a great idea. You record your podcast and then 583 00:28:12,400 --> 00:28:15,760 Speaker 1: that becomes your book. So you just record it and 584 00:28:15,800 --> 00:28:18,080 Speaker 1: then you put it through some program so that it 585 00:28:18,119 --> 00:28:21,080 Speaker 1: just dictates it, and then you just edit what you've 586 00:28:21,119 --> 00:28:23,720 Speaker 1: said already. This is how you're going to save time. 587 00:28:23,760 --> 00:28:26,640 Speaker 1: I would like to see this book. Yes, I love 588 00:28:26,720 --> 00:28:29,000 Speaker 1: that idea. You know. The difficulty is with a book, 589 00:28:29,520 --> 00:28:32,679 Speaker 1: it's like building a cathedral. You know, there's all this stuff. 590 00:28:32,760 --> 00:28:35,320 Speaker 1: It's such a bigger kind of project. And I wish 591 00:28:35,400 --> 00:28:37,960 Speaker 1: I could turn forty minute monologues into a book, but 592 00:28:38,080 --> 00:28:41,160 Speaker 1: it's uh. But perhaps if you do a hundred forty 593 00:28:41,200 --> 00:28:45,000 Speaker 1: minute monologues, you will have the beginnings of a book. 594 00:28:45,360 --> 00:28:47,480 Speaker 1: I'll have a lot of material, that's for sure. I've 595 00:28:47,520 --> 00:28:51,040 Speaker 1: written one book, David, which is not really comfortable to 596 00:28:51,080 --> 00:28:54,320 Speaker 1: anything that you've said. However, it was speaking the thoughts 597 00:28:54,480 --> 00:28:58,080 Speaker 1: that then made it much nearer. It made the reality 598 00:28:58,120 --> 00:29:00,560 Speaker 1: of that book much closer, so I could actually reach 599 00:29:00,600 --> 00:29:02,880 Speaker 1: out and get it. It was having verbalized it. So 600 00:29:02,920 --> 00:29:06,080 Speaker 1: I wonder if it would maybe speed up the process. 601 00:29:06,120 --> 00:29:08,320 Speaker 1: But you hear yourself talking about these ideas and it 602 00:29:08,360 --> 00:29:12,120 Speaker 1: becomes more coherent and certainly externalized and then become something 603 00:29:12,120 --> 00:29:14,520 Speaker 1: that you can actually you can grab easier and right out. 604 00:29:14,560 --> 00:29:17,400 Speaker 1: I don't know no, I totally get that. You reify 605 00:29:17,520 --> 00:29:20,200 Speaker 1: the ideas by saying them out loud. And then one 606 00:29:20,200 --> 00:29:22,440 Speaker 1: trick that I do all the time lately is I 607 00:29:22,480 --> 00:29:24,720 Speaker 1: will then take stuff that I've written and put into 608 00:29:24,800 --> 00:29:27,280 Speaker 1: programs so that it speaks the text back to me, 609 00:29:27,360 --> 00:29:29,280 Speaker 1: but with a totally different voice. I'd say a female 610 00:29:29,360 --> 00:29:32,000 Speaker 1: voice or British, but maybe your voice. So I'm listening 611 00:29:32,400 --> 00:29:34,720 Speaker 1: like an audiobook to my own writing and I think, oh, 612 00:29:34,800 --> 00:29:37,479 Speaker 1: that part sucked, and oh that logic doesn't quite match up. 613 00:29:37,640 --> 00:29:41,640 Speaker 1: That's amazing. I would love that. I would use that 614 00:29:41,680 --> 00:29:45,200 Speaker 1: for difficult conversations with my son. Let me just let 615 00:29:45,200 --> 00:29:48,719 Speaker 1: me just rest this, let me just REases to hear 616 00:29:48,760 --> 00:29:50,920 Speaker 1: that back. That's really cool, as though you're hearing a 617 00:29:50,960 --> 00:29:54,440 Speaker 1: different parents saying it and anything that's not so good actually, 618 00:29:54,440 --> 00:29:57,080 Speaker 1: as I'm a single mother, that really what hel I 619 00:29:57,120 --> 00:30:00,720 Speaker 1: could do it in like a man's voice, right perfect. 620 00:30:00,880 --> 00:30:03,120 Speaker 1: And by the way, many authors in the past have 621 00:30:03,440 --> 00:30:06,840 Speaker 1: used this method. Wordsworth, for example, had a lazy susan 622 00:30:06,880 --> 00:30:08,920 Speaker 1: on the table, you know, one of these circular jobs 623 00:30:08,920 --> 00:30:11,360 Speaker 1: that spins around, and he'd have his different manuscripts on it, 624 00:30:11,400 --> 00:30:13,880 Speaker 1: and he'd work on whatever manuscript until he was slowing down, 625 00:30:13,880 --> 00:30:15,920 Speaker 1: and then he'd spin it and pick a different manuscript 626 00:30:15,920 --> 00:30:17,880 Speaker 1: and work on that. That's exactly how I work. Whenever 627 00:30:17,880 --> 00:30:20,520 Speaker 1: I'm slowing down on something, I switch projects, and as 628 00:30:20,520 --> 00:30:23,760 Speaker 1: a result, I'm always working at top speed on that project. 629 00:30:23,840 --> 00:30:25,600 Speaker 1: So I think there actually is some benefit to it. 630 00:30:25,720 --> 00:30:28,960 Speaker 1: That's amazing. So it sounds like that's not necessarily like 631 00:30:29,000 --> 00:30:31,680 Speaker 1: a bad quality because you've done it always and you're 632 00:30:31,800 --> 00:30:33,440 Speaker 1: used to it in terms of taking on a lot 633 00:30:33,480 --> 00:30:36,200 Speaker 1: of things. I mean, for example, I'm in Silicon Valley 634 00:30:36,200 --> 00:30:39,000 Speaker 1: and I you know, the vcs who invest do not 635 00:30:39,160 --> 00:30:41,640 Speaker 1: do not like this quality about me. I think they 636 00:30:41,680 --> 00:30:43,880 Speaker 1: would much rather see me as the type of person 637 00:30:43,880 --> 00:30:46,840 Speaker 1: who just as wakes up thinks about this company seven. 638 00:30:46,920 --> 00:30:50,120 Speaker 1: And I do think about it essentially seven, but I'm 639 00:30:50,120 --> 00:30:51,920 Speaker 1: also doing other things at the same time. You know, 640 00:30:52,000 --> 00:30:54,360 Speaker 1: it has worked out, but it always feels like one 641 00:30:54,400 --> 00:30:57,080 Speaker 1: of those things where I'm leaning forward as far forward 642 00:30:57,120 --> 00:30:59,280 Speaker 1: as I can into the future and moving as fast 643 00:30:59,360 --> 00:31:02,000 Speaker 1: as I can on fronts, and as long as it works, 644 00:31:02,200 --> 00:31:04,800 Speaker 1: that's great. You know, at some point I'm going to 645 00:31:04,960 --> 00:31:06,640 Speaker 1: whatever it is, I'm going to break a leg or 646 00:31:06,720 --> 00:31:09,000 Speaker 1: you know, get diagnosed with something or whatever. And then 647 00:31:09,040 --> 00:31:12,200 Speaker 1: everything's gonna, you know, pile up like a giant car accident. 648 00:31:12,680 --> 00:31:15,040 Speaker 1: What would your life look like if you did slow down? 649 00:31:15,280 --> 00:31:17,200 Speaker 1: You know, I just don't think it's my personality. I've 650 00:31:17,200 --> 00:31:19,800 Speaker 1: actually tried. In fact, when I was in college, I 651 00:31:19,800 --> 00:31:24,320 Speaker 1: had this professor I really thought was wonderful, and he said, look, eagleman, 652 00:31:24,400 --> 00:31:27,240 Speaker 1: life is like you are a lumberjack, and you can't 653 00:31:27,240 --> 00:31:29,520 Speaker 1: go into the forest and take one thwack at each tree. 654 00:31:29,560 --> 00:31:31,800 Speaker 1: You have to pick your tree and really hit that 655 00:31:31,840 --> 00:31:34,000 Speaker 1: tree with it. And it sounded so wise, and I 656 00:31:34,040 --> 00:31:37,680 Speaker 1: really liked this guy, and so I tried to change myself. 657 00:31:37,680 --> 00:31:51,160 Speaker 1: But that's just not who I am. David. What would 658 00:31:51,160 --> 00:31:53,720 Speaker 1: be your last meal? I think I would do a 659 00:31:53,720 --> 00:31:59,240 Speaker 1: protein shake, And it's only because that represents my enthusiasm 660 00:31:59,280 --> 00:32:01,640 Speaker 1: about the next steps, about what's coming next, and I 661 00:32:01,640 --> 00:32:04,160 Speaker 1: want to make sure my body is fit and so 662 00:32:04,240 --> 00:32:05,920 Speaker 1: on for the future. So I might as well go 663 00:32:05,960 --> 00:32:08,239 Speaker 1: out on a high note with my eyes still on 664 00:32:08,280 --> 00:32:10,240 Speaker 1: the horizon. I think that's how I'd like to go out. 665 00:32:10,320 --> 00:32:14,400 Speaker 1: You're still feeding your muscles and feeding it protein. Yeah, yeah, cool, 666 00:32:14,640 --> 00:32:17,959 Speaker 1: I like that. I like that a lot. What flavor 667 00:32:18,000 --> 00:32:20,680 Speaker 1: protein shake would it be? I think chocolate. Why not? 668 00:32:21,040 --> 00:32:22,480 Speaker 1: I knew you were going to say that. I knew 669 00:32:22,520 --> 00:32:23,840 Speaker 1: you were going to say that. By the way, who 670 00:32:23,880 --> 00:32:32,280 Speaker 1: wouldn't What question would you most like? Answered? Years ago? 671 00:32:32,480 --> 00:32:34,880 Speaker 1: In two thousand four, I wrote a cover article for 672 00:32:34,920 --> 00:32:38,800 Speaker 1: Discover magazine called ten Unsolved Questions of Neuroscience. And what's 673 00:32:38,800 --> 00:32:42,560 Speaker 1: interesting is that those are essentially as unsolved now as 674 00:32:42,600 --> 00:32:46,080 Speaker 1: they were then, with one possible exception. Actually, but the 675 00:32:46,120 --> 00:32:49,040 Speaker 1: top question for me is the question of consciousness, which 676 00:32:49,120 --> 00:32:51,440 Speaker 1: is why does it feel like something to be you 677 00:32:51,760 --> 00:32:55,840 Speaker 1: or me? Because the brain is built of eighty six 678 00:32:55,960 --> 00:33:00,280 Speaker 1: billion neurons, which are the specialized cells of the brain, 679 00:33:00,400 --> 00:33:03,040 Speaker 1: and each of these neurons is you know, sending information 680 00:33:03,120 --> 00:33:06,160 Speaker 1: back and forth with these electrical spikes, and they're releasing chemicals, 681 00:33:06,200 --> 00:33:08,560 Speaker 1: all kinds of camp kids stuff. But fundamentally, it's just 682 00:33:08,600 --> 00:33:11,480 Speaker 1: a big biological machine. It's just doing stuff. It's just 683 00:33:11,600 --> 00:33:14,160 Speaker 1: you know, sending signals and reacting to signals, and as 684 00:33:14,160 --> 00:33:16,320 Speaker 1: far as we can tell, that's all that's going on. 685 00:33:16,440 --> 00:33:19,520 Speaker 1: Because when somebody damages their brain, we can make very 686 00:33:19,560 --> 00:33:22,880 Speaker 1: particular predictions about what the consequences are going to be. 687 00:33:23,040 --> 00:33:25,800 Speaker 1: It will change their risk aversion or their decision making, 688 00:33:25,920 --> 00:33:29,600 Speaker 1: or their ability to name animals or see colors or 689 00:33:29,720 --> 00:33:33,320 Speaker 1: understand music, you know, super specific things. And so that's 690 00:33:33,360 --> 00:33:35,880 Speaker 1: why when we look at hundreds of years of brain damage, 691 00:33:35,920 --> 00:33:37,920 Speaker 1: we say, all right, look it's Piers. Just be a 692 00:33:37,960 --> 00:33:40,520 Speaker 1: big machine there. But the question is why does it 693 00:33:40,560 --> 00:33:42,320 Speaker 1: feel like something to be alive? Why do you experience 694 00:33:42,440 --> 00:33:45,920 Speaker 1: the beauty of a sunset or the smell of cinnamon 695 00:33:46,040 --> 00:33:48,400 Speaker 1: or the taste of feta cheese on your tongue. Why 696 00:33:48,440 --> 00:33:51,480 Speaker 1: aren't we just like you know, my computer, my laptop 697 00:33:51,520 --> 00:33:54,680 Speaker 1: here is sending lots of signals around back and forth, 698 00:33:54,680 --> 00:33:57,719 Speaker 1: but presumably it's not conscious. And when I watch YouTube 699 00:33:57,800 --> 00:34:00,040 Speaker 1: video that I think is funny, it presumably doesn't. And 700 00:34:00,080 --> 00:34:01,720 Speaker 1: it's funny. It's just sending zero, you know, it's just 701 00:34:01,760 --> 00:34:03,600 Speaker 1: sending zeros and one drone. And when I shut it 702 00:34:03,600 --> 00:34:06,680 Speaker 1: off at night, it doesn't lament its own death or something. 703 00:34:06,960 --> 00:34:09,279 Speaker 1: So this is the question is how do you build 704 00:34:09,280 --> 00:34:13,279 Speaker 1: a biological machine and have it be self aware? Is 705 00:34:13,320 --> 00:34:16,920 Speaker 1: that the fundament of possibility? And is m yeah, exactly 706 00:34:16,920 --> 00:34:20,319 Speaker 1: so for anyone who doesn't though you know possibilities is 707 00:34:20,360 --> 00:34:23,440 Speaker 1: this movement I started about twelve years ago, which is 708 00:34:23,480 --> 00:34:25,960 Speaker 1: simply a way of me trying to capture what the 709 00:34:26,000 --> 00:34:29,080 Speaker 1: scientific temperament is. Where we shine a flashlight around the 710 00:34:29,080 --> 00:34:31,319 Speaker 1: possibility space, we say, look, maybe it's that, Maybe it's that, 711 00:34:31,360 --> 00:34:33,719 Speaker 1: Maybe it's that. And the reason I sort of tried 712 00:34:33,760 --> 00:34:37,120 Speaker 1: to articulate this is because when you go into a bookstore, 713 00:34:37,120 --> 00:34:39,640 Speaker 1: all you ever see are the books by the atheists, 714 00:34:39,640 --> 00:34:42,120 Speaker 1: the new atheists, and the books by the fundamentally religious, 715 00:34:42,120 --> 00:34:43,600 Speaker 1: and they're often put on the same table in the 716 00:34:43,600 --> 00:34:46,000 Speaker 1: bookstores so that you can sort of choose your side 717 00:34:46,040 --> 00:34:48,880 Speaker 1: and see what's But the truth is that our existence 718 00:34:48,880 --> 00:34:53,200 Speaker 1: in the cosmos is so deeply mysterious that almost certainly 719 00:34:53,600 --> 00:34:56,800 Speaker 1: there's something much more interesting going on that is neither 720 00:34:56,840 --> 00:34:59,200 Speaker 1: of those positions. I think you said it as well, 721 00:34:59,360 --> 00:35:02,520 Speaker 1: the asteness of our ignorance. It is full of potential 722 00:35:02,640 --> 00:35:07,719 Speaker 1: as opposed to full of admonishment. Yeah, when I read that, 723 00:35:08,040 --> 00:35:11,240 Speaker 1: you know, being interested in celebrating the vastness of our ignorance, 724 00:35:11,280 --> 00:35:15,160 Speaker 1: it was actually really dynamic as opposed to you dumb dumb. 725 00:35:15,360 --> 00:35:18,840 Speaker 1: It was like you dumb dumb. Right. The part of 726 00:35:18,840 --> 00:35:20,920 Speaker 1: that was surprising is that people want to pick one 727 00:35:20,960 --> 00:35:23,839 Speaker 1: answer and then fight for that and say Okay, this 728 00:35:23,920 --> 00:35:26,839 Speaker 1: is the right answer. Yeah. How many people are in 729 00:35:26,840 --> 00:35:29,719 Speaker 1: your movement? Can I be in it? Yeah? Please, I'd 730 00:35:29,760 --> 00:35:31,640 Speaker 1: love to have. You know, the interesting part, you know, 731 00:35:31,680 --> 00:35:33,560 Speaker 1: I wrote my book Some, which is a book of 732 00:35:33,640 --> 00:35:37,640 Speaker 1: literary fiction, and it's forty stories of what happens after 733 00:35:37,760 --> 00:35:39,319 Speaker 1: we die, and it's all made up. It's all meant 734 00:35:39,320 --> 00:35:41,200 Speaker 1: to be you know, funny and interesting and then if 735 00:35:41,200 --> 00:35:43,319 Speaker 1: it's meant to be taken seriously. But the part that 736 00:35:43,480 --> 00:35:45,640 Speaker 1: is meant to be taken seriously is the idea of, wow, 737 00:35:45,680 --> 00:35:47,560 Speaker 1: we really have no idea what this is, what our 738 00:35:47,600 --> 00:35:50,480 Speaker 1: existence is all about here? And that's what the metal 739 00:35:50,600 --> 00:35:53,240 Speaker 1: lesson that emerges from the book is. And so anyway, 740 00:35:53,280 --> 00:35:56,320 Speaker 1: after I said the sun and pr one day about 741 00:35:56,360 --> 00:35:58,920 Speaker 1: possibilityis um um, it sort of became a thing and 742 00:35:59,160 --> 00:36:01,799 Speaker 1: people started websites and Facebook groups and stuff like that. 743 00:36:01,920 --> 00:36:03,200 Speaker 1: So I don't know, I haven't really checked on in 744 00:36:03,239 --> 00:36:05,080 Speaker 1: a while, but I'm glad to see it's moving. I 745 00:36:05,160 --> 00:36:06,960 Speaker 1: like it. I think it's great. Now I've got to 746 00:36:07,000 --> 00:36:09,919 Speaker 1: read some as well. I have conversations based on something 747 00:36:09,960 --> 00:36:13,120 Speaker 1: that my mother post death, a phrase that she has coined, 748 00:36:13,120 --> 00:36:16,400 Speaker 1: which is called brain share because, as she's said, to 749 00:36:16,440 --> 00:36:19,240 Speaker 1: me in our conversations because she doesn't have a brain anymore, 750 00:36:19,280 --> 00:36:21,279 Speaker 1: which is a huge relief, but she has to use 751 00:36:21,320 --> 00:36:24,799 Speaker 1: mind so that I can feel her thoughts now. And 752 00:36:25,040 --> 00:36:26,719 Speaker 1: it's so funny because a friend of mine was like, well, 753 00:36:26,760 --> 00:36:29,000 Speaker 1: isn't that just your brain? Isn't that as the function 754 00:36:29,040 --> 00:36:31,560 Speaker 1: of your grief? Because she died only a year ago? 755 00:36:31,719 --> 00:36:34,080 Speaker 1: And I said, well, does it really matter. I don't 756 00:36:34,080 --> 00:36:36,799 Speaker 1: really know. I'll never know. It doesn't matter if I 757 00:36:36,840 --> 00:36:38,640 Speaker 1: know or if I don't know. But I hear her 758 00:36:38,719 --> 00:36:42,560 Speaker 1: voice very specifically, and we have these conversations which are 759 00:36:42,719 --> 00:36:45,680 Speaker 1: so they're fascinating to pick over. They're not just comforting, 760 00:36:45,920 --> 00:36:49,160 Speaker 1: they're strange because there's clearly an evolution either of my 761 00:36:49,280 --> 00:36:51,360 Speaker 1: idea of her since she died, or of her since 762 00:36:51,360 --> 00:36:53,920 Speaker 1: she died. That it's different enough that I recognize her, 763 00:36:53,960 --> 00:36:55,880 Speaker 1: but it's another version of her. Yeah. And you know, 764 00:36:55,880 --> 00:36:58,239 Speaker 1: one of those fascinating things is that the job of 765 00:36:58,320 --> 00:37:02,320 Speaker 1: the brain is to construct these internal models of other people. 766 00:37:02,400 --> 00:37:05,719 Speaker 1: So you have an enormous number of models in your head, 767 00:37:05,880 --> 00:37:08,359 Speaker 1: but you have thousands of these, you know, like oh, 768 00:37:08,400 --> 00:37:10,560 Speaker 1: your neighbor from down the street years ago, and oh, 769 00:37:10,600 --> 00:37:13,160 Speaker 1: your college roommate and so on. You've got little models. 770 00:37:13,280 --> 00:37:16,200 Speaker 1: Some are more sophisticated than the others. So your model 771 00:37:16,239 --> 00:37:18,840 Speaker 1: of your mother is you're devoting a lot of neural 772 00:37:18,920 --> 00:37:21,279 Speaker 1: real estate to that. Actually you've got a very rich 773 00:37:21,320 --> 00:37:23,960 Speaker 1: model of her. Other models that are thinner of you know, 774 00:37:24,040 --> 00:37:26,600 Speaker 1: your barista Starbucks or something that who don't know that well, 775 00:37:26,680 --> 00:37:28,640 Speaker 1: and you have to make lots of assumptions. But the 776 00:37:28,640 --> 00:37:31,480 Speaker 1: thing that has always struck me as fascinating, as you know, 777 00:37:31,560 --> 00:37:34,600 Speaker 1: in neuroscience my field, you know, we've essentially spent all 778 00:37:34,640 --> 00:37:36,520 Speaker 1: the time studying, Okay, how does vision work, how does 779 00:37:36,600 --> 00:37:38,400 Speaker 1: hearing work, how does decision making work? And so on? 780 00:37:38,560 --> 00:37:41,359 Speaker 1: But the part that's gone under appreciated there is how 781 00:37:41,440 --> 00:37:44,600 Speaker 1: social brains are. Brains are all about other brains, and 782 00:37:44,640 --> 00:37:47,640 Speaker 1: so this is sort of an emerging field called social neuroscience. 783 00:37:47,680 --> 00:37:50,520 Speaker 1: But the point is that a huge amount of the 784 00:37:50,600 --> 00:37:53,600 Speaker 1: territory of your brain is there just to simulate your 785 00:37:53,640 --> 00:37:57,160 Speaker 1: mother and your father and everybody you've ever known. Wow, 786 00:37:57,360 --> 00:38:01,800 Speaker 1: all right, I'm gonna be thinking about that three t time, David. 787 00:38:01,920 --> 00:38:05,960 Speaker 1: I'm so honestly just so shaft, as we say in England, 788 00:38:06,080 --> 00:38:08,040 Speaker 1: to talk to you, I can't thank you enough for 789 00:38:08,080 --> 00:38:10,239 Speaker 1: your time. Well, thank you, Minnie, It's been such a 790 00:38:10,280 --> 00:38:12,920 Speaker 1: pleasure to talk with you. Thank you very very much. 791 00:38:14,480 --> 00:38:18,279 Speaker 1: Be sure to check out David's books, including Some Incognito 792 00:38:18,480 --> 00:38:22,080 Speaker 1: and his most recent book, Live Wired, The Inside Story 793 00:38:22,080 --> 00:38:24,880 Speaker 1: of the Ever Changing Brain, which was also nominated for 794 00:38:24,880 --> 00:38:28,160 Speaker 1: a Pulitzer Prize. Live Word explores not only what the 795 00:38:28,200 --> 00:38:31,680 Speaker 1: brain is, but also what the brain does, and when 796 00:38:31,680 --> 00:38:33,680 Speaker 1: you sit down to read it, please feel free to 797 00:38:33,719 --> 00:38:36,440 Speaker 1: imagine you're having a cup of coffee at Eyehop sitting 798 00:38:36,480 --> 00:38:39,880 Speaker 1: next to David while he's busy writing five other books. 799 00:38:42,160 --> 00:38:45,440 Speaker 1: Mini Questions is hosted and written by Me Mini Driver, 800 00:38:46,000 --> 00:38:53,600 Speaker 1: supervising producer Aaron Kaufman, Producer Morgan Levoy, Research assistant Marissa Brown. 801 00:38:54,880 --> 00:39:00,560 Speaker 1: Original music Sorry Baby by Mini Driver. Additional musick by 802 00:39:00,600 --> 00:39:05,800 Speaker 1: Aaron Kaufman, Executive produced by Me Mini Driver. Special thanks 803 00:39:06,000 --> 00:39:11,680 Speaker 1: to Jim Nikolay, Will Pearson, Addison No Day, Lisa Castella 804 00:39:11,800 --> 00:39:17,000 Speaker 1: and Annicke Oppenheim at w kPr, de La Pescador, Kate 805 00:39:17,080 --> 00:39:21,600 Speaker 1: Driver and Jason Weinberg, and for constantly solicited tech support, 806 00:39:22,040 --> 00:39:22,840 Speaker 1: Henry Driver