1 00:00:06,040 --> 00:00:08,160 Speaker 1: Hey, welcome to Stuff to Blow Your Mind. My name 2 00:00:08,240 --> 00:00:09,400 Speaker 1: is Robert Lamb. 3 00:00:09,280 --> 00:00:12,120 Speaker 2: And I am Joe McCormick. And today we are bringing 4 00:00:12,160 --> 00:00:15,840 Speaker 2: you an episode from the Vault. This one originally published 5 00:00:16,079 --> 00:00:20,080 Speaker 2: April thirteenth, twenty twenty three, and this was our conversation 6 00:00:20,239 --> 00:00:25,080 Speaker 2: with David Eagleman, neuroscientist and host of the Inner Cosmos podcast. 7 00:00:25,400 --> 00:00:28,120 Speaker 2: I remember this was a very interesting chat. We hope 8 00:00:28,120 --> 00:00:28,720 Speaker 2: you enjoy it. 9 00:00:32,080 --> 00:00:38,560 Speaker 3: Welcome to Stuff to Blow Your Mind, a production of iHeartRadio. 10 00:00:41,880 --> 00:00:44,400 Speaker 1: Hey, welcome to Stuff to Blow your Mind. My name is. 11 00:00:44,440 --> 00:00:47,800 Speaker 2: Robert Lamb and I'm Joe McCormick, and hey, welcome back, Rob. 12 00:00:48,320 --> 00:00:50,199 Speaker 2: You were out sick earlier this week. It's good to 13 00:00:50,200 --> 00:00:50,680 Speaker 2: have you back. 14 00:00:51,040 --> 00:00:52,720 Speaker 4: It's good to be back now. 15 00:00:53,000 --> 00:00:56,000 Speaker 2: Because you were outsick, we ended up putting a pause 16 00:00:56,240 --> 00:00:59,680 Speaker 2: on an ongoing series we were doing on childhood amnesia. 17 00:01:00,600 --> 00:01:04,160 Speaker 2: We ended up running a Vault episode on Tuesday, and 18 00:01:04,280 --> 00:01:06,399 Speaker 2: I just wanted to assure people that we will be 19 00:01:06,520 --> 00:01:09,000 Speaker 2: coming back to that subject. We will be resuming the series, 20 00:01:09,040 --> 00:01:12,959 Speaker 2: probably for next week's core episodes, but because we already 21 00:01:13,000 --> 00:01:15,440 Speaker 2: had it scheduled out this way, today's episode is going 22 00:01:15,480 --> 00:01:18,880 Speaker 2: to be an interview. So in fact, we are talking 23 00:01:18,920 --> 00:01:23,360 Speaker 2: to a return guest, the neuroscientist and author David Eagleman. 24 00:01:23,520 --> 00:01:26,520 Speaker 2: This is actually the second time David has been a 25 00:01:26,520 --> 00:01:29,800 Speaker 2: guest on the show. In September twenty twenty, Rob, you 26 00:01:30,040 --> 00:01:32,640 Speaker 2: spoke to him about his book Live Wired, which is 27 00:01:32,680 --> 00:01:35,920 Speaker 2: a popular science book on the subject of brain plasticity. 28 00:01:36,319 --> 00:01:38,720 Speaker 1: Yeah. Absolutely. That was a fun episode. You can find 29 00:01:38,720 --> 00:01:41,200 Speaker 1: it in the archives, and the book Live Wired is 30 00:01:41,360 --> 00:01:43,880 Speaker 1: an absolute delight. If you're at all interested in anything 31 00:01:44,240 --> 00:01:47,680 Speaker 1: you hear us discussing in this episode, pick up a 32 00:01:47,720 --> 00:01:48,560 Speaker 1: copy of it. 33 00:01:48,320 --> 00:01:49,000 Speaker 4: It's great. 34 00:01:49,480 --> 00:01:52,200 Speaker 2: So this week we invited David back on the show 35 00:01:52,280 --> 00:01:57,120 Speaker 2: because he now has a fantastic, brand new podcast on 36 00:01:57,160 --> 00:02:00,400 Speaker 2: our very own network on the iHeart Network, and it 37 00:02:00,440 --> 00:02:04,560 Speaker 2: is called Inner Cosmos with David Eagleman. So before we 38 00:02:04,600 --> 00:02:08,120 Speaker 2: get started with our interview, I thought we should just 39 00:02:08,160 --> 00:02:10,880 Speaker 2: share a bit of background about David. This is from 40 00:02:10,960 --> 00:02:11,800 Speaker 2: his website. 41 00:02:11,960 --> 00:02:15,440 Speaker 1: That's right. David Eagleman is a neuroscientist at Stanford University 42 00:02:15,520 --> 00:02:18,560 Speaker 1: and an internationally best selling author. He's the co founder 43 00:02:18,800 --> 00:02:22,840 Speaker 1: of two venture backed companies, Neosensory and brain Check, and 44 00:02:22,880 --> 00:02:25,200 Speaker 1: he also directs the Center for Science and Law, a 45 00:02:25,320 --> 00:02:29,120 Speaker 1: national nonprofit Institute. He's best known for his work on 46 00:02:29,400 --> 00:02:35,600 Speaker 1: sensory substitution, time, perception, brain plasticity, synesthesia, and neurolaw. He 47 00:02:35,720 --> 00:02:38,720 Speaker 1: is the writer and presenter of the international PBS series 48 00:02:38,720 --> 00:02:41,080 Speaker 1: The Brain with David Eagleman and the author of the 49 00:02:41,080 --> 00:02:44,200 Speaker 1: companion book, The Brain The Story of You. He's also 50 00:02:44,200 --> 00:02:47,280 Speaker 1: the writer and presenter of The Creative Brain on Netflix. 51 00:02:47,639 --> 00:02:50,120 Speaker 2: David Eagleman is the author of over one hundred and 52 00:02:50,160 --> 00:02:54,040 Speaker 2: twenty academic publications and many many books of popular science. 53 00:02:54,520 --> 00:02:58,040 Speaker 2: Eagleman is a TED speaker, a Guggenheim Fellow, and serves 54 00:02:58,080 --> 00:03:01,480 Speaker 2: on several boards, including the America Brain Foundation and the 55 00:03:01,560 --> 00:03:05,520 Speaker 2: long Now Foundation. He's the Chief Scientific Advisor of the 56 00:03:05,680 --> 00:03:09,480 Speaker 2: Mind Science Foundation and the winner of the Claude Shannon 57 00:03:09,560 --> 00:03:13,320 Speaker 2: Luminary Award from Bell Labs and the McGovern Award for 58 00:03:13,400 --> 00:03:18,080 Speaker 2: Excellence in Biomedical Communication. He serves as the academic editor 59 00:03:18,120 --> 00:03:20,639 Speaker 2: for the Journal of Science and Law. Was named Science 60 00:03:20,760 --> 00:03:24,000 Speaker 2: Educator of the Year by this Society for Neuroscience and 61 00:03:24,200 --> 00:03:27,480 Speaker 2: was featured as one of the quote Brightest Idea Guys 62 00:03:27,560 --> 00:03:31,680 Speaker 2: by Italy's Style magazine. He served as the scientific advisor 63 00:03:31,720 --> 00:03:35,800 Speaker 2: on several TV shows, including Westworld and Perception and has 64 00:03:35,800 --> 00:03:39,480 Speaker 2: been profiled on The Colbert Report, Nova Science Now, The 65 00:03:39,520 --> 00:03:43,160 Speaker 2: New Yorker, CNN's Next List, and many other venues. He 66 00:03:43,200 --> 00:03:47,320 Speaker 2: appears regularly on radio and television to discuss literature and science, 67 00:03:47,600 --> 00:03:49,880 Speaker 2: and I guess now he is going to start having 68 00:03:50,040 --> 00:03:54,640 Speaker 2: to add podcasts to the end of his bio here, So, Robin, 69 00:03:54,680 --> 00:03:56,480 Speaker 2: unless you have anything else, I think we should jump 70 00:03:56,560 --> 00:03:58,600 Speaker 2: right into our conversation with David Eagleman. 71 00:04:01,640 --> 00:04:03,440 Speaker 1: Hi, David, welcome back to the show. 72 00:04:03,480 --> 00:04:05,720 Speaker 4: Great, Thanks Rob for having me again. It's a pleasure 73 00:04:05,720 --> 00:04:08,119 Speaker 4: to be here. And hello, Joe, it's great to meet you. David. 74 00:04:08,440 --> 00:04:11,040 Speaker 1: Now, you have a great new podcast series out Inner 75 00:04:11,080 --> 00:04:15,280 Speaker 1: Cosmos through iHeart. How did you decide what path to 76 00:04:15,360 --> 00:04:17,200 Speaker 1: take with the podcast format? 77 00:04:18,320 --> 00:04:21,560 Speaker 4: You know, it's a great question. The truth is I 78 00:04:21,600 --> 00:04:26,000 Speaker 4: had not listened to many podcasts at all. Now I have. 79 00:04:26,160 --> 00:04:28,480 Speaker 4: But when I first was putting this together with iHeart, 80 00:04:28,520 --> 00:04:30,839 Speaker 4: I thought, look, I want to do a forty five 81 00:04:30,839 --> 00:04:34,680 Speaker 4: minute to hour long monologue every week. And that seemed 82 00:04:34,720 --> 00:04:38,240 Speaker 4: like a terrific idea at first, and then my wife 83 00:04:38,279 --> 00:04:39,880 Speaker 4: said she was going to kill me because it turns 84 00:04:39,880 --> 00:04:42,480 Speaker 4: out that's a ton of work. It takes me about 85 00:04:42,800 --> 00:04:46,000 Speaker 4: kind of about twelve hours a week to get a 86 00:04:46,040 --> 00:04:50,719 Speaker 4: good monologue that's almost an hour long. So that's how 87 00:04:50,800 --> 00:04:54,520 Speaker 4: I decided on the format because I thought it would 88 00:04:54,680 --> 00:04:58,400 Speaker 4: be something special rather than you know, I've been on 89 00:04:58,600 --> 00:05:01,480 Speaker 4: many different podcasts where we're doing interviews just like this, 90 00:05:01,600 --> 00:05:04,080 Speaker 4: and it's super fun. But I wanted to do something different. 91 00:05:04,680 --> 00:05:08,159 Speaker 4: So that's how I accidentally stumbled into that format. 92 00:05:08,560 --> 00:05:10,839 Speaker 2: So I figure we should give people a taste of 93 00:05:10,839 --> 00:05:12,880 Speaker 2: the kind of things you talk about on your show. 94 00:05:13,240 --> 00:05:15,200 Speaker 2: I got a chance to listen to the episode you 95 00:05:15,240 --> 00:05:18,520 Speaker 2: did about memory and the perception of time, and I 96 00:05:18,520 --> 00:05:20,560 Speaker 2: thought it was great, by the way, a really great 97 00:05:20,560 --> 00:05:23,600 Speaker 2: way to kick off the show. So your starting premise 98 00:05:23,800 --> 00:05:26,599 Speaker 2: in that episode was that many people who have been 99 00:05:26,640 --> 00:05:30,200 Speaker 2: through intense or life threatening events, maybe falling off of 100 00:05:30,240 --> 00:05:33,839 Speaker 2: a building or seeing a car speeding toward them, report 101 00:05:33,920 --> 00:05:38,479 Speaker 2: afterwards that time seemed to have somehow slowed down for 102 00:05:38,600 --> 00:05:41,440 Speaker 2: them during the pivotal few seconds, almost as if they 103 00:05:41,440 --> 00:05:44,560 Speaker 2: were able to inter a state of slow motion or 104 00:05:44,600 --> 00:05:47,840 Speaker 2: bullet time, like from the matrix. Can you talk a 105 00:05:47,880 --> 00:05:50,760 Speaker 2: bit about your research on this subject and what you 106 00:05:50,880 --> 00:05:52,360 Speaker 2: discovered about this perception. 107 00:05:52,960 --> 00:05:55,520 Speaker 4: Yeah, so my research, of course started off very personal 108 00:05:55,560 --> 00:05:57,320 Speaker 4: which is that I fell off of a house and 109 00:05:57,360 --> 00:05:59,760 Speaker 4: it seemed like things took a long time, and that's 110 00:05:59,800 --> 00:06:02,120 Speaker 4: what got me interested in this. Then when I got older, 111 00:06:02,160 --> 00:06:06,400 Speaker 4: it became a neuroscientist. Eventually I realized that I was 112 00:06:06,480 --> 00:06:10,400 Speaker 4: hearing this story not uncommonly from people who had been 113 00:06:10,440 --> 00:06:14,200 Speaker 4: in a you know, gunfight or some scary situation or 114 00:06:14,279 --> 00:06:17,240 Speaker 4: car accident or whatever, and they felt that things took longer. 115 00:06:17,360 --> 00:06:19,080 Speaker 4: And so I looked in the literature and there was 116 00:06:19,120 --> 00:06:22,640 Speaker 4: not anything on this. So that's when I came to 117 00:06:22,640 --> 00:06:25,279 Speaker 4: realize I was going to have to do this myself 118 00:06:25,279 --> 00:06:27,720 Speaker 4: and figure out how to run an experiment on this. 119 00:06:27,880 --> 00:06:31,599 Speaker 4: So you know, briefly, what I did is I built 120 00:06:31,640 --> 00:06:34,240 Speaker 4: a device that I hooked to people's wrists that flashed 121 00:06:34,279 --> 00:06:36,919 Speaker 4: information them at them met a certain way so that 122 00:06:36,960 --> 00:06:41,200 Speaker 4: I could tell how rapidly their brain was perceiving, and 123 00:06:41,240 --> 00:06:44,000 Speaker 4: that way I could test whether they were actually seeing 124 00:06:44,040 --> 00:06:47,440 Speaker 4: in slow motion or The whole thing was a trick 125 00:06:47,480 --> 00:06:51,360 Speaker 4: of memory, meaning when you were in an intense situation, 126 00:06:51,480 --> 00:06:53,359 Speaker 4: you wrote down more memories. So when you said what 127 00:06:53,480 --> 00:06:56,320 Speaker 4: just happened, when just happened, that it seemed like it 128 00:06:56,400 --> 00:06:59,080 Speaker 4: must have taken longer because you have all these memories. 129 00:06:59,400 --> 00:07:01,720 Speaker 4: So what I did then is dropped people from one 130 00:07:01,800 --> 00:07:05,280 Speaker 4: hundred and fifty foot tall tower in free fall backwards 131 00:07:05,279 --> 00:07:09,880 Speaker 4: into a net below, and I measured their perception of 132 00:07:09,920 --> 00:07:11,720 Speaker 4: time on the way down this way. And what I 133 00:07:11,800 --> 00:07:15,280 Speaker 4: found after, you know, I didn't sell myself, but then 134 00:07:15,320 --> 00:07:20,080 Speaker 4: we dropped twenty three participants. What I found is that 135 00:07:20,240 --> 00:07:23,080 Speaker 4: it is in fact a trick of memory, which is 136 00:07:23,120 --> 00:07:25,800 Speaker 4: to say, when everything is hitting the fan, your brain 137 00:07:25,840 --> 00:07:28,720 Speaker 4: writes down much denser memory, and when you read that 138 00:07:28,760 --> 00:07:31,000 Speaker 4: back out, your brain has to make an assumption about 139 00:07:31,960 --> 00:07:35,120 Speaker 4: you know, how much memory, how much footage maps onto 140 00:07:35,120 --> 00:07:38,440 Speaker 4: how much time, and so it says, oh wow, that 141 00:07:38,480 --> 00:07:40,280 Speaker 4: must have been five seconds, even though it was only 142 00:07:40,320 --> 00:07:42,640 Speaker 4: one second worth. But the point is people were not 143 00:07:42,800 --> 00:07:45,280 Speaker 4: able to see in slow motion, which, by the way, 144 00:07:45,400 --> 00:07:47,760 Speaker 4: was disappointing for me because I already had. I was 145 00:07:47,760 --> 00:07:50,920 Speaker 4: already talking with the military about building cockpits in a 146 00:07:50,960 --> 00:07:54,440 Speaker 4: way that flashed information more rapidly at people when they 147 00:07:54,720 --> 00:07:57,600 Speaker 4: be in some intense situation. But it turns out none 148 00:07:57,680 --> 00:07:59,800 Speaker 4: of that makes the difference. You can't actually get information 149 00:08:00,120 --> 00:08:04,160 Speaker 4: there faster, you can only remember it faster, So. 150 00:08:04,200 --> 00:08:09,840 Speaker 2: There's not actually any any increased ability of perception. It's 151 00:08:10,000 --> 00:08:12,000 Speaker 2: just a trick of the memory exactly. 152 00:08:12,360 --> 00:08:14,800 Speaker 4: Now. It is the case that you know a lot. 153 00:08:15,040 --> 00:08:17,680 Speaker 4: You can do a lot of things pre consciously, by 154 00:08:17,720 --> 00:08:21,520 Speaker 4: which I mean your conscious awareness of something is always 155 00:08:21,520 --> 00:08:24,120 Speaker 4: the slowest thing on the ladder to ever get any information. 156 00:08:24,240 --> 00:08:26,680 Speaker 4: So by the time your brain puts together all the 157 00:08:26,680 --> 00:08:30,400 Speaker 4: signals and says, okay, this is what just happened. You know, 158 00:08:30,440 --> 00:08:33,480 Speaker 4: that's at least half a second to a second behind 159 00:08:33,640 --> 00:08:37,480 Speaker 4: real time. But the point is your body can react 160 00:08:37,559 --> 00:08:39,880 Speaker 4: much faster than that. Your body can get signals and say, whoa, 161 00:08:39,920 --> 00:08:42,120 Speaker 4: I got to do something about this right away, And 162 00:08:42,200 --> 00:08:45,680 Speaker 4: so you can react, you know, often much faster than 163 00:08:45,720 --> 00:08:47,920 Speaker 4: you can be consciously aware. So you know, I don't 164 00:08:47,920 --> 00:08:49,880 Speaker 4: know if you've been on a I mean, this is 165 00:08:49,880 --> 00:08:51,480 Speaker 4: what happened to me recently. I was on a hike 166 00:08:51,559 --> 00:08:54,280 Speaker 4: with a friend and a branch snapped back, and I was, 167 00:08:54,600 --> 00:08:57,040 Speaker 4: you know, halfway into the move of ducking out of 168 00:08:57,080 --> 00:08:59,480 Speaker 4: the way of the branch before I consciously realized it, 169 00:08:59,640 --> 00:09:02,840 Speaker 4: or my foot gets halfway to the break of my 170 00:09:02,880 --> 00:09:05,559 Speaker 4: truck before I realized that there's a car pulling out 171 00:09:05,600 --> 00:09:07,959 Speaker 4: of the driveway ahead of me. In other words, consciousness 172 00:09:08,000 --> 00:09:09,679 Speaker 4: is always the last guy on the ladder to get 173 00:09:09,720 --> 00:09:14,400 Speaker 4: any information, and your body can almost always react much faster. 174 00:09:15,480 --> 00:09:17,280 Speaker 4: In fact, wait, let me just say one more example 175 00:09:17,280 --> 00:09:19,520 Speaker 4: of that, which is when I was younger, I used 176 00:09:19,520 --> 00:09:23,320 Speaker 4: to play baseball, and my experience was always that, you know, 177 00:09:23,360 --> 00:09:27,439 Speaker 4: I'd be waiting for the pitch, and then I would 178 00:09:27,679 --> 00:09:32,080 Speaker 4: realize after it had happened that I had already hit 179 00:09:32,120 --> 00:09:34,760 Speaker 4: the ball. And I would consciously realize, oh, I have 180 00:09:35,000 --> 00:09:37,199 Speaker 4: just hit the ball. Now throw the bat and run. 181 00:09:38,559 --> 00:09:41,480 Speaker 4: But you know, the whole thing, the ball moving from 182 00:09:41,480 --> 00:09:44,240 Speaker 4: the mound to the plate, and the swing and the batting, 183 00:09:44,280 --> 00:09:47,680 Speaker 4: that's all really fast process and it often happens pre consciously. 184 00:09:48,280 --> 00:09:53,680 Speaker 2: Somewhat related to that, this raises questions about the different 185 00:09:53,840 --> 00:09:58,160 Speaker 2: kinds of circumstances that would favor the perception of slow 186 00:09:58,200 --> 00:10:00,560 Speaker 2: motion in intense situations or not. 187 00:10:01,360 --> 00:10:01,440 Speaker 3: So. 188 00:10:01,800 --> 00:10:04,440 Speaker 2: My example was there was one night years ago I 189 00:10:04,520 --> 00:10:07,560 Speaker 2: was driving under an overpass and there was a sudden 190 00:10:07,600 --> 00:10:10,880 Speaker 2: deafening sound and a shudder, And what my wife and 191 00:10:10,920 --> 00:10:14,520 Speaker 2: I deduced later was that somehow like a brick had 192 00:10:14,559 --> 00:10:16,600 Speaker 2: fallen from above and hit the roof of our car 193 00:10:16,760 --> 00:10:19,440 Speaker 2: just above the windshield. As we passed under a highway 194 00:10:19,440 --> 00:10:21,880 Speaker 2: speed and I don't know if somebody threw it or 195 00:10:21,880 --> 00:10:24,400 Speaker 2: if it somehow just fell, But I not only don't 196 00:10:24,440 --> 00:10:27,439 Speaker 2: recall a feeling of stretched time or a greater density 197 00:10:27,440 --> 00:10:31,360 Speaker 2: of memories right before and after the impact, I felt 198 00:10:31,400 --> 00:10:35,840 Speaker 2: almost a kind of retrospective amnesia, like a real paucity 199 00:10:35,920 --> 00:10:39,079 Speaker 2: of detail. And it was like we were suddenly a 200 00:10:39,160 --> 00:10:42,559 Speaker 2: good ways down the road and just trying to remember 201 00:10:42,720 --> 00:10:44,040 Speaker 2: or figure out what had happened. 202 00:10:45,120 --> 00:10:48,040 Speaker 4: Yeah, that's exactly right. It's because you didn't write down 203 00:10:48,040 --> 00:10:50,520 Speaker 4: any memory. And this is generally because as you are 204 00:10:50,600 --> 00:10:53,680 Speaker 4: taking a drive down the highway, your brain is writing 205 00:10:53,720 --> 00:10:58,000 Speaker 4: down very little stuff going on. In fact, the interesting 206 00:10:58,040 --> 00:11:01,200 Speaker 4: part is that although we think about memory as being 207 00:11:01,240 --> 00:11:03,520 Speaker 4: like a video recorder or something, in fact it's nothing 208 00:11:03,640 --> 00:11:06,840 Speaker 4: like that. You write down very little of what happens 209 00:11:06,840 --> 00:11:08,840 Speaker 4: in your life, especially when you're driving on a road 210 00:11:08,880 --> 00:11:12,080 Speaker 4: you've been down before. So what happened is there's the 211 00:11:12,080 --> 00:11:15,079 Speaker 4: deafening crash and suddenly you're thinking what just happened? What 212 00:11:15,160 --> 00:11:17,439 Speaker 4: just happened? And you've got nothing to draw on. There's 213 00:11:17,520 --> 00:11:21,000 Speaker 4: just no footage there. And by the way, just as 214 00:11:21,040 --> 00:11:22,679 Speaker 4: a very quick side note, I think this is what 215 00:11:22,760 --> 00:11:26,360 Speaker 4: happens to people when they are high on marijuana, is 216 00:11:26,400 --> 00:11:28,360 Speaker 4: they say, oh, my gosh, how long have I been 217 00:11:28,400 --> 00:11:30,680 Speaker 4: standing here? It feels like I'm standing here forever, And 218 00:11:30,760 --> 00:11:34,600 Speaker 4: it's because they're not writing down memories in the same way. 219 00:11:34,640 --> 00:11:37,320 Speaker 4: So when their brain looks for how long have I 220 00:11:37,360 --> 00:11:41,559 Speaker 4: been standing here, what it's looking for is footage in time, 221 00:11:41,720 --> 00:11:44,720 Speaker 4: as in, Okay, I remember getting here, I remember this happening, 222 00:11:44,760 --> 00:11:46,760 Speaker 4: someone said this. Then someone put the glass down and 223 00:11:46,760 --> 00:11:48,720 Speaker 4: blah blah blah, so that it can estimate how long 224 00:11:48,760 --> 00:11:51,719 Speaker 4: it's been there. But suddenly it can't grab on to 225 00:11:51,880 --> 00:11:55,040 Speaker 4: any memories at all, and so suddenly people are lost 226 00:11:55,559 --> 00:12:00,000 Speaker 4: in time anyway, So this is Joe exactly what happened 227 00:12:00,160 --> 00:12:04,320 Speaker 4: pins when people suddenly are hit by a car that 228 00:12:04,360 --> 00:12:06,240 Speaker 4: they don't see coming, like a car teet bones them 229 00:12:06,360 --> 00:12:09,800 Speaker 4: or something. Or I might have mentioned in the podcast, 230 00:12:09,840 --> 00:12:12,000 Speaker 4: I can't remember that, you know. I was once riding 231 00:12:12,040 --> 00:12:15,920 Speaker 4: my bike and the wheel suddenly dropped in a pothole 232 00:12:15,960 --> 00:12:18,440 Speaker 4: and I went flying over the handlebars. But because I 233 00:12:18,520 --> 00:12:22,160 Speaker 4: didn't see that coming, I just had the sensation of suddenly, 234 00:12:22,240 --> 00:12:23,320 Speaker 4: oh my god, what's is you know? 235 00:12:23,320 --> 00:12:23,480 Speaker 1: Here? 236 00:12:23,480 --> 00:12:25,680 Speaker 4: I'm lying on the asphalt, bloody, and I have no 237 00:12:25,720 --> 00:12:29,079 Speaker 4: idea what just happened. And it's precisely because you're not 238 00:12:29,160 --> 00:12:31,360 Speaker 4: running down any memories. So when your brain says what 239 00:12:31,520 --> 00:12:33,760 Speaker 4: just happened, what just happened, there's nothing to draw on, 240 00:12:34,400 --> 00:12:37,000 Speaker 4: as opposed to the brick sliding on ice towards the 241 00:12:37,000 --> 00:12:40,120 Speaker 4: brick wall phenomenon, which is where you say, oh my gosh, 242 00:12:40,240 --> 00:12:43,160 Speaker 4: I'm predicting what's going to happen and this is really 243 00:12:43,160 --> 00:12:44,920 Speaker 4: gonna hurt, this is gonna be bad, And that's when 244 00:12:44,960 --> 00:12:46,360 Speaker 4: you're writing down lots of stuff. 245 00:12:46,720 --> 00:12:48,920 Speaker 2: So would you say you're more likely to have this 246 00:12:49,760 --> 00:12:55,120 Speaker 2: memory density perception number one if you see the event 247 00:12:55,320 --> 00:12:58,640 Speaker 2: coming ahead of time, there's there's expectation of it. But 248 00:12:58,840 --> 00:13:03,680 Speaker 2: also if you're generally in a novel or unusual situation. 249 00:13:03,720 --> 00:13:06,920 Speaker 4: Yeah, that's exactly right. Actually, So two aspects of that. 250 00:13:06,960 --> 00:13:09,280 Speaker 4: One is that I just mentioned a moment ago that 251 00:13:09,360 --> 00:13:12,480 Speaker 4: you write down very little memory, and that's because as 252 00:13:12,559 --> 00:13:15,719 Speaker 4: an adult now, your brain has sort of figured out 253 00:13:15,760 --> 00:13:18,319 Speaker 4: a pretty good model of the world, meaning you don't 254 00:13:18,360 --> 00:13:20,760 Speaker 4: need to write stuff down because you've seen all the 255 00:13:20,760 --> 00:13:25,360 Speaker 4: personalities before, you've seen different cities before, you've seen roads, 256 00:13:25,360 --> 00:13:28,320 Speaker 4: and people and events and television shows and you sort 257 00:13:28,360 --> 00:13:32,200 Speaker 4: of got it. But if something really novel happens, that's 258 00:13:32,200 --> 00:13:34,600 Speaker 4: when your brain writes something down and says, WHOA, wait 259 00:13:34,640 --> 00:13:38,000 Speaker 4: a minute, I'm surprised, And that's when stuff starts getting 260 00:13:38,000 --> 00:13:40,199 Speaker 4: written down. So when you look back at the end 261 00:13:40,240 --> 00:13:43,040 Speaker 4: of let's say a novel event. Let's say you go 262 00:13:43,120 --> 00:13:46,640 Speaker 4: on some really wild trip on the weekend to Glockos 263 00:13:46,679 --> 00:13:49,160 Speaker 4: Islands and you see new things and so on, then 264 00:13:49,200 --> 00:13:51,600 Speaker 4: it seems like forever since you were at work on Friday. 265 00:13:51,840 --> 00:13:53,840 Speaker 4: But if you just go off for a normal weekend 266 00:13:54,160 --> 00:13:55,680 Speaker 4: and you come back to work, you think, oh, I 267 00:13:55,720 --> 00:13:57,720 Speaker 4: was just here because you didn't lay down any new 268 00:13:57,760 --> 00:14:03,080 Speaker 4: memories over the weekend. So it is true that things 269 00:14:03,080 --> 00:14:06,440 Speaker 4: that are novel generally seem to last longer. However, it 270 00:14:06,440 --> 00:14:10,319 Speaker 4: should be noted that when things are actually life threatening, 271 00:14:10,800 --> 00:14:14,520 Speaker 4: you have essentially an emergency response memory system that kicks 272 00:14:14,520 --> 00:14:18,520 Speaker 4: into gear. That is a secondary track on which you 273 00:14:18,559 --> 00:14:22,000 Speaker 4: write down memory. And this is underpinned by a part 274 00:14:22,000 --> 00:14:24,760 Speaker 4: of the brain called the amigla, and its job is 275 00:14:24,800 --> 00:14:28,840 Speaker 4: to say, WHOA, everything is going really bad and scary here, 276 00:14:29,200 --> 00:14:30,960 Speaker 4: and I got to write this down because that after 277 00:14:31,000 --> 00:14:33,720 Speaker 4: all is the point of memory is to make sure 278 00:14:33,720 --> 00:14:37,000 Speaker 4: that you write down stuff that is important and specifically 279 00:14:37,120 --> 00:14:38,600 Speaker 4: life threateningly important. 280 00:14:38,880 --> 00:14:41,960 Speaker 2: So if the normal memory system, would that be the 281 00:14:42,160 --> 00:14:46,520 Speaker 2: hippocampal memory system exactly? If that's the normal memory system, 282 00:14:46,520 --> 00:14:51,160 Speaker 2: and then the amygdala tends to be recruited in intense situations. 283 00:14:51,600 --> 00:14:55,200 Speaker 2: Do we know generally if there is any if there 284 00:14:55,200 --> 00:15:00,120 Speaker 2: are any characteristic differences between how memories are recorded in 285 00:15:00,160 --> 00:15:01,920 Speaker 2: the hippocampus versus the Amygdalah. 286 00:15:02,600 --> 00:15:06,400 Speaker 4: Yeah, And it turns out it's a tragic one, which 287 00:15:06,440 --> 00:15:12,960 Speaker 4: is that amygdala memories are uneaseable, whereas hippocampbel memories can 288 00:15:13,040 --> 00:15:15,960 Speaker 4: be erased. So let me unpack this because there's two 289 00:15:16,040 --> 00:15:18,720 Speaker 4: surprising parts here. So first of all, the fact that 290 00:15:18,800 --> 00:15:23,440 Speaker 4: hippocampal memories can be erased is terrifying and weird and wild. 291 00:15:23,520 --> 00:15:27,480 Speaker 4: Which is, if I ask you to recall the name 292 00:15:27,560 --> 00:15:33,600 Speaker 4: of your fifth grade teacher and then suddenly that brick 293 00:15:33,680 --> 00:15:36,520 Speaker 4: drops off the highway bridge and hits you in the head. 294 00:15:36,560 --> 00:15:40,960 Speaker 4: God forbid, let's say that happens. You will now have 295 00:15:41,080 --> 00:15:45,080 Speaker 4: amnesia for that one fact, You will not remember anything 296 00:15:45,160 --> 00:15:48,120 Speaker 4: about your fifth grade teacher anymore about the at least 297 00:15:48,120 --> 00:15:52,120 Speaker 4: the fifth grade teacher's name. Why it's because the name 298 00:15:52,160 --> 00:15:54,280 Speaker 4: of your fifth grade teacher is stored deep in the 299 00:15:54,280 --> 00:15:56,280 Speaker 4: structure of your brain, and when I ask you to 300 00:15:56,360 --> 00:16:01,800 Speaker 4: recall it, you're actually transferring it from that structural form 301 00:16:02,360 --> 00:16:05,920 Speaker 4: into activity, you know, spikes in the brain. And that's 302 00:16:05,960 --> 00:16:09,480 Speaker 4: how you're remembering the name of your teacher. Now, when 303 00:16:09,520 --> 00:16:13,400 Speaker 4: you're done remembering it, it has to get reconsolidated back 304 00:16:13,440 --> 00:16:15,640 Speaker 4: into its physical form. And if you get hit in 305 00:16:15,680 --> 00:16:21,040 Speaker 4: the head during that moment, it's gone. It's now you know, 306 00:16:21,080 --> 00:16:23,880 Speaker 4: it's been transferred from the physical to the you know, 307 00:16:24,000 --> 00:16:27,320 Speaker 4: activity in spikes. And if you you know, before it 308 00:16:27,360 --> 00:16:29,640 Speaker 4: gets transferred back into the physical, it can it is 309 00:16:29,680 --> 00:16:33,600 Speaker 4: susceptible to erasure, which is weird and terrifying. And by 310 00:16:33,600 --> 00:16:36,840 Speaker 4: the way, this can also be done with protein synthesis inhibitors. 311 00:16:37,440 --> 00:16:40,200 Speaker 4: So people do this in rats. They've been doing this 312 00:16:40,240 --> 00:16:42,480 Speaker 4: for decades, where you know, you train a rat how 313 00:16:42,480 --> 00:16:45,480 Speaker 4: to run different maases and then you put the rat 314 00:16:45,520 --> 00:16:48,200 Speaker 4: on a particular maze where the rat has to remember, oh, yeah, 315 00:16:48,240 --> 00:16:50,760 Speaker 4: that's this one, and then you just feed the rat 316 00:16:50,800 --> 00:16:55,440 Speaker 4: a protein synthesis inhibitor and now it cannot reconsolidate that 317 00:16:55,520 --> 00:16:59,960 Speaker 4: memory into physical form. So number one is hippocampal memory 318 00:17:00,160 --> 00:17:03,200 Speaker 4: can be erased. The number two point is that amigla 319 00:17:03,280 --> 00:17:06,040 Speaker 4: memories cannot be erased, which is to say, when you 320 00:17:06,200 --> 00:17:10,680 Speaker 4: recruit the emergency control system to say, wow, this is 321 00:17:10,760 --> 00:17:15,760 Speaker 4: really important, write this down. Than those are permanent, which 322 00:17:15,920 --> 00:17:18,959 Speaker 4: the reason I say that's unfortunate is because those are 323 00:17:19,000 --> 00:17:20,840 Speaker 4: the ones that people want to erase. In other words, 324 00:17:20,880 --> 00:17:23,439 Speaker 4: you know, let's say a rape victim or something like that, 325 00:17:23,440 --> 00:17:26,000 Speaker 4: that is the one thing that she wants to forget 326 00:17:26,440 --> 00:17:27,840 Speaker 4: more than anything, but cannot. 327 00:17:28,640 --> 00:17:31,840 Speaker 1: Now, I was checking out the show as well, and 328 00:17:31,880 --> 00:17:33,560 Speaker 1: I was listening to your I believe this is an 329 00:17:33,560 --> 00:17:35,840 Speaker 1: episode from just earlier this week on the topic of 330 00:17:36,400 --> 00:17:39,320 Speaker 1: animal uplift, which I don't think is a term that 331 00:17:39,400 --> 00:17:42,439 Speaker 1: I was familiar with. Can you give us a brief taste, 332 00:17:42,440 --> 00:17:45,040 Speaker 1: a brief overview of what animal uplift is. 333 00:17:45,720 --> 00:17:48,919 Speaker 4: Yeah, it's this idea that you know, look, the human 334 00:17:49,000 --> 00:17:51,439 Speaker 4: brain is made out of exactly the same stuff that 335 00:17:51,880 --> 00:17:54,080 Speaker 4: a mouse brain, the dog brain, the giraft brain. You know, 336 00:17:54,200 --> 00:17:56,200 Speaker 4: it's all the same stuff. It's got the same anatomy, 337 00:17:56,280 --> 00:17:59,560 Speaker 4: same general structure. We just have more of this wrinkly 338 00:17:59,640 --> 00:18:05,320 Speaker 4: outer bit called the cortex. But somehow we are you know, 339 00:18:05,359 --> 00:18:07,560 Speaker 4: we've taken over the whole planet as a species. We've 340 00:18:07,560 --> 00:18:10,960 Speaker 4: gotten off the planet. We've made vaccines and internet and 341 00:18:11,040 --> 00:18:14,280 Speaker 4: quantum mechanics and so like. There's some real difference in 342 00:18:14,359 --> 00:18:17,679 Speaker 4: what we are doing versus our neighbors in the animal kingdom. 343 00:18:18,320 --> 00:18:22,440 Speaker 4: But the genetic differences, as you know, are not that much. 344 00:18:22,520 --> 00:18:25,320 Speaker 4: I mean, we have enormous similarity with almost every Like 345 00:18:25,359 --> 00:18:27,199 Speaker 4: if you're building a giraffe, you got to build the 346 00:18:27,200 --> 00:18:28,879 Speaker 4: heart and the lungs and the brain and then the 347 00:18:29,000 --> 00:18:32,000 Speaker 4: esophagus and all that stuff is really the same stuff. 348 00:18:32,040 --> 00:18:35,399 Speaker 4: And so it's just some small algorithmic difference in the 349 00:18:35,480 --> 00:18:38,439 Speaker 4: DNA that's making our brain run in a more souped 350 00:18:38,480 --> 00:18:42,640 Speaker 4: up way. Okay, the idea of animal uplift is if 351 00:18:42,680 --> 00:18:44,879 Speaker 4: we can figure that out. And this won't happen, you know, 352 00:18:44,960 --> 00:18:47,000 Speaker 4: for at least a few more decades, but if we 353 00:18:47,040 --> 00:18:50,000 Speaker 4: can figure out, ah, here's the sequence of a's and 354 00:18:50,080 --> 00:18:54,280 Speaker 4: c's and t's and g's that gives us this high intelligence. 355 00:18:55,160 --> 00:18:59,600 Speaker 4: The question is should we give this to animals? Should 356 00:18:59,600 --> 00:19:04,480 Speaker 4: we help animals become intelligent? Now, let me just mention 357 00:19:04,600 --> 00:19:09,480 Speaker 4: this is an area that bioethicists and philosophers and neuroscientists 358 00:19:09,560 --> 00:19:12,399 Speaker 4: have been talking about for a while, and there's you know, 359 00:19:12,480 --> 00:19:14,240 Speaker 4: plenty of debate about it. And on one end of 360 00:19:14,280 --> 00:19:16,320 Speaker 4: the spectrum you have people say that's a terrible idea, 361 00:19:16,400 --> 00:19:18,280 Speaker 4: We wouldn't want to give intelligence to animals, and other 362 00:19:18,320 --> 00:19:21,640 Speaker 4: people say it's a moral obligation in the same way 363 00:19:21,640 --> 00:19:24,680 Speaker 4: that you know, if we know how to fix some 364 00:19:24,920 --> 00:19:27,639 Speaker 4: viral disease or fix a broken leg or something, of 365 00:19:27,640 --> 00:19:29,720 Speaker 4: course you should do this for your dog instead of 366 00:19:29,800 --> 00:19:33,720 Speaker 4: let your dog, you know, not have the medical advances 367 00:19:33,760 --> 00:19:36,720 Speaker 4: that we have made. So anyway, it's a big debate, 368 00:19:36,760 --> 00:19:39,120 Speaker 4: but this is the idea of animal uplift. You make 369 00:19:39,320 --> 00:19:43,040 Speaker 4: an animal as intelligent as a human. And I just 370 00:19:43,119 --> 00:19:47,880 Speaker 4: find this area fascinating. And you know, as I proposed 371 00:19:47,960 --> 00:19:52,359 Speaker 4: the podcast, what would the consequences of this be in 372 00:19:52,440 --> 00:19:55,840 Speaker 4: terms of, you know, will World War five be fought 373 00:19:55,880 --> 00:20:00,800 Speaker 4: by other animal species not just humans? You know, And 374 00:20:01,080 --> 00:20:04,879 Speaker 4: the way I sort of introduced the podcast is with 375 00:20:04,960 --> 00:20:07,080 Speaker 4: this question of what will my kids look back on? 376 00:20:07,400 --> 00:20:10,760 Speaker 4: Are my grandkids Obviously there's lots of things that will 377 00:20:10,800 --> 00:20:15,160 Speaker 4: be very different about our world right now and their world, 378 00:20:15,160 --> 00:20:17,159 Speaker 4: and let's say fifty years from now. But one of 379 00:20:17,200 --> 00:20:19,960 Speaker 4: them is will they look back and say, Wow, I 380 00:20:19,960 --> 00:20:22,320 Speaker 4: can't believe there was a time when humans were the 381 00:20:22,359 --> 00:20:25,240 Speaker 4: only species on Earth that was really doing anything, and 382 00:20:25,680 --> 00:20:29,600 Speaker 4: now we've got all these other you know, crows running universities, 383 00:20:29,680 --> 00:20:35,600 Speaker 4: and donkeys' programming computers and whatever, gophers in the Senate 384 00:20:35,640 --> 00:20:36,080 Speaker 4: and so on. 385 00:20:47,359 --> 00:20:51,240 Speaker 2: So one idea of yours that I came across because 386 00:20:51,320 --> 00:20:53,520 Speaker 2: Rob sent it to me and I found really interesting 387 00:20:54,280 --> 00:20:58,120 Speaker 2: was from a paper you published in twenty twenty one, 388 00:20:58,160 --> 00:21:02,040 Speaker 2: I think maybe in Frontiers and Neuroscience, offering a hypothesis 389 00:21:02,119 --> 00:21:07,800 Speaker 2: about the adaptive function of dreams, which you call the 390 00:21:07,880 --> 00:21:12,439 Speaker 2: defensive activation theory, Could you lay out what is the 391 00:21:12,480 --> 00:21:17,000 Speaker 2: basic controversy about the biological function of dreams and how 392 00:21:17,080 --> 00:21:20,840 Speaker 2: your proposed solution here would answer this question. 393 00:21:21,359 --> 00:21:24,359 Speaker 4: Yes, so it turns out there is no controversy about 394 00:21:24,400 --> 00:21:28,720 Speaker 4: the purpose of dreams because nobody knows, right, everyone, I 395 00:21:28,760 --> 00:21:31,840 Speaker 4: mean in the sense of everyone's got a little hypothesis 396 00:21:31,880 --> 00:21:36,040 Speaker 4: about it. But really it's complicated, and people think, well, 397 00:21:36,080 --> 00:21:37,720 Speaker 4: maybe it has something to do with learning and memory. 398 00:21:37,800 --> 00:21:41,600 Speaker 4: Maybe it just has to do with you know, energy restoration. 399 00:21:41,880 --> 00:21:45,359 Speaker 4: Maybe it has to do with you know, obviously the 400 00:21:45,359 --> 00:21:48,280 Speaker 4: Freudians thought that there was some important meaning in the 401 00:21:48,320 --> 00:21:50,960 Speaker 4: content of dreams and so on, but no one really knows, 402 00:21:51,000 --> 00:21:54,440 Speaker 4: and certainly no one has a quantitative hypothesis that can 403 00:21:54,440 --> 00:21:56,960 Speaker 4: make predictions about dreams and how much dream time we have. 404 00:21:58,000 --> 00:22:02,840 Speaker 4: But my student and I developed a theory that actually 405 00:22:02,880 --> 00:22:06,840 Speaker 4: does make quantitative predictions across animal species about it predicts 406 00:22:06,840 --> 00:22:11,240 Speaker 4: actually how much each animal species will dream. And to 407 00:22:11,440 --> 00:22:14,159 Speaker 4: explain something to take one step back, which is about 408 00:22:14,320 --> 00:22:18,439 Speaker 4: brain plasticity, which is this term that we use to 409 00:22:18,480 --> 00:22:21,600 Speaker 4: explain that the brain is very malleable, the human brain 410 00:22:21,640 --> 00:22:25,800 Speaker 4: in particular, and it's constantly reconfiguring its own circuitry and 411 00:22:25,840 --> 00:22:28,719 Speaker 4: that's how it learns and remembers, and that's how it 412 00:22:29,160 --> 00:22:32,640 Speaker 4: learns new skills and all that. So it turns out, 413 00:22:32,880 --> 00:22:35,320 Speaker 4: this is what my last book, Live Wired was about, 414 00:22:35,640 --> 00:22:39,440 Speaker 4: is the massive flexibility of the brain. It turns out 415 00:22:40,000 --> 00:22:43,040 Speaker 4: is probably a lot of people already into it. If 416 00:22:43,119 --> 00:22:47,320 Speaker 4: you go blind at a young age, the visual part 417 00:22:47,320 --> 00:22:50,359 Speaker 4: of your brain gets taken over, and in fact, if 418 00:22:50,400 --> 00:22:54,320 Speaker 4: you're born blind, that takeover is complete. The rest of 419 00:22:54,400 --> 00:22:57,760 Speaker 4: the territories in your brain involved in hearing and touch 420 00:22:57,920 --> 00:23:01,280 Speaker 4: and other things. These all take over what we would 421 00:23:01,280 --> 00:23:04,359 Speaker 4: normally think of as the visual cortex, and it's no 422 00:23:04,440 --> 00:23:07,840 Speaker 4: longer visual. It's now, you know, subserving other functions. 423 00:23:07,920 --> 00:23:08,240 Speaker 1: Okay. 424 00:23:09,080 --> 00:23:12,080 Speaker 4: One of the surprises in neuroscience was a study that 425 00:23:12,119 --> 00:23:14,560 Speaker 4: came out about a decade ago from some colleagues of 426 00:23:14,560 --> 00:23:17,200 Speaker 4: Mind Harvard where they put people in a scanner. These 427 00:23:17,200 --> 00:23:21,359 Speaker 4: are normally cited people, but they blindfolded them tightly and 428 00:23:21,359 --> 00:23:22,920 Speaker 4: they put them in the scanner and they were looking 429 00:23:22,920 --> 00:23:26,080 Speaker 4: at their brain's response to touch or to sound or 430 00:23:26,119 --> 00:23:28,680 Speaker 4: things like that. And what they found, to their surprise, 431 00:23:28,800 --> 00:23:32,679 Speaker 4: is that after an hour, you could start seeing the 432 00:23:32,720 --> 00:23:37,040 Speaker 4: first hints of signals in the visual cortex in response 433 00:23:37,080 --> 00:23:40,760 Speaker 4: to touch and sounds. So, in other words, the visual 434 00:23:40,760 --> 00:23:45,400 Speaker 4: cortex was starting to get annexed from these other territories 435 00:23:45,520 --> 00:23:49,199 Speaker 4: that they've touched and sound after one hour. And so 436 00:23:49,320 --> 00:23:52,439 Speaker 4: this was a much more rapid kind of movement than 437 00:23:52,520 --> 00:23:56,640 Speaker 4: anyone had expected. And so what my student and I 438 00:23:56,640 --> 00:24:03,200 Speaker 4: immediately realized is that this is the basis of dreaming. 439 00:24:03,240 --> 00:24:07,120 Speaker 4: It's because we are on a planet that rotates, and 440 00:24:07,600 --> 00:24:10,200 Speaker 4: we spend half our time in the darkness, away from 441 00:24:10,400 --> 00:24:13,120 Speaker 4: the light of our star, and so in the dark. 442 00:24:13,200 --> 00:24:15,160 Speaker 4: You can still hear and touch and taste and smell 443 00:24:15,320 --> 00:24:17,800 Speaker 4: just fine, but you can't see. And obviously I'm talking 444 00:24:17,840 --> 00:24:23,800 Speaker 4: about evolutionary time, you know, not our modern electricity blessed times. 445 00:24:24,160 --> 00:24:28,640 Speaker 4: And so what this means is the visual system in 446 00:24:28,680 --> 00:24:31,440 Speaker 4: particular has a real disadvantage, which is it is in 447 00:24:31,640 --> 00:24:35,320 Speaker 4: danger of getting taken over from the other senses. And 448 00:24:35,359 --> 00:24:38,119 Speaker 4: this is because of the brain's great plasticity, and so 449 00:24:38,240 --> 00:24:41,639 Speaker 4: as a result, the visual system needs a way to 450 00:24:41,720 --> 00:24:45,800 Speaker 4: defend its territory during the night. And that's what dreaming is. 451 00:24:45,920 --> 00:24:49,600 Speaker 4: Dreaming is essentially a screen saver. It's making sure that 452 00:24:50,200 --> 00:24:52,919 Speaker 4: at night time, when you're curled up in the corner 453 00:24:52,920 --> 00:24:56,760 Speaker 4: of your cave, staying out of trouble sleeping, and sleeping 454 00:24:56,760 --> 00:24:59,119 Speaker 4: has other benefits too, in terms of energy restoration and 455 00:24:59,119 --> 00:25:01,840 Speaker 4: so on. So when that's happening, you know, you can 456 00:25:01,880 --> 00:25:04,240 Speaker 4: still feel if some touches your skinner, if you're smelling 457 00:25:04,280 --> 00:25:06,320 Speaker 4: something or whatever. All that can still function in the dark, 458 00:25:06,359 --> 00:25:08,879 Speaker 4: but you're not seeing anything at all. And so what 459 00:25:08,920 --> 00:25:11,760 Speaker 4: happens is you've got this circuitry that just blows activity 460 00:25:11,840 --> 00:25:16,000 Speaker 4: into the visual system to make sure it stays active 461 00:25:16,040 --> 00:25:18,439 Speaker 4: during the night. Every ninety minutes, you have this wave 462 00:25:18,960 --> 00:25:21,640 Speaker 4: of active, random activity that just gets blown in there. 463 00:25:21,880 --> 00:25:24,800 Speaker 4: And because we're visual creatures, we see we have full, 464 00:25:24,880 --> 00:25:27,080 Speaker 4: rich visual experience even though our eyes are closed and 465 00:25:27,160 --> 00:25:31,520 Speaker 4: it's dark out, and it's because we are just making 466 00:25:31,560 --> 00:25:34,600 Speaker 4: sure the brain is making sure that it's keeping this 467 00:25:34,720 --> 00:25:38,320 Speaker 4: competition going so the visual system doesn't get taken over. Interestingly, 468 00:25:38,720 --> 00:25:41,600 Speaker 4: dream sleep is something we find across the animal kingdom, 469 00:25:41,960 --> 00:25:44,919 Speaker 4: but what we were able to demonstrate is that it 470 00:25:45,000 --> 00:25:48,240 Speaker 4: correlates with how plastic the animal species is. So some 471 00:25:48,760 --> 00:25:52,120 Speaker 4: animals drop out of the womb and they figure out 472 00:25:52,160 --> 00:25:54,960 Speaker 4: in thirty minutes how to run, how to walk. Very quickly, 473 00:25:55,000 --> 00:25:59,639 Speaker 4: they reach adolescence, they can reproduce, so all kinds, you know, 474 00:25:59,680 --> 00:26:02,800 Speaker 4: they're just they're obviously very pre programmed, let's just put 475 00:26:02,800 --> 00:26:06,840 Speaker 4: it that way. But other creatures like humans, are extremely plastic. 476 00:26:06,920 --> 00:26:09,560 Speaker 4: We take forever to learn how to walk, to wean 477 00:26:10,280 --> 00:26:14,320 Speaker 4: to reach reproductive age, things like that, precisely because we're 478 00:26:14,359 --> 00:26:19,160 Speaker 4: extremely plastic, and so we have lots of dreaming because 479 00:26:19,200 --> 00:26:21,800 Speaker 4: we have to protect our visual cortex at night. But 480 00:26:22,320 --> 00:26:25,600 Speaker 4: other animals that are, you know, these pre program types, 481 00:26:26,320 --> 00:26:29,080 Speaker 4: they have just a tiny bit of visual dreaming, but 482 00:26:29,480 --> 00:26:31,640 Speaker 4: not a lot. And by the way, I'll just mention 483 00:26:31,760 --> 00:26:36,040 Speaker 4: that the amount of visual dreaming we have goes down 484 00:26:36,080 --> 00:26:38,480 Speaker 4: with age. So it's an infant, you're dreaming all the time, 485 00:26:38,560 --> 00:26:40,520 Speaker 4: and as you get older and older, you dream less 486 00:26:40,520 --> 00:26:43,560 Speaker 4: and less is a fraction of your sleep. And you 487 00:26:43,600 --> 00:26:46,320 Speaker 4: know that's just a correlation. But in theory, what that 488 00:26:46,400 --> 00:26:49,840 Speaker 4: suggests is, you know, as an infant, your visual system 489 00:26:49,920 --> 00:26:53,199 Speaker 4: is very highly at risk of getting taken over, and 490 00:26:53,240 --> 00:26:55,800 Speaker 4: as you get older and things get more cemented into place, 491 00:26:55,800 --> 00:26:57,520 Speaker 4: it's less at risk of getting taken over, so you 492 00:26:57,560 --> 00:27:00,280 Speaker 4: don't need as much screensaver time and only. 493 00:27:00,320 --> 00:27:02,760 Speaker 2: This kind of reminds me of studies I've read on 494 00:27:03,240 --> 00:27:07,160 Speaker 2: a related subject, which is a prolonged blindfolding of normally 495 00:27:07,200 --> 00:27:11,680 Speaker 2: cited people who apparently it's very common for people under 496 00:27:11,720 --> 00:27:15,680 Speaker 2: those circumstances to experience a lot of visual hallucinations. Does 497 00:27:15,720 --> 00:27:18,040 Speaker 2: that have any relationship to what you're talking about here? 498 00:27:18,080 --> 00:27:20,800 Speaker 4: It does, Thank you for asking. That's perfect because this 499 00:27:20,880 --> 00:27:24,560 Speaker 4: is all part of the defensive activation theory, which is 500 00:27:24,600 --> 00:27:27,760 Speaker 4: to say, if a system is used to having data 501 00:27:27,800 --> 00:27:31,640 Speaker 4: coming in and suddenly it's not getting that data anymore, 502 00:27:32,480 --> 00:27:34,840 Speaker 4: it fights back from the inside it starts producing that 503 00:27:34,960 --> 00:27:39,160 Speaker 4: data itself. So one example of this is let's say blindfolding, 504 00:27:39,280 --> 00:27:42,720 Speaker 4: or you also see this for example in you know, 505 00:27:42,920 --> 00:27:45,640 Speaker 4: when people get thrown in solitary confinement in the dark, 506 00:27:45,680 --> 00:27:49,720 Speaker 4: they start having hallucinations both auditory and visual because they're 507 00:27:49,720 --> 00:27:51,760 Speaker 4: not getting that data and they're used to it they're 508 00:27:51,760 --> 00:27:55,160 Speaker 4: supposed to get. There's also something called Bonet syndrome Charles 509 00:27:55,240 --> 00:27:58,440 Speaker 4: Bone syndrome, which is people start losing their vision, but 510 00:27:58,480 --> 00:28:00,840 Speaker 4: they don't realize that they're losing their vision because they 511 00:28:00,960 --> 00:28:06,080 Speaker 4: start having hallucinations that essentially fill in for them. This 512 00:28:06,200 --> 00:28:08,480 Speaker 4: is all the same issue, though, which is that the 513 00:28:08,520 --> 00:28:11,160 Speaker 4: brain is used to getting certain inputs, suddenly it's not 514 00:28:11,280 --> 00:28:14,080 Speaker 4: getting it anymore, and so it starts generating in itself. 515 00:28:14,400 --> 00:28:17,640 Speaker 4: One more example is tonightis which is ringing in the ears. 516 00:28:17,960 --> 00:28:21,720 Speaker 4: This typically comes about because somebody loses hearing in some 517 00:28:22,440 --> 00:28:25,560 Speaker 4: frequency or some band of frequencies and the brain says, 518 00:28:25,560 --> 00:28:28,520 Speaker 4: wait a minute, I'm not hearing anything at twelve thousand 519 00:28:28,560 --> 00:28:32,080 Speaker 4: hurts anymore, so I'm going to start making myself and 520 00:28:32,160 --> 00:28:36,000 Speaker 4: it starts making this sound by itself. So this all 521 00:28:36,040 --> 00:28:38,040 Speaker 4: falls under the defensive activation theory. 522 00:28:39,120 --> 00:28:42,120 Speaker 2: One of the interesting things I recall about the studies 523 00:28:42,120 --> 00:28:48,200 Speaker 2: on prolonged blindfolding was that the hallucinations that were reported 524 00:28:48,240 --> 00:28:50,600 Speaker 2: were not entirely random. So it wasn't just you know, 525 00:28:50,640 --> 00:28:53,160 Speaker 2: people seeing strange scenes play out in front of them. 526 00:28:53,200 --> 00:28:57,680 Speaker 2: That they would often hallucinate stuff that you would expect 527 00:28:57,840 --> 00:29:00,360 Speaker 2: to see in that place in the room based on 528 00:29:00,400 --> 00:29:03,800 Speaker 2: other senses. So like if they heard someone come to 529 00:29:03,840 --> 00:29:06,640 Speaker 2: the door of the room, they would hallucinate the image 530 00:29:06,640 --> 00:29:07,960 Speaker 2: of that person in the door. 531 00:29:08,520 --> 00:29:12,080 Speaker 4: Yeah, that's perfect. And by the way, I think this 532 00:29:12,400 --> 00:29:15,560 Speaker 4: also has a lot to tell us about dream content 533 00:29:16,120 --> 00:29:18,960 Speaker 4: because the thing about dreams, I love the way you 534 00:29:19,000 --> 00:29:21,719 Speaker 4: put this, because you could, in theory, dream about anything 535 00:29:21,760 --> 00:29:25,920 Speaker 4: at all. You could dream that you are in Cambodia 536 00:29:26,080 --> 00:29:28,800 Speaker 4: and that you are in the fourteen hundreds and you're 537 00:29:28,920 --> 00:29:32,800 Speaker 4: a magician who's doing something. But you know, you tend 538 00:29:32,840 --> 00:29:35,160 Speaker 4: to dream about you know, your work and your spouse 539 00:29:35,200 --> 00:29:37,800 Speaker 4: and your drive and whatever, you know, things that are 540 00:29:37,800 --> 00:29:42,360 Speaker 4: more local to you. And it's precisely because when you 541 00:29:43,200 --> 00:29:47,960 Speaker 4: slam random activity into the visual system, the synapse is 542 00:29:47,960 --> 00:29:51,040 Speaker 4: the connections that are essentially hot from the day's work, 543 00:29:52,240 --> 00:29:54,280 Speaker 4: you know, those those are the things that tend to 544 00:29:54,280 --> 00:29:57,960 Speaker 4: get activated, and the association is very loose in a 545 00:29:58,200 --> 00:30:01,200 Speaker 4: sleeping dream state, and so what happens is, you know, 546 00:30:01,280 --> 00:30:05,440 Speaker 4: things can go off on weird tangents, but physics still 547 00:30:05,520 --> 00:30:07,960 Speaker 4: works fine in a dream. You know, rocks don't float 548 00:30:08,040 --> 00:30:12,120 Speaker 4: upwards and stuff like that, and so you know, essentially 549 00:30:12,160 --> 00:30:16,160 Speaker 4: you're just rebooting things that were there during the day. 550 00:30:16,520 --> 00:30:19,240 Speaker 4: And and this is closely related to what you're saying about. 551 00:30:20,640 --> 00:30:22,840 Speaker 4: You know, all the associations that your brain builds up 552 00:30:22,880 --> 00:30:25,200 Speaker 4: over a lifetime. So you hear the voice and you're 553 00:30:25,240 --> 00:30:29,440 Speaker 4: expecting to see that person, and that's exactly what happens. Actually, 554 00:30:29,480 --> 00:30:31,640 Speaker 4: I just want to mention one other thing about the dreams, 555 00:30:32,080 --> 00:30:35,680 Speaker 4: which is people will often ask me, well, what about 556 00:30:35,840 --> 00:30:38,480 Speaker 4: a blind person, how do they dream? And the answer 557 00:30:38,640 --> 00:30:41,760 Speaker 4: is blind people also dream because you have this very 558 00:30:41,840 --> 00:30:44,920 Speaker 4: ancient circuitry in your head that's blasting activity into the 559 00:30:44,960 --> 00:30:47,880 Speaker 4: back of the brain the occipital cortex, which is normally 560 00:30:47,880 --> 00:30:50,240 Speaker 4: the visual cortex. And people but if you're born blind, 561 00:30:50,640 --> 00:30:53,200 Speaker 4: it's you know, long taken over by hearing in touch, 562 00:30:53,520 --> 00:30:56,920 Speaker 4: and so a blind person's dream is all about hearing 563 00:30:56,960 --> 00:31:00,520 Speaker 4: in touch. They don't see anything, but they say, oh, 564 00:31:00,600 --> 00:31:02,480 Speaker 4: I was, you know, moving through the living room and 565 00:31:02,560 --> 00:31:04,960 Speaker 4: I felt the furniture was rearranged, and then there was 566 00:31:05,000 --> 00:31:07,360 Speaker 4: a big dog in the corner and I ran from 567 00:31:07,440 --> 00:31:09,280 Speaker 4: it and I was scared. And so, you know, they've 568 00:31:09,280 --> 00:31:12,680 Speaker 4: got full, rich dream experience. It's just that it is 569 00:31:13,360 --> 00:31:15,960 Speaker 4: not visual because that part of their brain is no 570 00:31:16,040 --> 00:31:16,760 Speaker 4: longer visual. 571 00:31:17,320 --> 00:31:20,000 Speaker 2: And would you well, that makes me wonder then if 572 00:31:20,840 --> 00:31:25,120 Speaker 2: if your hypothesis about the defensive activity of dreaming is correct, 573 00:31:25,200 --> 00:31:28,920 Speaker 2: does that mean that that dreaming is now to protect 574 00:31:29,160 --> 00:31:32,880 Speaker 2: still what would normally be used for visual processing that 575 00:31:32,920 --> 00:31:36,200 Speaker 2: part of the brain, but its role in processing auditory 576 00:31:36,240 --> 00:31:37,560 Speaker 2: and other stimuli. 577 00:31:37,840 --> 00:31:42,239 Speaker 4: A great question. No, that it's that these circuits that 578 00:31:42,360 --> 00:31:46,760 Speaker 4: underlie dreaming are extremely ancient, and so they are assuming 579 00:31:47,280 --> 00:31:50,360 Speaker 4: that you've got perfectly fine vision. And if you don't 580 00:31:50,360 --> 00:31:53,640 Speaker 4: have vision for some reason, then the circuits aren't going 581 00:31:53,720 --> 00:31:58,720 Speaker 4: to change. They're just doing a basic architectural job of saying, hey, guys, 582 00:31:58,760 --> 00:32:01,680 Speaker 4: every ninety minutes, just last some activity into the back 583 00:32:01,720 --> 00:32:02,120 Speaker 4: of the brain. 584 00:32:02,200 --> 00:32:02,440 Speaker 1: There. 585 00:32:02,680 --> 00:32:04,320 Speaker 4: That's all they're doing. And they don't know if you're 586 00:32:04,320 --> 00:32:04,880 Speaker 4: blind or not. 587 00:32:05,560 --> 00:32:09,120 Speaker 1: Now, speaking about dreams, I guess it's not too much 588 00:32:09,120 --> 00:32:12,840 Speaker 1: of a leap to start talking about consciousness. I was wondering, 589 00:32:12,960 --> 00:32:16,840 Speaker 1: where do you think we are in terms of coraling, testing, 590 00:32:16,960 --> 00:32:22,240 Speaker 1: and even eliminating various theories concerning the nature of human consciousness. Yeah? 591 00:32:22,280 --> 00:32:26,280 Speaker 4: Boy, this still remains to my mind the central unsolved 592 00:32:26,360 --> 00:32:29,040 Speaker 4: mystery of neuroscience. What's interesting, by the way, I wrote 593 00:32:29,080 --> 00:32:32,920 Speaker 4: an article, the cover article is Discover magazine back in 594 00:32:32,960 --> 00:32:36,480 Speaker 4: something like two thousand and six, called ten Unsolved Questions 595 00:32:36,480 --> 00:32:39,440 Speaker 4: of Neuroscience. And what's fascinating to me is it's now 596 00:32:39,480 --> 00:32:42,840 Speaker 4: twenty twenty three and they are equally as unsolved. I mean, 597 00:32:42,840 --> 00:32:46,640 Speaker 4: it's funny because we're making so much progress in the 598 00:32:46,640 --> 00:32:49,760 Speaker 4: field in some ways, and yet in other ways we're 599 00:32:49,800 --> 00:32:54,400 Speaker 4: just facing some really tough problems. So the consciousness, you know, 600 00:32:54,440 --> 00:32:57,680 Speaker 4: what is consciousness is really I think the central one. 601 00:32:57,960 --> 00:33:01,640 Speaker 4: And you know, for any listeners who are wondering what 602 00:33:01,800 --> 00:33:03,960 Speaker 4: is the question, the question is how do you take 603 00:33:04,000 --> 00:33:07,960 Speaker 4: eighty six billion cells and stick them together and hook 604 00:33:08,000 --> 00:33:11,040 Speaker 4: them up in such a way that you have private, 605 00:33:11,080 --> 00:33:17,479 Speaker 4: subjective internal experience. So you know, the smell of apple 606 00:33:17,560 --> 00:33:20,160 Speaker 4: pie and the taste of feta cheese, and the pain 607 00:33:20,200 --> 00:33:22,400 Speaker 4: of pain, and the redness of red and so on 608 00:33:23,240 --> 00:33:27,280 Speaker 4: how does that happen? Because you know, my laptop computer 609 00:33:28,440 --> 00:33:31,280 Speaker 4: has lots of signals running around, zeros and ones running around, 610 00:33:31,280 --> 00:33:35,160 Speaker 4: and it's transistors, but presumably it's not experiencing anything. It 611 00:33:35,200 --> 00:33:39,400 Speaker 4: can play a YouTube video for me, but presumably it 612 00:33:39,440 --> 00:33:42,120 Speaker 4: doesn't find it funny the way I do. And so 613 00:33:42,360 --> 00:33:45,560 Speaker 4: this is really the question of consciousness. We don't know 614 00:33:45,640 --> 00:33:50,120 Speaker 4: the answer to it. I can just tell you my 615 00:33:51,080 --> 00:33:54,800 Speaker 4: general feeling on this, which is when you look at 616 00:33:54,800 --> 00:33:57,360 Speaker 4: the history of science, what you find is that in 617 00:33:57,440 --> 00:34:02,040 Speaker 4: every era there were big pieces of information missing, and 618 00:34:02,560 --> 00:34:04,600 Speaker 4: yet the scientists were in a position of having to 619 00:34:04,640 --> 00:34:09,680 Speaker 4: try to explain everything not knowing some other thing. Here's 620 00:34:09,680 --> 00:34:13,120 Speaker 4: an example. You know, when the pump was invented, people 621 00:34:13,160 --> 00:34:15,680 Speaker 4: suddenly said, oh, I see the heart is like a pump, 622 00:34:15,960 --> 00:34:18,280 Speaker 4: and then it was obvious, oh, click falls into place. 623 00:34:18,520 --> 00:34:20,279 Speaker 4: But before that, everyone's trying to figure out what the 624 00:34:20,320 --> 00:34:21,680 Speaker 4: heck the heart was doing, but no one had the 625 00:34:21,719 --> 00:34:26,400 Speaker 4: concept of a pump. Or you know, before the magnetosphere 626 00:34:26,440 --> 00:34:28,400 Speaker 4: of the Earth was discovered, you'd have no way to 627 00:34:28,440 --> 00:34:31,920 Speaker 4: explain the northern lights. You'd have to make up some 628 00:34:32,080 --> 00:34:35,080 Speaker 4: crazy story about the northern lights and so on. Anyway, 629 00:34:35,400 --> 00:34:38,040 Speaker 4: I feel like we're in that situation now with consciousness. 630 00:34:38,040 --> 00:34:42,000 Speaker 4: There's something right at the edges. We're all listening for 631 00:34:42,120 --> 00:34:45,279 Speaker 4: its whispers. We can sort of feel that there's something there, 632 00:34:45,280 --> 00:34:47,520 Speaker 4: but we don't know exactly what it is that we're 633 00:34:48,040 --> 00:34:50,600 Speaker 4: missing that will allow us to explain how you take 634 00:34:50,600 --> 00:34:53,359 Speaker 4: a bunch of physical stuff and have it experience. 635 00:35:03,040 --> 00:35:06,680 Speaker 1: Now, a topic that's being discussed a lot right now is, 636 00:35:06,719 --> 00:35:12,239 Speaker 1: of course, as always artificial intelligence, but specifically generative artificial intelligence, 637 00:35:13,239 --> 00:35:16,239 Speaker 1: especially with so many of these text and image creative 638 00:35:16,280 --> 00:35:18,640 Speaker 1: tools that are available just to the average person to 639 00:35:18,719 --> 00:35:23,200 Speaker 1: experiment with and share the results of. And I was wondering, 640 00:35:23,239 --> 00:35:27,719 Speaker 1: what's your take on generative artificial intelligence and how it 641 00:35:27,800 --> 00:35:30,480 Speaker 1: relates or doesn't relate to human creativity. 642 00:35:30,800 --> 00:35:33,880 Speaker 4: Yeah, so actually this is my next episode because I'm 643 00:35:33,960 --> 00:35:38,160 Speaker 4: fascinated by this. Yeah, I'm just so Okay. So, you 644 00:35:38,200 --> 00:35:41,080 Speaker 4: guys may know I'm a neuroscientist, but I'm also a writer, 645 00:35:41,280 --> 00:35:45,600 Speaker 4: including of fiction. And so suddenly, when when generatively I 646 00:35:45,680 --> 00:35:48,799 Speaker 4: started blowing up really at the end of last year, 647 00:35:49,360 --> 00:35:51,880 Speaker 4: I of course, like many artists, thought, oh my gosh, 648 00:35:51,920 --> 00:35:54,560 Speaker 4: what does this mean for me? What's the what's the 649 00:35:54,600 --> 00:36:00,000 Speaker 4: future for writers? But actually, in my next episode, I 650 00:36:00,160 --> 00:36:03,480 Speaker 4: make a four part argument why I think it'll be 651 00:36:03,560 --> 00:36:07,080 Speaker 4: an important part of the symbiosis between humans and machines 652 00:36:07,120 --> 00:36:08,080 Speaker 4: that eventually comes about. 653 00:36:08,120 --> 00:36:09,120 Speaker 1: But it's not going to. 654 00:36:09,080 --> 00:36:12,759 Speaker 4: Replace writers and artists. There are many reasons. You know, 655 00:36:12,840 --> 00:36:14,360 Speaker 4: One thing is it can it can only do short 656 00:36:14,400 --> 00:36:16,960 Speaker 4: form stuff, and it does it very nicely. But you know, 657 00:36:16,960 --> 00:36:19,440 Speaker 4: if you want to write a little blog post or 658 00:36:19,440 --> 00:36:22,040 Speaker 4: a little jingle or poem or whatever, like, it's great 659 00:36:22,040 --> 00:36:24,840 Speaker 4: for that, but to actually write a novel is a 660 00:36:24,880 --> 00:36:27,600 Speaker 4: completely different sort of thing because what the author is 661 00:36:27,640 --> 00:36:31,320 Speaker 4: doing there is planting clues and having let's say, a 662 00:36:31,320 --> 00:36:34,600 Speaker 4: cliffhanger that doesn't come back for two or three chapters, 663 00:36:34,640 --> 00:36:38,400 Speaker 4: and you know, there's a continuity through time where what 664 00:36:38,480 --> 00:36:40,920 Speaker 4: the author knows is what the end of the story 665 00:36:41,080 --> 00:36:45,120 Speaker 4: is and then writes towards that. But AI can't even 666 00:36:45,160 --> 00:36:46,879 Speaker 4: do things like make up a joke, because to make 667 00:36:46,920 --> 00:36:48,839 Speaker 4: up a joke you have to know the punchline first 668 00:36:48,840 --> 00:36:51,439 Speaker 4: and then construct the joke to meet it. But it's 669 00:36:51,480 --> 00:36:55,440 Speaker 4: doing everything in the forward direction, so there are there 670 00:36:55,480 --> 00:36:58,680 Speaker 4: are reasons like that. There's also make the argument that 671 00:36:59,400 --> 00:37:03,000 Speaker 4: we as readers, I think, actually really care about the 672 00:37:03,040 --> 00:37:06,160 Speaker 4: heartbeat behind the page, which is to say, if you 673 00:37:06,239 --> 00:37:09,279 Speaker 4: offered me two books, and one was written by AI 674 00:37:09,440 --> 00:37:13,319 Speaker 4: and one was written by you, Rob, and you know, 675 00:37:13,400 --> 00:37:16,000 Speaker 4: I would absolutely want the one that's written by a 676 00:37:16,040 --> 00:37:20,120 Speaker 4: real human, because I know that you're a human with 677 00:37:20,160 --> 00:37:25,759 Speaker 4: all the you know, limitations and anxieties and joys and 678 00:37:25,800 --> 00:37:28,880 Speaker 4: ecstasies of a real human. And and that's what I 679 00:37:28,920 --> 00:37:33,160 Speaker 4: care about as a fellow human. And you know, part 680 00:37:33,200 --> 00:37:35,319 Speaker 4: of my evidence for this is a colleague of mine 681 00:37:35,360 --> 00:37:38,799 Speaker 4: here in Silicon Valley announced recently that he'd written a 682 00:37:38,800 --> 00:37:41,440 Speaker 4: book that was half by him and half by chat 683 00:37:41,520 --> 00:37:46,439 Speaker 4: gpt And and I actually read most of the book 684 00:37:46,440 --> 00:37:49,680 Speaker 4: and it's it's actually a good book, but I was 685 00:37:49,719 --> 00:37:52,080 Speaker 4: not inspired when I heard that I read it for 686 00:37:52,160 --> 00:37:55,799 Speaker 4: other reasons. I thought that sounds terrible, and I was 687 00:37:55,800 --> 00:37:58,000 Speaker 4: trying to figure out why why did I feel that way? 688 00:37:58,040 --> 00:38:00,840 Speaker 4: Why did I feel that it was interesting to me? 689 00:38:01,000 --> 00:38:04,640 Speaker 4: And it has to do with this heartbeat behind behind 690 00:38:04,760 --> 00:38:08,200 Speaker 4: the page that that matters. Here's the analogy that I'm 691 00:38:08,520 --> 00:38:13,359 Speaker 4: thinking about nowadays, is it was when cameras first came 692 00:38:13,400 --> 00:38:19,919 Speaker 4: on this scene, visual painters all panicked and thought we're 693 00:38:19,920 --> 00:38:22,439 Speaker 4: done for because why would anyone want me to sit 694 00:38:22,480 --> 00:38:25,560 Speaker 4: here and paint something for weeks and weeks when you 695 00:38:25,560 --> 00:38:28,360 Speaker 4: can just get a perfect representation of it in a 696 00:38:28,440 --> 00:38:31,279 Speaker 4: fraction of a second with a camera. And the answer is, 697 00:38:31,360 --> 00:38:34,040 Speaker 4: cameras did not kill visual painting. They just ended up 698 00:38:34,080 --> 00:38:38,160 Speaker 4: filling a different neighboring niche and became their own art form. 699 00:38:38,200 --> 00:38:40,680 Speaker 4: But visual painting still exists because you could do other 700 00:38:40,760 --> 00:38:44,000 Speaker 4: things with it, and I can you know, at least 701 00:38:44,160 --> 00:38:49,600 Speaker 4: right the moment, all these text generation programs are extraordinarily 702 00:38:49,880 --> 00:38:53,000 Speaker 4: boring in what they come up with, because they get 703 00:38:53,040 --> 00:38:56,719 Speaker 4: pushed through reinforcement learning with humans so that humans say, oh, 704 00:38:56,920 --> 00:38:59,040 Speaker 4: don't say that, don't say that, that might defend someone, 705 00:38:59,080 --> 00:39:00,880 Speaker 4: and so on, which is fine. I mean, I'm not 706 00:39:00,920 --> 00:39:03,480 Speaker 4: opposed to that. But the thing is that good literature 707 00:39:04,000 --> 00:39:06,759 Speaker 4: is stuff that really challenges us. Any good piece of 708 00:39:06,760 --> 00:39:09,200 Speaker 4: literature that you find is something that's full of stuff wethink. 709 00:39:09,280 --> 00:39:13,440 Speaker 4: Oh yikes, that sounds like a terrible thing that just happened. 710 00:39:13,840 --> 00:39:17,200 Speaker 4: And none of these large language models are even willing 711 00:39:17,239 --> 00:39:19,239 Speaker 4: to go near that or touch that. So I think 712 00:39:19,280 --> 00:39:21,840 Speaker 4: they're going to be quite a distance from real literature 713 00:39:21,920 --> 00:39:23,360 Speaker 4: for the foreseeable future. 714 00:39:23,719 --> 00:39:26,520 Speaker 2: I would tend to think also with literary craft. A 715 00:39:26,560 --> 00:39:30,359 Speaker 2: lot of what we really like about literary style is 716 00:39:30,760 --> 00:39:35,719 Speaker 2: being surprised. But I wonder if a you know, surprised 717 00:39:35,719 --> 00:39:39,240 Speaker 2: by like a strange word choice or a strange metaphor 718 00:39:39,360 --> 00:39:42,280 Speaker 2: or something. Those are the things that feel really good. 719 00:39:42,400 --> 00:39:46,719 Speaker 2: But can can a generative AI tell the difference between 720 00:39:46,960 --> 00:39:50,920 Speaker 2: a comparison or a word choice that is strange and 721 00:39:51,000 --> 00:39:54,239 Speaker 2: a pleasing and exciting way versus one that will be 722 00:39:54,400 --> 00:39:58,040 Speaker 2: essentially interpreted as a hallucination or an error by the AI. 723 00:39:58,760 --> 00:40:03,400 Speaker 4: That's interesting, I would say, I mean, say something positive 724 00:40:03,440 --> 00:40:06,279 Speaker 4: about these ais. I think probably it would be able 725 00:40:06,320 --> 00:40:07,920 Speaker 4: to do that, because remember, all us doing is a 726 00:40:07,960 --> 00:40:11,439 Speaker 4: statistical game of saying, Okay, what's the most probable thing 727 00:40:11,520 --> 00:40:14,920 Speaker 4: to come next, and you can turn up the temperature 728 00:40:14,960 --> 00:40:17,680 Speaker 4: on it so that it does things that are increasingly 729 00:40:17,800 --> 00:40:23,279 Speaker 4: less probable but somehow makes sense. I had not thought 730 00:40:23,280 --> 00:40:26,320 Speaker 4: about that, But I think these things might be great 731 00:40:26,600 --> 00:40:30,880 Speaker 4: at making really good metaphors that are surprising, because one 732 00:40:30,920 --> 00:40:32,400 Speaker 4: of the things that authors have to deal with all 733 00:40:32,440 --> 00:40:36,760 Speaker 4: the time is that they recycle metaphors and it's totally 734 00:40:36,960 --> 00:40:39,560 Speaker 4: you know, a soporific to the reader. It puts them 735 00:40:39,560 --> 00:40:43,560 Speaker 4: to sleep. But a good author I was just reading 736 00:40:44,160 --> 00:40:47,560 Speaker 4: the other day. It was name Frank Herbert who in 737 00:40:47,640 --> 00:40:52,920 Speaker 4: Dune he said something about the waves throwing white robes 738 00:40:53,040 --> 00:40:55,880 Speaker 4: over the rocks. That's how he was describing the foam 739 00:40:55,960 --> 00:40:58,520 Speaker 4: hitting the rocks, which is beautiful because it wakes you 740 00:40:58,600 --> 00:41:00,279 Speaker 4: up in that moment you think, oh, what a nice 741 00:41:00,280 --> 00:41:03,439 Speaker 4: way of describing that. But if one of these large 742 00:41:03,520 --> 00:41:06,480 Speaker 4: language models simply says, hey, I want to make something 743 00:41:06,520 --> 00:41:10,600 Speaker 4: not the most probable thing, but less probable, less probable, 744 00:41:10,760 --> 00:41:12,680 Speaker 4: I'll bet it could come up with really good stuff 745 00:41:12,719 --> 00:41:15,400 Speaker 4: like that that no human author has yet tried. Let 746 00:41:15,480 --> 00:41:19,120 Speaker 4: me just give it one example, which is when Alpha 747 00:41:19,200 --> 00:41:24,320 Speaker 4: Go beat le so Dole, the Go champion back and 748 00:41:24,360 --> 00:41:27,800 Speaker 4: so I think twenty seventeen. You know, here's the best 749 00:41:27,880 --> 00:41:29,560 Speaker 4: human in the world of playing the game of Go, 750 00:41:30,680 --> 00:41:34,840 Speaker 4: and the AI program beats him, and everybody sort of 751 00:41:34,880 --> 00:41:36,839 Speaker 4: watched that and thought, wow, wow, that's the end of that. 752 00:41:37,440 --> 00:41:39,680 Speaker 4: But the most interesting part of the story was what 753 00:41:39,719 --> 00:41:44,600 Speaker 4: happened next, which is le Sodole ended up then playing 754 00:41:45,239 --> 00:41:53,080 Speaker 4: against his his human opponents and took on different sorts 755 00:41:53,080 --> 00:41:55,879 Speaker 4: of moves that he had seen Alpha Go play that 756 00:41:55,960 --> 00:41:59,120 Speaker 4: no human had played before, like it was just doing 757 00:41:59,160 --> 00:42:01,680 Speaker 4: these weird things that were totally in the rules. They 758 00:42:01,680 --> 00:42:04,320 Speaker 4: were legal, but no one had thought of doing it before. 759 00:42:04,360 --> 00:42:08,560 Speaker 4: So now he started doing this and started beating his 760 00:42:08,719 --> 00:42:12,120 Speaker 4: human opponents at a much higher rate. So the point 761 00:42:12,239 --> 00:42:14,600 Speaker 4: is we can learn from AI, and I think there's 762 00:42:14,640 --> 00:42:18,000 Speaker 4: going to be this really interesting collaboration that happens into 763 00:42:18,040 --> 00:42:21,319 Speaker 4: the future where we see new things happening and in 764 00:42:21,360 --> 00:42:24,080 Speaker 4: this case, new metaphors that come out in literature and 765 00:42:24,160 --> 00:42:26,000 Speaker 4: we think, wow, I would have never thought of that one, 766 00:42:26,080 --> 00:42:29,320 Speaker 4: and then we can use them. 767 00:42:29,480 --> 00:42:32,759 Speaker 2: Another element of literary style, though, I think about with 768 00:42:33,480 --> 00:42:38,839 Speaker 2: Generative AI is the role of insight in writing, and 769 00:42:39,040 --> 00:42:42,480 Speaker 2: it makes me wonder what insight actually is. This is 770 00:42:42,520 --> 00:42:44,880 Speaker 2: obviously something we prize, you know, when we read a 771 00:42:44,920 --> 00:42:47,080 Speaker 2: novel that we really like and we say that it 772 00:42:47,239 --> 00:42:50,320 Speaker 2: is true, you know there's something true in it. Obviously 773 00:42:50,360 --> 00:42:53,279 Speaker 2: the story is literally fictional, it didn't happen, but it 774 00:42:53,680 --> 00:42:57,200 Speaker 2: observes something about life that we perceive as like deeply 775 00:42:57,280 --> 00:43:01,239 Speaker 2: correct and do I don't know, I would have an 776 00:43:01,239 --> 00:43:04,680 Speaker 2: intuition that says I would come across insights like that, 777 00:43:04,840 --> 00:43:08,000 Speaker 2: or things that feel like insights like that. Less in 778 00:43:08,160 --> 00:43:11,040 Speaker 2: generat in something generated by AI. I can't prove that, 779 00:43:11,840 --> 00:43:14,880 Speaker 2: but it does raise this question of what insight is. 780 00:43:15,560 --> 00:43:19,000 Speaker 4: Yeah, you know, I tell you, I think I'm signing 781 00:43:19,000 --> 00:43:22,640 Speaker 4: with the A on this one. Because what these large 782 00:43:22,680 --> 00:43:26,200 Speaker 4: language models are essentially is every human all put together. 783 00:43:26,560 --> 00:43:30,000 Speaker 4: So whatever insights people have had previously, this is all 784 00:43:30,040 --> 00:43:33,080 Speaker 4: available to the language model, and so there's no reason 785 00:43:33,080 --> 00:43:36,879 Speaker 4: that it can't put something together that's very insightful. And 786 00:43:36,920 --> 00:43:39,680 Speaker 4: it's not that it's having the insight, it's that it 787 00:43:39,719 --> 00:43:42,520 Speaker 4: gets to say, Okay, well, here's a billion people who 788 00:43:42,520 --> 00:43:45,960 Speaker 4: have written stuff down, and I've noticed that, you know 789 00:43:46,239 --> 00:43:48,279 Speaker 4: a number of these maybe two hundred of these people 790 00:43:48,320 --> 00:43:50,520 Speaker 4: have all said the same thing over here. And maybe 791 00:43:50,600 --> 00:43:54,520 Speaker 4: Joe's never read that sentence, you know, that paragraph, but 792 00:43:54,520 --> 00:43:57,000 Speaker 4: but I there's something going on over here, and it 793 00:43:57,040 --> 00:43:59,040 Speaker 4: puts it together and then you say, oh my gosh, 794 00:43:59,080 --> 00:44:01,839 Speaker 4: that was really insight because it's not a machine telling 795 00:44:01,840 --> 00:44:04,640 Speaker 4: you the story, it's a billion people telling you the story. 796 00:44:05,320 --> 00:44:09,200 Speaker 1: Now, Joe and I are actually currently doing some episodes 797 00:44:09,280 --> 00:44:13,000 Speaker 1: of our own podcast on the subject of infantile amnesia. 798 00:44:13,520 --> 00:44:17,200 Speaker 1: You know, why we don't remember our earliest childhood or 799 00:44:17,239 --> 00:44:21,160 Speaker 1: you know, infancy and birth and so forth. And and 800 00:44:21,560 --> 00:44:24,800 Speaker 1: we've heard from some listeners, as we inevitably knew we would, 801 00:44:25,120 --> 00:44:27,560 Speaker 1: who say that they do remember their birth, so they 802 00:44:27,600 --> 00:44:31,560 Speaker 1: do remember very early childhood. And we were just wondering 803 00:44:31,600 --> 00:44:35,440 Speaker 1: what your take is on people who who have that 804 00:44:35,520 --> 00:44:38,560 Speaker 1: experience or seem to have that memory. What may be 805 00:44:38,680 --> 00:44:39,319 Speaker 1: going on there? 806 00:44:40,040 --> 00:44:43,600 Speaker 4: Yeah, I mean, here's here's what we think in there. 807 00:44:43,760 --> 00:44:46,239 Speaker 4: Science generally is that, you know, memory is a is 808 00:44:46,280 --> 00:44:50,200 Speaker 4: something that unpacks slowly with time. It's a cognitive development 809 00:44:50,280 --> 00:44:53,399 Speaker 4: in some sense. And you know, as you guys know, 810 00:44:53,520 --> 00:44:56,200 Speaker 4: it's about three years old for girls and three and 811 00:44:56,200 --> 00:44:58,239 Speaker 4: a half years old for boys that they start laying 812 00:44:58,239 --> 00:45:01,239 Speaker 4: down their first memories. Here's the interesting thing. Memory is 813 00:45:01,320 --> 00:45:05,600 Speaker 4: a myth making machine, and we're constantly reinventing our past. 814 00:45:05,920 --> 00:45:10,719 Speaker 4: And so one of the difficult things to assess when 815 00:45:10,719 --> 00:45:14,319 Speaker 4: someone says, hey, I remember whatever being born or being 816 00:45:14,320 --> 00:45:17,319 Speaker 4: one years old, is it's really difficult to know the 817 00:45:17,400 --> 00:45:21,040 Speaker 4: degree to which they think that's true. But it's not 818 00:45:21,080 --> 00:45:23,879 Speaker 4: true because we're all told stories by our parents of oh, 819 00:45:23,960 --> 00:45:25,880 Speaker 4: and you were an infant, you did this hilarious thing 820 00:45:25,920 --> 00:45:27,759 Speaker 4: and blah blah blah, and you hear the story once 821 00:45:27,840 --> 00:45:31,319 Speaker 4: or twice, and eventually it becomes a false memory. So 822 00:45:32,680 --> 00:45:35,640 Speaker 4: I think it's very difficult to know to be able 823 00:45:35,719 --> 00:45:38,680 Speaker 4: to tell this sort of thing. And of course for 824 00:45:38,800 --> 00:45:43,800 Speaker 4: someone who has a memory, it's very difficult to tell them, Hey, 825 00:45:43,960 --> 00:45:47,400 Speaker 4: that might be false and you just think you remember that. 826 00:45:47,400 --> 00:45:51,120 Speaker 4: That makes people angry. But you know, the truth is, 827 00:45:51,160 --> 00:45:54,000 Speaker 4: this kind of stuff comes up all the time in 828 00:45:54,080 --> 00:45:57,680 Speaker 4: courts of law, in the realm of eyewitness testimony, because 829 00:45:58,840 --> 00:46:01,920 Speaker 4: people think that they're memories are like a video recorder, 830 00:46:01,960 --> 00:46:07,799 Speaker 4: and they're simply not. There's a giant psychology literature on 831 00:46:07,880 --> 00:46:12,080 Speaker 4: this showing all kinds of ways that things get false, 832 00:46:12,160 --> 00:46:15,000 Speaker 4: memories get introduced, and so on. You know, a colleague 833 00:46:15,000 --> 00:46:19,240 Speaker 4: of mine did a really great study right after September eleventh, 834 00:46:19,239 --> 00:46:22,040 Speaker 4: two thousand and one. She was in New York, and 835 00:46:22,200 --> 00:46:26,760 Speaker 4: she went and interviewed a bunch of people in downtown 836 00:46:26,800 --> 00:46:29,480 Speaker 4: in Midtown, New York about what they had just seen 837 00:46:30,200 --> 00:46:32,880 Speaker 4: on September eleventh, and then she was clever enough to 838 00:46:33,080 --> 00:46:36,400 Speaker 4: also ask them to describe a memory from September tenth, 839 00:46:36,920 --> 00:46:39,160 Speaker 4: the day before, like lunch here, and I did this, 840 00:46:39,239 --> 00:46:41,120 Speaker 4: and I did that, And then she went and tracked 841 00:46:41,120 --> 00:46:44,919 Speaker 4: all these people down a year later and asked them 842 00:46:44,920 --> 00:46:47,960 Speaker 4: to tell their memories again about September eleventh, September tenth, 843 00:46:48,160 --> 00:46:50,480 Speaker 4: of the year before, and it turns out that in 844 00:46:50,520 --> 00:46:53,719 Speaker 4: both cases the memories drifted. So this comes back to 845 00:46:53,760 --> 00:46:56,680 Speaker 4: the beginning of our conversation. Even Amigdala memories, even the 846 00:46:56,680 --> 00:46:59,360 Speaker 4: scariest memories that you have, it doesn't mean they're accurate. 847 00:47:00,120 --> 00:47:03,680 Speaker 4: And so I mean, especially it doesn't surprise me about 848 00:47:03,680 --> 00:47:06,480 Speaker 4: September eleventh, because especially the more you tell a story, 849 00:47:06,800 --> 00:47:09,520 Speaker 4: the more you start laying down these ruts in the road, 850 00:47:10,040 --> 00:47:13,359 Speaker 4: and that becomes the story that becomes the truth. And 851 00:47:13,800 --> 00:47:15,680 Speaker 4: you know, we've all run into these things in our 852 00:47:15,719 --> 00:47:18,520 Speaker 4: life where someone suddenly shows us a photograph or something 853 00:47:18,560 --> 00:47:20,239 Speaker 4: that says, wait, that's not my memory. Look here's the 854 00:47:20,280 --> 00:47:22,680 Speaker 4: thing here, and you go, oh, gosh, I had actually 855 00:47:22,719 --> 00:47:25,520 Speaker 4: misremembered that that thing that happened, or where I was 856 00:47:25,560 --> 00:47:28,520 Speaker 4: standing or what I was doing. Anyway, So this is 857 00:47:28,560 --> 00:47:32,960 Speaker 4: the concern when people say, oh, I remember whatever being 858 00:47:33,000 --> 00:47:34,920 Speaker 4: born or this event when I was really young, is 859 00:47:34,920 --> 00:47:38,760 Speaker 4: that we know how easy it is to believe memories 860 00:47:38,800 --> 00:47:39,399 Speaker 4: that are not true. 861 00:47:40,400 --> 00:47:42,560 Speaker 2: Now correct me if I'm wrong, But I think I 862 00:47:42,600 --> 00:47:45,680 Speaker 2: recall reading that. In some of these cases with the 863 00:47:45,920 --> 00:47:50,200 Speaker 2: like where there's a big public event and people remember you, 864 00:47:50,320 --> 00:47:52,920 Speaker 2: people are like asked to write down their experiences that day, 865 00:47:52,960 --> 00:47:56,319 Speaker 2: and then the researchers contact them again later and have 866 00:47:56,440 --> 00:47:58,280 Speaker 2: them try it again. You say, what do you remember 867 00:47:58,320 --> 00:48:00,759 Speaker 2: about that day? Not only do they often get details wrong, 868 00:48:00,840 --> 00:48:03,480 Speaker 2: but don't they often insist that the way they remember 869 00:48:03,520 --> 00:48:06,080 Speaker 2: it now is correct and what they wrote at the 870 00:48:06,080 --> 00:48:06,880 Speaker 2: time was wrong. 871 00:48:07,840 --> 00:48:11,839 Speaker 4: Exactly. That's exactly right, Yeah, because it's so hard to 872 00:48:11,880 --> 00:48:14,719 Speaker 4: disbelieve our own memories about things. You know, this is 873 00:48:14,760 --> 00:48:17,680 Speaker 4: obviously at the heart of lots of spousal arguments too. 874 00:48:17,840 --> 00:48:20,360 Speaker 4: You know, you have two brains, you have two different 875 00:48:20,360 --> 00:48:25,719 Speaker 4: ways of remembering what precisely happened. Yes, that's exactly right. 876 00:48:26,600 --> 00:48:28,640 Speaker 1: All right, Well, you already mentioned that you have the 877 00:48:28,680 --> 00:48:32,880 Speaker 1: episode coming up about AI creativity. I'm definitely excited to 878 00:48:32,960 --> 00:48:35,080 Speaker 1: check that one out. Is there anything else you want 879 00:48:35,120 --> 00:48:38,360 Speaker 1: to tease out for listeners what else they can expect 880 00:48:38,360 --> 00:48:41,280 Speaker 1: from future episodes of Inner Cosmos? Yeah? 881 00:48:41,320 --> 00:48:43,560 Speaker 4: Well, okay, so my next one after that is going 882 00:48:43,640 --> 00:48:47,040 Speaker 4: to be is a I sentient Because you know, this 883 00:48:47,120 --> 00:48:49,840 Speaker 4: is a big question now as these models get larger 884 00:48:49,840 --> 00:48:52,720 Speaker 4: and larger, and things are moving at an extraordinary pace. Now, 885 00:48:53,040 --> 00:48:55,160 Speaker 4: what does sentients mean? And this is related to the 886 00:48:55,239 --> 00:48:58,000 Speaker 4: question you asked me Rob about consciousness and so on. 887 00:48:58,239 --> 00:49:01,800 Speaker 4: So I think this actually gives an interesting tool into 888 00:49:02,080 --> 00:49:05,080 Speaker 4: studying consciousness that we haven't had before. But I have 889 00:49:05,160 --> 00:49:10,520 Speaker 4: other episodes about my One after that is about counterfeiting 890 00:49:10,719 --> 00:49:13,839 Speaker 4: money and what it is that we notice about counterfeits 891 00:49:14,360 --> 00:49:19,120 Speaker 4: or we do not. I have an episode on will 892 00:49:19,200 --> 00:49:23,160 Speaker 4: you perceive the event that kills you? And I find 893 00:49:23,440 --> 00:49:25,319 Speaker 4: this is just a topic I've been thinking about for 894 00:49:25,360 --> 00:49:27,120 Speaker 4: a long time and have put together a lot of 895 00:49:27,120 --> 00:49:31,839 Speaker 4: work on this about you know, if suddenly something, let's 896 00:49:31,880 --> 00:49:35,120 Speaker 4: say that the brick from the pedestrian bridge over the 897 00:49:35,160 --> 00:49:38,040 Speaker 4: highway fell on your head when you were in a convertible, 898 00:49:38,239 --> 00:49:42,000 Speaker 4: The question is, would you perceive dying or would you 899 00:49:42,080 --> 00:49:44,799 Speaker 4: be dead before you knew anything happen? And what does 900 00:49:44,880 --> 00:49:46,520 Speaker 4: that look like? Does it look like you know, suddenly 901 00:49:46,520 --> 00:49:50,239 Speaker 4: the footage just ends but there's no pain, stuff like that. 902 00:49:51,320 --> 00:49:54,399 Speaker 4: So I have lots and lots of episodes. Can we 903 00:49:54,440 --> 00:49:57,680 Speaker 4: create new senses for humans? Which is a big part 904 00:49:57,680 --> 00:49:59,320 Speaker 4: of what I've been doing over the last eight years 905 00:49:59,320 --> 00:50:04,880 Speaker 4: with company that I run called Neosensory. Yeah, and I 906 00:50:04,960 --> 00:50:08,480 Speaker 4: have forty six episodes this year, all of which I've outlined, 907 00:50:09,000 --> 00:50:12,520 Speaker 4: and then it's just a matter of spending the twelve 908 00:50:12,520 --> 00:50:15,319 Speaker 4: hours per week of writing the hour long monologue. 909 00:50:15,800 --> 00:50:19,920 Speaker 1: Awesome, it sounds exciting. I'm excited to check out more episodes. 910 00:50:20,560 --> 00:50:24,160 Speaker 4: Great. Thank you guys so much for having me. It's 911 00:50:24,200 --> 00:50:25,319 Speaker 4: been a pleasure to see all. 912 00:50:25,520 --> 00:50:28,520 Speaker 1: Yeah, thanks for coming on the show. 913 00:50:29,280 --> 00:50:31,640 Speaker 2: All right, well that was our conversation with David Eagleman. 914 00:50:32,000 --> 00:50:35,040 Speaker 2: Once again, much appreciation to David for taking the time 915 00:50:35,080 --> 00:50:37,600 Speaker 2: to chat with us today. If you want to check 916 00:50:37,600 --> 00:50:40,000 Speaker 2: out his new show, and we do recommend it once again, 917 00:50:40,080 --> 00:50:43,279 Speaker 2: it is called Inner Cosmos with David Eagleman. You can 918 00:50:43,320 --> 00:50:45,919 Speaker 2: find it on the iHeart app or wherever you get 919 00:50:45,920 --> 00:50:47,200 Speaker 2: your podcasts. 920 00:50:47,280 --> 00:50:49,759 Speaker 1: Just a reminder that's Stuff to Blow Your Mind is 921 00:50:49,880 --> 00:50:53,719 Speaker 1: a science podcast with core episodes on Tuesdays and Thursdays 922 00:50:53,719 --> 00:50:56,640 Speaker 1: in the Stuff to Blow your Mind podcast feed. On 923 00:50:56,680 --> 00:50:59,360 Speaker 1: Mondays we do listener mail episodes. On Wednesdays we do 924 00:50:59,360 --> 00:51:01,640 Speaker 1: a short form artifact or monster fack episode, and on 925 00:51:01,680 --> 00:51:03,719 Speaker 1: Fridays we set aside most serious concerns to just talk 926 00:51:03,719 --> 00:51:06,120 Speaker 1: about a weird film on Weird House Cinema. 927 00:51:06,360 --> 00:51:09,480 Speaker 2: Huge thanks to our audio producer JJ Posway. If you 928 00:51:09,480 --> 00:51:11,440 Speaker 2: would like to get in touch with us with feedback 929 00:51:11,480 --> 00:51:13,760 Speaker 2: on this episode or any other, to suggest a topic 930 00:51:13,800 --> 00:51:16,000 Speaker 2: for the future, or just to say hello, you can 931 00:51:16,040 --> 00:51:18,840 Speaker 2: email us at contact at stuff to Blow your Mind 932 00:51:18,960 --> 00:51:26,839 Speaker 2: dot com. 933 00:51:26,880 --> 00:51:29,840 Speaker 3: Stuff to Blow Your Mind is production of iHeartRadio. For 934 00:51:29,920 --> 00:51:32,720 Speaker 3: more podcasts from my Heart Radio, visit the iHeartRadio app, 935 00:51:32,840 --> 00:51:48,680 Speaker 3: Apple Podcasts, or wherever you're listening to your favorite shows.