1 00:00:02,960 --> 00:00:06,760 Speaker 1: Welcome to Stuff to Blow Your Mind, the production of iHeartRadio. 2 00:00:12,800 --> 00:00:14,960 Speaker 1: Hey you welcome to Stuff to Blow your Mind. My 3 00:00:15,040 --> 00:00:17,840 Speaker 1: name is Robert Lamb and I'm Joe McCormick, and hey, 4 00:00:17,880 --> 00:00:20,880 Speaker 1: welcome back. Rob. You were outsick earlier this week. It's 5 00:00:20,920 --> 00:00:23,600 Speaker 1: good to have you back. It's good to be back now. 6 00:00:23,960 --> 00:00:26,920 Speaker 1: Because you were outsick, we ended up putting a pause 7 00:00:27,200 --> 00:00:30,640 Speaker 1: on an ongoing series we were doing on childhood amnesia. 8 00:00:30,800 --> 00:00:34,680 Speaker 1: So we ended up running a vault episode on Tuesday, 9 00:00:34,920 --> 00:00:37,200 Speaker 1: and I just wanted to assure people that we will 10 00:00:37,280 --> 00:00:39,559 Speaker 1: be coming back to that subject. We will be resuming 11 00:00:39,560 --> 00:00:43,360 Speaker 1: the series, probably for next week's core episodes, but because 12 00:00:43,400 --> 00:00:46,000 Speaker 1: we already had it scheduled out this way, today's episode 13 00:00:46,080 --> 00:00:48,800 Speaker 1: is going to be an interview, so in fact, we 14 00:00:48,840 --> 00:00:53,240 Speaker 1: are talking to a return guest, the neuroscientist and author 15 00:00:53,400 --> 00:00:57,160 Speaker 1: David Eagleman. This is actually the second time David has 16 00:00:57,200 --> 00:01:00,600 Speaker 1: been a guest on the show. In September twenty twenty, Rob, 17 00:01:00,680 --> 00:01:03,480 Speaker 1: you spoke to him about his book Live Wired, which 18 00:01:03,520 --> 00:01:08,040 Speaker 1: is a popular science book on the subject of brain plasticity. Yeah. Absolutely, 19 00:01:08,080 --> 00:01:09,840 Speaker 1: that was a fun episode. You can find it in 20 00:01:09,880 --> 00:01:13,160 Speaker 1: the archives, and the book Live Wired is an absolute delight. 21 00:01:13,200 --> 00:01:16,320 Speaker 1: If you're at all interested in anything you hear us 22 00:01:16,319 --> 00:01:19,520 Speaker 1: discussing in this episode, pick up a copy of it. 23 00:01:19,280 --> 00:01:22,679 Speaker 1: It's great. So this week we invited David back on 24 00:01:22,720 --> 00:01:27,200 Speaker 1: the show because he now has a fantastic, brand new 25 00:01:27,240 --> 00:01:30,960 Speaker 1: podcast on our very own network, on the iHeart Network, 26 00:01:31,000 --> 00:01:35,119 Speaker 1: and it is called Inner Cosmos with David Eagelman. So 27 00:01:35,160 --> 00:01:37,520 Speaker 1: before we get started with our interview, I thought we 28 00:01:37,560 --> 00:01:41,240 Speaker 1: should just share a bit of background about David. This 29 00:01:41,319 --> 00:01:44,400 Speaker 1: is from his website. That's right. David Eagelman is a 30 00:01:44,440 --> 00:01:48,480 Speaker 1: neuroscientist at Stanford University and an internationally best selling author. 31 00:01:48,560 --> 00:01:52,400 Speaker 1: He's the co founder of two venture backed companies, Neosensory 32 00:01:52,440 --> 00:01:55,080 Speaker 1: and brain Check, and he also directs the Center for 33 00:01:55,200 --> 00:01:59,280 Speaker 1: Science and Law in national nonprofit Institute. He's best known 34 00:01:59,280 --> 00:02:05,320 Speaker 1: for his work on sensory substitution, time perception, brain plasticity, synesthesia, 35 00:02:05,360 --> 00:02:08,240 Speaker 1: and neuro law. He is the writer presenter of the 36 00:02:08,320 --> 00:02:11,400 Speaker 1: international PBS series The Brain with David Eagleman and the 37 00:02:11,440 --> 00:02:14,240 Speaker 1: author of the companion book The Brain The Story of You. 38 00:02:14,680 --> 00:02:17,480 Speaker 1: He's also the writer and presenter of The Creative Brain 39 00:02:17,520 --> 00:02:20,680 Speaker 1: on Netflix. David Eagleman is the author of over one 40 00:02:20,760 --> 00:02:23,960 Speaker 1: hundred and twenty academic publications, and many many books of 41 00:02:24,040 --> 00:02:28,440 Speaker 1: popular science. Eagleman is a TED speaker, a Guggenheim Fellow, 42 00:02:28,480 --> 00:02:32,000 Speaker 1: and serves on several boards, including the American Brain Foundation 43 00:02:32,120 --> 00:02:36,080 Speaker 1: and the long Now Foundation. He's the Chief Scientific Advisor 44 00:02:36,200 --> 00:02:39,400 Speaker 1: of the Mind Science Foundation and the winner of the 45 00:02:39,639 --> 00:02:43,639 Speaker 1: Claude Shannon Luminary Award from Bell Labs and the McGovern 46 00:02:43,760 --> 00:02:48,040 Speaker 1: Award for Excellence in Biomedical Communication. He serves as the 47 00:02:48,080 --> 00:02:50,880 Speaker 1: academic editor for the Journal of Science in Law. Was 48 00:02:51,000 --> 00:02:53,720 Speaker 1: named Science Educator of the Year by the Society for 49 00:02:53,800 --> 00:02:57,480 Speaker 1: Neuroscience and was featured as one of the quote Brightest 50 00:02:57,600 --> 00:03:01,600 Speaker 1: Idea Guys by Italy's Style magazine. He served as the 51 00:03:01,680 --> 00:03:06,000 Speaker 1: scientific advisor on several TV shows, including Westworld and Perception, 52 00:03:06,400 --> 00:03:10,240 Speaker 1: and it's been profiled on The Colbert Report, Nova Science Now, 53 00:03:10,320 --> 00:03:13,680 Speaker 1: The New Yorker, CNN's Next List, and many other venues. 54 00:03:14,000 --> 00:03:17,720 Speaker 1: He appears regularly on radio and television to discuss literature 55 00:03:17,720 --> 00:03:19,880 Speaker 1: and science. And I guess now he is going to 56 00:03:19,960 --> 00:03:23,600 Speaker 1: start having to add podcasts to his to the end 57 00:03:23,639 --> 00:03:26,600 Speaker 1: of his bio here, So Rob, unless you have anything else, 58 00:03:26,639 --> 00:03:28,840 Speaker 1: I think we should jump right into our conversation with 59 00:03:28,919 --> 00:03:34,760 Speaker 1: David Eagleman. Hi David, welcome back to the show. Great 60 00:03:34,800 --> 00:03:36,760 Speaker 1: thanks Rob for having me again. It's a pleasure to 61 00:03:36,800 --> 00:03:39,560 Speaker 1: be here. And hello Joe, it's great to meet you. David. Now, 62 00:03:39,560 --> 00:03:41,920 Speaker 1: you have a great new podcast series out in our 63 00:03:42,040 --> 00:03:46,160 Speaker 1: cosmos through iHeart. How did you decide what path to 64 00:03:46,280 --> 00:03:50,360 Speaker 1: take with the podcast format? You know, it's a great 65 00:03:50,400 --> 00:03:54,120 Speaker 1: question I did. Truth is, I had not listened to 66 00:03:55,160 --> 00:03:57,800 Speaker 1: many podcasts at all. Now I have, but when I 67 00:03:57,840 --> 00:03:59,960 Speaker 1: first was putting this together with iHeart, I thought, look, 68 00:04:00,040 --> 00:04:02,640 Speaker 1: I want to do a forty five minute hour long 69 00:04:03,160 --> 00:04:07,320 Speaker 1: monologue every week. And that seemed like a terrific idea 70 00:04:07,680 --> 00:04:09,760 Speaker 1: at first, and then my wife says she was going 71 00:04:09,840 --> 00:04:11,760 Speaker 1: to kill me because it turns out that's a ton 72 00:04:11,840 --> 00:04:14,280 Speaker 1: of work. It takes me about, I don't know, about 73 00:04:14,320 --> 00:04:18,320 Speaker 1: twelve hours a week to get a good monologue that's 74 00:04:18,360 --> 00:04:23,680 Speaker 1: almost an hour long. So that's how I decided on 75 00:04:23,720 --> 00:04:27,800 Speaker 1: the format, because I thought it would be something special 76 00:04:28,000 --> 00:04:30,680 Speaker 1: rather than you know, I've been on many different podcasts 77 00:04:30,680 --> 00:04:33,640 Speaker 1: where we're doing interviews just like this and it's super fun. 78 00:04:33,720 --> 00:04:36,280 Speaker 1: But I wanted to do something different. So that's how 79 00:04:36,320 --> 00:04:40,600 Speaker 1: I accidentally stumbled into that format. So I figure we 80 00:04:40,600 --> 00:04:42,440 Speaker 1: should give people a taste of the kind of things 81 00:04:42,480 --> 00:04:44,919 Speaker 1: you talk about on your show. I got a chance 82 00:04:44,960 --> 00:04:48,040 Speaker 1: to listen to the episode you did about memory and 83 00:04:48,080 --> 00:04:50,160 Speaker 1: the perception of time, and I thought it was great, 84 00:04:50,160 --> 00:04:52,120 Speaker 1: by the way, a really great way to kick off 85 00:04:52,160 --> 00:04:55,840 Speaker 1: the show. So your starting premise in that episode was 86 00:04:55,920 --> 00:04:59,280 Speaker 1: that many people who have been through intense or life 87 00:04:59,279 --> 00:05:02,320 Speaker 1: threatening event, maybe falling off of a building or seeing 88 00:05:02,360 --> 00:05:07,159 Speaker 1: a car speeding toward them, report afterwards that time seemed 89 00:05:07,240 --> 00:05:10,880 Speaker 1: to have somehow slowed down for them during the pivotal 90 00:05:10,920 --> 00:05:13,680 Speaker 1: few seconds, almost as if they were able to enter 91 00:05:13,760 --> 00:05:16,840 Speaker 1: a state of slow motion or bullet time like from 92 00:05:16,839 --> 00:05:19,880 Speaker 1: the Matrix. Can you talk a bit about your research 93 00:05:19,920 --> 00:05:24,080 Speaker 1: on this subject and what you discovered about this perception. Yeah, 94 00:05:24,160 --> 00:05:26,719 Speaker 1: so my research, of course, started off very personal, which 95 00:05:26,760 --> 00:05:28,359 Speaker 1: is that I fell off of a house and it 96 00:05:28,480 --> 00:05:30,839 Speaker 1: seemed like things took a long time, and that's what 97 00:05:30,920 --> 00:05:33,120 Speaker 1: got me interested in this. Then when I got older 98 00:05:33,120 --> 00:05:37,320 Speaker 1: and became a neuroscientist, eventually I realized that I was 99 00:05:37,440 --> 00:05:41,360 Speaker 1: hearing this story not uncommonly from people who had been 100 00:05:41,400 --> 00:05:45,760 Speaker 1: in a gunfight or some scary situation or car accident 101 00:05:45,880 --> 00:05:48,400 Speaker 1: or whatever, and they felt that things took longer, and 102 00:05:48,440 --> 00:05:50,200 Speaker 1: so I looked in the literature and there was not 103 00:05:50,440 --> 00:05:54,080 Speaker 1: anything on this. So that's when I came drialized. I 104 00:05:54,080 --> 00:05:56,679 Speaker 1: was going to have to do this myself and figure 105 00:05:56,720 --> 00:06:00,719 Speaker 1: out how to run an experiment on this. So you know, briefly, 106 00:06:00,720 --> 00:06:03,279 Speaker 1: what I did is I built a device that I 107 00:06:03,279 --> 00:06:06,440 Speaker 1: hooked to people's wrists that flashed information them at the 108 00:06:06,600 --> 00:06:09,400 Speaker 1: met a certain way so that I could tell how 109 00:06:10,080 --> 00:06:12,880 Speaker 1: rapidly their brain was perceiving, and that way I could 110 00:06:12,920 --> 00:06:17,039 Speaker 1: test whether they were actually seeing in slow motion or 111 00:06:17,480 --> 00:06:20,680 Speaker 1: The whole thing was a trick of memory, meaning when 112 00:06:20,680 --> 00:06:23,599 Speaker 1: you were in an intense situation, you wrote down more memories. 113 00:06:23,600 --> 00:06:25,520 Speaker 1: So when you said, what has happened, which has happened 114 00:06:26,120 --> 00:06:28,640 Speaker 1: that it seemed like it must have taken longer because 115 00:06:28,680 --> 00:06:31,440 Speaker 1: you have all these memories. So what I did then 116 00:06:31,480 --> 00:06:33,800 Speaker 1: has dropped people from one hundred and fifty foot tall 117 00:06:33,839 --> 00:06:38,000 Speaker 1: tower in free fall backwards into a net below, and 118 00:06:39,120 --> 00:06:41,880 Speaker 1: I measured their perception of time on the way down 119 00:06:42,000 --> 00:06:45,120 Speaker 1: this way. And what I found after, you know, I 120 00:06:45,200 --> 00:06:48,120 Speaker 1: did this on myself, but then we dropped twenty three participants. 121 00:06:50,080 --> 00:06:52,680 Speaker 1: What I found is that it is in fact a 122 00:06:52,680 --> 00:06:55,560 Speaker 1: trick of memory, which is to say, when everything is 123 00:06:55,600 --> 00:06:58,880 Speaker 1: hitting the fan, your brain writes down much denser memory, 124 00:06:58,920 --> 00:07:00,760 Speaker 1: and when you read that back out, your brain has 125 00:07:00,760 --> 00:07:04,599 Speaker 1: to make an assumption about, you know, how much memory, 126 00:07:04,640 --> 00:07:07,200 Speaker 1: how much footage maps onto how much time, and so 127 00:07:07,240 --> 00:07:10,520 Speaker 1: it says, oh, wow, that must have been five seconds, 128 00:07:10,560 --> 00:07:12,280 Speaker 1: even though it was only one second worth. But the 129 00:07:12,320 --> 00:07:15,760 Speaker 1: point is people were not able to see in slow motion, which, 130 00:07:15,800 --> 00:07:18,240 Speaker 1: by the way, was disappointing for me because I already 131 00:07:18,240 --> 00:07:20,520 Speaker 1: had I was already talking with the military about building 132 00:07:20,560 --> 00:07:24,840 Speaker 1: cockpits in a way that flashed information more rapidly at 133 00:07:24,840 --> 00:07:27,520 Speaker 1: people when they in some intense situation. But it turns 134 00:07:27,520 --> 00:07:30,160 Speaker 1: out none of that makes the difference. You can't actually 135 00:07:30,160 --> 00:07:33,600 Speaker 1: get information in there faster, you can only remember it faster. 136 00:07:34,880 --> 00:07:40,800 Speaker 1: So there's not actually any increased ability of perception. It's 137 00:07:40,840 --> 00:07:43,880 Speaker 1: just a trick of the memory exactly. Now, it is 138 00:07:43,920 --> 00:07:46,400 Speaker 1: the case that you know a lot, You can do 139 00:07:46,480 --> 00:07:49,120 Speaker 1: a lot of things pre consciously, by which I mean 140 00:07:49,200 --> 00:07:53,320 Speaker 1: your conscious awareness of something is always the slowest thing 141 00:07:53,320 --> 00:07:55,920 Speaker 1: on the ladder to ever get any information. So by 142 00:07:55,920 --> 00:07:58,640 Speaker 1: the time your brain puts together all the signals and says, okay, 143 00:07:58,680 --> 00:08:02,360 Speaker 1: this is what just happened, you know, that's at least 144 00:08:02,400 --> 00:08:06,280 Speaker 1: half a second to a second behind real time. But 145 00:08:06,520 --> 00:08:09,280 Speaker 1: the point is your body can react much faster than that. 146 00:08:09,320 --> 00:08:11,160 Speaker 1: Your body can get signals and say, whoa, I got 147 00:08:11,160 --> 00:08:13,520 Speaker 1: to do something about this right away, and so you 148 00:08:13,560 --> 00:08:16,880 Speaker 1: can react, you know, often much faster than you can 149 00:08:16,880 --> 00:08:19,040 Speaker 1: be consciously aware. So you know, I don't know if 150 00:08:19,040 --> 00:08:21,239 Speaker 1: you've been on a I mean, this is what happened 151 00:08:21,240 --> 00:08:22,760 Speaker 1: to me recently. I was on a hike with a 152 00:08:22,800 --> 00:08:25,720 Speaker 1: friend and a branch snapped back, and I was, you know, 153 00:08:25,880 --> 00:08:28,280 Speaker 1: halfway into the move of ducking out of the way 154 00:08:28,280 --> 00:08:31,200 Speaker 1: of the branch before I consciously realized it. Or my 155 00:08:31,240 --> 00:08:34,439 Speaker 1: foot gets halfway to the break of my truck before 156 00:08:34,480 --> 00:08:36,640 Speaker 1: I realize that there's a car pulling out of the 157 00:08:36,720 --> 00:08:39,320 Speaker 1: driveway ahead of me. In other words, consciousness is always 158 00:08:39,320 --> 00:08:41,400 Speaker 1: the last guy on the ladder to get any information, 159 00:08:41,880 --> 00:08:46,800 Speaker 1: and your body can almost always react much faster. In fact, 160 00:08:47,000 --> 00:08:48,720 Speaker 1: let me just say what more example of that, which 161 00:08:48,800 --> 00:08:51,080 Speaker 1: is when I was younger, I used to play baseball, 162 00:08:51,120 --> 00:08:54,600 Speaker 1: and my experience was always that, you know, I'd be 163 00:08:54,640 --> 00:08:58,559 Speaker 1: waiting for the for the pitch, and then I would 164 00:08:58,600 --> 00:09:03,000 Speaker 1: realize after it had happened that I had already hit 165 00:09:03,080 --> 00:09:05,720 Speaker 1: the ball, and I would consciously realize, oh, I have 166 00:09:05,840 --> 00:09:08,120 Speaker 1: just hit the ball. Now throw the bat and run. 167 00:09:09,520 --> 00:09:12,400 Speaker 1: But you know, the whole thing, the ball moving from 168 00:09:12,400 --> 00:09:15,199 Speaker 1: the mound to the plate, and the swing and the batting, 169 00:09:15,200 --> 00:09:17,920 Speaker 1: that's all a really fast process, and it often happens 170 00:09:17,920 --> 00:09:23,560 Speaker 1: preconsciously somewhat related to that. This raises questions about the 171 00:09:24,280 --> 00:09:28,640 Speaker 1: different kinds of circumstances that would favor the perception of 172 00:09:28,760 --> 00:09:33,440 Speaker 1: slow motion in intense situations or not. So. My example 173 00:09:33,520 --> 00:09:36,040 Speaker 1: was there was one night years ago I was driving 174 00:09:36,120 --> 00:09:39,480 Speaker 1: under an overpass and there was a sudden deafening sound 175 00:09:39,559 --> 00:09:42,640 Speaker 1: and a shutter, And what my wife and I deduced 176 00:09:42,760 --> 00:09:46,040 Speaker 1: later was that somehow like a brick had fallen from 177 00:09:46,040 --> 00:09:48,040 Speaker 1: above and hit the roof of our car just above 178 00:09:48,080 --> 00:09:51,440 Speaker 1: the windshield as we passed under at highway speed. And 179 00:09:51,559 --> 00:09:53,000 Speaker 1: I don't know if somebody threw it or if it 180 00:09:53,080 --> 00:09:55,840 Speaker 1: somehow just fell, but I not only don't recall a 181 00:09:55,880 --> 00:09:59,000 Speaker 1: feeling of stretched time or a greater density of memories 182 00:09:59,360 --> 00:10:02,760 Speaker 1: right before and after the impact, I felt almost a 183 00:10:02,840 --> 00:10:07,480 Speaker 1: kind of retrospective amnesia, like a real paucity of detail, 184 00:10:07,679 --> 00:10:10,600 Speaker 1: And it was like we were suddenly a good ways 185 00:10:10,640 --> 00:10:14,079 Speaker 1: down the road and just trying to remember or figure 186 00:10:14,080 --> 00:10:18,120 Speaker 1: out what had happened. Yeah, that's exactly right. It's because 187 00:10:18,200 --> 00:10:20,520 Speaker 1: you didn't write down any memory. And this is generally 188 00:10:20,559 --> 00:10:23,760 Speaker 1: because as you are taking a drive down the highway, 189 00:10:23,800 --> 00:10:27,160 Speaker 1: your brain is writing down very little stuff going on. 190 00:10:27,520 --> 00:10:31,040 Speaker 1: In fact, the interesting part is that although we think 191 00:10:31,080 --> 00:10:33,560 Speaker 1: about memory is being like a video recorder or something, 192 00:10:33,600 --> 00:10:36,719 Speaker 1: in fact it's nothing like that. You write down very 193 00:10:36,800 --> 00:10:39,000 Speaker 1: little of what happens in your life, especially when you're 194 00:10:39,080 --> 00:10:41,520 Speaker 1: driving on a road you've been down before. So what 195 00:10:41,640 --> 00:10:45,480 Speaker 1: happened is there's the deafening crash and suddenly thinking what 196 00:10:45,600 --> 00:10:47,520 Speaker 1: just happened? What just happened? And you've got nothing to 197 00:10:47,600 --> 00:10:51,679 Speaker 1: draw on. There's just no footage there. And by the way, 198 00:10:51,720 --> 00:10:53,360 Speaker 1: just as a very quick side note, I think this 199 00:10:53,400 --> 00:10:57,040 Speaker 1: is what happens to people when they are high on marijuana. 200 00:10:57,240 --> 00:10:59,160 Speaker 1: Is they say, oh, my gosh, how long have I 201 00:10:59,240 --> 00:11:01,679 Speaker 1: been standing here? It's like I'm standing here forever. And 202 00:11:01,720 --> 00:11:05,520 Speaker 1: it's because they're not writing down memories in the same way. 203 00:11:05,559 --> 00:11:08,280 Speaker 1: So when their brain looks for how long have I 204 00:11:08,320 --> 00:11:12,520 Speaker 1: been standing here, what it's looking for is footage in time, 205 00:11:12,679 --> 00:11:15,679 Speaker 1: as in, Okay, I remember getting here, I remember this happening. 206 00:11:15,679 --> 00:11:17,720 Speaker 1: Someone said this, and someone put the glass down and 207 00:11:17,720 --> 00:11:19,679 Speaker 1: blah blah blah, so that I can estimate how long 208 00:11:19,720 --> 00:11:23,040 Speaker 1: it's been there. But suddenly it can't grab onto any 209 00:11:23,120 --> 00:11:27,920 Speaker 1: memories at all, and so suddenly people are lost in time. Anyway, 210 00:11:28,120 --> 00:11:33,679 Speaker 1: So this is Joe exactly what happens when people suddenly 211 00:11:33,840 --> 00:11:36,199 Speaker 1: are hit by a car that they don't see coming, 212 00:11:36,200 --> 00:11:39,679 Speaker 1: like a car tee bones them or something. Or I 213 00:11:39,760 --> 00:11:41,520 Speaker 1: might have mentioned in the podcast, I can't remember that, 214 00:11:41,720 --> 00:11:44,559 Speaker 1: you know. I was once riding my bike and the 215 00:11:44,720 --> 00:11:47,560 Speaker 1: wheel suddenly dropped in a pothole and I went flying 216 00:11:47,600 --> 00:11:50,720 Speaker 1: over the handlebars. But because I didn't see that coming, 217 00:11:51,160 --> 00:11:53,680 Speaker 1: I just had the sensation of suddenly, oh my god, 218 00:11:53,679 --> 00:11:56,200 Speaker 1: what's just you know, here I'm lying on the asphalt, bloody, 219 00:11:56,200 --> 00:11:59,080 Speaker 1: and I have no idea what just happened. And it's 220 00:11:59,120 --> 00:12:01,320 Speaker 1: precisely because you're not running down any memories. So when 221 00:12:01,360 --> 00:12:03,680 Speaker 1: your brain says, what does happened? Which has happened, there's 222 00:12:03,800 --> 00:12:07,120 Speaker 1: nothing to draw on, as opposed to the brick sliding 223 00:12:07,160 --> 00:12:09,680 Speaker 1: on ice towards the brick wall phenomenon, which is where 224 00:12:09,720 --> 00:12:12,640 Speaker 1: you say, oh my gosh, I'm predicting what's going to 225 00:12:12,720 --> 00:12:15,040 Speaker 1: happen and this is really gonna hurt, this is gonna 226 00:12:15,040 --> 00:12:17,320 Speaker 1: be bad, And that's when you're writing down lots of stuff. 227 00:12:17,679 --> 00:12:19,920 Speaker 1: So would you say you're more likely to have this 228 00:12:20,679 --> 00:12:26,080 Speaker 1: memory density perception number one if you see the event 229 00:12:26,280 --> 00:12:29,679 Speaker 1: coming ahead of time, there's there's expectation of it. But 230 00:12:29,800 --> 00:12:34,960 Speaker 1: also if you're just generally in a novel or unusual situation, yeah, 231 00:12:35,000 --> 00:12:38,120 Speaker 1: that's exactly right. Actually, so two aspects of that. One 232 00:12:38,280 --> 00:12:40,360 Speaker 1: is that I just mentioned a moment ago that you 233 00:12:40,400 --> 00:12:43,600 Speaker 1: write down very little memory, and that's because as an 234 00:12:43,679 --> 00:12:46,760 Speaker 1: adult now, your brain has sort of figured out a 235 00:12:46,760 --> 00:12:49,440 Speaker 1: pretty good model of the world, meaning you don't need 236 00:12:49,480 --> 00:12:52,720 Speaker 1: to write stuff down because you've seen all the personalities before, 237 00:12:52,800 --> 00:12:56,840 Speaker 1: you've seen different cities before, you've seen roads and people 238 00:12:56,960 --> 00:12:59,400 Speaker 1: and events and television shows, and you sort of you 239 00:12:59,480 --> 00:13:02,840 Speaker 1: sort of got it. But if something really novel happens, 240 00:13:02,960 --> 00:13:05,319 Speaker 1: that's when your brain writes something down and says, whoa 241 00:13:05,360 --> 00:13:08,679 Speaker 1: wait a minute, I'm surprised, And that's when stuff starts 242 00:13:08,679 --> 00:13:10,760 Speaker 1: getting written down. So when you look back at the 243 00:13:11,000 --> 00:13:13,880 Speaker 1: end of let's say a novel event, let's say you 244 00:13:13,920 --> 00:13:16,959 Speaker 1: go on some really wild trip on the weekend to 245 00:13:17,040 --> 00:13:19,600 Speaker 1: glopos islands and you see new things and so on, 246 00:13:19,960 --> 00:13:21,839 Speaker 1: then it seems like forever since you were at work 247 00:13:21,920 --> 00:13:23,920 Speaker 1: on Friday. But if you just go off for a 248 00:13:23,960 --> 00:13:26,520 Speaker 1: normal weekend and you come back to work, you think, oh, 249 00:13:26,520 --> 00:13:28,400 Speaker 1: I was just here because he didn't lay down any 250 00:13:28,440 --> 00:13:33,560 Speaker 1: new memories over the weekend. So it is true that 251 00:13:33,720 --> 00:13:37,080 Speaker 1: things that are novel generally seem to last longer. However, 252 00:13:37,280 --> 00:13:41,240 Speaker 1: it should be noted that when things are actually life threatening, 253 00:13:41,720 --> 00:13:45,440 Speaker 1: you have essentially an emergency response memory system that kicks 254 00:13:45,440 --> 00:13:49,480 Speaker 1: into gear. That is a secondary track on which you 255 00:13:49,520 --> 00:13:52,959 Speaker 1: write down memory. And this is underpinned by a part 256 00:13:52,960 --> 00:13:55,720 Speaker 1: of the brain called the amygla and its job is 257 00:13:55,760 --> 00:13:59,760 Speaker 1: to say, WHOA, everything is going really bad and scary here, 258 00:14:00,160 --> 00:14:02,120 Speaker 1: and I got to write this down because that, after all, 259 00:14:02,240 --> 00:14:04,800 Speaker 1: is the point of memory, is to make sure that 260 00:14:04,840 --> 00:14:08,320 Speaker 1: you write down stuff that is important and specifically life 261 00:14:08,400 --> 00:14:12,720 Speaker 1: threateningly important. So if the normal memory system, would that 262 00:14:12,720 --> 00:14:16,760 Speaker 1: be the hippocampal memory system exactly, If that's the normal 263 00:14:16,800 --> 00:14:20,320 Speaker 1: memory system, and then the amygdala tends to be recruited 264 00:14:20,440 --> 00:14:24,520 Speaker 1: in intense situations. Do we know generally if there is 265 00:14:24,600 --> 00:14:30,320 Speaker 1: any if there are any characteristic differences between how memories 266 00:14:30,320 --> 00:14:34,440 Speaker 1: are recorded in the hipocampus versus the amygdala. Yeah, and 267 00:14:34,640 --> 00:14:37,040 Speaker 1: it turns out it's a it's a it's a tragic one, 268 00:14:37,120 --> 00:14:43,680 Speaker 1: which is that amygdala memories are uneasable, whereas hippocampal memories 269 00:14:43,720 --> 00:14:46,800 Speaker 1: can be erased. So let me unpack this because there's 270 00:14:46,800 --> 00:14:49,600 Speaker 1: two surprising parts here. So first of all, the fact 271 00:14:49,600 --> 00:14:53,960 Speaker 1: that hippocampal memories can be erased is terrifying and weird 272 00:14:54,000 --> 00:14:57,960 Speaker 1: and wild, which is, if I ask you to recall 273 00:14:58,040 --> 00:15:02,760 Speaker 1: the name of your fifth grade teacher and then suddenly 274 00:15:04,000 --> 00:15:07,000 Speaker 1: that brick drops off the highway bridge and hits you 275 00:15:07,040 --> 00:15:10,200 Speaker 1: in the head. God forbid, let's say that happens. You 276 00:15:10,240 --> 00:15:14,400 Speaker 1: will now have amnesia. For that one fact. You will 277 00:15:14,440 --> 00:15:18,240 Speaker 1: not remember anything about your fifth grade teacher anymore about 278 00:15:18,280 --> 00:15:21,320 Speaker 1: the at least the fifth grade teacher's name. Why. It's 279 00:15:21,360 --> 00:15:24,600 Speaker 1: because the name of your fifth grade teacher is stored 280 00:15:24,720 --> 00:15:26,720 Speaker 1: deep in the structure of your brain, and when I 281 00:15:26,720 --> 00:15:30,640 Speaker 1: ask you to recall it, you're actually transferring it from 282 00:15:30,680 --> 00:15:36,520 Speaker 1: that structural form into activity, you know, spikes in the brain. 283 00:15:36,600 --> 00:15:40,240 Speaker 1: And that's how you're remembering the name of your teacher. Now, 284 00:15:40,280 --> 00:15:44,000 Speaker 1: when you're done remembering it, it has to get reconsolidated 285 00:15:44,080 --> 00:15:46,440 Speaker 1: back into its physical form, and if you get hit 286 00:15:46,520 --> 00:15:51,400 Speaker 1: in the head during that moment, it's gone. It's now 287 00:15:51,800 --> 00:15:53,840 Speaker 1: you know, it's been transferred from the physical to the 288 00:15:54,560 --> 00:15:57,800 Speaker 1: you know, activity in spikes and if you you know, 289 00:15:57,840 --> 00:16:00,600 Speaker 1: before it gets transferred back into the physical, it is 290 00:16:00,640 --> 00:16:04,560 Speaker 1: susceptible to erasure, which is weird and terrifying. And by 291 00:16:04,560 --> 00:16:07,800 Speaker 1: the way, this can also be done with protein synthesis inhibitors. 292 00:16:08,360 --> 00:16:11,160 Speaker 1: So people do this in rats. They've been doing this 293 00:16:11,200 --> 00:16:13,400 Speaker 1: for decades, where you know, you train a rat how 294 00:16:13,440 --> 00:16:16,400 Speaker 1: to run different mazes, and then you put the rat 295 00:16:16,520 --> 00:16:19,160 Speaker 1: on a particular maze where the rat has to remember, oh, yeah, 296 00:16:19,200 --> 00:16:21,720 Speaker 1: that's this one, and then you just feed the rat 297 00:16:21,760 --> 00:16:26,400 Speaker 1: a protein synthesis inhibitor, and now it cannot reconsolidate that 298 00:16:26,480 --> 00:16:31,080 Speaker 1: memory into physical form. So number one is hypocampal memories 299 00:16:31,120 --> 00:16:34,160 Speaker 1: can be erased. The number two point is that amiglo 300 00:16:34,240 --> 00:16:37,000 Speaker 1: memories cannot be erased, which is to say, when you 301 00:16:37,120 --> 00:16:41,680 Speaker 1: recruit the emergency control system to say, wow, this is 302 00:16:41,720 --> 00:16:46,720 Speaker 1: really important, write this down, then those are permanent. Which 303 00:16:46,880 --> 00:16:49,960 Speaker 1: the reason I say that's unfortunate is because those are 304 00:16:49,960 --> 00:16:51,840 Speaker 1: the ones that people want to race. In other words. 305 00:16:51,840 --> 00:16:54,400 Speaker 1: You know, let's say a rape victim or something like that. 306 00:16:54,400 --> 00:16:56,960 Speaker 1: That is the one thing that she wants to forget 307 00:16:57,400 --> 00:17:00,960 Speaker 1: more than anything, but cannot. Now, I was checking out 308 00:17:02,080 --> 00:17:03,760 Speaker 1: the show as well, and I was listening to your 309 00:17:03,800 --> 00:17:05,679 Speaker 1: I believe this is an episode from just earlier this 310 00:17:05,720 --> 00:17:09,560 Speaker 1: week on the topic of animal uplift, which I don't 311 00:17:09,560 --> 00:17:12,119 Speaker 1: think is a term that I was familiar with. Can 312 00:17:12,160 --> 00:17:14,399 Speaker 1: you give us a brief taste, a brief overview of 313 00:17:14,480 --> 00:17:18,680 Speaker 1: what animal uplift is. Yeah, it's this idea that you know, look, 314 00:17:18,800 --> 00:17:21,840 Speaker 1: the human brain is made out of exactly the same 315 00:17:21,880 --> 00:17:24,840 Speaker 1: stuff that a mouse brain, the dog brain, the giraffe brain. 316 00:17:24,920 --> 00:17:26,520 Speaker 1: You know, it's all the same stuff. It's got the 317 00:17:26,560 --> 00:17:29,399 Speaker 1: same anatomy, same general structure. We just have more of 318 00:17:29,440 --> 00:17:34,280 Speaker 1: this wrinkly outer bit called the cortex. And but somehow 319 00:17:35,000 --> 00:17:37,720 Speaker 1: we are you know, we've taken over the whole planet 320 00:17:37,760 --> 00:17:39,560 Speaker 1: as a species. We've gotten off the planet, We've made 321 00:17:39,640 --> 00:17:43,480 Speaker 1: vaccines and internet and quantum mechanics and so like, there's 322 00:17:43,480 --> 00:17:47,240 Speaker 1: some real difference in what we are doing versus our 323 00:17:47,280 --> 00:17:52,000 Speaker 1: neighbors in the animal kingdom, But the genetic differences, as 324 00:17:52,040 --> 00:17:54,040 Speaker 1: you know, are not that much. I mean, we have 325 00:17:54,359 --> 00:17:57,440 Speaker 1: enormous similarity with almost every Like if you're building a giraffe, 326 00:17:57,440 --> 00:17:58,960 Speaker 1: you've got to build the heart and the lungs and 327 00:17:59,000 --> 00:18:01,439 Speaker 1: the brain and then theophagus, and all that stuff is 328 00:18:01,760 --> 00:18:04,520 Speaker 1: really the same stuff. And so it's just some small 329 00:18:04,600 --> 00:18:08,640 Speaker 1: algorithmic difference in the DNA that's making our brain run 330 00:18:08,680 --> 00:18:11,240 Speaker 1: in a more souped up way. Okay, the idea of 331 00:18:11,280 --> 00:18:14,800 Speaker 1: animal uplift is if we can figure that out, and 332 00:18:14,840 --> 00:18:16,840 Speaker 1: this won't happen, you know, for at least a few 333 00:18:16,840 --> 00:18:19,760 Speaker 1: more decades, but if we can figure out, ah, here's 334 00:18:19,800 --> 00:18:22,200 Speaker 1: the sequence of a's and seas and teas and geese 335 00:18:22,520 --> 00:18:27,199 Speaker 1: that gives us this high intelligence. The question is should 336 00:18:27,280 --> 00:18:34,520 Speaker 1: we give this to animals? Should we help animals become intelligent? Now, 337 00:18:34,640 --> 00:18:38,640 Speaker 1: let me just mention this is an area that bioethicists 338 00:18:38,640 --> 00:18:42,359 Speaker 1: and philosophers and neuroscientists have been talking about for a while, 339 00:18:42,640 --> 00:18:45,159 Speaker 1: and there's plenty of debate about it. On one end 340 00:18:45,160 --> 00:18:47,320 Speaker 1: of the spectrum. You have people say that's a terrible idea, 341 00:18:47,320 --> 00:18:49,520 Speaker 1: we wouldn't want to give intelligence animals, and other people 342 00:18:49,560 --> 00:18:52,719 Speaker 1: say it's a moral obligation in the same way that 343 00:18:52,920 --> 00:18:56,240 Speaker 1: you know, if we know how to fix some viral 344 00:18:56,320 --> 00:18:58,800 Speaker 1: disease or fix a broken leg or something. Of course 345 00:18:58,840 --> 00:19:00,920 Speaker 1: you should do this for your dog instead of let 346 00:19:00,920 --> 00:19:04,840 Speaker 1: your dog, you know, not have the medical advances that 347 00:19:04,880 --> 00:19:07,800 Speaker 1: we have made. So anyway, it's a big debate. But 348 00:19:07,840 --> 00:19:10,440 Speaker 1: this is the idea of animal uplift. You make an 349 00:19:10,480 --> 00:19:14,320 Speaker 1: animal as intelligent as a human. And I just find 350 00:19:14,359 --> 00:19:18,879 Speaker 1: this area fascinating. And you know, as I proposed in 351 00:19:18,920 --> 00:19:23,320 Speaker 1: the podcast, what would the consequences of this be in 352 00:19:23,440 --> 00:19:26,760 Speaker 1: terms of, you know, will World War five be fought 353 00:19:26,840 --> 00:19:31,240 Speaker 1: by other animal species not just humans? And you know, 354 00:19:31,440 --> 00:19:35,199 Speaker 1: and the way I sort of introduced the podcast is 355 00:19:35,720 --> 00:19:37,920 Speaker 1: with this question of what don't my kids look back 356 00:19:37,920 --> 00:19:41,520 Speaker 1: on or my grandkids. Obviously there's lots of things that 357 00:19:41,560 --> 00:19:45,080 Speaker 1: will be very different about our world right now and 358 00:19:45,520 --> 00:19:47,600 Speaker 1: their world and let's say fifty years from now, but 359 00:19:47,720 --> 00:19:50,800 Speaker 1: one of them is will they look back and say, Wow, 360 00:19:50,840 --> 00:19:53,160 Speaker 1: I can't believe there was a time when humans were 361 00:19:53,160 --> 00:19:55,640 Speaker 1: the only species on Earth that was really doing anything 362 00:19:56,080 --> 00:19:59,320 Speaker 1: and now we've got all these other you know, crows 363 00:19:59,480 --> 00:20:05,920 Speaker 1: running unit versities, and donkeys programming computers and whatever, gophers 364 00:20:05,960 --> 00:20:20,959 Speaker 1: in the Senate so on. So one idea of yours 365 00:20:21,280 --> 00:20:23,159 Speaker 1: that I came across because Rob senate to me and 366 00:20:23,200 --> 00:20:28,200 Speaker 1: I found really interesting was from a paper you published 367 00:20:28,240 --> 00:20:31,560 Speaker 1: in twenty twenty one. I think maybe in Frontiers and Neuroscience, 368 00:20:31,560 --> 00:20:37,080 Speaker 1: offering a hypothesis about the adaptive function of dreams, which 369 00:20:37,160 --> 00:20:42,160 Speaker 1: you call the defensive activation theory. Could could you lay 370 00:20:42,160 --> 00:20:46,080 Speaker 1: out what is the basic controversy about the biological function 371 00:20:46,160 --> 00:20:50,520 Speaker 1: of dreams and how your proposed solution here would would 372 00:20:50,880 --> 00:20:53,760 Speaker 1: answer this question. Yes, so it turns out there is 373 00:20:53,840 --> 00:20:58,800 Speaker 1: no controversy about the purpose of dreams because nobody knows, right, everyone, 374 00:20:59,560 --> 00:21:01,960 Speaker 1: I mean in the sense of everyone's got a little 375 00:21:02,040 --> 00:21:07,000 Speaker 1: hypothesis about it, but really it's it's complicated, and people think, well, 376 00:21:07,040 --> 00:21:08,720 Speaker 1: maybe it has something to do with learning and memory. 377 00:21:08,760 --> 00:21:12,640 Speaker 1: Maybe it just has to do with, you know, energy restoration. 378 00:21:12,840 --> 00:21:16,320 Speaker 1: Maybe it has to do with you know. Obviously, the 379 00:21:16,320 --> 00:21:19,199 Speaker 1: Freudians thought that there was some important meaning in the 380 00:21:19,280 --> 00:21:21,880 Speaker 1: content of dreams and so on, but no one really knows, 381 00:21:21,920 --> 00:21:25,359 Speaker 1: and certainly no one has a quantitative hypothesis that can 382 00:21:25,400 --> 00:21:27,959 Speaker 1: make predictions about dreams and how much dream time we have. 383 00:21:28,960 --> 00:21:33,760 Speaker 1: But my student and I developed a theory that actually 384 00:21:33,800 --> 00:21:37,760 Speaker 1: does make quantitative predictions across animal species about it predicts 385 00:21:37,800 --> 00:21:42,320 Speaker 1: actually how much each animal species will dream. And to 386 00:21:42,359 --> 00:21:44,800 Speaker 1: explain something to take one step back, which is about 387 00:21:45,280 --> 00:21:49,320 Speaker 1: brain plasticity, which is this term that we use to 388 00:21:49,440 --> 00:21:52,560 Speaker 1: explain that the brain is very malleable, the human brain 389 00:21:52,600 --> 00:21:56,760 Speaker 1: in particular, and it's constantly reconfiguring its own circuitry and 390 00:21:56,800 --> 00:22:00,080 Speaker 1: that's how it learns and remembers, and that's how it 391 00:22:00,160 --> 00:22:03,600 Speaker 1: learns new skills and all that. So it turns out, 392 00:22:03,840 --> 00:22:06,000 Speaker 1: this is what my last book, Live Wired was about, 393 00:22:06,600 --> 00:22:10,400 Speaker 1: is the massive flexibility of the brain. It turns out 394 00:22:10,960 --> 00:22:14,000 Speaker 1: is probably a lot of people already into it. If 395 00:22:14,040 --> 00:22:18,280 Speaker 1: you go blind at a young age, the visual part 396 00:22:18,280 --> 00:22:21,320 Speaker 1: of your brain gets taken over, and in fact, if 397 00:22:21,320 --> 00:22:25,280 Speaker 1: you're born blind, that takeover is complete. The rest of 398 00:22:25,320 --> 00:22:28,679 Speaker 1: the territories in your brain involved in hearing and touch 399 00:22:28,800 --> 00:22:32,240 Speaker 1: and other things, these all take over what we would 400 00:22:32,280 --> 00:22:35,280 Speaker 1: normally think of as the visual cortex, and it's no 401 00:22:35,359 --> 00:22:39,840 Speaker 1: longer visual. It's now, you know, subserving other functions. Okay. 402 00:22:40,000 --> 00:22:43,000 Speaker 1: One of the surprises in neuroscience was a study that 403 00:22:43,040 --> 00:22:45,520 Speaker 1: came out about a decade ago from some colleagues of 404 00:22:45,560 --> 00:22:47,600 Speaker 1: Mind and Harvard, where they put people in a scanner. 405 00:22:47,960 --> 00:22:51,919 Speaker 1: These are normally cited people but they blindfolded them tightly, 406 00:22:52,200 --> 00:22:53,639 Speaker 1: and they put them in the scanner and they were 407 00:22:53,680 --> 00:22:56,879 Speaker 1: looking at their brains response to touch or to sound 408 00:22:57,000 --> 00:22:59,600 Speaker 1: or things like that. And what they found, to their surprise, 409 00:22:59,720 --> 00:23:03,560 Speaker 1: is that after an hour, you could start seeing the 410 00:23:03,640 --> 00:23:08,000 Speaker 1: first hints of signals in the visual cortex in response 411 00:23:08,040 --> 00:23:11,679 Speaker 1: to touch and sounds. So, in other words, the visual 412 00:23:11,720 --> 00:23:16,360 Speaker 1: cortex was starting to get annexed from these other territories 413 00:23:16,440 --> 00:23:20,200 Speaker 1: that they've touched and sound after one hour. And so 414 00:23:20,280 --> 00:23:23,440 Speaker 1: this was a much more rapid kind of movement than 415 00:23:23,480 --> 00:23:27,560 Speaker 1: anyone had expected. And so what my student and I 416 00:23:27,560 --> 00:23:34,160 Speaker 1: immediately realized is that this is the basis of dreaming. 417 00:23:34,200 --> 00:23:38,080 Speaker 1: It's because we are on a planet that rotates, and 418 00:23:38,520 --> 00:23:41,160 Speaker 1: we spend half our time in the darkness, away from 419 00:23:41,359 --> 00:23:44,080 Speaker 1: the light of our star. And so in the dark 420 00:23:44,119 --> 00:23:46,120 Speaker 1: you can still hear and touch and taste and smell 421 00:23:46,200 --> 00:23:48,760 Speaker 1: just fine, but you can't see. And obviously I'm talking 422 00:23:48,800 --> 00:23:54,760 Speaker 1: about evolutionary time, you know, not our modern electricity blessed times. 423 00:23:55,040 --> 00:23:59,560 Speaker 1: And so what this means is the visual system in 424 00:23:59,600 --> 00:24:02,399 Speaker 1: particular has a real disadvantage, which is it is in 425 00:24:02,560 --> 00:24:06,240 Speaker 1: danger of getting taken over from the other senses. And 426 00:24:06,280 --> 00:24:09,120 Speaker 1: this is because of the brain's great plasticity, and so 427 00:24:09,200 --> 00:24:12,600 Speaker 1: as a result, the visual system needs a way to 428 00:24:12,640 --> 00:24:16,800 Speaker 1: defend its territory during the night, and that's what dreaming is. 429 00:24:16,880 --> 00:24:21,920 Speaker 1: Dreaming is essentially a screensaver. It's making sure that at nighttime, 430 00:24:22,320 --> 00:24:24,440 Speaker 1: when you're curled up in the corner of your cave, 431 00:24:24,680 --> 00:24:28,440 Speaker 1: staying out of trouble, sleeping and sleeping as other benefits 432 00:24:28,480 --> 00:24:30,760 Speaker 1: too in terms of energy restoration and so on. But 433 00:24:30,920 --> 00:24:33,399 Speaker 1: when that's happening, you know, you can still feel if 434 00:24:33,400 --> 00:24:35,920 Speaker 1: something touches your skin, or if you're smelling something or whatever. 435 00:24:35,920 --> 00:24:37,600 Speaker 1: All that can still function in the dark, but you're 436 00:24:37,600 --> 00:24:40,280 Speaker 1: not seeing anything at all. And so what happens is 437 00:24:40,280 --> 00:24:43,800 Speaker 1: you've got this circuitry that just blows activity into the 438 00:24:43,880 --> 00:24:47,399 Speaker 1: visual system to make sure it stays active during the night. 439 00:24:47,440 --> 00:24:51,000 Speaker 1: Every ninety minutes, you have this wave of active, random 440 00:24:51,000 --> 00:24:53,560 Speaker 1: activity that just gets blown in there. And because we're 441 00:24:53,640 --> 00:24:56,800 Speaker 1: visual creatures, we see we have full, rich visual experience 442 00:24:56,800 --> 00:24:58,800 Speaker 1: even though our eyes are closed and it's dark out, 443 00:25:00,080 --> 00:25:03,120 Speaker 1: and it's because we are just making sure the brain 444 00:25:03,200 --> 00:25:06,679 Speaker 1: is making sure that it's keeping this competition going so 445 00:25:06,720 --> 00:25:10,280 Speaker 1: the visual system doesn't get taken over. Interestingly, dream sleep 446 00:25:10,280 --> 00:25:13,280 Speaker 1: is something we find across the animal kingdom. But what 447 00:25:13,320 --> 00:25:16,600 Speaker 1: we were able to demonstrate is that it correlates with 448 00:25:16,600 --> 00:25:21,440 Speaker 1: how plastic the animal species is. So some animals drop 449 00:25:21,480 --> 00:25:23,720 Speaker 1: out of the womb and they figure out in thirty 450 00:25:23,720 --> 00:25:26,120 Speaker 1: minutes how to run, how to walk, very quickly, they 451 00:25:26,520 --> 00:25:30,679 Speaker 1: reach adolescents, they can reproduce all kind of you know, 452 00:25:30,680 --> 00:25:33,840 Speaker 1: they're just they're obviously very preprogrammed, let's just put it 453 00:25:33,880 --> 00:25:37,800 Speaker 1: that way. But other creatures, like humans, are extremely plastic. 454 00:25:37,880 --> 00:25:40,440 Speaker 1: We take forever to learn how to walk, to wean, 455 00:25:41,200 --> 00:25:46,320 Speaker 1: to reproductive age, things like that, precisely because we're extremely plastic, 456 00:25:46,640 --> 00:25:50,359 Speaker 1: and so we have lots of dreaming because we have 457 00:25:50,440 --> 00:25:54,119 Speaker 1: to protect our visual cortex at night. But other animals 458 00:25:54,119 --> 00:25:57,720 Speaker 1: that are, you know, these pre program types, they have 459 00:25:58,359 --> 00:26:00,920 Speaker 1: just a tiny bit of visual dreaming, but not a lot. 460 00:26:01,480 --> 00:26:03,480 Speaker 1: And by the way, I'll just mention that the amount 461 00:26:03,520 --> 00:26:07,520 Speaker 1: of visual dreaming we have goes down with age. So 462 00:26:07,560 --> 00:26:09,719 Speaker 1: as an infant, you're dreaming all the time, and as 463 00:26:09,720 --> 00:26:11,800 Speaker 1: you get older and older, you dream less and less 464 00:26:11,840 --> 00:26:14,920 Speaker 1: as a fraction of your sleep. And you know, that's 465 00:26:14,960 --> 00:26:18,040 Speaker 1: just a correlation. But in theory, what that suggests is, 466 00:26:18,440 --> 00:26:21,400 Speaker 1: you know, as an infant, your visual system is very 467 00:26:22,040 --> 00:26:24,399 Speaker 1: highly at risk of getting taken over, and as you 468 00:26:24,440 --> 00:26:26,920 Speaker 1: get older and things get more cemented into place, it's 469 00:26:27,000 --> 00:26:28,680 Speaker 1: less at risk of getting taken over, so you don't 470 00:26:28,680 --> 00:26:31,600 Speaker 1: need as much screen saver time. Incidentally, this kind of 471 00:26:31,600 --> 00:26:35,160 Speaker 1: reminds me of studies I've read on a related subject, 472 00:26:35,200 --> 00:26:41,000 Speaker 1: which is prolonged blindfolding of normally cited people who apparently 473 00:26:41,000 --> 00:26:44,480 Speaker 1: it's very common for people under those circumstances to experience 474 00:26:44,520 --> 00:26:47,760 Speaker 1: a lot of visual hallucinations. Does that have any relationship 475 00:26:47,840 --> 00:26:49,840 Speaker 1: to what you're talking about here? It does, thank you 476 00:26:49,880 --> 00:26:52,439 Speaker 1: for asking us, perfect because this is all part of 477 00:26:52,480 --> 00:26:56,760 Speaker 1: the defensive activation theory, which is to say, if a 478 00:26:56,840 --> 00:27:00,600 Speaker 1: system is used to having data coming in and suddenly 479 00:27:00,960 --> 00:27:04,399 Speaker 1: it's not getting that data anymore, it fights back from 480 00:27:04,440 --> 00:27:07,200 Speaker 1: the inside. It starts producing that data itself. So one 481 00:27:07,240 --> 00:27:10,880 Speaker 1: example of this is let's say, blindfolding, or you also 482 00:27:10,960 --> 00:27:14,480 Speaker 1: see this, for example in you know, when people get 483 00:27:14,480 --> 00:27:18,160 Speaker 1: thrown in solitary confinement in the dark, they start having hallucinations, 484 00:27:18,200 --> 00:27:21,480 Speaker 1: both auditory and visual because they're not getting that data 485 00:27:21,520 --> 00:27:23,359 Speaker 1: and they're used to it, they're supposed to get it. 486 00:27:23,520 --> 00:27:27,120 Speaker 1: There's also something called bonnet syndrome Charles Bonnet syndrome, which 487 00:27:27,160 --> 00:27:30,200 Speaker 1: is people start losing their vision, but they don't realize 488 00:27:30,200 --> 00:27:33,240 Speaker 1: that they're losing their vision because they start having hallucinations 489 00:27:33,240 --> 00:27:37,639 Speaker 1: that essentially fill in for them. This is all the 490 00:27:37,680 --> 00:27:40,160 Speaker 1: same issue, though, which is that the brain is used 491 00:27:40,160 --> 00:27:43,120 Speaker 1: to getting certain inputs, suddenly it's not getting it anymore, 492 00:27:43,160 --> 00:27:47,040 Speaker 1: and so it starts generating itself when more example is tenitis, 493 00:27:47,080 --> 00:27:50,000 Speaker 1: which is ringing in the ears. This typically comes about 494 00:27:50,080 --> 00:27:54,680 Speaker 1: because somebody loses hearing in some frequency or some band 495 00:27:54,680 --> 00:27:57,200 Speaker 1: of frequencies, and the brain says, wait a minute, I'm 496 00:27:57,200 --> 00:28:00,800 Speaker 1: not hearing anything at twelve thousand hurts anymore are so 497 00:28:00,840 --> 00:28:03,560 Speaker 1: I'm going to start making myself whip and it starts 498 00:28:03,680 --> 00:28:07,480 Speaker 1: making this sound by itself. So this all falls under 499 00:28:07,520 --> 00:28:11,240 Speaker 1: the defensive activation theory. One of the interesting things I 500 00:28:11,280 --> 00:28:16,480 Speaker 1: recall about the studies on prolonged blindfolding was that the 501 00:28:16,520 --> 00:28:20,879 Speaker 1: hallucinations that were reported were not entirely random. So it 502 00:28:20,920 --> 00:28:23,520 Speaker 1: wasn't just you know, people seeing strange scenes play out 503 00:28:23,520 --> 00:28:27,360 Speaker 1: in front of them that they would often hallucinate stuff 504 00:28:27,400 --> 00:28:30,439 Speaker 1: that you would expect to see in that place in 505 00:28:30,440 --> 00:28:33,600 Speaker 1: the room based on other senses. So like if they 506 00:28:33,680 --> 00:28:35,800 Speaker 1: heard someone come to the door of the room, they 507 00:28:35,840 --> 00:28:39,960 Speaker 1: would hallucinate the image of that person in the door. Yeah, 508 00:28:40,000 --> 00:28:43,680 Speaker 1: that's perfect. And by the way, I think this also 509 00:28:43,920 --> 00:28:47,400 Speaker 1: has a lot to tell us about dream content because 510 00:28:47,720 --> 00:28:50,240 Speaker 1: the thing about dreams, I love the way you put this, 511 00:28:50,320 --> 00:28:53,000 Speaker 1: because you could, in theory dream about anything at all. 512 00:28:53,040 --> 00:28:57,360 Speaker 1: You could dream that you are in Cambodia and that 513 00:28:57,440 --> 00:29:01,640 Speaker 1: you are in the fourteen hundreds and you're a magician 514 00:29:01,720 --> 00:29:03,880 Speaker 1: who's doing something. But but you know, you tend to 515 00:29:03,920 --> 00:29:06,240 Speaker 1: dream about you know, your work and your spouse and 516 00:29:06,320 --> 00:29:08,880 Speaker 1: your drive and whatever, you know, things that are more 517 00:29:08,960 --> 00:29:13,440 Speaker 1: local to you. And and it's precisely because when you 518 00:29:14,120 --> 00:29:18,880 Speaker 1: slam random activity into the visual system, the synapse is 519 00:29:18,920 --> 00:29:22,720 Speaker 1: the connections that are essentially hot from the day's work. Um, 520 00:29:23,160 --> 00:29:25,200 Speaker 1: you know, those those are the things that tend to 521 00:29:25,240 --> 00:29:29,600 Speaker 1: get activated. And the association's very loose in a sleeping 522 00:29:29,720 --> 00:29:32,480 Speaker 1: dream state. And so what happens is, you know, things 523 00:29:32,480 --> 00:29:36,800 Speaker 1: can go off on weird tangents, but physics still works 524 00:29:36,800 --> 00:29:39,440 Speaker 1: fine in a dream. You know, rocks don't float upwards 525 00:29:39,440 --> 00:29:43,040 Speaker 1: and stuff like that, and so, um, you know, essentially 526 00:29:43,120 --> 00:29:47,120 Speaker 1: you're just rebooting things that were there during the day. 527 00:29:47,440 --> 00:29:51,040 Speaker 1: And and this is closely related to what you're saying about, um, 528 00:29:51,560 --> 00:29:53,800 Speaker 1: you know, all the associations that your brain builds up 529 00:29:53,840 --> 00:29:56,160 Speaker 1: over a lifetime. So you hear the voice and you're 530 00:29:56,160 --> 00:30:00,440 Speaker 1: expecting to see that person, and that's exactly what what happens. Actually, 531 00:30:00,440 --> 00:30:02,560 Speaker 1: I just want to mention one other thing about the dreams, 532 00:30:03,000 --> 00:30:06,720 Speaker 1: which is people will often ask me, well, what about 533 00:30:06,760 --> 00:30:09,719 Speaker 1: a blind person, how do they dream? And the answer is, 534 00:30:10,360 --> 00:30:13,160 Speaker 1: blind people also dream because you have this very ancient 535 00:30:13,200 --> 00:30:16,080 Speaker 1: circuitry in your head that's blasting activity into the back 536 00:30:16,120 --> 00:30:18,920 Speaker 1: of the brain, the occipital cortex, which is normally the 537 00:30:19,000 --> 00:30:21,800 Speaker 1: visual cortex. And people but if you're born blind, it's 538 00:30:21,840 --> 00:30:24,560 Speaker 1: you know, long taken over by hearing and touch, and 539 00:30:24,640 --> 00:30:28,240 Speaker 1: so a blind person's dream is all about hearing and touch. 540 00:30:28,320 --> 00:30:31,840 Speaker 1: They don't see anything, but they say, oh, I was, 541 00:30:31,960 --> 00:30:34,000 Speaker 1: you know, moving through the living room and I felt 542 00:30:34,000 --> 00:30:36,240 Speaker 1: the furniture was rearranged, and then there was a big 543 00:30:36,280 --> 00:30:38,600 Speaker 1: dog in the corner and I ran from it and 544 00:30:38,600 --> 00:30:41,080 Speaker 1: I was scared. And so you know, they've got full, 545 00:30:41,280 --> 00:30:45,040 Speaker 1: rich dream experience. It's just that it is not visual 546 00:30:45,240 --> 00:30:47,720 Speaker 1: because that part of their brain is no longer visual. 547 00:30:48,200 --> 00:30:50,880 Speaker 1: And would you well, That makes me wonder, then, if 548 00:30:51,920 --> 00:30:56,040 Speaker 1: your hypothesis about the defensive activity of dreaming is correct, 549 00:30:56,160 --> 00:30:59,840 Speaker 1: does that mean that that dreaming is now to protect 550 00:31:00,120 --> 00:31:03,840 Speaker 1: still what would normally be used for visual processing that 551 00:31:03,880 --> 00:31:07,120 Speaker 1: part of the brain, but its role in processing auditory 552 00:31:07,160 --> 00:31:11,120 Speaker 1: and other stimuli A great question. Nope, it's it's that 553 00:31:11,640 --> 00:31:16,760 Speaker 1: these circuits that underlie dreaming are extremely ancient, and so 554 00:31:16,800 --> 00:31:20,240 Speaker 1: they are assuming that you've got perfectly fined vision, And 555 00:31:20,560 --> 00:31:23,720 Speaker 1: if you don't have vision for some reason, then the 556 00:31:23,840 --> 00:31:26,200 Speaker 1: circuits aren't going to change there. They're just doing a 557 00:31:26,240 --> 00:31:30,600 Speaker 1: basic architectural job of saying, hey, guys, every ninety minutes, 558 00:31:30,640 --> 00:31:33,360 Speaker 1: just blastom activity into the back of the brain. There, 559 00:31:33,640 --> 00:31:35,280 Speaker 1: that's all they're doing. And they don't know if you're 560 00:31:35,280 --> 00:31:39,120 Speaker 1: blind or not. Now, speaking about dreams, I guess it's 561 00:31:39,200 --> 00:31:41,320 Speaker 1: it's not too much of a leap to start talking 562 00:31:41,320 --> 00:31:44,440 Speaker 1: about about consciousness. I was wondering, where do you think 563 00:31:44,440 --> 00:31:47,800 Speaker 1: we are in terms of I don't know, growling testing 564 00:31:47,880 --> 00:31:53,480 Speaker 1: and even eliminating various theories concerning the nature of human consciousness. Yeah, boy, 565 00:31:53,680 --> 00:31:57,800 Speaker 1: this still remains to my mind the central unsolved mystery 566 00:31:57,840 --> 00:32:00,640 Speaker 1: of neuroscience. What's interesting, By the way, I wrote an article, 567 00:32:01,280 --> 00:32:04,280 Speaker 1: the cover article of Discover magazine back in something like 568 00:32:04,320 --> 00:32:08,160 Speaker 1: two thousand and six, called ten Unsolved Questions of Neuroscience. 569 00:32:08,560 --> 00:32:11,000 Speaker 1: And what's fascinating to me is it's now twenty twenty 570 00:32:11,080 --> 00:32:13,960 Speaker 1: three and they are equally as unsolved. I mean, it's 571 00:32:14,080 --> 00:32:17,480 Speaker 1: it's it's funny because we're making so much progress in 572 00:32:17,520 --> 00:32:20,000 Speaker 1: the field in some ways, and yet in other ways 573 00:32:20,440 --> 00:32:25,120 Speaker 1: we're just facing some really tough problems. So the consciousness, 574 00:32:25,160 --> 00:32:28,000 Speaker 1: you know, what is consciousness is really I think the 575 00:32:28,040 --> 00:32:32,000 Speaker 1: central one. And and you know, for any listeners who 576 00:32:32,040 --> 00:32:34,440 Speaker 1: are wondering what is the question, the question is how 577 00:32:34,440 --> 00:32:38,280 Speaker 1: do you take eighty six billion cells and stick them 578 00:32:38,280 --> 00:32:40,720 Speaker 1: together and hook them up in such a way that 579 00:32:41,120 --> 00:32:45,400 Speaker 1: you have private, subjective internal experience, So you know, the 580 00:32:47,160 --> 00:32:50,520 Speaker 1: smell of apple pie and the taste of feted cheese, 581 00:32:50,520 --> 00:32:52,760 Speaker 1: and the pain of pain and the redness of red 582 00:32:52,840 --> 00:32:57,080 Speaker 1: and so on. How does that happen? Because you know, 583 00:32:57,120 --> 00:33:01,440 Speaker 1: my laptop computer has lots of signals running around, zeroes 584 00:33:01,440 --> 00:33:05,000 Speaker 1: and ones running around its transistors, but presumably it's not 585 00:33:05,080 --> 00:33:08,880 Speaker 1: experiencing anything. It can it can play a YouTube video 586 00:33:09,040 --> 00:33:11,560 Speaker 1: for me, but presumably it doesn't find it funny the 587 00:33:11,600 --> 00:33:14,200 Speaker 1: way I do. And so this is really the question 588 00:33:14,200 --> 00:33:19,000 Speaker 1: of consciousness. We don't know the answer to it. I 589 00:33:19,040 --> 00:33:23,480 Speaker 1: can just tell you my general feeling on this, which 590 00:33:23,520 --> 00:33:27,400 Speaker 1: is when you look at the history of science, what 591 00:33:27,440 --> 00:33:30,080 Speaker 1: you find is that in every era there were big 592 00:33:30,560 --> 00:33:34,560 Speaker 1: pieces of information missing, and yet the scientists were in 593 00:33:34,600 --> 00:33:37,680 Speaker 1: a position of having to try to explain everything not 594 00:33:37,880 --> 00:33:42,400 Speaker 1: knowing some other thing. Here's an example. You know, when 595 00:33:42,440 --> 00:33:45,520 Speaker 1: the pump was invented, people suddenly said, oh, I see 596 00:33:45,560 --> 00:33:47,840 Speaker 1: the heart is like a pump, and then it was obvious, oh, 597 00:33:47,880 --> 00:33:50,760 Speaker 1: click falls into place. But before that, everyone's trying to 598 00:33:50,760 --> 00:33:52,120 Speaker 1: figure out what the heck the heart was doing, but 599 00:33:52,160 --> 00:33:55,160 Speaker 1: no one had the concept of a pump. Or you know, 600 00:33:55,280 --> 00:33:58,880 Speaker 1: before the magnetosphere of the Earth was discovered, you'd have 601 00:33:58,880 --> 00:34:01,800 Speaker 1: no way to explain the northern lights. You'd have to 602 00:34:02,360 --> 00:34:04,959 Speaker 1: make up some crazy story about the northern lights and 603 00:34:05,000 --> 00:34:07,680 Speaker 1: so on. Anyway, I feel like we're in that situation 604 00:34:07,760 --> 00:34:12,080 Speaker 1: now with consciousness. There's something right at the edges. We're 605 00:34:12,080 --> 00:34:15,400 Speaker 1: all listening for its whispers. We can sort of feel 606 00:34:15,400 --> 00:34:17,680 Speaker 1: that there's something there, but we don't know exactly what 607 00:34:17,719 --> 00:34:21,120 Speaker 1: it is that we're missing that will allow us to explain. 608 00:34:21,160 --> 00:34:23,319 Speaker 1: How you take a bunch of physical stuff and have 609 00:34:23,400 --> 00:34:37,160 Speaker 1: it experience. Now a topic that's being discussed a lot 610 00:34:37,239 --> 00:34:40,239 Speaker 1: right now is, of course, as always artificial intelligence, but 611 00:34:40,320 --> 00:34:45,439 Speaker 1: specifically generative artificial intelligence, especially with so many of these 612 00:34:45,480 --> 00:34:48,640 Speaker 1: text and image creative tools that are available just to 613 00:34:48,680 --> 00:34:52,279 Speaker 1: the average person to experiment with and share the results of. 614 00:34:52,520 --> 00:34:57,120 Speaker 1: And I was wondering, what's your take on generative artificial 615 00:34:57,120 --> 00:35:02,359 Speaker 1: intelligence and how it relates or doesn't relate to human creativity. Yeah, so, 616 00:35:02,480 --> 00:35:06,640 Speaker 1: actually this is my next episode because I'm fascinated by this. Yeah, 617 00:35:06,680 --> 00:35:10,279 Speaker 1: I'm just so Okay, So you guys may know I'm 618 00:35:10,280 --> 00:35:13,360 Speaker 1: a neuroscientist, but I'm also a writer, including of fiction. 619 00:35:13,480 --> 00:35:17,759 Speaker 1: And so suddenly, when when generative I started blowing up 620 00:35:17,840 --> 00:35:20,920 Speaker 1: really at the end of last year, I of course, 621 00:35:20,960 --> 00:35:23,359 Speaker 1: like many artists, thought, oh my gosh, what does this 622 00:35:23,440 --> 00:35:27,239 Speaker 1: mean for me? What's the what's the future for writers? Um? 623 00:35:27,520 --> 00:35:32,000 Speaker 1: But actually, in my next episode, I make a four 624 00:35:32,040 --> 00:35:35,319 Speaker 1: part argument why I think it'll be an important part 625 00:35:35,360 --> 00:35:39,040 Speaker 1: of the symbiosis between humans and machines that eventually comes about. 626 00:35:39,080 --> 00:35:42,879 Speaker 1: But it's not going to replace writers and artists. There 627 00:35:42,880 --> 00:35:44,399 Speaker 1: are many reasons. You know. One thing is it can 628 00:35:44,440 --> 00:35:46,440 Speaker 1: it can only do short form stuff and it doesn't 629 00:35:46,800 --> 00:35:48,520 Speaker 1: very nicely. But you know, if you want to write 630 00:35:48,560 --> 00:35:51,960 Speaker 1: a little blog post or a little jingle or poem 631 00:35:52,040 --> 00:35:54,239 Speaker 1: or whatever like, it's great for that. But but to 632 00:35:54,280 --> 00:35:57,120 Speaker 1: actually write a novel is a completely different sort of 633 00:35:57,160 --> 00:35:59,879 Speaker 1: thing because what the author is doing there is plan 634 00:36:00,239 --> 00:36:03,719 Speaker 1: clues and having let's say, cliffhanger that doesn't come back 635 00:36:03,760 --> 00:36:07,319 Speaker 1: for two or three chapters, and you know, there's a 636 00:36:07,400 --> 00:36:10,560 Speaker 1: continuity through time where what the author knows is what 637 00:36:10,640 --> 00:36:14,560 Speaker 1: the end of the story is and then writes towards that. 638 00:36:14,960 --> 00:36:16,680 Speaker 1: But but a I can't even do things like make 639 00:36:16,760 --> 00:36:18,440 Speaker 1: up a joke because to make up a joke you 640 00:36:18,480 --> 00:36:20,640 Speaker 1: have to know the punchline first and then construct the 641 00:36:20,719 --> 00:36:23,520 Speaker 1: joke to meet it. But it's doing everything in the 642 00:36:23,600 --> 00:36:27,279 Speaker 1: forward direction, So there are there are reasons like that. 643 00:36:27,320 --> 00:36:32,000 Speaker 1: There's also make the argument that we as readers, I think, 644 00:36:32,160 --> 00:36:36,320 Speaker 1: actually really care about the heartbeat behind the page, which 645 00:36:36,360 --> 00:36:39,000 Speaker 1: is to say, if you offered me two books and 646 00:36:39,160 --> 00:36:41,520 Speaker 1: one was written by AI and one was written by 647 00:36:41,840 --> 00:36:45,839 Speaker 1: by you, Rob, and you know, I would absolutely want 648 00:36:45,880 --> 00:36:49,719 Speaker 1: the one that's written by a real human because I 649 00:36:49,800 --> 00:36:53,239 Speaker 1: know that you're a human with all them, you know, 650 00:36:53,920 --> 00:36:58,520 Speaker 1: Limitations and anxieties and joys and ecstasies of a real human. 651 00:36:58,920 --> 00:37:01,200 Speaker 1: And that's what I can about as a fellow human. 652 00:37:02,160 --> 00:37:05,319 Speaker 1: And you know, part of my evidence for this is 653 00:37:05,480 --> 00:37:08,960 Speaker 1: a colleague of mine here in Silicon Valley announced recently 654 00:37:09,040 --> 00:37:11,560 Speaker 1: that he'd written a book that was half by him 655 00:37:11,560 --> 00:37:16,600 Speaker 1: and half by chat Chypt And and I actually read 656 00:37:16,760 --> 00:37:18,760 Speaker 1: most of the book and it's it's actually a good book, 657 00:37:18,880 --> 00:37:22,480 Speaker 1: but I was not inspired when I heard that I 658 00:37:22,560 --> 00:37:26,160 Speaker 1: read it for other reasons. I thought that sounds terrible, 659 00:37:26,280 --> 00:37:28,319 Speaker 1: and I was trying to figure out why why did 660 00:37:28,320 --> 00:37:30,640 Speaker 1: I feel that way? Why did I feel that it 661 00:37:30,680 --> 00:37:32,759 Speaker 1: was uninteresting to me? And it has to do with 662 00:37:32,840 --> 00:37:38,080 Speaker 1: this heartbeat behind behind the page that that matters. Here's 663 00:37:38,120 --> 00:37:41,560 Speaker 1: the analogy that I'm thinking about nowadays is it was 664 00:37:41,800 --> 00:37:48,600 Speaker 1: when cameras first came on this scene, visual painters all 665 00:37:49,080 --> 00:37:52,680 Speaker 1: panicked and thought, we're done for because why would anyone 666 00:37:52,719 --> 00:37:55,600 Speaker 1: want me to sit here and paint something for weeks 667 00:37:55,600 --> 00:37:58,359 Speaker 1: and weeks when you can just get a perfect representation 668 00:37:58,440 --> 00:38:00,800 Speaker 1: of it in a fraction of a sack with a camera. 669 00:38:01,400 --> 00:38:04,320 Speaker 1: And the answer is, cameras did not kill visual painting. 670 00:38:04,320 --> 00:38:07,719 Speaker 1: They just ended up filling a different neighboring niche and 671 00:38:07,880 --> 00:38:10,720 Speaker 1: became their own art form. But visual paintings still exists 672 00:38:10,719 --> 00:38:14,200 Speaker 1: because you can do other things with it, and I can, 673 00:38:14,360 --> 00:38:17,880 Speaker 1: you know, at least write the moment. All these text 674 00:38:17,920 --> 00:38:22,160 Speaker 1: generation programs are extraordinarily boring and what they come up 675 00:38:22,200 --> 00:38:26,240 Speaker 1: with because they get pushed through reinforcement learning with humans 676 00:38:26,600 --> 00:38:29,000 Speaker 1: so that humans say, oh, don't say that, don't say 677 00:38:29,040 --> 00:38:31,239 Speaker 1: that, that that might defend someone, and so on, which is fine. 678 00:38:31,280 --> 00:38:33,200 Speaker 1: I mean, I'm not opposed to that, But the thing 679 00:38:33,280 --> 00:38:36,759 Speaker 1: is that good literature is stuff that really challenges us. 680 00:38:36,880 --> 00:38:38,839 Speaker 1: Any good piece of literature that you find is something 681 00:38:38,880 --> 00:38:42,120 Speaker 1: that's full of stuff where you think, oh, yikes, that 682 00:38:42,200 --> 00:38:45,759 Speaker 1: sounds like a terrible thing that just happened. And none 683 00:38:45,800 --> 00:38:48,480 Speaker 1: of these large language models are even willing to go 684 00:38:48,680 --> 00:38:50,600 Speaker 1: near that or touch that. So I think they're going 685 00:38:50,640 --> 00:38:53,360 Speaker 1: to be quite a distance from real literature for the 686 00:38:53,400 --> 00:38:57,280 Speaker 1: foreseeable future. I would tend to think also with literary craft, 687 00:38:57,360 --> 00:39:00,840 Speaker 1: a lot of what we really like about literary style 688 00:39:01,200 --> 00:39:06,120 Speaker 1: is being surprised. But I wonder if a you know, 689 00:39:06,200 --> 00:39:09,239 Speaker 1: surprised about like a strange word choice or a strange 690 00:39:09,680 --> 00:39:13,280 Speaker 1: metaphor or something, those are the things that feel really good. 691 00:39:13,320 --> 00:39:17,960 Speaker 1: But can a generative AI tell the difference between a 692 00:39:19,239 --> 00:39:22,000 Speaker 1: comparison or a word choice that is strange in a 693 00:39:22,080 --> 00:39:25,720 Speaker 1: pleasing and exciting way versus one that will be essentially 694 00:39:25,760 --> 00:39:29,040 Speaker 1: interpreted as a hallucination or an error by the AI. 695 00:39:29,440 --> 00:39:33,960 Speaker 1: Oh that's interesting, I would say, I mean, say something 696 00:39:33,960 --> 00:39:37,040 Speaker 1: positive about these ais. I think probably it would be 697 00:39:37,080 --> 00:39:38,800 Speaker 1: able to do that, because remember, all this doing is 698 00:39:38,800 --> 00:39:42,160 Speaker 1: a statistical game of saying, Okay, what's the most probable 699 00:39:42,200 --> 00:39:45,319 Speaker 1: thing to come next, and you can turn up the 700 00:39:45,360 --> 00:39:47,600 Speaker 1: temperature on it so that it does things that are 701 00:39:47,920 --> 00:39:54,000 Speaker 1: increasingly less probable but somehow makes sense. I had not 702 00:39:54,040 --> 00:39:56,160 Speaker 1: thought about that, But I think these things might be 703 00:39:56,280 --> 00:40:01,560 Speaker 1: great at making really good metaphor that are surprising, because 704 00:40:01,719 --> 00:40:03,200 Speaker 1: one of the things that authors have to deal with 705 00:40:03,239 --> 00:40:06,800 Speaker 1: all the time is that they recycle metaphors and it's 706 00:40:07,239 --> 00:40:10,000 Speaker 1: totally you know, a soporific to the to the reader. 707 00:40:10,040 --> 00:40:13,880 Speaker 1: It puts some sleep. Um, but a good author. I 708 00:40:13,920 --> 00:40:17,680 Speaker 1: was just reading the other day. It was Frank Herbert 709 00:40:17,880 --> 00:40:22,720 Speaker 1: who in Dune, he said something about the waves throwing 710 00:40:22,880 --> 00:40:26,200 Speaker 1: white robes over the rocks. That's how he was describing 711 00:40:26,239 --> 00:40:29,120 Speaker 1: the foam hitting the rocks, which is beautiful because it 712 00:40:29,120 --> 00:40:30,799 Speaker 1: wakes you up in that moment you think, oh, what 713 00:40:30,880 --> 00:40:33,759 Speaker 1: a nice way of describing that. But if the if 714 00:40:33,760 --> 00:40:36,640 Speaker 1: one of these large language models simply says, hey, I 715 00:40:36,680 --> 00:40:38,680 Speaker 1: want to make something not the most probable thing that 716 00:40:38,880 --> 00:40:42,399 Speaker 1: but less probable and less probable, I'll bet it could 717 00:40:42,400 --> 00:40:44,200 Speaker 1: come up with really good stuff like that that no 718 00:40:44,400 --> 00:40:47,440 Speaker 1: human author has yet tried. Let me just give one example, 719 00:40:47,480 --> 00:40:53,520 Speaker 1: which is when Alpha Go beat Lee Sodol, the Go 720 00:40:53,880 --> 00:40:57,800 Speaker 1: champion back So I think twenty seventeen, Um, you know this, 721 00:40:58,000 --> 00:41:00,040 Speaker 1: here's the best human in the world and playing the 722 00:41:00,080 --> 00:41:04,200 Speaker 1: game of Go, and the AI program beats him, and 723 00:41:04,840 --> 00:41:07,239 Speaker 1: everybody sort of watched that and thought, wow, wow, that's 724 00:41:07,280 --> 00:41:10,080 Speaker 1: the end of that. But the most interesting part of 725 00:41:10,080 --> 00:41:13,600 Speaker 1: the story was what happened next, which is Lee Sotol 726 00:41:13,880 --> 00:41:22,920 Speaker 1: ended up then playing against his human opponents and took 727 00:41:22,920 --> 00:41:25,920 Speaker 1: on different sorts of moves that he had seen Alpha 728 00:41:25,960 --> 00:41:29,520 Speaker 1: Go play that no human had played before, Like it 729 00:41:29,600 --> 00:41:31,960 Speaker 1: was just doing these weird things that were totally in 730 00:41:32,000 --> 00:41:34,600 Speaker 1: the rules. They were legal, but no one had thought 731 00:41:34,600 --> 00:41:36,879 Speaker 1: of doing it before. So now he started doing this 732 00:41:37,239 --> 00:41:41,800 Speaker 1: and started beating his human opponents at a much higher rate. 733 00:41:42,440 --> 00:41:45,080 Speaker 1: So the point is we can learn from AI, and 734 00:41:45,120 --> 00:41:48,160 Speaker 1: I think there's going to be this really interesting collaboration 735 00:41:48,239 --> 00:41:51,719 Speaker 1: that happens into the future where we see new things happening, 736 00:41:52,000 --> 00:41:54,480 Speaker 1: and in this case, new metaphors that come out in 737 00:41:54,560 --> 00:41:56,439 Speaker 1: literature and we think, wow, I would have never thought 738 00:41:56,480 --> 00:42:00,840 Speaker 1: of that one, and then we can use them. Another 739 00:42:00,960 --> 00:42:04,920 Speaker 1: element of literary style. Though I think about with generative 740 00:42:04,960 --> 00:42:10,280 Speaker 1: AI is the role of insight in writing, and it 741 00:42:10,320 --> 00:42:14,000 Speaker 1: makes me wonder what insight actually is. This is obviously 742 00:42:14,200 --> 00:42:16,120 Speaker 1: something we prize, you know, when we read a novel 743 00:42:16,160 --> 00:42:18,840 Speaker 1: that we really like and we say that it is true. 744 00:42:19,000 --> 00:42:21,800 Speaker 1: You know there's something true in it. Obviously the story 745 00:42:21,960 --> 00:42:25,680 Speaker 1: is literally fictional, it didn't happen, but it observes something 746 00:42:25,719 --> 00:42:29,439 Speaker 1: about life that we perceive as like deeply correct and 747 00:42:29,800 --> 00:42:32,880 Speaker 1: do I don't know, I would have an intuition that 748 00:42:32,920 --> 00:42:36,120 Speaker 1: says I would come across insights like that, or things 749 00:42:36,160 --> 00:42:39,719 Speaker 1: that feel like insights like that, less in generative in 750 00:42:39,760 --> 00:42:43,160 Speaker 1: something generated by AI. I can't prove that, but it 751 00:42:43,480 --> 00:42:47,239 Speaker 1: does raise this question of what insight is. Yeah, you know, 752 00:42:47,320 --> 00:42:50,600 Speaker 1: I tell you, I think I'm signing with the AAI 753 00:42:50,760 --> 00:42:54,640 Speaker 1: on this one. Because what these large language models are 754 00:42:54,840 --> 00:42:58,680 Speaker 1: essentially is every human all put together. So whatever insights 755 00:42:58,680 --> 00:43:01,840 Speaker 1: people have had, obviously this is all available to the 756 00:43:01,920 --> 00:43:04,720 Speaker 1: language model, and so there's no reason that it can't 757 00:43:05,000 --> 00:43:08,440 Speaker 1: put something together that's very insightful. And it's not that 758 00:43:08,560 --> 00:43:11,600 Speaker 1: it's having the insight, it's that it gets to say, Okay, well, 759 00:43:11,640 --> 00:43:15,479 Speaker 1: here's a billion people who have written stuff down, and 760 00:43:15,600 --> 00:43:18,200 Speaker 1: I've noticed that, you know, a number of these maybe 761 00:43:18,239 --> 00:43:20,360 Speaker 1: two hundred of these people have all said the same 762 00:43:20,400 --> 00:43:23,400 Speaker 1: thing over here. And maybe Joe's never read that sentence, 763 00:43:23,600 --> 00:43:27,000 Speaker 1: you know, that that paragraph, but but I there's something 764 00:43:27,040 --> 00:43:28,840 Speaker 1: going on over here, and it puts it together and 765 00:43:28,840 --> 00:43:31,200 Speaker 1: then you say, oh my gosh, that was really insightful, 766 00:43:31,520 --> 00:43:33,920 Speaker 1: because it's not a machine telling you the story. It's 767 00:43:33,960 --> 00:43:37,200 Speaker 1: a billion people telling you the story. Now, Joe and 768 00:43:37,239 --> 00:43:40,879 Speaker 1: I are actually currently doing some episodes of our own 769 00:43:40,880 --> 00:43:44,719 Speaker 1: podcast on the subject of infantile and nesia, you know, 770 00:43:44,840 --> 00:43:48,360 Speaker 1: when why we don't remember our earliest childhood or you know, 771 00:43:48,400 --> 00:43:53,000 Speaker 1: infancy and birth and so forth. And and we've we've 772 00:43:53,000 --> 00:43:55,760 Speaker 1: heard from some listeners, as we inevitably knew we would, 773 00:43:56,080 --> 00:43:58,520 Speaker 1: who say that they do remember their births or they 774 00:43:58,520 --> 00:44:02,560 Speaker 1: do remember very early childhood. And we were just wondering 775 00:44:02,560 --> 00:44:06,440 Speaker 1: what your take is on people who who have that 776 00:44:06,480 --> 00:44:09,480 Speaker 1: experience or seem to have that memory, what may be 777 00:44:09,600 --> 00:44:13,680 Speaker 1: going on there. Yeah, I mean, here's what we think 778 00:44:14,280 --> 00:44:16,480 Speaker 1: in neuroscience generally is that, you know, memory is a 779 00:44:17,080 --> 00:44:20,480 Speaker 1: is something that unpacks slowly with time. It's a cognitive 780 00:44:20,520 --> 00:44:24,319 Speaker 1: development in some sense. And you know, as you guys know, 781 00:44:24,480 --> 00:44:27,160 Speaker 1: it's about three years old for girls and three and 782 00:44:27,160 --> 00:44:29,160 Speaker 1: a half years old for boys that they start laying 783 00:44:29,200 --> 00:44:32,200 Speaker 1: down their first memories. Here's the interesting thing. Memory is 784 00:44:32,280 --> 00:44:36,080 Speaker 1: a myth making machine, and we're constantly reinventing our past. 785 00:44:36,800 --> 00:44:41,640 Speaker 1: And so one of the difficult things to assess when 786 00:44:41,680 --> 00:44:45,279 Speaker 1: someone says, hey, I remember whatever being born or being 787 00:44:45,280 --> 00:44:48,319 Speaker 1: one years old, is it's really difficult to know the 788 00:44:48,360 --> 00:44:51,960 Speaker 1: degree to which they think that's true. But it's not 789 00:44:52,040 --> 00:44:54,799 Speaker 1: true because we're all told stories by our parents of oh, 790 00:44:54,880 --> 00:44:56,799 Speaker 1: and you are an infant, you did this hilarious thing 791 00:44:56,880 --> 00:44:58,719 Speaker 1: and blah blah blah, and you hear the story once 792 00:44:58,800 --> 00:45:03,719 Speaker 1: or twice and eventually comes a false memory. So I 793 00:45:03,760 --> 00:45:06,839 Speaker 1: think it's very difficult to know to be able to 794 00:45:06,840 --> 00:45:10,040 Speaker 1: tell this sort of thing. And of course, for someone 795 00:45:10,080 --> 00:45:14,719 Speaker 1: who has a memory, it's very difficult to tell them, hey, 796 00:45:14,880 --> 00:45:17,440 Speaker 1: that might be false, and you just think you remember 797 00:45:17,480 --> 00:45:21,960 Speaker 1: that that makes people angry. But you know, the truth 798 00:45:22,040 --> 00:45:24,160 Speaker 1: is this, This kind of stuff comes up all the 799 00:45:24,239 --> 00:45:28,200 Speaker 1: time in courts of law, in the realm of eyewitness testimony, 800 00:45:28,280 --> 00:45:32,360 Speaker 1: because people think that their memories are like a video 801 00:45:32,400 --> 00:45:38,640 Speaker 1: recorder and they're simply not. There's a giant psychology literature 802 00:45:38,640 --> 00:45:41,840 Speaker 1: on this showing all kinds of ways that that things 803 00:45:41,880 --> 00:45:45,480 Speaker 1: get false, memories get introduced, and so on. You know, 804 00:45:45,520 --> 00:45:48,080 Speaker 1: a colleague of mine did a really great study right 805 00:45:48,280 --> 00:45:52,080 Speaker 1: after September eleventh, two thousand and one. She was in 806 00:45:52,080 --> 00:45:55,840 Speaker 1: New York, and she went and interviewed a bunch of 807 00:45:55,840 --> 00:45:59,560 Speaker 1: people in downtown in Midtown, New York about what they 808 00:45:59,600 --> 00:46:03,439 Speaker 1: had scene on September eleventh, and then she was clever 809 00:46:03,560 --> 00:46:06,479 Speaker 1: enough to also ask them to describe a memory from 810 00:46:06,520 --> 00:46:09,640 Speaker 1: September tenth, the day before, like I had lunch here 811 00:46:09,680 --> 00:46:11,239 Speaker 1: and I did this, and I did that. And then 812 00:46:11,280 --> 00:46:13,040 Speaker 1: she went and tracked all these people down a year 813 00:46:13,160 --> 00:46:17,320 Speaker 1: later and asked them to tell their memories again about 814 00:46:17,320 --> 00:46:20,799 Speaker 1: September eleven September tenth of the year before, and it 815 00:46:20,800 --> 00:46:23,879 Speaker 1: turns out that in both cases the memories drifted. So 816 00:46:24,000 --> 00:46:26,080 Speaker 1: this comes back to the beginning of our conversation. Even 817 00:46:26,120 --> 00:46:29,160 Speaker 1: Amignala memories, even the scariest memories that you have, it 818 00:46:29,160 --> 00:46:33,440 Speaker 1: doesn't mean they're accurate. And so I mean, especially it 819 00:46:33,480 --> 00:46:36,160 Speaker 1: doesn't surprise me at about September eleventh, because especially the 820 00:46:36,200 --> 00:46:38,880 Speaker 1: more you tell a story, the more you start laying 821 00:46:38,880 --> 00:46:42,400 Speaker 1: down these ruts in the road, and that becomes the story, 822 00:46:42,400 --> 00:46:45,560 Speaker 1: that becomes the truth. And you know, we've all run 823 00:46:45,600 --> 00:46:48,360 Speaker 1: into these things in our life where someone suddenly shows 824 00:46:48,400 --> 00:46:50,680 Speaker 1: us a photograph or something that's like, that's not my memory. 825 00:46:50,719 --> 00:46:52,440 Speaker 1: Look here's the thing here, and you go, oh, gosh, 826 00:46:52,480 --> 00:46:56,040 Speaker 1: I had actually misremembered that that thing that happened, or 827 00:46:56,040 --> 00:46:58,960 Speaker 1: where I was standing or what I was doing anyway. 828 00:46:59,000 --> 00:47:01,839 Speaker 1: So this is the the concern when people say, oh, 829 00:47:01,960 --> 00:47:05,120 Speaker 1: I remember whatever being born or this event when I 830 00:47:05,160 --> 00:47:07,920 Speaker 1: was really young, is that we know how easy it 831 00:47:08,000 --> 00:47:12,200 Speaker 1: is to believe memories that are not true. Now correct 832 00:47:12,200 --> 00:47:14,280 Speaker 1: me if I'm wrong, but I think I recall reading 833 00:47:14,360 --> 00:47:17,399 Speaker 1: that in some of these cases with the like where 834 00:47:17,400 --> 00:47:21,520 Speaker 1: there's a big public event and people remember you, people 835 00:47:21,520 --> 00:47:23,920 Speaker 1: are like asked to write down their experiences that day, 836 00:47:23,920 --> 00:47:27,239 Speaker 1: and then the researchers contact them again later and have 837 00:47:27,360 --> 00:47:29,520 Speaker 1: them try it again, So what do you remember about 838 00:47:29,560 --> 00:47:31,719 Speaker 1: that day? Not only do they often get the details wrong, 839 00:47:31,760 --> 00:47:34,440 Speaker 1: but don't they often insist that the way they remember 840 00:47:34,480 --> 00:47:37,000 Speaker 1: it now is correct and what they wrote at the 841 00:47:37,040 --> 00:47:41,919 Speaker 1: time was wrong. Exactly that's exactly right. Yeah, because it's 842 00:47:41,960 --> 00:47:45,360 Speaker 1: so hard to disbelieve our own memories about things. You know, 843 00:47:45,440 --> 00:47:47,640 Speaker 1: this is obviously at the heart of lots of spousal 844 00:47:47,800 --> 00:47:50,759 Speaker 1: arguments too. You know, you have two brains, you have 845 00:47:50,800 --> 00:47:56,000 Speaker 1: two different ways of remembering what precisely happened. Yes, that's 846 00:47:56,000 --> 00:47:59,200 Speaker 1: exactly right. All right, Well, you already mentioned that you 847 00:47:59,239 --> 00:48:03,280 Speaker 1: have the episode coming up about AI creativity. I'm definitely 848 00:48:03,320 --> 00:48:05,759 Speaker 1: excited to check that one out. Is there anything else 849 00:48:05,800 --> 00:48:08,799 Speaker 1: you want to tease out for listeners what else they 850 00:48:08,800 --> 00:48:13,080 Speaker 1: can expect from future episodes of Inner Cosmos? Yeah? Well okay, 851 00:48:13,080 --> 00:48:14,799 Speaker 1: So my next one after that is going to be 852 00:48:15,120 --> 00:48:18,399 Speaker 1: is a sentient because you know, this is a big 853 00:48:18,520 --> 00:48:21,360 Speaker 1: question now as these models get larger and larger and 854 00:48:21,400 --> 00:48:24,279 Speaker 1: things are moving at an extraordinary pace, Now, what does 855 00:48:24,320 --> 00:48:26,480 Speaker 1: sentience mean? And this is related to the question you 856 00:48:26,520 --> 00:48:29,520 Speaker 1: asked me Rob about consciousness and so on. So I 857 00:48:29,560 --> 00:48:33,520 Speaker 1: think this actually gives us an interesting tool into studying 858 00:48:33,560 --> 00:48:36,279 Speaker 1: consciousness that we haven't had before. But I have other 859 00:48:36,280 --> 00:48:42,040 Speaker 1: episodes about my One after that is about counterfeiting money 860 00:48:42,080 --> 00:48:45,360 Speaker 1: and what it is that we notice about counterfeits or 861 00:48:45,480 --> 00:48:50,319 Speaker 1: we do not. I have an episode on will you 862 00:48:50,760 --> 00:48:54,520 Speaker 1: perceive the event that kills you? And I find this 863 00:48:54,600 --> 00:48:56,400 Speaker 1: is just a topic I've been thinking about for a 864 00:48:56,440 --> 00:48:58,279 Speaker 1: long time and have put together a lot of work 865 00:48:58,280 --> 00:49:03,000 Speaker 1: on this about you know, if suddenly something, let's say 866 00:49:03,040 --> 00:49:06,560 Speaker 1: that the brick from the pedestrian bridge over the highway 867 00:49:06,920 --> 00:49:09,000 Speaker 1: fell on your head when you were in a convertible. 868 00:49:09,200 --> 00:49:12,960 Speaker 1: The question is, would you perceive dying or would you 869 00:49:13,040 --> 00:49:15,799 Speaker 1: be dead before you knew anything happen? And what does 870 00:49:15,840 --> 00:49:17,440 Speaker 1: that look like? Does it look like you know, suddenly 871 00:49:17,480 --> 00:49:21,160 Speaker 1: the footage just ends, but there's no pain, stuff like that. 872 00:49:22,239 --> 00:49:25,319 Speaker 1: So I have lots and lots of episodes. Can we 873 00:49:25,360 --> 00:49:28,600 Speaker 1: create new senses for humans? Which is a big part 874 00:49:28,640 --> 00:49:30,280 Speaker 1: of what I've been doing over the last eight years 875 00:49:30,280 --> 00:49:35,120 Speaker 1: with a company that I run called Neosensory. Yeah, and 876 00:49:35,680 --> 00:49:38,600 Speaker 1: I have forty six episodes this year, all of which 877 00:49:38,600 --> 00:49:43,120 Speaker 1: I've outlined, And then it's just a matter of spending 878 00:49:43,120 --> 00:49:45,560 Speaker 1: the twelve hours per week of writing the hour long 879 00:49:45,600 --> 00:49:49,839 Speaker 1: monologue awesome. It sounds exciting. I'm excited to check out 880 00:49:50,120 --> 00:49:54,440 Speaker 1: more episodes. Great. Thank you guys so much for having me. 881 00:49:54,960 --> 00:49:57,000 Speaker 1: It's been a pleasure to see all. Yeah, thanks for 882 00:49:57,000 --> 00:50:01,160 Speaker 1: coming on the show. All right, well that was our 883 00:50:01,200 --> 00:50:05,120 Speaker 1: conversation with David Eagleman. Once again, much appreciation to David 884 00:50:05,160 --> 00:50:07,520 Speaker 1: for taking the time to chat with us today. If 885 00:50:07,600 --> 00:50:09,680 Speaker 1: you want to check out his new show, and we 886 00:50:09,719 --> 00:50:12,799 Speaker 1: do recommend it once again, it is called Inner Cosmos 887 00:50:12,840 --> 00:50:15,480 Speaker 1: with David Eagleman. You can find it on the iHeart 888 00:50:15,480 --> 00:50:18,960 Speaker 1: app or wherever you get your podcasts. Just a reminder 889 00:50:19,000 --> 00:50:22,000 Speaker 1: that's Stuff to Blow Your Mind is a science podcast 890 00:50:22,080 --> 00:50:25,120 Speaker 1: with core episodes on Tuesdays and Thursdays and the Stuff 891 00:50:25,160 --> 00:50:28,279 Speaker 1: to Blow your Mind podcast feed On Monday's. We do 892 00:50:28,360 --> 00:50:31,320 Speaker 1: listenermail episodes. On Wednesdays we do a short form artifact 893 00:50:31,400 --> 00:50:33,520 Speaker 1: or monster fac episode, and on Fridays we set aside 894 00:50:33,520 --> 00:50:35,480 Speaker 1: most serious concerns to just talk about a weird film 895 00:50:35,520 --> 00:50:39,120 Speaker 1: on Weird House Cinema. Huge thanks to our audio producer 896 00:50:39,200 --> 00:50:41,399 Speaker 1: JJ Posway. If you would like to get in touch 897 00:50:41,440 --> 00:50:43,680 Speaker 1: with us with feedback on this episode or any other, 898 00:50:43,719 --> 00:50:45,840 Speaker 1: to suggest a topic for the future, or just to 899 00:50:45,880 --> 00:50:48,960 Speaker 1: say hello, you can email us at contact at Stuff 900 00:50:49,000 --> 00:50:58,799 Speaker 1: to Blow Your Mind dot com. Stuff to Blow Your Mind. 901 00:50:58,840 --> 00:51:01,680 Speaker 1: It's production of iHeart Radio. For more podcasts from my 902 00:51:01,760 --> 00:51:05,160 Speaker 1: heart Radio, visit the iHeartRadio app, Apple Podcasts, or wherever 903 00:51:05,200 --> 00:51:10,480 Speaker 1: you listening to your favorite shows.