1 00:00:05,440 --> 00:00:07,600 Speaker 1: The analogy that I like to use for this is 2 00:00:08,039 --> 00:00:11,800 Speaker 1: imagine you are at a club and there's lots and 3 00:00:11,840 --> 00:00:13,640 Speaker 1: lots of noise going on, lots of things going on, 4 00:00:13,680 --> 00:00:16,200 Speaker 1: lots of people talking to each other, there's loud music 5 00:00:16,280 --> 00:00:19,200 Speaker 1: and so on, and you're trying to communicate something to 6 00:00:19,280 --> 00:00:22,639 Speaker 1: somebody who's dancing alongside gm, and it's very, very difficult 7 00:00:22,640 --> 00:00:25,400 Speaker 1: because of all of that noise. But now you get 8 00:00:25,400 --> 00:00:26,759 Speaker 1: real close to them and you kind of, you know, 9 00:00:26,880 --> 00:00:28,920 Speaker 1: hold your hand to their ears and you start to 10 00:00:28,960 --> 00:00:31,960 Speaker 1: talk directly into their ears. Now they're going to receive 11 00:00:32,000 --> 00:00:34,640 Speaker 1: that communication with much higher fidelity, be able to tune 12 00:00:34,680 --> 00:00:38,440 Speaker 1: out noise and selectively attend to that communication. You've enhanced 13 00:00:38,479 --> 00:00:41,479 Speaker 1: the communication between that person and the person they're talking to. 14 00:00:42,120 --> 00:00:45,320 Speaker 1: That's what happens at synapses. There's lots and lots of 15 00:00:45,320 --> 00:00:48,720 Speaker 1: synoptic firing, lots and lots of communication happening. But when 16 00:00:48,800 --> 00:00:51,400 Speaker 1: cells start to attach to each other, they communicate much 17 00:00:51,440 --> 00:00:55,680 Speaker 1: more preferentially. They can transmit signals that express that form 18 00:00:55,720 --> 00:00:59,240 Speaker 1: of learning. In other words, if there's an experience that 19 00:00:59,360 --> 00:01:02,160 Speaker 1: happens that is learned by the brain, the brain can 20 00:01:02,240 --> 00:01:06,000 Speaker 1: express a form of plasticity or a form of memory 21 00:01:06,040 --> 00:01:09,240 Speaker 1: in the strength of the connections. So if the connections 22 00:01:09,240 --> 00:01:12,280 Speaker 1: grow stronger, that's a signal that this memory has been learned. 23 00:01:12,840 --> 00:01:14,920 Speaker 1: And most of the information that we have about this 24 00:01:15,000 --> 00:01:18,200 Speaker 1: comes from animal models, comes from slice recordings where we 25 00:01:18,240 --> 00:01:22,680 Speaker 1: can see evidence for enhancements in the connectivity, enhancement of 26 00:01:22,680 --> 00:01:26,720 Speaker 1: the communication between cells as a result of a learning experience. 27 00:01:26,880 --> 00:01:29,959 Speaker 2: You've reminded me why I hate clubs. I'm sorry, no 28 00:01:29,959 --> 00:01:31,120 Speaker 2: one invites me to them anymore. 29 00:01:31,120 --> 00:01:46,320 Speaker 1: I share that with you. Hi, I'm Daniel. 30 00:01:46,360 --> 00:01:49,040 Speaker 3: I'm a particle physicist, and every day I rely more 31 00:01:49,080 --> 00:01:50,960 Speaker 3: on my computer's memory instead. 32 00:01:50,680 --> 00:01:51,080 Speaker 1: Of my own. 33 00:01:53,040 --> 00:01:55,480 Speaker 2: Yeah, I am Kelly Leader Smith. I'm a biologist, and 34 00:01:55,520 --> 00:01:57,680 Speaker 2: I'm having a lot more of those moments where you 35 00:01:57,720 --> 00:02:01,480 Speaker 2: walk into a room and think why am I in here? 36 00:02:04,080 --> 00:02:05,920 Speaker 3: I walk around campus here at you see Irvine and 37 00:02:05,960 --> 00:02:08,200 Speaker 3: a lot of folks go hey, Professor Whitson, and I 38 00:02:08,240 --> 00:02:10,920 Speaker 3: go hey, and I think I have no idea who you. 39 00:02:10,880 --> 00:02:12,440 Speaker 1: Are and why we know each other. 40 00:02:13,000 --> 00:02:15,320 Speaker 3: And sometimes it's just because they were in a class 41 00:02:15,320 --> 00:02:17,480 Speaker 3: I taught with four hundred people, and the relationship is 42 00:02:17,520 --> 00:02:20,720 Speaker 3: a little axometrical. And sometimes it's just because my memory 43 00:02:20,800 --> 00:02:23,480 Speaker 3: is terrible, and maybe we hit coffee and I've forgotten, 44 00:02:23,600 --> 00:02:26,960 Speaker 3: So I apologize to folks out there who I'm pretending 45 00:02:27,000 --> 00:02:27,880 Speaker 3: to recognize. 46 00:02:28,280 --> 00:02:30,480 Speaker 2: Yeah, yeah, no, I'm there also, And every once in 47 00:02:30,520 --> 00:02:32,240 Speaker 2: a while be standing in the grocery store and I'll 48 00:02:32,280 --> 00:02:33,680 Speaker 2: stand in front of the aisle for a little too 49 00:02:33,720 --> 00:02:37,959 Speaker 2: long and my daughter will go, you forgot, didn't you. Yeah, 50 00:02:38,040 --> 00:02:40,160 Speaker 2: I have no idea what I'm looking for right now. 51 00:02:40,200 --> 00:02:44,760 Speaker 2: And hopefully this doesn't give us too much anxiety thinking 52 00:02:44,800 --> 00:02:47,519 Speaker 2: about what the root cause of our memory problems are. 53 00:02:47,720 --> 00:02:50,000 Speaker 3: I have that as well. Is that a memory issue 54 00:02:50,040 --> 00:02:52,760 Speaker 3: or is that a distraction issue? Like where you're looking 55 00:02:52,880 --> 00:02:55,080 Speaker 3: for capers, but then you see a jar pickles and 56 00:02:55,120 --> 00:02:56,720 Speaker 3: it makes you think about that last time you had 57 00:02:56,720 --> 00:02:58,320 Speaker 3: a pickle, and then you're like, m I wonder if 58 00:02:58,320 --> 00:03:00,360 Speaker 3: you can pickle at home. And then five minutes later 59 00:03:00,520 --> 00:03:03,480 Speaker 3: you're dreaming about a whole pickling building in your backyard, 60 00:03:03,760 --> 00:03:05,760 Speaker 3: and you've forgotten that you were looking for capers. 61 00:03:06,600 --> 00:03:09,320 Speaker 2: So I don't like pickles. So no, that's not my scenario. 62 00:03:09,680 --> 00:03:11,320 Speaker 2: But sometimes it'll be like, you know, I see my 63 00:03:11,360 --> 00:03:13,200 Speaker 2: reflection in a bottle, and I'll be like, oh, is 64 00:03:13,240 --> 00:03:15,840 Speaker 2: that spot skin cancer? When am I gonna die. 65 00:03:15,800 --> 00:03:16,280 Speaker 1: There we go? 66 00:03:16,400 --> 00:03:17,919 Speaker 3: Yeah, exactly, but. 67 00:03:18,440 --> 00:03:22,280 Speaker 2: It is often distraction instead of forgetfulness. But yeah, sometimes 68 00:03:22,320 --> 00:03:24,240 Speaker 2: it's hard to disentangle those things. 69 00:03:24,440 --> 00:03:26,600 Speaker 3: Well, I just learned something deeply troubling about you that 70 00:03:26,639 --> 00:03:27,760 Speaker 3: you don't like pickles. 71 00:03:28,000 --> 00:03:31,440 Speaker 2: What you know, Daniel, You've made a really great point 72 00:03:31,520 --> 00:03:33,880 Speaker 2: about your memory, because we have definitely talked to about 73 00:03:36,800 --> 00:03:39,040 Speaker 2: and I think even on the show, we've talked about this. 74 00:03:39,400 --> 00:03:43,760 Speaker 3: That's embarrassing, but not as embarrassing as being closed minded 75 00:03:43,960 --> 00:03:46,240 Speaker 3: to the wonderful world of pickles. I'm with you, like 76 00:03:46,240 --> 00:03:48,600 Speaker 3: the big deal pickle, okay with the sandwich, but I'm 77 00:03:48,600 --> 00:03:50,600 Speaker 3: not munching on one in general. But have you ever 78 00:03:50,640 --> 00:03:53,640 Speaker 3: done home pickling, you know, like you can pickle cauliflower 79 00:03:53,760 --> 00:03:54,400 Speaker 3: or carrots. 80 00:03:54,720 --> 00:03:55,640 Speaker 1: It's really wonderful. 81 00:03:55,880 --> 00:03:59,360 Speaker 2: The only pickled item I've ever enjoyed is Cowboy candy, 82 00:03:59,760 --> 00:04:02,600 Speaker 2: which which is when you slice of really hot halopanos 83 00:04:02,680 --> 00:04:06,040 Speaker 2: and put them in like a sugar pickle. Oh so good. 84 00:04:06,080 --> 00:04:08,200 Speaker 2: That I love on sandwiches. Other than that, I have 85 00:04:08,240 --> 00:04:09,440 Speaker 2: not met a pickle that I like. 86 00:04:09,840 --> 00:04:11,800 Speaker 3: I'm sorry, I have to work on that, all right. 87 00:04:11,800 --> 00:04:14,160 Speaker 2: Okay, all right, well, I'll keep my mind open. But 88 00:04:14,280 --> 00:04:18,280 Speaker 2: today we got a wonderful question from our listener, Simon, 89 00:04:18,600 --> 00:04:21,200 Speaker 2: who is interested in memory, and so let's go ahead 90 00:04:21,240 --> 00:04:22,520 Speaker 2: and listen to Simon's question. 91 00:04:22,640 --> 00:04:26,920 Speaker 4: Now, Hi, Daniel and Kelly. This is Simon from New York. 92 00:04:28,000 --> 00:04:30,360 Speaker 4: I was a huge fan of Daniel and joege Explain 93 00:04:30,440 --> 00:04:33,039 Speaker 4: the Universe. However, I am loving the new show with 94 00:04:33,160 --> 00:04:37,200 Speaker 4: Kelly and never missed an episode. Here's my question, I 95 00:04:37,240 --> 00:04:41,839 Speaker 4: think mainly for Kelly. I am now into my seventies 96 00:04:42,320 --> 00:04:46,719 Speaker 4: and not surprisingly find myself reflecting a lot on my 97 00:04:46,839 --> 00:04:50,560 Speaker 4: life and the thousands, perhaps millions of memories that go 98 00:04:50,640 --> 00:04:54,320 Speaker 4: into a lifetime. I have often wondered exactly how the 99 00:04:54,400 --> 00:05:00,880 Speaker 4: human brain, using biological substances, chemicals, and electricity, actually doors memories. 100 00:05:01,680 --> 00:05:04,080 Speaker 4: I have tried to read about this, but have not 101 00:05:04,160 --> 00:05:09,279 Speaker 4: found anything to satisfy my curiosity or wonder Perhaps this 102 00:05:09,360 --> 00:05:12,080 Speaker 4: blurs too much into the question of what is consciousness 103 00:05:12,080 --> 00:05:15,559 Speaker 4: and sentience and it's not something you want to delve into. 104 00:05:16,640 --> 00:05:18,680 Speaker 4: But if you would like to tackle it, I would 105 00:05:18,760 --> 00:05:22,280 Speaker 4: love to hear about what insights biology and physics have 106 00:05:22,400 --> 00:05:25,920 Speaker 4: to offer. I have very recently been reading a lot 107 00:05:26,120 --> 00:05:30,680 Speaker 4: about computers, trying to understand how they really work and 108 00:05:30,720 --> 00:05:35,640 Speaker 4: that seems somewhat related, but only somewhat anyway, Thanks to 109 00:05:35,680 --> 00:05:38,279 Speaker 4: both of you for all you do. My weeks would 110 00:05:38,279 --> 00:05:41,719 Speaker 4: not be complete without listening to your podcast episodes. 111 00:05:42,400 --> 00:05:44,400 Speaker 2: All right, Simon, we love your accent. 112 00:05:44,720 --> 00:05:45,040 Speaker 1: We do. 113 00:05:45,640 --> 00:05:48,640 Speaker 2: I grew up in New Jersey and I miss hearing 114 00:05:48,640 --> 00:05:51,240 Speaker 2: that accent more often, so that was awesome, And thank 115 00:05:51,240 --> 00:05:54,799 Speaker 2: you for this fantastic question. And we got really lucky. 116 00:05:54,800 --> 00:05:58,200 Speaker 2: So folks, remember we talked in the past about whether 117 00:05:58,320 --> 00:06:00,680 Speaker 2: or not tripped a fan from Turn he actually makes 118 00:06:00,720 --> 00:06:04,000 Speaker 2: you sleepy on Thanksgiving and we interviewed Mark Mapstone for 119 00:06:04,040 --> 00:06:06,880 Speaker 2: that and he suggested that if we were interested in memory, 120 00:06:07,080 --> 00:06:09,440 Speaker 2: we should talk to his colleague, and his colleague was 121 00:06:09,480 --> 00:06:12,240 Speaker 2: willing to come on the show. So on today's show 122 00:06:12,240 --> 00:06:15,120 Speaker 2: we have doctor Michael Yassa. He's a professor at the 123 00:06:15,240 --> 00:06:19,400 Speaker 2: University of California, Irvine, where he's also the director of 124 00:06:19,440 --> 00:06:22,800 Speaker 2: the Center for the Neurobiology of Learning and Memory and 125 00:06:23,160 --> 00:06:27,000 Speaker 2: is the director of the UCI Brain Initiative. So like 126 00:06:27,360 --> 00:06:29,400 Speaker 2: clearly the perfect person for this topic. 127 00:06:29,760 --> 00:06:32,640 Speaker 3: Yes, amazing, and also it's one more notch on my 128 00:06:32,720 --> 00:06:36,039 Speaker 3: personal goal to have my entire neighborhood on the podcast. 129 00:06:36,600 --> 00:06:38,240 Speaker 3: For those of you who don't know people that you 130 00:06:38,279 --> 00:06:41,080 Speaker 3: see Irvine. Many of us live in this faculty neighborhood 131 00:06:41,120 --> 00:06:43,360 Speaker 3: right next to campus, so we're all friends and neighbors 132 00:06:43,400 --> 00:06:45,560 Speaker 3: and we know each other. And by now we've had 133 00:06:45,600 --> 00:06:49,239 Speaker 3: a significant fraction of that neighborhood on the podcast. Because 134 00:06:49,440 --> 00:06:51,280 Speaker 3: I want to know, like, hey, who's an expert flight 135 00:06:51,320 --> 00:06:53,080 Speaker 3: I'm like, oh, I know that guy who's on the 136 00:06:53,080 --> 00:06:55,880 Speaker 3: next street. And so it's a fantastic resource. 137 00:06:56,000 --> 00:06:58,120 Speaker 2: That is pretty great. So let's go ahead and bring 138 00:06:58,160 --> 00:07:00,760 Speaker 2: Mike on the show. But we should mention you were 139 00:07:00,880 --> 00:07:03,479 Speaker 2: off telling people about your amazing research ideas, so you 140 00:07:03,520 --> 00:07:06,000 Speaker 2: were not able to join us for this interview. So 141 00:07:06,279 --> 00:07:07,560 Speaker 2: I flew solo with Mike. 142 00:07:07,839 --> 00:07:09,880 Speaker 3: Thanks very much for handling this while I was goofing 143 00:07:09,920 --> 00:07:12,240 Speaker 3: around in the area my pleasure. 144 00:07:12,280 --> 00:07:17,160 Speaker 2: I had a blast. All right, Welcome to the show, Mike, 145 00:07:17,200 --> 00:07:18,320 Speaker 2: Thanks for being with us today. 146 00:07:18,840 --> 00:07:19,640 Speaker 1: Thanks for having me. 147 00:07:20,040 --> 00:07:22,880 Speaker 2: So what got you interested in studying memory? Let's start 148 00:07:22,880 --> 00:07:23,800 Speaker 2: by getting to know you of it? 149 00:07:24,640 --> 00:07:28,080 Speaker 1: Sure, So, I don't think that I really knew much 150 00:07:28,120 --> 00:07:31,160 Speaker 1: about memory when I was an undergraduate. I was fascinated 151 00:07:31,160 --> 00:07:33,120 Speaker 1: by the brain just by virtue of taking a couple 152 00:07:33,160 --> 00:07:36,800 Speaker 1: of classes that kind of inspired that passion, that love 153 00:07:36,920 --> 00:07:40,000 Speaker 1: for everything brain related. And one of the things that 154 00:07:40,080 --> 00:07:42,000 Speaker 1: was really fascinating about it is that I felt, even 155 00:07:42,040 --> 00:07:44,440 Speaker 1: at the time, this was in the late nineties, that 156 00:07:44,520 --> 00:07:47,600 Speaker 1: we knew next to nothing. Unlike other classes that I took, 157 00:07:47,640 --> 00:07:49,640 Speaker 1: where there was sort of a big body of knowledge, 158 00:07:49,680 --> 00:07:52,200 Speaker 1: it felt like with the brain, there's just so much 159 00:07:52,240 --> 00:07:55,960 Speaker 1: more that we really didn't understand. So that became really fascinating. 160 00:07:56,000 --> 00:07:57,800 Speaker 1: As I started to dive a little bit more deeply 161 00:07:57,840 --> 00:08:01,600 Speaker 1: into different aspects of how the brain functions, memory came 162 00:08:01,640 --> 00:08:04,640 Speaker 1: about as one that was front and the foremost, particularly 163 00:08:04,680 --> 00:08:07,160 Speaker 1: because we started to see or I started to see 164 00:08:07,200 --> 00:08:11,640 Speaker 1: that memory loss is just very devastating, unlike any other 165 00:08:11,880 --> 00:08:15,200 Speaker 1: cognitive domain that if you know, if you have attentional 166 00:08:15,280 --> 00:08:18,680 Speaker 1: deficit or if you have a deficit with executive function, 167 00:08:19,280 --> 00:08:22,600 Speaker 1: you know, that can be somewhat circumscribed. It's a contained 168 00:08:22,680 --> 00:08:25,840 Speaker 1: kind of deficit, not memory loss, which just utterly devastating, 169 00:08:26,320 --> 00:08:29,560 Speaker 1: you know, seeing patients with Alzheimer's disease, seeing patients with 170 00:08:30,000 --> 00:08:33,160 Speaker 1: various forms of memory loss, that was really compelling, and 171 00:08:33,240 --> 00:08:36,079 Speaker 1: I started to understand a bit more that memory is 172 00:08:36,120 --> 00:08:38,520 Speaker 1: what makes us who we are. It's so fundamental to 173 00:08:38,559 --> 00:08:41,800 Speaker 1: our core. It's the essence of our consciousness. Everything that 174 00:08:41,840 --> 00:08:43,800 Speaker 1: we do we do because of some experience that we've 175 00:08:43,800 --> 00:08:46,000 Speaker 1: had that we've been able to store, and it just 176 00:08:46,040 --> 00:08:50,320 Speaker 1: became not just fascinating, but like entirely all consuming. So 177 00:08:50,760 --> 00:08:54,000 Speaker 1: I focused on memory from all of its aspects. One 178 00:08:54,080 --> 00:08:57,080 Speaker 1: is trying to understand it's fundamental inner workings, and too 179 00:08:57,400 --> 00:08:59,720 Speaker 1: trying to understand how it breaks down in a variety 180 00:08:59,720 --> 00:09:01,800 Speaker 1: of differ and conditions. And if we can do that, 181 00:09:01,920 --> 00:09:03,400 Speaker 1: maybe we can help people. Yeah. 182 00:09:03,440 --> 00:09:05,960 Speaker 2: I've got a neighbor with Alzheimer's and it's been totally 183 00:09:06,000 --> 00:09:09,040 Speaker 2: devastating for the two of them. So yeah, So this 184 00:09:09,160 --> 00:09:11,280 Speaker 2: interview was inspired by a question that we got from 185 00:09:11,320 --> 00:09:14,120 Speaker 2: the listener and one of the things that they asked 186 00:09:14,160 --> 00:09:17,120 Speaker 2: about is does the concept of memory blur the line 187 00:09:17,160 --> 00:09:21,839 Speaker 2: with consciousness and sentience? How do you view these concepts? 188 00:09:22,240 --> 00:09:25,160 Speaker 1: Yeah, you know, it's interesting. There's a somewhat related question, 189 00:09:25,360 --> 00:09:28,680 Speaker 1: which is, you know you can have a computer have memory. 190 00:09:28,760 --> 00:09:31,400 Speaker 1: When we talked about memory and a computing platform in 191 00:09:31,440 --> 00:09:34,199 Speaker 1: a robot or a botic application, certainly when you think 192 00:09:34,200 --> 00:09:36,760 Speaker 1: about chat GPT, well that can hold on to memory 193 00:09:36,760 --> 00:09:39,600 Speaker 1: for some period of time and use that to guide 194 00:09:39,720 --> 00:09:41,720 Speaker 1: how it responds to the us there and so on. 195 00:09:42,240 --> 00:09:45,600 Speaker 1: So memory in and of itself may not be the 196 00:09:45,640 --> 00:09:49,120 Speaker 1: thing that I would say as associated with sentiens. I 197 00:09:49,160 --> 00:09:52,640 Speaker 1: think that it's the way that our memory works, not 198 00:09:52,720 --> 00:09:55,480 Speaker 1: just as humans, but as sort of you know, live organisms. 199 00:09:55,800 --> 00:09:57,079 Speaker 1: It's not like the way that you would do it 200 00:09:57,160 --> 00:10:01,079 Speaker 1: in a computer. So let me elaborate. Memory is stored 201 00:10:01,080 --> 00:10:05,000 Speaker 1: in a computer is very much one to one. Everything 202 00:10:05,040 --> 00:10:09,320 Speaker 1: that you see and learn, your storing with incredibly high fidelity. 203 00:10:09,440 --> 00:10:11,480 Speaker 1: You try to retrieve it twenty years from now, thirty 204 00:10:11,559 --> 00:10:13,880 Speaker 1: years from now. It's exactly to say there's no degradation. 205 00:10:14,320 --> 00:10:16,920 Speaker 1: But that creates a problem for a memory system, and 206 00:10:17,000 --> 00:10:20,480 Speaker 1: that it's more difficult to extract generalities. It's more difficult 207 00:10:20,520 --> 00:10:24,880 Speaker 1: to generate knowledge based on memory. But say as a 208 00:10:24,960 --> 00:10:27,480 Speaker 1: human and you're encoding memories all the time. These memories 209 00:10:27,520 --> 00:10:30,320 Speaker 1: are stored, but we know that there's blurriness of memories, 210 00:10:30,360 --> 00:10:32,680 Speaker 1: there's forgetting of memories. There's all sorts of things that 211 00:10:32,720 --> 00:10:36,080 Speaker 1: we tend to think of as memory problems, but in 212 00:10:36,120 --> 00:10:39,360 Speaker 1: fact one could argue they're not bugs, they're features of 213 00:10:39,440 --> 00:10:42,000 Speaker 1: the system. Because memory is not intended to be a 214 00:10:42,040 --> 00:10:45,640 Speaker 1: super high fidelity kind of system. It's intended to get 215 00:10:45,920 --> 00:10:49,160 Speaker 1: enough information in so that you can generalize knowledge, so 216 00:10:49,200 --> 00:10:51,440 Speaker 1: you can learn from experience and be able to guide 217 00:10:51,679 --> 00:10:55,760 Speaker 1: your future decision making. So, while a computer's memory is 218 00:10:55,800 --> 00:10:58,160 Speaker 1: really about storage, with high fidelity, you don't want to 219 00:10:58,360 --> 00:11:00,520 Speaker 1: write a filin word and then store and then later 220 00:11:00,600 --> 00:11:02,600 Speaker 1: on have an abstract version of it rather than what 221 00:11:02,640 --> 00:11:04,760 Speaker 1: you actually wrote. You want to have an accurate record 222 00:11:04,760 --> 00:11:07,400 Speaker 1: of what you actually wrote. But for the brain, what 223 00:11:07,480 --> 00:11:10,280 Speaker 1: you might want to get later on is that abstract version. 224 00:11:10,600 --> 00:11:12,880 Speaker 1: It's just enough knowledge to be able to guide your 225 00:11:12,880 --> 00:11:16,640 Speaker 1: future decision making. And that's the reality of how memory 226 00:11:16,679 --> 00:11:19,000 Speaker 1: evolved in live organisms. And maybe the thing that makes 227 00:11:19,040 --> 00:11:23,520 Speaker 1: it very different from nonsensient beings is that it never 228 00:11:23,600 --> 00:11:27,160 Speaker 1: really evolved to think too much about the past. It 229 00:11:27,280 --> 00:11:30,600 Speaker 1: evolved almost entirely to think about the future. So the 230 00:11:30,679 --> 00:11:33,280 Speaker 1: reason why you might store something is because you want 231 00:11:33,320 --> 00:11:36,079 Speaker 1: to use that knowledge to guide future decision making, to 232 00:11:36,120 --> 00:11:38,040 Speaker 1: make sure that you do things that are adaptive to 233 00:11:38,120 --> 00:11:40,480 Speaker 1: promote your survival. You're not going back to the same 234 00:11:40,559 --> 00:11:42,760 Speaker 1: poison as berry bush. You know, to run from a 235 00:11:42,760 --> 00:11:44,440 Speaker 1: bear out in the wild, as supposed to go up 236 00:11:44,440 --> 00:11:47,920 Speaker 1: and say Hello, those kinds of things are based on memory, 237 00:11:47,960 --> 00:11:50,400 Speaker 1: equipping us to make better predictions for the future. 238 00:11:50,840 --> 00:11:52,640 Speaker 2: So can we dig in a little bit more to 239 00:11:52,720 --> 00:11:56,959 Speaker 2: this trade off? So why can't you remember everything perfectly 240 00:11:57,240 --> 00:11:59,800 Speaker 2: and generalize when the time is right. 241 00:12:00,360 --> 00:12:03,320 Speaker 1: Yeah. So it turns out that you can mathematically and 242 00:12:03,400 --> 00:12:07,240 Speaker 1: computationally model this and it gives you a pretty straightforward answer. 243 00:12:07,280 --> 00:12:09,240 Speaker 1: And the way that it works is that if you 244 00:12:09,320 --> 00:12:12,880 Speaker 1: were to encode every single experience that you have in 245 00:12:12,960 --> 00:12:16,480 Speaker 1: a very high resolution and high fidelity kind of approach, 246 00:12:17,520 --> 00:12:21,760 Speaker 1: then it becomes very difficult to generalize that to new situations. 247 00:12:22,080 --> 00:12:26,280 Speaker 1: The representations in the brain become almost hyper specific. They're 248 00:12:26,400 --> 00:12:29,280 Speaker 1: very very specific to those instances in which they were encoded. 249 00:12:29,760 --> 00:12:33,199 Speaker 1: So being able to extract the generalities or the knowledge 250 00:12:33,200 --> 00:12:36,440 Speaker 1: that you can apply to new things requires that there's 251 00:12:36,600 --> 00:12:40,680 Speaker 1: enough blur, enough fuzziness across the different instances so you 252 00:12:40,679 --> 00:12:43,120 Speaker 1: can generalize that knowledge. I'll give you an example. If 253 00:12:43,120 --> 00:12:44,520 Speaker 1: I were to ask you what is the capital of 254 00:12:44,520 --> 00:12:46,960 Speaker 1: the United States, you'd have a very very quick answer 255 00:12:46,960 --> 00:12:49,960 Speaker 1: for me, which is Washington, d Z Exactly. Now, if 256 00:12:50,000 --> 00:12:52,120 Speaker 1: I ask you, when did you first learn that? 257 00:12:54,679 --> 00:12:55,520 Speaker 2: Now don't know. 258 00:12:56,440 --> 00:12:58,800 Speaker 1: So now if I had asked you the day right 259 00:12:58,840 --> 00:13:01,040 Speaker 1: after you first learned that, be in class or maybe 260 00:13:01,040 --> 00:13:03,200 Speaker 1: from parents, you would have a pretty good memory for it. 261 00:13:03,240 --> 00:13:05,520 Speaker 1: That's a pretty exciting thing that you just learned. But 262 00:13:05,679 --> 00:13:08,720 Speaker 1: the reality is you've learned it so many times over 263 00:13:08,800 --> 00:13:11,400 Speaker 1: so many different exposures. You've heard in a million times 264 00:13:11,559 --> 00:13:14,680 Speaker 1: in many different settings. And what's most important is not 265 00:13:14,720 --> 00:13:16,240 Speaker 1: so much the first time you heard it or the 266 00:13:16,320 --> 00:13:19,080 Speaker 1: last time you heard it, but the fact that you 267 00:13:19,200 --> 00:13:22,000 Speaker 1: extracted that piece of knowledge, that core piece of knowledge 268 00:13:22,000 --> 00:13:24,559 Speaker 1: out and now that's part of your body of knowledge. 269 00:13:25,080 --> 00:13:28,240 Speaker 1: So the specifics around how and when and where we 270 00:13:28,400 --> 00:13:32,160 Speaker 1: encoded specific things may not be all that important from 271 00:13:32,200 --> 00:13:36,200 Speaker 1: an evolutionary standpoint to hold on to as the memory 272 00:13:36,240 --> 00:13:39,920 Speaker 1: for the actual knowledge is important. That's what's going to 273 00:13:39,920 --> 00:13:43,320 Speaker 1: guide your future action. That's what's important to generalize the 274 00:13:43,320 --> 00:13:46,960 Speaker 1: new situations. So in some ways there's no evolutionary pressure 275 00:13:47,240 --> 00:13:50,320 Speaker 1: to hold on to the specifics over time. But it 276 00:13:50,360 --> 00:13:53,000 Speaker 1: is an interesting point like why can't we have both? Well, 277 00:13:53,240 --> 00:13:55,320 Speaker 1: you can think about it from an energetic standpoint. If 278 00:13:55,320 --> 00:13:58,520 Speaker 1: you have a limited resource system, why would you invest 279 00:13:58,559 --> 00:14:01,520 Speaker 1: your energy into storing the specifics when you know you're 280 00:14:01,559 --> 00:14:02,680 Speaker 1: not going to use them down the line. 281 00:14:02,800 --> 00:14:06,440 Speaker 2: Yeah, no, fair enough. So you mentioned the analogy of 282 00:14:06,480 --> 00:14:10,040 Speaker 2: the brain as a computer. Is there any analogy over 283 00:14:10,240 --> 00:14:12,800 Speaker 2: history comparing a brain to something else that you think 284 00:14:13,040 --> 00:14:15,040 Speaker 2: is a helpful analogy or the brain is just so 285 00:14:15,120 --> 00:14:17,040 Speaker 2: different it doesn't make sense to try to compare it 286 00:14:17,080 --> 00:14:18,880 Speaker 2: to anything we have experience with. 287 00:14:19,360 --> 00:14:21,360 Speaker 1: So that is a really interesting question. And I have 288 00:14:21,520 --> 00:14:24,040 Speaker 1: thought about this before, and I keep coming back to 289 00:14:24,080 --> 00:14:27,800 Speaker 1: the computer as the closest thing, and I always tell 290 00:14:27,840 --> 00:14:30,280 Speaker 1: people I think it is possible that we will get 291 00:14:30,280 --> 00:14:32,880 Speaker 1: to a point, perhaps with quantum computing and other types 292 00:14:32,920 --> 00:14:35,640 Speaker 1: of things, where we might be able to approximate a 293 00:14:36,040 --> 00:14:40,240 Speaker 1: human brain like functionality in a computer. To this day, 294 00:14:40,280 --> 00:14:44,000 Speaker 1: we don't have that, even with the incredible advent of 295 00:14:44,040 --> 00:14:46,120 Speaker 1: AI in large language models and all of those things. 296 00:14:46,160 --> 00:14:49,240 Speaker 1: Still there are certain things that human brains can do, 297 00:14:49,280 --> 00:14:52,480 Speaker 1: and mammalian brains in general can do that computers just 298 00:14:52,560 --> 00:14:55,760 Speaker 1: are not capable of. But there is not another device 299 00:14:55,840 --> 00:14:58,920 Speaker 1: out there that is capable of this level of processing 300 00:14:59,520 --> 00:15:03,160 Speaker 1: that I can think think of to associate with that analogy. Particularly. 301 00:15:03,240 --> 00:15:04,880 Speaker 1: I mean, I think one of the things that the 302 00:15:04,920 --> 00:15:07,960 Speaker 1: brain does, don't get me wrong, like the computational kinds 303 00:15:08,000 --> 00:15:11,160 Speaker 1: of things that have been built are incredible, and the 304 00:15:11,240 --> 00:15:13,240 Speaker 1: amount of the minuscule amount of time that it takes 305 00:15:13,280 --> 00:15:15,120 Speaker 1: for them to be able to process and provide an 306 00:15:15,120 --> 00:15:20,160 Speaker 1: answer is just wild beyond the imagination. But there are 307 00:15:20,200 --> 00:15:23,480 Speaker 1: still some things that brains can do that the computer scans, 308 00:15:24,120 --> 00:15:27,520 Speaker 1: and in particular it has to do with the ability 309 00:15:27,560 --> 00:15:31,280 Speaker 1: to extract knowledge, the ability to error correct, the ability 310 00:15:31,320 --> 00:15:34,000 Speaker 1: to do the kinds of decision making that we do 311 00:15:34,400 --> 00:15:37,080 Speaker 1: as humans that are very difficult to encapsuling in the computer. 312 00:15:37,240 --> 00:15:40,200 Speaker 1: The ability to have emotion emotional reactions. Those things are 313 00:15:40,240 --> 00:15:42,560 Speaker 1: still very difficult to model. While you can tell the 314 00:15:42,600 --> 00:15:44,400 Speaker 1: computer all you want about what we think the human 315 00:15:44,440 --> 00:15:47,280 Speaker 1: experience is, we still don't understand that well enough to 316 00:15:47,320 --> 00:15:48,640 Speaker 1: be able to model it computer. 317 00:15:49,000 --> 00:15:51,160 Speaker 2: Yeah, fair enough, all right, So let's pull back to 318 00:15:51,240 --> 00:15:54,120 Speaker 2: memory a little bit. So we've talked about how memory 319 00:15:54,240 --> 00:15:58,320 Speaker 2: differs from sentience. Are there different kinds of memory and 320 00:15:58,440 --> 00:16:00,800 Speaker 2: do our brains store different kinds of manator differently? 321 00:16:01,480 --> 00:16:04,480 Speaker 1: Yes, And oftentimes we just say memory as like one 322 00:16:04,560 --> 00:16:08,920 Speaker 1: big blanket umbrella kind of term, But it is important 323 00:16:08,960 --> 00:16:11,480 Speaker 1: to know that there are different memory types, different memory 324 00:16:11,480 --> 00:16:14,040 Speaker 1: systems in the brain, and they serve different functions. So 325 00:16:14,120 --> 00:16:16,240 Speaker 1: let me give you a couple of examples. One type 326 00:16:16,240 --> 00:16:18,440 Speaker 1: of memory, which I tend to like quite a bit, 327 00:16:18,480 --> 00:16:21,400 Speaker 1: we studied quite often in my research laboratory, is what 328 00:16:21,440 --> 00:16:24,840 Speaker 1: we call episodic memory, memory for episodes, memory for events 329 00:16:24,840 --> 00:16:27,320 Speaker 1: that happened to us, And we tend to kind of 330 00:16:27,320 --> 00:16:30,680 Speaker 1: operationally define it as remembering what happened, where it happened, 331 00:16:30,680 --> 00:16:32,760 Speaker 1: and when it happens, and whenever you have kind of 332 00:16:32,800 --> 00:16:35,080 Speaker 1: like the collection or the conjunction of those three, you 333 00:16:35,120 --> 00:16:36,800 Speaker 1: can label that as an episode and you have a 334 00:16:36,840 --> 00:16:39,840 Speaker 1: memory for a particular episode in your life. Typically you 335 00:16:39,880 --> 00:16:42,880 Speaker 1: think about that as also the root of our autobiographical memory, 336 00:16:42,920 --> 00:16:46,440 Speaker 1: our memory for autobiographical experiences, things that happen to us. 337 00:16:47,120 --> 00:16:49,040 Speaker 1: But that's very different from membering how to tie your 338 00:16:49,080 --> 00:16:52,920 Speaker 1: shoe races, or how to ride a bicycle, or how 339 00:16:52,920 --> 00:16:56,520 Speaker 1: to do something like your tennis swing or your golf swing. 340 00:16:57,040 --> 00:17:00,320 Speaker 1: You know, those kinds of things are trained in the 341 00:17:00,320 --> 00:17:03,680 Speaker 1: brain very differently. They involve very different systems. A lot 342 00:17:03,760 --> 00:17:06,520 Speaker 1: of times, they require more trial and error kind of learning, 343 00:17:07,040 --> 00:17:10,320 Speaker 1: and they tend to be a bit less accessible to consciousness. 344 00:17:10,760 --> 00:17:13,399 Speaker 1: So there will call sort of implicit kinds of learning 345 00:17:14,400 --> 00:17:16,640 Speaker 1: so if you look at how the brain is organized, 346 00:17:16,920 --> 00:17:20,040 Speaker 1: pretty much every patch of cortex is capable of some 347 00:17:20,240 --> 00:17:23,040 Speaker 1: form of memory. Another term that we typically use in 348 00:17:23,040 --> 00:17:26,520 Speaker 1: neuroscience is plasticity. The idea that the brain is plastic 349 00:17:26,600 --> 00:17:30,200 Speaker 1: means it's capable of change. So whenever you have experience, 350 00:17:30,560 --> 00:17:33,560 Speaker 1: cells that are responding to that experience are capable of change, 351 00:17:33,880 --> 00:17:36,679 Speaker 1: and that change typically is thought of as a change 352 00:17:36,680 --> 00:17:39,199 Speaker 1: in the connections and the way that cells communicate with 353 00:17:39,240 --> 00:17:42,840 Speaker 1: each other, but some change that reflects a record of 354 00:17:42,840 --> 00:17:46,000 Speaker 1: the experience that you had. Now, those changes happen throughout 355 00:17:46,040 --> 00:17:48,719 Speaker 1: They can happen in our visual cortex, our visual system, 356 00:17:48,920 --> 00:17:52,119 Speaker 1: our auditory system. They can happen in the episodic memory 357 00:17:52,119 --> 00:17:54,160 Speaker 1: systems in the brain, or they can happen in these 358 00:17:54,200 --> 00:17:57,159 Speaker 1: more implicit memory kinds of systems in the brain that 359 00:17:57,240 --> 00:18:01,040 Speaker 1: typically support more unconscious function like knowing how to ride 360 00:18:01,040 --> 00:18:03,480 Speaker 1: a bicycle, like knowing how to swing the golf club 361 00:18:03,600 --> 00:18:06,520 Speaker 1: and those kinds of things. Those are stored separately. And 362 00:18:06,560 --> 00:18:09,000 Speaker 1: we know this to be true because we see patients 363 00:18:09,000 --> 00:18:11,760 Speaker 1: that have deficits in one type of memory and not another, 364 00:18:11,800 --> 00:18:14,359 Speaker 1: because they have maybe a focal stroke or some damage 365 00:18:14,440 --> 00:18:16,960 Speaker 1: or some deficit that impacted one system and not the other. 366 00:18:17,200 --> 00:18:19,360 Speaker 1: So they struggle with that one type of memory that's 367 00:18:19,359 --> 00:18:22,160 Speaker 1: affected in that system, but everything else seems to be intact. 368 00:18:22,640 --> 00:18:24,880 Speaker 2: And by type of memory does that mean like category 369 00:18:24,920 --> 00:18:27,880 Speaker 2: of memories like the tennis swing and the other sort 370 00:18:27,880 --> 00:18:30,760 Speaker 2: of muscle memory things, or like I could forget high 371 00:18:30,760 --> 00:18:33,040 Speaker 2: school if I had a stroke in the right place right. 372 00:18:33,000 --> 00:18:36,760 Speaker 1: Although that's actually increasingly difficult. So typically, if there is 373 00:18:36,800 --> 00:18:40,000 Speaker 1: a stroke that is focal, it might affect motor memory. 374 00:18:40,040 --> 00:18:42,680 Speaker 1: It might affect memory that allows you to kind of 375 00:18:42,920 --> 00:18:44,360 Speaker 1: move your hands in the right way and be able 376 00:18:44,400 --> 00:18:47,240 Speaker 1: to support that kind of function. But episodic memory is 377 00:18:47,280 --> 00:18:50,959 Speaker 1: this really weird thing. Initially, it does depend on key 378 00:18:51,040 --> 00:18:53,080 Speaker 1: regions of the brain, one of them being the hippocampus. 379 00:18:53,840 --> 00:18:56,920 Speaker 1: That's a really important region for episodic memory. But over 380 00:18:57,080 --> 00:19:00,440 Speaker 1: time memories start to become somewhat independent of the hippi 381 00:19:00,440 --> 00:19:03,440 Speaker 1: campus they start to become stored elsewhere, and that's one 382 00:19:03,440 --> 00:19:06,560 Speaker 1: of the reasons why in Alzheimer's disease, where we know 383 00:19:06,640 --> 00:19:09,840 Speaker 1: the hippocampus is one of the earliest regions to degenerate. 384 00:19:10,560 --> 00:19:14,119 Speaker 1: As that starts to go away, you see a loss 385 00:19:14,200 --> 00:19:18,679 Speaker 1: of recent memories things that were recently acquired, maybe weeks, months, 386 00:19:18,720 --> 00:19:22,199 Speaker 1: or a couple of years before the decline started. But 387 00:19:22,359 --> 00:19:26,080 Speaker 1: things from long ago, like high school are preserved. And 388 00:19:26,160 --> 00:19:29,399 Speaker 1: the reason they're preserved is that they've now been consolidated 389 00:19:29,440 --> 00:19:32,359 Speaker 1: that sort of a technical term for made strength and 390 00:19:32,920 --> 00:19:36,160 Speaker 1: made resilience to loss. And that's because their stored sort 391 00:19:36,200 --> 00:19:39,240 Speaker 1: of in parallel throughout the brain. So just the focal 392 00:19:39,280 --> 00:19:42,120 Speaker 1: deficit there is likely not going to wipe out those 393 00:19:42,160 --> 00:19:45,800 Speaker 1: particular memories, but it's much more likely to wipe out say, yeah, 394 00:19:45,840 --> 00:19:49,119 Speaker 1: your ability to have the right swing or ride a 395 00:19:49,119 --> 00:19:51,240 Speaker 1: bicycle or anything like that. Those are the kinds of 396 00:19:51,240 --> 00:19:53,040 Speaker 1: things that are much more focally stored. 397 00:19:53,240 --> 00:19:55,360 Speaker 2: Okay, and can we dig a little bit more into 398 00:19:56,000 --> 00:19:59,160 Speaker 2: exactly how the brain stores memories. Is it like how 399 00:19:59,200 --> 00:20:01,720 Speaker 2: do neurons connect with each other? Yeah, let's dig in. 400 00:20:02,000 --> 00:20:06,159 Speaker 1: Yeah. So brain cells are very very unique compared to 401 00:20:06,240 --> 00:20:09,440 Speaker 1: other cells in the body. And provided this is again 402 00:20:09,560 --> 00:20:11,640 Speaker 1: I always tell my students, you know, this is true 403 00:20:11,720 --> 00:20:14,400 Speaker 1: ninety five percent of the time according to our knowledge today. 404 00:20:14,800 --> 00:20:18,200 Speaker 1: I sometimes things you know, several weeks from now, months, years, 405 00:20:18,240 --> 00:20:20,280 Speaker 1: things can get revised. So this is just according to 406 00:20:20,280 --> 00:20:24,080 Speaker 1: our current knowledge today, brain cells are able to communicate 407 00:20:24,119 --> 00:20:26,720 Speaker 1: with one another in a way that other cells in 408 00:20:26,760 --> 00:20:29,440 Speaker 1: the body are not able to. And the way that 409 00:20:29,480 --> 00:20:32,479 Speaker 1: they communicate with each other is using a combination of 410 00:20:32,520 --> 00:20:37,239 Speaker 1: electricity and chemistry. So the transmission of signals within a 411 00:20:37,240 --> 00:20:40,920 Speaker 1: brain cell is entirely electrical, and we can talk about 412 00:20:40,920 --> 00:20:43,960 Speaker 1: that in a second, but the transmission from one cell 413 00:20:44,000 --> 00:20:47,040 Speaker 1: to the next most of the time is chemical. It 414 00:20:47,080 --> 00:20:50,119 Speaker 1: involves the release of a neurochemical that goes from one cell, 415 00:20:50,480 --> 00:20:53,880 Speaker 1: binds to the other, and then initiates another electrical signal 416 00:20:54,240 --> 00:20:57,080 Speaker 1: from the next cell to the next cell. So it 417 00:20:57,119 --> 00:21:01,040 Speaker 1: goes really fast electrical, somewhat slower chemical, really fast electrical, 418 00:21:01,160 --> 00:21:03,760 Speaker 1: someone slower chemical, and so on, and you have this 419 00:21:03,920 --> 00:21:08,040 Speaker 1: sort of progression of communication between cells. And that's really 420 00:21:08,040 --> 00:21:10,760 Speaker 1: important because these cells need to bring in signals that 421 00:21:11,280 --> 00:21:15,040 Speaker 1: essentially encode the outside world and bring that knowledge into 422 00:21:15,080 --> 00:21:17,320 Speaker 1: the brain to create some sort of representation of it, 423 00:21:17,400 --> 00:21:20,080 Speaker 1: and then act on them. Allow us to move, allow 424 00:21:20,160 --> 00:21:22,720 Speaker 1: us to avoid a threat, allow us to seek reward, 425 00:21:22,800 --> 00:21:24,679 Speaker 1: all of those kinds of things. So that's how the 426 00:21:24,680 --> 00:21:28,200 Speaker 1: brain typically communicates. But your question is how does memory happen? 427 00:21:28,920 --> 00:21:32,040 Speaker 1: And there was sort of lots of answers over the years. 428 00:21:32,160 --> 00:21:34,840 Speaker 1: We used to think, well, maybe it's encoded in the 429 00:21:34,920 --> 00:21:39,119 Speaker 1: DNA and cells. Maybe it's encoded in the cell size, 430 00:21:39,160 --> 00:21:41,840 Speaker 1: maybe it's encoded in the whatever else is happening to 431 00:21:41,960 --> 00:21:46,000 Speaker 1: change cell shape. And the current answer is that it 432 00:21:46,119 --> 00:21:48,760 Speaker 1: is much more likely to be encoded in the connections 433 00:21:49,560 --> 00:21:51,879 Speaker 1: in the way that these cells communicate with one another. 434 00:21:51,960 --> 00:21:55,040 Speaker 1: So let's say sell A is firing and cell B 435 00:21:55,280 --> 00:21:58,960 Speaker 1: is receiving a signal from cell A. If I were 436 00:21:59,040 --> 00:22:02,760 Speaker 1: to modify some how the frequency by which sell A communicates, 437 00:22:02,760 --> 00:22:06,640 Speaker 1: would sell be by making it fire more, or increase 438 00:22:06,720 --> 00:22:11,040 Speaker 1: the neurotransmitter release, or increase the number of receptors on 439 00:22:11,119 --> 00:22:14,440 Speaker 1: the second cell that receives that neurotransmitter. I can make 440 00:22:14,480 --> 00:22:17,760 Speaker 1: it so that the communication between those cells is enhanced. 441 00:22:18,119 --> 00:22:20,320 Speaker 1: And the way that the analogy that I like to 442 00:22:20,400 --> 00:22:23,639 Speaker 1: use for this is imagine you are at a club 443 00:22:24,320 --> 00:22:26,640 Speaker 1: and there's lots and lots of noise going on, lots 444 00:22:26,680 --> 00:22:28,600 Speaker 1: of things going on, lots of people talking to each other, 445 00:22:28,600 --> 00:22:31,679 Speaker 1: there's loud music and so on, and you're trying to 446 00:22:31,680 --> 00:22:35,679 Speaker 1: communicate something to somebody who's dancing alongside you, and it's very, 447 00:22:35,760 --> 00:22:38,720 Speaker 1: very difficult because of all of that noise. But now 448 00:22:38,760 --> 00:22:40,280 Speaker 1: you get real close to them and you kind of, 449 00:22:40,280 --> 00:22:42,159 Speaker 1: you know, hold your hand to their ears and you 450 00:22:42,240 --> 00:22:45,160 Speaker 1: start to talk directly into their ears. Now they're going 451 00:22:45,200 --> 00:22:47,960 Speaker 1: to receive that communication with much higher fidelity, be able 452 00:22:47,960 --> 00:22:51,119 Speaker 1: to tune out noise and selectively attend to that communication. 453 00:22:51,440 --> 00:22:54,520 Speaker 1: You've enhanced the communication between that person and the person 454 00:22:54,520 --> 00:22:58,560 Speaker 1: they're talking to. That's what happens at synapses. There's lots 455 00:22:58,560 --> 00:23:01,600 Speaker 1: and lots of synaptic firing and lots of communication happening. 456 00:23:01,960 --> 00:23:04,240 Speaker 1: But when cells start to attach to each other, they 457 00:23:04,280 --> 00:23:08,800 Speaker 1: communicate much more preferentially. They can transmit signals that express 458 00:23:08,880 --> 00:23:11,800 Speaker 1: that form of learning. In other words, if there's an 459 00:23:11,800 --> 00:23:15,280 Speaker 1: experience that happens that is learned by the brain, the 460 00:23:15,320 --> 00:23:18,520 Speaker 1: brain can express a form of plasticity or a form 461 00:23:18,560 --> 00:23:22,320 Speaker 1: of memory in the strength of the connections. So if 462 00:23:22,359 --> 00:23:25,240 Speaker 1: the connections grow stronger, that's a signal that this memory 463 00:23:25,280 --> 00:23:28,080 Speaker 1: has been learned. And most of the information that we 464 00:23:28,119 --> 00:23:31,560 Speaker 1: have about this comes from animal models, comes from slice recordings, 465 00:23:31,600 --> 00:23:35,400 Speaker 1: where we can see evidence for enhancements in the connectivity, 466 00:23:35,760 --> 00:23:39,280 Speaker 1: enhancement in the communication between cells as a result of 467 00:23:39,320 --> 00:23:40,399 Speaker 1: a learning experience. 468 00:23:40,720 --> 00:23:43,920 Speaker 2: One. You've reminded me why I hate clubs. That's sorry, 469 00:23:43,920 --> 00:23:45,119 Speaker 2: no one invites me to them anymore. 470 00:23:45,200 --> 00:23:45,920 Speaker 1: I share that what you do? 471 00:23:46,680 --> 00:23:49,720 Speaker 2: Yeah yeah, Okay. So we've got these connections, they get strengthened, 472 00:23:49,760 --> 00:23:52,440 Speaker 2: but then it feels like there's another step between having 473 00:23:52,440 --> 00:23:54,919 Speaker 2: this connection and then having a like specific memory. So 474 00:23:54,960 --> 00:23:56,280 Speaker 2: you know, like we're not at the point where we 475 00:23:56,320 --> 00:23:58,880 Speaker 2: could even in mice and correct me if I'm wrong 476 00:23:58,880 --> 00:24:01,199 Speaker 2: about this, where we could like see which neurons are 477 00:24:01,240 --> 00:24:05,120 Speaker 2: firing together and know they're thinking about food. Yeah, yeah, 478 00:24:05,119 --> 00:24:06,160 Speaker 2: So where do we go from there? 479 00:24:06,280 --> 00:24:09,000 Speaker 1: So what you're talking about actually is a very very 480 00:24:09,040 --> 00:24:12,479 Speaker 1: big problem that we deal with in neuroscience and in 481 00:24:12,560 --> 00:24:17,040 Speaker 1: cognitive science, and it's the credit assignment problem. How do 482 00:24:17,119 --> 00:24:19,919 Speaker 1: you know that a particular cell is assigned to a 483 00:24:19,920 --> 00:24:22,600 Speaker 1: particular memory or a particular connection it's assigned to a 484 00:24:22,640 --> 00:24:25,359 Speaker 1: particular memory. And it's a very challenging question and we 485 00:24:25,400 --> 00:24:28,040 Speaker 1: don't know the answer yet, but we suspect There's been 486 00:24:28,240 --> 00:24:31,639 Speaker 1: a lot of research on what's called mechanisms of allocation. 487 00:24:32,400 --> 00:24:36,080 Speaker 1: In other words, how can you allocate particular synapses, brain 488 00:24:36,200 --> 00:24:40,560 Speaker 1: cell connections and cells to a particular memory and not 489 00:24:40,640 --> 00:24:43,719 Speaker 1: another right so how can we get the specificity that 490 00:24:43,800 --> 00:24:46,400 Speaker 1: we need in the system. And there's been a flurry 491 00:24:46,440 --> 00:24:49,840 Speaker 1: of work in recent years understanding there are certain proteins 492 00:24:49,840 --> 00:24:54,480 Speaker 1: that are used to quote label synapses, label particular cells, 493 00:24:54,960 --> 00:24:57,520 Speaker 1: and assign them to a one memory and not another. 494 00:24:58,119 --> 00:25:00,520 Speaker 1: It's just really brilliant work by some colleagues in the 495 00:25:00,520 --> 00:25:03,520 Speaker 1: field that is trying to really get at this specificity question. 496 00:25:03,560 --> 00:25:06,760 Speaker 1: Now we're still early days. We don't have final answers yet, 497 00:25:06,920 --> 00:25:10,119 Speaker 1: but I think we have some tentitive ideas that you 498 00:25:10,240 --> 00:25:13,240 Speaker 1: can with specific proteins that are expressed in the synaps 499 00:25:13,520 --> 00:25:16,320 Speaker 1: essentially label them or prime them to be the ones 500 00:25:16,359 --> 00:25:18,840 Speaker 1: that are modified by this experience and maybe not another. 501 00:25:19,119 --> 00:25:20,880 Speaker 2: Wow, and that's pretty cool, right. 502 00:25:20,840 --> 00:25:22,880 Speaker 1: Yeah, because from the looment of the time, we thought, well, 503 00:25:22,880 --> 00:25:25,160 Speaker 1: how can we ever have any specificity in our brains 504 00:25:25,280 --> 00:25:27,640 Speaker 1: if memory just activates cells, how do you know which 505 00:25:27,640 --> 00:25:29,880 Speaker 1: cells are the ones that are involved here? And this 506 00:25:29,960 --> 00:25:31,640 Speaker 1: might actually re find us with some answers. 507 00:25:31,720 --> 00:25:34,720 Speaker 2: Wow. I feel like I recently heard about the connectome project, 508 00:25:34,720 --> 00:25:36,720 Speaker 2: which I think is trying to figure out all of 509 00:25:36,760 --> 00:25:39,239 Speaker 2: the neurons. So does this suggest that once we have 510 00:25:39,280 --> 00:25:42,400 Speaker 2: a connectome next we need to work on the proteome 511 00:25:42,520 --> 00:25:45,080 Speaker 2: that connects to the connectome to really understand how all 512 00:25:45,119 --> 00:25:47,720 Speaker 2: of this works, or how helpful is this connect dome 513 00:25:47,720 --> 00:25:48,400 Speaker 2: project going to. 514 00:25:48,320 --> 00:25:51,160 Speaker 1: Be very helpful? I think that every time we try 515 00:25:51,200 --> 00:25:54,040 Speaker 1: it and map another home it is very helpful. At 516 00:25:54,080 --> 00:25:56,959 Speaker 1: some point we all have an everything owned and you know, 517 00:25:57,040 --> 00:25:59,919 Speaker 1: with that sort of large scale data effort, we're going 518 00:25:59,920 --> 00:26:02,760 Speaker 1: to need also the AI, the machine learning, all of 519 00:26:02,800 --> 00:26:05,200 Speaker 1: those tools people to pass through it and actually figure 520 00:26:05,240 --> 00:26:07,800 Speaker 1: out what's going on. But it's interesting, you know, Kelly, 521 00:26:08,000 --> 00:26:10,080 Speaker 1: When I was coming up as a student, there was 522 00:26:10,119 --> 00:26:13,280 Speaker 1: always this question that was asked by faculty in my 523 00:26:13,400 --> 00:26:16,720 Speaker 1: department and many other departments, And it's a theoretical question, 524 00:26:16,840 --> 00:26:19,480 Speaker 1: and I want to pose that question to you, which is, 525 00:26:19,680 --> 00:26:22,679 Speaker 1: if we were to map every single neuron in the 526 00:26:22,680 --> 00:26:26,520 Speaker 1: brain and every single connection in the brain, would we 527 00:26:26,560 --> 00:26:30,199 Speaker 1: have learned anything about how the brain actually functions? And 528 00:26:30,320 --> 00:26:33,199 Speaker 1: whenever they asked that question, you know, some of us 529 00:26:33,200 --> 00:26:34,919 Speaker 1: were attempted to say, well, yeah, of course you can 530 00:26:35,040 --> 00:26:37,080 Speaker 1: have all the data, right, And the answer they wanted 531 00:26:37,160 --> 00:26:38,960 Speaker 1: us to get to is no, you're no better off 532 00:26:39,000 --> 00:26:41,440 Speaker 1: because you have a ton of data but no way 533 00:26:41,520 --> 00:26:45,240 Speaker 1: to really test hypotheses and understand function. You have to 534 00:26:45,280 --> 00:26:47,040 Speaker 1: have the right model, You have to have the right 535 00:26:47,119 --> 00:26:50,000 Speaker 1: kind of strategy to go into that data and look 536 00:26:50,040 --> 00:26:53,280 Speaker 1: for what's necessary. But I would argue that over the 537 00:26:53,359 --> 00:26:57,200 Speaker 1: last twenty thirty years that thinking has evolved and now 538 00:26:57,760 --> 00:27:01,200 Speaker 1: we can go in in a completely unsupervised without having 539 00:27:01,240 --> 00:27:03,800 Speaker 1: a model, which means also we avoid some of the 540 00:27:03,840 --> 00:27:06,840 Speaker 1: biases that might come from a model and ask what 541 00:27:06,880 --> 00:27:09,520 Speaker 1: does the data tell us? Sure, there is an explosion 542 00:27:09,520 --> 00:27:12,040 Speaker 1: of data, but there are patterns that are hidden in there, 543 00:27:12,280 --> 00:27:14,640 Speaker 1: and if you train up AI enough, it can pick 544 00:27:14,680 --> 00:27:17,480 Speaker 1: up on those patterns and maybe tell us that there's 545 00:27:17,520 --> 00:27:19,720 Speaker 1: a new model. There's a different model. The way that 546 00:27:19,720 --> 00:27:22,520 Speaker 1: we were thinking about the brain informing hypotheses may not 547 00:27:22,560 --> 00:27:25,119 Speaker 1: have been right all along. So I go back to 548 00:27:25,200 --> 00:27:28,120 Speaker 1: things like the connect home projects and trying to resolve 549 00:27:28,160 --> 00:27:31,880 Speaker 1: the connect home, the proteome, the epigenome, all of those 550 00:27:32,000 --> 00:27:36,000 Speaker 1: kinds of things as different layers of knowledge about the 551 00:27:36,000 --> 00:27:38,960 Speaker 1: nervous system. And if we have all of that information, 552 00:27:39,000 --> 00:27:41,399 Speaker 1: it stands to reason that we should be able to 553 00:27:41,440 --> 00:27:45,119 Speaker 1: pick up on patterns, and patterns can maybe transform the 554 00:27:45,160 --> 00:27:47,720 Speaker 1: way that we think about the brain. Maybe we're all 555 00:27:47,720 --> 00:27:51,800 Speaker 1: along maybe the brain does quantum computing. Maybe all sorts 556 00:27:51,840 --> 00:27:54,560 Speaker 1: of things that we just would have never imagined that 557 00:27:54,600 --> 00:27:56,040 Speaker 1: are buried in that data. 558 00:27:56,119 --> 00:27:57,919 Speaker 2: It sounds like an exciting time to be in the field. 559 00:27:58,119 --> 00:27:59,719 Speaker 1: Oh. Absolutely, absolutely. 560 00:28:16,080 --> 00:28:17,280 Speaker 2: I'm going to pull us back a little bit. In 561 00:28:17,320 --> 00:28:19,720 Speaker 2: our conversation. You were talking about how you can strengthen 562 00:28:19,760 --> 00:28:23,600 Speaker 2: connections between neurons. Does that happen because you're thinking of 563 00:28:23,640 --> 00:28:25,960 Speaker 2: the same memory over and over again? And so how 564 00:28:25,960 --> 00:28:27,919 Speaker 2: do memories get strengthened and how do we lose some 565 00:28:28,000 --> 00:28:28,439 Speaker 2: of them? 566 00:28:28,520 --> 00:28:32,280 Speaker 1: Yeah, So the strengthening of memory, or the idea of consolidation, 567 00:28:32,400 --> 00:28:34,680 Speaker 1: which is exactly what it sounds like. It's making memories 568 00:28:34,720 --> 00:28:38,320 Speaker 1: more solidified or more strengthened. It can happen because we 569 00:28:38,440 --> 00:28:41,840 Speaker 1: are repeatedly rehearsing the memory, so we're thinking about it 570 00:28:41,880 --> 00:28:44,240 Speaker 1: over and over. But the good news, Kelly, is that 571 00:28:44,240 --> 00:28:48,400 Speaker 1: that happens completely incidentally, without you intentionally trying to do it. 572 00:28:48,640 --> 00:28:51,400 Speaker 1: Your brain constantly brings up old memories and thinks about them, 573 00:28:51,480 --> 00:28:53,520 Speaker 1: and even if you're not consciously aware of it, it happens 574 00:28:53,560 --> 00:28:56,440 Speaker 1: while you sleep at night. Okay, so the brain is 575 00:28:56,440 --> 00:28:59,760 Speaker 1: sort of on the back burner, constantly playing through these memories, 576 00:29:00,080 --> 00:29:02,320 Speaker 1: replaying through these memories, and even when you go to sleep, 577 00:29:02,480 --> 00:29:05,000 Speaker 1: it's replaying through these memories. So the strengthening act of 578 00:29:05,080 --> 00:29:08,880 Speaker 1: memories doesn't have to be this intentional thing, which I 579 00:29:08,920 --> 00:29:11,080 Speaker 1: think is a really powerful thing to tell students. Also, 580 00:29:11,240 --> 00:29:12,960 Speaker 1: you don't have to sit here and like regurgitate and 581 00:29:13,040 --> 00:29:15,480 Speaker 1: rehearse everything over and over and over. Just go through 582 00:29:15,520 --> 00:29:18,560 Speaker 1: and understand and then get a good night's sleep, right, 583 00:29:18,720 --> 00:29:20,520 Speaker 1: which is a lot of them really struggle to do. 584 00:29:20,840 --> 00:29:24,080 Speaker 1: But during that period you might think, well, that's just rest. 585 00:29:24,400 --> 00:29:26,960 Speaker 1: Actually the brain can be quite active during that period 586 00:29:27,000 --> 00:29:29,080 Speaker 1: of time and can be playing through these memories and 587 00:29:29,120 --> 00:29:32,760 Speaker 1: storing them and trying to make them morsistant to forgetting now. 588 00:29:33,160 --> 00:29:37,000 Speaker 1: One thing to note also is that when you replay memories, 589 00:29:37,040 --> 00:29:40,560 Speaker 1: when you bring back memories, you don't end up with 590 00:29:40,680 --> 00:29:44,720 Speaker 1: the same thing being stored again. So memories are reconstructive 591 00:29:44,720 --> 00:29:47,000 Speaker 1: in nature. If I were to bring back an experience, 592 00:29:47,000 --> 00:29:48,920 Speaker 1: say that I have from a few weeks ago and 593 00:29:49,040 --> 00:29:51,680 Speaker 1: talk about it with you, what I end up storing 594 00:29:51,760 --> 00:29:55,640 Speaker 1: now and reinforcing is a somewhat altered version of that memory. 595 00:29:55,880 --> 00:29:58,800 Speaker 1: It's not the same thing, because memory is reconstructed on 596 00:29:58,880 --> 00:30:01,880 Speaker 1: piecing together the pieces. Other pieces just get incorporated because 597 00:30:01,880 --> 00:30:04,239 Speaker 1: we're having a conversation about it. And then later on 598 00:30:04,280 --> 00:30:07,160 Speaker 1: what I remember is some amalgamation of all of those 599 00:30:07,200 --> 00:30:09,640 Speaker 1: experiences when I brought it back and changed it ever 600 00:30:09,760 --> 00:30:12,600 Speaker 1: so slightly. My colleague here at you see, Aravin Beth 601 00:30:12,640 --> 00:30:16,000 Speaker 1: Loftus built her career on studying false memories and how 602 00:30:16,080 --> 00:30:19,280 Speaker 1: they arise, and they are very, very frequent. They arise 603 00:30:19,360 --> 00:30:21,960 Speaker 1: all the time. We generate the more as we get older, 604 00:30:22,160 --> 00:30:24,640 Speaker 1: and they happen just by virtue of our memory being 605 00:30:24,960 --> 00:30:29,920 Speaker 1: a reconstructive system rather than a high fidelity video camera 606 00:30:30,240 --> 00:30:33,520 Speaker 1: or a picture of reality. It's just a sort of 607 00:30:33,520 --> 00:30:36,200 Speaker 1: a Hodgepodg construction of what that reality might have been. 608 00:30:36,360 --> 00:30:39,520 Speaker 1: And again we say, why is this happening? Why can't 609 00:30:39,560 --> 00:30:42,120 Speaker 1: we have a high fidelity version of things? And it's 610 00:30:42,160 --> 00:30:44,960 Speaker 1: possible that we just don't need to. So even with 611 00:30:45,040 --> 00:30:48,400 Speaker 1: these false memories arising, they're very artifactual, like you really 612 00:30:48,400 --> 00:30:50,760 Speaker 1: need to remember exactly what happened and where it happened, 613 00:30:50,800 --> 00:30:52,560 Speaker 1: that when it happened, Ergie, you just need to remember 614 00:30:52,600 --> 00:30:55,440 Speaker 1: the core knowledge. So if we focus on, hey, the 615 00:30:55,480 --> 00:30:59,040 Speaker 1: corenology is being remembered accurately. That's all that matters. Everything 616 00:30:59,080 --> 00:31:01,840 Speaker 1: else and go to crap, and nobody's going to be 617 00:31:01,920 --> 00:31:04,560 Speaker 1: less able to survive. So from a survival standpoint, it 618 00:31:04,600 --> 00:31:07,720 Speaker 1: certainly doesn't matter that you have this strengthening, be a 619 00:31:07,800 --> 00:31:10,440 Speaker 1: very high fidelity strengthening. It just matters that the core 620 00:31:10,480 --> 00:31:13,480 Speaker 1: component knowledge is the thing that strengthened, Like the capital 621 00:31:13,520 --> 00:31:16,480 Speaker 1: of the United States, Washington, everything else. Who cares. 622 00:31:16,880 --> 00:31:19,880 Speaker 2: But if the lawyer is you priding you for details 623 00:31:19,880 --> 00:31:22,640 Speaker 2: in a court case, that's when you're in some trouble. 624 00:31:22,880 --> 00:31:26,000 Speaker 1: Absolutely, and you know the good news is, well, the 625 00:31:26,160 --> 00:31:29,640 Speaker 1: somewhare good news is some lawyers, some federal judges, some 626 00:31:29,800 --> 00:31:35,040 Speaker 1: jury get the lecture about false memories and understand that 627 00:31:35,120 --> 00:31:37,000 Speaker 1: when you call witnesses to to stand and you're asking 628 00:31:37,080 --> 00:31:40,160 Speaker 1: very very specific questions, that their right collection is going 629 00:31:40,200 --> 00:31:42,880 Speaker 1: to be some combination of what actually happens, what the 630 00:31:42,960 --> 00:31:45,600 Speaker 1: brain sort of reconstructed it to be, what kinds of 631 00:31:45,680 --> 00:31:49,680 Speaker 1: questions are being asked, the pressure, any interrogation that happened earlier, 632 00:31:49,760 --> 00:31:52,800 Speaker 1: All of those things sort of weave their way into 633 00:31:52,840 --> 00:31:54,800 Speaker 1: that memory. You're never going to be able to get 634 00:31:54,800 --> 00:31:58,719 Speaker 1: this beautiful, accurate, one hundred percent depiction of what happens, 635 00:31:58,760 --> 00:32:01,400 Speaker 1: you're going to get some very of it that could 636 00:32:01,400 --> 00:32:03,000 Speaker 1: be quite a bit more corrupted. 637 00:32:03,240 --> 00:32:05,120 Speaker 2: Gosh, there could be a whole podcast on that topic. 638 00:32:05,760 --> 00:32:09,960 Speaker 2: So my co host wanted me to dig into how 639 00:32:10,000 --> 00:32:12,040 Speaker 2: we learn about this kind of stuff. So you mentioned 640 00:32:12,040 --> 00:32:15,200 Speaker 2: that there are animal models, but just focusing on people 641 00:32:15,280 --> 00:32:19,840 Speaker 2: right now, how good are our techniques for watching how 642 00:32:19,880 --> 00:32:22,240 Speaker 2: the brain works in living humans to sort of try 643 00:32:22,280 --> 00:32:23,560 Speaker 2: to get a handle on some of this stuff. 644 00:32:23,760 --> 00:32:27,840 Speaker 1: Yeah, So when we started out, the discipline of neuropsychology 645 00:32:28,000 --> 00:32:31,280 Speaker 1: had very, very poor tools available to it. So you 646 00:32:31,400 --> 00:32:34,400 Speaker 1: have to develop cognitive assessments, which have come a long way. 647 00:32:34,480 --> 00:32:36,480 Speaker 1: You can have the right kinds of cognitive assessments and 648 00:32:36,520 --> 00:32:38,720 Speaker 1: so on, but you really have to work with patients 649 00:32:38,720 --> 00:32:42,120 Speaker 1: and the clinic who come in presenting with memory problems, 650 00:32:42,200 --> 00:32:45,480 Speaker 1: with a variety of different conditions, and essentially it's akin 651 00:32:45,560 --> 00:32:49,239 Speaker 1: to leision studies. You're working with patients who might have 652 00:32:49,640 --> 00:32:52,000 Speaker 1: circumscribed focal deficit in the brain. You can see that 653 00:32:52,040 --> 00:32:54,760 Speaker 1: on structural MRI for example, and say this part of 654 00:32:54,760 --> 00:32:57,400 Speaker 1: the brain is damaged or missing. Therefore they have this 655 00:32:57,560 --> 00:32:59,760 Speaker 1: kind of function, So you might be able to say 656 00:32:59,800 --> 00:33:01,760 Speaker 1: something about the function of this part of the brain 657 00:33:01,800 --> 00:33:06,400 Speaker 1: in a healthy person. But our tools have evolved significantly since, 658 00:33:06,600 --> 00:33:09,640 Speaker 1: so two major advances that I can tell you are 659 00:33:09,680 --> 00:33:12,520 Speaker 1: still today. The chief ways by which we study this. 660 00:33:12,960 --> 00:33:14,880 Speaker 1: One is functional MRI, and we tend to do a 661 00:33:14,960 --> 00:33:16,680 Speaker 1: heck of a lot of that in my lab. So 662 00:33:16,840 --> 00:33:19,560 Speaker 1: functional MRI operates on the principle that you can put 663 00:33:19,600 --> 00:33:22,960 Speaker 1: somebody in the scanner totally intact brain and give them 664 00:33:23,040 --> 00:33:25,840 Speaker 1: a game to play, or a memory task, or any 665 00:33:25,840 --> 00:33:28,920 Speaker 1: sort of challenge that would engage the memory bits of 666 00:33:29,000 --> 00:33:32,200 Speaker 1: their brain, for example. And what you're imaging with functional 667 00:33:32,280 --> 00:33:35,920 Speaker 1: MRI is not neural activity directly. What you're imaging is 668 00:33:35,960 --> 00:33:39,840 Speaker 1: blood flow. The idea being that if there's a patch 669 00:33:39,920 --> 00:33:44,160 Speaker 1: of cortex patchup brain that is more active, that is 670 00:33:44,280 --> 00:33:47,840 Speaker 1: engaged in this challenge, it's going to require oxygen and 671 00:33:47,880 --> 00:33:51,360 Speaker 1: glucose and it's going to try to extract that out 672 00:33:51,360 --> 00:33:55,880 Speaker 1: of the blood flow. So by mapping how much oxygenated 673 00:33:55,960 --> 00:33:59,760 Speaker 1: blood and deoxygenated blood are going to different areas in 674 00:33:59,800 --> 00:34:03,040 Speaker 1: the brain, you can generate a contrast because it turns 675 00:34:03,040 --> 00:34:06,080 Speaker 1: out that the degree of oxygenation has a different magnetic signal. 676 00:34:06,320 --> 00:34:07,880 Speaker 1: So that's sort of a little hack that we pull 677 00:34:07,920 --> 00:34:11,600 Speaker 1: in MRI because we're changing ragnetic fields. So by measuring 678 00:34:11,640 --> 00:34:15,640 Speaker 1: that contrast, we can get an indirect proxy to where 679 00:34:15,719 --> 00:34:19,440 Speaker 1: neural activity might be by virtue of that blood flow change. 680 00:34:19,880 --> 00:34:22,400 Speaker 1: So that's been a really, really helpful technique since the 681 00:34:22,560 --> 00:34:25,480 Speaker 1: late nineties early two thousands. In the early days of 682 00:34:25,480 --> 00:34:28,319 Speaker 1: functional MRI, people did a bunch of like really just 683 00:34:28,440 --> 00:34:31,560 Speaker 1: awful studies because the technology was new and we didn't 684 00:34:31,560 --> 00:34:33,800 Speaker 1: know what to do with it, and folks didn't really 685 00:34:34,320 --> 00:34:36,839 Speaker 1: think beyond you know, the X marks the spot kind 686 00:34:36,840 --> 00:34:39,680 Speaker 1: of approach, Right, I want to know what the fill 687 00:34:39,719 --> 00:34:42,080 Speaker 1: in the blank part of the brain is. It even 688 00:34:42,120 --> 00:34:43,799 Speaker 1: got as absurd as I want to know what the 689 00:34:43,840 --> 00:34:46,200 Speaker 1: god part of the brain is. Right, So people start 690 00:34:46,239 --> 00:34:48,080 Speaker 1: to do those kinds of studies, try to go in 691 00:34:48,160 --> 00:34:50,800 Speaker 1: and say, X marks the spot, where's the stuff happening? 692 00:34:51,160 --> 00:34:54,919 Speaker 2: Is this the same system where they had that dead trout? Oh? 693 00:34:55,000 --> 00:34:58,080 Speaker 1: Yeah, you know that that trout study. Of course. So 694 00:34:58,239 --> 00:34:59,960 Speaker 1: at the end of the day, it is a statistic 695 00:35:00,160 --> 00:35:04,840 Speaker 1: approach to comparing activation, and yeah, that study is really 696 00:35:04,840 --> 00:35:07,480 Speaker 1: compelling because you could show that you see activation essentially 697 00:35:07,520 --> 00:35:10,920 Speaker 1: in something that is dead, and people every now and 698 00:35:10,960 --> 00:35:12,600 Speaker 1: then will kind of make fun of this, and remember 699 00:35:12,680 --> 00:35:15,080 Speaker 1: the old days of fMRI, when folks didn't really know 700 00:35:15,120 --> 00:35:16,920 Speaker 1: what they were doing, and you could get something like 701 00:35:16,960 --> 00:35:19,480 Speaker 1: this right, and in some cases you can even get 702 00:35:19,480 --> 00:35:21,520 Speaker 1: it to be published, which is crazy when you think 703 00:35:21,560 --> 00:35:25,400 Speaker 1: about it. But we've come a long way since. So 704 00:35:25,480 --> 00:35:28,160 Speaker 1: the beauty of functional MRI now is that one we 705 00:35:28,320 --> 00:35:30,359 Speaker 1: understand how to do the X marks the spot much 706 00:35:30,440 --> 00:35:32,239 Speaker 1: much better. We now have much better handle on this 707 00:35:32,440 --> 00:35:35,279 Speaker 1: historical challenges, the way to build the right contrasts, the 708 00:35:35,320 --> 00:35:37,800 Speaker 1: way to correct for multiple comparisons, all the things that 709 00:35:37,840 --> 00:35:40,880 Speaker 1: you tend to think of when you're doing large scale statistics. 710 00:35:41,600 --> 00:35:44,360 Speaker 1: That discipline had to kind of come to functional imaging 711 00:35:44,560 --> 00:35:46,920 Speaker 1: and inform it and that has happened, which is great. 712 00:35:47,640 --> 00:35:49,440 Speaker 1: But the second part is I told you before that 713 00:35:49,520 --> 00:35:52,719 Speaker 1: memory is all about the connections, and functional MRI initially 714 00:35:53,040 --> 00:35:56,400 Speaker 1: was all about blobbology, right, try to find little hotspots 715 00:35:56,400 --> 00:35:58,960 Speaker 1: in the brain. You pretty pictures the cover of science 716 00:35:58,960 --> 00:36:01,960 Speaker 1: and all that with little hotspots in the brain, and 717 00:36:02,040 --> 00:36:04,440 Speaker 1: that was the approach. But we know that memory is 718 00:36:04,440 --> 00:36:07,239 Speaker 1: not in the hotspots, memories and the connections, so we've 719 00:36:07,239 --> 00:36:11,360 Speaker 1: started to move much more towards connectivity analysis and asking 720 00:36:11,400 --> 00:36:14,759 Speaker 1: about how are the different parts of the brain communicating 721 00:36:14,840 --> 00:36:19,960 Speaker 1: dynamically and essentially coactivating with each other to support solving 722 00:36:19,960 --> 00:36:22,799 Speaker 1: this challenge or doing this memory test or memory game 723 00:36:22,880 --> 00:36:25,720 Speaker 1: while you're in the scanner. So that was the advent 724 00:36:25,719 --> 00:36:29,040 Speaker 1: of functional connectivity kinds of approaches which are I think 725 00:36:29,120 --> 00:36:32,319 Speaker 1: far more compelling, far more robust against some of the 726 00:36:32,360 --> 00:36:36,160 Speaker 1: initial critiques of a functional MRI, and they reflect the 727 00:36:36,200 --> 00:36:38,040 Speaker 1: true nature of how the brain works. The brain is 728 00:36:38,080 --> 00:36:41,640 Speaker 1: one big dynamical system. It's not just regions working in isolation. 729 00:36:41,960 --> 00:36:44,880 Speaker 1: Everything is connecting with each other, so we owe it 730 00:36:44,920 --> 00:36:47,239 Speaker 1: to ourselves to try to understand it from that much 731 00:36:47,239 --> 00:36:51,319 Speaker 1: more complex way. So functional MRI still remains a very 732 00:36:51,400 --> 00:36:54,000 Speaker 1: very powerful tool, but now the analysis that we can 733 00:36:54,040 --> 00:36:56,759 Speaker 1: do are just far more advanced. The other thing that 734 00:36:56,880 --> 00:37:00,640 Speaker 1: I think is just incredibly powerful, aside from anim models, 735 00:37:00,880 --> 00:37:05,040 Speaker 1: is the ability to directly record electrical activity from cells 736 00:37:05,640 --> 00:37:08,279 Speaker 1: or from what it's called local field potentials that the 737 00:37:08,360 --> 00:37:11,440 Speaker 1: areas around cells that have also electrical activity. It can 738 00:37:11,480 --> 00:37:16,120 Speaker 1: be measured directly in patients, and these are typically patients 739 00:37:16,160 --> 00:37:19,839 Speaker 1: that are going to undergo surgery for epileptic seizures. So 740 00:37:19,920 --> 00:37:22,759 Speaker 1: the surgery is done to remove the part of the 741 00:37:22,760 --> 00:37:26,160 Speaker 1: brain where the seizures are emanating from. And typically when 742 00:37:26,160 --> 00:37:28,120 Speaker 1: they come into a hospital, they're in a hospital for 743 00:37:28,120 --> 00:37:31,120 Speaker 1: about a week or so. They get implanted with electrodes 744 00:37:31,160 --> 00:37:33,719 Speaker 1: to record from the parts of the brain where the 745 00:37:33,760 --> 00:37:38,040 Speaker 1: clinician might suspect that epilepsy is happening, and they're taken 746 00:37:38,120 --> 00:37:41,400 Speaker 1: off of anti epileptic medication and essentially they're waiting to 747 00:37:41,480 --> 00:37:44,520 Speaker 1: induce a seizure, and once that seizure is induced, they 748 00:37:44,560 --> 00:37:47,160 Speaker 1: attract the location. That's how they decide on a way 749 00:37:47,200 --> 00:37:50,319 Speaker 1: to do the surgery. There's about a week or so 750 00:37:50,400 --> 00:37:53,879 Speaker 1: while they're in the hospital with electrodes penetrating deep into 751 00:37:53,920 --> 00:37:56,719 Speaker 1: their cortex and many of those electrodes directly into the 752 00:37:56,760 --> 00:37:59,759 Speaker 1: memory bits of the brain, like the hippocampus. And those 753 00:38:00,360 --> 00:38:04,640 Speaker 1: are just an incredible group of individuals because they also 754 00:38:04,880 --> 00:38:07,120 Speaker 1: most of them want to help science and they understand 755 00:38:07,120 --> 00:38:10,239 Speaker 1: the opportunity the scientists have while they're laying in a 756 00:38:10,239 --> 00:38:13,560 Speaker 1: hospital bed for about a week to understand something fundamental 757 00:38:13,600 --> 00:38:16,319 Speaker 1: about the brain. So every now and then they give 758 00:38:16,400 --> 00:38:18,680 Speaker 1: us the opportunity to give them a challenge, maybe on 759 00:38:18,719 --> 00:38:21,239 Speaker 1: an iPad or a computer while they're laying there and 760 00:38:21,440 --> 00:38:24,080 Speaker 1: they try to solve this challenge, try to play this 761 00:38:24,160 --> 00:38:27,000 Speaker 1: memory game or do this memory test while we're recording 762 00:38:27,120 --> 00:38:30,839 Speaker 1: direct electrical activity from their brain cells, which is incredible. 763 00:38:31,239 --> 00:38:34,560 Speaker 1: So it gives us almost the same degree of information 764 00:38:34,600 --> 00:38:36,480 Speaker 1: that you can get in an animal model. Now in 765 00:38:36,520 --> 00:38:38,040 Speaker 1: an animal model, in a road that you can stick 766 00:38:38,080 --> 00:38:41,400 Speaker 1: more electrones, you can get higher fidelity. And with patients 767 00:38:41,400 --> 00:38:43,720 Speaker 1: you have to do things that are only clinically warranted. 768 00:38:43,760 --> 00:38:46,160 Speaker 1: So there's an ethical obligation, of course to make sure 769 00:38:46,200 --> 00:38:48,920 Speaker 1: that nothing is being done that would ever put the 770 00:38:48,960 --> 00:38:53,120 Speaker 1: patient an increase risk. So that also poses some limitations 771 00:38:53,120 --> 00:38:55,920 Speaker 1: as to how you can record activity and get that data. 772 00:38:56,040 --> 00:38:58,640 Speaker 1: But it's just incredible access that we have to the 773 00:38:58,719 --> 00:39:02,520 Speaker 1: brain in partnership with these remarkable individuals, and we've learned 774 00:39:02,520 --> 00:39:04,200 Speaker 1: a heck of a lot about how the brain works 775 00:39:04,200 --> 00:39:07,120 Speaker 1: and how memory works from those direct electrical recordings. 776 00:39:07,719 --> 00:39:11,040 Speaker 2: Wow. I previously wrote a chapter on brain computer interfaces 777 00:39:11,080 --> 00:39:14,560 Speaker 2: and I was reading about like, utah arrays. Is this 778 00:39:14,719 --> 00:39:17,000 Speaker 2: the same thing or is this a different kind of electrode. 779 00:39:17,080 --> 00:39:20,239 Speaker 1: Yes, So, utah arrays are one way to do it. 780 00:39:20,680 --> 00:39:24,640 Speaker 1: Utah arrays are a little bit more invasive. They involve 781 00:39:24,800 --> 00:39:27,320 Speaker 1: several electrodes that are kind of going through the surface 782 00:39:27,400 --> 00:39:30,040 Speaker 1: of the cortex. At the same time, they're no longer 783 00:39:30,160 --> 00:39:33,160 Speaker 1: kind of the standard practice for most patients. They're still 784 00:39:33,280 --> 00:39:35,960 Speaker 1: used in some cases where they're clinically warranted, but in 785 00:39:36,000 --> 00:39:39,680 Speaker 1: many cases they're not because you suspect that what's happening 786 00:39:39,760 --> 00:39:42,800 Speaker 1: is deep into the brain, so you stick direct single 787 00:39:42,840 --> 00:39:45,600 Speaker 1: electrodes all the way down to where you suspect the 788 00:39:45,680 --> 00:39:48,440 Speaker 1: action might be, and you avoid some of the potential 789 00:39:48,520 --> 00:39:50,920 Speaker 1: damage that happens with UTAH rays. So they're used in 790 00:39:50,960 --> 00:39:54,760 Speaker 1: some clinical contexts, but in many others, we can stick 791 00:39:54,840 --> 00:39:58,719 Speaker 1: these much thinner, slimmer electrodes directly into the parts of 792 00:39:58,760 --> 00:40:01,440 Speaker 1: the brain that we suspect yet epilepsy is ennything from 793 00:40:01,719 --> 00:40:04,720 Speaker 1: so far less damage that way, and those patients typically 794 00:40:04,719 --> 00:40:08,120 Speaker 1: have better outcomes than patients implanted with guitar rays. Now, 795 00:40:08,200 --> 00:40:11,280 Speaker 1: your point about BCIs, and there's a number of companies 796 00:40:11,320 --> 00:40:14,160 Speaker 1: out there that are trying to develop brain computer interfaces 797 00:40:14,239 --> 00:40:16,960 Speaker 1: using these kinds of arrays. I think that's a particular 798 00:40:17,040 --> 00:40:20,560 Speaker 1: challenge for those enterprises is how do you create a 799 00:40:20,600 --> 00:40:23,759 Speaker 1: way to measure directly from the brain and to be 800 00:40:23,800 --> 00:40:26,880 Speaker 1: able to stimulate and influence the brain without causing too 801 00:40:26,960 --> 00:40:30,359 Speaker 1: much damage. Having electrones that are thin enough, that are 802 00:40:30,640 --> 00:40:33,080 Speaker 1: made from the right material so that you don't cause 803 00:40:33,120 --> 00:40:35,600 Speaker 1: a lot of tissue damage, because ideally, what you want 804 00:40:35,600 --> 00:40:37,400 Speaker 1: to do is create an interface that helps people, so 805 00:40:37,440 --> 00:40:40,200 Speaker 1: you don't want to inadvertently cause more damage. 806 00:40:40,400 --> 00:40:42,960 Speaker 2: Now, when we were thinking about UTAH rays and damage, 807 00:40:43,000 --> 00:40:45,359 Speaker 2: you know, we thought about like a cup with jello 808 00:40:45,480 --> 00:40:48,279 Speaker 2: in it, and you stick some needles in there, and 809 00:40:48,360 --> 00:40:50,400 Speaker 2: as you move the jello around, if the needles are 810 00:40:50,440 --> 00:40:52,120 Speaker 2: kind of staying in place, that would sort of mess 811 00:40:52,200 --> 00:40:53,600 Speaker 2: up the brain. Is that a good way to think 812 00:40:53,640 --> 00:40:55,080 Speaker 2: about it? Does it all move together? 813 00:40:55,520 --> 00:40:59,200 Speaker 1: Yeah? The brain certainly is as vulnerable maybe as a 814 00:40:59,239 --> 00:41:02,960 Speaker 1: cup of gello. But the key is also flexibility. So 815 00:41:02,960 --> 00:41:05,520 Speaker 1: you're right. When you have these electrodes, there's a bit 816 00:41:05,560 --> 00:41:08,759 Speaker 1: of a compromise. So want them to be flexible so 817 00:41:08,800 --> 00:41:10,719 Speaker 1: that they're moving with their brain. You're absolutely right, and 818 00:41:10,760 --> 00:41:13,439 Speaker 1: that'll cause less tissue damage. But at the same time, 819 00:41:13,520 --> 00:41:16,000 Speaker 1: flexibility you can come at a cost, which is what 820 00:41:16,120 --> 00:41:19,200 Speaker 1: they're targeting, might change. So you want to make them 821 00:41:19,200 --> 00:41:21,720 Speaker 1: flexible enough sort of they don't cause damage, but rigid 822 00:41:21,800 --> 00:41:23,960 Speaker 1: enough so that they can continue to target the same region, 823 00:41:24,800 --> 00:41:27,359 Speaker 1: so it is not an easy challenge at all. But 824 00:41:27,560 --> 00:41:30,520 Speaker 1: there's been some developments recently in doing these kinds of 825 00:41:30,600 --> 00:41:33,839 Speaker 1: arrays with animals, and we haven't yet pourted that over 826 00:41:33,920 --> 00:41:36,080 Speaker 1: to humans and done the FD approval and all of 827 00:41:36,080 --> 00:41:39,280 Speaker 1: those kinds of things. It's happening soon. There's already experiments 828 00:41:39,280 --> 00:41:41,840 Speaker 1: that try to test out one of the technologies like 829 00:41:41,920 --> 00:41:45,959 Speaker 1: neuropixel for example, or near epixels technologies. Those have been 830 00:41:46,400 --> 00:41:49,799 Speaker 1: incredibly powerful for animal models for reading from road insight 831 00:41:49,840 --> 00:41:53,400 Speaker 1: from non human primates and their small form factor. They're thinner, 832 00:41:53,400 --> 00:41:55,720 Speaker 1: but they have a ton of electrode contacts on there, 833 00:41:56,040 --> 00:41:58,200 Speaker 1: so it can really give you information from a lot 834 00:41:58,239 --> 00:42:02,480 Speaker 1: of different sounds simultaneou sleep. And pointing that over to 835 00:42:02,560 --> 00:42:04,840 Speaker 1: humans I think will be a really helpful thing to do. 836 00:42:05,160 --> 00:42:07,360 Speaker 1: But that's only been done in some limited experiments and 837 00:42:07,440 --> 00:42:10,200 Speaker 1: not widespread. You so hoping that some variants of those 838 00:42:10,280 --> 00:42:12,600 Speaker 1: kinds of technologies will make us way to primetime soon. 839 00:42:12,840 --> 00:42:16,160 Speaker 2: I've watched some videos of people with brain computer interfaces 840 00:42:16,200 --> 00:42:18,040 Speaker 2: that were able to do incredible things, But one of 841 00:42:18,080 --> 00:42:20,560 Speaker 2: the things that sounded totally devastating to me was if 842 00:42:20,600 --> 00:42:23,719 Speaker 2: I understand this correctly, it's over time, the brain has 843 00:42:23,719 --> 00:42:26,600 Speaker 2: a response to those electrodes and like kind of walls 844 00:42:26,600 --> 00:42:28,520 Speaker 2: them off and the connection gets less good. I don't 845 00:42:28,560 --> 00:42:31,120 Speaker 2: know exactly what's happening. Do we have any progress in 846 00:42:31,120 --> 00:42:31,560 Speaker 2: that area. 847 00:42:32,080 --> 00:42:34,360 Speaker 1: Well, so that's another thing that needs to be tackled. Also, 848 00:42:34,440 --> 00:42:37,840 Speaker 1: with some of these newer silicon probes, they're less likely 849 00:42:37,920 --> 00:42:41,800 Speaker 1: to have the inflammatory and the calcification kinds of responses 850 00:42:41,800 --> 00:42:44,080 Speaker 1: that happened around electrodes, because remember, this is a foreign 851 00:42:44,160 --> 00:42:47,840 Speaker 1: object entering the brain, and the brain's natural disposition towards 852 00:42:47,880 --> 00:42:50,840 Speaker 1: foreign objects is attack it, right. That's why we have 853 00:42:51,280 --> 00:42:54,319 Speaker 1: brains immune cells, we have microglia, we have a lot 854 00:42:54,320 --> 00:42:58,120 Speaker 1: of cells that are dedicated to detecting and eliminating foreign objects. 855 00:42:58,680 --> 00:43:02,000 Speaker 1: So you tend to see them of aggregate around electrodic 856 00:43:02,000 --> 00:43:05,760 Speaker 1: contact locations and things like that. But there are ways 857 00:43:05,800 --> 00:43:08,399 Speaker 1: with different substances to kind of maybe fool the brain 858 00:43:08,480 --> 00:43:10,880 Speaker 1: a little bit into thinking this is okay. You can 859 00:43:10,920 --> 00:43:13,040 Speaker 1: try to also reduce the brains immune response to some 860 00:43:13,120 --> 00:43:16,120 Speaker 1: extent when these things are coming in. So there's approaches 861 00:43:16,120 --> 00:43:18,319 Speaker 1: that are being developed to try to get better long 862 00:43:18,400 --> 00:43:21,120 Speaker 1: term outcomes, but yet we're still very early. 863 00:43:20,920 --> 00:43:22,799 Speaker 2: In this game, And for listeners who are maybe not 864 00:43:22,880 --> 00:43:25,640 Speaker 2: es familiar with brain computer interfaces, what are some reasons 865 00:43:25,680 --> 00:43:28,160 Speaker 2: that people might get a brain computer interface. 866 00:43:28,320 --> 00:43:31,360 Speaker 1: There's a variety of reasons. So, for example, for someone 867 00:43:31,400 --> 00:43:34,520 Speaker 1: who has lost the ability to control their limbs because 868 00:43:34,560 --> 00:43:37,640 Speaker 1: of a stroke or a focal deficit, being able to 869 00:43:37,680 --> 00:43:40,680 Speaker 1: have a brain computer interface shortcut signals so that they 870 00:43:40,680 --> 00:43:43,960 Speaker 1: can still control their limbs and their body is remarkable. 871 00:43:44,000 --> 00:43:46,840 Speaker 1: And patients who have had those kinds of approaches, it 872 00:43:46,920 --> 00:43:49,840 Speaker 1: is just life changing. They go from a paraplegic or 873 00:43:49,880 --> 00:43:52,160 Speaker 1: quadriplegic to being able to have use of their arms 874 00:43:52,239 --> 00:43:56,240 Speaker 1: or their legs again. So there's incredibly utility there. For 875 00:43:56,440 --> 00:44:00,880 Speaker 1: folks who might have epilepsy. For example, there is a 876 00:44:00,920 --> 00:44:05,279 Speaker 1: brain computer interface that is a stimulator that is implanted 877 00:44:05,280 --> 00:44:07,920 Speaker 1: in the brain that responds to the earliest science of 878 00:44:07,920 --> 00:44:11,480 Speaker 1: epileptic seizures and that is able to with electrical stimulation 879 00:44:11,600 --> 00:44:14,600 Speaker 1: essentially knock it out. So now instead of having to 880 00:44:14,680 --> 00:44:17,280 Speaker 1: have the person be going in for surgery and lomping 881 00:44:17,360 --> 00:44:19,440 Speaker 1: up parts of the brain or having them be devastated 882 00:44:19,480 --> 00:44:23,359 Speaker 1: by epileptic seizures, you can have an implanted BCI that 883 00:44:23,520 --> 00:44:26,400 Speaker 1: responds in a closed loop system, so it uses the 884 00:44:26,400 --> 00:44:29,279 Speaker 1: responses of the brain itself to tell the stimulator what 885 00:44:29,360 --> 00:44:32,000 Speaker 1: to do. And that's a completely closed loop, so it 886 00:44:32,040 --> 00:44:35,680 Speaker 1: doesn't require any user interventionally outside. That allows it to 887 00:44:35,719 --> 00:44:38,960 Speaker 1: do a much better job of helping the patients overcome 888 00:44:39,600 --> 00:44:43,080 Speaker 1: seizures or epilepsy. So there's a number of different uses. 889 00:44:43,160 --> 00:44:46,240 Speaker 1: I can imagine that for movement disorders, for a variety 890 00:44:46,280 --> 00:44:49,200 Speaker 1: of different conditions where you might want to implant something 891 00:44:49,280 --> 00:44:51,200 Speaker 1: that communicates with the brain and feeds at the right 892 00:44:51,239 --> 00:44:53,600 Speaker 1: signal at the right time, there's going to be a 893 00:44:53,719 --> 00:44:57,080 Speaker 1: huge use for a BCI. Then there's a whole other 894 00:44:57,160 --> 00:45:02,040 Speaker 1: class of uses that may not work vire implantation. Right, 895 00:45:02,120 --> 00:45:05,880 Speaker 1: So these external devices maybe devices that are communicating with 896 00:45:05,920 --> 00:45:10,600 Speaker 1: the brain and external fashion portable or multiple, and they 897 00:45:10,640 --> 00:45:14,960 Speaker 1: allow it to improve function for stroke rahabilitation, or improve 898 00:45:15,000 --> 00:45:17,480 Speaker 1: function in some other way. There's a lot of folks 899 00:45:17,520 --> 00:45:20,520 Speaker 1: also kind of you know, taking the transhumanist approach here 900 00:45:20,520 --> 00:45:22,760 Speaker 1: and trying to develop PCIs to just improve our function. 901 00:45:22,960 --> 00:45:25,319 Speaker 1: We just want to be better at something, right, So 902 00:45:25,360 --> 00:45:27,239 Speaker 1: what if I can control this robotic arm to do 903 00:45:27,320 --> 00:45:29,920 Speaker 1: some you know, whatever it is. So don't care too 904 00:45:30,000 --> 00:45:33,800 Speaker 1: much about those but certainly the utility for helping patients 905 00:45:33,880 --> 00:45:34,600 Speaker 1: is huge. 906 00:45:34,920 --> 00:45:37,200 Speaker 2: So getting a little more sci fi here, do you 907 00:45:37,239 --> 00:45:40,080 Speaker 2: think we'll ever be able to know what somebody is 908 00:45:40,120 --> 00:45:42,839 Speaker 2: thinking by having, you know, a cap on their head 909 00:45:42,920 --> 00:45:45,600 Speaker 2: or electrodes in their brain or is that just way 910 00:45:45,640 --> 00:45:46,279 Speaker 2: too far off? 911 00:45:46,520 --> 00:45:49,600 Speaker 1: I think we already do so. I think we are 912 00:45:49,719 --> 00:45:53,799 Speaker 1: to some extent with an electrocapsule. Egen is really the 913 00:45:53,840 --> 00:45:56,640 Speaker 1: technology that you're talking about, or there's other ways to 914 00:45:56,640 --> 00:46:00,480 Speaker 1: do it also, various ultrasound technologies and so on. You 915 00:46:00,520 --> 00:46:04,440 Speaker 1: can detect and stimulate pretty easily, but the problem is 916 00:46:04,560 --> 00:46:06,600 Speaker 1: the kinds of things that you can get the system 917 00:46:06,640 --> 00:46:09,360 Speaker 1: to do are still fairly rudimentary. So I can tell, 918 00:46:09,440 --> 00:46:12,600 Speaker 1: for example, based on eg signal whether the person is 919 00:46:12,640 --> 00:46:16,279 Speaker 1: going to move their hand or make some overt kind 920 00:46:16,320 --> 00:46:20,040 Speaker 1: of gesture, and motor control is a somewhat simpler system 921 00:46:20,080 --> 00:46:25,200 Speaker 1: than like memory, executive decision, or emotion or having very 922 00:46:25,200 --> 00:46:29,120 Speaker 1: complex feelings like guilt, right or things like that are 923 00:46:29,719 --> 00:46:33,160 Speaker 1: much more difficult to capture in the simple signals that 924 00:46:33,160 --> 00:46:36,640 Speaker 1: we're capturing. The BCIs right now. So do I anticipate 925 00:46:36,640 --> 00:46:38,960 Speaker 1: that at one point we'll be able to do one percent? 926 00:46:39,120 --> 00:46:42,000 Speaker 1: There's no doubt the technologies there. It's just a matter 927 00:46:42,080 --> 00:46:44,799 Speaker 1: of are we recording from enough are we able to 928 00:46:44,840 --> 00:46:48,399 Speaker 1: build the sophisticated models to model this function, and we're 929 00:46:48,400 --> 00:46:51,759 Speaker 1: making rapid progress in that arena. So from a sci 930 00:46:51,760 --> 00:46:55,400 Speaker 1: fi perspective, I'm sure you've heard this before. The moving vehicle, 931 00:46:55,480 --> 00:46:57,680 Speaker 1: the car was sci fi at one point until someone 932 00:46:57,680 --> 00:47:00,600 Speaker 1: invented it, right, So the things that we think about 933 00:47:00,600 --> 00:47:02,799 Speaker 1: a sci fi are just science that's not here yet. 934 00:47:02,880 --> 00:47:21,480 Speaker 2: All right, Yeah, the future is now. So let's talk 935 00:47:21,520 --> 00:47:24,279 Speaker 2: about losing memory a little bit as we wrap things up. 936 00:47:24,640 --> 00:47:27,440 Speaker 2: So we talked about how your brain is working overnight 937 00:47:27,480 --> 00:47:31,080 Speaker 2: to strengthen memories. How do we lose memories over time? 938 00:47:31,440 --> 00:47:33,160 Speaker 1: So there's a number of different ways to do this. 939 00:47:33,680 --> 00:47:37,040 Speaker 1: Memory can be lost because of decay, so just the 940 00:47:37,080 --> 00:47:40,200 Speaker 1: passage of time a lot of times can make memories 941 00:47:40,280 --> 00:47:43,480 Speaker 1: just harder to remember, harder to access. And we've known 942 00:47:43,520 --> 00:47:47,360 Speaker 1: this since eighteen eighties. Herman Ebbinghouse was the first to 943 00:47:47,440 --> 00:47:50,560 Speaker 1: kind of do this using experiments with himself. You would 944 00:47:50,680 --> 00:47:54,080 Speaker 1: learn lists of nonsense syllables and then try to map 945 00:47:54,120 --> 00:47:57,200 Speaker 1: his own forgetting curve. So how much forgetting happens over 946 00:47:57,200 --> 00:47:59,279 Speaker 1: the first twenty four hours, next twenty four hours, next 947 00:47:59,280 --> 00:48:02,000 Speaker 1: twenty four hours. Yeah, I know, experiments on yourself very 948 00:48:02,040 --> 00:48:04,480 Speaker 1: very boring times in eighteen eighty so it didn't have 949 00:48:04,520 --> 00:48:06,520 Speaker 1: too much else to do, so I did this with himself, 950 00:48:07,520 --> 00:48:09,920 Speaker 1: but maps what's called the forgetting curve, which we still 951 00:48:09,960 --> 00:48:11,880 Speaker 1: to this day. We can look at any sort of 952 00:48:11,880 --> 00:48:14,120 Speaker 1: memory function or any memory task that we do in 953 00:48:14,160 --> 00:48:16,320 Speaker 1: the lot and we see a very clear forgetting curve. 954 00:48:16,600 --> 00:48:18,600 Speaker 1: A lot of forget the first twenty four hours. Then 955 00:48:18,600 --> 00:48:20,839 Speaker 1: things to kind of taper off. So there's that, And 956 00:48:20,880 --> 00:48:24,440 Speaker 1: the suspicion is that mostly it's decay, but sometimes it 957 00:48:24,480 --> 00:48:28,200 Speaker 1: happens because of interference, because similar memories get in there 958 00:48:28,239 --> 00:48:30,760 Speaker 1: and kind of interfere with one another, compete with one another, 959 00:48:30,960 --> 00:48:32,799 Speaker 1: sort of the memory that you have for them get 960 00:48:32,800 --> 00:48:36,600 Speaker 1: a little bit fuzzy. And again these are maybe features, 961 00:48:36,640 --> 00:48:39,240 Speaker 1: not bugs, right where maybe the system is not intending 962 00:48:39,280 --> 00:48:42,160 Speaker 1: to hold onto memories with high fidelity for long. So 963 00:48:42,239 --> 00:48:46,000 Speaker 1: interference and decay help us extract what's most important and 964 00:48:46,120 --> 00:48:48,319 Speaker 1: keep that over time, and then other things can kind 965 00:48:48,320 --> 00:48:50,920 Speaker 1: of go away. So those are natural things that happen 966 00:48:50,920 --> 00:48:53,239 Speaker 1: in every brain all the time, and there's nothing to 967 00:48:53,280 --> 00:48:55,919 Speaker 1: be concerned about decay, interference, nothing to be concerned about. 968 00:48:56,080 --> 00:48:59,000 Speaker 1: But as we get older, and by older I hate 969 00:48:59,000 --> 00:49:01,800 Speaker 1: to say this, you know, talking about like forties. 970 00:49:01,400 --> 00:49:06,160 Speaker 2: And above that what I wanted to hear money, Well, 971 00:49:06,200 --> 00:49:06,600 Speaker 2: I'm going to. 972 00:49:06,640 --> 00:49:09,840 Speaker 1: Say graduate in our fourth decade and then maybe a 973 00:49:09,840 --> 00:49:14,160 Speaker 1: little bit more precipitously, over time, memory does get more difficult. 974 00:49:14,320 --> 00:49:18,160 Speaker 1: So things do degrade, and we start to maybe lose 975 00:49:18,200 --> 00:49:20,960 Speaker 1: to some extent, our ability to make new memories, encode 976 00:49:20,960 --> 00:49:23,520 Speaker 1: new memories, our memories. For the old stuff is still 977 00:49:23,600 --> 00:49:25,600 Speaker 1: there's still resilience, even though every now and then we 978 00:49:25,680 --> 00:49:28,520 Speaker 1: might have a problem with access. So we get distracted, 979 00:49:28,560 --> 00:49:30,919 Speaker 1: we lose retrieval cues, but you know it's there because 980 00:49:30,920 --> 00:49:33,840 Speaker 1: if you get the right reminder, boom, it comes back. Right. 981 00:49:34,000 --> 00:49:36,000 Speaker 1: It's just a matter of like tip of a tongue, 982 00:49:36,080 --> 00:49:38,440 Speaker 1: you know, being able to remember exactly that the right 983 00:49:38,560 --> 00:49:41,360 Speaker 1: queue for retrieval. That becomes more difficult when we're just 984 00:49:41,400 --> 00:49:44,760 Speaker 1: more distracted. But as we get older, making new memories 985 00:49:44,760 --> 00:49:47,640 Speaker 1: becomes harder. And then for some folks who might go 986 00:49:47,719 --> 00:49:52,080 Speaker 1: down the trajectory to Alzheimer's disease, right then that becomes 987 00:49:52,320 --> 00:49:54,880 Speaker 1: exceptionally more difficult. And that's one of the first things 988 00:49:54,920 --> 00:49:59,200 Speaker 1: to go. And there's a very difficult line between what's 989 00:49:59,440 --> 00:50:02,280 Speaker 1: normal I hate to say that word, maybe more typical 990 00:50:02,600 --> 00:50:05,960 Speaker 1: age associated memory impairment, which you can expect in every brain, 991 00:50:06,480 --> 00:50:08,640 Speaker 1: and that's something that may be a bit more costs 992 00:50:08,680 --> 00:50:10,879 Speaker 1: for concern because it may be going down to about 993 00:50:10,880 --> 00:50:14,400 Speaker 1: the Alzheimer's disease. Dissociating those two, say in the sixties 994 00:50:14,440 --> 00:50:17,120 Speaker 1: and seventies, is actually very difficult. It's not very easy 995 00:50:17,360 --> 00:50:20,400 Speaker 1: because both can start out as a form of forgetfulness. 996 00:50:20,920 --> 00:50:24,480 Speaker 1: But with Alzheimer's disease or with dementia, it's very progressive, 997 00:50:24,840 --> 00:50:27,280 Speaker 1: so it does get worse and worse and worse over time. 998 00:50:28,080 --> 00:50:30,680 Speaker 1: That change in a healthy aging brain or a typical 999 00:50:30,680 --> 00:50:34,439 Speaker 1: aging brain is far less deep, so you don't see 1000 00:50:34,480 --> 00:50:37,200 Speaker 1: too much changing over time. You don't see this degradation 1001 00:50:37,280 --> 00:50:41,000 Speaker 1: to the point where it becomes very noticeable by family, friends, neighbors, 1002 00:50:41,160 --> 00:50:44,160 Speaker 1: and so on. So those are the forms of memory 1003 00:50:44,320 --> 00:50:48,000 Speaker 1: loss or memory change that happened. Some totally innocuous, no 1004 00:50:48,120 --> 00:50:50,440 Speaker 1: cause for concern, they happen every day to everyone, to 1005 00:50:50,480 --> 00:50:53,239 Speaker 1: the best of us. And then some that are a 1006 00:50:53,280 --> 00:50:54,640 Speaker 1: bit more cause for concern. 1007 00:50:54,719 --> 00:50:57,479 Speaker 2: As we get older and mechanistically, is it just the 1008 00:50:57,560 --> 00:51:01,759 Speaker 2: messages are not getting sent between those anymore, or this is. 1009 00:51:01,760 --> 00:51:04,760 Speaker 1: Something we've done quite a bit of work on. Actually mechanistically, 1010 00:51:04,800 --> 00:51:07,759 Speaker 1: what seems to be the case is that first the 1011 00:51:07,840 --> 00:51:10,200 Speaker 1: part of the brain that's really important for encoding these 1012 00:51:10,200 --> 00:51:13,600 Speaker 1: episodic memories, the hippocampus, as I said, has a very 1013 00:51:13,600 --> 00:51:17,520 Speaker 1: interesting change and it's dynamic. So this is a massive 1014 00:51:17,560 --> 00:51:19,880 Speaker 1: information processing hub in the brain. Even though it's a 1015 00:51:19,880 --> 00:51:24,200 Speaker 1: small structure, it carries a big information processing load, and 1016 00:51:24,640 --> 00:51:27,560 Speaker 1: it kind of can shift its state from encoding new 1017 00:51:27,560 --> 00:51:31,479 Speaker 1: information to remembering old information. And there are a few 1018 00:51:31,600 --> 00:51:34,480 Speaker 1: changes that happen to the cells in that system as 1019 00:51:34,520 --> 00:51:37,640 Speaker 1: we get older that bias the system towards remembering old 1020 00:51:37,640 --> 00:51:41,040 Speaker 1: information and away from encoding new information. And there's lots 1021 00:51:41,040 --> 00:51:43,520 Speaker 1: of sort of reasons why that is, we tend to 1022 00:51:44,040 --> 00:51:49,160 Speaker 1: change the excitation inhibition balance, we tend to change neurotransmitter concentrations, 1023 00:51:49,200 --> 00:51:50,800 Speaker 1: all of those kinds of things as we get older 1024 00:51:51,000 --> 00:51:54,799 Speaker 1: that cause that change in the information processing balance. But 1025 00:51:54,920 --> 00:51:57,440 Speaker 1: the other thing that happens more in the context of 1026 00:51:57,640 --> 00:52:01,520 Speaker 1: Alzheimer's disease is you also start to deprive the hippocampus 1027 00:52:01,560 --> 00:52:03,880 Speaker 1: of its main input coming in from the rest of 1028 00:52:03,920 --> 00:52:07,239 Speaker 1: the brain. So there's a region that sits alongside the 1029 00:52:07,320 --> 00:52:11,400 Speaker 1: hippo campus called the antorhinal cortex, and that region shrivels 1030 00:52:11,480 --> 00:52:14,720 Speaker 1: up in Alzheimer's disease. It's one of the first regions 1031 00:52:14,760 --> 00:52:18,719 Speaker 1: to deposit what's called tangle pathology or tau tangles, and 1032 00:52:18,760 --> 00:52:22,319 Speaker 1: that's a marker of cell death. So there's massive cell 1033 00:52:22,400 --> 00:52:25,080 Speaker 1: loss that's happening in that's around cortex early on, which 1034 00:52:25,160 --> 00:52:29,840 Speaker 1: deprives the hippocampus of its principal inputs. So in computer 1035 00:52:29,920 --> 00:52:32,640 Speaker 1: science we have this old adage called garbage in garbage out, 1036 00:52:33,360 --> 00:52:36,560 Speaker 1: and that's essentially what's happening in the hippocampus. It's the 1037 00:52:36,640 --> 00:52:39,680 Speaker 1: quality of the information that's coming in starts to become 1038 00:52:39,840 --> 00:52:42,759 Speaker 1: much more degraded than the context of Alzheimer's c's, So 1039 00:52:42,760 --> 00:52:45,080 Speaker 1: we can't possibly expect it to do a good job 1040 00:52:45,239 --> 00:52:48,920 Speaker 1: encoding information and story with high fidelity if the information 1041 00:52:49,000 --> 00:52:52,359 Speaker 1: coming in is in some ways quote garbage. So that 1042 00:52:52,760 --> 00:52:54,680 Speaker 1: tends to be one of the things that is also 1043 00:52:54,719 --> 00:52:57,680 Speaker 1: mechanistically associated with memory loss in the aging brain. 1044 00:52:57,920 --> 00:53:00,360 Speaker 2: Do we know why that region starts to degree in 1045 00:53:00,400 --> 00:53:01,480 Speaker 2: some people and not others. 1046 00:53:01,719 --> 00:53:05,319 Speaker 1: You know, there's hypotheses and I have my pet hypothesies. 1047 00:53:05,320 --> 00:53:07,719 Speaker 1: In the field has its pet hypotheses and so on, 1048 00:53:07,800 --> 00:53:10,799 Speaker 1: But there's a variety of contributors. We suspect that in 1049 00:53:10,840 --> 00:53:15,680 Speaker 1: the context of Alzheimer's disease, it's a combination of inflammatory changes, 1050 00:53:15,760 --> 00:53:19,080 Speaker 1: so that the mirror immune system is disregulated in Alzheimer's disease, 1051 00:53:19,080 --> 00:53:22,800 Speaker 1: where the inflammatory response that normally is very healthy starts 1052 00:53:22,800 --> 00:53:24,959 Speaker 1: to kind of take a turn and become very pathological. 1053 00:53:25,400 --> 00:53:28,080 Speaker 1: There is vascular damage that happens as we get older, 1054 00:53:28,080 --> 00:53:31,120 Speaker 1: and we suspect that some of those early vascular insults, 1055 00:53:31,160 --> 00:53:33,680 Speaker 1: so related to blood flows, deifenning of the arteries and 1056 00:53:33,719 --> 00:53:38,040 Speaker 1: so on, can also contribute to that. There's metabolic regulation 1057 00:53:38,200 --> 00:53:41,800 Speaker 1: that changes, so the metabolic demands of different regions also 1058 00:53:42,120 --> 00:53:45,480 Speaker 1: becomes disregulated as we get older. And then one thing 1059 00:53:45,520 --> 00:53:48,120 Speaker 1: that we've worked quite a bit on is excitation inhibition 1060 00:53:48,239 --> 00:53:51,759 Speaker 1: balance and the notion that normally the brain keeps this 1061 00:53:51,920 --> 00:53:56,480 Speaker 1: dynamic beautifully between stop signals and ghost signals, and as 1062 00:53:56,520 --> 00:54:00,560 Speaker 1: we get older there actually is an overabunton of the 1063 00:54:00,600 --> 00:54:05,160 Speaker 1: ghost signals which can drive the brain pathologically more towards 1064 00:54:05,200 --> 00:54:07,640 Speaker 1: memory loss and not enough of the stop signals, so 1065 00:54:08,080 --> 00:54:10,920 Speaker 1: that imbalance can also be a contributor. So it's a 1066 00:54:11,000 --> 00:54:13,880 Speaker 1: multifactorial issue. There's lots of contributors. This is not just 1067 00:54:14,040 --> 00:54:16,920 Speaker 1: one single pathology kind of model, as much as some 1068 00:54:17,000 --> 00:54:19,160 Speaker 1: of the field likes to believe that. It's a bit 1069 00:54:19,200 --> 00:54:19,840 Speaker 1: more complex. 1070 00:54:20,080 --> 00:54:21,719 Speaker 2: So as someone who just learned that they're in the 1071 00:54:21,760 --> 00:54:24,440 Speaker 2: old category, I. 1072 00:54:24,360 --> 00:54:26,799 Speaker 1: Would necessarily call it that. I would just say, we're 1073 00:54:26,800 --> 00:54:30,160 Speaker 1: maybe done developing and we're now just kind of hitting 1074 00:54:30,160 --> 00:54:32,320 Speaker 1: that hump of we're going to start the aging process. 1075 00:54:32,600 --> 00:54:34,120 Speaker 2: Got it, got it, I'll think of it that way. 1076 00:54:34,120 --> 00:54:38,200 Speaker 2: But I like that better. What kinds of science based 1077 00:54:38,680 --> 00:54:40,920 Speaker 2: things can we do to maintain our memory? So I 1078 00:54:41,000 --> 00:54:43,120 Speaker 2: see all these apps, pitch me, is there any science 1079 00:54:43,160 --> 00:54:44,200 Speaker 2: behind any of those things? 1080 00:54:45,200 --> 00:54:47,960 Speaker 1: Well, the app stuff probably not. I will tell you 1081 00:54:48,040 --> 00:54:51,000 Speaker 1: that the four big things that are really really important, 1082 00:54:51,040 --> 00:54:53,279 Speaker 1: and when I say really important, I mean they are 1083 00:54:53,320 --> 00:54:57,680 Speaker 1: supported both by epideological data and by clinical trials. So 1084 00:54:57,760 --> 00:54:59,600 Speaker 1: these things are really really helpful. The first one is 1085 00:54:59,640 --> 00:55:03,600 Speaker 1: physical activity. We know that physical activity maintains brain health 1086 00:55:03,640 --> 00:55:06,600 Speaker 1: well into older adulthood. We know that it can delay 1087 00:55:06,640 --> 00:55:09,360 Speaker 1: the onset of Alzheimer's disease. It can make the outcomes 1088 00:55:09,400 --> 00:55:13,000 Speaker 1: better for patients and as protective it really is knocking out. 1089 00:55:13,040 --> 00:55:16,560 Speaker 1: So sedentary lifestyle is a risk factor. We've done work 1090 00:55:16,600 --> 00:55:19,719 Speaker 1: also to show that even activity as brief as ten 1091 00:55:19,719 --> 00:55:22,880 Speaker 1: minutes of walking can be helpful to memory. So it 1092 00:55:22,920 --> 00:55:25,160 Speaker 1: doesn't take a lot, you know, I would say thirty 1093 00:55:25,200 --> 00:55:27,960 Speaker 1: minutes of you know, mild to moderate activity like walking, 1094 00:55:28,320 --> 00:55:31,560 Speaker 1: risk walking, that's sufficient to be able to give people 1095 00:55:32,080 --> 00:55:34,440 Speaker 1: a way to handle that risk factory. The second thing 1096 00:55:34,520 --> 00:55:36,680 Speaker 1: that's really important, and this kind of goes back to 1097 00:55:36,680 --> 00:55:40,200 Speaker 1: your idea about apps and kind of cognitive engagement. It 1098 00:55:40,239 --> 00:55:42,319 Speaker 1: turns out that apps and brain games on all of 1099 00:55:42,360 --> 00:55:45,720 Speaker 1: that efficacy is a little bit tendless, so there's conflicting reports. 1100 00:55:46,080 --> 00:55:49,560 Speaker 1: But we know that social engagement is really important. So 1101 00:55:49,600 --> 00:55:52,439 Speaker 1: in other words, if you remove social engagement, if folks 1102 00:55:52,440 --> 00:55:55,680 Speaker 1: become isolated the same past retirement, that's a risk factory 1103 00:55:55,800 --> 00:55:58,719 Speaker 1: for sure. But if they're able to continue to be 1104 00:55:58,800 --> 00:56:02,840 Speaker 1: social build archer social networks in person, you know, volunteering, 1105 00:56:02,880 --> 00:56:06,280 Speaker 1: community centers, churches, synagogues, whatever it is, or being around 1106 00:56:06,320 --> 00:56:10,200 Speaker 1: people in general, and that can also combine with physical activity. 1107 00:56:10,280 --> 00:56:13,320 Speaker 1: So let's say it's dance class now, it's physical activity 1108 00:56:13,360 --> 00:56:16,440 Speaker 1: and social contact. Right, those kinds of things, even in 1109 00:56:16,440 --> 00:56:19,240 Speaker 1: interventional studies, have been shown to have very positive results. 1110 00:56:19,880 --> 00:56:23,120 Speaker 1: Then the third piece is diet. A heart healthy diet 1111 00:56:23,160 --> 00:56:25,520 Speaker 1: is a brain healthy diet. And the one that has 1112 00:56:25,560 --> 00:56:28,440 Speaker 1: been tried and true in clinical trials is the Mediterranean diet. 1113 00:56:28,960 --> 00:56:31,160 Speaker 1: So the Mediterranean which has lots and lots of variants, 1114 00:56:31,200 --> 00:56:34,640 Speaker 1: but I think very colorful, leafy, green vegetables and so on, 1115 00:56:35,160 --> 00:56:39,040 Speaker 1: healthy fats, and reducing you know, things like red meat 1116 00:56:39,080 --> 00:56:41,439 Speaker 1: and some on. So that one also has been tried 1117 00:56:41,480 --> 00:56:44,440 Speaker 1: in clinical trials compared to other diets and seems to 1118 00:56:44,600 --> 00:56:47,600 Speaker 1: be able to stave off risk. And then the last piece, 1119 00:56:47,680 --> 00:56:49,920 Speaker 1: which I think is one of the most important, to sleep. 1120 00:56:50,640 --> 00:56:54,000 Speaker 1: We all need good quality and quantity of sleep every night. 1121 00:56:54,080 --> 00:56:57,439 Speaker 1: It turns out that actually during sleep we go through 1122 00:56:57,440 --> 00:56:59,800 Speaker 1: a process of glymphatic clearance and we clear out on 1123 00:56:59,800 --> 00:57:01,960 Speaker 1: the life lot of the pathologies that can lead down 1124 00:57:02,000 --> 00:57:04,480 Speaker 1: the path to Alzheimer's. These or at least be contributors 1125 00:57:04,520 --> 00:57:08,160 Speaker 1: to it. And studies have shown that if you have 1126 00:57:08,280 --> 00:57:11,160 Speaker 1: sleep loss folks for example, who sleep less than six 1127 00:57:11,200 --> 00:57:13,239 Speaker 1: hours a night versus those who sleep more than seven 1128 00:57:13,320 --> 00:57:16,720 Speaker 1: or eight hours there's differences in their amyloid uptakes, so 1129 00:57:16,960 --> 00:57:19,520 Speaker 1: that emiloid pathology is one of the chief pathology in 1130 00:57:19,560 --> 00:57:21,880 Speaker 1: Alzheimer's disease. You see a lot more of it in 1131 00:57:21,920 --> 00:57:24,560 Speaker 1: those who sleep those fewer hours, and a lot less 1132 00:57:24,560 --> 00:57:27,640 Speaker 1: of it and those who sleep longer. So sleep disruption, 1133 00:57:27,800 --> 00:57:30,960 Speaker 1: sleep loss, that's a risk factor. The good news is 1134 00:57:31,120 --> 00:57:34,920 Speaker 1: most sleep problems are treatable, whether it's because of obstructive 1135 00:57:34,920 --> 00:57:38,280 Speaker 1: sleep apnea, or insomnia or any other reason, restless slag, 1136 00:57:38,520 --> 00:57:40,560 Speaker 1: all of those things. There's good treatments out there that 1137 00:57:40,600 --> 00:57:43,400 Speaker 1: will help people sleep better. So those are the four 1138 00:57:43,520 --> 00:57:45,600 Speaker 1: sort of chief things. There's many other smaller things, but 1139 00:57:45,880 --> 00:57:47,520 Speaker 1: those are the four ones that I like to lead 1140 00:57:47,600 --> 00:57:50,440 Speaker 1: with because they have just excellent data in their favor. 1141 00:57:50,680 --> 00:57:53,120 Speaker 2: Well, I'm excited about the sleep thing. I like sleeping. 1142 00:57:54,120 --> 00:57:56,200 Speaker 2: Maybe it's time to get out and exercise a little 1143 00:57:56,240 --> 00:57:59,320 Speaker 2: more so. My co host is a physicist. He always 1144 00:57:59,400 --> 00:58:03,160 Speaker 2: ends on a aliens. So here we go. If an 1145 00:58:03,160 --> 00:58:05,600 Speaker 2: alien were to land on Earth today, do you think 1146 00:58:05,640 --> 00:58:07,760 Speaker 2: they would store memories in a similar way. 1147 00:58:08,120 --> 00:58:09,840 Speaker 1: You know, if you had asked me that question ten 1148 00:58:09,920 --> 00:58:11,960 Speaker 1: years ago, I would have said, yes, I think our 1149 00:58:11,960 --> 00:58:15,600 Speaker 1: memory system is exceptional. It's wonderful, it's brilliant. Why not 1150 00:58:15,960 --> 00:58:18,240 Speaker 1: they should be like a model to strive to achieve. 1151 00:58:19,000 --> 00:58:20,960 Speaker 1: But I will take the opportunity since we've got a 1152 00:58:20,960 --> 00:58:23,360 Speaker 1: couple more minutes, Kelly, and tell you about something that 1153 00:58:23,840 --> 00:58:27,000 Speaker 1: changes my answer to this question. And that is the 1154 00:58:27,040 --> 00:58:30,160 Speaker 1: discovery that was made first by James Magaw who is 1155 00:58:30,240 --> 00:58:33,480 Speaker 1: the founding director of my center here at uc Aervine, 1156 00:58:33,600 --> 00:58:35,720 Speaker 1: and we continue to do work with this group of 1157 00:58:35,800 --> 00:58:40,480 Speaker 1: remarkable individuals who have what's called highly superior autobiographical memory. 1158 00:58:41,360 --> 00:58:43,439 Speaker 1: We spent a lot of time during this conversation talking 1159 00:58:43,480 --> 00:58:46,800 Speaker 1: about how memory is fallible. There's forgetting, there's interference. It's 1160 00:58:46,840 --> 00:58:49,919 Speaker 1: not meant to store everything with high fidelity because that's 1161 00:58:49,960 --> 00:58:52,919 Speaker 1: going to compromise your knowledge generation and so on. Well, 1162 00:58:52,960 --> 00:58:56,400 Speaker 1: these folks would beg to differ. And it's incredible because 1163 00:58:56,440 --> 00:58:59,000 Speaker 1: they do store things with high fidelity. They can remember 1164 00:58:59,200 --> 00:59:01,160 Speaker 1: everything that happened in our lives since that they were 1165 00:59:01,200 --> 00:59:04,240 Speaker 1: teenagers and tell you exactly what happened on what day 1166 00:59:04,240 --> 00:59:07,160 Speaker 1: of the week, what month, and someone. And they don't 1167 00:59:07,200 --> 00:59:11,360 Speaker 1: have a problem extracting generalities and knowledge. Also, So sometimes 1168 00:59:11,360 --> 00:59:12,960 Speaker 1: I'll jup with them and say I think you're like 1169 00:59:13,040 --> 00:59:17,000 Speaker 1: the X men of our generation, X people of our generation. 1170 00:59:17,400 --> 00:59:19,600 Speaker 1: It's remarkable and we still don't understand how they do 1171 00:59:19,680 --> 00:59:22,520 Speaker 1: it and how their brains are wired differently. So if 1172 00:59:22,520 --> 00:59:26,520 Speaker 1: aliens sufficiently advanced to aliens, I think, if they're reaching 1173 00:59:26,560 --> 00:59:29,120 Speaker 1: Earth before we reach them, they're probably far more advanced 1174 00:59:29,160 --> 00:59:31,600 Speaker 1: than us. They're more likely to have figure it out 1175 00:59:31,640 --> 00:59:34,360 Speaker 1: a way to do that which sort of combines the 1176 00:59:34,400 --> 00:59:37,480 Speaker 1: best of what we have built into our computers and 1177 00:59:37,520 --> 00:59:40,360 Speaker 1: what we have built into our brains. Just no compromise, 1178 00:59:40,440 --> 00:59:41,360 Speaker 1: no sacrifice. 1179 00:59:41,640 --> 00:59:44,120 Speaker 2: Awesome. Well, I wish Daniel we're here to hear that, 1180 00:59:44,280 --> 00:59:46,480 Speaker 2: but I'm sure he'll enjoy hearing that explanation when he 1181 00:59:46,480 --> 00:59:48,760 Speaker 2: gets back. Thank you so much for your time, Mike. 1182 00:59:48,840 --> 00:59:50,280 Speaker 2: This was absolutely fascinating. 1183 00:59:50,360 --> 00:59:52,640 Speaker 1: They're very welcome. I very much enjoyed to Kelly. Thank you. 1184 01:00:00,120 --> 01:00:03,880 Speaker 2: Daniel and Kelly's Extraordinary Universe is produced by iHeartRadio. We 1185 01:00:03,920 --> 01:00:06,280 Speaker 2: would love to hear from you, We really would. 1186 01:00:06,520 --> 01:00:09,280 Speaker 3: We want to know what questions you have about this 1187 01:00:09,480 --> 01:00:11,120 Speaker 3: Extraordinary Universe. 1188 01:00:11,240 --> 01:00:14,200 Speaker 2: We want to know your thoughts on recent shows, suggestions 1189 01:00:14,240 --> 01:00:17,240 Speaker 2: for future shows. If you contact us, we will get 1190 01:00:17,280 --> 01:00:17,680 Speaker 2: back to you. 1191 01:00:17,880 --> 01:00:18,880 Speaker 1: We really mean it. 1192 01:00:19,000 --> 01:00:23,560 Speaker 3: We answer every message, email us at Questions at Danielandkelly 1193 01:00:23,720 --> 01:00:24,520 Speaker 3: dot org, or. 1194 01:00:24,520 --> 01:00:26,600 Speaker 2: You can find us on social media. We have accounts 1195 01:00:26,720 --> 01:00:30,680 Speaker 2: on x, Instagram, Blue Sky and on all of those platforms. 1196 01:00:30,680 --> 01:00:33,600 Speaker 2: You can find us at D and K Universe. 1197 01:00:33,760 --> 01:00:35,280 Speaker 1: Don't be shy, write to us