1 00:00:00,280 --> 00:00:02,920 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,160 --> 00:00:07,560 Speaker 1: It's ready. Are you welcome to Stuff you should know 3 00:00:08,160 --> 00:00:16,400 Speaker 1: from House Stuff Works dot com. Hey, and welcome to 4 00:00:16,440 --> 00:00:20,520 Speaker 1: the podcast. I'm Josh Clark, Davis, Charles W Chuck Bryant. 5 00:00:21,040 --> 00:00:25,319 Speaker 1: I remembered your name, and this is stuff you should know. Lizzy, 6 00:00:25,400 --> 00:00:27,040 Speaker 1: had you cracked up there? If that makes it into 7 00:00:27,080 --> 00:00:30,240 Speaker 1: the gut, it's not going to because that was pre 8 00:00:30,480 --> 00:00:32,760 Speaker 1: hay and welcome to the podcast. Everything before that I 9 00:00:32,760 --> 00:00:34,440 Speaker 1: always gets cut out. You know that. I don't think 10 00:00:34,440 --> 00:00:36,839 Speaker 1: a little giggle beforehand might be in. We would be 11 00:00:36,880 --> 00:00:42,440 Speaker 1: fired if we didn't cut out everything that came before that, right, maybe, Chuck, 12 00:00:43,479 --> 00:00:46,920 Speaker 1: you ever heard of a guy named Nicolas Carr? Yeah? Sure, 13 00:00:47,640 --> 00:00:49,840 Speaker 1: I know you. Have you ever heard of a little 14 00:00:49,920 --> 00:00:53,880 Speaker 1: rag called The Atlantic? Yes? I have. Back in two 15 00:00:53,880 --> 00:00:56,640 Speaker 1: thousand and eight, those two things collided, Nicholas Carr in 16 00:00:56,680 --> 00:01:01,280 Speaker 1: the Atlantic and he had a very great UM headlined 17 00:01:01,560 --> 00:01:05,640 Speaker 1: article called his Google making Us Stupid and Nicholas Carr. 18 00:01:05,720 --> 00:01:07,640 Speaker 1: He went on to write a book. He followed that 19 00:01:07,760 --> 00:01:10,399 Speaker 1: normal process where like you're writing the book and you're like, 20 00:01:10,600 --> 00:01:13,520 Speaker 1: I need some extra cash, so I'll excerpt this or 21 00:01:13,560 --> 00:01:16,280 Speaker 1: rewrite you know, like fifteen pages of it, celts to 22 00:01:16,400 --> 00:01:19,960 Speaker 1: the Atlantic or whatever. We need to get in that gig. Well, 23 00:01:20,000 --> 00:01:22,520 Speaker 1: we have to write a book or be writing a book. 24 00:01:22,520 --> 00:01:24,520 Speaker 1: I write a book. We should do that. But it 25 00:01:24,640 --> 00:01:27,080 Speaker 1: was I can't remember what the book was was called. 26 00:01:27,160 --> 00:01:29,920 Speaker 1: But the article actually made a bigger splash than the book. 27 00:01:29,959 --> 00:01:33,280 Speaker 1: Din But basically he was saying, like, we are re 28 00:01:34,520 --> 00:01:40,360 Speaker 1: configuring the way we learn through our interactions with the internet. Um, Like, 29 00:01:40,440 --> 00:01:42,959 Speaker 1: there's constantly things trying to get our attention on a 30 00:01:43,040 --> 00:01:45,640 Speaker 1: web page. You know, it's not like a book with 31 00:01:45,640 --> 00:01:49,680 Speaker 1: with without pictures or um, you know, flashing lights, that 32 00:01:49,760 --> 00:01:53,160 Speaker 1: kind of thing. We we read horizontally he put it, 33 00:01:53,280 --> 00:01:55,480 Speaker 1: rather than vertically, meaning you know, we just kind of 34 00:01:55,480 --> 00:01:57,480 Speaker 1: skimmed the surface of a bunch of different stuff rather 35 00:01:57,560 --> 00:02:01,240 Speaker 1: than really deeply get into one thing. Action which helps 36 00:02:01,320 --> 00:02:05,320 Speaker 1: flex the imagination. Um is pretty much non existent on 37 00:02:05,360 --> 00:02:08,000 Speaker 1: the web unless you're like into live journal, you know, 38 00:02:08,080 --> 00:02:12,880 Speaker 1: Harry Potter erotica. Right. So, but at the basis of 39 00:02:12,919 --> 00:02:16,320 Speaker 1: his argument is Google making a stupid was the idea 40 00:02:16,400 --> 00:02:20,600 Speaker 1: that it's actually reforming the way we form memories that 41 00:02:20,720 --> 00:02:24,680 Speaker 1: Google he chose Google, You know, just to get a headline, 42 00:02:24,840 --> 00:02:28,520 Speaker 1: as the editor did. But um, the the idea that 43 00:02:28,560 --> 00:02:31,320 Speaker 1: the internet, the way we read, is re changing or 44 00:02:31,400 --> 00:02:36,000 Speaker 1: changing the way that we absorb information and therefore form memory. 45 00:02:36,800 --> 00:02:38,919 Speaker 1: It says a lot that he was asking if it's 46 00:02:38,960 --> 00:02:42,639 Speaker 1: making as stupid because we here in the West equate 47 00:02:43,120 --> 00:02:49,400 Speaker 1: memory a good memory, too, intelligence, too smart. Um, I 48 00:02:49,400 --> 00:02:53,520 Speaker 1: guess what I'm trying to drive at, very clumsily, is 49 00:02:53,880 --> 00:02:56,560 Speaker 1: how does memory work? I thought that was a great 50 00:02:56,560 --> 00:02:59,120 Speaker 1: set up, And who knows what the heck we're gonna 51 00:02:59,160 --> 00:03:01,720 Speaker 1: look like in a hundred years as a species, how 52 00:03:01,720 --> 00:03:04,000 Speaker 1: our brains are gonna be firing, and what the effect 53 00:03:04,040 --> 00:03:06,560 Speaker 1: it's gonna have on us. We're gonna have mighty humps. Yeah, 54 00:03:07,000 --> 00:03:08,680 Speaker 1: well that's the thing. I mean, you can you can 55 00:03:08,720 --> 00:03:10,800 Speaker 1: debate all all day long. Is it stupid? Or are 56 00:03:10,840 --> 00:03:13,840 Speaker 1: we just in the middle of evolution? Probably in the 57 00:03:13,840 --> 00:03:15,560 Speaker 1: middle of an evolution. I don't think we're gonna end 58 00:03:15,639 --> 00:03:21,480 Speaker 1: up like a in idiocricy. You although never possibility, you 59 00:03:21,520 --> 00:03:24,720 Speaker 1: never know. I watched Idiocracy, so that might mean I'm 60 00:03:24,720 --> 00:03:27,240 Speaker 1: on that road myself. It's a great movie. Did you 61 00:03:27,280 --> 00:03:30,680 Speaker 1: like it? I did. I thought it was what it was, 62 00:03:30,880 --> 00:03:33,919 Speaker 1: But like the one joke premise of most most times 63 00:03:33,919 --> 00:03:35,520 Speaker 1: when a movie has the one joke premise, that kind 64 00:03:35,520 --> 00:03:39,480 Speaker 1: of gets old for me. What one joke? Sort of 65 00:03:39,520 --> 00:03:43,600 Speaker 1: the one joke everybody's stupid? Yeah, it worth than all right? 66 00:03:43,800 --> 00:03:48,480 Speaker 1: Moving on, memories, Josh, are what makes us who we 67 00:03:48,520 --> 00:03:51,400 Speaker 1: are if we you know, I imagine if someone has 68 00:03:51,560 --> 00:03:57,040 Speaker 1: complete amnesia, they usually don't have a sense of self. Well, yeah, 69 00:03:57,080 --> 00:04:01,960 Speaker 1: you know, it depends if you remember h M. Henry Mullison. No. 70 00:04:02,640 --> 00:04:05,160 Speaker 1: So he was the patient that proved there's this big 71 00:04:05,160 --> 00:04:08,320 Speaker 1: debate over when we think of memory, whether there's like 72 00:04:08,440 --> 00:04:11,160 Speaker 1: one part of the brain that's responsible for memory or 73 00:04:11,160 --> 00:04:13,160 Speaker 1: whether it's a bunch of different parts of the brain. 74 00:04:13,480 --> 00:04:17,280 Speaker 1: And he proved that the multiple memory systems works because 75 00:04:17,279 --> 00:04:18,920 Speaker 1: they used to think like, oh, you just got a 76 00:04:18,920 --> 00:04:21,159 Speaker 1: big old filing cabinet and your brain just sticks it 77 00:04:21,200 --> 00:04:23,200 Speaker 1: in whatever filet belongs in and then you go and 78 00:04:23,240 --> 00:04:25,360 Speaker 1: pull it out when you need it exactly. And that's 79 00:04:25,520 --> 00:04:29,119 Speaker 1: very it's very sesame street way of putting it. It is, um, 80 00:04:29,200 --> 00:04:31,560 Speaker 1: But you know, they were working with what they had 81 00:04:31,560 --> 00:04:33,320 Speaker 1: to work with at the time, and they were wrong. 82 00:04:33,440 --> 00:04:37,560 Speaker 1: But Hm, like age twenty three, this guy UM who 83 00:04:37,600 --> 00:04:43,719 Speaker 1: became known as HM. Patient HM had temporal labectomy to 84 00:04:43,880 --> 00:04:48,440 Speaker 1: cure his epilepsy. Oh, that guy also removed his hippocampus 85 00:04:48,480 --> 00:04:50,360 Speaker 1: so he could tell you, you you know, where he went 86 00:04:50,400 --> 00:04:53,440 Speaker 1: to high school, who his oldest friend was, that kind 87 00:04:53,440 --> 00:04:54,800 Speaker 1: of thing. But he couldn't tell you what he had 88 00:04:54,800 --> 00:04:57,080 Speaker 1: for lunch that day because he lost the ability to 89 00:04:57,120 --> 00:04:59,760 Speaker 1: form new memories. So the fact that he could maintain 90 00:04:59,760 --> 00:05:02,760 Speaker 1: old memories but couldn't form new memories proved that there's 91 00:05:02,880 --> 00:05:08,200 Speaker 1: multiple systems involved for different types of memory, like Memento. Right, 92 00:05:08,800 --> 00:05:11,080 Speaker 1: great movie that he would have proven it to a 93 00:05:11,160 --> 00:05:14,000 Speaker 1: memento did not have Ellen page. No it didn't, So 94 00:05:14,839 --> 00:05:21,800 Speaker 1: it's on your list of acceptable films. She would have 95 00:05:21,800 --> 00:05:26,240 Speaker 1: written in by now Um or not. She may just 96 00:05:26,279 --> 00:05:29,120 Speaker 1: be like, I hate those guys so much. Maybe so 97 00:05:29,960 --> 00:05:33,160 Speaker 1: uh so, let's let's talk a bit about memory, josh Um. 98 00:05:33,240 --> 00:05:35,640 Speaker 1: Let's say we were talking about breakfast this morning. If 99 00:05:35,680 --> 00:05:38,200 Speaker 1: you remember what you had for breakfast, you might think 100 00:05:38,320 --> 00:05:40,680 Speaker 1: that that is a very simple thing that happened, when 101 00:05:40,720 --> 00:05:45,080 Speaker 1: in fact, it is very complex reconstruction from different parts 102 00:05:45,080 --> 00:05:51,159 Speaker 1: of your brain putting together. Maybe the smell of your eggs, 103 00:05:51,160 --> 00:05:54,200 Speaker 1: of your eggs bacon is octuating you didn't need that 104 00:05:54,279 --> 00:05:57,000 Speaker 1: slab of ham. What it looked like, maybe, what it 105 00:05:57,040 --> 00:06:00,680 Speaker 1: felt like, um in your mouth, how it tasted. So 106 00:06:00,680 --> 00:06:04,560 Speaker 1: you're recalling all these different parts, but not complex thing, 107 00:06:04,640 --> 00:06:06,320 Speaker 1: not and even just that. I mean, there's so much 108 00:06:06,360 --> 00:06:11,880 Speaker 1: more to it, the the tablecloth, whether you were angry 109 00:06:11,920 --> 00:06:16,000 Speaker 1: at the weather guy, to remember even what eggs are. Yeah, 110 00:06:16,040 --> 00:06:18,839 Speaker 1: you know these memories that go way back, but we 111 00:06:18,839 --> 00:06:21,760 Speaker 1: we conceive of it as this one little snapshot of 112 00:06:21,800 --> 00:06:25,680 Speaker 1: a memory called what what you had for breakfast? Right? Um? 113 00:06:25,720 --> 00:06:28,240 Speaker 1: But all of those different things put together are called 114 00:06:28,279 --> 00:06:32,800 Speaker 1: neural projections, right chuck, Okay, so go ahead. Well the 115 00:06:33,200 --> 00:06:35,599 Speaker 1: other instance too, that they always mentioned is riding. You 116 00:06:35,640 --> 00:06:37,640 Speaker 1: never forget how to ride a bike, and it seems 117 00:06:37,680 --> 00:06:39,880 Speaker 1: like a very easy thing, but there are so many 118 00:06:39,920 --> 00:06:42,480 Speaker 1: things going on when you ride a bike. How you 119 00:06:42,960 --> 00:06:45,320 Speaker 1: get on the bike, how you mount the bike, where 120 00:06:45,360 --> 00:06:48,480 Speaker 1: your feet go, how you move it forward? Where should 121 00:06:48,520 --> 00:06:52,560 Speaker 1: I put my hands? Um? What about this car barreling down? 122 00:06:52,600 --> 00:06:54,600 Speaker 1: I should probably not ride in the center of the 123 00:06:54,680 --> 00:06:58,000 Speaker 1: road going the wrong way. So it's just like hundreds 124 00:06:58,000 --> 00:06:59,800 Speaker 1: of memories. I can't put a number on it, you 125 00:06:59,839 --> 00:07:02,120 Speaker 1: know how many there are? Well, the reason why it's 126 00:07:02,120 --> 00:07:04,799 Speaker 1: so difficult is because it's this is all a seamless process, 127 00:07:04,920 --> 00:07:08,040 Speaker 1: right exactly. It ding a bike as if it's one 128 00:07:08,160 --> 00:07:10,960 Speaker 1: single file that you pull out of your cabinet called 129 00:07:11,080 --> 00:07:14,800 Speaker 1: ride a bike. Yeah. And at times it's so it's 130 00:07:14,800 --> 00:07:17,280 Speaker 1: so second nature, it's so natural to us that we 131 00:07:17,360 --> 00:07:20,320 Speaker 1: kind of detach ourselves from it and call it things 132 00:07:20,320 --> 00:07:23,640 Speaker 1: like muscle memory. So just things muscle memory. Your muscles 133 00:07:23,680 --> 00:07:27,200 Speaker 1: are there, they don't have a capacity for member remembering 134 00:07:27,200 --> 00:07:29,800 Speaker 1: anything well, and we don't even know how we recall still, 135 00:07:29,880 --> 00:07:32,360 Speaker 1: even though we have a better handle on storage of 136 00:07:32,400 --> 00:07:35,240 Speaker 1: memory and now, and we should we should disclaim this 137 00:07:35,280 --> 00:07:38,600 Speaker 1: episode by saying that there is a lot of there's 138 00:07:38,640 --> 00:07:42,160 Speaker 1: a This is the rough sketch of what we know 139 00:07:42,480 --> 00:07:45,840 Speaker 1: right now about how we form and retrieve memory, which 140 00:07:45,880 --> 00:07:48,480 Speaker 1: is more than we've ever known. Yeah, and I think 141 00:07:48,480 --> 00:07:50,600 Speaker 1: we're hot on the trail. It's really starting to come 142 00:07:50,600 --> 00:07:54,320 Speaker 1: together and make sense. So should we start with encoding, Yeah, 143 00:07:54,440 --> 00:07:58,240 Speaker 1: which is basically your senses. It's rooted in your senses. 144 00:07:58,360 --> 00:08:01,120 Speaker 1: Encoding is the first step to create a memory. It 145 00:08:01,160 --> 00:08:03,440 Speaker 1: begins with perception. And when we talk about perception, we're 146 00:08:03,440 --> 00:08:07,720 Speaker 1: talking about your sensory perception, um, right, which like, right 147 00:08:07,760 --> 00:08:09,920 Speaker 1: now you appear to me as you know, a little 148 00:08:09,920 --> 00:08:14,160 Speaker 1: scruffy looking good. You've got your braves cap on, almost 149 00:08:14,480 --> 00:08:18,160 Speaker 1: smile right the the ikea lights gleaming off of your eye, 150 00:08:18,160 --> 00:08:22,040 Speaker 1: about eyeballs a little bit. All of this is visual information, right, 151 00:08:22,600 --> 00:08:26,280 Speaker 1: But in my brain it's nothing more than electrical impulses 152 00:08:26,320 --> 00:08:29,400 Speaker 1: traveling through the optic nerve to my hippocampus. Right right. 153 00:08:29,440 --> 00:08:32,040 Speaker 1: You might smell me, um, you might. You know. The 154 00:08:32,080 --> 00:08:34,199 Speaker 1: example they use in the article is the first girlfriend. 155 00:08:34,920 --> 00:08:37,560 Speaker 1: But that's right on the money, man. I mean, I 156 00:08:37,559 --> 00:08:40,240 Speaker 1: still remember all that stuff. But when you when you 157 00:08:40,280 --> 00:08:42,000 Speaker 1: see that first girl that you fall in love with, 158 00:08:42,040 --> 00:08:44,120 Speaker 1: you know what she looked like, what she smelled like, 159 00:08:44,200 --> 00:08:47,000 Speaker 1: the first time she shook your hand. I hate to 160 00:08:47,000 --> 00:08:49,319 Speaker 1: say this, but it's the exact same thing as breakfast 161 00:08:49,320 --> 00:08:53,200 Speaker 1: this morning or riding the bike. Well true in a way, 162 00:08:53,240 --> 00:08:55,160 Speaker 1: but there's also a point made later on that we'll 163 00:08:55,160 --> 00:08:57,560 Speaker 1: talk about that things that are more important to you 164 00:08:57,600 --> 00:08:59,760 Speaker 1: are more likely to be rooted in your long term memory. 165 00:08:59,800 --> 00:09:06,800 Speaker 1: So oh yeah, yeah, okay, breakfast is pretty important to me, right? True? 166 00:09:07,679 --> 00:09:09,200 Speaker 1: Is that that is that that's something been your long 167 00:09:09,320 --> 00:09:14,319 Speaker 1: term memory though I remember every breakfast I've ever eaten. Okay, 168 00:09:14,960 --> 00:09:18,320 Speaker 1: it's because you've only had breakfast four times. Uh So 169 00:09:18,320 --> 00:09:21,400 Speaker 1: where are we within the hippocampus, Well, yeah, we're with encoding. 170 00:09:21,480 --> 00:09:24,640 Speaker 1: So basically what just happened is the light bouncing off 171 00:09:24,679 --> 00:09:26,240 Speaker 1: of you that gives you shape, and all that is 172 00:09:26,320 --> 00:09:29,760 Speaker 1: coming into my eyes and be in transforming into electrical information. 173 00:09:29,840 --> 00:09:32,679 Speaker 1: It doesn't matter what it is. Like, that's the language 174 00:09:32,679 --> 00:09:37,200 Speaker 1: of the brain, right, translates information and stores it in 175 00:09:37,240 --> 00:09:41,320 Speaker 1: the hippocampus initially. Right, that's like the big processing, sorting, 176 00:09:41,400 --> 00:09:45,120 Speaker 1: routing hub. Yeah. So your hippocampus is like this, uh, 177 00:09:45,160 --> 00:09:48,559 Speaker 1: this region of the brain shaped like a sea horse, 178 00:09:48,960 --> 00:09:53,880 Speaker 1: which hence the name um that basically says, okay, so 179 00:09:54,440 --> 00:09:56,719 Speaker 1: I don't get that part hippocampus. It's not shaped like 180 00:09:56,720 --> 00:10:00,520 Speaker 1: a hippopotamus, shape like a sea horse. And why Plus 181 00:10:00,559 --> 00:10:05,000 Speaker 1: the name hippocampus means seahorse. I did not realize that, man, 182 00:10:05,040 --> 00:10:07,679 Speaker 1: I hope it does. That's how I've always taken it. 183 00:10:07,800 --> 00:10:11,040 Speaker 1: So let's call it the seahorse a campus. But basically, 184 00:10:11,080 --> 00:10:14,320 Speaker 1: what the hippocampus does is it takes in all this information, 185 00:10:14,720 --> 00:10:17,400 Speaker 1: including stuff that I have no idea I'm taking in 186 00:10:17,480 --> 00:10:20,839 Speaker 1: at the moment um and says this is important, this 187 00:10:20,880 --> 00:10:23,240 Speaker 1: is an important you can leave this out. Let's send 188 00:10:23,240 --> 00:10:25,640 Speaker 1: this over here, let's send this over there. Let's create 189 00:10:25,679 --> 00:10:29,199 Speaker 1: this neural projection by by combining this, this, and this, 190 00:10:29,920 --> 00:10:33,440 Speaker 1: and it's like basically, the the the man behind the curtain. 191 00:10:33,600 --> 00:10:38,320 Speaker 1: The hippocampus is the sent center of forming new memories, 192 00:10:38,640 --> 00:10:40,719 Speaker 1: and along with the frontal cortex, they work in hand 193 00:10:40,760 --> 00:10:42,880 Speaker 1: in hand at this Yeah, okay, and it's a really 194 00:10:42,920 --> 00:10:46,520 Speaker 1: efficient way to deal with your surroundings, Chuck, because consider this. 195 00:10:46,840 --> 00:10:49,120 Speaker 1: Let's say you or I, um, well, let's say we're 196 00:10:49,160 --> 00:10:51,679 Speaker 1: doing it together. We're coming out of the woods into 197 00:10:51,720 --> 00:10:55,360 Speaker 1: a meadow and it is um a primal area, and 198 00:10:55,360 --> 00:10:59,240 Speaker 1: we're scared of bears, so we're scanning the meadow for bears. 199 00:10:59,320 --> 00:11:02,520 Speaker 1: Right we don't see any bears, but there's birds, there's flowers, 200 00:11:02,520 --> 00:11:05,720 Speaker 1: there's butterflies, and we're kind of taking in all of 201 00:11:05,760 --> 00:11:08,480 Speaker 1: these things, but we're not really taking it in because 202 00:11:08,720 --> 00:11:10,880 Speaker 1: none of them are the bear, which is what we're 203 00:11:10,920 --> 00:11:15,400 Speaker 1: tasked with finding. Right then, So the the hippocampus isn't 204 00:11:15,400 --> 00:11:19,040 Speaker 1: in forming any memories of the butterfly or the daisies 205 00:11:19,120 --> 00:11:21,440 Speaker 1: or whatever. Maybe we would know, oh, well, there's a 206 00:11:21,440 --> 00:11:24,040 Speaker 1: splash of white against the green, so there are flowers there. 207 00:11:24,720 --> 00:11:26,600 Speaker 1: If we were asked later on what kind of flowers 208 00:11:26,640 --> 00:11:28,680 Speaker 1: were in that meadow, we couldn't say. So it's just 209 00:11:28,720 --> 00:11:34,880 Speaker 1: filtering that what we would consider unnecessary information out exactly. Okay, Um, 210 00:11:34,920 --> 00:11:37,760 Speaker 1: it travels while we said this perception and encoding is 211 00:11:37,760 --> 00:11:40,160 Speaker 1: where it starts. But then you have to it has 212 00:11:40,200 --> 00:11:43,000 Speaker 1: to go somewhere from there, and this is where the 213 00:11:43,120 --> 00:11:46,479 Speaker 1: chemistry of the brain comes in, which is endlessly fascinating 214 00:11:46,520 --> 00:11:50,200 Speaker 1: to me. Yes, Josh, we have nerve cells, neurons and 215 00:11:50,240 --> 00:11:53,240 Speaker 1: these uh, these connect with other cells, not a point 216 00:11:53,400 --> 00:11:56,560 Speaker 1: called the synapse, right well, which is actually that's funny 217 00:11:56,600 --> 00:11:58,440 Speaker 1: that the author of this article put it like that 218 00:11:58,480 --> 00:12:02,640 Speaker 1: because it's actually not a point. It's a gap. You 219 00:12:02,640 --> 00:12:04,679 Speaker 1: think that all these things connect, but there is a gap, 220 00:12:05,280 --> 00:12:08,960 Speaker 1: and the leap to the other side. The leap of 221 00:12:08,960 --> 00:12:14,600 Speaker 1: the gap is uh performed via neurotransmitters that right, and 222 00:12:14,640 --> 00:12:17,560 Speaker 1: then latched onto by a dentrite. The little feathery things 223 00:12:17,559 --> 00:12:21,480 Speaker 1: on the cells collect. Yeah, they accept the transmission. Come 224 00:12:21,480 --> 00:12:25,080 Speaker 1: on in transmission. Welcome to my cell. So Chuck I'm 225 00:12:25,120 --> 00:12:27,520 Speaker 1: going to give a couple of stature quick. Okay, there's 226 00:12:27,559 --> 00:12:32,360 Speaker 1: possibly as many as a hundred billion neurons in your brain. 227 00:12:32,760 --> 00:12:38,280 Speaker 1: It's a lot um. Each of them have many tens 228 00:12:38,280 --> 00:12:41,560 Speaker 1: of thousands, are many, many thousands of connections up to 229 00:12:42,360 --> 00:12:45,800 Speaker 1: UM which are synapsis, yes, which are which leads to 230 00:12:46,200 --> 00:12:49,600 Speaker 1: as many as a quadrillion synapses in the human brain. 231 00:12:49,840 --> 00:12:52,120 Speaker 1: And they can connect. Is it an infinite amount of 232 00:12:52,160 --> 00:12:56,080 Speaker 1: times if need be? What do you mean is there 233 00:12:56,120 --> 00:12:58,520 Speaker 1: any limit to the amount of neural connections these cells 234 00:12:58,520 --> 00:13:01,360 Speaker 1: can make? Well, I think ten thousand is the high 235 00:13:01,440 --> 00:13:04,679 Speaker 1: end that I've heard, But I love that you had 236 00:13:04,679 --> 00:13:08,840 Speaker 1: an answer for that. UM. And they're constantly going to 237 00:13:10,000 --> 00:13:12,600 Speaker 1: and forming new connections. I think it's something like thirty 238 00:13:12,679 --> 00:13:16,360 Speaker 1: to sixty times the second. Near neurons are firing all 239 00:13:16,400 --> 00:13:20,640 Speaker 1: over your head and they're not sentence down. They're always changing, 240 00:13:20,679 --> 00:13:23,920 Speaker 1: always forming new connections. The more that you do something, 241 00:13:24,720 --> 00:13:27,120 Speaker 1: the stronger the connection is going to be. We might 242 00:13:27,160 --> 00:13:31,920 Speaker 1: know that in the real world as practice or repetition, right, 243 00:13:32,280 --> 00:13:35,880 Speaker 1: but the another word for it is plasticity, where the 244 00:13:36,080 --> 00:13:40,479 Speaker 1: brain you're the organizational structure of your brain actually changes 245 00:13:40,480 --> 00:13:45,240 Speaker 1: shape it as as you're saying, like, through practicing something 246 00:13:45,600 --> 00:13:49,720 Speaker 1: like the repeated firing of a neural connection, right, which 247 00:13:49,760 --> 00:13:52,080 Speaker 1: is just an electrical impulse that triggers the release of 248 00:13:52,080 --> 00:13:55,959 Speaker 1: neurotransmitters that crosses synapse are accepted in the den drite right, 249 00:13:56,000 --> 00:13:58,959 Speaker 1: and the neurotransmitters are the message carried like a certain 250 00:13:59,000 --> 00:14:03,000 Speaker 1: type of neurotransmit or like dopamine says hey, everything's just 251 00:14:03,160 --> 00:14:07,800 Speaker 1: ire right um, And this information has just passed along 252 00:14:07,840 --> 00:14:10,360 Speaker 1: from one neur onto another if the the impulse is 253 00:14:10,360 --> 00:14:12,800 Speaker 1: strong enough. Right. But then when you do it again 254 00:14:12,800 --> 00:14:17,760 Speaker 1: and again and again, more channels that allow the neurotransmitters 255 00:14:17,800 --> 00:14:21,240 Speaker 1: to to be released from one and accepted by another 256 00:14:21,840 --> 00:14:25,119 Speaker 1: are dug which means that this thing fires more efficiently 257 00:14:25,400 --> 00:14:28,640 Speaker 1: and all of a sudden after firing them. By practicing 258 00:14:28,640 --> 00:14:31,240 Speaker 1: your violin this one piece of music slowly over and 259 00:14:31,280 --> 00:14:33,680 Speaker 1: over again, you get faster and faster and faster at 260 00:14:33,720 --> 00:14:37,160 Speaker 1: it until you can play it perfectly. That's exactly what's 261 00:14:37,160 --> 00:14:42,040 Speaker 1: going on your your neural connection is is that top performance, 262 00:14:42,120 --> 00:14:45,680 Speaker 1: peak performance practice makes perfect How to play uh the 263 00:14:45,720 --> 00:14:47,800 Speaker 1: intro to Stairway to Heaven. When I first got my 264 00:14:47,840 --> 00:14:50,280 Speaker 1: guitar and I played it over and over and over 265 00:14:50,320 --> 00:14:54,120 Speaker 1: and over until I learned it. Give me a guitar. Today, 266 00:14:54,640 --> 00:14:58,000 Speaker 1: I can monkey through about a third of it very 267 00:14:58,000 --> 00:15:02,800 Speaker 1: clumsily because I forget or I can't write in cursive anymore. 268 00:15:03,000 --> 00:15:06,200 Speaker 1: I can't either. I mean, I'd really have to concentrate. 269 00:15:06,240 --> 00:15:09,000 Speaker 1: And they're definitely letters that I would forget how to write. 270 00:15:09,280 --> 00:15:13,560 Speaker 1: That was a jarring realization for me, that, like, how 271 00:15:13,600 --> 00:15:16,160 Speaker 1: do you make the cue? And whatever happened to that Z? 272 00:15:16,440 --> 00:15:20,000 Speaker 1: That weird Z like it's all gone and even like 273 00:15:20,040 --> 00:15:22,400 Speaker 1: the S and the R and all that like normal stuff, 274 00:15:22,400 --> 00:15:25,080 Speaker 1: it's just gone. Yeah, you don't want to see my 275 00:15:25,120 --> 00:15:27,880 Speaker 1: cursive writing. So what what you're talking about is a 276 00:15:28,840 --> 00:15:32,960 Speaker 1: while you can refine the organization of your brain to 277 00:15:33,080 --> 00:15:37,240 Speaker 1: peak performance, your neural connections also have a kind of 278 00:15:37,360 --> 00:15:39,720 Speaker 1: use it or lose it aspect to them as well. 279 00:15:40,240 --> 00:15:45,160 Speaker 1: Remember that study with the kittens. No, this the really 280 00:15:45,200 --> 00:15:47,920 Speaker 1: sad study. I think it's funny that you're asking me 281 00:15:47,960 --> 00:15:51,000 Speaker 1: how much I remember and I keep saying no. Um. 282 00:15:51,080 --> 00:15:55,320 Speaker 1: There was this study that um that involved kittens having 283 00:15:55,560 --> 00:15:59,560 Speaker 1: one eye so and shut from birth, and they would 284 00:16:00,000 --> 00:16:02,280 Speaker 1: they were allowed to frolic and play and do whatever, 285 00:16:02,360 --> 00:16:04,880 Speaker 1: but they just had one eye so and shut UM. 286 00:16:04,920 --> 00:16:07,080 Speaker 1: And then after I think like eight or ten or 287 00:16:07,120 --> 00:16:13,120 Speaker 1: twelve weeks, the eye was um released opened up again, 288 00:16:13,560 --> 00:16:17,000 Speaker 1: and the kittens were blind for life. And I guess 289 00:16:17,080 --> 00:16:19,440 Speaker 1: they killed the kittens and looked at what was going 290 00:16:19,480 --> 00:16:21,880 Speaker 1: on in the brain and they found that that they say, 291 00:16:21,880 --> 00:16:24,120 Speaker 1: the left eye had been so and shut during the 292 00:16:24,200 --> 00:16:28,360 Speaker 1: stage of development. UM. The neural connections had all traveled 293 00:16:28,400 --> 00:16:31,320 Speaker 1: to the right eye, which was seeing, and the ones 294 00:16:31,400 --> 00:16:33,560 Speaker 1: that had been there on the left eye that formed 295 00:16:33,560 --> 00:16:36,360 Speaker 1: the optic were withered and dead. You know what they 296 00:16:36,400 --> 00:16:39,600 Speaker 1: called that experiment, They called the saddest experiment in the 297 00:16:39,720 --> 00:16:42,280 Speaker 1: history of the world. It's pretty bad. It's a pretty 298 00:16:42,280 --> 00:16:45,000 Speaker 1: bad experiment, but it basically goes to show you that 299 00:16:45,040 --> 00:16:48,280 Speaker 1: not only will neural connections wither and die if they're 300 00:16:48,280 --> 00:16:51,240 Speaker 1: not used, they'll also migrate to places where they can 301 00:16:51,280 --> 00:16:54,640 Speaker 1: be used. There's kind of like a survival of the fittest, 302 00:16:54,760 --> 00:16:59,840 Speaker 1: like um jungle grab for firing, because the more fired 303 00:17:00,440 --> 00:17:03,440 Speaker 1: a neural connection is, the more important it is, the 304 00:17:03,480 --> 00:17:07,119 Speaker 1: stronger it's going to be right, right and f I 305 00:17:07,200 --> 00:17:11,119 Speaker 1: I Giraffe neurons can grow up to three ft in length. Really, 306 00:17:11,560 --> 00:17:14,960 Speaker 1: are they all in the neck. Isn't that cool? That 307 00:17:15,080 --> 00:17:18,320 Speaker 1: is pretty cool. Uh. So we were talking about encoding, Um, 308 00:17:18,359 --> 00:17:21,000 Speaker 1: you have to really be paying attention to properly encode, 309 00:17:21,359 --> 00:17:23,840 Speaker 1: and we also talked about filtering things out. What they 310 00:17:23,880 --> 00:17:27,280 Speaker 1: don't know again, is this maybe the first time we've 311 00:17:27,280 --> 00:17:32,000 Speaker 1: said it, is are we screening this stimuli out during 312 00:17:32,040 --> 00:17:35,639 Speaker 1: the first initial sensory stage or are we literally processing 313 00:17:35,680 --> 00:17:38,240 Speaker 1: it and saying no, we don't need this, get rid 314 00:17:38,280 --> 00:17:40,040 Speaker 1: of it. Yeah, it would make sense to me that 315 00:17:40,600 --> 00:17:43,560 Speaker 1: it comes afterward, but they don't know. Yeah, it makes 316 00:17:43,560 --> 00:17:45,720 Speaker 1: sense to me too, Like the hippocampus is like, that's 317 00:17:45,760 --> 00:17:49,399 Speaker 1: not a bear, so forget it. Yeah, forget that. So 318 00:17:49,480 --> 00:17:53,000 Speaker 1: are we at short term and long term? Josh? Yeah, 319 00:17:53,200 --> 00:17:55,199 Speaker 1: you have to store all memories, even if it's just 320 00:17:55,240 --> 00:17:57,159 Speaker 1: for a blip, you're going to be storing it, or 321 00:17:57,200 --> 00:17:59,680 Speaker 1: it's not a memory, even the shortest of short term 322 00:17:59,680 --> 00:18:02,640 Speaker 1: memories as a memory. And there are three ways they 323 00:18:02,640 --> 00:18:05,720 Speaker 1: believe that we store these memories. We've already talked about 324 00:18:05,760 --> 00:18:08,800 Speaker 1: the sensory stage. Then you have the short term if 325 00:18:08,880 --> 00:18:11,240 Speaker 1: it's deemed important enough to remember at least for a 326 00:18:11,280 --> 00:18:15,640 Speaker 1: little while, and then eventually long term. Yeah, if it's 327 00:18:15,680 --> 00:18:18,640 Speaker 1: really important to you, which there's different ways to look 328 00:18:18,680 --> 00:18:21,639 Speaker 1: at long term memory. The way that I found is 329 00:18:22,160 --> 00:18:26,840 Speaker 1: that long term memory is this dormant neural projection, all 330 00:18:26,880 --> 00:18:30,640 Speaker 1: the all the different neural connections that make up that 331 00:18:31,320 --> 00:18:36,160 Speaker 1: rich memory you know from long ago. Um that it's there, 332 00:18:36,200 --> 00:18:38,800 Speaker 1: like it can be activated again. It's long term memories, 333 00:18:38,800 --> 00:18:42,200 Speaker 1: short term memories when it's active, and then working memory, 334 00:18:42,200 --> 00:18:45,920 Speaker 1: which isn't in this article. Working memory is like bringing 335 00:18:45,960 --> 00:18:49,320 Speaker 1: something to mind and then the action of consciously keeping 336 00:18:49,359 --> 00:18:52,280 Speaker 1: it in mind, like repeating a phone number over and 337 00:18:52,320 --> 00:18:55,359 Speaker 1: over again that you knew before, but you're having to 338 00:18:55,359 --> 00:18:58,040 Speaker 1: to remind yourself you're keeping it in your working memory. Well, 339 00:18:58,080 --> 00:19:01,040 Speaker 1: it's funny you mentioned phone number because short term memory 340 00:19:01,119 --> 00:19:04,200 Speaker 1: is really limited. UM. I love the status. It says 341 00:19:04,280 --> 00:19:07,119 Speaker 1: that short term memory can hold about seven items for 342 00:19:07,160 --> 00:19:09,200 Speaker 1: no more than twenty or thirty seconds at a time. 343 00:19:09,920 --> 00:19:12,159 Speaker 1: So that's why when you see something like a phone number, 344 00:19:12,680 --> 00:19:14,959 Speaker 1: you shouldn't be able to remember that. So what you 345 00:19:15,040 --> 00:19:18,119 Speaker 1: do is you break it down, or it's already broken 346 00:19:18,119 --> 00:19:21,600 Speaker 1: down usually for you UH into three sets of three 347 00:19:22,280 --> 00:19:24,320 Speaker 1: or two sets of three and one set of four. 348 00:19:24,920 --> 00:19:27,679 Speaker 1: I would be missing a digit. And that's how you 349 00:19:27,720 --> 00:19:29,600 Speaker 1: remember things and then saying it over and over. And 350 00:19:29,640 --> 00:19:32,920 Speaker 1: there's all sorts of exercises you can do to remember things, 351 00:19:32,960 --> 00:19:35,720 Speaker 1: like when you meet somebody. That's where I'm bad. You know. 352 00:19:37,040 --> 00:19:39,640 Speaker 1: You know how I am with remembering names. I can 353 00:19:39,680 --> 00:19:41,879 Speaker 1: never remember names. But you're also very friendly. You can 354 00:19:41,920 --> 00:19:45,080 Speaker 1: be like, hey, I I recognize your face. What's your name? Well, yeah, 355 00:19:45,080 --> 00:19:48,240 Speaker 1: and I might remember. You could say you met this 356 00:19:48,320 --> 00:19:51,640 Speaker 1: girl Francis at this thing that we did, and I'd 357 00:19:51,640 --> 00:19:54,400 Speaker 1: say who. I'd say, oh, that the lady who wore 358 00:19:54,440 --> 00:19:58,120 Speaker 1: the overalls in the flip flops. Like I'll remember things 359 00:19:58,119 --> 00:20:03,119 Speaker 1: like that forever. Names forget about it. That's why you 360 00:20:03,160 --> 00:20:06,240 Speaker 1: have all the names of everyone you've ever met written 361 00:20:06,280 --> 00:20:08,240 Speaker 1: down on your hand. That's why I have no friends. 362 00:20:09,440 --> 00:20:13,679 Speaker 1: But long term memory can store everything forever if you 363 00:20:13,760 --> 00:20:17,160 Speaker 1: wanted to. Yeah, that seems like a ge whiz we 364 00:20:17,280 --> 00:20:20,720 Speaker 1: don't really know exactly what's going on kind of statement, 365 00:20:20,800 --> 00:20:22,520 Speaker 1: you know, I totally agree. When I read that, I 366 00:20:22,560 --> 00:20:27,360 Speaker 1: was like, really, yeah, I don't know about that one, 367 00:20:27,520 --> 00:20:31,520 Speaker 1: but it is I would agree that it's um at 368 00:20:31,560 --> 00:20:34,480 Speaker 1: least as much as we need or another way to 369 00:20:34,480 --> 00:20:37,159 Speaker 1: look at it is, what if we are all operating 370 00:20:37,160 --> 00:20:41,560 Speaker 1: pretty much a capacity. How much more incredibly intelligent would 371 00:20:41,560 --> 00:20:45,800 Speaker 1: we be if we had even like more memory storage. 372 00:20:46,320 --> 00:20:48,679 Speaker 1: But like we mentioned earlier, things that are important to you, 373 00:20:48,680 --> 00:20:52,680 Speaker 1: you're more likely to remember. Um. And then when you're 374 00:20:52,760 --> 00:20:55,800 Speaker 1: encoding how you're perceiving things. That's why I probably when 375 00:20:55,800 --> 00:21:00,280 Speaker 1: I meet someone, I look at their shoes and they're 376 00:21:00,320 --> 00:21:03,240 Speaker 1: wearing I guess, and and and I'm distracted. I'm not 377 00:21:03,280 --> 00:21:05,239 Speaker 1: thinking of the fact that they said their name when 378 00:21:05,240 --> 00:21:07,679 Speaker 1: they shook my hand. Well, you just hit two big points. 379 00:21:07,720 --> 00:21:11,000 Speaker 1: One that's something that is meaningful to you. You're going 380 00:21:11,040 --> 00:21:16,920 Speaker 1: to remember more. That's because emotion is usually attached the emotion. 381 00:21:16,960 --> 00:21:19,199 Speaker 1: The seat of emotion is the amygdala, and it is 382 00:21:19,240 --> 00:21:22,520 Speaker 1: directly connected to the hippocampus. It's got like a direct 383 00:21:22,560 --> 00:21:25,320 Speaker 1: line to the hippocampus, like up coming through I'm first 384 00:21:25,840 --> 00:21:28,200 Speaker 1: right the saddle on the sea horse, if you will, Yes, 385 00:21:28,400 --> 00:21:33,400 Speaker 1: and that's great. Um, the uh, that's really good, and 386 00:21:33,560 --> 00:21:35,960 Speaker 1: the there's a I guess. The one of the big 387 00:21:36,040 --> 00:21:39,359 Speaker 1: theories behind what emotions are, why we have them, is 388 00:21:39,359 --> 00:21:42,760 Speaker 1: they're basically like learning guides. They're teaching guides Like you 389 00:21:42,760 --> 00:21:46,080 Speaker 1: feel fear, um, you're going to remember that you feel 390 00:21:46,119 --> 00:21:48,399 Speaker 1: fear when you see a bear, and you're going to 391 00:21:48,440 --> 00:21:52,640 Speaker 1: stay away from bears. Um Or joy makes us all 392 00:21:52,880 --> 00:21:58,320 Speaker 1: you know, what makes us feel familiar with other people groups, 393 00:21:58,359 --> 00:22:02,679 Speaker 1: which is which is right away from it keeps bears away. 394 00:22:02,720 --> 00:22:04,920 Speaker 1: You know, eight people could beat up a bear rather 395 00:22:04,960 --> 00:22:08,480 Speaker 1: than just one. So we have emotions, and we learned 396 00:22:08,480 --> 00:22:11,520 Speaker 1: from our emotions, which is why we managed to remember 397 00:22:11,600 --> 00:22:14,280 Speaker 1: things so much more clearly when there's an emotion attached. 398 00:22:14,280 --> 00:22:18,240 Speaker 1: And if you examine most of your memories, there's probably 399 00:22:18,280 --> 00:22:21,960 Speaker 1: going to be some sort of emotional memory. I guess 400 00:22:22,000 --> 00:22:25,359 Speaker 1: beneath the surface there. Like have you ever watched the 401 00:22:25,720 --> 00:22:30,800 Speaker 1: movie and you know, it's really dramatic and intense and 402 00:22:30,920 --> 00:22:33,560 Speaker 1: that scene ends and you kind of come out of 403 00:22:33,560 --> 00:22:35,600 Speaker 1: it like you were just totally sucked in, and you 404 00:22:35,680 --> 00:22:37,280 Speaker 1: kind of come out of it because the next scene 405 00:22:37,320 --> 00:22:40,080 Speaker 1: started and it's it's you know, the build up hasn't 406 00:22:40,080 --> 00:22:42,520 Speaker 1: happened yet for that scene, but you realize you have 407 00:22:42,600 --> 00:22:46,679 Speaker 1: this kind of um remnant, uneasy feeling that you have 408 00:22:46,760 --> 00:22:49,600 Speaker 1: no idea what it belongs to any longer. And then 409 00:22:49,640 --> 00:22:52,600 Speaker 1: you realize, wait a minute, I was just identifying with 410 00:22:52,640 --> 00:22:56,800 Speaker 1: the movie, So I that's I think that's kind of 411 00:22:56,840 --> 00:23:01,520 Speaker 1: the the same kind of underlying emotional memory that can 412 00:23:01,560 --> 00:23:04,080 Speaker 1: be attached to anything, and that makes it more poignant 413 00:23:04,080 --> 00:23:07,080 Speaker 1: and more likely to be remembered. Yeah, exactly, and that 414 00:23:07,160 --> 00:23:09,200 Speaker 1: and and that may be more important to one person 415 00:23:09,240 --> 00:23:12,840 Speaker 1: than another. So it's typical to say I have a 416 00:23:12,840 --> 00:23:15,480 Speaker 1: good memory or a bad memory. And what's probably more 417 00:23:15,520 --> 00:23:18,000 Speaker 1: likely is that you might be really good at remembering 418 00:23:18,119 --> 00:23:21,080 Speaker 1: some things but not others, you know what I'm saying. 419 00:23:22,040 --> 00:23:25,240 Speaker 1: And if you're having trouble remembering something, it's not like 420 00:23:25,320 --> 00:23:28,719 Speaker 1: your entire memory system is is not working. It's probably 421 00:23:28,760 --> 00:23:31,520 Speaker 1: like one part. Because I think there's three stages to 422 00:23:31,520 --> 00:23:35,119 Speaker 1: actually keeping a memory around, and it just means one 423 00:23:35,160 --> 00:23:38,840 Speaker 1: of those is not working quite well. Uh, why don't 424 00:23:38,880 --> 00:23:42,480 Speaker 1: you tell me? Okay, Well, basically that you can say 425 00:23:42,480 --> 00:23:46,720 Speaker 1: that when well, let's take an example of eyeglasses. Um, 426 00:23:46,800 --> 00:23:49,000 Speaker 1: would you neither one of us were right, But let's 427 00:23:49,000 --> 00:23:52,520 Speaker 1: say we did, and we're going to bed. Right, You're 428 00:23:52,560 --> 00:23:57,119 Speaker 1: going to bed in a separate from me, okay, And um, 429 00:23:57,240 --> 00:24:00,000 Speaker 1: you're you take your glasses off and you you toss them, 430 00:24:00,080 --> 00:24:03,520 Speaker 1: you know, off to on the nightstand and go to sleep. Right, 431 00:24:04,400 --> 00:24:08,160 Speaker 1: if you looked at your eyeglasses where you set them, 432 00:24:08,200 --> 00:24:12,400 Speaker 1: you were, you would be perceiving their placement encoding, right, 433 00:24:12,760 --> 00:24:15,560 Speaker 1: which is going to make it likelier for that memory 434 00:24:15,560 --> 00:24:18,840 Speaker 1: to be retained. And then when you wake up, since 435 00:24:18,880 --> 00:24:21,439 Speaker 1: the memory is retained, you'll be able to retrieve it. 436 00:24:22,160 --> 00:24:25,280 Speaker 1: So those are the three steps. It's awareness, retention, and 437 00:24:25,280 --> 00:24:28,000 Speaker 1: then retrieval, and any of those three is where the 438 00:24:28,040 --> 00:24:30,640 Speaker 1: breakdown can occur. Yeah, and I've heard you know, there's 439 00:24:30,640 --> 00:24:33,560 Speaker 1: all sorts of tips, like if you say out loud 440 00:24:33,600 --> 00:24:36,359 Speaker 1: as you're as you're doing it, I'm putting my eyeglasses 441 00:24:36,359 --> 00:24:39,359 Speaker 1: on the nightstand, that might help you. You might seem 442 00:24:39,359 --> 00:24:41,439 Speaker 1: a little weird, but that will help you remember it. 443 00:24:42,480 --> 00:24:47,639 Speaker 1: So Chuck, what about aging, Like there's this underlying fear 444 00:24:47,800 --> 00:24:50,520 Speaker 1: among everybody that as we get older, our memories are 445 00:24:50,520 --> 00:24:53,639 Speaker 1: going to go. And that's that's true in a lot 446 00:24:53,720 --> 00:24:57,640 Speaker 1: of cases, but it's not necessarily you know, it's often 447 00:24:57,640 --> 00:25:00,800 Speaker 1: associated with Alzheimer's or dementia or something that's not necessarily 448 00:25:00,880 --> 00:25:05,120 Speaker 1: the underlying mechanism. It's not the mechanism, but it does happen. Um, 449 00:25:05,160 --> 00:25:07,879 Speaker 1: there is a breakdown that starts with the onset of 450 00:25:07,960 --> 00:25:11,720 Speaker 1: sexual maturity. Oddly, it's linked to that then you start 451 00:25:11,760 --> 00:25:16,600 Speaker 1: forgetting things. Um and it uh, I think it gets 452 00:25:16,600 --> 00:25:19,879 Speaker 1: worse and worse until we reach our fifties. It's like 453 00:25:19,880 --> 00:25:22,960 Speaker 1: twenties to fifties is when you're you really have some 454 00:25:23,080 --> 00:25:27,280 Speaker 1: trouble initially, right, But the brain isn't changing it it's 455 00:25:27,320 --> 00:25:30,680 Speaker 1: structure or anything. It's the connections that start to fail. 456 00:25:30,760 --> 00:25:33,880 Speaker 1: Is that right? That's what I understand. Although they did 457 00:25:33,880 --> 00:25:37,760 Speaker 1: say the brain in the in hippocampus shrink in your seventies. 458 00:25:38,480 --> 00:25:41,320 Speaker 1: It depends, yes, I think. Um. What they're finding though, 459 00:25:41,359 --> 00:25:42,720 Speaker 1: is that a lot of it has to do with 460 00:25:42,720 --> 00:25:46,359 Speaker 1: the lack of stimulation. Well that's huge. Yeah. Um. They 461 00:25:46,480 --> 00:25:49,600 Speaker 1: found that rats that are raised with lots of toys 462 00:25:49,880 --> 00:25:52,560 Speaker 1: or they are given lots of toys and stimulating environment 463 00:25:52,640 --> 00:25:58,119 Speaker 1: later on in life have literally fatter, healthier cells, brain cells, 464 00:25:58,400 --> 00:26:01,679 Speaker 1: neurons um then their counterparts, and the same as in 465 00:26:01,800 --> 00:26:03,720 Speaker 1: humans as well. At the very least, we know that 466 00:26:04,000 --> 00:26:06,600 Speaker 1: our neurons shrink as we get older, like you said. 467 00:26:07,040 --> 00:26:12,040 Speaker 1: But they found they found that stimulating environments, like you know, 468 00:26:12,119 --> 00:26:14,280 Speaker 1: if you're in a nursing home and and there's a 469 00:26:14,320 --> 00:26:16,439 Speaker 1: lot going on, rather than just like go sit in 470 00:26:16,480 --> 00:26:19,480 Speaker 1: your room, right, the people at the lots going on 471 00:26:19,600 --> 00:26:22,359 Speaker 1: nursing home are going to be a lot more um 472 00:26:22,400 --> 00:26:25,199 Speaker 1: I guess intelligent later in life, or at the very 473 00:26:25,280 --> 00:26:27,440 Speaker 1: least they're going to have better memories, is another way 474 00:26:27,480 --> 00:26:29,760 Speaker 1: to put it. Yeah. Well, Emily's grandmother, as you know, 475 00:26:29,920 --> 00:26:36,119 Speaker 1: is ninety and she is uh has a very robust 476 00:26:36,440 --> 00:26:40,320 Speaker 1: personality and memory and she is I think it's all 477 00:26:40,400 --> 00:26:44,000 Speaker 1: due to the fact that she exercises that muscle quite 478 00:26:44,000 --> 00:26:47,560 Speaker 1: a bit. She does word puzzles every day, she's she's 479 00:26:47,560 --> 00:26:49,960 Speaker 1: on the internet more than I am. She's on all 480 00:26:50,000 --> 00:26:52,880 Speaker 1: of our Facebook page and she just you know, that's 481 00:26:52,880 --> 00:26:56,919 Speaker 1: how you stay vital. If you don't Facebook, I'd be 482 00:26:57,040 --> 00:27:00,919 Speaker 1: a heck of a endorsement. Facebook let you live forever. 483 00:27:02,200 --> 00:27:03,879 Speaker 1: But it's true though, I mean, anyway you want to 484 00:27:03,920 --> 00:27:07,200 Speaker 1: go about it, if you're exercising that noodle, it's gonna 485 00:27:07,240 --> 00:27:10,520 Speaker 1: stay strong and and you can regenerate and stay vital 486 00:27:10,640 --> 00:27:14,320 Speaker 1: and not not slip darkly into the night. And also 487 00:27:14,480 --> 00:27:19,040 Speaker 1: they're they're pretty sure that a reduction in production of 488 00:27:19,119 --> 00:27:23,720 Speaker 1: acetal colin, which is a neurotransmitter that's strongly associated with 489 00:27:23,960 --> 00:27:26,960 Speaker 1: um memory formation. Yeah, they kind of pinpointed that. Yeah, 490 00:27:27,000 --> 00:27:30,720 Speaker 1: they're not exactly sure how that works, but they know 491 00:27:30,800 --> 00:27:33,119 Speaker 1: that like if you the more a setal colin you have, 492 00:27:33,359 --> 00:27:36,320 Speaker 1: the better memory you have and vice versa. But they 493 00:27:36,320 --> 00:27:39,639 Speaker 1: can you can actually reverse that, right what through the 494 00:27:39,640 --> 00:27:43,240 Speaker 1: mental exercises? Yeah, yeah, I think you can boost production 495 00:27:43,320 --> 00:27:45,080 Speaker 1: like that, and I'm sure pretty soon they'll have a 496 00:27:45,119 --> 00:27:47,880 Speaker 1: setal coline shots where you can just shoot right into 497 00:27:47,880 --> 00:27:53,119 Speaker 1: our brains like memory junkies. If you're a smoker or 498 00:27:53,160 --> 00:27:58,879 Speaker 1: a drinker or generally unhealthy, it's gonna you know, impact 499 00:27:58,880 --> 00:28:02,399 Speaker 1: your memory too. And then last, sadly, this is the 500 00:28:02,760 --> 00:28:05,840 Speaker 1: this is the one that um I find the most 501 00:28:05,840 --> 00:28:11,720 Speaker 1: fascinating about memory. There's something called sleep dependent memory consolidation. 502 00:28:12,440 --> 00:28:15,679 Speaker 1: And basically what happens, remember when we were talking about sleepwalking, 503 00:28:16,320 --> 00:28:19,240 Speaker 1: Your brain goes through your your you go through two phases, 504 00:28:19,240 --> 00:28:21,280 Speaker 1: one where your body is active but your brain is out, 505 00:28:22,080 --> 00:28:24,320 Speaker 1: and then the R E M sleep, the deepest sleep, 506 00:28:24,359 --> 00:28:27,520 Speaker 1: which where your body can't move but your brain's going. 507 00:28:27,680 --> 00:28:31,440 Speaker 1: It's basically taking advantage of your you napping so that 508 00:28:31,520 --> 00:28:34,359 Speaker 1: it can do some paperwork or whatever, and it goes 509 00:28:34,400 --> 00:28:37,880 Speaker 1: through and fires all the neural connections that were used 510 00:28:37,880 --> 00:28:40,400 Speaker 1: that day. Maybe some are kind of fading a little 511 00:28:40,440 --> 00:28:45,240 Speaker 1: bit here there, fires those and um, while you're sleeping, 512 00:28:45,440 --> 00:28:52,240 Speaker 1: your brain is basically creating perception again, right, got by 513 00:28:52,360 --> 00:28:56,280 Speaker 1: firing your your neural projections, which I think is probably 514 00:28:56,320 --> 00:29:03,680 Speaker 1: the best explanation for dreams I've ever heard. Yeah, that's it. Uh. Well, 515 00:29:03,720 --> 00:29:05,680 Speaker 1: I will just close by saying if I have ever 516 00:29:05,720 --> 00:29:08,400 Speaker 1: met you and I don't remember your name, please don't 517 00:29:08,400 --> 00:29:11,520 Speaker 1: be offended, because I guarantee you I recognize your face. 518 00:29:11,800 --> 00:29:14,960 Speaker 1: Who was it, Tammy with the overalls and the flip flops. 519 00:29:15,560 --> 00:29:17,640 Speaker 1: I'll remember all that stuff. I remember. I feel like 520 00:29:17,960 --> 00:29:20,080 Speaker 1: last time I was, last time I was in New York, 521 00:29:20,120 --> 00:29:23,560 Speaker 1: Emily marveled at I saw I think two different people 522 00:29:24,320 --> 00:29:26,960 Speaker 1: that I said, Hey, that person was at our our 523 00:29:27,040 --> 00:29:29,200 Speaker 1: trivia night, you know, a year and a half ago 524 00:29:29,320 --> 00:29:31,560 Speaker 1: or two years ago or whatever. She said? You remember that? 525 00:29:31,600 --> 00:29:34,480 Speaker 1: I want? Yeah, so I remember all those faces. That's 526 00:29:34,560 --> 00:29:36,880 Speaker 1: very good, just not the names. And if you think 527 00:29:36,880 --> 00:29:41,120 Speaker 1: your memory is going try paying more attention. Distraction is 528 00:29:41,720 --> 00:29:45,160 Speaker 1: one of the greatest threats to memory formation. And if 529 00:29:45,200 --> 00:29:48,280 Speaker 1: you don't form a memory properly, you're not gonna remember it. 530 00:29:48,640 --> 00:29:51,880 Speaker 1: I mean it sounds that sounds so basic, but yeah, 531 00:29:52,160 --> 00:29:56,280 Speaker 1: proper encoding requires concentration, and really, you know, look at 532 00:29:56,280 --> 00:29:58,320 Speaker 1: the glasses as you set them next to the alarm 533 00:29:58,360 --> 00:30:01,239 Speaker 1: clock and think I just put the there. Yeah, so 534 00:30:01,360 --> 00:30:04,840 Speaker 1: obey humans boom. If you want to learn more about 535 00:30:05,040 --> 00:30:09,120 Speaker 1: memory type memory, you should probably type human memory because 536 00:30:09,160 --> 00:30:11,400 Speaker 1: I'm pretty sure if you just type memory, a lot 537 00:30:11,440 --> 00:30:15,000 Speaker 1: of computer stuff is gonna come up in the handy 538 00:30:15,040 --> 00:30:16,960 Speaker 1: search bar how stuff works dot com And you don't 539 00:30:17,000 --> 00:30:20,960 Speaker 1: want to read any of that. It's uh well, I'm yeah, 540 00:30:21,560 --> 00:30:27,200 Speaker 1: uh now it's time for a listener made Josh. I'm 541 00:30:27,200 --> 00:30:33,840 Speaker 1: gonna call this uh maggot mania. Yeah, that really got 542 00:30:33,880 --> 00:30:37,360 Speaker 1: to people. I always find it like, um I opening 543 00:30:37,400 --> 00:30:39,400 Speaker 1: when I just tell a story from my life and 544 00:30:39,440 --> 00:30:43,480 Speaker 1: everybody's like, oh my god, I couldn't eat. Yeah I'm 545 00:30:43,520 --> 00:30:47,200 Speaker 1: I'm like, oh sorry, all right. This is from cath 546 00:30:47,280 --> 00:30:52,080 Speaker 1: in Australia. Oh hello, guys. I have done the exact 547 00:30:52,120 --> 00:30:54,360 Speaker 1: same thing as you, Josh. I put some meat in 548 00:30:54,400 --> 00:30:57,840 Speaker 1: a plastic bag in the kitchen garbage bin, which also 549 00:30:57,880 --> 00:31:00,120 Speaker 1: had a lid, and woke up to a moving floor. 550 00:31:00,080 --> 00:31:03,360 Speaker 1: Or only I was was not wearing my glasses so 551 00:31:04,120 --> 00:31:07,600 Speaker 1: she might forget where she laid them. Um I was 552 00:31:07,640 --> 00:31:09,320 Speaker 1: doing a little cleaning, and it's slowly dawned at me 553 00:31:09,400 --> 00:31:12,760 Speaker 1: what had happened. I had begun my morning by sweeping. 554 00:31:13,320 --> 00:31:15,440 Speaker 1: I had made cookies a night prior and was noticing 555 00:31:15,480 --> 00:31:18,240 Speaker 1: all these little balls of dough on the floor. They 556 00:31:18,240 --> 00:31:20,560 Speaker 1: were very hard to sweep up. I became more and 557 00:31:20,600 --> 00:31:23,680 Speaker 1: more confused at the huge amount of tiny balls of 558 00:31:23,720 --> 00:31:26,560 Speaker 1: dough until I had bent down and had a closer look. 559 00:31:26,800 --> 00:31:28,720 Speaker 1: It was like a horror movie. My blood went cold, 560 00:31:29,120 --> 00:31:32,200 Speaker 1: and I crouched an utter panic. I then looked across 561 00:31:32,240 --> 00:31:34,160 Speaker 1: the floor and these little dough balls had made their 562 00:31:34,160 --> 00:31:37,600 Speaker 1: way across the entire apartment, and they made it to 563 00:31:37,600 --> 00:31:41,560 Speaker 1: the bedroom carpet. I was totally disgusted and horrified. I 564 00:31:41,600 --> 00:31:45,720 Speaker 1: think I was even doing that half panic cry, swearing 565 00:31:45,840 --> 00:31:51,560 Speaker 1: quietly to myself thing I'm not aware of that. I 566 00:31:51,600 --> 00:31:53,440 Speaker 1: got rid of them by patiently sweeping them up. I 567 00:31:53,440 --> 00:31:56,280 Speaker 1: couldn't bear to squish them and have to clean up 568 00:31:56,320 --> 00:31:59,040 Speaker 1: that mess, and I put them into a plastic bag 569 00:31:59,080 --> 00:32:01,280 Speaker 1: that I left sitting outside. I sprayed the bag with 570 00:32:01,360 --> 00:32:05,880 Speaker 1: disinfectant and bug spray after every dump of maggots, but 571 00:32:05,920 --> 00:32:08,560 Speaker 1: there were still they were still squirming. I will never 572 00:32:08,600 --> 00:32:12,440 Speaker 1: ever ever leave meat unattended Again, I live in Australia, 573 00:32:12,520 --> 00:32:14,760 Speaker 1: where I should have been aware of this. I thought 574 00:32:14,760 --> 00:32:19,120 Speaker 1: she mc cookies, she need meat cookies. Was that just unrelated? 575 00:32:19,480 --> 00:32:21,640 Speaker 1: I think it was unrelated and that maybe explained that 576 00:32:21,800 --> 00:32:24,240 Speaker 1: why she thought there was dough on the floor. Now 577 00:32:24,320 --> 00:32:26,280 Speaker 1: she's said a meat cookies night before. This is Australia. 578 00:32:26,320 --> 00:32:29,240 Speaker 1: Maybe have meat cookies down and there. I want some 579 00:32:29,320 --> 00:32:34,480 Speaker 1: meat cookie. That's it. That's from cat. Wow. That was 580 00:32:34,560 --> 00:32:39,560 Speaker 1: a weird exposition there, Chuck, that was cat. Yeah from Australia. 581 00:32:40,760 --> 00:32:44,360 Speaker 1: Well thanks Cat the maggot hata h you're just Cat 582 00:32:44,680 --> 00:32:47,840 Speaker 1: k A t h cant short for Katherine or Cat. Well, 583 00:32:47,840 --> 00:32:50,360 Speaker 1: thanks a lot. We appreciate it. Um, I imagine you've 584 00:32:50,400 --> 00:32:53,880 Speaker 1: moved by now and and um, very sensible of you. Yes, 585 00:32:54,200 --> 00:32:57,480 Speaker 1: if you've ever made meat cookies or anything that sounds 586 00:32:57,600 --> 00:33:00,440 Speaker 1: equally awesome, we want to hear it. And if you've 587 00:33:00,440 --> 00:33:03,920 Speaker 1: got a recipe, cool, and if you're willing to send 588 00:33:04,000 --> 00:33:07,840 Speaker 1: us some of these things even better, right right so 589 00:33:07,960 --> 00:33:10,440 Speaker 1: you can get our mailing address by sending us an 590 00:33:10,440 --> 00:33:17,800 Speaker 1: email right right at Stuff podcast at how stuff works 591 00:33:19,360 --> 00:33:25,880 Speaker 1: dot com for more on this and thousands of other topics. 592 00:33:26,080 --> 00:33:28,200 Speaker 1: Is it how stuff works dot com. To learn more 593 00:33:28,200 --> 00:33:31,000 Speaker 1: about the podcast, click on the podcast icon in the 594 00:33:31,080 --> 00:33:34,120 Speaker 1: upper right corner of our homepage. The How Stuff Works 595 00:33:34,200 --> 00:33:40,680 Speaker 1: iPhone app has arrived. Download it today on iTunes, brought 596 00:33:40,680 --> 00:33:43,880 Speaker 1: to you by the reinvented two thousand twelve camera. It's ready, 597 00:33:44,040 --> 00:33:44,480 Speaker 1: are you