1 00:00:00,160 --> 00:00:02,640 Speaker 1: Guess what, Mango? What's up? Will do you remember back 2 00:00:02,640 --> 00:00:05,040 Speaker 1: in the episode where we asked will we ever live 3 00:00:05,120 --> 00:00:07,920 Speaker 1: without sleep? We talked about how sleep appears to be 4 00:00:08,000 --> 00:00:12,720 Speaker 1: this really critical component of our brain's memory consolidation. Right, yeah, definitely. Well, 5 00:00:12,720 --> 00:00:15,320 Speaker 1: I started thinking about some of those rat brain monitoring 6 00:00:15,360 --> 00:00:18,760 Speaker 1: studies again because I was reading about these two neuroscientists 7 00:00:18,800 --> 00:00:22,239 Speaker 1: at Wake Forest Baptist Medical Center in North Carolina. Their 8 00:00:22,280 --> 00:00:25,360 Speaker 1: names are Sam Deadweiler and Robert Hampson, and they've been 9 00:00:25,360 --> 00:00:28,320 Speaker 1: studying rat brains for a few decades now. But one 10 00:00:28,320 --> 00:00:30,280 Speaker 1: of their studies on what's going on in these tiny 11 00:00:30,360 --> 00:00:34,159 Speaker 1: rat naggins suggests some fascinating implications for the future of 12 00:00:34,159 --> 00:00:38,040 Speaker 1: adding technology to the human brain. How's that? Well, all right, 13 00:00:38,040 --> 00:00:40,240 Speaker 1: so let me set this up here. So, in a study, 14 00:00:40,240 --> 00:00:42,120 Speaker 1: they got two sets of rats that are trained to 15 00:00:42,200 --> 00:00:44,919 Speaker 1: run between two areas of their cage and on one 16 00:00:44,960 --> 00:00:47,199 Speaker 1: side of the cage, they learned to press these levers 17 00:00:47,400 --> 00:00:50,920 Speaker 1: at a certain sequence and that helps them get a reward. However, 18 00:00:51,000 --> 00:00:53,400 Speaker 1: over time, one set of these is trained to wait 19 00:00:53,440 --> 00:00:55,840 Speaker 1: for up to thirty seconds before they could press the 20 00:00:55,880 --> 00:00:59,400 Speaker 1: appropriate lever to get their reward. Now, the second set 21 00:00:59,400 --> 00:01:01,920 Speaker 1: didn't have a wait. But when the second set was 22 00:01:01,960 --> 00:01:04,360 Speaker 1: forced to experience a delay of their own, they were 23 00:01:04,400 --> 00:01:07,559 Speaker 1: completely thrown off and they forgot which lever they needed 24 00:01:07,560 --> 00:01:10,520 Speaker 1: to push. So here's where the crazy part comes in. 25 00:01:11,040 --> 00:01:13,280 Speaker 1: The brain activity in the first set of rats. You know, 26 00:01:13,319 --> 00:01:14,880 Speaker 1: these are the ones that had to learn to wait. 27 00:01:15,280 --> 00:01:17,640 Speaker 1: They had been recorded, you know, once they'd learned the 28 00:01:17,720 --> 00:01:21,319 Speaker 1: lever to push, and then using electrodes dead Wilder and 29 00:01:21,400 --> 00:01:24,680 Speaker 1: Hampson stimulated this same series of brain activity for the 30 00:01:24,760 --> 00:01:27,920 Speaker 1: second set of rats, and this time they began behaving 31 00:01:27,959 --> 00:01:30,040 Speaker 1: as though they had been trained like the first set 32 00:01:30,040 --> 00:01:33,160 Speaker 1: of rats, and began choosing the right lever despite the 33 00:01:33,160 --> 00:01:35,760 Speaker 1: fact that they had not actually been trained to do this. 34 00:01:36,200 --> 00:01:38,679 Speaker 1: That's insane. Yeah, It's as though they had memories and 35 00:01:38,720 --> 00:01:42,080 Speaker 1: planet of things they had not actually experienced. And this 36 00:01:42,200 --> 00:01:44,759 Speaker 1: got me thinking, you know, as this evolves, what will 37 00:01:44,800 --> 00:01:48,040 Speaker 1: it mean for humans? And how are scientists currently using 38 00:01:48,080 --> 00:01:50,720 Speaker 1: machines in the human brain? And what are our brains 39 00:01:50,760 --> 00:01:53,720 Speaker 1: capable of when computers are built in These are just 40 00:01:53,760 --> 00:01:56,120 Speaker 1: a few of the questions will be asking him today's episode. 41 00:01:56,160 --> 00:02:15,919 Speaker 1: So let's get started, hey, their podcast listeners, welcome to 42 00:02:15,960 --> 00:02:18,560 Speaker 1: Part Time Genius I'm Will Pearson and as always I'm 43 00:02:18,639 --> 00:02:20,920 Speaker 1: joined by my good friend Man Guesh Ticketer and on 44 00:02:20,960 --> 00:02:23,280 Speaker 1: the other side of the soundproof glass, eating but apparently 45 00:02:23,320 --> 00:02:25,520 Speaker 1: not sharing a big old bag of Smarties as our 46 00:02:25,600 --> 00:02:29,080 Speaker 1: friend and producer Tristan McNeil. And that seems fitting because 47 00:02:29,120 --> 00:02:31,160 Speaker 1: in today's episode we're going to be talking about the 48 00:02:31,200 --> 00:02:35,280 Speaker 1: incredible advances and understanding the human brain, as well as 49 00:02:35,360 --> 00:02:37,720 Speaker 1: hacking the brain with technology so that we can treat 50 00:02:37,760 --> 00:02:41,240 Speaker 1: diseases like Alzheimer's and Parkinson's as well as those who 51 00:02:41,280 --> 00:02:44,080 Speaker 1: have been affected by stroke, but also just to see 52 00:02:44,080 --> 00:02:46,640 Speaker 1: how much more the brain is capable of if given 53 00:02:46,680 --> 00:02:49,960 Speaker 1: some additional firepower. Yeah, I'm sure this is just the 54 00:02:50,000 --> 00:02:52,080 Speaker 1: first of a ton of episodes we'll do on the brain, 55 00:02:52,200 --> 00:02:55,160 Speaker 1: but there's just something so fascinating about the research happening 56 00:02:55,240 --> 00:02:58,240 Speaker 1: right now, and so many things that once seemed like 57 00:02:58,280 --> 00:03:00,840 Speaker 1: science fiction that are now possible. So that's what we 58 00:03:00,880 --> 00:03:03,400 Speaker 1: decided to focus on. Yeah, and I'm so excited to 59 00:03:03,400 --> 00:03:05,480 Speaker 1: have a real superstar in this field on with us 60 00:03:05,480 --> 00:03:07,720 Speaker 1: in a bid. His name is John krak Our, and 61 00:03:07,720 --> 00:03:11,120 Speaker 1: he's a neuroscientist and neurologist at Johns Hopkins and he's 62 00:03:11,160 --> 00:03:15,279 Speaker 1: also the director of the Brain Learning, Animation and Movement Laboratory, 63 00:03:15,720 --> 00:03:19,800 Speaker 1: also known as BLAMAM. The best aconem in the business, 64 00:03:20,480 --> 00:03:22,800 Speaker 1: and obviously he's so interesting. I can't wait to get 65 00:03:22,840 --> 00:03:25,120 Speaker 1: him on the line. But before we do, we need 66 00:03:25,160 --> 00:03:27,000 Speaker 1: to back up just a little and and talk about 67 00:03:27,040 --> 00:03:28,800 Speaker 1: how we got where we are in terms of the 68 00:03:28,800 --> 00:03:32,120 Speaker 1: technology being added to our neural systems. I mean, not 69 00:03:32,400 --> 00:03:36,160 Speaker 1: our specifically, but humans in general. Right right, Well, you know, 70 00:03:36,160 --> 00:03:39,280 Speaker 1: it's crazy to realize that two thousand seventeen marks sixty 71 00:03:39,360 --> 00:03:43,040 Speaker 1: years since the first human trials for cochlear implants. These 72 00:03:43,080 --> 00:03:46,000 Speaker 1: implants were designed with electrodes positioned in the inner ear 73 00:03:46,320 --> 00:03:48,800 Speaker 1: to transmit sound to the brain, and then in nineteen 74 00:03:48,880 --> 00:03:52,600 Speaker 1: sixty four, after those trials, the first cochlear implant was 75 00:03:52,680 --> 00:03:55,920 Speaker 1: tested in a human volunteer. This was a huge step 76 00:03:55,960 --> 00:03:58,440 Speaker 1: in helping us to see that electronic devices could be 77 00:03:58,440 --> 00:04:01,560 Speaker 1: built to take sense reinformation and you know, such as sound, 78 00:04:01,600 --> 00:04:04,280 Speaker 1: to translate it into a language that the brain could 79 00:04:04,280 --> 00:04:07,600 Speaker 1: then process. So I really had no idea than that 80 00:04:07,680 --> 00:04:10,000 Speaker 1: long and and of course today they are much more 81 00:04:10,000 --> 00:04:13,800 Speaker 1: sophisticated cochlear implants helping tens of thousands of new patients 82 00:04:13,840 --> 00:04:16,800 Speaker 1: every year. Yeah, we should probably quickly note that there 83 00:04:16,839 --> 00:04:19,960 Speaker 1: are two general types of neural implants. First, you have 84 00:04:20,000 --> 00:04:22,920 Speaker 1: input devices, you know, like we described with cochlear implants. 85 00:04:22,960 --> 00:04:25,200 Speaker 1: These are the kind that takes sensory information from the 86 00:04:25,240 --> 00:04:28,520 Speaker 1: outside world and pass that along to our nervous system 87 00:04:28,680 --> 00:04:32,080 Speaker 1: via electrical signals. And then you have retinal implants, which 88 00:04:32,120 --> 00:04:35,600 Speaker 1: are also pretty amazing and they're another form of input device. 89 00:04:36,200 --> 00:04:38,120 Speaker 1: And then you'll also find devices that are used to 90 00:04:38,160 --> 00:04:41,560 Speaker 1: help control seizures, you know from epilepsy or maybe tremors 91 00:04:41,600 --> 00:04:46,200 Speaker 1: caused by Parkinson's against signals brought in from the outside world. Sure, 92 00:04:46,240 --> 00:04:48,159 Speaker 1: and and the progress they're making on treating things like 93 00:04:48,200 --> 00:04:52,680 Speaker 1: Parkinson's through deep brain stimulation implants is incredible. So I 94 00:04:52,880 --> 00:04:55,040 Speaker 1: didn't realize that there have now been over a hundred 95 00:04:55,040 --> 00:04:58,279 Speaker 1: thousand people treated with these. And for our listeners when 96 00:04:58,320 --> 00:05:00,160 Speaker 1: Will and I were at Mental Floss, we have really 97 00:05:00,160 --> 00:05:03,480 Speaker 1: teamed up with National Geographic to help demystify brain surgery. 98 00:05:03,760 --> 00:05:06,479 Speaker 1: We did the show called Brain Surgery Live, where we 99 00:05:06,520 --> 00:05:08,640 Speaker 1: followed the story of this wonderful man who had been 100 00:05:08,680 --> 00:05:11,600 Speaker 1: suffering from Parkinson's tremors for over a decade and we 101 00:05:11,680 --> 00:05:14,599 Speaker 1: got to see how when surgeons inserted electrodes into the 102 00:05:14,600 --> 00:05:17,559 Speaker 1: patient's basil ganglia, which is this area of the brain 103 00:05:17,680 --> 00:05:20,680 Speaker 1: that's most affected by Parkinson's, and then when they stimulated 104 00:05:20,680 --> 00:05:24,640 Speaker 1: those electrodes with a battery, the patient's tremors stopped completely. 105 00:05:25,000 --> 00:05:27,560 Speaker 1: Like it was one of the most miraculous things I've 106 00:05:27,600 --> 00:05:30,320 Speaker 1: ever seen. This gentleman who couldn't easily hold a piece 107 00:05:30,320 --> 00:05:32,640 Speaker 1: of paper because he was shaking so much, suddenly had 108 00:05:32,680 --> 00:05:36,360 Speaker 1: these tremors turned off, and he had so much dexterity, 109 00:05:36,520 --> 00:05:38,560 Speaker 1: Like he sent a message to his family from the 110 00:05:38,560 --> 00:05:42,640 Speaker 1: operating room on an iPad and it was just incredible. Yeah, 111 00:05:42,640 --> 00:05:45,880 Speaker 1: it definitely was. Alright, So so those are all examples 112 00:05:45,920 --> 00:05:49,400 Speaker 1: of input devices. So now let's talk about output devices, 113 00:05:49,440 --> 00:05:51,880 Speaker 1: which have come along more recently. And you know, these 114 00:05:51,880 --> 00:05:54,560 Speaker 1: are the devices that take information in the opposite direction. 115 00:05:55,000 --> 00:05:57,960 Speaker 1: They read and record brain activity and then translate that 116 00:05:58,000 --> 00:06:01,279 Speaker 1: into signals for some outside use. You stay controlling a 117 00:06:01,320 --> 00:06:04,120 Speaker 1: prosthetic arm, for example. And I know this is much 118 00:06:04,120 --> 00:06:06,960 Speaker 1: more recent, but we're still talking to a couple of decades, right, Yeah, 119 00:06:06,960 --> 00:06:10,040 Speaker 1: I'm pretty sure it was. Which is the your most 120 00:06:10,040 --> 00:06:12,760 Speaker 1: people associate with the release of Mariah Carey and Boys, 121 00:06:12,760 --> 00:06:15,120 Speaker 1: Two Men's One Sweet Day. I mean, you remember what 122 00:06:15,160 --> 00:06:17,440 Speaker 1: a huge year that was because of that, But you know, 123 00:06:17,440 --> 00:06:19,960 Speaker 1: it was also the year that researchers first and planted 124 00:06:20,000 --> 00:06:22,400 Speaker 1: electrodes into the brain of a monkey and then helped 125 00:06:22,400 --> 00:06:25,760 Speaker 1: it use a prosthetic arm. And as we mentioned earlier 126 00:06:25,800 --> 00:06:28,280 Speaker 1: back in our will We Ever Live Without Sleep? Episode, 127 00:06:28,839 --> 00:06:31,080 Speaker 1: we talked about how we've gotten so much better at 128 00:06:31,120 --> 00:06:34,480 Speaker 1: observing brain patterns and can actually see the same areas 129 00:06:34,480 --> 00:06:37,080 Speaker 1: of the brain light up as we run through those 130 00:06:37,120 --> 00:06:40,560 Speaker 1: memories again during sleep, and that helps us consolidate those 131 00:06:40,600 --> 00:06:44,400 Speaker 1: memories well in a similar kind of observation. When they 132 00:06:44,400 --> 00:06:47,680 Speaker 1: place electrodes on the motor cortex of these monkeys, they 133 00:06:47,680 --> 00:06:50,800 Speaker 1: can then observe spikes in the activity of certain neurons. 134 00:06:51,240 --> 00:06:53,960 Speaker 1: And as they observe these of the course of several studies, 135 00:06:54,000 --> 00:06:56,839 Speaker 1: they began to figure out what patterns of spikes corresponded 136 00:06:56,880 --> 00:07:00,000 Speaker 1: with certain our motions, and over time the researchers figure 137 00:07:00,000 --> 00:07:02,680 Speaker 1: about how to teach the monkeys to control a robotic arm, 138 00:07:02,760 --> 00:07:05,360 Speaker 1: you know, just using their brain signals. And of course 139 00:07:05,400 --> 00:07:07,120 Speaker 1: the next step was then to figure out how to 140 00:07:07,160 --> 00:07:09,720 Speaker 1: do this, you know, for people. Yeah, so I actually 141 00:07:09,760 --> 00:07:12,040 Speaker 1: read a good bit about this, especially as it related 142 00:07:12,040 --> 00:07:14,920 Speaker 1: to helping those dealing with paralysis, and I think the 143 00:07:14,960 --> 00:07:17,160 Speaker 1: first person to use a brain implant to both use 144 00:07:17,200 --> 00:07:20,080 Speaker 1: certain functions on a computer screen and to gain some 145 00:07:20,120 --> 00:07:22,920 Speaker 1: functionality from a prosthetic hand was this guy named Matthew 146 00:07:23,000 --> 00:07:26,040 Speaker 1: Nagle back in two thousand four. So he was paralyzed 147 00:07:26,040 --> 00:07:28,280 Speaker 1: from the neck down. So this was a big step, 148 00:07:28,760 --> 00:07:31,480 Speaker 1: and though with the brain being such a complicated organ, 149 00:07:31,600 --> 00:07:34,400 Speaker 1: there's clearly a long way to go. So researchers are 150 00:07:34,400 --> 00:07:37,960 Speaker 1: still working on ways to improve this technology, and as 151 00:07:38,000 --> 00:07:40,200 Speaker 1: of now, patients still have to be connected to a 152 00:07:40,200 --> 00:07:43,320 Speaker 1: computer for it to work. But it's still pretty remarkable 153 00:07:43,320 --> 00:07:46,560 Speaker 1: what's happened in efforts to help those fighting paralysis. Yeah, 154 00:07:46,600 --> 00:07:48,640 Speaker 1: we were talking about some of these yesterday, and I 155 00:07:48,680 --> 00:07:50,280 Speaker 1: actually think it would be helpful if you would just 156 00:07:50,320 --> 00:07:52,720 Speaker 1: walk us through some of the most recent progress in 157 00:07:52,720 --> 00:07:55,160 Speaker 1: this area. Sure, well, several things have happened in the 158 00:07:55,160 --> 00:07:58,440 Speaker 1: past year or two. On the communications front. We've placed 159 00:07:58,480 --> 00:08:01,600 Speaker 1: electrodes inside the brain of woman with a LS, allowing 160 00:08:01,600 --> 00:08:04,720 Speaker 1: her to communicate simply by using her thoughts. Basically, she 161 00:08:04,800 --> 00:08:07,120 Speaker 1: uses an eye tracker to spell words on a screen. 162 00:08:07,600 --> 00:08:10,160 Speaker 1: But at some point, like many people with a LS, 163 00:08:10,240 --> 00:08:13,280 Speaker 1: she may lose that ability as well. So she's participating 164 00:08:13,320 --> 00:08:15,880 Speaker 1: in this other study where they've implanted an electrode system 165 00:08:15,920 --> 00:08:18,280 Speaker 1: over the region of the brain that affects hand movement, 166 00:08:18,720 --> 00:08:21,480 Speaker 1: and after a bit of training, just by imagining moving 167 00:08:21,480 --> 00:08:23,920 Speaker 1: her hand, she was actually able to make selections on 168 00:08:23,960 --> 00:08:28,040 Speaker 1: a screen, And she's apparently gotten this down to accuracy, 169 00:08:28,200 --> 00:08:30,760 Speaker 1: which is just unbelieved. I just have a hard time 170 00:08:30,800 --> 00:08:33,120 Speaker 1: wrapping my head around how this is even possible. Yeah, 171 00:08:33,200 --> 00:08:35,840 Speaker 1: it's really is amazing. So in recent years we've also 172 00:08:35,880 --> 00:08:38,120 Speaker 1: seen some major leaps in terms of motion and touch. 173 00:08:38,360 --> 00:08:41,880 Speaker 1: So last year, this partially paralyzed man was able to 174 00:08:41,920 --> 00:08:45,280 Speaker 1: pour liquid from a bottle, and even more impressively, he 175 00:08:45,320 --> 00:08:48,560 Speaker 1: was able to play guitar hero because they had an 176 00:08:48,600 --> 00:08:51,360 Speaker 1: electrode sleeve that was connected to the motor cortex in 177 00:08:51,400 --> 00:08:55,400 Speaker 1: his brain. And in another study, scientists helped this quadriplegic 178 00:08:55,480 --> 00:08:57,720 Speaker 1: man feel as though he was touching certain objects through 179 00:08:57,720 --> 00:09:01,040 Speaker 1: a robotic arm, like he could actually yield objects all 180 00:09:01,040 --> 00:09:04,360 Speaker 1: by tapping into a somatosensory cortex. I mean, I can 181 00:09:04,400 --> 00:09:08,560 Speaker 1: only imagine how strange and guessing overwhelming that was to 182 00:09:08,600 --> 00:09:12,040 Speaker 1: regain this sense of touch. Definitely, it all just seems 183 00:09:12,080 --> 00:09:15,360 Speaker 1: so unreal. And there's also this electrode cap that's actually 184 00:09:15,400 --> 00:09:18,760 Speaker 1: helped some paralyzed people begin walking again. It's a little 185 00:09:18,800 --> 00:09:20,960 Speaker 1: different than the others because the cap is connected to 186 00:09:21,000 --> 00:09:24,199 Speaker 1: this exoskeleton which is on the person's legs, and as 187 00:09:24,240 --> 00:09:26,720 Speaker 1: signals get sent from the cap to the exoskeleton, it 188 00:09:26,760 --> 00:09:29,920 Speaker 1: allows the legs to move, which is awesome. But there's 189 00:09:30,000 --> 00:09:32,480 Speaker 1: actually been some cases that paralyzed people learning to walk 190 00:09:32,520 --> 00:09:35,360 Speaker 1: without the exoskeleton in recent years. Like the cap sends 191 00:09:35,400 --> 00:09:39,160 Speaker 1: signals to electrodes implanted into the person's own legs, that's 192 00:09:39,240 --> 00:09:42,160 Speaker 1: so cool. A Alright, so we've talked about input devices 193 00:09:42,280 --> 00:09:45,080 Speaker 1: and output devices, and you might be wondering what the 194 00:09:45,120 --> 00:09:47,360 Speaker 1: next step in the evolution is, and it's something called 195 00:09:47,400 --> 00:09:52,160 Speaker 1: bidirectional interfaces, and these combined the input and the output 196 00:09:52,480 --> 00:09:54,600 Speaker 1: and they could be huge and helping those deal with 197 00:09:54,720 --> 00:09:58,480 Speaker 1: damaged nervous systems. So, well, let's say somebody had a 198 00:09:58,520 --> 00:10:00,880 Speaker 1: stroke and as a result of that, there are parts 199 00:10:00,880 --> 00:10:03,800 Speaker 1: of their nervous system that are not really communicating or 200 00:10:04,080 --> 00:10:07,160 Speaker 1: appropriately connected anymore. And so through the use of a 201 00:10:07,160 --> 00:10:10,520 Speaker 1: bidirectional implant, you might be able to re establish this 202 00:10:10,640 --> 00:10:13,200 Speaker 1: connection and give them the ability to use a body 203 00:10:13,240 --> 00:10:16,000 Speaker 1: part that had effectively been paralyzed due to the stroke. 204 00:10:16,559 --> 00:10:19,360 Speaker 1: But what's even more fascinating is that there's a developing 205 00:10:19,400 --> 00:10:23,440 Speaker 1: area that's still very very early days. So what's that. Well, 206 00:10:23,520 --> 00:10:26,640 Speaker 1: it involves playing with memories, and apparently we might be 207 00:10:26,679 --> 00:10:30,080 Speaker 1: able to restore memories by using an implant to replace 208 00:10:30,120 --> 00:10:33,200 Speaker 1: the input and output flow from the hippo campus. And 209 00:10:33,200 --> 00:10:35,160 Speaker 1: and this is the area of the brain that's responsible 210 00:10:35,160 --> 00:10:37,880 Speaker 1: for memory formation. Well, let's talk a little bit more 211 00:10:37,880 --> 00:10:40,040 Speaker 1: about memory and some of that research you mentioned at 212 00:10:40,040 --> 00:10:42,200 Speaker 1: the very top of the show. I mean, the idea 213 00:10:42,280 --> 00:10:44,920 Speaker 1: that scientists could basically transplant the memories of a bunch 214 00:10:44,920 --> 00:10:47,600 Speaker 1: of rats into a single rats brain and then watch 215 00:10:47,600 --> 00:10:50,199 Speaker 1: it behave as though it learned certain things through experience, 216 00:10:50,640 --> 00:10:53,200 Speaker 1: even though it hadn't. I mean, that's just so crazy 217 00:10:53,240 --> 00:10:56,079 Speaker 1: to me and also just so hard to believe. And 218 00:10:56,200 --> 00:10:58,920 Speaker 1: that's pretty much what Deadweiler said about the science community's 219 00:10:58,960 --> 00:11:01,400 Speaker 1: response to his five findings at first. I mean, he said, 220 00:11:01,760 --> 00:11:03,800 Speaker 1: no one's going to believe this until I do a 221 00:11:03,880 --> 00:11:06,560 Speaker 1: hundred control experiments. But play this out for me a 222 00:11:06,559 --> 00:11:09,480 Speaker 1: little like what does all of this mean for people, Well, 223 00:11:09,520 --> 00:11:11,720 Speaker 1: there's obviously a long way to go to apply this 224 00:11:11,800 --> 00:11:14,240 Speaker 1: to the same type of neural and plant and humans, 225 00:11:14,280 --> 00:11:16,760 Speaker 1: but you know, the thinking is that something like this 226 00:11:16,840 --> 00:11:19,480 Speaker 1: could play a very real role in helping somebody with 227 00:11:19,520 --> 00:11:22,520 Speaker 1: Alzheimer's or someone who out of stroke, you know, get 228 00:11:22,559 --> 00:11:25,440 Speaker 1: some of their brain function back. I mean, the problem 229 00:11:25,440 --> 00:11:27,920 Speaker 1: of memory losses that it often results in damage in 230 00:11:27,960 --> 00:11:31,640 Speaker 1: the brain that prevents the flow of information between two locations. 231 00:11:32,160 --> 00:11:33,920 Speaker 1: And so if you could essentially create a way to 232 00:11:34,000 --> 00:11:37,040 Speaker 1: bypass the damaged areas, you might be able to create 233 00:11:37,080 --> 00:11:40,520 Speaker 1: both new memories and hopefully regain the ability to access 234 00:11:40,559 --> 00:11:44,360 Speaker 1: old ones, which is a fascinating idea. Though obviously memories 235 00:11:44,440 --> 00:11:47,640 Speaker 1: this super complicated thing, right, I mean, while the hippocampus 236 00:11:47,679 --> 00:11:50,040 Speaker 1: is where long term memories are formed, you still have 237 00:11:50,080 --> 00:11:52,199 Speaker 1: to consider all these other areas of the brain that 238 00:11:52,280 --> 00:11:54,520 Speaker 1: are working together. And that's true, so it's it's a 239 00:11:54,640 --> 00:11:57,840 Speaker 1: very difficult task. But there are many researchers that believe 240 00:11:57,880 --> 00:11:59,920 Speaker 1: at some point it might be possible to put in 241 00:12:00,080 --> 00:12:02,640 Speaker 1: implant in the hippo campus and actually be able to 242 00:12:02,679 --> 00:12:05,800 Speaker 1: record memories as they come together. Of course, they then 243 00:12:05,880 --> 00:12:08,040 Speaker 1: have to figure out that you know, the neural signals 244 00:12:08,120 --> 00:12:11,720 Speaker 1: or the codes that are the indicators of certain memories. Yeah, 245 00:12:11,760 --> 00:12:13,160 Speaker 1: and and of course no one's saying this will be 246 00:12:13,160 --> 00:12:15,880 Speaker 1: an easy task, but I was reading about some researchers 247 00:12:15,920 --> 00:12:17,840 Speaker 1: that are working to try and figure out how to 248 00:12:17,880 --> 00:12:21,000 Speaker 1: crack the code around certain memories and understand this has 249 00:12:21,040 --> 00:12:24,080 Speaker 1: to happen among the millions and millions of neurons firing. 250 00:12:24,440 --> 00:12:27,320 Speaker 1: But scientists like Theodore Burger at the University of Southern 251 00:12:27,360 --> 00:12:31,200 Speaker 1: California are actually making real progress towards this. Yeah. And 252 00:12:31,240 --> 00:12:33,160 Speaker 1: actually Burger was one of them that teamed up with 253 00:12:33,240 --> 00:12:36,120 Speaker 1: dead Wilder and Hampson on some of their rats studies, 254 00:12:36,440 --> 00:12:39,160 Speaker 1: so specifically in studies where they drugged rats to mimic 255 00:12:39,200 --> 00:12:41,760 Speaker 1: amnesia so they wouldn't be able to remember the whole 256 00:12:41,960 --> 00:12:45,800 Speaker 1: lever pushing thing. But then after using electrodes to stimulate 257 00:12:45,840 --> 00:12:49,040 Speaker 1: the same neural pattern, the rats were able to remember 258 00:12:49,080 --> 00:12:51,600 Speaker 1: what to do. Yeah. So, uh, I don't know about 259 00:12:51,640 --> 00:12:55,079 Speaker 1: roofy rats, but the potential applications of this are truly staggering. 260 00:12:55,400 --> 00:12:57,559 Speaker 1: But but I guess one question I have is whether 261 00:12:57,559 --> 00:13:00,320 Speaker 1: there's a neural code that applies to everyone, or whether 262 00:13:00,360 --> 00:13:02,920 Speaker 1: everyone's is different. That's a very good point that they're 263 00:13:02,920 --> 00:13:05,200 Speaker 1: still trying to figure out all the specifics of this. 264 00:13:05,360 --> 00:13:07,720 Speaker 1: But I have to be honest, I can't wait to 265 00:13:07,720 --> 00:13:09,600 Speaker 1: get our guests on the line to get his thoughts 266 00:13:09,640 --> 00:13:12,800 Speaker 1: on some even more mind blowing possibilities of this brain 267 00:13:12,880 --> 00:13:23,320 Speaker 1: machine connection. Yeah, let's do it. So our guest today 268 00:13:23,360 --> 00:13:26,960 Speaker 1: is a neuroscientist and professor at Johns Hopkins. He's also 269 00:13:27,040 --> 00:13:30,000 Speaker 1: the founder and director of the Brain Learning, Animation and 270 00:13:30,040 --> 00:13:33,319 Speaker 1: Movement Laboratory or BLAM as mango and I love to say, 271 00:13:33,760 --> 00:13:36,520 Speaker 1: and his work in helping treat stroke patients is just 272 00:13:36,840 --> 00:13:40,160 Speaker 1: fascinating and we're thrilled to have him on today. John Krakauer, 273 00:13:40,280 --> 00:13:44,360 Speaker 1: welcome to part time Genius. Thank you. I certainly I'm 274 00:13:44,400 --> 00:13:47,880 Speaker 1: only a part time genius. So you're joining us from 275 00:13:47,880 --> 00:13:51,280 Speaker 1: a cafe in Lisbon, Portugal, Is that right? Yeah? I 276 00:13:51,280 --> 00:13:55,120 Speaker 1: mean Portugal. I spend a month and you know too, 277 00:13:55,800 --> 00:13:59,679 Speaker 1: usually a month and a half every year. Yeah. So 278 00:13:59,840 --> 00:14:02,959 Speaker 1: that a visiting exhibition of a fantastic place. It's talking 279 00:14:02,960 --> 00:14:09,400 Speaker 1: about names. It's called the Shampolo mo Center for the Unknown. So, John, 280 00:14:09,440 --> 00:14:12,000 Speaker 1: knowing that stroke is the leading cause of disability in 281 00:14:12,040 --> 00:14:15,400 Speaker 1: the US and often causes complete arm and or leg 282 00:14:15,520 --> 00:14:19,320 Speaker 1: paralysis and people. This is obviously very important work to 283 00:14:19,760 --> 00:14:22,600 Speaker 1: many many people. Yeah, but you've explained that one of 284 00:14:22,640 --> 00:14:25,240 Speaker 1: the things we failed to understand for so long was 285 00:14:25,600 --> 00:14:28,200 Speaker 1: just how critical it is to begin rehab as quickly 286 00:14:28,240 --> 00:14:30,880 Speaker 1: as possible. So can you talk a little bit about 287 00:14:30,920 --> 00:14:35,120 Speaker 1: this and the studies that led to this realization. Yeah, so, Um, 288 00:14:35,680 --> 00:14:38,320 Speaker 1: it's a it's a long story really. In other words, 289 00:14:38,600 --> 00:14:41,920 Speaker 1: Strokes has been really interesting to neurologists over a century, 290 00:14:42,120 --> 00:14:47,600 Speaker 1: and they were very much interested in studying animals, particularly 291 00:14:49,240 --> 00:14:53,960 Speaker 1: primate non even primates sort of guests some of the 292 00:14:54,000 --> 00:14:58,040 Speaker 1: mechanisms of the deficit office Strokes. And it's ironic that 293 00:14:58,120 --> 00:15:00,480 Speaker 1: if you look at the very early study in the 294 00:15:00,520 --> 00:15:04,200 Speaker 1: early twentieth century, there was evidence that the animals had 295 00:15:04,200 --> 00:15:07,960 Speaker 1: the potential to get better early on after the lesions 296 00:15:08,000 --> 00:15:11,160 Speaker 1: were into and especially if you encourage them with training. 297 00:15:12,000 --> 00:15:16,920 Speaker 1: So it was there in the early literature, um, and 298 00:15:16,960 --> 00:15:22,360 Speaker 1: then it's not clear that ever sort of got through 299 00:15:22,440 --> 00:15:26,760 Speaker 1: to the clinicians and the therapists, and a certain nihilism 300 00:15:26,800 --> 00:15:34,800 Speaker 1: and a certain pessimism seconds um, and the general impetus 301 00:15:34,920 --> 00:15:39,720 Speaker 1: was to try to make people better early but not 302 00:15:39,840 --> 00:15:43,240 Speaker 1: with very high doses or to help people cope with 303 00:15:43,360 --> 00:15:46,280 Speaker 1: what they had left. So, John, your team developed a 304 00:15:46,280 --> 00:15:51,080 Speaker 1: system of therapy. They're involved in exo skeleton, robotics, brain stimulation, 305 00:15:51,360 --> 00:15:53,960 Speaker 1: and a game where you control a dolphin. Can Can 306 00:15:53,960 --> 00:15:55,320 Speaker 1: you tell us a little bit about the game you 307 00:15:55,400 --> 00:15:58,120 Speaker 1: developed and how things are going in you're testing well, 308 00:15:58,160 --> 00:16:00,280 Speaker 1: I should stay stay right from the beginning, it's been 309 00:16:00,320 --> 00:16:04,400 Speaker 1: a painful, low process. We are hoping to be able 310 00:16:04,440 --> 00:16:06,880 Speaker 1: to look at the data by the end of the year. 311 00:16:07,040 --> 00:16:10,760 Speaker 1: I cannot tell you any results. I don't know that yet. 312 00:16:11,160 --> 00:16:17,640 Speaker 1: Um Now, in terms of why, it's actually a bit 313 00:16:17,680 --> 00:16:20,720 Speaker 1: of a bit of a story, the major answers, how 314 00:16:20,760 --> 00:16:23,280 Speaker 1: do you get people to make hundreds, if not thousands, 315 00:16:23,280 --> 00:16:26,440 Speaker 1: of continuous movements day in day out. In other words, 316 00:16:26,440 --> 00:16:30,760 Speaker 1: the animal egg suggestive you needed thousands of movements of 317 00:16:30,960 --> 00:16:33,920 Speaker 1: titrated difficult movements. You know, you're not going to get 318 00:16:33,960 --> 00:16:36,400 Speaker 1: somebody to pick up a glass of water five hundred 319 00:16:36,480 --> 00:16:38,960 Speaker 1: times in a row. You're not going to get someone 320 00:16:39,000 --> 00:16:42,280 Speaker 1: to use in life and fork, you know, thousands of repetitions, 321 00:16:42,320 --> 00:16:45,400 Speaker 1: so it's not a trivial thing. How do you get 322 00:16:45,400 --> 00:16:48,680 Speaker 1: people into a context where they're going to be making 323 00:16:49,040 --> 00:16:50,880 Speaker 1: the kinds of movements you want them to use in 324 00:16:50,960 --> 00:16:53,360 Speaker 1: everyday life, but in a way you trick them into 325 00:16:53,360 --> 00:16:55,840 Speaker 1: making them in the under conditions. But there's so much 326 00:16:55,880 --> 00:16:58,720 Speaker 1: fun they don't realize they're practicing one. I mean, I 327 00:16:58,760 --> 00:17:00,720 Speaker 1: don't know if any of you've ever had the abilitation 328 00:17:00,800 --> 00:17:05,159 Speaker 1: for anything, you know, elbow surgery, shoulder surgery, and I have, 329 00:17:06,000 --> 00:17:09,000 Speaker 1: and it's so boring that I was a terrible patience. 330 00:17:09,200 --> 00:17:11,600 Speaker 1: I didn't even do the ten minutes quite a day 331 00:17:11,640 --> 00:17:13,640 Speaker 1: that I was meant to do. It was so dull. 332 00:17:14,240 --> 00:17:17,560 Speaker 1: Now imagine that in the conditions of strokes. So one 333 00:17:17,560 --> 00:17:20,920 Speaker 1: thing we know is having an illness, being brain damaged 334 00:17:21,240 --> 00:17:24,240 Speaker 1: business in it of itself an incentive to go to 335 00:17:24,280 --> 00:17:26,680 Speaker 1: the gym or the equivalent. So we had to find 336 00:17:26,720 --> 00:17:30,240 Speaker 1: a way to make people do stuff that was fun, 337 00:17:31,200 --> 00:17:34,200 Speaker 1: and we also wanted to do something that the movements 338 00:17:34,240 --> 00:17:38,000 Speaker 1: they made were general, they were useful for everyday life. 339 00:17:38,000 --> 00:17:39,960 Speaker 1: Because there's something, you know, the learning that's called the 340 00:17:40,000 --> 00:17:43,879 Speaker 1: curse of charte specificity. But if you practice task A, 341 00:17:44,680 --> 00:17:47,479 Speaker 1: you only get good at tak A, and it doesn't 342 00:17:47,600 --> 00:17:49,879 Speaker 1: make you any better at task B, two or D. 343 00:17:50,680 --> 00:17:53,640 Speaker 1: And how do you guys, come up with a dolphin concept. Well, okay, 344 00:17:53,720 --> 00:17:55,879 Speaker 1: so there's a lot of data to show you if 345 00:17:55,880 --> 00:17:59,080 Speaker 1: you put wraps after brain injury into enriched environment and 346 00:17:59,160 --> 00:18:01,240 Speaker 1: as you put them in a little cage full of 347 00:18:01,680 --> 00:18:06,320 Speaker 1: ramps and spinning wheels and balls and friends, they do 348 00:18:06,480 --> 00:18:08,480 Speaker 1: much better even if you don't train them on the 349 00:18:08,520 --> 00:18:11,199 Speaker 1: top to test him on. So that was the final 350 00:18:11,520 --> 00:18:15,120 Speaker 1: sort of clue it needs to be in an emotional, immersive, 351 00:18:15,640 --> 00:18:21,159 Speaker 1: motivated environments. And then I met these remarkable to people 352 00:18:21,359 --> 00:18:26,040 Speaker 1: to meet Roy and o'barmage who were both graduate students, 353 00:18:26,119 --> 00:18:29,879 Speaker 1: well they've both been undergrad and grad ad Hopkins and 354 00:18:30,080 --> 00:18:34,360 Speaker 1: they were doing beautiful gaming um where they were simulating 355 00:18:34,480 --> 00:18:37,880 Speaker 1: animal movements. And I had gone to the Hopkins campus 356 00:18:37,880 --> 00:18:43,680 Speaker 1: looking for young gamers and I found them, introduced to them, 357 00:18:43,720 --> 00:18:45,879 Speaker 1: and they showed me what they had done, and I 358 00:18:45,960 --> 00:18:48,800 Speaker 1: realized that I wanted to go a step further and 359 00:18:48,880 --> 00:18:54,720 Speaker 1: not to have people watch beautiful movement on a game, 360 00:18:54,960 --> 00:18:59,000 Speaker 1: but the to take the execlusive step of controlling it 361 00:19:00,040 --> 00:19:03,120 Speaker 1: if you became the character in the game. So basically 362 00:19:03,160 --> 00:19:09,679 Speaker 1: imagine gaming meats Pixar meets new controlling the characters using 363 00:19:10,640 --> 00:19:13,080 Speaker 1: into the idea that if you had a dolphin, which 364 00:19:13,119 --> 00:19:16,280 Speaker 1: is a beautiful animal that we love to watch moves. 365 00:19:16,320 --> 00:19:19,560 Speaker 1: That's why we love dolph, which you love their acrobatics 366 00:19:19,600 --> 00:19:22,240 Speaker 1: and they love continuously in the water and there's no 367 00:19:22,359 --> 00:19:25,280 Speaker 1: stopping is starting. I mean you can be moving your 368 00:19:25,400 --> 00:19:28,840 Speaker 1: arms around continuously without starting and stopping because eused to 369 00:19:28,920 --> 00:19:32,240 Speaker 1: you stop moving your arm, the animal will continue to 370 00:19:32,280 --> 00:19:37,080 Speaker 1: move through the water. Bribesmen, So we thought that we 371 00:19:37,160 --> 00:19:43,119 Speaker 1: could make patients babble like children do, moving in this 372 00:19:43,320 --> 00:19:45,720 Speaker 1: cloud of everyday life. We'll tell us what else you 373 00:19:45,720 --> 00:19:48,719 Speaker 1: guys are focused on it blam right now. So you 374 00:19:48,760 --> 00:19:51,480 Speaker 1: can actually take somebody who's locked their hands through an 375 00:19:51,480 --> 00:19:55,399 Speaker 1: actiment soldier for example, so they can have just a 376 00:19:55,480 --> 00:20:02,600 Speaker 1: stump where their hand was and the using prospect for years. 377 00:20:03,160 --> 00:20:06,800 Speaker 1: And you can take the towns of someone who died 378 00:20:09,080 --> 00:20:14,479 Speaker 1: and basically reconnected to someone else's body, so you can 379 00:20:14,560 --> 00:20:20,320 Speaker 1: basically have someone else's hand. There are many many ways 380 00:20:20,359 --> 00:20:24,440 Speaker 1: that you could basically expand the the repertoires in the 381 00:20:24,520 --> 00:20:28,840 Speaker 1: movement by actually having more than just your hands. In 382 00:20:28,880 --> 00:20:31,760 Speaker 1: other words, the irony right on the one hand, you 383 00:20:31,800 --> 00:20:34,199 Speaker 1: want to use this app to training just to have 384 00:20:34,280 --> 00:20:38,040 Speaker 1: a hand like everybody else it is healthy people. You 385 00:20:38,080 --> 00:20:40,600 Speaker 1: could go beyond the hand. Yeah, that was one of 386 00:20:40,600 --> 00:20:42,480 Speaker 1: the things that Will and I were both fascinated by 387 00:20:42,480 --> 00:20:45,080 Speaker 1: in that article, the idea that we could almost like 388 00:20:45,280 --> 00:20:48,359 Speaker 1: upgrade our boring old hands. We don't know what the 389 00:20:48,400 --> 00:20:51,600 Speaker 1: upper limit times. What if you what if somebody was 390 00:20:51,640 --> 00:20:54,960 Speaker 1: to be born in eight homes? Because in the minute, 391 00:20:56,600 --> 00:21:01,639 Speaker 1: each of your muscle fibers might be the admit you 392 00:21:01,680 --> 00:21:04,400 Speaker 1: see many many, many many muffles make up your arm. 393 00:21:04,720 --> 00:21:08,159 Speaker 1: And we talk about just controlling, but you're actually controlling 394 00:21:08,280 --> 00:21:12,000 Speaker 1: much the d of your muscles in your heart. So 395 00:21:13,040 --> 00:21:16,760 Speaker 1: you can you start thinking the way I'm laying out now, 396 00:21:17,600 --> 00:21:21,119 Speaker 1: imagine what the range of its impacting, the words you 397 00:21:21,160 --> 00:21:25,240 Speaker 1: could have if you were to st combinatory slow in 398 00:21:25,280 --> 00:21:28,240 Speaker 1: that way. Yeah, it's so crazy to think where your 399 00:21:28,240 --> 00:21:32,679 Speaker 1: brain might max out. Yeah, instead of we could have 400 00:21:33,080 --> 00:21:38,159 Speaker 1: a New Yorker cartoon where the the conductor, instead of 401 00:21:38,160 --> 00:21:43,080 Speaker 1: having all the musicians in the orchestra is all the musicians. 402 00:21:44,560 --> 00:21:47,919 Speaker 1: That's pretty incredible. Well, John, we we really appreciate the 403 00:21:47,920 --> 00:21:51,280 Speaker 1: work you're doing. It's both fascinating and obviously, you know, 404 00:21:51,359 --> 00:21:53,480 Speaker 1: life changing for a lot of people. So thank you 405 00:21:53,520 --> 00:21:55,360 Speaker 1: for that work and thank you for joining us today 406 00:21:55,400 --> 00:21:57,919 Speaker 1: on Part Time Genius. It was an athlete pleasure and 407 00:21:58,440 --> 00:22:14,920 Speaker 1: you guys are wonderful. A pat to welcome back to 408 00:22:14,960 --> 00:22:17,360 Speaker 1: Part Time Genius, no Ango. Before the break, we were 409 00:22:17,359 --> 00:22:20,440 Speaker 1: talking about the possibilities of using devices in our brains 410 00:22:20,480 --> 00:22:22,840 Speaker 1: to assist with memory. Yeah, and there are actually a 411 00:22:22,920 --> 00:22:24,680 Speaker 1: few other things related to this field that I wanted 412 00:22:24,720 --> 00:22:26,840 Speaker 1: to talk about before we move on. All right, we'll 413 00:22:26,880 --> 00:22:28,720 Speaker 1: go for it. Well. The first is something I was 414 00:22:28,760 --> 00:22:31,680 Speaker 1: reading about a new scientist, and this is the possibility 415 00:22:31,760 --> 00:22:33,359 Speaker 1: that we could one day and plan a chip in 416 00:22:33,440 --> 00:22:36,359 Speaker 1: people who had suffered some sort of brain damage, and 417 00:22:36,480 --> 00:22:38,800 Speaker 1: this chip would include code that would help these people 418 00:22:38,800 --> 00:22:41,479 Speaker 1: accomplish some of the basic things lost after a stroke 419 00:22:41,640 --> 00:22:44,360 Speaker 1: or some other form of damage. There's a quote from 420 00:22:44,400 --> 00:22:47,639 Speaker 1: Justin Sanchez, who works in neuroprosthetics at the University of 421 00:22:47,680 --> 00:22:51,440 Speaker 1: Miami and Florida, and it goes, before we can get 422 00:22:51,520 --> 00:22:53,919 Speaker 1: someone with brain damage back to work, we want to 423 00:22:53,920 --> 00:22:58,000 Speaker 1: return their capability to form those fundamental declarative memories. Yeah. 424 00:22:58,080 --> 00:23:00,720 Speaker 1: It's fascinating to think of that being a possibility, and 425 00:23:00,760 --> 00:23:02,800 Speaker 1: I can't even imagine how life changing it would be 426 00:23:02,840 --> 00:23:05,480 Speaker 1: for people struggling to tackle some of the basic life 427 00:23:05,480 --> 00:23:08,080 Speaker 1: skills they may have once had. Sure, and and then 428 00:23:08,080 --> 00:23:10,280 Speaker 1: you take it a step further right, because Sanchez also 429 00:23:10,359 --> 00:23:13,120 Speaker 1: told new scientists, think of the guy coming back from 430 00:23:13,160 --> 00:23:16,919 Speaker 1: war who can't remember his wife's face. And that's heartbreaking 431 00:23:16,960 --> 00:23:19,479 Speaker 1: to think about. But the science behind it and how 432 00:23:19,520 --> 00:23:23,160 Speaker 1: they're approaching the science, that's fascinating. And we can actually 433 00:23:23,200 --> 00:23:25,640 Speaker 1: come back to more studies from dead Wilder and Hampson 434 00:23:25,720 --> 00:23:28,920 Speaker 1: on this sod did you read about their studies on maccas, Yeah, 435 00:23:29,200 --> 00:23:31,840 Speaker 1: I did so for the listeners. This is a study 436 00:23:31,880 --> 00:23:34,480 Speaker 1: where they showed the maccas and image on the screen 437 00:23:34,560 --> 00:23:36,320 Speaker 1: and and then have them pick out that image once 438 00:23:36,320 --> 00:23:38,400 Speaker 1: it was part of a much bigger collection of images. 439 00:23:38,720 --> 00:23:41,480 Speaker 1: This happened a minute or so later, and and during 440 00:23:41,520 --> 00:23:44,480 Speaker 1: this the researchers were martyring their brains and observing the 441 00:23:44,480 --> 00:23:48,040 Speaker 1: signals involved in this process. And then they added drugs 442 00:23:48,240 --> 00:23:50,399 Speaker 1: and these were to prevent the macaques from turning this 443 00:23:50,440 --> 00:23:53,480 Speaker 1: event into a long term memory. It effectively made them 444 00:23:53,520 --> 00:23:56,399 Speaker 1: forget it happened. But then the scientists had them performed 445 00:23:56,400 --> 00:23:59,080 Speaker 1: the task again, and when they did, they hit the 446 00:23:59,080 --> 00:24:02,040 Speaker 1: neurons with the same pattern of signals they observed earlier, 447 00:24:02,240 --> 00:24:04,880 Speaker 1: and the macaques seem to know exactly what to look for. 448 00:24:05,560 --> 00:24:07,399 Speaker 1: You know, there are a couple of pieces of us 449 00:24:07,400 --> 00:24:10,080 Speaker 1: that are just super interesting to me. So one is 450 00:24:10,119 --> 00:24:13,120 Speaker 1: what Deadwiler and Hampson and other researchers believe about how 451 00:24:13,160 --> 00:24:15,880 Speaker 1: memory works, and and that is that the brain patterns 452 00:24:15,880 --> 00:24:19,520 Speaker 1: they observe aren't necessarily attached to an exact image. It's 453 00:24:19,520 --> 00:24:21,800 Speaker 1: really more that our brains tend to break things down 454 00:24:21,800 --> 00:24:24,840 Speaker 1: into features, so you know, like by shape or color 455 00:24:25,080 --> 00:24:28,680 Speaker 1: or size, and then this collection of information, when that's 456 00:24:28,720 --> 00:24:31,440 Speaker 1: piece together, that's what helps us recall an object or 457 00:24:31,440 --> 00:24:33,960 Speaker 1: a person specifically. That's crazy, like I never would have 458 00:24:34,000 --> 00:24:36,000 Speaker 1: thought about that. Well, and the other thing that's so 459 00:24:36,080 --> 00:24:39,080 Speaker 1: interesting is our brains plasticity and its ability to learn 460 00:24:39,119 --> 00:24:41,760 Speaker 1: to work with these devices. So so let's go back 461 00:24:41,760 --> 00:24:43,840 Speaker 1: for a minute to the neural implants used to control 462 00:24:43,880 --> 00:24:46,639 Speaker 1: a robotic arm. The interesting thing is the way that 463 00:24:46,680 --> 00:24:49,640 Speaker 1: the neurons are working to control the robotic arm, that 464 00:24:49,640 --> 00:24:51,720 Speaker 1: that they're not identical to the way it would move 465 00:24:51,800 --> 00:24:55,400 Speaker 1: a typical arm. But our brains adapt and they observe 466 00:24:55,560 --> 00:24:59,040 Speaker 1: and they learn, and then because of this neurofeedback, they 467 00:24:59,200 --> 00:25:01,800 Speaker 1: master attack ask, even if it means doing that task 468 00:25:01,800 --> 00:25:04,800 Speaker 1: in a slightly different way. Yeah, and the same thing 469 00:25:04,880 --> 00:25:07,800 Speaker 1: happens with the stimulation used to assist with memory, Like 470 00:25:07,840 --> 00:25:10,840 Speaker 1: our our brain's plasticity helps us work with these devices 471 00:25:10,880 --> 00:25:13,280 Speaker 1: to learn. And now DARK was getting more and more 472 00:25:13,280 --> 00:25:16,000 Speaker 1: interested in this type of research, and I always forget 473 00:25:16,000 --> 00:25:17,959 Speaker 1: what it stands for US. I wrote down, that's the 474 00:25:18,040 --> 00:25:21,760 Speaker 1: US Defense Advanced Research Projects Agency, and they have something 475 00:25:21,760 --> 00:25:25,960 Speaker 1: they're calling the Restoring Active Memory Project. They're basically investing 476 00:25:26,080 --> 00:25:28,960 Speaker 1: significant sums to try and develop technology that could be 477 00:25:29,000 --> 00:25:32,200 Speaker 1: implanted to help a range of people dealing with brain injury. 478 00:25:32,400 --> 00:25:35,240 Speaker 1: So that means like the soldiers were turning home with injury, 479 00:25:35,280 --> 00:25:38,520 Speaker 1: to to those battling Alzheimer's, to those who suffered strokes. 480 00:25:39,119 --> 00:25:42,560 Speaker 1: You know. And while the applications are truly incredible, I 481 00:25:42,640 --> 00:25:44,399 Speaker 1: do think we have to at least note some of 482 00:25:44,400 --> 00:25:47,040 Speaker 1: the ethical concerns and risk involved in all of this. 483 00:25:47,640 --> 00:25:49,640 Speaker 1: I mean, think about what it means to be able 484 00:25:49,680 --> 00:25:53,640 Speaker 1: to implant memories, especially memories that might not be our own. 485 00:25:54,000 --> 00:25:56,320 Speaker 1: I mean, some would argue that our collection of memories 486 00:25:56,400 --> 00:25:58,520 Speaker 1: is really at the core of who we are, and 487 00:25:58,880 --> 00:26:01,040 Speaker 1: if our memories at some point, are are more of 488 00:26:01,080 --> 00:26:03,720 Speaker 1: a computer algorithm than you know then something our own 489 00:26:03,760 --> 00:26:07,639 Speaker 1: brains are producing. Are we still us? I mean that's 490 00:26:07,680 --> 00:26:09,199 Speaker 1: kind of deep to me, Mano, I don't know how 491 00:26:09,240 --> 00:26:11,479 Speaker 1: you feel about it. Well, I had to drop out 492 00:26:11,520 --> 00:26:15,639 Speaker 1: of a philosophy course for too deep. Another concern is 493 00:26:15,680 --> 00:26:18,679 Speaker 1: just the possibility of certain flaws or downsides to the 494 00:26:18,680 --> 00:26:20,719 Speaker 1: way the chips work. So let's say they don't just 495 00:26:20,800 --> 00:26:23,399 Speaker 1: bring up things we want to remember, but things we 496 00:26:23,480 --> 00:26:26,720 Speaker 1: really don't and have moved on from. I'm not saying 497 00:26:26,720 --> 00:26:28,880 Speaker 1: it's not worth doing this for the people who we've 498 00:26:28,880 --> 00:26:31,199 Speaker 1: talked about. I mean, I think we absolutely should, but 499 00:26:31,560 --> 00:26:34,760 Speaker 1: they're definitely gonna be some hurdles ahead. Well, there's there's 500 00:26:34,800 --> 00:26:36,960 Speaker 1: one other weird thought about this. So so let's say 501 00:26:37,000 --> 00:26:40,600 Speaker 1: we're using these devices and because of regained access to memories, 502 00:26:40,640 --> 00:26:43,720 Speaker 1: we might behave a little bit differently for better or worse. 503 00:26:44,160 --> 00:26:46,400 Speaker 1: Let's say in this case it's for worse, and then 504 00:26:46,400 --> 00:26:49,679 Speaker 1: there are consequences for that behavior. Could you then have 505 00:26:49,720 --> 00:26:52,560 Speaker 1: somebody that would argue that the memories that lead to 506 00:26:52,600 --> 00:26:55,359 Speaker 1: that said behavior were you know, not really their own, 507 00:26:55,440 --> 00:26:59,000 Speaker 1: and therefore shouldn't be the ones facing the consequences. Yeah, 508 00:26:59,080 --> 00:27:00,760 Speaker 1: I mean, the legal stuff is gonna be this other 509 00:27:00,800 --> 00:27:03,199 Speaker 1: patch of problems. But I'm going to choose to be 510 00:27:03,240 --> 00:27:05,560 Speaker 1: optimistic here and especially in the way it will help 511 00:27:05,600 --> 00:27:08,480 Speaker 1: those dealing with brain injury and disease. And I just 512 00:27:08,520 --> 00:27:11,399 Speaker 1: think it's going to be fascinating to watch. Well, you 513 00:27:11,440 --> 00:27:13,560 Speaker 1: don't have to wait decades to be fascinated, because guess 514 00:27:13,640 --> 00:27:21,919 Speaker 1: what time it is? Time for the PGG fact off. Yeah, 515 00:27:26,280 --> 00:27:28,880 Speaker 1: it's I'm gonna kick this off here. It turns out 516 00:27:28,960 --> 00:27:32,040 Speaker 1: researchers are better at understanding why after a night of drinking, 517 00:27:32,280 --> 00:27:35,000 Speaker 1: despite all those recent self promises you've made to eat 518 00:27:35,040 --> 00:27:37,720 Speaker 1: better and lose a few pounds, your hungry brain goes 519 00:27:37,720 --> 00:27:39,960 Speaker 1: into overdrive and you find yourself running for the border 520 00:27:40,000 --> 00:27:43,719 Speaker 1: to down a handful of Darrito's tacos. In a study 521 00:27:43,840 --> 00:27:47,000 Speaker 1: of mice, of course, they booze them up and martyred 522 00:27:47,040 --> 00:27:50,520 Speaker 1: their brain activity, and when the mice were completely pickled, 523 00:27:50,800 --> 00:27:52,960 Speaker 1: they noticed a spike in activity in a group of 524 00:27:52,960 --> 00:27:55,720 Speaker 1: neurons called a g r P and these are the 525 00:27:55,720 --> 00:27:58,600 Speaker 1: ones that are activated when our bodies are actually facing starvation, 526 00:27:59,000 --> 00:28:02,240 Speaker 1: and as a result, the mice ate more. But when 527 00:28:02,280 --> 00:28:04,440 Speaker 1: the scientists got them drunk again and block the A 528 00:28:04,520 --> 00:28:07,520 Speaker 1: g RP neurons. With medication, the mice didn't eat as much. 529 00:28:07,920 --> 00:28:10,160 Speaker 1: And the thinking is these same neurons are the ones 530 00:28:10,200 --> 00:28:13,040 Speaker 1: responsible for our post drunken feast. You know, I feel 531 00:28:13,040 --> 00:28:15,080 Speaker 1: like we've talked about drunk mice a couple of times 532 00:28:15,119 --> 00:28:17,760 Speaker 1: in this episode. All right, where do I want to start. 533 00:28:17,840 --> 00:28:20,280 Speaker 1: Let's see. Um. Well, as we've learned more about how 534 00:28:20,280 --> 00:28:23,480 Speaker 1: the brain works, I find those little tricks or shortcuts 535 00:28:23,480 --> 00:28:26,919 Speaker 1: that our brains used for making memories so interesting. And 536 00:28:26,960 --> 00:28:29,240 Speaker 1: there's some other ways our brains take shortcuts that that 537 00:28:29,320 --> 00:28:31,760 Speaker 1: I also find pretty interesting to look at. And one 538 00:28:31,760 --> 00:28:34,560 Speaker 1: of these deals with peripheral vision. So, according to a 539 00:28:34,600 --> 00:28:38,120 Speaker 1: study in the journal Psychological Science, researchers found that our 540 00:28:38,120 --> 00:28:40,920 Speaker 1: brains often make up things in our peripheral vision that 541 00:28:41,040 --> 00:28:43,920 Speaker 1: that aren't actually there. And this is because our brains 542 00:28:43,960 --> 00:28:47,520 Speaker 1: focus on our central vision intend to just make educated 543 00:28:47,560 --> 00:28:51,640 Speaker 1: guesses about our peripheral vision. I love that. So, despite 544 00:28:51,680 --> 00:28:53,680 Speaker 1: what your mother may have told you, it's a myth 545 00:28:53,760 --> 00:28:56,200 Speaker 1: that we're born with all our brain cells will ever have. 546 00:28:56,640 --> 00:28:58,760 Speaker 1: This study out of Sweden in the late nineties help 547 00:28:58,800 --> 00:29:02,360 Speaker 1: scientists prove that the hipocampus forms new neurons pretty much 548 00:29:02,400 --> 00:29:05,680 Speaker 1: our entire lives. And in this other study, also out 549 00:29:05,720 --> 00:29:08,000 Speaker 1: of Sweden, Wow, Sweden is really doing it. Yeah, they're 550 00:29:08,040 --> 00:29:11,080 Speaker 1: killing it. A team of researchers show that new brain 551 00:29:11,120 --> 00:29:13,920 Speaker 1: cells are formed in the striatum. It's a part of 552 00:29:13,920 --> 00:29:16,520 Speaker 1: the brain involved in motor control and decision making, among 553 00:29:16,560 --> 00:29:19,440 Speaker 1: other things. All right, well here's another one. I think 554 00:29:19,480 --> 00:29:21,720 Speaker 1: we've all heard that exercise is good for the brain. 555 00:29:22,160 --> 00:29:24,680 Speaker 1: In fact, studies have shown that taking half hour walks 556 00:29:24,720 --> 00:29:27,680 Speaker 1: a few times a week helps our abstract reasoning skills 557 00:29:27,920 --> 00:29:29,880 Speaker 1: and even helps with the growth of new cells and 558 00:29:29,920 --> 00:29:32,120 Speaker 1: the hippocampus. Kind Of like you were just talking about, 559 00:29:32,560 --> 00:29:34,880 Speaker 1: but I didn't realize that the effect can happen in 560 00:29:34,920 --> 00:29:38,479 Speaker 1: the reverse direction as well, That is, mental exercise can 561 00:29:38,480 --> 00:29:41,719 Speaker 1: be good for your physique. To one, study done at 562 00:29:41,720 --> 00:29:44,320 Speaker 1: the Cleveland Clinics show that those who spent fifteen minutes 563 00:29:44,320 --> 00:29:48,000 Speaker 1: a day thinking about exercising their biceps actually increase the 564 00:29:48,080 --> 00:29:52,479 Speaker 1: strength of their biceps by over a three month period. 565 00:29:52,720 --> 00:29:55,960 Speaker 1: I'm gonna start thinking so hard about my biceps. Yeah 566 00:29:56,000 --> 00:29:59,480 Speaker 1: that's awesome. Uh so, do you know a bigger brain 567 00:29:59,600 --> 00:30:03,160 Speaker 1: doesn't necessarily mean a smarter brain. Uh, the average human 568 00:30:03,160 --> 00:30:06,160 Speaker 1: brain is three pounds and Einstein's was only two point 569 00:30:06,240 --> 00:30:09,160 Speaker 1: seven pounds, And I think that's pretty solid proof. Yeah, 570 00:30:09,160 --> 00:30:11,959 Speaker 1: I would agree with that. Well, your brain generates twenty 571 00:30:12,040 --> 00:30:14,520 Speaker 1: watts of power, which is actually enough to run a 572 00:30:14,560 --> 00:30:18,440 Speaker 1: regular sized LED bulb. You know how we talk about 573 00:30:18,480 --> 00:30:21,600 Speaker 1: people being auditory or visual learners. Yeah, well, well, while 574 00:30:21,600 --> 00:30:23,520 Speaker 1: it may be true that we all have our preferences 575 00:30:23,520 --> 00:30:25,880 Speaker 1: and how we learn, like you might prefer to read 576 00:30:25,960 --> 00:30:28,800 Speaker 1: something instead of hearing it in lecture form, there really 577 00:30:28,800 --> 00:30:31,480 Speaker 1: aren't studies to back up this idea. I mean, when 578 00:30:31,560 --> 00:30:34,800 Speaker 1: tested students tended to perform similarly regardless of whether they 579 00:30:34,800 --> 00:30:37,040 Speaker 1: were taught in their preferred method or some other method. 580 00:30:37,400 --> 00:30:40,440 Speaker 1: That is surprising. Yeah, I've always just assumed that we 581 00:30:40,440 --> 00:30:43,000 Speaker 1: were either one or the other. So I think I'm 582 00:30:43,040 --> 00:30:45,640 Speaker 1: going to give you this week's fact Off trophy. Congratulations, 583 00:30:46,000 --> 00:30:48,080 Speaker 1: and if there any brain facts you feel we should know, 584 00:30:48,160 --> 00:30:50,120 Speaker 1: hit us up at part time genius at how stuff 585 00:30:50,160 --> 00:30:52,760 Speaker 1: works dot com. You can also find us on Facebook 586 00:30:52,840 --> 00:30:55,560 Speaker 1: or Twitter, or as always called our fact hot Line 587 00:30:55,960 --> 00:30:59,240 Speaker 1: one eight four four pt genius. It's it's still seven 588 00:30:59,240 --> 00:31:02,000 Speaker 1: fact hot Line. I think it's still all right. So 589 00:31:02,120 --> 00:31:18,000 Speaker 1: call us there. Thanks so much for listening. Thanks again 590 00:31:18,040 --> 00:31:20,240 Speaker 1: for listening. Part Time Genius is a production of how 591 00:31:20,280 --> 00:31:23,120 Speaker 1: stuff works, and wouldn't be possible without several brilliant people 592 00:31:23,120 --> 00:31:26,040 Speaker 1: who do the important things we couldn't even begin to understand. 593 00:31:26,160 --> 00:31:28,960 Speaker 1: Tristan McNeil does the editing thing. Noel Brown made the 594 00:31:28,960 --> 00:31:31,640 Speaker 1: theme song and does the mixy mixy sound thing. Jerry 595 00:31:31,760 --> 00:31:34,680 Speaker 1: Rowland does the exact producer thing. Gay Bluesier is our 596 00:31:34,760 --> 00:31:38,080 Speaker 1: lead researcher, with support from the Research Army including Austin Thompson, 597 00:31:38,160 --> 00:31:40,640 Speaker 1: Nolan Brown and Lucas Adams and Eve Jeff Cook gets 598 00:31:40,640 --> 00:31:42,800 Speaker 1: the show to your ears. Good job, Eves. If you 599 00:31:42,880 --> 00:31:44,760 Speaker 1: like what you heard, we hope you'll subscribe, And if 600 00:31:44,760 --> 00:31:46,640 Speaker 1: you really really like what you've heard, maybe you could 601 00:31:46,720 --> 00:31:48,600 Speaker 1: leave a good review for us. Do we do? We 602 00:31:48,600 --> 00:32:00,280 Speaker 1: forget Jason Jason who