1 00:00:15,356 --> 00:00:23,076 Speaker 1: Pushkin. My guest today is a brain surgeon who also 2 00:00:23,156 --> 00:00:27,796 Speaker 1: has a PhD in electrical engineering from MIT, which is 3 00:00:27,836 --> 00:00:32,076 Speaker 1: to say he is extremely well prepared to figure out 4 00:00:32,076 --> 00:00:36,916 Speaker 1: how to implant electronic devices in people's brains, which is 5 00:00:36,956 --> 00:00:40,316 Speaker 1: what he's doing, and in fact, as it happens, he's 6 00:00:40,316 --> 00:00:43,436 Speaker 1: actually been preparing to do this kind of his whole life. 7 00:00:44,116 --> 00:00:45,916 Speaker 2: You know, I sort of was born into the business. 8 00:00:46,476 --> 00:00:48,956 Speaker 2: My dad is a neurologist who started on his career 9 00:00:48,996 --> 00:00:53,316 Speaker 2: as an electrical engineer. You know, electrophysiology, clinical neuroscience and 10 00:00:54,036 --> 00:00:55,916 Speaker 2: neurology and our surgery. I've been a part of my 11 00:00:56,236 --> 00:01:00,196 Speaker 2: life forever as far as I can remember. And you know, 12 00:01:00,236 --> 00:01:02,956 Speaker 2: brain computer interfaces the way we talk about them today 13 00:01:03,156 --> 00:01:07,956 Speaker 2: didn't exist in the nineteen eighties, but the fundamentals were there, 14 00:01:07,996 --> 00:01:12,596 Speaker 2: and so that's been percolating in some way forever. 15 00:01:18,956 --> 00:01:21,156 Speaker 1: I'm Jacob Oldstein and this is What's Your Problem, the 16 00:01:21,156 --> 00:01:23,076 Speaker 1: show where I talk to people who are trying to 17 00:01:23,156 --> 00:01:27,556 Speaker 1: make technological progress. My guest today is Ben Rapaport. He's 18 00:01:27,636 --> 00:01:31,396 Speaker 1: the co founder and chief science officer at Precision Neuroscience. 19 00:01:31,956 --> 00:01:34,916 Speaker 1: Ben's problem is this, can you build a device that 20 00:01:34,996 --> 00:01:38,356 Speaker 1: allows someone who is paralyzed to use a computer with 21 00:01:38,516 --> 00:01:42,116 Speaker 1: only their thoughts and can you do it without sticking 22 00:01:42,156 --> 00:01:47,436 Speaker 1: needles into their brain. Before he started Precision, Ben was 23 00:01:47,476 --> 00:01:50,796 Speaker 1: a co founder of Neuralink. Neuralink is probably the best 24 00:01:50,876 --> 00:01:54,316 Speaker 1: known brain computer interface company, and it was founded in 25 00:01:54,356 --> 00:01:58,156 Speaker 1: twenty sixteen, right around the moment when modern AI was 26 00:01:58,276 --> 00:02:01,836 Speaker 1: just emerging. And Ben told me the AI revolution was 27 00:02:01,876 --> 00:02:04,596 Speaker 1: really what inspired the foundation of Neuralink. 28 00:02:05,396 --> 00:02:09,676 Speaker 2: The kind of founding principles of neuralink were, you know, 29 00:02:09,956 --> 00:02:12,476 Speaker 2: here's here's a point in time when we're thinking broadly 30 00:02:12,476 --> 00:02:15,516 Speaker 2: about how the human brain is going to interact with 31 00:02:15,676 --> 00:02:20,156 Speaker 2: artificial intelligence. And if breakthrough is an artificial intelligence are 32 00:02:20,196 --> 00:02:23,156 Speaker 2: scaling at an exponential rate, you know, how's the human 33 00:02:23,156 --> 00:02:24,716 Speaker 2: brain going to keep up with that? How are we 34 00:02:24,716 --> 00:02:28,276 Speaker 2: going to keep communicating with artificial intelligence in a way 35 00:02:28,316 --> 00:02:30,196 Speaker 2: that is feasible and productive. 36 00:02:30,516 --> 00:02:32,756 Speaker 1: So that's a really different that's not how can we 37 00:02:32,916 --> 00:02:36,956 Speaker 1: help people who are paralyzed. That's a much more sort 38 00:02:36,996 --> 00:02:39,796 Speaker 1: of cognitive centric. It's about like the nature of human 39 00:02:39,876 --> 00:02:41,316 Speaker 1: thought in the context of AI. 40 00:02:41,996 --> 00:02:45,156 Speaker 2: So that was the that's kind of the raison deetra 41 00:02:45,236 --> 00:02:47,956 Speaker 2: of neuralink, and it was a little different from a 42 00:02:48,516 --> 00:02:53,356 Speaker 2: human focused medically oriented focus that Precision has taken. And 43 00:02:53,676 --> 00:02:57,236 Speaker 2: these different focuses can you know, can and will coexist 44 00:02:57,276 --> 00:03:01,116 Speaker 2: in an ecosystem in which multiple brain computer interface core 45 00:03:01,196 --> 00:03:04,556 Speaker 2: technology has become widely available and are the standard to 46 00:03:04,556 --> 00:03:07,276 Speaker 2: become the standard of care. But it became clear to 47 00:03:07,316 --> 00:03:10,276 Speaker 2: me that that there was a need to also focus 48 00:03:10,876 --> 00:03:13,916 Speaker 2: an effort within the community of bring computer interfaces on 49 00:03:14,436 --> 00:03:18,396 Speaker 2: treating patients with untreatable diseases. That was the origin, you know, 50 00:03:18,516 --> 00:03:21,916 Speaker 2: brain computer interfaces was really bringing this science and technology 51 00:03:21,996 --> 00:03:24,316 Speaker 2: to a point where you know, people who today we 52 00:03:24,436 --> 00:03:27,756 Speaker 2: think have has having really no treatment options, people with 53 00:03:28,156 --> 00:03:31,956 Speaker 2: paralysis or an ability to speak, for example, from als, 54 00:03:32,436 --> 00:03:35,516 Speaker 2: you know, really unlocking a world of possibilities for those people. 55 00:03:35,836 --> 00:03:39,876 Speaker 2: But we really wanted to focus on those applications within 56 00:03:39,916 --> 00:03:43,916 Speaker 2: bringing computer interfaces, and doing that, in my view, required 57 00:03:43,916 --> 00:03:47,516 Speaker 2: making a few different design decisions than what we've made 58 00:03:47,716 --> 00:03:50,316 Speaker 2: at Neuralink. You know, so those were the founding principles 59 00:03:50,356 --> 00:03:51,116 Speaker 2: of Precision. 60 00:03:51,476 --> 00:03:54,796 Speaker 1: You leave Neuralink to found Precision. Tell me tell me 61 00:03:54,836 --> 00:03:57,116 Speaker 1: about what you you know, what you're setting out to 62 00:03:57,156 --> 00:03:59,516 Speaker 1: create at Precision when you're launching the company, Like, what 63 00:03:59,636 --> 00:04:01,396 Speaker 1: is it that you want to do and how is 64 00:04:01,436 --> 00:04:03,716 Speaker 1: it different than what everybody else is doing. 65 00:04:03,756 --> 00:04:08,556 Speaker 2: Yeah. The goal then and is today to build a safe, 66 00:04:09,156 --> 00:04:12,676 Speaker 2: scalable brain computer interface that can become the standard of 67 00:04:12,716 --> 00:04:16,636 Speaker 2: care in the treatment of patients people with a variety 68 00:04:16,636 --> 00:04:18,476 Speaker 2: of diseases of the brain and nervous system that today 69 00:04:18,476 --> 00:04:22,156 Speaker 2: are untreatable. That that includes various forms of paralysis and 70 00:04:22,396 --> 00:04:24,236 Speaker 2: inability to communicate. 71 00:04:24,316 --> 00:04:26,836 Speaker 1: And tell me about the tell me about the technology, like, 72 00:04:27,196 --> 00:04:29,196 Speaker 1: tell me about the thing you're building and how it's 73 00:04:29,276 --> 00:04:31,436 Speaker 1: different from what other people are building. 74 00:04:32,116 --> 00:04:34,716 Speaker 2: Our philosophy has been that in order for a brain 75 00:04:34,796 --> 00:04:38,916 Speaker 2: computer interface to really work in the real world and 76 00:04:38,996 --> 00:04:42,916 Speaker 2: to unlock the potential of this technology for many millions 77 00:04:42,916 --> 00:04:46,236 Speaker 2: of people. First, of course, it has to be incredibly safe. 78 00:04:46,436 --> 00:04:47,756 Speaker 2: We see the use of the views of the term 79 00:04:47,756 --> 00:04:51,196 Speaker 2: minimally invasive a lot, but really in my view, has 80 00:04:51,236 --> 00:04:52,196 Speaker 2: to not damage the brain. 81 00:04:52,796 --> 00:04:54,756 Speaker 1: So what does that mean in practice? 82 00:04:54,956 --> 00:04:58,156 Speaker 2: Yeah, the tissue interface with the electrode involves kind of 83 00:04:58,196 --> 00:05:01,196 Speaker 2: like little needles of the electrics are little needles and 84 00:05:01,236 --> 00:05:03,796 Speaker 2: they penetrate into the brain. And there's been a lot 85 00:05:03,796 --> 00:05:06,316 Speaker 2: of innovation in doing it, trying to do that very safely, 86 00:05:06,356 --> 00:05:09,916 Speaker 2: But in my view, the most safe version of that 87 00:05:10,236 --> 00:05:12,716 Speaker 2: is a version that just kind of caresses the brain 88 00:05:12,956 --> 00:05:16,076 Speaker 2: but doesn't penetrate it. And it was at first thought, 89 00:05:16,236 --> 00:05:18,836 Speaker 2: you know, certainly when we found a precision, many people 90 00:05:18,836 --> 00:05:21,356 Speaker 2: thought that it was not possible to extract high quality 91 00:05:21,356 --> 00:05:24,516 Speaker 2: signals from the brain without penetrating, and we and others 92 00:05:24,556 --> 00:05:26,836 Speaker 2: have shown that, in fact, it's not only possible to 93 00:05:26,876 --> 00:05:30,076 Speaker 2: do but has many advantages. So not that it's the 94 00:05:30,076 --> 00:05:32,996 Speaker 2: only way or necessarily better or worse, but from the 95 00:05:33,036 --> 00:05:36,636 Speaker 2: standpoint of people who have untreatable diseases and already have 96 00:05:36,716 --> 00:05:39,716 Speaker 2: a very low threshold for damage to the brain, not 97 00:05:39,756 --> 00:05:42,636 Speaker 2: doing any incremental damage to the brain for us is 98 00:05:42,756 --> 00:05:45,756 Speaker 2: very very important. So that was sort of part one 99 00:05:46,476 --> 00:05:47,556 Speaker 2: of precision. 100 00:05:47,276 --> 00:05:49,316 Speaker 1: Is there before we get to part two? Is there 101 00:05:49,356 --> 00:05:52,076 Speaker 1: a trade off? I mean, do you lose some amount 102 00:05:52,156 --> 00:05:55,676 Speaker 1: of sensitivity or resolution? Is that the basic trade off? 103 00:05:55,836 --> 00:06:00,196 Speaker 2: So we always get this question, you know, right, No, 104 00:06:00,276 --> 00:06:02,956 Speaker 2: it's absolutely it's a good question, right, And so there's 105 00:06:03,076 --> 00:06:08,316 Speaker 2: this false dichotomy. I think that more penetration into the 106 00:06:08,316 --> 00:06:12,236 Speaker 2: brain equals higher quality signal, and if you don't do that, 107 00:06:12,356 --> 00:06:15,676 Speaker 2: then you somehow sacrifice signal quality. But it's really not 108 00:06:15,836 --> 00:06:18,836 Speaker 2: a one dimensional as one dimensional as that if you're 109 00:06:18,836 --> 00:06:22,036 Speaker 2: a neuroscientist, then there's a trade off. If you care 110 00:06:22,076 --> 00:06:24,316 Speaker 2: about recording from one neuron out of time, and you're 111 00:06:24,356 --> 00:06:27,196 Speaker 2: studying the behavior of individual neurons, and you care about that, 112 00:06:27,556 --> 00:06:31,676 Speaker 2: then you want what we called intracortical penetrating microelectrodes, the 113 00:06:31,676 --> 00:06:34,956 Speaker 2: ones that can come up up close to an individual 114 00:06:34,996 --> 00:06:38,556 Speaker 2: neuron and listen to those individual action potentials. And that's 115 00:06:38,596 --> 00:06:42,356 Speaker 2: something that neuroscientists care about. So you don't want to 116 00:06:42,476 --> 00:06:46,956 Speaker 2: use the same electrodes that we use for a precision 117 00:06:46,996 --> 00:06:50,476 Speaker 2: But but if what you care about is is treating 118 00:06:50,516 --> 00:06:55,476 Speaker 2: paralysis or sources of communication, what you care about is stable, 119 00:06:55,756 --> 00:06:59,316 Speaker 2: high quality signals over a long period of time. And 120 00:06:59,356 --> 00:07:03,396 Speaker 2: in that area, arguably, just based on the data, you know, 121 00:07:03,476 --> 00:07:06,356 Speaker 2: the cortical surface electrodes that we use at precision are 122 00:07:06,476 --> 00:07:08,676 Speaker 2: at least as good, if not better. And I think 123 00:07:08,716 --> 00:07:10,956 Speaker 2: you know, I will tell because there's a few of 124 00:07:10,956 --> 00:07:12,836 Speaker 2: these different systems that are now out there in the 125 00:07:12,836 --> 00:07:15,516 Speaker 2: real world. What's really exciting is that this has come 126 00:07:15,516 --> 00:07:19,476 Speaker 2: out of the laboratory, out of animal experiment territory, into 127 00:07:19,996 --> 00:07:23,996 Speaker 2: human pilot clinical trials that we and neuralink and synchron 128 00:07:24,036 --> 00:07:27,556 Speaker 2: and others are engaged and that's really where it's at. 129 00:07:27,756 --> 00:07:30,356 Speaker 1: So tell me where you are now. I know you've 130 00:07:30,396 --> 00:07:34,156 Speaker 1: done some amount of experimental work in people, right what 131 00:07:34,316 --> 00:07:35,716 Speaker 1: is the frontier of your work right now? 132 00:07:35,996 --> 00:07:40,036 Speaker 2: Yeah, we've now implanted our electrode raise in almost thirty 133 00:07:40,316 --> 00:07:44,476 Speaker 2: patients over the last two years. These are pilot studies 134 00:07:44,836 --> 00:07:47,396 Speaker 2: across four major medical centers and the US that are 135 00:07:47,436 --> 00:07:50,556 Speaker 2: partnering with US, and all of those studies are really 136 00:07:51,036 --> 00:07:54,036 Speaker 2: they're temporary placements of the electrodes. So there are studies 137 00:07:54,076 --> 00:07:57,636 Speaker 2: that are run in nations who have volunteered to have 138 00:07:57,716 --> 00:08:01,756 Speaker 2: the electrodes placed alongside clinical standard electrodes as part of 139 00:08:01,796 --> 00:08:05,516 Speaker 2: a under resurgical procedure that they're already undergoing. And we've 140 00:08:05,556 --> 00:08:09,476 Speaker 2: been using those opportunities to basically validate the quality of 141 00:08:09,476 --> 00:08:12,956 Speaker 2: electroctavity that we can record on those electrodes and to 142 00:08:12,996 --> 00:08:18,636 Speaker 2: demonstrate that our algorithms can in fact basically decode intention 143 00:08:18,956 --> 00:08:22,676 Speaker 2: and thought as intended by health essentially healthy volunteers. 144 00:08:23,116 --> 00:08:28,036 Speaker 1: So the array itself, like, what does it look like? 145 00:08:29,716 --> 00:08:31,916 Speaker 2: So the brain lives in the skull, so it's a 146 00:08:32,196 --> 00:08:37,596 Speaker 2: it is a soft tissue that's kind of jelly like inconsistency, 147 00:08:38,156 --> 00:08:41,716 Speaker 2: and so the best way to generally interface with it 148 00:08:41,756 --> 00:08:45,356 Speaker 2: is with something also that is soft and flexible. And 149 00:08:45,396 --> 00:08:47,556 Speaker 2: the surface of the brain is many of us have 150 00:08:47,596 --> 00:08:51,636 Speaker 2: seen in pictures, is curved or undulating, and so our 151 00:08:51,676 --> 00:08:56,676 Speaker 2: electrode array is a thin polymer that's many times thinner 152 00:08:56,676 --> 00:08:59,476 Speaker 2: even than a human hair. So it's a film, and 153 00:08:59,516 --> 00:09:04,396 Speaker 2: embedded in that film are tiny little dots of platinum, 154 00:09:04,916 --> 00:09:08,956 Speaker 2: each one connected to it very very very thin platinum wire. 155 00:09:09,556 --> 00:09:12,916 Speaker 2: And so that film with the tiny little dots of 156 00:09:12,956 --> 00:09:16,956 Speaker 2: platinum inside, can be placed over the brain surface and 157 00:09:16,996 --> 00:09:21,956 Speaker 2: it conforms to that curved surface. So each of those 158 00:09:21,996 --> 00:09:25,516 Speaker 2: little platinum electrodes touches the surface of the brain at 159 00:09:25,516 --> 00:09:28,116 Speaker 2: a very discreete point, and so it can record the 160 00:09:28,156 --> 00:09:31,676 Speaker 2: electrical activity from the area of the brain just under 161 00:09:31,796 --> 00:09:32,956 Speaker 2: that its touching basically. 162 00:09:33,876 --> 00:09:38,316 Speaker 1: Okay, So in these trials, you put this implant on 163 00:09:38,356 --> 00:09:41,116 Speaker 1: a patient's brain and then what. 164 00:09:42,156 --> 00:09:45,116 Speaker 2: So let me describe maybe one of the paradigms that 165 00:09:45,116 --> 00:09:48,236 Speaker 2: we use at one of our partner sites. So ian 166 00:09:48,316 --> 00:09:52,036 Speaker 2: Cahegus is the neurosurgeon at Penn who's our partner, and 167 00:09:52,196 --> 00:09:54,516 Speaker 2: he is a surgeon who specializes in the treatment of 168 00:09:54,556 --> 00:09:58,076 Speaker 2: Parkinson's disease. One of the ways of treating Parkinson's disease 169 00:09:58,356 --> 00:10:01,796 Speaker 2: is a procedure called deep brain stimulation, in which electrodes 170 00:10:01,836 --> 00:10:05,476 Speaker 2: are placed deep within the brain to stimulate those areas 171 00:10:05,596 --> 00:10:09,516 Speaker 2: that are responsible for modulating the tremor. Doctor ca Vegas, 172 00:10:09,556 --> 00:10:13,076 Speaker 2: among many others, performs these procedures at least a part 173 00:10:13,076 --> 00:10:16,036 Speaker 2: of them awake in order to make sure effectively that 174 00:10:16,036 --> 00:10:18,916 Speaker 2: the exact right place is being targeted and the brain 175 00:10:18,956 --> 00:10:22,476 Speaker 2: doesn't feel pain. And so it's not only possible but 176 00:10:22,516 --> 00:10:25,356 Speaker 2: beneficial to do these procedures at least partially awake. So 177 00:10:25,636 --> 00:10:28,996 Speaker 2: in those procedures, we take a basically a fifteen minute 178 00:10:29,036 --> 00:10:34,316 Speaker 2: window and doctor Ahigis places the precisional electrode directly over 179 00:10:34,396 --> 00:10:37,276 Speaker 2: the motor cortex, a portion of the motor cortex that 180 00:10:37,276 --> 00:10:41,556 Speaker 2: controls hand movement. And this has provided for us and 181 00:10:41,596 --> 00:10:46,396 Speaker 2: for the community, the highest resolution picture of the human 182 00:10:46,436 --> 00:10:50,156 Speaker 2: motor cortex and the awake human ever in the history 183 00:10:50,196 --> 00:10:53,076 Speaker 2: of the world. So you know, the area of the 184 00:10:53,076 --> 00:10:55,916 Speaker 2: brain of the motor cortex that controls hand movement is 185 00:10:55,956 --> 00:11:00,956 Speaker 2: about the size of a postage stamp. And critically to 186 00:11:01,036 --> 00:11:06,116 Speaker 2: understand is the neurons that are responsible for coordinating those movements. 187 00:11:06,796 --> 00:11:10,116 Speaker 2: They all live within a two milimeters your layer of 188 00:11:10,716 --> 00:11:14,476 Speaker 2: tissue that's just at the surface of the brain. So 189 00:11:14,596 --> 00:11:17,796 Speaker 2: all that critical computation and activity is happening very very 190 00:11:17,796 --> 00:11:20,636 Speaker 2: close to the surface. And so it's good. 191 00:11:20,396 --> 00:11:24,076 Speaker 1: For you, good for us method. So so what actually happens. 192 00:11:24,076 --> 00:11:26,916 Speaker 1: So you have a patient who's there, you put your 193 00:11:27,236 --> 00:11:30,636 Speaker 1: array on the on the portion of their motor cortex 194 00:11:30,676 --> 00:11:34,356 Speaker 1: that controls hand movements, and then you say, wiggle your finger. 195 00:11:34,276 --> 00:11:37,836 Speaker 2: Exactly, so we say, we say, we say, basically, you know, 196 00:11:37,916 --> 00:11:40,196 Speaker 2: we asked. We walk the patient through making a certain 197 00:11:40,236 --> 00:11:43,356 Speaker 2: number of gestures, you know, open hand, close hand, make 198 00:11:43,396 --> 00:11:47,276 Speaker 2: a peace sign, and we watch, and I say metaphorically watch, 199 00:11:47,316 --> 00:11:49,676 Speaker 2: We watch what the patient is doing, and we watch 200 00:11:49,716 --> 00:11:52,236 Speaker 2: what is happening on the surface of the brain. And 201 00:11:52,236 --> 00:11:55,316 Speaker 2: and here is you know, where modern machine learning plays 202 00:11:55,316 --> 00:11:59,396 Speaker 2: a tremendous role, because this exactly, you know, this is 203 00:11:59,436 --> 00:12:01,716 Speaker 2: the AI portion of it, because this is this is 204 00:12:02,236 --> 00:12:04,916 Speaker 2: the so called training data. So this this is a 205 00:12:04,956 --> 00:12:09,516 Speaker 2: calibration phase in which our algorithms learn what the brain's 206 00:12:09,636 --> 00:12:12,116 Speaker 2: signals to the hand look like in a given patient. 207 00:12:12,796 --> 00:12:16,876 Speaker 2: So there's a characteristic signature, electrical signature that happens in 208 00:12:16,916 --> 00:12:19,876 Speaker 2: the moments before an action is done, and it's a 209 00:12:19,916 --> 00:12:23,676 Speaker 2: little bit different in each person, and learning that signature 210 00:12:23,716 --> 00:12:27,396 Speaker 2: for that person allows us to recognize when the brain 211 00:12:27,476 --> 00:12:29,556 Speaker 2: is telling the hand to make a particular gesture, when 212 00:12:29,556 --> 00:12:31,476 Speaker 2: the fingers are supposed to move in a particular way, 213 00:12:31,476 --> 00:12:34,476 Speaker 2: when the hand opens and closes, and after about three 214 00:12:34,556 --> 00:12:37,916 Speaker 2: to five minutes of training, we then have a trained 215 00:12:37,956 --> 00:12:42,076 Speaker 2: algorithm that can recognize not just movement, but the intention 216 00:12:42,156 --> 00:12:44,876 Speaker 2: to move. And so we then use the balance of 217 00:12:44,956 --> 00:12:47,236 Speaker 2: the time that we have with those patients to ask 218 00:12:47,556 --> 00:12:50,596 Speaker 2: the patient to move and validate that we're predicting the 219 00:12:50,636 --> 00:12:55,396 Speaker 2: correct movement, and then to imagine movement without moving, and 220 00:12:55,436 --> 00:12:59,436 Speaker 2: that too we can accurately predict. And so these procedures 221 00:12:59,716 --> 00:13:05,196 Speaker 2: become the basically healthy volunteer test bed for patients who 222 00:13:05,556 --> 00:13:08,796 Speaker 2: can't actually move, the paralyzed patients that we'll be treating 223 00:13:09,156 --> 00:13:11,316 Speaker 2: within the next couple of years. So that's that's the 224 00:13:11,436 --> 00:13:14,556 Speaker 2: nature of this first phase of pilot trials. 225 00:13:14,876 --> 00:13:19,036 Speaker 1: You mentioned that each person is different in terms of 226 00:13:19,076 --> 00:13:25,196 Speaker 1: the patterns of neuron activity for each hand motion. In 227 00:13:25,236 --> 00:13:28,396 Speaker 1: this context, how different is it like kind of like 228 00:13:28,436 --> 00:13:30,916 Speaker 1: a Southern accent versus a New York accent. Is it 229 00:13:30,996 --> 00:13:33,756 Speaker 1: like an entirely different language if that kind of metaphor work. 230 00:13:33,836 --> 00:13:36,796 Speaker 2: Yeah, No, that's a perfect metaphor. And it's it's kind 231 00:13:36,796 --> 00:13:39,516 Speaker 2: of like that. So you know, you if you're trying 232 00:13:39,556 --> 00:13:41,836 Speaker 2: to learn a new language or a dialect, you know 233 00:13:42,036 --> 00:13:45,396 Speaker 2: that there are words, and you know that they're spoken 234 00:13:45,396 --> 00:13:47,596 Speaker 2: in a particular frequency range, so you kind of know 235 00:13:47,596 --> 00:13:49,916 Speaker 2: what to listen for, and you kind of know the cadence. 236 00:13:49,996 --> 00:13:51,716 Speaker 2: So when there's a word, you know that's a word, 237 00:13:52,116 --> 00:13:54,716 Speaker 2: but you might not know what it means until you 238 00:13:54,916 --> 00:13:57,796 Speaker 2: listen in to conversation and you've seen the context. 239 00:13:58,036 --> 00:14:01,396 Speaker 1: So so like pretty different. Like it wouldn't work to 240 00:14:01,516 --> 00:14:05,916 Speaker 1: just make a generic algorithm and put her on my brain. 241 00:14:05,836 --> 00:14:08,836 Speaker 2: Because it doesn't work to make a generic algorithm. But 242 00:14:09,396 --> 00:14:11,716 Speaker 2: that's an area where there's been a lot of just 243 00:14:11,756 --> 00:14:15,676 Speaker 2: fascinating development. And so a good example of this is, 244 00:14:15,876 --> 00:14:18,076 Speaker 2: you know, Siri works out of the box for most 245 00:14:18,116 --> 00:14:20,876 Speaker 2: people pretty well, right. 246 00:14:21,076 --> 00:14:23,516 Speaker 1: Talk into your eye, right, it works. 247 00:14:23,316 --> 00:14:25,956 Speaker 2: Right, It works pretty well, and then you need to 248 00:14:25,996 --> 00:14:28,236 Speaker 2: train it to make it better, and then it listens 249 00:14:28,236 --> 00:14:31,116 Speaker 2: to you in the background and gets even better. And 250 00:14:31,156 --> 00:14:32,996 Speaker 2: so that that's a good that's a good analogy. So 251 00:14:33,316 --> 00:14:36,196 Speaker 2: it is possible for us to build, you know, a 252 00:14:36,196 --> 00:14:40,876 Speaker 2: translation algorithm that works somewhat out of the box, but 253 00:14:40,996 --> 00:14:45,036 Speaker 2: we build into it a calibration phase that knows something 254 00:14:45,076 --> 00:14:48,996 Speaker 2: about the structure of brain signals and how they interact 255 00:14:48,996 --> 00:14:53,676 Speaker 2: with and relate to movement or speech. And that's what 256 00:14:53,796 --> 00:14:57,036 Speaker 2: basically allows us to use only relatively small amounts of 257 00:14:57,076 --> 00:14:59,116 Speaker 2: calibration data. I mean, you know, we can do a 258 00:14:59,156 --> 00:15:01,116 Speaker 2: lot with a small amount of calibration data. 259 00:15:01,196 --> 00:15:05,836 Speaker 1: So you're doing a sort of pilot study. Now, when 260 00:15:06,516 --> 00:15:07,836 Speaker 1: what's the next big step? 261 00:15:08,756 --> 00:15:11,676 Speaker 2: So I want to be careful about what I say 262 00:15:11,996 --> 00:15:15,756 Speaker 2: before it happens. But we do anticipate being able to 263 00:15:16,156 --> 00:15:19,476 Speaker 2: in the very near future extend what are now short 264 00:15:19,556 --> 00:15:24,156 Speaker 2: duration file with studies that last only the span of 265 00:15:24,196 --> 00:15:25,916 Speaker 2: time that we have access to the brain within a 266 00:15:26,236 --> 00:15:30,636 Speaker 2: standard androsurgical procedure, which is relatively short. We anticipate having 267 00:15:30,676 --> 00:15:34,916 Speaker 2: ways of extending that with regulatory approval, to hopefully many 268 00:15:34,956 --> 00:15:37,956 Speaker 2: days and weeks within the calendar year. And then, of course, 269 00:15:37,996 --> 00:15:41,316 Speaker 2: this is all in the service of permanent implants that 270 00:15:41,636 --> 00:15:45,476 Speaker 2: wirelessly communicate with the outside world, and that will be 271 00:15:45,636 --> 00:15:48,116 Speaker 2: the basis of our pivotal clinical trial a couple of 272 00:15:48,156 --> 00:15:48,876 Speaker 2: years hence. 273 00:15:51,316 --> 00:15:53,796 Speaker 1: Still to come. On the show, Ben and I discussed 274 00:15:53,836 --> 00:15:57,916 Speaker 1: the possibility of using brain computer interfaces in healthy people, 275 00:15:58,716 --> 00:16:09,756 Speaker 1: also the meaning of consciousness. Just before the break, Ben 276 00:16:09,796 --> 00:16:12,996 Speaker 1: mentioned that pivotal clinical trial that they're building up to, 277 00:16:13,156 --> 00:16:15,236 Speaker 1: and so I asked him what exactly they're going to 278 00:16:15,276 --> 00:16:16,716 Speaker 1: be doing in that trial. 279 00:16:17,116 --> 00:16:19,556 Speaker 2: So the first clinical application is going to be for 280 00:16:19,556 --> 00:16:23,476 Speaker 2: the treatment of severe paralysis, okay, And the device will 281 00:16:23,516 --> 00:16:26,516 Speaker 2: be an implant that has the electrodes on the brain 282 00:16:26,716 --> 00:16:29,756 Speaker 2: and an implant within the chestwell that provides power and 283 00:16:29,836 --> 00:16:33,156 Speaker 2: data transfer to the outside world to communicate with the 284 00:16:33,156 --> 00:16:36,396 Speaker 2: external devices like a computer. And that system will allow, 285 00:16:36,476 --> 00:16:39,156 Speaker 2: for example, a person with a spinal cord injury really 286 00:16:39,156 --> 00:16:41,756 Speaker 2: to hold the desk job that will allow them to 287 00:16:41,916 --> 00:16:46,556 Speaker 2: operate effectively a word processing program, email, serve the internet, 288 00:16:46,876 --> 00:16:50,036 Speaker 2: have a zoom conversation, operate and expel a sales spreadsheet, 289 00:16:50,156 --> 00:16:53,916 Speaker 2: use PowerPoint, have the ability to re enter the workforce 290 00:16:54,276 --> 00:16:57,796 Speaker 2: with a level of personal and economic self sufficiency that 291 00:16:58,076 --> 00:17:01,196 Speaker 2: allows them to, you know, certain freedoms that they don't have, 292 00:17:01,356 --> 00:17:04,556 Speaker 2: and that our core to being a part of modern society. 293 00:17:04,876 --> 00:17:07,796 Speaker 2: That is, for us a major goal. Number one, I'm 294 00:17:07,836 --> 00:17:12,916 Speaker 2: quite sure that past the technolog becomes provenly safe and effective, 295 00:17:13,396 --> 00:17:17,076 Speaker 2: that other disorders and conditions that are perhaps less dramatic, 296 00:17:17,476 --> 00:17:20,236 Speaker 2: you know, will benefit from this and in other forms 297 00:17:20,276 --> 00:17:23,916 Speaker 2: of technology. And part three is there's a lot that 298 00:17:23,956 --> 00:17:26,116 Speaker 2: I'm sure that we're not even imagining right now. You know, 299 00:17:26,356 --> 00:17:30,076 Speaker 2: the brain computer interface, at least the precision system is 300 00:17:30,116 --> 00:17:33,476 Speaker 2: really in some ways a platform technology because it's it 301 00:17:33,636 --> 00:17:37,836 Speaker 2: translates the wet and difficult to access, delicate, you know, 302 00:17:37,876 --> 00:17:43,676 Speaker 2: biological signals of the brain into robust digital bitstreams and 303 00:17:43,756 --> 00:17:46,356 Speaker 2: allows us to compute on them in a scalable way. 304 00:17:46,676 --> 00:17:48,796 Speaker 2: The brain computer interface is not a substitute for a 305 00:17:48,836 --> 00:17:51,436 Speaker 2: keyboard in a mouse. It's not a substitute for a 306 00:17:51,556 --> 00:17:56,156 Speaker 2: gestural interface or a voice interface. It's another kind of 307 00:17:56,196 --> 00:17:59,436 Speaker 2: interface with the brain, just like it was would have 308 00:17:59,476 --> 00:18:01,996 Speaker 2: been impossible to predict based on the keyboard, a loan, 309 00:18:02,156 --> 00:18:05,476 Speaker 2: or the graphical user interface alone, all of the different 310 00:18:05,516 --> 00:18:08,516 Speaker 2: applications that have emerged. I think, as long as we 311 00:18:08,596 --> 00:18:13,356 Speaker 2: build a safe, reliable interface and make that responsibly available 312 00:18:13,756 --> 00:18:16,836 Speaker 2: kind of skuy's the limit. And I can't even hazard 313 00:18:16,876 --> 00:18:18,716 Speaker 2: to guess at some of the things that will come next. 314 00:18:18,756 --> 00:18:20,956 Speaker 2: So I think there's a there's a whole generation of 315 00:18:21,116 --> 00:18:23,996 Speaker 2: discovery and innovation waiting to happen after we get this 316 00:18:24,196 --> 00:18:27,276 Speaker 2: across the line into patients to become standard of care. 317 00:18:27,956 --> 00:18:30,756 Speaker 1: Could you imagine it being used in healthy people for 318 00:18:31,716 --> 00:18:35,476 Speaker 1: you know, the computer and the brain application. 319 00:18:35,836 --> 00:18:37,796 Speaker 2: Yeah, I could eventually, in a sense, I would love 320 00:18:37,796 --> 00:18:38,516 Speaker 2: that to be the case. 321 00:18:38,876 --> 00:18:42,476 Speaker 1: I think I'm ambivalent about that one. Tell me why 322 00:18:42,476 --> 00:18:43,196 Speaker 1: you'd love that to be. 323 00:18:43,156 --> 00:18:45,636 Speaker 2: The case, Well, because it will have meant that we've. 324 00:18:46,436 --> 00:18:48,876 Speaker 1: Well, yes, it'll mean your thing works really well, it 325 00:18:48,956 --> 00:18:49,956 Speaker 1: is wildly safe. 326 00:18:50,036 --> 00:18:52,556 Speaker 2: Yes, that's true, right, Yeah, So we build with that 327 00:18:52,636 --> 00:18:55,156 Speaker 2: goal in mind in a way, right, just because in 328 00:18:55,276 --> 00:18:58,436 Speaker 2: order for something to be accepted by an eble body 329 00:18:58,596 --> 00:19:03,316 Speaker 2: person who has zero risk tolerance, right and basically only downside, 330 00:19:03,436 --> 00:19:06,596 Speaker 2: if something goes wrong or doesn't work properly, you need 331 00:19:06,596 --> 00:19:09,996 Speaker 2: it to work just in a bulletproof way. That's that's 332 00:19:09,996 --> 00:19:12,676 Speaker 2: the kind of system that we're trying to engineer. 333 00:19:12,796 --> 00:19:15,836 Speaker 1: Yes, from that point of view, it makes perfect sense. 334 00:19:15,876 --> 00:19:18,556 Speaker 1: Then if that is true, then then you have built 335 00:19:18,556 --> 00:19:22,196 Speaker 1: a wildly safe and effective device exactly. 336 00:19:22,236 --> 00:19:23,916 Speaker 2: So, if you and I were having this conversation and 337 00:19:23,916 --> 00:19:25,676 Speaker 2: you said to me, gosh, I would love to write. 338 00:19:25,716 --> 00:19:27,876 Speaker 2: I mean that would mean that all those doubts had 339 00:19:27,916 --> 00:19:31,276 Speaker 2: been erased. And in order to erase those doubts, we 340 00:19:31,396 --> 00:19:33,476 Speaker 2: have to prove certain things to the world, and that's 341 00:19:33,516 --> 00:19:34,636 Speaker 2: that's really our job. 342 00:19:35,756 --> 00:19:39,276 Speaker 1: Would you would you want if you were healthy? Would 343 00:19:39,316 --> 00:19:41,596 Speaker 1: you want to have your device in your brain? If 344 00:19:41,636 --> 00:19:43,796 Speaker 1: it were safe and effective? 345 00:19:43,956 --> 00:19:46,436 Speaker 2: I would have to do certain things that that the 346 00:19:46,476 --> 00:19:49,196 Speaker 2: device can't do yet, yeah, rush, but I wouldn't definitely 347 00:19:49,196 --> 00:19:50,596 Speaker 2: wouldn't rerule it out when we get there. And I 348 00:19:50,596 --> 00:19:53,756 Speaker 2: mean it's like sometimes with technology, it's it's hard to 349 00:19:53,796 --> 00:19:55,916 Speaker 2: wrap your mind around what's going to happen in a generation, 350 00:19:56,156 --> 00:19:59,476 Speaker 2: right too little kids and we're always talking about like 351 00:19:59,676 --> 00:20:01,916 Speaker 2: should the kids actually get to use an iPhone? 352 00:20:02,356 --> 00:20:05,156 Speaker 1: Hold out for as long as you can so, right, 353 00:20:05,196 --> 00:20:07,876 Speaker 1: because so it's not exactly a choice, right, that's the 354 00:20:07,996 --> 00:20:10,116 Speaker 1: that's the thing you think, like, oh an iPhone? 355 00:20:10,156 --> 00:20:12,876 Speaker 2: That's my point. By the way, I'm very very permissive 356 00:20:13,556 --> 00:20:14,196 Speaker 2: and me too. 357 00:20:15,516 --> 00:20:17,916 Speaker 1: You know what finished me was COVID, Like we held 358 00:20:17,916 --> 00:20:21,396 Speaker 1: out really strong and then COVID hit did ason. 359 00:20:21,636 --> 00:20:23,476 Speaker 2: So but the reason I bring that up is that, 360 00:20:23,516 --> 00:20:26,676 Speaker 2: you know, like our parents could not even have conceived 361 00:20:26,836 --> 00:20:28,796 Speaker 2: of even that question, right. 362 00:20:28,756 --> 00:20:30,956 Speaker 1: Yes, But I mean the other way to think about 363 00:20:30,996 --> 00:20:34,236 Speaker 1: that is, like, you know, I'm pro progress and pro technology, 364 00:20:34,396 --> 00:20:39,316 Speaker 1: but like having kids makes me wish iPhones didn't exist, right, 365 00:20:39,436 --> 00:20:41,676 Speaker 1: makes me wish like, sure, give them a flip phone 366 00:20:41,716 --> 00:20:43,556 Speaker 1: so they can text their friends that call me if 367 00:20:43,596 --> 00:20:46,156 Speaker 1: something goes wrong. But well, you know, I don't know. 368 00:20:46,356 --> 00:20:48,556 Speaker 1: But on the other hand, I make podcasts for a little. 369 00:20:49,236 --> 00:20:52,396 Speaker 2: It's an interesting discussion, right, and you know, we sometimes joke, 370 00:20:52,476 --> 00:20:54,916 Speaker 2: but somehow kids are born now knowing how to swipe 371 00:20:54,956 --> 00:20:58,796 Speaker 2: and navigate the phone interface. Right. So my point is 372 00:20:58,836 --> 00:21:01,836 Speaker 2: that in twenty years it's going to be a different conversation. 373 00:21:02,196 --> 00:21:04,676 Speaker 2: There's a lot of kids of people in the company, 374 00:21:04,756 --> 00:21:07,356 Speaker 2: and they know what we're doing, you know, like girls 375 00:21:07,396 --> 00:21:09,796 Speaker 2: know what we're doing, and their view and the technology 376 00:21:09,836 --> 00:21:13,596 Speaker 2: is different. They see it as something that exists, and 377 00:21:13,636 --> 00:21:15,396 Speaker 2: when you're bored into it, you have kind of a 378 00:21:15,396 --> 00:21:18,236 Speaker 2: different sense of what's okay and what's normal. And that's 379 00:21:18,276 --> 00:21:20,996 Speaker 2: the generation that's growing up today is going to grow 380 00:21:21,076 --> 00:21:24,156 Speaker 2: up with bring computer interfaces just being a normal thing. 381 00:21:24,916 --> 00:21:28,436 Speaker 1: Yeah, maybe your grandkids will feel about bring computer interfaces 382 00:21:28,476 --> 00:21:30,076 Speaker 1: the way your kids feel about iPhones. 383 00:21:30,236 --> 00:21:31,516 Speaker 2: It's going to happen faster than that. 384 00:21:36,356 --> 00:21:38,436 Speaker 1: We'll be back in a minute with the lightning rim 385 00:21:48,876 --> 00:21:53,956 Speaker 1: tell me about the metabolic factors limiting performance in marathon runners. 386 00:21:54,556 --> 00:21:58,756 Speaker 2: Okay, right, so that was a paper that I wrote 387 00:21:58,916 --> 00:22:01,356 Speaker 2: now more than a decade ago. So I'm a dedicated 388 00:22:01,396 --> 00:22:05,156 Speaker 2: marathon runner and I run forty something marathons over twenty 389 00:22:05,396 --> 00:22:07,916 Speaker 2: plus years. There's a longer story, which we don't have 390 00:22:07,956 --> 00:22:10,196 Speaker 2: time for now, is to I wrote that paper. 391 00:22:10,236 --> 00:22:12,716 Speaker 1: Give me what's the short version of that story of 392 00:22:12,716 --> 00:22:13,596 Speaker 1: why you wrote the paper. 393 00:22:13,636 --> 00:22:16,396 Speaker 2: The short version is it shouldn't be metabolically possible to 394 00:22:16,476 --> 00:22:20,236 Speaker 2: run a marathon because interest everybody. Everybody thinks the paradox 395 00:22:20,316 --> 00:22:22,596 Speaker 2: is that you know, you can't eat enough pasta to 396 00:22:22,636 --> 00:22:23,956 Speaker 2: get through twenty six miles. 397 00:22:24,596 --> 00:22:26,996 Speaker 1: Uh, just like if you do the math, there's not 398 00:22:27,196 --> 00:22:29,236 Speaker 1: enough energy stored in the body. 399 00:22:28,956 --> 00:22:30,636 Speaker 2: If you do the simple math. There seems to be 400 00:22:30,636 --> 00:22:32,996 Speaker 2: a paradox that you can't you can't eat enough pasta 401 00:22:33,036 --> 00:22:34,676 Speaker 2: to run the marathon. Right, everybody thinks you got to 402 00:22:34,756 --> 00:22:36,796 Speaker 2: run eat passive before you run the marathon. It turns 403 00:22:36,836 --> 00:22:39,396 Speaker 2: out that can't really eat enough pasta to run a marathon. 404 00:22:39,876 --> 00:22:42,396 Speaker 2: So how is it even possible, okay, And the reason 405 00:22:42,436 --> 00:22:44,636 Speaker 2: it's possible is that you're burning some fat as you go. 406 00:22:45,116 --> 00:22:47,636 Speaker 2: And then everybody knows that there's this phenomenon of hitting 407 00:22:47,676 --> 00:22:51,236 Speaker 2: the wall where you know, many runners collapse or have 408 00:22:51,636 --> 00:22:54,116 Speaker 2: a major impact at some point, you know, along the way, 409 00:22:54,196 --> 00:22:55,916 Speaker 2: usually about two thirds the way through the race, where 410 00:22:55,956 --> 00:22:58,516 Speaker 2: they just can't keep going or it can't keep going 411 00:22:58,516 --> 00:23:01,276 Speaker 2: at the same pace that they started the race. And 412 00:23:01,316 --> 00:23:04,716 Speaker 2: why does that happen. That happens because they're not burning 413 00:23:04,876 --> 00:23:08,396 Speaker 2: carbohydrates as the fuel substrate, or they can't burn them 414 00:23:08,396 --> 00:23:10,716 Speaker 2: at the same rate that they started the race. So 415 00:23:11,076 --> 00:23:12,556 Speaker 2: how do you not hit the wall? How do you 416 00:23:12,596 --> 00:23:16,556 Speaker 2: avoid that phenomenon? And basically you need to run at 417 00:23:16,596 --> 00:23:20,476 Speaker 2: a pace that basically burns both fuel substrates fat and 418 00:23:20,556 --> 00:23:23,716 Speaker 2: the carbohydrate at a rate that basically you just exhaust 419 00:23:23,796 --> 00:23:28,076 Speaker 2: your carbohydrates stores at mile twenty six point two. So 420 00:23:28,156 --> 00:23:32,236 Speaker 2: that's one of the core rate limiting metabolic factors in marathon. 421 00:23:32,476 --> 00:23:34,876 Speaker 1: And that, I mean, so, what was it that you 422 00:23:34,956 --> 00:23:39,036 Speaker 1: figured out that got published in whatever it was Plus. 423 00:23:39,236 --> 00:23:43,116 Speaker 2: Yes, I figured that out, and I figured out how 424 00:23:43,116 --> 00:23:45,796 Speaker 2: to model that mathematically. 425 00:23:45,756 --> 00:23:48,596 Speaker 1: And did it affect the way people run? 426 00:23:48,636 --> 00:23:50,676 Speaker 2: Marathons well effected the way I run marathons? 427 00:23:50,956 --> 00:23:53,796 Speaker 1: And how did you change your running strategy based on 428 00:23:53,836 --> 00:23:54,676 Speaker 1: your own research? 429 00:23:55,316 --> 00:23:58,676 Speaker 2: I learned how to pace myself in a more quantitative way, 430 00:23:59,196 --> 00:24:02,036 Speaker 2: and I learned how to have a structure. My pre 431 00:24:02,156 --> 00:24:04,396 Speaker 2: race died and my training diet in a way that 432 00:24:04,636 --> 00:24:06,636 Speaker 2: was much better than I had in the years before that. 433 00:24:07,156 --> 00:24:07,996 Speaker 1: Did you get faster? 434 00:24:08,596 --> 00:24:09,876 Speaker 2: I got sick dificantly faster. 435 00:24:10,116 --> 00:24:10,316 Speaker 1: Yeah. 436 00:24:10,316 --> 00:24:12,596 Speaker 2: I run a bunch of some three hour marathons around 437 00:24:12,596 --> 00:24:14,316 Speaker 2: the time I figured that all out. 438 00:24:14,556 --> 00:24:16,196 Speaker 1: That is a very fast marathon. 439 00:24:16,276 --> 00:24:17,516 Speaker 2: And for a period of time, I don't know if 440 00:24:17,516 --> 00:24:20,516 Speaker 2: it's still the case, but maybe embarrassingly that was my 441 00:24:20,756 --> 00:24:23,676 Speaker 2: It still is I think my only single author paper, 442 00:24:24,196 --> 00:24:27,836 Speaker 2: and for a period of time it was most cited paper. 443 00:24:29,396 --> 00:24:32,076 Speaker 1: Well, you know, Einstein's most cited paper is the one 444 00:24:32,076 --> 00:24:35,796 Speaker 1: where he describes entanglement and basically says, this proves that 445 00:24:36,156 --> 00:24:38,676 Speaker 1: quantum is not a complete description of reality because there's 446 00:24:38,716 --> 00:24:41,516 Speaker 1: no way it could be true and he was wrong. 447 00:24:41,356 --> 00:24:45,916 Speaker 2: Right, can't aspire to that necessarily, But anyway. 448 00:24:46,156 --> 00:24:48,796 Speaker 1: What's one tip that comes out of that? Like, do 449 00:24:48,876 --> 00:24:50,796 Speaker 1: I is there like a model I could plug in? 450 00:24:50,876 --> 00:24:52,676 Speaker 1: I ran my first marathon this year. I did not 451 00:24:52,796 --> 00:24:56,476 Speaker 1: know about your paper. Is there something you can tell me, 452 00:24:56,556 --> 00:24:59,076 Speaker 1: just qualitatively from it that I'm doing wrong? 453 00:24:59,196 --> 00:25:02,236 Speaker 2: Yeah, take a look. There's a little formula there basically 454 00:25:02,276 --> 00:25:06,396 Speaker 2: that allows the average person to estimate their optimal marathon pace. 455 00:25:08,516 --> 00:25:10,636 Speaker 1: Boston ma On or New York Marathon. What do you 456 00:25:10,676 --> 00:25:11,076 Speaker 1: like better? 457 00:25:11,276 --> 00:25:13,676 Speaker 2: Well, you know, I'm a native. I've run both many times. 458 00:25:14,476 --> 00:25:18,636 Speaker 2: I've run Boston for the last twenty four years consecutively, 459 00:25:19,156 --> 00:25:21,316 Speaker 2: and I've run New York. I think, I forget now 460 00:25:21,356 --> 00:25:25,236 Speaker 2: how many times more than ten? And I love them both, 461 00:25:25,316 --> 00:25:27,076 Speaker 2: And I'm not going to go I'm not going to 462 00:25:27,116 --> 00:25:29,516 Speaker 2: say in public which one I love more. But they're 463 00:25:29,636 --> 00:25:34,076 Speaker 2: very different. They're very different, and yeah that's all. That's all. 464 00:25:34,076 --> 00:25:36,156 Speaker 2: That's all I'll say. But they are wonderful races and 465 00:25:37,076 --> 00:25:38,476 Speaker 2: a lot of special things about both. 466 00:25:41,356 --> 00:25:44,236 Speaker 1: What is one thing we don't understand about the brain 467 00:25:44,636 --> 00:25:45,996 Speaker 1: that you wish we understood? 468 00:25:47,236 --> 00:25:50,316 Speaker 2: So the question of what is consciousness? I think has 469 00:25:50,356 --> 00:25:54,236 Speaker 2: been a big one in philosophy and neuroscience for a 470 00:25:54,276 --> 00:25:56,916 Speaker 2: long long time. Right, you know, I think that the 471 00:25:57,076 --> 00:26:00,836 Speaker 2: tools of bring computer interfaces are probably have already given, 472 00:26:00,876 --> 00:26:02,956 Speaker 2: but certainly we'll be giving us in the next couple 473 00:26:02,996 --> 00:26:06,436 Speaker 2: of years ways to answer that in a really rigorous 474 00:26:06,476 --> 00:26:09,956 Speaker 2: and quantitative way. And not just that, but I think 475 00:26:09,996 --> 00:26:13,196 Speaker 2: to have an impact in disorders of consciousness, and so 476 00:26:13,276 --> 00:26:15,596 Speaker 2: I think that's an area where rank of beer interfaces 477 00:26:15,636 --> 00:26:19,396 Speaker 2: are going to have perhaps a surprisingly major impact. 478 00:26:20,716 --> 00:26:22,996 Speaker 1: What's a disorder of consciousness? I don't think I know 479 00:26:23,076 --> 00:26:25,516 Speaker 1: that phrase, like help me, what does that mean? 480 00:26:25,596 --> 00:26:27,836 Speaker 2: Well, you know, I think many people are familiar with 481 00:26:27,876 --> 00:26:31,156 Speaker 2: the koma, right, so people who are alive but not 482 00:26:31,636 --> 00:26:33,956 Speaker 2: compismentus in the in the ways that you and I 483 00:26:33,956 --> 00:26:37,196 Speaker 2: are when we're talking. That's just a dramatic example of that. 484 00:26:37,956 --> 00:26:40,716 Speaker 1: Has the work you've done, I mean, either as a 485 00:26:40,716 --> 00:26:44,756 Speaker 1: as a brand surgeon or as in developing brain computer interfaces, 486 00:26:44,836 --> 00:26:47,436 Speaker 1: how has that changed the way you think about consciousness? 487 00:26:48,396 --> 00:26:51,516 Speaker 2: If it has, I'm not sure it has yet, but 488 00:26:52,316 --> 00:26:53,876 Speaker 2: at least not in a race I want to talk about 489 00:26:53,956 --> 00:26:56,756 Speaker 2: in public. But I mean, watch this space carefully. 490 00:26:58,876 --> 00:27:01,916 Speaker 1: Say one more thing about that. That's it's very intriguing 491 00:27:01,956 --> 00:27:04,156 Speaker 1: to me. I feel like there's something you're thinking that 492 00:27:04,196 --> 00:27:04,876 Speaker 1: you're not saying. 493 00:27:05,196 --> 00:27:07,236 Speaker 2: I think a lot of it is public, and I 494 00:27:07,236 --> 00:27:10,796 Speaker 2: think in a really really interesting way. So i'd highlight 495 00:27:11,076 --> 00:27:14,876 Speaker 2: some recent work or recently published work by Negoschiff and 496 00:27:14,876 --> 00:27:18,916 Speaker 2: others demonstrating that some people who seem to be in 497 00:27:18,956 --> 00:27:21,996 Speaker 2: a minimally conscious state actually have the ability to communicate 498 00:27:22,156 --> 00:27:24,236 Speaker 2: if you give them the tools to do so, and 499 00:27:24,316 --> 00:27:30,396 Speaker 2: that just has profound implications for the diagnosis of certain 500 00:27:30,396 --> 00:27:34,276 Speaker 2: types of severe brain injury, for prognosticating, you know, the 501 00:27:34,276 --> 00:27:37,436 Speaker 2: subsequent course of people who have such injuries, and all 502 00:27:37,516 --> 00:27:41,316 Speaker 2: kinds of philosophical, ethical, and really just most importantly practical 503 00:27:41,356 --> 00:27:44,116 Speaker 2: aspects of how do we take care of people with 504 00:27:44,316 --> 00:27:47,236 Speaker 2: that kind of severe brain injury, many of whom pose 505 00:27:47,796 --> 00:27:52,836 Speaker 2: tremendously difficult questions to family and caregivers who can't predict 506 00:27:52,916 --> 00:27:55,916 Speaker 2: what's going to happen next and can't communicate with their 507 00:27:55,956 --> 00:27:59,516 Speaker 2: loved ones. And there's always this question in such situations, 508 00:27:59,556 --> 00:28:02,516 Speaker 2: you know, is that the person we knew still there? 509 00:28:03,316 --> 00:28:06,836 Speaker 2: And will that person come back, so to speak or not? 510 00:28:07,476 --> 00:28:11,916 Speaker 2: And answering that question, this is one aspect of getting 511 00:28:11,916 --> 00:28:14,956 Speaker 2: at what is consciousness and how does it flunctually and 512 00:28:14,996 --> 00:28:17,756 Speaker 2: how do we quantify it, and how do we read 513 00:28:17,956 --> 00:28:22,236 Speaker 2: or restore it when it's lost or damaged. So, you know, 514 00:28:22,396 --> 00:28:26,276 Speaker 2: that has been the realm of philosophy for most of 515 00:28:26,796 --> 00:28:30,156 Speaker 2: human history, and I think it is very exciting for 516 00:28:30,236 --> 00:28:33,996 Speaker 2: me now that's that's changed in the last several years. 517 00:28:34,036 --> 00:28:37,436 Speaker 2: And I do think that the technology of Breaker Beeterer 518 00:28:37,436 --> 00:28:39,996 Speaker 2: interfaces is going to have an impact in making some 519 00:28:40,036 --> 00:28:42,916 Speaker 2: of the discoveries that have come to light actionable. 520 00:28:49,796 --> 00:28:53,156 Speaker 1: Ben Rappaport is the co founder and chief science officer 521 00:28:53,316 --> 00:28:58,236 Speaker 1: at Precision Neuroscience. Today's show was produced by Gabriel hunter Cheng. 522 00:28:58,596 --> 00:29:01,916 Speaker 1: It was edited by Lyddya jene Kott and engineered by 523 00:29:01,956 --> 00:29:05,516 Speaker 1: Sarah Brumer. You can email us at problem at Pushkin 524 00:29:05,636 --> 00:29:08,796 Speaker 1: dot FM. I'm Jacob Goldstein and we'll be back next 525 00:29:08,836 --> 00:29:20,316 Speaker 1: week with another episode of What's Your Problem MHM.