1 00:00:03,240 --> 00:00:07,680 Speaker 1: Could you learn to fly a helicopter not by practicing, 2 00:00:08,080 --> 00:00:12,120 Speaker 1: but instead by uploading the information directly into your brain? 3 00:00:12,880 --> 00:00:16,200 Speaker 1: What would society do if kids no longer had to 4 00:00:16,200 --> 00:00:18,560 Speaker 1: go to school? And what does any of this have 5 00:00:18,640 --> 00:00:23,200 Speaker 1: to do with suntan booths or nano robots or torking 6 00:00:23,440 --> 00:00:27,680 Speaker 1: over a presidential address or what a cowboy on a 7 00:00:27,760 --> 00:00:35,000 Speaker 1: hill is simply not able to see. Welcome to Inner 8 00:00:35,120 --> 00:00:38,800 Speaker 1: Cosmos with me David Eagleman. I'm a neuroscientist and author 9 00:00:38,840 --> 00:00:42,840 Speaker 1: at Stanford and in these episodes we sail deeply into 10 00:00:42,880 --> 00:00:47,239 Speaker 1: our three pound universe to understand why and how our 11 00:00:47,320 --> 00:00:59,400 Speaker 1: lives look the way they do. Today's episode is about 12 00:00:59,440 --> 00:01:05,280 Speaker 1: the potability of really coming to understand the tangled forest 13 00:01:05,360 --> 00:01:08,520 Speaker 1: of eighty six billion neurons in your head and the 14 00:01:08,640 --> 00:01:11,960 Speaker 1: trillions of connections between them. And if we could do that, 15 00:01:12,480 --> 00:01:17,200 Speaker 1: could we upload information directly into your brain? Could we 16 00:01:17,360 --> 00:01:21,760 Speaker 1: speed up education this way? Now? At the moment, this 17 00:01:21,880 --> 00:01:25,720 Speaker 1: is all pure fantasy because we simply don't have the 18 00:01:25,800 --> 00:01:29,160 Speaker 1: technology to allow us to do that. But the question 19 00:01:29,200 --> 00:01:32,880 Speaker 1: we're going to ask today is whether this is theoretically 20 00:01:33,000 --> 00:01:36,760 Speaker 1: possible and something we can look forward to around the 21 00:01:36,959 --> 00:01:40,640 Speaker 1: corner of the next century, and what are the caveats, 22 00:01:40,640 --> 00:01:44,800 Speaker 1: the things to watch out for, and the unexpected complexities here. 23 00:01:45,160 --> 00:01:49,080 Speaker 1: So let's get started some hundreds of years ago and 24 00:01:49,240 --> 00:01:53,280 Speaker 1: still in many impoverished places in the world, children of 25 00:01:53,320 --> 00:01:56,960 Speaker 1: the species Homo sapiens reproduce by the time they are 26 00:01:57,080 --> 00:02:01,360 Speaker 1: young teens. But this situation is it's totally different in 27 00:02:01,440 --> 00:02:05,560 Speaker 1: modern times and modern societies. Now, young people go to 28 00:02:05,840 --> 00:02:10,040 Speaker 1: school for their first eighteen years or twenty one years, 29 00:02:10,480 --> 00:02:14,080 Speaker 1: and increasingly twenty five or twenty six years for an 30 00:02:14,120 --> 00:02:17,880 Speaker 1: advanced degree, and in fields like medicine, they take another 31 00:02:18,080 --> 00:02:22,040 Speaker 1: several years of internship and residency. And in a field 32 00:02:22,120 --> 00:02:26,320 Speaker 1: like neuroscience research people do a postdoctoral fellowship and then 33 00:02:26,320 --> 00:02:29,919 Speaker 1: they hope to become an assistant professor, and then an 34 00:02:29,919 --> 00:02:33,760 Speaker 1: associate professor and finally a full professor. And most people 35 00:02:34,080 --> 00:02:37,040 Speaker 1: are in their forties by the time they get there. 36 00:02:37,560 --> 00:02:40,880 Speaker 1: So what accounts for this recent historical change. Why do 37 00:02:40,919 --> 00:02:43,840 Speaker 1: we do so much schooling for so much of our 38 00:02:43,960 --> 00:02:48,359 Speaker 1: lives now? Well, it's because we are a runaway species. 39 00:02:48,400 --> 00:02:50,919 Speaker 1: We've gone off in a totally different direction than all 40 00:02:50,960 --> 00:02:55,840 Speaker 1: our animal cousins, and we have made thousands of important 41 00:02:56,200 --> 00:03:00,880 Speaker 1: discoveries about our world and produced so much art invarious forms, 42 00:03:01,160 --> 00:03:04,800 Speaker 1: And as a result, there's so much to learn, and 43 00:03:04,840 --> 00:03:09,760 Speaker 1: so we need to spend decades in institutions of learning, 44 00:03:09,840 --> 00:03:13,480 Speaker 1: not to mention, reading books and listening to podcasts to 45 00:03:13,720 --> 00:03:18,920 Speaker 1: understand what millions of humans have devoted their lives to 46 00:03:19,160 --> 00:03:23,280 Speaker 1: figuring out. But what if there were a way that 47 00:03:23,360 --> 00:03:24,920 Speaker 1: we didn't have to do that? What if there were 48 00:03:24,960 --> 00:03:28,600 Speaker 1: a way to simply upload the information, in other words, 49 00:03:28,680 --> 00:03:33,320 Speaker 1: to put the information directly into your brain. So let's 50 00:03:33,320 --> 00:03:38,160 Speaker 1: harken back to this great scene in The Matrix where 51 00:03:38,280 --> 00:03:42,240 Speaker 1: Neo and Trinity are being hotly pursued by the antagonist, 52 00:03:42,320 --> 00:03:45,560 Speaker 1: agent Smith, and our two heroes end up on top 53 00:03:45,600 --> 00:03:49,240 Speaker 1: of a building, and there they spy a helicopter parked 54 00:03:49,280 --> 00:03:52,720 Speaker 1: on the roof, and Neo asks Trinity do you know 55 00:03:52,760 --> 00:03:56,960 Speaker 1: how to fly that? And she replies not yet, And 56 00:03:57,000 --> 00:04:00,480 Speaker 1: she flips open her phone and she calls Tank, the operator, 57 00:04:00,560 --> 00:04:03,680 Speaker 1: and she says, I need a pilot program for a 58 00:04:03,720 --> 00:04:07,280 Speaker 1: B two twelve helicopter, And we see the operator rotate 59 00:04:07,320 --> 00:04:09,840 Speaker 1: his chair in front of his bank of computers and 60 00:04:09,880 --> 00:04:12,440 Speaker 1: he quickly types out a bunch of commands, and she 61 00:04:12,600 --> 00:04:17,000 Speaker 1: closes her eyes and one second later she turns confidently 62 00:04:17,080 --> 00:04:21,320 Speaker 1: to Neo and says, let's go. So what happened is 63 00:04:21,360 --> 00:04:26,480 Speaker 1: that Tank the operator had taken the expertise, the complicated 64 00:04:26,640 --> 00:04:30,159 Speaker 1: know how of flying a B two twelve helicopter and 65 00:04:30,360 --> 00:04:34,160 Speaker 1: just uploaded it to her brain. So the question we're 66 00:04:34,200 --> 00:04:37,719 Speaker 1: going to ask today is is that theoretically possible from 67 00:04:37,839 --> 00:04:42,360 Speaker 1: a neuroscience perspective, and what will make that straightforward? And 68 00:04:42,400 --> 00:04:46,679 Speaker 1: what will make that not straightforward to accomplish someday. Now, 69 00:04:46,880 --> 00:04:50,000 Speaker 1: in some ways, the whole idea sounds crazy because it 70 00:04:50,040 --> 00:04:52,800 Speaker 1: seems like we always have to earn things if we 71 00:04:52,880 --> 00:04:55,719 Speaker 1: want changes to our brains or body. You can't just 72 00:04:55,760 --> 00:04:58,839 Speaker 1: get something for free. But of course, people for decades 73 00:04:58,880 --> 00:05:02,120 Speaker 1: have been climbing in into suntan booths instead of spending 74 00:05:02,200 --> 00:05:06,279 Speaker 1: days outside, and people get botox, which binds to receptors 75 00:05:06,360 --> 00:05:09,680 Speaker 1: the ends of peripheral nerves and changes the wrinkliness of 76 00:05:09,720 --> 00:05:13,320 Speaker 1: your face. And people are increasingly doing things to not 77 00:05:13,400 --> 00:05:15,640 Speaker 1: have to go to the gym but instead to have 78 00:05:16,040 --> 00:05:20,919 Speaker 1: your abdominal muscles built for you with electrical stimulation. You 79 00:05:21,040 --> 00:05:24,039 Speaker 1: just lie on the table and your muscles contract over 80 00:05:24,080 --> 00:05:26,480 Speaker 1: and over and The idea is that your muscles can 81 00:05:26,520 --> 00:05:30,279 Speaker 1: grow stronger and look better without you having to do 82 00:05:30,320 --> 00:05:33,160 Speaker 1: a single sit up. You just lie there. So what 83 00:05:33,160 --> 00:05:36,720 Speaker 1: would be the equivalent in the realm of education? Can 84 00:05:36,760 --> 00:05:39,680 Speaker 1: we imagine a time when you don't have to bury 85 00:05:39,720 --> 00:05:42,800 Speaker 1: yourself in a book to master some domain, where you 86 00:05:42,839 --> 00:05:45,400 Speaker 1: don't have to spend hundreds of hours sitting in a 87 00:05:45,440 --> 00:05:49,080 Speaker 1: flight simulator, but instead you hook something up to your 88 00:05:49,120 --> 00:05:52,600 Speaker 1: brain and then it is as though you already knew 89 00:05:53,000 --> 00:05:57,120 Speaker 1: quantum mechanics or electrical engineering, or Persian history, or how 90 00:05:57,160 --> 00:06:00,200 Speaker 1: to serve for hang glide or repair that model of 91 00:06:00,240 --> 00:06:05,479 Speaker 1: dishwasher or whatever. Now, how would you push information to 92 00:06:05,640 --> 00:06:10,120 Speaker 1: the brain? We currently do this by sitting down dozens 93 00:06:10,120 --> 00:06:13,200 Speaker 1: of children in front of someone who already has the 94 00:06:13,200 --> 00:06:17,919 Speaker 1: information in their brain, and that person uses words or pictures, 95 00:06:18,040 --> 00:06:21,320 Speaker 1: and the students attend to those stimuli and try to 96 00:06:21,480 --> 00:06:26,239 Speaker 1: translate those words or pictures into changes in their own 97 00:06:26,400 --> 00:06:30,520 Speaker 1: private jungle of billions of neurons. They try to convert 98 00:06:30,600 --> 00:06:34,040 Speaker 1: what they're hearing or seeing into storage in their own 99 00:06:34,080 --> 00:06:36,760 Speaker 1: internal model in a way that makes sense to them. 100 00:06:37,240 --> 00:06:41,479 Speaker 1: What learning means is that you very finely change the 101 00:06:41,600 --> 00:06:45,719 Speaker 1: networks in your head. That's it. That's what we pay 102 00:06:45,760 --> 00:06:48,159 Speaker 1: lots of tuition for and go off to college for 103 00:06:48,560 --> 00:06:51,799 Speaker 1: to get someone who already has information in their network 104 00:06:52,240 --> 00:06:56,240 Speaker 1: to translate it through the low bandwidth channel of language 105 00:06:56,560 --> 00:06:59,720 Speaker 1: over to your network. So, just to be clear on this, 106 00:07:00,240 --> 00:07:04,240 Speaker 1: before you know some factor concept, your network is configured 107 00:07:04,279 --> 00:07:07,279 Speaker 1: in some way, and then I tell you, oh, that 108 00:07:07,560 --> 00:07:11,480 Speaker 1: dog's name is Nebula, and then you encode that information. 109 00:07:12,040 --> 00:07:14,200 Speaker 1: This connection in your brain gets strengthened and this one 110 00:07:14,240 --> 00:07:17,520 Speaker 1: gets weakened, and this synapse unplugs, it replugs over there, 111 00:07:17,600 --> 00:07:20,280 Speaker 1: and this happens over millions of synapses, and then you 112 00:07:20,960 --> 00:07:24,240 Speaker 1: know something that you did not know before. And for 113 00:07:24,800 --> 00:07:28,800 Speaker 1: deeper knowledge, like flying a B two twelve helicopter, this 114 00:07:28,880 --> 00:07:32,240 Speaker 1: requires not just the memory of a fact, but of 115 00:07:32,280 --> 00:07:36,440 Speaker 1: a procedure. And so those changes happen in different brain 116 00:07:36,480 --> 00:07:39,920 Speaker 1: areas and they're more widespread. But what is required in 117 00:07:40,000 --> 00:07:44,200 Speaker 1: all these forms of learning are simply changes in the 118 00:07:44,240 --> 00:07:48,800 Speaker 1: patterns of your network, presumably just the synaptic connections, but 119 00:07:48,880 --> 00:07:53,280 Speaker 1: maybe other details as well, like which neurotransmitter receptors are 120 00:07:53,280 --> 00:07:56,240 Speaker 1: being expressed on the membranes and whatever. But that's it, 121 00:07:56,760 --> 00:08:01,320 Speaker 1: that's what it means. To learn something. So is there 122 00:08:01,400 --> 00:08:06,000 Speaker 1: any way to implement those changes besides the old fashioned 123 00:08:06,040 --> 00:08:09,480 Speaker 1: way of sitting for a semester in a classroom or 124 00:08:09,600 --> 00:08:14,200 Speaker 1: spending hours in the helicopter flight simulator. Well, there's been 125 00:08:14,200 --> 00:08:18,120 Speaker 1: a lot of excitement about brain machine interfaces, such as 126 00:08:18,200 --> 00:08:22,960 Speaker 1: the brain electrodes that are implanted robotically by the company Neurallink. 127 00:08:23,200 --> 00:08:25,880 Speaker 1: So I'll just take a quick moment to clarify the 128 00:08:26,000 --> 00:08:30,360 Speaker 1: landscape of electrodes in the brain. Even though neuralink hit 129 00:08:30,440 --> 00:08:32,800 Speaker 1: the news recently. The first thing to note is that 130 00:08:32,880 --> 00:08:37,600 Speaker 1: these brain machine interfaces have been around for many decades 131 00:08:37,640 --> 00:08:41,960 Speaker 1: since people started inserting electrodes. These are just thin metal 132 00:08:41,960 --> 00:08:44,920 Speaker 1: wires into the brain. The idea is that you just 133 00:08:45,160 --> 00:08:49,199 Speaker 1: insert this electrode into the neural tissue and you listen 134 00:08:49,679 --> 00:08:53,480 Speaker 1: to the electrical activity of the cells. And researchers pretty 135 00:08:53,559 --> 00:08:56,079 Speaker 1: quickly figured out that if you send a little bit 136 00:08:56,080 --> 00:08:59,480 Speaker 1: of electricity down the wire down this electrode, you can 137 00:08:59,520 --> 00:09:03,360 Speaker 1: stimulate the cell to make it active where it pops 138 00:09:03,400 --> 00:09:07,079 Speaker 1: off its own little electrical spikes that travel around. So 139 00:09:07,400 --> 00:09:10,520 Speaker 1: you put in some electricity and it goes And this 140 00:09:10,600 --> 00:09:14,880 Speaker 1: is the technology behind, for example, deep brain stimulation you 141 00:09:14,920 --> 00:09:18,439 Speaker 1: might have heard of this. Take Parkinson's disease. There's a 142 00:09:18,640 --> 00:09:22,960 Speaker 1: tiny brain region called the subthalamic nucleus, and it was 143 00:09:23,000 --> 00:09:26,280 Speaker 1: discovered starting from work in the nineteen seventies that you 144 00:09:26,320 --> 00:09:29,800 Speaker 1: can insert your electrode into this area and zap it 145 00:09:29,840 --> 00:09:32,640 Speaker 1: with a bit of electricity and you get these amazing 146 00:09:32,679 --> 00:09:39,040 Speaker 1: effects of the movement problems of Parkinson's essentially disappearing. And 147 00:09:39,080 --> 00:09:41,479 Speaker 1: by the way, the reason you can stick an electrode 148 00:09:41,520 --> 00:09:44,200 Speaker 1: into the brain is because the brain doesn't have any 149 00:09:44,280 --> 00:09:47,280 Speaker 1: pain receptors, so you can just dunk the little metal 150 00:09:47,280 --> 00:09:50,640 Speaker 1: wire right in there after you've opened a little portal 151 00:09:50,760 --> 00:09:53,520 Speaker 1: in the skull. So what's happening when you put these 152 00:09:53,520 --> 00:09:57,240 Speaker 1: little bursts of electricity in is that the cells fire, 153 00:09:57,280 --> 00:09:59,560 Speaker 1: which has effects on the rest of the network that 154 00:09:59,600 --> 00:10:02,440 Speaker 1: those cells are connected to, and it also changes the 155 00:10:02,520 --> 00:10:06,080 Speaker 1: electrical oscillations. And why this works so well in Parkinson's 156 00:10:06,120 --> 00:10:08,400 Speaker 1: is still a bit of a mystery, but you get 157 00:10:08,679 --> 00:10:10,720 Speaker 1: what you want out of it, and people have been 158 00:10:10,800 --> 00:10:14,400 Speaker 1: using this sort of brain stimulation for all kinds of purposes. 159 00:10:14,600 --> 00:10:19,080 Speaker 1: For example, my colleague Helen Mayberg puts electrodes directly into 160 00:10:19,120 --> 00:10:23,000 Speaker 1: a very specific area near the singulate gyrus, and she 161 00:10:23,120 --> 00:10:27,439 Speaker 1: stimulates and can pull people out of deep clinical depression 162 00:10:27,520 --> 00:10:30,720 Speaker 1: this way. So there are many labs and clinics using 163 00:10:30,760 --> 00:10:34,440 Speaker 1: the technique of stimulating individual cells in the brain, and 164 00:10:34,520 --> 00:10:37,440 Speaker 1: the direction of the technology over the past couple of 165 00:10:37,480 --> 00:10:41,760 Speaker 1: decades has been getting more and more electrodes implanted, so 166 00:10:41,800 --> 00:10:44,439 Speaker 1: that you're not just hitting one or a few cells 167 00:10:44,440 --> 00:10:47,240 Speaker 1: at the tip of the electrode, but you're instead exciting 168 00:10:47,640 --> 00:10:51,360 Speaker 1: tens or hundreds or eventually thousands of cells by using 169 00:10:51,400 --> 00:10:56,679 Speaker 1: a whole specific collection of electrodes. And companies like Neuralink 170 00:10:56,720 --> 00:10:59,520 Speaker 1: have become famous in the public eye because of the 171 00:10:59,559 --> 00:11:03,440 Speaker 1: idea of sewing these electrodes very finely into the brain 172 00:11:03,800 --> 00:11:06,680 Speaker 1: and getting a thousand of them and soon more than that. 173 00:11:06,920 --> 00:11:10,640 Speaker 1: And in all these cases, the electrodes can read and write, 174 00:11:10,679 --> 00:11:13,040 Speaker 1: in other words, they can record the activity in the 175 00:11:13,040 --> 00:11:16,040 Speaker 1: brain cells, but they can also stimulate the brain cells 176 00:11:16,240 --> 00:11:34,160 Speaker 1: to put activity in there. So once you have the 177 00:11:34,200 --> 00:11:37,640 Speaker 1: electrodes in there, could you just send in the right 178 00:11:37,880 --> 00:11:41,080 Speaker 1: zaps of electricity in just the right pattern, spread over 179 00:11:41,120 --> 00:11:45,040 Speaker 1: millions of neurons with precise timing of your patterns in 180 00:11:45,240 --> 00:11:49,040 Speaker 1: such a way that you shape the network so that 181 00:11:49,120 --> 00:11:53,040 Speaker 1: you can fly a helicopter. Now, all that sounds pretty 182 00:11:53,040 --> 00:11:57,280 Speaker 1: exciting as a theoretical possibility, but I think there are 183 00:11:57,280 --> 00:12:01,160 Speaker 1: two major technical hurdles here to be able to stimulate 184 00:12:01,240 --> 00:12:03,320 Speaker 1: lots and lots of cells in the brain in the 185 00:12:03,360 --> 00:12:07,679 Speaker 1: way that you might want to upload helicopter instructions. The 186 00:12:07,720 --> 00:12:13,240 Speaker 1: first is simply a physical challenge. The brain is very delicate, 187 00:12:13,600 --> 00:12:17,080 Speaker 1: and so Mother Nature has surrounded it in the armored 188 00:12:17,160 --> 00:12:20,280 Speaker 1: plating of the skull. So it's very very hard to 189 00:12:20,520 --> 00:12:24,280 Speaker 1: get at this fragile, delicate tissue of the brain, and 190 00:12:24,320 --> 00:12:26,280 Speaker 1: so if you want to insert an electrode, you have 191 00:12:26,320 --> 00:12:29,440 Speaker 1: to actually drill a small hole in the skull to 192 00:12:29,600 --> 00:12:32,600 Speaker 1: expose the brain and then you can put your electrode in. 193 00:12:32,920 --> 00:12:36,599 Speaker 1: The difficulty is that there are eighty six billion neurons, 194 00:12:36,640 --> 00:12:40,560 Speaker 1: and at the moment, even with our fanciest technology, we 195 00:12:40,600 --> 00:12:43,600 Speaker 1: can only get to say a thousand of these at 196 00:12:43,640 --> 00:12:47,320 Speaker 1: any time, and so that is useless in terms of 197 00:12:47,400 --> 00:12:51,120 Speaker 1: actually having access to the whole system. It would be 198 00:12:51,160 --> 00:12:54,720 Speaker 1: equivalent to if you really wanted to say something to 199 00:12:54,840 --> 00:12:58,199 Speaker 1: all eight billion people on the planet, but you only 200 00:12:58,240 --> 00:13:02,680 Speaker 1: had one hundred followers on social media. The huge majority 201 00:13:02,679 --> 00:13:04,760 Speaker 1: of the world will have no idea that you've ever 202 00:13:05,000 --> 00:13:08,480 Speaker 1: said anything, or that you even exist. And that's the situation. 203 00:13:08,880 --> 00:13:12,160 Speaker 1: When you zap a few hundred neurons, the other tens 204 00:13:12,200 --> 00:13:15,199 Speaker 1: of billions of neurons don't even know that you're knocking 205 00:13:15,240 --> 00:13:19,560 Speaker 1: on the door. So to actually insert information into the brain, 206 00:13:19,640 --> 00:13:23,160 Speaker 1: you'd somehow need to access all or at least most 207 00:13:23,240 --> 00:13:27,240 Speaker 1: of the neurons to make any meaningful change. Now, I'm 208 00:13:27,280 --> 00:13:29,880 Speaker 1: not yet addressing how you would know what you want 209 00:13:29,920 --> 00:13:31,760 Speaker 1: to change, where I'll come back to that in a moment. 210 00:13:32,080 --> 00:13:35,120 Speaker 1: Let's just imagine for now that you know exactly what 211 00:13:35,160 --> 00:13:37,199 Speaker 1: you want to tweak in the brain. Now, I do 212 00:13:37,240 --> 00:13:40,160 Speaker 1: think that in the future there may be a very 213 00:13:40,160 --> 00:13:45,920 Speaker 1: different solution besides electrodes to this issue of manipulating the network, 214 00:13:45,960 --> 00:13:48,880 Speaker 1: because I don't think the idea of dunking electrodes in 215 00:13:48,880 --> 00:13:52,280 Speaker 1: there is ever going to be a long term solution. 216 00:13:52,800 --> 00:13:55,199 Speaker 1: When I squint into the future, I think the solution 217 00:13:55,360 --> 00:13:59,400 Speaker 1: is something like nano robots. So what are nano robots. 218 00:13:59,440 --> 00:14:04,240 Speaker 1: The idea is that you use atomically precise three D 219 00:14:04,440 --> 00:14:09,360 Speaker 1: printing to make little molecular machines out of atoms. Essentially, 220 00:14:09,559 --> 00:14:14,040 Speaker 1: you make little robots that carry out some functions, so 221 00:14:14,080 --> 00:14:18,720 Speaker 1: they're like little robots, but they're microscopically small, built out 222 00:14:18,720 --> 00:14:23,320 Speaker 1: of individual atoms, by the way, which is what proteins are. Anyway, 223 00:14:23,360 --> 00:14:27,000 Speaker 1: you could make these super durable, for example by printing 224 00:14:27,040 --> 00:14:31,920 Speaker 1: them out of carbon, making them diamond robots. The idea, 225 00:14:32,240 --> 00:14:35,120 Speaker 1: and this is probably not for several decades. The idea 226 00:14:35,240 --> 00:14:39,240 Speaker 1: is that you swallow a pill with tens of billions 227 00:14:39,280 --> 00:14:42,320 Speaker 1: of these little nano robots in there, and they float 228 00:14:42,360 --> 00:14:45,800 Speaker 1: through your bloodstream and you give them the right FedEx 229 00:14:45,880 --> 00:14:50,080 Speaker 1: labels to pass the blood brain barrier, and once they're 230 00:14:50,080 --> 00:14:53,360 Speaker 1: in there in the brain, they wiggle their way into 231 00:14:53,400 --> 00:14:56,720 Speaker 1: your neurons where they can read the activity and they 232 00:14:56,720 --> 00:15:01,400 Speaker 1: can cause the cell to spike to fire signal whenever 233 00:15:01,440 --> 00:15:05,920 Speaker 1: they need to. So, with proper signaling between the nanobots, 234 00:15:06,280 --> 00:15:10,640 Speaker 1: using for example, mesh networking, you could in theory generate 235 00:15:10,760 --> 00:15:14,680 Speaker 1: whatever patterns you needed to across the entire brain, and 236 00:15:14,720 --> 00:15:18,600 Speaker 1: if your science is really advanced, then you hit the 237 00:15:18,760 --> 00:15:22,920 Speaker 1: correct brain wide patterns that will cement in the knowledge 238 00:15:23,400 --> 00:15:26,560 Speaker 1: of how to fly a B two twelve. Now, although 239 00:15:26,560 --> 00:15:31,240 Speaker 1: this is not happening anytime soon, it certainly seems plausible 240 00:15:31,480 --> 00:15:34,560 Speaker 1: that this could be in our future. But wait, there's 241 00:15:34,680 --> 00:15:38,760 Speaker 1: actually a difficult twist to this story. I said before 242 00:15:38,840 --> 00:15:42,120 Speaker 1: there are two technical hurdles, and here comes the second. 243 00:15:42,520 --> 00:15:47,040 Speaker 1: And that hurdle is that there won't be a single 244 00:15:47,560 --> 00:15:51,320 Speaker 1: program for flying a B two twelve helicopter. Why not, 245 00:15:52,240 --> 00:15:56,200 Speaker 1: because the brain inside each of us is totally unique. 246 00:15:56,240 --> 00:15:59,640 Speaker 1: We each have a massive forest of eighty six billion 247 00:15:59,680 --> 00:16:04,160 Speaker 1: euro on each with ten thousand connection points reaching out 248 00:16:04,400 --> 00:16:08,840 Speaker 1: and interacting with other trees. And it's a living forest 249 00:16:08,960 --> 00:16:12,880 Speaker 1: such as each connection, every twig on every branch finds 250 00:16:12,920 --> 00:16:16,360 Speaker 1: its place in life based on the exact details of 251 00:16:16,400 --> 00:16:22,120 Speaker 1: what you have seen and heard and experienced in your life. 252 00:16:22,160 --> 00:16:27,400 Speaker 1: You born in your hometown, with your family, your neighborhood, 253 00:16:27,520 --> 00:16:32,320 Speaker 1: your culture, your moment in history. All those things determine 254 00:16:32,360 --> 00:16:35,520 Speaker 1: the exact wiring of your brain. And your brain has 255 00:16:35,560 --> 00:16:37,960 Speaker 1: a network that is different from his brain over there, 256 00:16:38,160 --> 00:16:41,320 Speaker 1: and her brain over there, and everyone else's brain on 257 00:16:41,360 --> 00:16:46,720 Speaker 1: the planet. And the exact wiring is what makes you you. 258 00:16:47,560 --> 00:16:51,400 Speaker 1: So in the proposed future of the Matrix, the operator 259 00:16:51,520 --> 00:16:55,000 Speaker 1: Tank would have to specify that he wants a program 260 00:16:55,400 --> 00:16:59,600 Speaker 1: to pilot a B two twelve helicopter that is specified 261 00:16:59,720 --> 00:17:04,480 Speaker 1: exact exactly for Trinity's brain, that is bespoke for her 262 00:17:04,640 --> 00:17:08,840 Speaker 1: neural network only. And if Tank tried to upload the 263 00:17:08,880 --> 00:17:13,200 Speaker 1: same program to Neo's brain or Morpheus's brain, who knows 264 00:17:13,320 --> 00:17:17,240 Speaker 1: what that would result in. Because if the program alters 265 00:17:17,320 --> 00:17:20,840 Speaker 1: the way that neuron nineteen million, three hundred fifty six 266 00:17:20,840 --> 00:17:23,879 Speaker 1: three hundred and two is talking to its neighbors, and 267 00:17:23,960 --> 00:17:28,399 Speaker 1: it does this over a million other neurons with high specificity, 268 00:17:28,680 --> 00:17:31,679 Speaker 1: that might teach Trinity how to fly a helicopter, but 269 00:17:31,760 --> 00:17:35,080 Speaker 1: it certainly would not work for someone else whose brain 270 00:17:35,520 --> 00:17:38,120 Speaker 1: is different. So how do we get around that problem, 271 00:17:38,160 --> 00:17:42,800 Speaker 1: the problem of everyone having a unique neural network. Well, 272 00:17:43,359 --> 00:17:46,320 Speaker 1: the answer will have to rely on what is called 273 00:17:46,640 --> 00:17:51,320 Speaker 1: system identification. This is an engineering approach where you have 274 00:17:51,440 --> 00:17:56,960 Speaker 1: some complicated dynamic system and you measure lots of input 275 00:17:57,080 --> 00:18:01,840 Speaker 1: output pairs, as in, when I put this in, what happens? Okay, 276 00:18:02,000 --> 00:18:04,520 Speaker 1: now it happens if I put that in. So imagine 277 00:18:04,520 --> 00:18:07,960 Speaker 1: you find a really complicated machine and you don't know 278 00:18:08,000 --> 00:18:10,240 Speaker 1: exactly what it does. So you tap one of the 279 00:18:10,320 --> 00:18:13,640 Speaker 1: keys and you see how it moves, and then you 280 00:18:13,680 --> 00:18:15,760 Speaker 1: tap three of the keys at the same time, and 281 00:18:15,800 --> 00:18:18,320 Speaker 1: you look at what it does as its output, and 282 00:18:18,359 --> 00:18:20,600 Speaker 1: then you hit a series of the keys and you 283 00:18:20,640 --> 00:18:23,520 Speaker 1: see what results. And you do this over and over 284 00:18:23,560 --> 00:18:26,080 Speaker 1: and over to try to figure out what is the 285 00:18:26,160 --> 00:18:31,560 Speaker 1: structure under the hood. This system identification approach is used 286 00:18:31,560 --> 00:18:34,320 Speaker 1: in lots of fields. For example, in economics, let's say 287 00:18:34,320 --> 00:18:37,159 Speaker 1: you want to figure out the guts of the stock market. 288 00:18:37,320 --> 00:18:40,840 Speaker 1: So you take lots of inputs like gross domestic product 289 00:18:40,920 --> 00:18:44,000 Speaker 1: and inflation and unemployment and interest rates and blah blah blah, 290 00:18:44,080 --> 00:18:46,480 Speaker 1: and you look at all these as inputs and you 291 00:18:46,520 --> 00:18:49,080 Speaker 1: look at the reaction of the market this way, and 292 00:18:49,119 --> 00:18:52,200 Speaker 1: you develop better and better mathematical models of what the 293 00:18:52,320 --> 00:18:55,800 Speaker 1: machinery of the stock market is doing, even though you 294 00:18:55,880 --> 00:18:59,359 Speaker 1: can't see it. Okay, So the question is, could you 295 00:18:59,440 --> 00:19:04,760 Speaker 1: do system identification on a human brain. No one's ever 296 00:19:04,920 --> 00:19:07,400 Speaker 1: really done this because there's no purpose for it now, 297 00:19:07,840 --> 00:19:11,399 Speaker 1: but someday it might make sense. So the idea is 298 00:19:11,480 --> 00:19:15,639 Speaker 1: you go into a super futuristic brain scanner and you 299 00:19:15,680 --> 00:19:20,199 Speaker 1: get lots of inputs, and this sophisticated brain imaging device 300 00:19:20,440 --> 00:19:23,800 Speaker 1: measures the outputs, in other words, which cells in your 301 00:19:23,840 --> 00:19:27,959 Speaker 1: brain are responding. So you see a rapid series of 302 00:19:28,040 --> 00:19:30,960 Speaker 1: images and you hear words, and you feel touches on 303 00:19:31,040 --> 00:19:34,119 Speaker 1: your body, and you smell smells, and you run through 304 00:19:34,520 --> 00:19:39,480 Speaker 1: thousands or maybe millions of little micro experiences while your 305 00:19:39,520 --> 00:19:42,760 Speaker 1: brain is getting measured. And in theory, this is how 306 00:19:42,800 --> 00:19:47,600 Speaker 1: a scientist could say, Aha, Trinity's brain is organized like this, 307 00:19:47,760 --> 00:19:51,400 Speaker 1: while Neo's brain is laid out like that, and Morpheus's 308 00:19:51,440 --> 00:19:54,719 Speaker 1: brain has a slightly different pattern, And you might find 309 00:19:54,800 --> 00:19:59,480 Speaker 1: that for teaching the operation of a B two twelve helicopter, 310 00:20:00,119 --> 00:20:04,520 Speaker 1: in his brain thinks about it in analogy to riding 311 00:20:04,560 --> 00:20:07,440 Speaker 1: a horse and controlling it, which let's say she grew 312 00:20:07,560 --> 00:20:11,760 Speaker 1: up riding horses, while Neo's brain would learn the helicopter 313 00:20:11,960 --> 00:20:15,399 Speaker 1: in analogy to the way a motorcycle feels, which is, 314 00:20:15,440 --> 00:20:18,040 Speaker 1: let's say how he grew up. And for Morpheus, the 315 00:20:18,200 --> 00:20:22,520 Speaker 1: actions of piloting emerge from his deep knowledge of surfing, 316 00:20:22,560 --> 00:20:24,359 Speaker 1: which is how he grew up and what is stored 317 00:20:24,400 --> 00:20:27,760 Speaker 1: in his brain. Now, it's not clear how many inputs 318 00:20:27,760 --> 00:20:30,440 Speaker 1: you'd have to ping in there to get high enough 319 00:20:30,520 --> 00:20:33,960 Speaker 1: resolution to make all the little changes you need, but 320 00:20:34,119 --> 00:20:39,160 Speaker 1: presumably that would get figured out with enough experimentation. Okay, 321 00:20:39,480 --> 00:20:43,720 Speaker 1: so let's say we as a society grow to a 322 00:20:43,800 --> 00:20:48,280 Speaker 1: point where we can do system identification on an individual's 323 00:20:48,320 --> 00:20:54,720 Speaker 1: brain and then use nanobots to upload knowledge of helicopter piloting. 324 00:20:55,119 --> 00:20:57,920 Speaker 1: I need to emphasize that this is not right around 325 00:20:57,960 --> 00:21:02,040 Speaker 1: the corner, but it certainly seems the theoretically plausible. Another 326 00:21:02,680 --> 00:21:07,040 Speaker 1: century of advancement, and suddenly the network that makes you 327 00:21:07,840 --> 00:21:12,520 Speaker 1: can get directed and shaped in a bespoke manner. And 328 00:21:12,560 --> 00:21:14,719 Speaker 1: if we come to a point where we can do it, 329 00:21:15,080 --> 00:21:19,760 Speaker 1: that's possibly the biggest societal change. I can imagine you 330 00:21:20,080 --> 00:21:22,000 Speaker 1: say to your three year old kid, Okay, we're gonna 331 00:21:22,040 --> 00:21:25,000 Speaker 1: upload first grade now. Great, Now, go play outside for 332 00:21:25,040 --> 00:21:28,159 Speaker 1: an hour, and then we're gonna upload second grade after lunch. 333 00:21:28,560 --> 00:21:31,400 Speaker 1: Imagine that by the end of the week, your three 334 00:21:31,480 --> 00:21:35,120 Speaker 1: year old knows as much as a full professor does. Now, 335 00:21:35,760 --> 00:21:39,000 Speaker 1: so what becomes of society and the way we run 336 00:21:39,040 --> 00:21:41,639 Speaker 1: it now? You may think the analogy here is to 337 00:21:41,640 --> 00:21:45,000 Speaker 1: look at super smart, genius kids in our current world, 338 00:21:45,000 --> 00:21:48,439 Speaker 1: But these kids often go off to attend college at 339 00:21:48,480 --> 00:21:51,520 Speaker 1: twelve years old, and they very often end up lonely 340 00:21:51,560 --> 00:21:54,199 Speaker 1: and socially misplaced, because really what they want is to 341 00:21:54,200 --> 00:21:57,320 Speaker 1: play with their colleagues other kids their age, But they 342 00:21:57,320 --> 00:22:00,040 Speaker 1: get stuck with a bunch of older kids who have 343 00:22:00,119 --> 00:22:04,600 Speaker 1: gone through puberty and are running deeply carved evolutionary programs 344 00:22:04,920 --> 00:22:07,920 Speaker 1: that cause their brains to be taken over by sexuality, 345 00:22:08,160 --> 00:22:10,560 Speaker 1: and that software hasn't yet turned on in the heads 346 00:22:10,560 --> 00:22:13,439 Speaker 1: of these young genuses, and as a result, they can't 347 00:22:13,480 --> 00:22:15,760 Speaker 1: mesh with what is happening around them, and they can 348 00:22:15,800 --> 00:22:21,159 Speaker 1: feel very lonely in these contexts. But the future scenario 349 00:22:21,240 --> 00:22:25,399 Speaker 1: of uploading knowledge is totally different because now every single 350 00:22:25,480 --> 00:22:30,560 Speaker 1: kid can stay among colleagues. But the question is if 351 00:22:30,680 --> 00:22:33,960 Speaker 1: education is uploaded, what do the kids do all day? 352 00:22:34,440 --> 00:22:38,159 Speaker 1: Do they launch startups at the age of six, do 353 00:22:38,200 --> 00:22:41,600 Speaker 1: they write epic novels by the time they're eight years old? 354 00:22:42,240 --> 00:22:46,520 Speaker 1: Do they return to reproducing as teenagers like their distant 355 00:22:46,560 --> 00:22:50,480 Speaker 1: ancestors did? And is it dangerous that they have all 356 00:22:50,640 --> 00:22:54,800 Speaker 1: the knowledge of decades of schooling but without the maturity. 357 00:22:55,320 --> 00:22:57,840 Speaker 1: The most slowly developing part of the brain is the 358 00:22:57,840 --> 00:23:02,760 Speaker 1: prefrontal cortex, and this underlies our ability to simulate possible 359 00:23:02,800 --> 00:23:06,520 Speaker 1: futures and think about consequences. So imagine a kid with 360 00:23:06,560 --> 00:23:10,359 Speaker 1: an undeveloped prefederal cortex who has all the knowledge that 361 00:23:10,440 --> 00:23:15,040 Speaker 1: Albert Einstein commanded at midlife. But this child lacks the 362 00:23:15,080 --> 00:23:19,280 Speaker 1: ability to simulate consequences, so they think something like, wouldn't 363 00:23:19,280 --> 00:23:22,359 Speaker 1: it be hilarious to build a small nuclear bomb and 364 00:23:22,440 --> 00:23:25,719 Speaker 1: blow up my neighbor's porch, Or wouldn't it be a 365 00:23:25,760 --> 00:23:31,840 Speaker 1: crackup to disrupt the presidential broadcast by hijacking the frequency 366 00:23:31,880 --> 00:23:36,320 Speaker 1: and imposing a video of me twerking or whatever? Because 367 00:23:36,440 --> 00:23:40,240 Speaker 1: children don't yet have a fully developed profederal cortex that 368 00:23:40,359 --> 00:23:44,080 Speaker 1: can't simulate consequences the way an adult can, and this 369 00:23:44,200 --> 00:23:47,840 Speaker 1: is why it could be dangerous to inject the knowledge 370 00:23:47,880 --> 00:24:04,840 Speaker 1: of an adult into a child's brain. Now, perhaps I'm 371 00:24:04,840 --> 00:24:09,240 Speaker 1: being shortsighted here, and we could somehow upload maturity as well. 372 00:24:09,480 --> 00:24:14,240 Speaker 1: We could figure out the learning that translates to morally 373 00:24:14,240 --> 00:24:18,800 Speaker 1: complex situations and simulate those over and over do the 374 00:24:18,840 --> 00:24:25,240 Speaker 1: synaptic equivalent of working through the possibilities and feeling the consequences. 375 00:24:25,520 --> 00:24:29,560 Speaker 1: Maybe you could massively speed up emotional learning that way. 376 00:24:29,680 --> 00:24:32,360 Speaker 1: After all, as my father would always tell me, the 377 00:24:32,400 --> 00:24:37,600 Speaker 1: wise person learns from experience, but the wiser person learns 378 00:24:37,640 --> 00:24:41,280 Speaker 1: from the experience of others. So maybe there could be 379 00:24:41,760 --> 00:24:47,360 Speaker 1: enough uploaded knowledge where a kid understands various possible scenarios 380 00:24:47,400 --> 00:24:51,080 Speaker 1: and outcomes, and the good decision making simply results from 381 00:24:51,080 --> 00:24:55,159 Speaker 1: a deep knowledge of previous examples, things that have happened 382 00:24:55,160 --> 00:24:58,919 Speaker 1: to other people, all of which have been uploaded. So 383 00:24:59,040 --> 00:25:01,919 Speaker 1: maybe the maturity problem could be taken care of, but 384 00:25:02,040 --> 00:25:06,880 Speaker 1: still we're looking at massive societal shifts that would render 385 00:25:07,000 --> 00:25:12,320 Speaker 1: our current civilization totally unrecognizable. Now, we all like to 386 00:25:12,320 --> 00:25:15,600 Speaker 1: be very thoughtful about the future, but it doesn't matter 387 00:25:15,840 --> 00:25:19,359 Speaker 1: what we speculate about it, because we are guaranteed to 388 00:25:19,440 --> 00:25:22,679 Speaker 1: be wrong. We can only envision what we're capable of, 389 00:25:22,760 --> 00:25:26,159 Speaker 1: in this case, a cartoonish version of a bunch of 390 00:25:26,200 --> 00:25:29,200 Speaker 1: super intelligent kids running around while their parents go off 391 00:25:29,200 --> 00:25:31,720 Speaker 1: to their jobs. But the world is likely to be 392 00:25:32,440 --> 00:25:36,720 Speaker 1: very different by then. Presuming that everything is massively sped 393 00:25:36,800 --> 00:25:41,639 Speaker 1: up by artificial intelligence, it seems very possible that society 394 00:25:41,720 --> 00:25:45,600 Speaker 1: is going to evolve exponentially faster at a pace that 395 00:25:45,680 --> 00:25:49,359 Speaker 1: we really can't conceive of here in the first third 396 00:25:49,400 --> 00:25:52,199 Speaker 1: of the twenty first century. I mean, just imagine that 397 00:25:52,400 --> 00:25:57,360 Speaker 1: AI knocks down scientific problems rapidly, such that we move 398 00:25:57,440 --> 00:26:02,480 Speaker 1: from our current state of pretty wide spread ignorance to perfect, 399 00:26:02,640 --> 00:26:06,639 Speaker 1: wonderful models of everything in the cosmos. Just think about 400 00:26:06,680 --> 00:26:11,560 Speaker 1: the incredibly slow pace between the Stone Age and the 401 00:26:11,560 --> 00:26:14,800 Speaker 1: Bronze Age, and then the Bronze Age to the Silver Age. 402 00:26:15,119 --> 00:26:18,560 Speaker 1: Now imagine this pace goes up by a thousandfold or 403 00:26:18,600 --> 00:26:21,800 Speaker 1: a millionfold. So we find ourselves a few decades from 404 00:26:21,880 --> 00:26:25,719 Speaker 1: now in the Diamond Age, where we can manipulate carbon 405 00:26:25,760 --> 00:26:28,960 Speaker 1: atoms however we like. And then a few years later 406 00:26:29,280 --> 00:26:31,600 Speaker 1: we're past that and into a new era where we 407 00:26:31,640 --> 00:26:36,280 Speaker 1: can entangle photons and find ourselves in the quantum age 408 00:26:36,359 --> 00:26:39,720 Speaker 1: and so on. Like everyone, I love to speculate about 409 00:26:39,760 --> 00:26:42,879 Speaker 1: the future, but the truth is that it is impossible 410 00:26:43,200 --> 00:26:46,280 Speaker 1: to picture what things will become and how quickly. And 411 00:26:46,320 --> 00:26:50,040 Speaker 1: I want to share an example. Last month here in 412 00:26:50,119 --> 00:26:53,560 Speaker 1: Silicon Valley, I saw a black and white photograph from 413 00:26:53,640 --> 00:26:58,160 Speaker 1: nineteen forty. It was a man on horseback ambling up 414 00:26:58,200 --> 00:27:01,240 Speaker 1: a dirt road on a hills and there was nothing 415 00:27:01,280 --> 00:27:06,200 Speaker 1: particularly special about this sandy hill with its scrubbrush. So 416 00:27:06,240 --> 00:27:10,360 Speaker 1: I was intrigued to read the caption and find out 417 00:27:10,400 --> 00:27:15,160 Speaker 1: that this little dirt road was sand Hill Road. Now 418 00:27:15,240 --> 00:27:18,240 Speaker 1: you may know that sand Hill Road is nowadays a 419 00:27:18,480 --> 00:27:21,920 Speaker 1: road almost as famous as Wall Street in New York. 420 00:27:22,359 --> 00:27:25,760 Speaker 1: Sand Hill Road is where many of the world's most 421 00:27:25,840 --> 00:27:31,000 Speaker 1: elite venture capitalists do their business. They invest hundreds of billions. 422 00:27:31,080 --> 00:27:36,399 Speaker 1: This road is the mecca for startups who are seeking investment. Now, 423 00:27:36,680 --> 00:27:39,159 Speaker 1: the thing that was so striking to me is that 424 00:27:39,280 --> 00:27:43,320 Speaker 1: for the horseman sauntering up this sandy hillside in nineteen 425 00:27:43,400 --> 00:27:46,880 Speaker 1: forty in the hot sun, there's no way he could 426 00:27:46,960 --> 00:27:51,440 Speaker 1: have imagined that the lonely hoof prints he was leaving 427 00:27:51,800 --> 00:27:54,920 Speaker 1: would in just sixty years mark this spot of one 428 00:27:54,960 --> 00:27:58,879 Speaker 1: of the world's economic engines. And there's no way he 429 00:27:58,920 --> 00:28:02,960 Speaker 1: could have envisioned what advances would get funded on that spot. 430 00:28:03,160 --> 00:28:07,600 Speaker 1: The worldwide light speed network that allows anyone on the 431 00:28:07,680 --> 00:28:12,679 Speaker 1: planet to effortlessly communicate to anyone else, or rectangles that 432 00:28:12,760 --> 00:28:16,000 Speaker 1: everyone would carry in their pocket like a handkerchief or 433 00:28:16,040 --> 00:28:21,080 Speaker 1: a tobacco tin. But these rectangles would contain the accumulated 434 00:28:21,240 --> 00:28:26,760 Speaker 1: knowledge of all humankind. Or satellites or quantum computers or 435 00:28:26,840 --> 00:28:31,919 Speaker 1: blockchain cryptocurrencies, or large language models that could read every 436 00:28:31,960 --> 00:28:35,920 Speaker 1: book ever written. None of these would be even vaguely 437 00:28:36,160 --> 00:28:41,720 Speaker 1: imaginable to the cowboy in nineteen forty, moving slowly up 438 00:28:41,720 --> 00:28:45,280 Speaker 1: the hill. We are blind to the future. I often 439 00:28:45,360 --> 00:28:48,720 Speaker 1: wish I could talk to whoever is listening to this 440 00:28:49,160 --> 00:28:53,240 Speaker 1: historical podcast in the year twenty eighty four, because the 441 00:28:53,280 --> 00:28:56,560 Speaker 1: world will be so different by then, and I am 442 00:28:56,720 --> 00:29:00,360 Speaker 1: incapable of imagining it. And it's not just that we 443 00:29:00,400 --> 00:29:05,000 Speaker 1: are not being creative about extrapolating technology curves into the future. 444 00:29:05,040 --> 00:29:09,360 Speaker 1: It's that there will be new technologies and novel sciences 445 00:29:09,480 --> 00:29:15,360 Speaker 1: and new convergences that will make it intrinsically unpredictable. There 446 00:29:15,400 --> 00:29:21,320 Speaker 1: will be serendipitous discoveries and socioeconomic changes and geopolitical events. 447 00:29:21,760 --> 00:29:25,320 Speaker 1: While we always make guesses based on our current trends 448 00:29:25,400 --> 00:29:29,040 Speaker 1: and research, the future is shaped by hundreds of things 449 00:29:29,080 --> 00:29:31,920 Speaker 1: we just can't see. Not only that, but you've heard 450 00:29:32,000 --> 00:29:36,960 Speaker 1: me speak before about our limited perspective, our inability to 451 00:29:37,040 --> 00:29:40,000 Speaker 1: see past the fence line of what we already know. 452 00:29:40,560 --> 00:29:43,920 Speaker 1: Our current knowledge understanding are based on the technologies and 453 00:29:43,920 --> 00:29:47,320 Speaker 1: paradigms that exist right now, so it's really hard for 454 00:29:47,400 --> 00:29:52,360 Speaker 1: us to anticipate breakthroughs or paradigm shifts that are going 455 00:29:52,400 --> 00:29:55,640 Speaker 1: to radically alter our society in the future. But this 456 00:29:55,760 --> 00:30:00,080 Speaker 1: idea of putting information directly into the brain, that's it 457 00:30:00,080 --> 00:30:03,160 Speaker 1: certainly seems like that could be a big shift. So 458 00:30:03,240 --> 00:30:06,280 Speaker 1: when we think about the future, it's more than just 459 00:30:06,640 --> 00:30:09,520 Speaker 1: adults like us riding around on a spaceship with a 460 00:30:09,600 --> 00:30:13,920 Speaker 1: robot or two. Things are guaranteed to be weirder than 461 00:30:13,960 --> 00:30:17,800 Speaker 1: we expect. While brain uploads our science fiction right now, 462 00:30:18,120 --> 00:30:22,280 Speaker 1: assuming we don't blow ourselves up, this inevitably seems like 463 00:30:22,360 --> 00:30:27,040 Speaker 1: it will become science fact. So let's wrap this up. 464 00:30:27,360 --> 00:30:30,360 Speaker 1: This episode is not about what's going to happen anytime soon, 465 00:30:30,400 --> 00:30:33,000 Speaker 1: but I think it is inevitably what will happen in 466 00:30:33,040 --> 00:30:36,320 Speaker 1: the future. After all, the brain is made of billions 467 00:30:36,360 --> 00:30:39,400 Speaker 1: of cells, each one of which is very complicated, and 468 00:30:39,520 --> 00:30:44,040 Speaker 1: each is connected in very complicated patterns. But fundamentally, learning 469 00:30:44,120 --> 00:30:48,480 Speaker 1: and memory take place in the changes of connectivity, and 470 00:30:48,560 --> 00:30:51,880 Speaker 1: as far as we can tell, that's all learning is. 471 00:30:52,400 --> 00:30:54,560 Speaker 1: So what we talked about is the way that the 472 00:30:55,080 --> 00:30:57,600 Speaker 1: jungle of neurons in your head is wired up differently 473 00:30:57,640 --> 00:31:01,720 Speaker 1: than in your friend's head because you have different genetic predispositions, 474 00:31:01,760 --> 00:31:05,480 Speaker 1: and more importantly, you have different experiences in life. So 475 00:31:05,640 --> 00:31:09,680 Speaker 1: in order to upload any changes into your network, we'd 476 00:31:09,720 --> 00:31:13,960 Speaker 1: have to know your brain in exquisitely fine detail, and 477 00:31:14,000 --> 00:31:16,560 Speaker 1: we'd have to know those patterns right now, because it's 478 00:31:16,680 --> 00:31:19,240 Speaker 1: just a little bit different than it was yesterday. But 479 00:31:19,360 --> 00:31:23,280 Speaker 1: in theory, if we had this information and understood the 480 00:31:23,400 --> 00:31:27,360 Speaker 1: language of the connections, we could dial knobs here and 481 00:31:27,400 --> 00:31:31,880 Speaker 1: there in a million other spots, strengthening or weakening synapta connections, 482 00:31:32,200 --> 00:31:35,800 Speaker 1: tickling the genome to express a little more neurotransmit or 483 00:31:35,840 --> 00:31:39,280 Speaker 1: receptor over here, a little less over there, and after 484 00:31:39,320 --> 00:31:44,240 Speaker 1: that you might be able to suddenly possess some knowledge 485 00:31:44,400 --> 00:31:48,560 Speaker 1: you didn't have before. Now, obviously, society will have to 486 00:31:48,600 --> 00:31:52,160 Speaker 1: be very careful about this technology when that century comes, 487 00:31:52,400 --> 00:31:57,200 Speaker 1: because in theory, you could use it to implant false memories, 488 00:31:57,680 --> 00:32:00,880 Speaker 1: or to erase knowledge, or to do any number of 489 00:32:01,040 --> 00:32:04,960 Speaker 1: nefarious things. So we will enter a very strange time, 490 00:32:05,160 --> 00:32:09,560 Speaker 1: and like every technology, a whole raft of protections and 491 00:32:09,680 --> 00:32:13,520 Speaker 1: legislation will grow up around it. Again, this is likely 492 00:32:13,640 --> 00:32:17,920 Speaker 1: impossible to achieve in our generation because of the size 493 00:32:17,960 --> 00:32:21,400 Speaker 1: of the problem. It would take about a zetabyte of 494 00:32:21,440 --> 00:32:25,640 Speaker 1: information to store the detailed structure of one human brain, 495 00:32:25,840 --> 00:32:27,560 Speaker 1: and that, by the way, would only tell you the 496 00:32:27,600 --> 00:32:30,480 Speaker 1: structure of the forest of neurons, but wouldn't even tell 497 00:32:30,520 --> 00:32:34,600 Speaker 1: you anything about their individual details, like which genes are 498 00:32:34,640 --> 00:32:37,720 Speaker 1: getting expressed and which proteins are getting put where. So 499 00:32:37,960 --> 00:32:41,520 Speaker 1: for us, the citizens of the twenty first century, this 500 00:32:41,680 --> 00:32:46,120 Speaker 1: is likely to be an unsolvably huge problem to capture 501 00:32:46,160 --> 00:32:52,120 Speaker 1: a detailed description of an individual brain. But as a species, 502 00:32:52,280 --> 00:32:56,920 Speaker 1: we're in an interesting situation because we can see that 503 00:32:56,960 --> 00:33:00,440 Speaker 1: this is all coming, and we can speculate on the 504 00:33:00,600 --> 00:33:05,440 Speaker 1: size of the changes this will have on society writ large. Now, 505 00:33:05,440 --> 00:33:09,760 Speaker 1: what I find amazing is our guaranteed inability to correctly 506 00:33:09,840 --> 00:33:13,640 Speaker 1: picture this future world, even though it will be populated 507 00:33:13,680 --> 00:33:18,080 Speaker 1: by our own great grandchildren. Given all this, I think 508 00:33:18,160 --> 00:33:22,200 Speaker 1: the only specific prediction we can make is that we 509 00:33:22,360 --> 00:33:26,280 Speaker 1: have more in common with our ancestors two million years 510 00:33:26,320 --> 00:33:31,360 Speaker 1: ago than we do with our descendants two hundred years 511 00:33:31,400 --> 00:33:38,440 Speaker 1: from now. In the meantime, go to eagleman dot com 512 00:33:38,480 --> 00:33:42,200 Speaker 1: slash podcast for more information and to find further reading. 513 00:33:42,560 --> 00:33:45,600 Speaker 1: Send me an email at podcast at eagleman dot com 514 00:33:45,840 --> 00:33:49,120 Speaker 1: with questions for discussion, and check out and subscribe to 515 00:33:49,200 --> 00:33:53,160 Speaker 1: Inner Cosmos on YouTube for videos of each episode and 516 00:33:53,240 --> 00:33:57,000 Speaker 1: to leave comments. Until next time. I'm David Eagleman, and 517 00:33:57,040 --> 00:34:04,360 Speaker 1: this is Inner Cosmos. You and not you. You