1 00:00:05,840 --> 00:00:09,160 Speaker 1: How would we know if we're living in a simulation, 2 00:00:09,960 --> 00:00:12,360 Speaker 1: And what does that have to do with having a 3 00:00:12,520 --> 00:00:16,680 Speaker 1: dream that you're a butterfly or John Lennon or Renee 4 00:00:16,760 --> 00:00:21,040 Speaker 1: des Cartes or freezing yourself in a vat of liquid nitrogen? 5 00:00:21,480 --> 00:00:24,040 Speaker 1: And how will we solve the problem that human bodies 6 00:00:24,120 --> 00:00:31,159 Speaker 1: can't do space travel? Welcome to Inner Cosmos with me, 7 00:00:31,320 --> 00:00:35,800 Speaker 1: David Eagleman. I'm a neuroscientist and an author at Stanford University, 8 00:00:36,000 --> 00:00:39,280 Speaker 1: and I've spent my career studying the intersection between how 9 00:00:39,320 --> 00:00:42,720 Speaker 1: the brain works and how we experience life. And what 10 00:00:42,760 --> 00:00:45,360 Speaker 1: I'm going to talk about today is the possibility of 11 00:00:45,880 --> 00:00:48,840 Speaker 1: living forever. What does that have to do with the brain? 12 00:00:49,360 --> 00:00:52,479 Speaker 1: Would we want to live forever? And how could we 13 00:00:52,560 --> 00:00:58,840 Speaker 1: get there? Humans have been obsessed with the idea of 14 00:00:59,040 --> 00:01:04,200 Speaker 1: immortality for ages. As far back as documented history goes, 15 00:01:04,720 --> 00:01:08,360 Speaker 1: there are stories of humans trying everything to unlock the 16 00:01:08,480 --> 00:01:14,520 Speaker 1: secret to eternal life. In Arthurian literature, the Holy Grail 17 00:01:14,720 --> 00:01:18,119 Speaker 1: was said to have miraculous healing powers and it would 18 00:01:18,120 --> 00:01:22,400 Speaker 1: grant anyone who drank from it eternal youthfulness, and the 19 00:01:22,440 --> 00:01:27,520 Speaker 1: ancient Egyptians they would painstakingly prepare the physical body for 20 00:01:27,680 --> 00:01:30,280 Speaker 1: the journey to the afterlife, and they believed you were 21 00:01:30,640 --> 00:01:35,520 Speaker 1: reborn again and again. And China's first emperor launched an 22 00:01:35,560 --> 00:01:39,480 Speaker 1: obsessive search for the elixir of life, and that obsession 23 00:01:39,720 --> 00:01:43,440 Speaker 1: continues and cultures all over the world today. While leaving 24 00:01:43,520 --> 00:01:46,880 Speaker 1: a legacy is the only thing close to immortality by 25 00:01:46,920 --> 00:01:50,920 Speaker 1: today's standards, Woodie Allen wanted the real thing. He said, 26 00:01:51,320 --> 00:01:54,800 Speaker 1: I don't want to achieve immortality through my work. I 27 00:01:54,920 --> 00:01:59,320 Speaker 1: want to achieve immortality through not dying. He said, I 28 00:01:59,360 --> 00:02:01,600 Speaker 1: don't want to live on in the hearts of my countrymen. 29 00:02:02,000 --> 00:02:05,000 Speaker 1: I want to live on in my apartment. So what 30 00:02:05,280 --> 00:02:10,720 Speaker 1: are our options for achieving eternal life? Well, the first 31 00:02:10,760 --> 00:02:14,240 Speaker 1: option has to do with medical innovations, such that we 32 00:02:14,360 --> 00:02:18,240 Speaker 1: keep repairing things that go wrong with the body. And 33 00:02:18,280 --> 00:02:22,600 Speaker 1: in fact, it's rumored that life insurance companies assume in 34 00:02:22,639 --> 00:02:27,080 Speaker 1: their actuarial tables that children born now are going to 35 00:02:27,120 --> 00:02:30,200 Speaker 1: live to one hundred and fourteen years old. And this 36 00:02:30,240 --> 00:02:35,359 Speaker 1: is based just on extrapolating medical innovation curves. In other words, 37 00:02:35,720 --> 00:02:39,280 Speaker 1: you take the pace of medical progress and you guess 38 00:02:39,360 --> 00:02:41,840 Speaker 1: what things are going to look like in a century 39 00:02:41,880 --> 00:02:45,440 Speaker 1: from now. But the fact is there are barriers to 40 00:02:45,680 --> 00:02:50,480 Speaker 1: physical immortality, like aging and disease, and then of course 41 00:02:50,560 --> 00:02:53,600 Speaker 1: there are accidents that can happen, like you fall off 42 00:02:53,600 --> 00:02:56,120 Speaker 1: a cliff, where you get in a car accident, or 43 00:02:56,120 --> 00:02:59,880 Speaker 1: a tornado gets you, or a volcano explodes, and on 44 00:03:00,080 --> 00:03:04,600 Speaker 1: and on. So although we can continue to make improvements 45 00:03:04,639 --> 00:03:09,240 Speaker 1: in medical science to expand the average lifespan, there's no 46 00:03:09,360 --> 00:03:12,919 Speaker 1: real guarantee that your lifespan is going to be any 47 00:03:12,960 --> 00:03:16,720 Speaker 1: better now. There's a second approach that some people have taken, 48 00:03:17,240 --> 00:03:20,560 Speaker 1: and that's about making a throw to the future, when 49 00:03:20,600 --> 00:03:22,760 Speaker 1: people will know things like. 50 00:03:22,800 --> 00:03:24,119 Speaker 2: How to repair bodies. 51 00:03:24,560 --> 00:03:27,240 Speaker 1: The issue is that we just don't know how to 52 00:03:27,440 --> 00:03:31,080 Speaker 1: cure lots of problems now, but it's not impossible that 53 00:03:31,120 --> 00:03:33,760 Speaker 1: two hundred years from now a lot of this stuff 54 00:03:33,800 --> 00:03:36,800 Speaker 1: will be obvious. In two hundred years, we'll look back 55 00:03:36,800 --> 00:03:39,280 Speaker 1: at our textbooks and we'll think, my gosh, how did 56 00:03:39,320 --> 00:03:41,760 Speaker 1: we not know that? How did we not know that 57 00:03:41,800 --> 00:03:43,760 Speaker 1: we can just reach in and tweak some things on 58 00:03:43,800 --> 00:03:47,640 Speaker 1: the telomeres and cure cancer easily with a few zapps 59 00:03:47,680 --> 00:03:52,560 Speaker 1: and essentially make ourselves live much longer, maybe forever. The 60 00:03:52,640 --> 00:03:54,960 Speaker 1: problem is that this is in the future and not 61 00:03:55,160 --> 00:03:57,720 Speaker 1: right now. And therefore we don't get any of the 62 00:03:57,760 --> 00:04:02,200 Speaker 1: benefits of that. So the idea is, perhaps you could 63 00:04:02,720 --> 00:04:07,640 Speaker 1: by pausing yourself now and rebooting yourself in that future. 64 00:04:08,320 --> 00:04:11,760 Speaker 1: And that's the approach of a company called Alcore. Alcore 65 00:04:11,880 --> 00:04:15,240 Speaker 1: is based in Arizona, and they have these giant silver 66 00:04:15,400 --> 00:04:20,600 Speaker 1: tanks called doers that are filled with liquid nitrogen and 67 00:04:20,600 --> 00:04:23,520 Speaker 1: they're kept at a temperature of negative one hundred and 68 00:04:23,600 --> 00:04:27,719 Speaker 1: ninety six degrees celsius. So when you're expiring on your 69 00:04:27,839 --> 00:04:32,040 Speaker 1: hospital bed, this company gets contacted either by your family 70 00:04:32,200 --> 00:04:34,680 Speaker 1: or because you're wearing a special bracelet, and as soon 71 00:04:34,720 --> 00:04:38,880 Speaker 1: as your heart stops, a team from Alcore rushes to 72 00:04:38,920 --> 00:04:43,400 Speaker 1: your bedside and starts the deep freezing process. Then they 73 00:04:43,400 --> 00:04:47,080 Speaker 1: transport your body to Arizona where they put it in 74 00:04:47,120 --> 00:04:50,520 Speaker 1: the giant vat with the liquid nitrogen. And you actually 75 00:04:50,600 --> 00:04:53,880 Speaker 1: have two choices here. You can freeze your entire body, 76 00:04:54,520 --> 00:04:56,400 Speaker 1: or if you can't afford that, you can just have 77 00:04:56,520 --> 00:04:59,839 Speaker 1: your head removed and just have your head frozen. Because 78 00:05:00,000 --> 00:05:02,799 Speaker 1: while it's great to have your whole body, the real 79 00:05:03,000 --> 00:05:07,279 Speaker 1: representation of who you are is theoretically just kept in 80 00:05:07,320 --> 00:05:10,440 Speaker 1: the brain. Or suffice it to say, that's the densest 81 00:05:10,600 --> 00:05:12,240 Speaker 1: representation of who you are. 82 00:05:12,560 --> 00:05:14,119 Speaker 2: You just need the head. 83 00:05:14,680 --> 00:05:17,040 Speaker 1: Now, as a side note, you might wonder how much 84 00:05:17,120 --> 00:05:20,599 Speaker 1: this costs. Well, you don't actually have to pay anything upfront. Instead, 85 00:05:20,640 --> 00:05:24,400 Speaker 1: you can modify your life insurance payment to make its 86 00:05:24,480 --> 00:05:28,200 Speaker 1: payout to the company. So this allows you the possibility 87 00:05:28,320 --> 00:05:32,520 Speaker 1: of living a second chapter. Now, if you're unfamiliar with 88 00:05:32,600 --> 00:05:34,919 Speaker 1: this slice of the world, it will blow your mind. 89 00:05:35,000 --> 00:05:38,560 Speaker 1: I visited al Core and met the then CEO, Max 90 00:05:38,600 --> 00:05:41,200 Speaker 1: More for my television show The Brain, and I got 91 00:05:41,240 --> 00:05:43,960 Speaker 1: to tour around the facility and it's quite striking. There 92 00:05:43,960 --> 00:05:48,000 Speaker 1: are these giant silver vats of liquid nitrogen and they 93 00:05:48,040 --> 00:05:51,520 Speaker 1: currently have two hundred and eight people stored in there, 94 00:05:52,120 --> 00:05:56,320 Speaker 1: and the company has over fourteen hundred people who are 95 00:05:56,360 --> 00:05:58,640 Speaker 1: signed up for the service, but they haven't died yet 96 00:05:59,000 --> 00:06:02,040 Speaker 1: and they will eventually end up in these vats. Now, 97 00:06:02,080 --> 00:06:07,400 Speaker 1: although scientists don't currently know how to unfreeze and revivify 98 00:06:07,480 --> 00:06:11,200 Speaker 1: a body, the idea is that some generation in the 99 00:06:11,240 --> 00:06:14,840 Speaker 1: future will figure that out. Maybe that's a century from now, 100 00:06:14,880 --> 00:06:18,159 Speaker 1: or maybe that's a millennium. And then the idea is 101 00:06:18,160 --> 00:06:22,920 Speaker 1: that the frozen person will awaken into a new world. 102 00:06:23,600 --> 00:06:28,080 Speaker 1: So people who sign up for the service operate on hope. 103 00:06:28,240 --> 00:06:32,520 Speaker 1: They're making a deep football throw into the future, hoping 104 00:06:32,720 --> 00:06:36,120 Speaker 1: someone will be there to catch it, and some people 105 00:06:36,279 --> 00:06:40,680 Speaker 1: choose to cryo preserve their pets. Also, the idea is 106 00:06:40,680 --> 00:06:44,559 Speaker 1: that if you awaken to a completely new reality, that's 107 00:06:44,720 --> 00:06:48,320 Speaker 1: likely to be disorienting, and if you have a familiar, 108 00:06:48,480 --> 00:06:51,320 Speaker 1: fuzzy face by your side, that could be really nice 109 00:06:51,320 --> 00:06:55,280 Speaker 1: and reassuring. Also, it turns out that Alcore allows you 110 00:06:55,320 --> 00:06:58,520 Speaker 1: to have a small memory box to go along with 111 00:06:58,680 --> 00:07:01,680 Speaker 1: your frozen body. And the reason I mentioned this is 112 00:07:01,720 --> 00:07:05,000 Speaker 1: because sometimes people ask me to inscribe one of my 113 00:07:05,040 --> 00:07:08,400 Speaker 1: books to them. But I recently got a really interesting 114 00:07:08,440 --> 00:07:12,240 Speaker 1: request from a man who had just signed up for Alcore. 115 00:07:12,640 --> 00:07:14,760 Speaker 1: He said he'd been moved by my book to the 116 00:07:14,800 --> 00:07:18,240 Speaker 1: Brain and wanted to keep that in his memory box 117 00:07:18,320 --> 00:07:21,280 Speaker 1: that goes along with the frozen body, so he'd have 118 00:07:21,400 --> 00:07:24,520 Speaker 1: it in his new world when he gets rebooted. So 119 00:07:24,640 --> 00:07:28,760 Speaker 1: he asked me to sign my book for his revivification 120 00:07:29,040 --> 00:07:32,120 Speaker 1: in a century or two. In other words, I would 121 00:07:32,600 --> 00:07:35,880 Speaker 1: sign the book to him for his re entry into 122 00:07:35,920 --> 00:07:38,560 Speaker 1: the world sometime in the future. Now I've had a 123 00:07:38,600 --> 00:07:42,480 Speaker 1: lot of cool requests to sign books for special occasions, 124 00:07:42,760 --> 00:07:45,680 Speaker 1: but signing for someone who is possibly coming back in 125 00:07:45,720 --> 00:07:48,440 Speaker 1: a couple of centuries. That was a new one for me. 126 00:07:49,000 --> 00:07:53,680 Speaker 1: So here's what I wrote, Dear Steve, welcome back. I'm 127 00:07:53,720 --> 00:07:59,040 Speaker 1: so curious what secrets the world has divulged now. Presumably 128 00:07:59,440 --> 00:08:03,480 Speaker 1: the text books have been thoroughly rewritten, and the current 129 00:08:03,600 --> 00:08:07,920 Speaker 1: understanding of science would intimidate and thrill a twenty first 130 00:08:07,960 --> 00:08:12,520 Speaker 1: century mind. Presumably I am long gone and all that 131 00:08:12,560 --> 00:08:16,440 Speaker 1: remains of me are fading echoes of my genetic code. 132 00:08:17,200 --> 00:08:21,560 Speaker 1: But this book remains. It memorializes what we knew in 133 00:08:21,600 --> 00:08:25,760 Speaker 1: the early part of the millennium. Like all snapshots of science, 134 00:08:26,240 --> 00:08:30,720 Speaker 1: some notions will stale, while others will prove more durable. 135 00:08:31,240 --> 00:08:35,920 Speaker 1: Whatever the case, these pages form a bridge. They allow 136 00:08:36,000 --> 00:08:40,400 Speaker 1: me to talk with you as though we still live contemporaneously. 137 00:08:41,040 --> 00:08:45,800 Speaker 1: They allow two brains to span an unknowably wide chasm 138 00:08:45,840 --> 00:08:46,360 Speaker 1: of time. 139 00:08:47,200 --> 00:08:48,800 Speaker 2: Enjoy the future, David. 140 00:08:51,000 --> 00:08:55,960 Speaker 1: So that's the method of freezing yourself until biology is 141 00:08:56,000 --> 00:08:59,440 Speaker 1: better understood. It's a hail Mary throw, and you don't 142 00:08:59,440 --> 00:09:01,680 Speaker 1: know for sure that anyone's going to catch the ball. 143 00:09:02,040 --> 00:09:05,679 Speaker 1: For all you know, society will have collapsed, and maybe 144 00:09:05,720 --> 00:09:10,040 Speaker 1: your body is going to get cannibalized, or legislatively, something 145 00:09:10,080 --> 00:09:12,840 Speaker 1: might happen in the future where all these frozen bodies 146 00:09:13,200 --> 00:09:16,680 Speaker 1: don't get unfrozen, or we are long blown up in 147 00:09:16,720 --> 00:09:19,400 Speaker 1: a nuclear war and there's nobody there to unfreeze you. 148 00:09:20,160 --> 00:09:22,640 Speaker 1: But people who sign up for this figure that at 149 00:09:22,720 --> 00:09:26,720 Speaker 1: least they have a non zero chance of something happening here, 150 00:09:27,120 --> 00:09:30,360 Speaker 1: of getting to live a second chapter, and that's better 151 00:09:30,679 --> 00:09:34,360 Speaker 1: than having one hundred percent certainty that you're going to die. 152 00:09:35,000 --> 00:09:39,679 Speaker 1: But fundamentally, there's a problem that's probably intractable, which is 153 00:09:39,720 --> 00:09:43,760 Speaker 1: that we are made of biological pieces and parts, and 154 00:09:43,800 --> 00:09:46,760 Speaker 1: those things are going to wear down. Even if we're 155 00:09:46,800 --> 00:09:52,200 Speaker 1: able to successfully unfreeze people, the ticking clock of aging 156 00:09:52,240 --> 00:09:55,280 Speaker 1: will keep ticking, and we're always going to be racing 157 00:09:55,320 --> 00:09:59,920 Speaker 1: against entropy. As the poet William Butler Yeates said, things 158 00:10:00,200 --> 00:10:03,959 Speaker 1: fall apart, the center cannot hold, and this is what 159 00:10:04,080 --> 00:10:07,120 Speaker 1: aging and death is. And that brings us to a 160 00:10:07,240 --> 00:10:10,880 Speaker 1: question of how we might solve this for the long term, 161 00:10:10,960 --> 00:10:15,280 Speaker 1: forever by just extracting the part of us that matters 162 00:10:15,960 --> 00:10:21,640 Speaker 1: our consciousness. So could it be possible someday to upload 163 00:10:22,040 --> 00:10:26,000 Speaker 1: our consciousness out of our brains and onto a different 164 00:10:26,080 --> 00:10:30,120 Speaker 1: substrate like silicon and run it, and would that still 165 00:10:30,160 --> 00:10:30,440 Speaker 1: be you. 166 00:10:31,000 --> 00:10:32,240 Speaker 2: So here's the idea. 167 00:10:32,480 --> 00:10:35,680 Speaker 1: Maybe the hardware of the brain, all these cells, that's 168 00:10:35,720 --> 00:10:39,600 Speaker 1: not the important part. But instead it's the software of 169 00:10:39,640 --> 00:10:42,880 Speaker 1: the brain that matters. In other words, the algorithms that 170 00:10:42,920 --> 00:10:46,840 Speaker 1: are running. Maybe that's what makes a mind. Maybe it's 171 00:10:46,880 --> 00:10:50,760 Speaker 1: the patterns by which these spikes, these zeros and ones 172 00:10:50,880 --> 00:10:51,600 Speaker 1: are running. 173 00:10:51,960 --> 00:10:53,800 Speaker 2: And if we could extract those. 174 00:10:53,679 --> 00:10:58,679 Speaker 1: Massive patterns and reproduce them on something different, maybe that 175 00:10:58,720 --> 00:10:59,760 Speaker 1: would be you. 176 00:11:00,840 --> 00:11:04,120 Speaker 2: This idea is called computational equivalents. 177 00:11:04,640 --> 00:11:07,320 Speaker 1: The idea is that the eighty six billion cells of 178 00:11:07,360 --> 00:11:11,320 Speaker 1: the brain are just running algorithms that have been sussed 179 00:11:11,360 --> 00:11:15,880 Speaker 1: out by mother nature over billions of years. But fundamentally, 180 00:11:16,440 --> 00:11:19,720 Speaker 1: it's just code that's being run. And so the idea 181 00:11:19,800 --> 00:11:23,400 Speaker 1: of computational equivalents is that it doesn't matter if you 182 00:11:23,520 --> 00:11:28,280 Speaker 1: reproduce this system out of legos or tinker toys or 183 00:11:28,400 --> 00:11:33,520 Speaker 1: ball bearings or silicon. If the system runs the same program, 184 00:11:33,960 --> 00:11:35,920 Speaker 1: you will get the same consciousness. 185 00:11:35,960 --> 00:11:37,280 Speaker 2: You will get you. 186 00:11:38,080 --> 00:11:41,920 Speaker 1: Now, we don't know if the theory of computational equivalents 187 00:11:41,960 --> 00:11:45,240 Speaker 1: is correct, but if it is, it implies that we 188 00:11:45,280 --> 00:11:50,600 Speaker 1: could shift ourselves off the degrading biological substrate of our bodies, 189 00:11:51,200 --> 00:11:55,720 Speaker 1: and with powerful enough computers simulating all these neural interactions, 190 00:11:55,760 --> 00:12:02,040 Speaker 1: we could upload our consciousness and exist digitally, circumventing the 191 00:12:02,240 --> 00:12:07,040 Speaker 1: inevitability of demise. That would be the single biggest leap 192 00:12:07,120 --> 00:12:10,320 Speaker 1: in the history of our species, launching us into an 193 00:12:10,360 --> 00:12:14,960 Speaker 1: era of transhumanism. Just imagine what it could look like 194 00:12:15,080 --> 00:12:18,440 Speaker 1: to leave your body behind and wake up in a 195 00:12:18,760 --> 00:12:23,800 Speaker 1: simulated world. Your simulated existence could look like anything you wanted. 196 00:12:24,240 --> 00:12:28,679 Speaker 2: Programmers could make any virtual world for you. If you've 197 00:12:28,679 --> 00:12:30,719 Speaker 2: ever wanted to fly. 198 00:12:30,960 --> 00:12:35,120 Speaker 1: Or breathe underwater, or inhabit an alien planet, you just 199 00:12:35,480 --> 00:12:38,560 Speaker 1: sign a contract for it, You pay, and that becomes 200 00:12:38,600 --> 00:12:42,800 Speaker 1: your new existence. It's your simulation and your reality is 201 00:12:43,000 --> 00:12:46,280 Speaker 1: up to you. And as a side note, we could 202 00:12:46,320 --> 00:12:50,400 Speaker 1: in theory run virtual brains as fast or slow as 203 00:12:50,440 --> 00:12:54,520 Speaker 1: we wanted, so in seconds of computing time, you could 204 00:12:54,520 --> 00:12:57,960 Speaker 1: have thousands of years of experience. In fact, even if 205 00:12:58,000 --> 00:13:01,800 Speaker 1: the cosmos were coming to an end tomorrow, the programmers 206 00:13:01,840 --> 00:13:04,559 Speaker 1: could simulate a billion more years for you. 207 00:13:04,559 --> 00:13:05,200 Speaker 2: In that time. 208 00:13:07,960 --> 00:13:11,760 Speaker 1: So all that sounds great, but there are some technical 209 00:13:11,920 --> 00:13:17,359 Speaker 1: and theoretical hurdles. The first one is the enormous difficulty 210 00:13:17,480 --> 00:13:20,760 Speaker 1: of figuring out the secrets of the brain. And this 211 00:13:20,840 --> 00:13:23,400 Speaker 1: is a problem that can't be underestimated. 212 00:13:23,960 --> 00:13:24,880 Speaker 2: As much as. 213 00:13:24,760 --> 00:13:27,720 Speaker 1: We've discovered about how the brain works, we are a 214 00:13:28,000 --> 00:13:31,560 Speaker 1: long way off from understanding the big picture. It's a 215 00:13:31,760 --> 00:13:47,120 Speaker 1: very tough problem. You've heard me mentioned before on this 216 00:13:47,200 --> 00:13:50,880 Speaker 1: podcast that the brain is the most complex device we 217 00:13:50,920 --> 00:13:54,640 Speaker 1: have ever discovered. It has almost one hundred billion neurons. 218 00:13:55,000 --> 00:13:58,160 Speaker 1: Each of those is as complicated as the city of 219 00:13:58,200 --> 00:14:02,440 Speaker 1: New York, and you've got detailed connections between them that 220 00:14:02,559 --> 00:14:06,720 Speaker 1: number in the hundreds of trillions. So figuring out the 221 00:14:06,880 --> 00:14:09,880 Speaker 1: algorithms running in the system, or even just scratching the 222 00:14:09,920 --> 00:14:13,840 Speaker 1: surface of those is something that still, for the most. 223 00:14:13,640 --> 00:14:16,120 Speaker 2: Part, has completely eluded us. 224 00:14:16,880 --> 00:14:19,800 Speaker 1: Consider this experiment that a couple of my colleagues did 225 00:14:19,840 --> 00:14:23,120 Speaker 1: in twenty seventeen. They asked, what would happen if we 226 00:14:23,240 --> 00:14:29,440 Speaker 1: used our best neuroscience approaches to understand a simple microprocessor, 227 00:14:29,480 --> 00:14:31,840 Speaker 1: just a computer chip. So if the brain is really 228 00:14:31,960 --> 00:14:35,280 Speaker 1: like a giant, complex computer chip, let's see how far 229 00:14:35,360 --> 00:14:38,240 Speaker 1: we can get by analyzing this very simple one. So 230 00:14:38,280 --> 00:14:42,680 Speaker 1: they picked an Atari microprocessor from nineteen eighty one that 231 00:14:42,840 --> 00:14:46,800 Speaker 1: ran donkey Kong. And it turns out that after measuring 232 00:14:47,240 --> 00:14:50,440 Speaker 1: all the input and output signals from the chip and 233 00:14:50,560 --> 00:14:53,720 Speaker 1: applying the type of analyses that we have in neuroscience, 234 00:14:54,360 --> 00:14:57,920 Speaker 1: they couldn't say much of anything about the function of 235 00:14:58,000 --> 00:15:00,520 Speaker 1: the chip. I mean, note that they all already knew 236 00:15:00,600 --> 00:15:04,040 Speaker 1: what the chip accomplished, and they know how chips work. 237 00:15:04,560 --> 00:15:06,760 Speaker 1: But when you're looking at a ton of zeros and 238 00:15:06,840 --> 00:15:11,760 Speaker 1: ones streaming in and streaming out, it's really difficult to say, Okay, 239 00:15:11,800 --> 00:15:15,000 Speaker 1: here's the algorithms that the chip is implementing. As the 240 00:15:15,360 --> 00:15:18,480 Speaker 1: researchers put it in that paper quote. In the case 241 00:15:18,520 --> 00:15:22,000 Speaker 1: of the processor, we know its function and structure, and 242 00:15:22,440 --> 00:15:26,160 Speaker 1: our results stayed well short of what we would call 243 00:15:26,520 --> 00:15:30,560 Speaker 1: a satisfying understanding. So figuring out what the brain is 244 00:15:30,600 --> 00:15:33,360 Speaker 1: implementing may turn out to be really hard. 245 00:15:33,880 --> 00:15:35,240 Speaker 2: So can we copy the brain? 246 00:15:35,440 --> 00:15:39,480 Speaker 1: Well, we're nowhere close to copying the human brain because 247 00:15:39,560 --> 00:15:43,600 Speaker 1: the detailed structure of it is so enormous. It's probably 248 00:15:44,000 --> 00:15:47,760 Speaker 1: a zetabyte of data, which is maybe around a tenth 249 00:15:48,080 --> 00:15:51,320 Speaker 1: of the computational capacity of our planet right now. But 250 00:15:51,400 --> 00:15:55,680 Speaker 1: this is fundamentally just a technology challenge, and the way 251 00:15:55,720 --> 00:15:58,200 Speaker 1: that things are going we will probably get there at 252 00:15:58,240 --> 00:16:01,120 Speaker 1: some point. It may not happen in our lifetime, but 253 00:16:01,120 --> 00:16:04,640 Speaker 1: it's essentially guaranteed to happen in the future as our 254 00:16:04,720 --> 00:16:07,239 Speaker 1: computational power increases. 255 00:16:07,600 --> 00:16:09,160 Speaker 2: Okay, but this brings us. 256 00:16:09,160 --> 00:16:13,040 Speaker 1: To a tougher hurdle, which might be a theoretical one, 257 00:16:13,720 --> 00:16:17,320 Speaker 1: which is, there may be physics happening in the brain 258 00:16:18,040 --> 00:16:20,920 Speaker 1: that don't make it as easy as copy pasting a 259 00:16:20,960 --> 00:16:21,840 Speaker 1: giant document. 260 00:16:22,360 --> 00:16:24,040 Speaker 2: For example, the. 261 00:16:24,000 --> 00:16:28,800 Speaker 1: Brain might be exploiting quantum mechanical effects, and if so, 262 00:16:29,040 --> 00:16:31,560 Speaker 1: that means we can't pretend the brain is just a 263 00:16:31,560 --> 00:16:36,000 Speaker 1: big clockwork machine. Quantum mechanics is the science of the 264 00:16:36,200 --> 00:16:40,000 Speaker 1: very small. It explains the behavior of subatomic particles and 265 00:16:40,040 --> 00:16:42,360 Speaker 1: how they interact with each other and with light. The 266 00:16:42,440 --> 00:16:46,600 Speaker 1: thing is that it's extraordinarily counterintuitive, and it's not really 267 00:16:46,680 --> 00:16:50,800 Speaker 1: anything like classical physics, and we have no idea how 268 00:16:50,840 --> 00:16:51,520 Speaker 1: to build a. 269 00:16:51,560 --> 00:16:54,240 Speaker 2: Quantum system the size of the brain. 270 00:16:54,480 --> 00:16:58,080 Speaker 1: Now I'll just mention there are many scientists like Roger Penrose, 271 00:16:58,080 --> 00:17:01,200 Speaker 1: who recently won the Nobel Prize, who suggest that there 272 00:17:01,320 --> 00:17:04,800 Speaker 1: might be quantum mechanical effects in the brain, And there 273 00:17:04,800 --> 00:17:06,919 Speaker 1: are many more scientists on the other side of the 274 00:17:06,960 --> 00:17:10,000 Speaker 1: debate who say they think there aren't quantum mechanics in 275 00:17:10,040 --> 00:17:12,040 Speaker 1: the brain, but I just want to note that we. 276 00:17:12,000 --> 00:17:13,240 Speaker 2: Don't actually know. 277 00:17:14,240 --> 00:17:16,919 Speaker 1: Some people make fun of the quantum mechanics approach and 278 00:17:16,960 --> 00:17:20,359 Speaker 1: they say something like, well, quantum mechanics is mysterious and 279 00:17:20,400 --> 00:17:22,960 Speaker 1: the brain is mysterious, so maybe they're the same thing. 280 00:17:23,040 --> 00:17:24,159 Speaker 2: Ha ha, Okay. 281 00:17:24,400 --> 00:17:28,879 Speaker 1: Skepticism is always warranted until something is demonstrated to be true. 282 00:17:29,520 --> 00:17:32,719 Speaker 1: But science doesn't rule things in er out in advance 283 00:17:32,880 --> 00:17:36,200 Speaker 1: of having enough data, So when it comes to quantum mechanics, 284 00:17:36,240 --> 00:17:38,919 Speaker 1: we don't actually know yet. Part of the problem that 285 00:17:38,960 --> 00:17:42,800 Speaker 1: people have excepting quantum mechanics is that it's poorly understood 286 00:17:42,880 --> 00:17:46,240 Speaker 1: and it's relatively new for us humans. But remember that 287 00:17:46,320 --> 00:17:50,240 Speaker 1: for mother Nature, quantum mechanics has been around from the beginning. 288 00:17:50,600 --> 00:17:52,719 Speaker 1: And one thing that's clear is that if there's something 289 00:17:52,800 --> 00:17:56,720 Speaker 1: mother Nature can take advantage of, she will. So maybe 290 00:17:56,800 --> 00:17:58,919 Speaker 1: quantum mechanics has nothing at all to do with the 291 00:17:58,920 --> 00:18:01,239 Speaker 1: brain or everything to do with the brain, but we 292 00:18:01,320 --> 00:18:04,160 Speaker 1: can't rule it out yet. And if it turns out 293 00:18:04,200 --> 00:18:08,760 Speaker 1: that the brain is not so straightforward but requires quantum 294 00:18:08,800 --> 00:18:12,840 Speaker 1: effects or whatever we discover next century like super hyper 295 00:18:12,920 --> 00:18:16,080 Speaker 1: quantum x effects, then this path of learning how to 296 00:18:16,160 --> 00:18:16,960 Speaker 1: reproduce the. 297 00:18:16,920 --> 00:18:18,959 Speaker 2: Brain might take much longer. 298 00:18:19,000 --> 00:18:22,160 Speaker 1: Maybe we'll need quantum computers or something we haven't even 299 00:18:22,240 --> 00:18:24,919 Speaker 1: thought of yet. And I want to mention a third 300 00:18:25,119 --> 00:18:29,800 Speaker 1: technical hurdle to successfully uploading our consciousness, which is that 301 00:18:29,840 --> 00:18:33,840 Speaker 1: the simulated brain has to be able to modify its 302 00:18:33,880 --> 00:18:37,480 Speaker 1: own structure. This is what's known as brain plasticity. In 303 00:18:37,560 --> 00:18:40,080 Speaker 1: some of my other episodes have been about this. The 304 00:18:40,119 --> 00:18:44,840 Speaker 1: activity that runs through a brain modifies the brain. 305 00:18:45,320 --> 00:18:46,480 Speaker 2: It changes the brain. 306 00:18:46,520 --> 00:18:50,240 Speaker 1: That's why you have memory, for example, because everything you 307 00:18:50,359 --> 00:18:54,280 Speaker 1: experience is actually changing the physical structure of your brain, 308 00:18:54,400 --> 00:18:58,400 Speaker 1: so that activity runs through it differently. Next time, when 309 00:18:58,400 --> 00:19:00,760 Speaker 1: you learn that the name of this part podcast is 310 00:19:00,840 --> 00:19:05,120 Speaker 1: inner cosmos, that physically changes the structure of your brain 311 00:19:05,200 --> 00:19:07,560 Speaker 1: so that when someone asks you, hey, what was the 312 00:19:07,640 --> 00:19:11,760 Speaker 1: name of that podcast, your brain can retrieve that information. 313 00:19:12,520 --> 00:19:16,440 Speaker 1: So we need to simulate not only the detailed structure 314 00:19:16,520 --> 00:19:18,960 Speaker 1: of the brain to run the software, but we also 315 00:19:19,000 --> 00:19:23,040 Speaker 1: need to understand the physics of the ongoing interactions and 316 00:19:23,119 --> 00:19:26,159 Speaker 1: how it changes. For example, in the brain, you have 317 00:19:26,280 --> 00:19:30,040 Speaker 1: the activity of transcription factors that travel to the nucleus 318 00:19:30,080 --> 00:19:33,280 Speaker 1: and change gene expression. You have dynamic changes in the 319 00:19:33,320 --> 00:19:37,680 Speaker 1: location and strength of the synapses, the connections between neurons, 320 00:19:37,960 --> 00:19:43,520 Speaker 1: and so on. And unless your simulated experiences change the 321 00:19:43,600 --> 00:19:48,840 Speaker 1: structure of your simulated brain, you could inform new memories. 322 00:19:49,160 --> 00:19:52,880 Speaker 1: You'd have no sense of the passage of time. Your 323 00:19:52,880 --> 00:19:57,840 Speaker 1: consciousness would be frozen at whatever point it was uploaded 324 00:19:57,920 --> 00:20:01,439 Speaker 1: into the simulation. Under those circums stances, would there be 325 00:20:01,600 --> 00:20:05,800 Speaker 1: any point to immortality? Okay, So let's imagine that two 326 00:20:05,920 --> 00:20:09,240 Speaker 1: hundred years go by and we managed to surmount all 327 00:20:09,320 --> 00:20:10,560 Speaker 1: those technical hurdles. 328 00:20:10,600 --> 00:20:12,440 Speaker 2: We have enough storage. 329 00:20:11,960 --> 00:20:15,119 Speaker 1: Capacity, we figure out any quantum effects, We make it 330 00:20:15,160 --> 00:20:19,280 Speaker 1: so that the simulation self modifies based on its experience. 331 00:20:19,520 --> 00:20:23,480 Speaker 1: So great, we're there, and in this future world, uploading 332 00:20:23,480 --> 00:20:27,360 Speaker 1: our consciousness would be possible. Now what would that mean 333 00:20:27,480 --> 00:20:31,160 Speaker 1: for the human species? Well, among other things, it would 334 00:20:31,240 --> 00:20:35,119 Speaker 1: open up the possibility of getting to other solar systems. 335 00:20:35,280 --> 00:20:38,320 Speaker 1: There are at least one hundred billion other galaxies in 336 00:20:38,359 --> 00:20:42,120 Speaker 1: our cosmos, each of which contain one hundred billion stars. 337 00:20:42,640 --> 00:20:47,520 Speaker 1: We've already spotted thousands of exoplanets orbiting those stars. There 338 00:20:47,560 --> 00:20:50,800 Speaker 1: are planets that are like Earth in some way, and 339 00:20:50,880 --> 00:20:52,800 Speaker 1: some of those have conditions quite. 340 00:20:52,600 --> 00:20:53,200 Speaker 2: Like the Earth. 341 00:20:53,480 --> 00:20:57,200 Speaker 1: But The impossibility lies in the fact that our current 342 00:20:57,760 --> 00:21:03,320 Speaker 1: fleshy bodies will ever get to those exoplanets, because there's 343 00:21:03,400 --> 00:21:05,400 Speaker 1: just no real way that we're going to be able 344 00:21:05,480 --> 00:21:09,280 Speaker 1: to travel those kind of distances in space and in time. 345 00:21:09,760 --> 00:21:12,800 Speaker 2: But uploading would allow us. 346 00:21:12,960 --> 00:21:16,760 Speaker 1: To transfer our minds into bodies that are built for 347 00:21:16,840 --> 00:21:21,000 Speaker 1: space travel, and that way we could travel between stars 348 00:21:21,040 --> 00:21:27,600 Speaker 1: in between galaxies with a human mind and a titanium body. Also, 349 00:21:27,880 --> 00:21:32,359 Speaker 1: note that with a simulated mind, you could pause the simulation, 350 00:21:32,920 --> 00:21:35,679 Speaker 1: you could shoot it out into space and reboot it 351 00:21:35,760 --> 00:21:38,880 Speaker 1: a thousand years later when it arrives at a planet, 352 00:21:39,600 --> 00:21:42,120 Speaker 1: So it would seem to your consciousness that you were 353 00:21:42,160 --> 00:21:45,359 Speaker 1: on Earth, you had a launch, and then you instantly 354 00:21:45,440 --> 00:21:48,960 Speaker 1: found yourself on a new planet. In other words, if 355 00:21:48,960 --> 00:21:51,960 Speaker 1: you could upload your brain into silicon, this would be 356 00:21:52,080 --> 00:21:56,720 Speaker 1: equivalent to the physics dream of finding a wormhole that 357 00:21:56,800 --> 00:21:59,159 Speaker 1: lets you get from one part of the universe to 358 00:21:59,240 --> 00:22:05,920 Speaker 1: another in a subjective instant. Okay, so we've established that 359 00:22:06,000 --> 00:22:09,520 Speaker 1: if the algorithms are the important part of what makes 360 00:22:09,520 --> 00:22:13,560 Speaker 1: you who you are, rather than the biological physical stuff, 361 00:22:14,160 --> 00:22:17,200 Speaker 1: then it's a possibility will someday be able to copy 362 00:22:17,240 --> 00:22:20,920 Speaker 1: our brains and upload them onto Silica and run them there. 363 00:22:21,720 --> 00:22:25,000 Speaker 1: But there's an important question here, is it really you? 364 00:22:26,119 --> 00:22:28,160 Speaker 1: I was thinking about this the other day because Paul 365 00:22:28,280 --> 00:22:33,200 Speaker 1: McCartney announced an upcoming song with vocals by John Lennon 366 00:22:33,680 --> 00:22:37,640 Speaker 1: thanks to AI that brings Lennon's voice back to life. 367 00:22:38,280 --> 00:22:41,680 Speaker 1: But even though John Lennon is out there singing new 368 00:22:41,880 --> 00:22:46,240 Speaker 1: songs now and everyone's talking about his immortality, he doesn't 369 00:22:46,280 --> 00:22:50,200 Speaker 1: get to enjoy it. John Lennon doesn't know that he's 370 00:22:50,240 --> 00:22:53,439 Speaker 1: been brought back. It's just zeros and ones running on 371 00:22:53,480 --> 00:22:57,359 Speaker 1: a computer. John Lennon doesn't get anything out of that. 372 00:22:58,400 --> 00:22:59,520 Speaker 2: So even if there's. 373 00:22:59,320 --> 00:23:03,719 Speaker 1: A suit for a complete simulation of you, is it 374 00:23:03,840 --> 00:23:07,560 Speaker 1: really you? Or is it just zeros and ones? Well, 375 00:23:07,640 --> 00:23:11,359 Speaker 1: I think this could be argued either way. For example, 376 00:23:11,880 --> 00:23:14,399 Speaker 1: every night when you go to sleep, it's like you 377 00:23:14,440 --> 00:23:18,600 Speaker 1: are turning off, and then the consciousness that awakens on 378 00:23:18,640 --> 00:23:22,320 Speaker 1: your pillow in the morning inherits all of your memories 379 00:23:22,640 --> 00:23:25,439 Speaker 1: and we say, yeah, I'm the same person who climbed 380 00:23:25,480 --> 00:23:27,959 Speaker 1: into this bed last night. I turned off, and then 381 00:23:28,000 --> 00:23:30,720 Speaker 1: I turn on again and it's me and I'm getting 382 00:23:30,760 --> 00:23:34,800 Speaker 1: back to business. So maybe the process of transferring from 383 00:23:34,840 --> 00:23:39,000 Speaker 1: your physical body to a computer is just like that 384 00:23:39,160 --> 00:23:42,439 Speaker 1: where you open your eyes in the simulated world and 385 00:23:42,480 --> 00:23:44,119 Speaker 1: you think, cool, here I am. 386 00:23:44,280 --> 00:23:45,640 Speaker 2: Let's get back to business. 387 00:23:45,840 --> 00:23:48,520 Speaker 1: But there's another way to look at this too, which 388 00:23:48,560 --> 00:23:52,399 Speaker 1: is that possibly when some company scans your brain and 389 00:23:52,440 --> 00:23:56,120 Speaker 1: then uploads you into the computer, that's not you at all. 390 00:23:56,280 --> 00:24:02,080 Speaker 1: That's just a computer program that's running. That program happens 391 00:24:02,119 --> 00:24:06,160 Speaker 1: to feel confident that it's you. It has all your 392 00:24:06,240 --> 00:24:10,000 Speaker 1: memories and beliefs, and thinks that it was just there 393 00:24:10,040 --> 00:24:14,160 Speaker 1: standing outside the computer in your body, but its existence 394 00:24:14,320 --> 00:24:18,160 Speaker 1: inside the computer doesn't help you at all. Let's imagine 395 00:24:18,160 --> 00:24:21,240 Speaker 1: you just paid a million bucks to this immortality company 396 00:24:21,560 --> 00:24:23,679 Speaker 1: and they tell you, hey, it worked. You see this 397 00:24:23,760 --> 00:24:27,360 Speaker 1: little coordinate moving around on the screen. That's you. You're 398 00:24:27,400 --> 00:24:31,640 Speaker 1: living forever now. And then you leave the immortality company 399 00:24:31,800 --> 00:24:34,359 Speaker 1: and you drive home and sit on the couch and 400 00:24:35,000 --> 00:24:36,199 Speaker 1: your life is no different. 401 00:24:36,760 --> 00:24:38,679 Speaker 2: You are still heading towards death. 402 00:24:39,080 --> 00:24:41,040 Speaker 1: It's just that you know you've paid all this money 403 00:24:41,119 --> 00:24:43,919 Speaker 1: so that some computer simulation has a good afterlife. 404 00:24:44,359 --> 00:24:45,399 Speaker 2: But is that really you? 405 00:24:45,920 --> 00:24:49,119 Speaker 1: Did you gain anything out of paying this one million dollars? 406 00:24:49,520 --> 00:24:52,560 Speaker 1: The actual situation is that there are now two of you. 407 00:24:52,720 --> 00:24:55,600 Speaker 1: It's not like your consciousness is split or something, because 408 00:24:56,000 --> 00:25:00,639 Speaker 1: you immediately move off on different trajectories. With each new experience, 409 00:25:01,160 --> 00:25:04,920 Speaker 1: your brain and the computer simulation's brain are becoming different, 410 00:25:05,040 --> 00:25:09,240 Speaker 1: so it really is like two separate beings. Now, interestingly, 411 00:25:09,800 --> 00:25:14,200 Speaker 1: there's a philosophical question here about the timing. If the 412 00:25:14,280 --> 00:25:17,280 Speaker 1: company uploads a copy of your brain to the computer 413 00:25:17,680 --> 00:25:20,480 Speaker 1: and you go home, then it definitely seems like you 414 00:25:20,560 --> 00:25:22,280 Speaker 1: have not achieved immortality. 415 00:25:22,720 --> 00:25:23,240 Speaker 2: But if the. 416 00:25:23,200 --> 00:25:27,280 Speaker 1: Company kills you and turns on the computer one second later, 417 00:25:28,080 --> 00:25:32,440 Speaker 1: then it's like a transfer. You've gone from being inside 418 00:25:32,480 --> 00:25:35,560 Speaker 1: your body to being inside the virtual world. It's like 419 00:25:35,680 --> 00:25:38,960 Speaker 1: waking up on your pillow. Now you may well say, yeah, 420 00:25:38,960 --> 00:25:43,760 Speaker 1: but I'm not actually sure that's me inside the virtual world. 421 00:25:43,800 --> 00:25:45,800 Speaker 2: It's like a recreation of me. 422 00:25:46,280 --> 00:25:48,880 Speaker 1: But I'm dead. I just got killed by the company. 423 00:25:49,320 --> 00:25:52,400 Speaker 1: But again, this is the situation when we turn off 424 00:25:52,440 --> 00:25:55,399 Speaker 1: at nighttime and wake up again in the morning. And 425 00:25:55,440 --> 00:25:59,240 Speaker 1: you can ask the same question about Captain Kirk beaming 426 00:25:59,320 --> 00:26:02,960 Speaker 1: himself up in Star Trek. One moment he's standing on 427 00:26:03,000 --> 00:26:06,560 Speaker 1: the surface of the planet, and then he gets completely disintegrated, 428 00:26:06,760 --> 00:26:10,040 Speaker 1: and then he gets reconstituted inside the ship. 429 00:26:10,400 --> 00:26:12,600 Speaker 2: But is that really him now? 430 00:26:12,760 --> 00:26:15,919 Speaker 1: Inside the ship, or was he actually killed on the 431 00:26:15,960 --> 00:26:20,200 Speaker 1: surface of the planet, torn apart into his constituent atoms 432 00:26:20,720 --> 00:26:25,040 Speaker 1: and some identical version of his structure gets rebuilt, But 433 00:26:25,119 --> 00:26:28,440 Speaker 1: it's a new creature, it's not really him. These are 434 00:26:28,440 --> 00:26:33,719 Speaker 1: all thorny questions that philosophers and neuroscientists wrestle with. And 435 00:26:33,760 --> 00:26:36,480 Speaker 1: there are versions of these questions, like does it matter 436 00:26:36,520 --> 00:26:39,639 Speaker 1: if you capture just the structure of the atoms that 437 00:26:39,720 --> 00:26:43,560 Speaker 1: make up Captain Kirk and reproduce that structure with new 438 00:26:43,600 --> 00:26:47,720 Speaker 1: atoms in the spaceship or whether you take his actual 439 00:26:47,800 --> 00:26:51,560 Speaker 1: atoms and push those through space and rebuild him from 440 00:26:51,640 --> 00:26:55,560 Speaker 1: his original atoms? Does it make any difference? Did he 441 00:26:55,680 --> 00:26:58,040 Speaker 1: die in either case and it's just a rebuild of him. 442 00:26:58,320 --> 00:27:01,120 Speaker 1: And by the way, returning to this question of timing, 443 00:27:02,080 --> 00:27:05,960 Speaker 1: if the company kills you one second before turning on 444 00:27:06,040 --> 00:27:09,960 Speaker 1: your simulation, they can call that a transfer. But if 445 00:27:10,000 --> 00:27:13,840 Speaker 1: they kill you one second after turning on your simulation, 446 00:27:14,600 --> 00:27:18,680 Speaker 1: then it's murder because you have an independent existence from 447 00:27:18,680 --> 00:27:21,080 Speaker 1: that computer program and they have just taken that away 448 00:27:21,080 --> 00:27:21,399 Speaker 1: from you. 449 00:27:21,720 --> 00:27:23,280 Speaker 2: So the timing matters. 450 00:27:23,320 --> 00:27:24,960 Speaker 1: And what you can see is that these are all 451 00:27:25,080 --> 00:27:43,480 Speaker 1: tough philosophical problems. Now we've been conjecturing over whether you 452 00:27:43,600 --> 00:27:48,800 Speaker 1: could reproduce consciousness. But it's of course a possibility that 453 00:27:48,920 --> 00:27:51,800 Speaker 1: all of this conjecture is not conjecture at all. 454 00:27:52,000 --> 00:27:53,280 Speaker 2: Maybe we have had. 455 00:27:53,080 --> 00:27:57,920 Speaker 1: These conversations a thousand years ago and already figured out 456 00:27:57,960 --> 00:28:03,120 Speaker 1: the key to successfully transfer our consciousness into a simulation, 457 00:28:03,600 --> 00:28:07,720 Speaker 1: and the idea is that our reality is actually already 458 00:28:07,760 --> 00:28:11,240 Speaker 1: a simulation. Now thinking about these issues, this is not 459 00:28:11,320 --> 00:28:15,360 Speaker 1: a new idea. Two three hundred years ago, the Chinese 460 00:28:15,440 --> 00:28:20,919 Speaker 1: philosopher Tuang Tu wrote that he once quote dreamt, I 461 00:28:21,119 --> 00:28:26,120 Speaker 1: was a butterfly, fluttering hither and thither to all intents 462 00:28:26,160 --> 00:28:30,920 Speaker 1: and purposes. A butterfly. I was conscious only of following 463 00:28:30,960 --> 00:28:35,480 Speaker 1: my fancies as a butterfly, and was unconscious of my 464 00:28:35,720 --> 00:28:40,880 Speaker 1: individuality as a man. Suddenly I awoke, and there I 465 00:28:41,000 --> 00:28:45,800 Speaker 1: lay myself again. Now I do not know whether I 466 00:28:45,960 --> 00:28:49,960 Speaker 1: was then a man dreaming I was a butterfly, or 467 00:28:50,000 --> 00:28:53,840 Speaker 1: whether I am now a butterfly dreaming that I am 468 00:28:53,880 --> 00:28:58,280 Speaker 1: a man. And what this illustrates is the difficulty of 469 00:28:58,440 --> 00:29:03,239 Speaker 1: knowing precisely what reality we're in. The French philosopher Ney 470 00:29:03,320 --> 00:29:06,720 Speaker 1: Descartes wrestled with a different version of the same problem. 471 00:29:07,160 --> 00:29:10,760 Speaker 1: He wondered, how could we ever know if what we 472 00:29:10,920 --> 00:29:16,320 Speaker 1: experience is the real reality. So he proposed a thought experiment. 473 00:29:16,800 --> 00:29:19,360 Speaker 1: He asked, how do I know that I'm not a 474 00:29:19,440 --> 00:29:24,160 Speaker 1: brain in a vat? Maybe some scientists are stimulating that 475 00:29:24,280 --> 00:29:26,560 Speaker 1: brain in just the right way to make me believe 476 00:29:27,160 --> 00:29:30,080 Speaker 1: that I'm here and that I'm eating this delicious food 477 00:29:30,120 --> 00:29:34,400 Speaker 1: and seeing these stunning colors, and I'm listening to this podcast. 478 00:29:34,720 --> 00:29:38,200 Speaker 1: And Descartes concluded there might not be any way to know. 479 00:29:38,960 --> 00:29:43,160 Speaker 1: But he also realized something else. There's some me at 480 00:29:43,200 --> 00:29:45,920 Speaker 1: the center trying to figure all this out. So whether 481 00:29:46,160 --> 00:29:49,840 Speaker 1: or not I'm a brain in a simulation, I'm pondering 482 00:29:50,080 --> 00:29:54,080 Speaker 1: the problem. I'm thinking about this, and therefore there is 483 00:29:54,120 --> 00:30:01,280 Speaker 1: some I that exists. Japons don't suis. I think therefore 484 00:30:01,480 --> 00:30:05,560 Speaker 1: I am irrespective of whether I understand precisely what that 485 00:30:05,760 --> 00:30:09,360 Speaker 1: I is. Now, the modern version of the brain in 486 00:30:09,400 --> 00:30:12,040 Speaker 1: a vat question is how do I. 487 00:30:12,040 --> 00:30:14,680 Speaker 2: Know if I'm living in a computer simulation? 488 00:30:15,600 --> 00:30:19,520 Speaker 1: And in fact, some philosophers like Nick Bostrom have suggested 489 00:30:19,560 --> 00:30:22,320 Speaker 1: that in fact, it is more likely that we are 490 00:30:22,400 --> 00:30:26,800 Speaker 1: in a simulation than not. His argument is that once 491 00:30:26,840 --> 00:30:30,760 Speaker 1: it becomes possible to create a computer simulation of reality, 492 00:30:31,360 --> 00:30:35,760 Speaker 1: then it's likely that many such simulations would be created, 493 00:30:36,160 --> 00:30:39,680 Speaker 1: and in this scenario, it is more probable that we 494 00:30:39,760 --> 00:30:42,760 Speaker 1: are living in one of these simulations rather than the 495 00:30:43,200 --> 00:30:47,680 Speaker 1: real reality. And in fact, there are several philosophical arguments 496 00:30:47,680 --> 00:30:50,920 Speaker 1: that have been put forward to support this idea. One 497 00:30:51,000 --> 00:30:55,320 Speaker 1: argument is that the universe appears to be exquisitely finely 498 00:30:55,400 --> 00:30:59,440 Speaker 1: tuned for life, with the laws of physics being very specific, 499 00:31:00,040 --> 00:31:04,560 Speaker 1: which suggests they might have been designed by an intelligent being. 500 00:31:04,600 --> 00:31:07,280 Speaker 1: And while some people use this as an argument for 501 00:31:07,440 --> 00:31:12,160 Speaker 1: religious creationism, some philosophers use this argument that we are 502 00:31:12,240 --> 00:31:17,520 Speaker 1: perhaps the creation of a very convincing virtual reality, of 503 00:31:17,560 --> 00:31:18,520 Speaker 1: a simulation. 504 00:31:19,080 --> 00:31:20,920 Speaker 2: The question of whether we are. 505 00:31:20,800 --> 00:31:24,920 Speaker 1: Living in a simulation seems impossible, at least at the moment, 506 00:31:25,040 --> 00:31:29,680 Speaker 1: to know how to address scientifically, but it certainly seems 507 00:31:29,680 --> 00:31:33,560 Speaker 1: like a possibility. I mean, we already know how easily 508 00:31:33,680 --> 00:31:38,160 Speaker 1: we can get fooled into accepting our reality. Every night 509 00:31:38,200 --> 00:31:42,640 Speaker 1: we fall asleep and we have these bizarre dreams, and 510 00:31:42,720 --> 00:31:46,160 Speaker 1: while we're there, we believe in those worlds entirely, and 511 00:31:46,200 --> 00:31:48,480 Speaker 1: then we wake up and we think, oh, that wasn't 512 00:31:48,520 --> 00:31:49,560 Speaker 1: actually the real world. 513 00:31:49,600 --> 00:31:51,640 Speaker 2: Now I'm in the real world. So we know that 514 00:31:51,680 --> 00:31:53,600 Speaker 2: we are completely capable of. 515 00:31:53,560 --> 00:31:57,840 Speaker 1: Being in simulations and believing them entirely. In other words, 516 00:31:58,240 --> 00:32:02,760 Speaker 1: the mere existence of dreams may be sufficient evidence that 517 00:32:02,840 --> 00:32:06,600 Speaker 1: it is possible we are living in a simulation. Now, 518 00:32:06,680 --> 00:32:10,160 Speaker 1: if we are living in a simulation, could we escape 519 00:32:10,160 --> 00:32:12,760 Speaker 1: from it like they do in the matrix? Or are 520 00:32:12,800 --> 00:32:16,000 Speaker 1: we trapped in it? Would we have any power to 521 00:32:16,160 --> 00:32:19,880 Speaker 1: change the simulation? What's the purpose of the simulation? Who 522 00:32:19,920 --> 00:32:23,800 Speaker 1: exactly created the simulation? Okay, so those are tough questions 523 00:32:23,840 --> 00:32:26,120 Speaker 1: and we really have no way of tackling them, But 524 00:32:26,160 --> 00:32:28,160 Speaker 1: I want to pile on one more. 525 00:32:28,440 --> 00:32:29,880 Speaker 2: I think there's an open. 526 00:32:29,680 --> 00:32:35,200 Speaker 1: Question for us about the usefulness of immortality. Would you 527 00:32:35,520 --> 00:32:38,560 Speaker 1: actually want to live forever? And let's say, with your 528 00:32:38,640 --> 00:32:42,040 Speaker 1: uploaded brain on a new substrate, you don't even require sleep. 529 00:32:42,560 --> 00:32:46,240 Speaker 1: So for four hundred years you're looking for ways to 530 00:32:46,360 --> 00:32:47,520 Speaker 1: occupy yourself. 531 00:32:48,160 --> 00:32:49,520 Speaker 2: Time can be painful. 532 00:32:50,160 --> 00:32:54,640 Speaker 1: Imagine for hundreds of years you're looking for entertainment like 533 00:32:54,680 --> 00:32:57,680 Speaker 1: the best new series on whatever the streamers are at 534 00:32:57,680 --> 00:33:01,120 Speaker 1: that point, or you're scrolling through your social media with 535 00:33:01,280 --> 00:33:04,920 Speaker 1: infinite scroll and it really is close to infinite. Maybe 536 00:33:04,920 --> 00:33:08,520 Speaker 1: you actually reach the end of the Internet. Is there 537 00:33:08,560 --> 00:33:12,640 Speaker 1: a time when you say, Okay, it's been four hundred 538 00:33:12,680 --> 00:33:15,560 Speaker 1: and seventy years, I am ready to wrap this up now. 539 00:33:15,720 --> 00:33:18,440 Speaker 1: So I'll give you a sense of this from a 540 00:33:18,800 --> 00:33:21,760 Speaker 1: short story that I wrote my book sum So in 541 00:33:21,800 --> 00:33:25,400 Speaker 1: this story, you become a famous medical visionary, and here's 542 00:33:25,400 --> 00:33:29,080 Speaker 1: how it goes. You argue that there's no such thing 543 00:33:29,120 --> 00:33:32,240 Speaker 1: as a natural death, and you raise millions to fund 544 00:33:32,280 --> 00:33:36,640 Speaker 1: your research. You program computers to calculate all possible mutations 545 00:33:36,680 --> 00:33:40,760 Speaker 1: of viruses before they happen, and you design prophylactic treatments 546 00:33:40,800 --> 00:33:44,880 Speaker 1: against them. You compute the exact effects of every medication 547 00:33:45,080 --> 00:33:48,960 Speaker 1: on the normal cycles of the body. Your aggressive anti 548 00:33:49,040 --> 00:33:53,040 Speaker 1: death program is a success. After the final breath of 549 00:33:53,200 --> 00:33:57,440 Speaker 1: an incurably ill elderly woman, you are able to announce 550 00:33:57,840 --> 00:34:03,480 Speaker 1: that hers represented the last natural death. Great celebrations ensue. 551 00:34:03,960 --> 00:34:06,760 Speaker 1: People begin to live forever, healing just as they would 552 00:34:06,800 --> 00:34:10,040 Speaker 1: when they were young, free at last from the overhanging 553 00:34:10,200 --> 00:34:16,000 Speaker 1: cloud of mortality. You are greatly admired, but eventually your 554 00:34:16,080 --> 00:34:20,320 Speaker 1: success begins to lose its shine. People come to discover 555 00:34:20,440 --> 00:34:23,040 Speaker 1: that the end of death is the end of motivation. 556 00:34:23,440 --> 00:34:26,319 Speaker 1: Too much life, It turns out is the opiate of 557 00:34:26,400 --> 00:34:31,200 Speaker 1: the masses. There's a noticeable decline in accomplishment. People take 558 00:34:31,239 --> 00:34:35,400 Speaker 1: more naps, there's no great rush. In an attempt to 559 00:34:35,520 --> 00:34:39,920 Speaker 1: salvage their once dynamic lives, people begin to set suicide 560 00:34:40,000 --> 00:34:43,080 Speaker 1: dates for themselves. It is a welcome echo of the 561 00:34:43,120 --> 00:34:47,120 Speaker 1: old days of finite life spans, but superior because of 562 00:34:47,160 --> 00:34:50,960 Speaker 1: the opportunity to say goodbye and complete your estate planning. 563 00:34:51,320 --> 00:34:54,920 Speaker 1: That works well for a while, rekindling the incentive to 564 00:34:55,000 --> 00:34:58,920 Speaker 1: live strongly, but eventually people begin to take the system 565 00:34:58,960 --> 00:35:02,560 Speaker 1: with less than the appropriate seriousness, and if some large 566 00:35:02,600 --> 00:35:06,120 Speaker 1: new development occurs, such as a new relationship, they simply 567 00:35:06,360 --> 00:35:11,600 Speaker 1: postpone the suicide date. Whole cadres of procrastinators grow. When 568 00:35:11,640 --> 00:35:15,600 Speaker 1: they reschedule a new date, others ridicule them by calling 569 00:35:15,640 --> 00:35:19,759 Speaker 1: it a death threat. There develops enormous social pressure to 570 00:35:19,840 --> 00:35:23,719 Speaker 1: follow through with the suicides at long last. After many 571 00:35:23,760 --> 00:35:27,000 Speaker 1: abuses of the system, it is legislated that there's no 572 00:35:27,280 --> 00:35:31,400 Speaker 1: changing a preset death date, But eventually it comes to 573 00:35:31,440 --> 00:35:35,279 Speaker 1: be appreciated that not just the finitude of life, but 574 00:35:35,440 --> 00:35:39,800 Speaker 1: also the surprise timing of death is critical to motivation, 575 00:35:40,680 --> 00:35:44,000 Speaker 1: So people begin to set ranges for their death dates. 576 00:35:44,440 --> 00:35:48,319 Speaker 1: In this new framework, their friends throw surprise parties for 577 00:35:48,400 --> 00:35:51,520 Speaker 1: them like birthday parties, except they jump out from behind 578 00:35:51,560 --> 00:35:54,160 Speaker 1: the couch and kill them. Since you never know when 579 00:35:54,200 --> 00:35:58,080 Speaker 1: your friends are going to schedule your party, it reinstills 580 00:35:58,200 --> 00:36:03,080 Speaker 1: the carpae dm attitude of former years. Unfortunately, people begin 581 00:36:03,200 --> 00:36:07,320 Speaker 1: to abuse the surprise party system to extinguish their enemies 582 00:36:07,440 --> 00:36:11,640 Speaker 1: under the protection of necro legislation. In the end, great 583 00:36:11,680 --> 00:36:15,279 Speaker 1: masses of rioters break into your medical complex, kick the 584 00:36:15,320 --> 00:36:18,640 Speaker 1: plugs out of the computers, and once again have a 585 00:36:18,680 --> 00:36:24,239 Speaker 1: great celebration to mark the end of the last unnatural life. So, 586 00:36:24,360 --> 00:36:29,120 Speaker 1: although there's been a millennia long reach for immortality, I 587 00:36:29,160 --> 00:36:31,759 Speaker 1: think it's worth exploring this question of would it be 588 00:36:31,800 --> 00:36:32,120 Speaker 1: worth it? 589 00:36:32,520 --> 00:36:34,160 Speaker 2: So let's wrap this up for today. 590 00:36:34,560 --> 00:36:37,520 Speaker 1: In the coming years, we are going to discover more 591 00:36:37,600 --> 00:36:40,719 Speaker 1: about the human brain than we can describe with our 592 00:36:40,760 --> 00:36:44,239 Speaker 1: current theories and frameworks. And at the moment we are 593 00:36:44,280 --> 00:36:47,640 Speaker 1: surrounded with mysteries, many that we recognize and many we 594 00:36:47,760 --> 00:36:51,320 Speaker 1: haven't even yet registered as a field. We have vast 595 00:36:51,520 --> 00:36:55,000 Speaker 1: uncharted waters ahead of us. As always in science, the 596 00:36:55,040 --> 00:36:57,960 Speaker 1: important thing is to run the experiments and assess the results. 597 00:36:58,120 --> 00:37:01,240 Speaker 1: Some of the approaches are going to be blind alleys 598 00:37:01,239 --> 00:37:03,440 Speaker 1: and others are going to move us farther down the 599 00:37:03,520 --> 00:37:08,640 Speaker 1: road of understanding the blueprints of our own minds and consciousness. 600 00:37:09,280 --> 00:37:11,000 Speaker 2: But one thing is certain, which. 601 00:37:10,760 --> 00:37:13,799 Speaker 1: Is that our species is just at the beginning of 602 00:37:13,840 --> 00:37:16,759 Speaker 1: something and we don't fully know what it is and 603 00:37:16,800 --> 00:37:21,200 Speaker 1: where it's going. We are at an unprecedented moment in history, 604 00:37:21,719 --> 00:37:26,520 Speaker 1: one in which brain, science and technology are co evolving, 605 00:37:26,640 --> 00:37:31,840 Speaker 1: and what happens at this intersection is poised to change 606 00:37:32,120 --> 00:37:35,920 Speaker 1: who we are and how we think about life and immortality. 607 00:37:36,239 --> 00:37:40,120 Speaker 1: For thousands of generations, humans have lived the same life 608 00:37:40,200 --> 00:37:44,240 Speaker 1: cycle over and over. We're born, we control a fragile body, 609 00:37:44,640 --> 00:37:48,120 Speaker 1: we enjoy a small strip of sensory reality, and then 610 00:37:48,160 --> 00:37:51,640 Speaker 1: we die. And science might give us the tools to 611 00:37:51,880 --> 00:37:56,640 Speaker 1: transcend that evolutionary story, because we can now hack our 612 00:37:56,640 --> 00:38:00,879 Speaker 1: own hardware, and as a result, our brains don't need 613 00:38:00,920 --> 00:38:04,880 Speaker 1: to remain as we have inherited them. If we are 614 00:38:04,920 --> 00:38:09,319 Speaker 1: able to upload our consciousness, eventually we're gonna be able 615 00:38:09,360 --> 00:38:13,600 Speaker 1: to shed our physical forms altogether. So our species is 616 00:38:13,880 --> 00:38:17,520 Speaker 1: just now discovering the tools to shape our own destiny, 617 00:38:18,000 --> 00:38:25,960 Speaker 1: and who we become is yet to be Imagined. To 618 00:38:26,040 --> 00:38:28,480 Speaker 1: find out more and to share your thoughts, head over 619 00:38:28,520 --> 00:38:32,359 Speaker 1: to eagleman dot com slash podcasts and send me an 620 00:38:32,400 --> 00:38:37,399 Speaker 1: email at podcast at eagleman dot com with questions or discussions, 621 00:38:37,600 --> 00:38:39,680 Speaker 1: and I'll be making an episode soon in which I 622 00:38:39,719 --> 00:38:43,439 Speaker 1: address those. Until next time, I'm David Eagleman, and this 623 00:38:43,719 --> 00:38:44,920 Speaker 1: is Inner Cosmos