1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from my Heart Radio. 2 00:00:11,800 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,760 --> 00:00:17,520 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,600 --> 00:00:21,200 Speaker 1: and how the Tech Area. It's time for a tech 5 00:00:21,239 --> 00:00:25,959 Speaker 1: Stuff Classic episode. This episode originally published back on July 6 00:00:26,840 --> 00:00:31,040 Speaker 1: two thousand fifteen. It is titled Who Wants to Live Forever? 7 00:00:31,640 --> 00:00:35,680 Speaker 1: It's really about digital immortality, and the great Josh Clark 8 00:00:36,040 --> 00:00:39,000 Speaker 1: of Stuff you Should Know joins the show on this episode. 9 00:00:39,040 --> 00:00:42,239 Speaker 1: Hope you enjoy. I have a feeling the title of 10 00:00:42,280 --> 00:00:45,680 Speaker 1: this episode will be who Wants to Live Forever? Because 11 00:00:45,680 --> 00:00:49,640 Speaker 1: I'm a big fan of Queen. Yeah, who wants to 12 00:00:49,640 --> 00:00:52,440 Speaker 1: Live Forever? That came from the soundtrack to the hit 13 00:00:52,600 --> 00:00:57,240 Speaker 1: film Highlander. Oh cult classic? Did Queen do the whole 14 00:00:57,240 --> 00:01:02,560 Speaker 1: sound check? They did Highlander? And Flash Gordon? They did. 15 00:01:02,840 --> 00:01:04,800 Speaker 1: I did not know Highlander and I saw Highlander the 16 00:01:04,800 --> 00:01:07,080 Speaker 1: other day and I was like, does not hold up? No, 17 00:01:07,400 --> 00:01:09,400 Speaker 1: the movie is one of those that I wish we 18 00:01:09,440 --> 00:01:12,280 Speaker 1: could just wipe from history and redo because the concept 19 00:01:12,360 --> 00:01:14,959 Speaker 1: is amazing. Yeah, but that's not what we're gonna talk about. 20 00:01:14,959 --> 00:01:19,720 Speaker 1: Although there are immortals in Highlander, Yeah, yeah, there are, Yeah, 21 00:01:19,800 --> 00:01:22,240 Speaker 1: I mean that's the that's the connection, right, I guess so, 22 00:01:22,920 --> 00:01:25,720 Speaker 1: or Queen your love of Queen in Queen doing the 23 00:01:25,760 --> 00:01:31,440 Speaker 1: Highlanders sound chracktality. Yeah, it's ultimately it all goes back 24 00:01:31,480 --> 00:01:36,319 Speaker 1: to uh tesla. No, we're gonna be talking about digital immortality, 25 00:01:36,400 --> 00:01:43,320 Speaker 1: this concept of using technology to extend our lifespans indefinitely. Yeah, 26 00:01:43,480 --> 00:01:48,520 Speaker 1: to immortality, Yeah, to the point where essentially until the 27 00:01:48,600 --> 00:01:52,200 Speaker 1: sun burns out right and the great heat death of 28 00:01:52,280 --> 00:01:54,600 Speaker 1: the universe. Yeah, I mean yeah, because you could in theory, 29 00:01:54,640 --> 00:01:59,160 Speaker 1: if you were had digital immortality, there's nothing stopping you 30 00:01:59,240 --> 00:02:03,040 Speaker 1: from hopping on a spaceship and high tailing it somewhere else, 31 00:02:03,720 --> 00:02:07,440 Speaker 1: you know, or being transmitted it near the speed of light. Yep. Yeah, 32 00:02:07,480 --> 00:02:10,919 Speaker 1: you could be beamed from one point to another and 33 00:02:11,160 --> 00:02:14,560 Speaker 1: sure if you I wonder what that experience would be like. Well, 34 00:02:14,600 --> 00:02:17,520 Speaker 1: maybe that it's that's the future of space travel. Physical 35 00:02:17,520 --> 00:02:21,919 Speaker 1: space travel is as digital beings rather than maybe that's 36 00:02:21,960 --> 00:02:25,600 Speaker 1: the while we keep banging up against is the physical limitations, 37 00:02:25,639 --> 00:02:28,360 Speaker 1: and then that will finally unbridle us and allow us 38 00:02:28,360 --> 00:02:32,320 Speaker 1: to really do like interstellar travel, intergalactic travel, though presumably 39 00:02:32,360 --> 00:02:36,040 Speaker 1: you would have to have something you're beaming into Well 40 00:02:36,120 --> 00:02:38,560 Speaker 1: you're just purely digital, then you have to have something 41 00:02:38,560 --> 00:02:40,800 Speaker 1: to house that information. I mean, I guess you could 42 00:02:40,800 --> 00:02:45,080 Speaker 1: just be literally just information beaming around, but I don't 43 00:02:45,240 --> 00:02:47,240 Speaker 1: know how. I wonder what that experience would be, like 44 00:02:47,280 --> 00:02:49,040 Speaker 1: I wondered to be like going to sleep. Think about 45 00:02:49,040 --> 00:02:54,320 Speaker 1: a laser. The laser doesn't have any sort of infrastructure, 46 00:02:54,880 --> 00:03:00,519 Speaker 1: just beaming and you're transmitting light information in one place 47 00:03:00,560 --> 00:03:02,840 Speaker 1: to another at the speed of light. Right. Well, what 48 00:03:02,919 --> 00:03:05,840 Speaker 1: if we figured out a way to digitize ourselves, as 49 00:03:05,880 --> 00:03:09,120 Speaker 1: we'll talk about um, and we were able to beam 50 00:03:09,120 --> 00:03:12,720 Speaker 1: ourselves in much the same way that a laser beams light. Right, 51 00:03:13,280 --> 00:03:16,560 Speaker 1: But the question is then, because if we are digitizing ourselves, 52 00:03:16,960 --> 00:03:20,480 Speaker 1: we're usually talking about that with the the understanding that 53 00:03:20,480 --> 00:03:25,079 Speaker 1: that digital information rests on top of some physical architecture. Right, 54 00:03:25,240 --> 00:03:28,200 Speaker 1: just as software needs hardware to run off of, Right, 55 00:03:28,240 --> 00:03:30,560 Speaker 1: you need like fiber optics. Now, yeah, I'm saying, what 56 00:03:30,600 --> 00:03:32,960 Speaker 1: if you remove that, we figure out a way to 57 00:03:32,960 --> 00:03:34,880 Speaker 1: remove it, then there's no Leyeah. If you can get 58 00:03:34,880 --> 00:03:37,360 Speaker 1: to a point where we become pure information and there 59 00:03:37,480 --> 00:03:41,840 Speaker 1: is no need for physical infrastructure beneath that, then we're golden. 60 00:03:42,280 --> 00:03:44,320 Speaker 1: There's no limit. Then I guess we would need some 61 00:03:44,360 --> 00:03:46,840 Speaker 1: sort of receptacle to beam into even on the end, 62 00:03:46,920 --> 00:03:49,960 Speaker 1: even if we don't have something connecting the two points. 63 00:03:49,960 --> 00:03:55,640 Speaker 1: That's what I was kind of if you're just gonna 64 00:03:55,680 --> 00:03:58,320 Speaker 1: send someone ahead, like, all right, Bill, it's your job 65 00:03:58,400 --> 00:04:02,480 Speaker 1: to set up all these these EPU talent. Bill, do 66 00:04:02,560 --> 00:04:05,000 Speaker 1: not let us down. Make sure they're all plugged in, 67 00:04:05,040 --> 00:04:08,400 Speaker 1: and please use one of those uninterrupted power supplies because 68 00:04:08,440 --> 00:04:10,440 Speaker 1: if if there's a blackout, we don't want you know, 69 00:04:10,920 --> 00:04:13,680 Speaker 1: we lost Lucy, and then make it over. Please don't 70 00:04:13,720 --> 00:04:15,960 Speaker 1: smoke while you're setting them up, Bill, because we could 71 00:04:16,000 --> 00:04:18,239 Speaker 1: smell it last time. It's dunk up the whole place 72 00:04:18,400 --> 00:04:20,960 Speaker 1: right right. So to get down to what we're actually 73 00:04:21,040 --> 00:04:23,880 Speaker 1: talking about, you probably picked up on this. The idea 74 00:04:23,920 --> 00:04:29,760 Speaker 1: of digital immortality largely revolves around this concept of somehow 75 00:04:30,160 --> 00:04:36,880 Speaker 1: transferring human consciousness and experience into a digital format. Usually 76 00:04:36,920 --> 00:04:40,120 Speaker 1: the way we describe it as uploading your brain into 77 00:04:40,160 --> 00:04:43,400 Speaker 1: a computer. That's kind of the easiest way to explain it, 78 00:04:43,800 --> 00:04:47,560 Speaker 1: and there are a lot of really smart people who 79 00:04:47,560 --> 00:04:51,440 Speaker 1: have been talking about this possibility beyond saying it's hypothetical, 80 00:04:51,520 --> 00:04:54,440 Speaker 1: saying it will be possible or it will happen. A 81 00:04:54,480 --> 00:04:56,719 Speaker 1: lot of people strut around like they're just cock of 82 00:04:56,720 --> 00:04:59,320 Speaker 1: the walk, saying it's going to happen, and some sometimes 83 00:04:59,320 --> 00:05:01,159 Speaker 1: they even put like DAYTI on things like this, Oh 84 00:05:01,200 --> 00:05:03,440 Speaker 1: yeh know the guy we got to talk about to 85 00:05:03,480 --> 00:05:08,400 Speaker 1: be at least Ray Kurts Watch Kurtzwile Uh famous for 86 00:05:08,520 --> 00:05:13,880 Speaker 1: his futurism predictions, including the idea that we will reach 87 00:05:13,960 --> 00:05:17,279 Speaker 1: what is called the singularity. There's the point at which 88 00:05:17,640 --> 00:05:21,800 Speaker 1: technology is evolving so quickly that there is no meaningful 89 00:05:21,839 --> 00:05:25,120 Speaker 1: way to describe the present because it's changing that fast 90 00:05:26,240 --> 00:05:28,920 Speaker 1: and in the way I always think about singularities. Usually 91 00:05:28,960 --> 00:05:32,640 Speaker 1: it's also the moment where UM one of two things 92 00:05:32,720 --> 00:05:37,240 Speaker 1: has just happened. Either UM in AI has awakened and 93 00:05:37,279 --> 00:05:42,240 Speaker 1: become conscious right and therefore we it is now the 94 00:05:42,279 --> 00:05:45,160 Speaker 1: master of the universe as far as we're concerned, or 95 00:05:45,760 --> 00:05:51,840 Speaker 1: we it's the moment we merge biology merges with technology 96 00:05:52,160 --> 00:05:57,239 Speaker 1: at a point where we're able to UM remove ourselves 97 00:05:57,320 --> 00:06:00,719 Speaker 1: from the limitations of evolution and chart our own course 98 00:06:00,760 --> 00:06:03,280 Speaker 1: from that point on. Yeah, that's that's pretty accurate. I 99 00:06:03,279 --> 00:06:06,800 Speaker 1: would argue that there's also, uh, there's the possibility of 100 00:06:06,839 --> 00:06:12,120 Speaker 1: developing UM technology that allows us to genetically alter ourselves 101 00:06:12,120 --> 00:06:16,120 Speaker 1: without having to directly incorporate like computers or electronics into 102 00:06:16,120 --> 00:06:19,000 Speaker 1: our systems. That also can be It's transhumanism, is what 103 00:06:19,000 --> 00:06:21,560 Speaker 1: we're talking about here. We're like right there. Yeah, we're 104 00:06:21,600 --> 00:06:26,520 Speaker 1: on a kind of happening, like very crudely, but it's 105 00:06:26,720 --> 00:06:30,360 Speaker 1: we're like right there. As far as that last definition, yeah, yeah, 106 00:06:30,440 --> 00:06:33,720 Speaker 1: we're there. Well, even with the incorporation of technology, we're 107 00:06:33,720 --> 00:06:36,720 Speaker 1: getting there. You look at things like cochlear implants, and 108 00:06:37,080 --> 00:06:41,200 Speaker 1: while this is this technology is specifically meant to give 109 00:06:41,440 --> 00:06:44,760 Speaker 1: people who have either lost or never developed a particular 110 00:06:45,320 --> 00:06:51,160 Speaker 1: uh sense or maybe some other form of neurological process, 111 00:06:51,680 --> 00:06:54,479 Speaker 1: uh you know right now, it's meant to address that. 112 00:06:54,960 --> 00:06:57,680 Speaker 1: In the future, it could be meant to augment, not 113 00:06:57,800 --> 00:07:02,400 Speaker 1: just to to repair down image or to address a 114 00:07:02,400 --> 00:07:06,360 Speaker 1: loss of something. Right, Like, the defining characteristic of trans 115 00:07:06,440 --> 00:07:13,840 Speaker 1: humanism is that, um, you don't want a blade prosthetic 116 00:07:13,960 --> 00:07:19,480 Speaker 1: leg because the one you were born with was removed. 117 00:07:20,360 --> 00:07:22,800 Speaker 1: You want to blade prosthetic leg because you want to 118 00:07:22,840 --> 00:07:25,560 Speaker 1: be able to run faster. Right, It's not to it's 119 00:07:25,560 --> 00:07:28,240 Speaker 1: not to make up for a loss, it's to further. 120 00:07:28,920 --> 00:07:32,560 Speaker 1: It's to go to the next step exactly. So, uh, this, 121 00:07:32,720 --> 00:07:37,240 Speaker 1: this singularity idea is very closely related to digital immortality, 122 00:07:37,280 --> 00:07:40,880 Speaker 1: and largely because of Ray Kurtzwild, because, as it turns out, 123 00:07:41,240 --> 00:07:44,160 Speaker 1: I think it's fair to say Ray Kurtzwild has an 124 00:07:44,200 --> 00:07:47,560 Speaker 1: issue with the concept of mortality. Yeah. I was wondering, like, 125 00:07:47,680 --> 00:07:50,400 Speaker 1: I don't know that much about Kurtzwild. I mean, I'm 126 00:07:50,440 --> 00:07:52,760 Speaker 1: slightly familiar, but you clearly know a lot more about 127 00:07:52,840 --> 00:07:54,840 Speaker 1: him than I do. And I was wondering if he 128 00:07:55,600 --> 00:07:59,840 Speaker 1: is a like a fretful fannie, Like, does he constantly 129 00:08:00,000 --> 00:08:04,880 Speaker 1: worry about misstepping and dying, you know, people dying really weird, 130 00:08:05,160 --> 00:08:09,640 Speaker 1: random mundane ways every day. Yeah, and I wonder if 131 00:08:09,680 --> 00:08:12,360 Speaker 1: he just lives in literal mortal fear of that. Well, 132 00:08:12,400 --> 00:08:17,280 Speaker 1: he he is certainly taking great precautions to extend his 133 00:08:17,360 --> 00:08:21,080 Speaker 1: life because he does believe firmly that we will reach 134 00:08:21,160 --> 00:08:25,040 Speaker 1: this point in which technology will allow us to extend 135 00:08:25,080 --> 00:08:29,120 Speaker 1: our lifespans indefinitely within his lifetime if he takes care 136 00:08:29,120 --> 00:08:32,360 Speaker 1: of himself. So he he is determined, he doesn't. I mean, 137 00:08:32,720 --> 00:08:34,920 Speaker 1: you would kind of feel like a like a dufus 138 00:08:35,559 --> 00:08:37,880 Speaker 1: if you, you know, if you were capable of feeling 139 00:08:38,040 --> 00:08:43,040 Speaker 1: if you died the day before they invented digital immortality. Right, 140 00:08:43,160 --> 00:08:45,600 Speaker 1: it's the like the last guy to die in a war, 141 00:08:45,960 --> 00:08:50,280 Speaker 1: right after like right before the ceasefires. Right, Yeah, there's 142 00:08:50,280 --> 00:08:53,520 Speaker 1: a there's a great um. Have you ever seen there's 143 00:08:53,559 --> 00:08:56,200 Speaker 1: a British sketch comedy show called that Mitchell and Webb. 144 00:08:56,240 --> 00:08:59,320 Speaker 1: Look you've told me about it though, Yeah, it's it's 145 00:08:59,320 --> 00:09:03,200 Speaker 1: a two comedians, uh, David Mitchell and Robert Webb who 146 00:09:03,240 --> 00:09:05,400 Speaker 1: do this series, And one of the ones they they 147 00:09:05,480 --> 00:09:07,960 Speaker 1: have is it's just supposed to be off the cuff 148 00:09:08,480 --> 00:09:10,840 Speaker 1: conversation between the two, so it's not in the context 149 00:09:10,840 --> 00:09:12,920 Speaker 1: of a sketch so basically like what we're doing, kind 150 00:09:12,920 --> 00:09:15,640 Speaker 1: of what we're doing now, except it's obviously scripted and 151 00:09:15,720 --> 00:09:19,160 Speaker 1: ours is not. But in that case, they have a 152 00:09:19,200 --> 00:09:22,120 Speaker 1: conversation where David Mitchell is very upset with the thought 153 00:09:22,160 --> 00:09:24,679 Speaker 1: that his generation is going to be the last generation 154 00:09:24,720 --> 00:09:29,520 Speaker 1: to die, and he is spiteful of the of the 155 00:09:29,559 --> 00:09:33,240 Speaker 1: next generation. He's mad at them for being able to 156 00:09:33,280 --> 00:09:35,960 Speaker 1: live forever while he has to die. Webb is like, 157 00:09:36,000 --> 00:09:39,880 Speaker 1: you could just be happy for them. There's no same 158 00:09:39,920 --> 00:09:41,480 Speaker 1: sort of thing I think with Kurt Swile is that 159 00:09:41,559 --> 00:09:44,720 Speaker 1: he's um taking great pains to take care of himself. 160 00:09:44,760 --> 00:09:50,080 Speaker 1: He's advocate for a healthy diet and exercise, which is fantastic. 161 00:09:50,440 --> 00:09:53,359 Speaker 1: He takes something like a hundred and fifty dietary supplements. 162 00:09:53,760 --> 00:09:55,720 Speaker 1: I'm going to have to correct you. And this is 163 00:09:55,920 --> 00:10:00,320 Speaker 1: from the article that you wrote. Yeah two, or if 164 00:10:00,320 --> 00:10:04,880 Speaker 1: the advice she is constantly taking pill you know. Yeah, 165 00:10:04,920 --> 00:10:07,560 Speaker 1: So that's uh. And and there there are plenty of 166 00:10:07,600 --> 00:10:10,520 Speaker 1: studies that have suggested that unless you are suffering a 167 00:10:10,559 --> 00:10:16,080 Speaker 1: deficiency of some sort, these supplements are not actually helpful. Well, um, 168 00:10:16,920 --> 00:10:22,040 Speaker 1: it's kind of like um, vitamin A. I believe vitamin 169 00:10:22,120 --> 00:10:26,120 Speaker 1: a UH is known to help you see better. Pretty sure, 170 00:10:26,160 --> 00:10:29,840 Speaker 1: it's vitamin a UM. And it's been shown that if 171 00:10:29,880 --> 00:10:33,000 Speaker 1: you're especially like night vision is a little deficient, that 172 00:10:33,080 --> 00:10:36,080 Speaker 1: if you eat some carrots your night vision will improve. 173 00:10:36,120 --> 00:10:39,080 Speaker 1: So carrots do help you, But if it's already up 174 00:10:39,120 --> 00:10:42,160 Speaker 1: to whatever your baseline night vision level is, you can't 175 00:10:42,800 --> 00:10:45,400 Speaker 1: all the carrots in the world and it's not going 176 00:10:45,440 --> 00:10:46,760 Speaker 1: to help it. As a matter of fact, you will 177 00:10:46,760 --> 00:10:48,719 Speaker 1: turn rage. My wife turned a little orange because she 178 00:10:48,840 --> 00:10:51,120 Speaker 1: like carrots so much when she was a kid. So 179 00:10:52,080 --> 00:10:54,600 Speaker 1: but she couldn't see any better beyond her baseline night 180 00:10:54,679 --> 00:10:56,840 Speaker 1: vision level. And so I think it's the same thing 181 00:10:56,840 --> 00:10:59,120 Speaker 1: as what you're saying. The same thing with vitamin C. Right, 182 00:10:59,360 --> 00:11:02,000 Speaker 1: once you hit a certain level of vitamin C, anything 183 00:11:02,000 --> 00:11:03,720 Speaker 1: beyond that you're such as just going to pee away. 184 00:11:03,880 --> 00:11:07,079 Speaker 1: And in fact, vitamins can become toxic. Much of anything 185 00:11:07,200 --> 00:11:11,079 Speaker 1: is is toxic to the human body because it seeks homeostasis. Right, 186 00:11:11,480 --> 00:11:14,480 Speaker 1: So I'm wondering if kurts, Well, surely he's smart enough 187 00:11:14,520 --> 00:11:17,360 Speaker 1: to know, like, maybe I should cut this one out, 188 00:11:17,440 --> 00:11:20,160 Speaker 1: or maybe I'm taking too much of this. Well, it's 189 00:11:20,200 --> 00:11:23,440 Speaker 1: also possible that the reported number of supplements that he 190 00:11:23,480 --> 00:11:27,920 Speaker 1: takes has been you know, exaggerated, as it's been reported 191 00:11:27,960 --> 00:11:31,560 Speaker 1: over and over again. I am personally a little skeptical 192 00:11:31,640 --> 00:11:34,079 Speaker 1: that he takes that many, But at any rate, the 193 00:11:34,400 --> 00:11:36,640 Speaker 1: whole point is that he wants to make certain to 194 00:11:36,679 --> 00:11:40,120 Speaker 1: live long enough to see the day when his prediction 195 00:11:40,200 --> 00:11:44,520 Speaker 1: comes true, that that we will have the technological ability 196 00:11:44,640 --> 00:11:49,960 Speaker 1: to port a person's mind into some kind of electronic construct. 197 00:11:50,559 --> 00:11:53,000 Speaker 1: We'll be back with more of this classic episode of 198 00:11:53,000 --> 00:12:03,320 Speaker 1: tech stuff after this quick break. I have, just while 199 00:12:03,400 --> 00:12:08,319 Speaker 1: you were speaking, pulled out my calculator and Ray Kurtzwell 200 00:12:08,640 --> 00:12:12,800 Speaker 1: takes a pill every five point seven six minutes a day, 201 00:12:13,000 --> 00:12:15,720 Speaker 1: assuming he stays up all twenty four hours in a day, 202 00:12:15,760 --> 00:12:19,480 Speaker 1: assuming again that that number is in fact accurate, that 203 00:12:19,600 --> 00:12:23,160 Speaker 1: the number of supplements. Not that I completely trust your math. 204 00:12:24,000 --> 00:12:26,760 Speaker 1: Let's talk a little bit about some of the concepts 205 00:12:26,800 --> 00:12:30,560 Speaker 1: here about how this could in theory happen. Now, obviously, 206 00:12:30,600 --> 00:12:33,280 Speaker 1: we are not at the point where we can create 207 00:12:33,320 --> 00:12:37,920 Speaker 1: any kind of hardware and or software that would allow 208 00:12:38,040 --> 00:12:44,760 Speaker 1: us to uh to migrate and intelligence from our meaty brains, right, 209 00:12:44,800 --> 00:12:48,360 Speaker 1: And that's a huge problem what you just said, We're 210 00:12:48,440 --> 00:12:53,760 Speaker 1: we are dealing in something called software hardware when what 211 00:12:54,000 --> 00:12:57,040 Speaker 1: the substrate that our our brains and consciousness exists on 212 00:12:57,360 --> 00:13:01,920 Speaker 1: is what you would term wetware. Yeah, by all logical material. Yeah, 213 00:13:01,960 --> 00:13:06,800 Speaker 1: and where it's not necessarily analogous to a computer. Even 214 00:13:06,800 --> 00:13:08,680 Speaker 1: though people tend to think of the brain is such, 215 00:13:08,880 --> 00:13:12,080 Speaker 1: that doesn't mean that it is the same thing. That's 216 00:13:12,120 --> 00:13:15,400 Speaker 1: absolutely correct. I mean, let's let's take memory for an example. 217 00:13:15,520 --> 00:13:19,240 Speaker 1: Memory is a great way to illustrate the difference between 218 00:13:19,240 --> 00:13:21,840 Speaker 1: a computer system and the brain. All right, So in 219 00:13:21,880 --> 00:13:26,319 Speaker 1: a computer system, you end up designating a certain space 220 00:13:26,400 --> 00:13:31,280 Speaker 1: on some medium like on magnetic tape or in certain 221 00:13:31,800 --> 00:13:34,960 Speaker 1: you know, it all depends on whatever the form is 222 00:13:35,000 --> 00:13:36,319 Speaker 1: that you're saving it too, But at any rate, it 223 00:13:36,360 --> 00:13:39,720 Speaker 1: all ends up being zeros and ones, and it is unaltered. 224 00:13:40,679 --> 00:13:43,520 Speaker 1: If you call up a file and you know you 225 00:13:43,559 --> 00:13:45,320 Speaker 1: haven't done anything to it since the last time you 226 00:13:45,320 --> 00:13:48,719 Speaker 1: looked at it's going to be exactly the same there, 227 00:13:48,800 --> 00:13:51,800 Speaker 1: unless there's some sort of corruption in the file or 228 00:13:51,920 --> 00:13:54,400 Speaker 1: you have made changes to it and then saved it again. 229 00:13:55,600 --> 00:13:57,319 Speaker 1: You're not you know, it's gonna be the same experience 230 00:13:57,360 --> 00:14:02,240 Speaker 1: every time. Human memory totally different a memory is, and 231 00:14:02,280 --> 00:14:06,040 Speaker 1: we only sort of understand memory. Uh, we don't have 232 00:14:06,120 --> 00:14:11,000 Speaker 1: a full grasp on how memory works. But based upon 233 00:14:11,040 --> 00:14:14,280 Speaker 1: what we do know, when you experience something, your brain 234 00:14:14,360 --> 00:14:17,800 Speaker 1: creates a certain neural pathway in response to the stimuli 235 00:14:18,040 --> 00:14:21,280 Speaker 1: you are experiencing. So, for example, right here in this 236 00:14:21,360 --> 00:14:26,840 Speaker 1: room my brain is your hair. The heat in this room, 237 00:14:27,280 --> 00:14:30,000 Speaker 1: the light in this room, little things, and I'm not 238 00:14:30,000 --> 00:14:33,720 Speaker 1: noticing everything that's hair look pretty good, isn't it again? 239 00:14:33,880 --> 00:14:36,440 Speaker 1: Start contrast with all the rest of the experiences is 240 00:14:36,480 --> 00:14:41,920 Speaker 1: amazing hair. So the these these pathways are forming in 241 00:14:41,960 --> 00:14:46,440 Speaker 1: my brain later on, assuming that I have converted this 242 00:14:46,480 --> 00:14:49,760 Speaker 1: particular experience to long term memory, which is a pretty 243 00:14:49,800 --> 00:14:53,400 Speaker 1: big assumption. Honestly, I can't remember what a podcast that 244 00:14:53,440 --> 00:14:55,480 Speaker 1: about two weeks ago. Yeah, I think my hair is 245 00:14:55,480 --> 00:14:57,160 Speaker 1: going to make it into your long term memory. But 246 00:14:57,520 --> 00:14:59,560 Speaker 1: the more you say it, the more likely it's gonna 247 00:14:59,560 --> 00:15:03,200 Speaker 1: happen when when I think back on it, my brain 248 00:15:03,280 --> 00:15:08,080 Speaker 1: will reconstruct that same pathway. So the memory is essentially 249 00:15:08,080 --> 00:15:13,160 Speaker 1: representative and the physical relationship between the various synapses that 250 00:15:13,320 --> 00:15:17,040 Speaker 1: light up when I have this experience. Right, So there 251 00:15:17,120 --> 00:15:22,120 Speaker 1: is a physical pathway that is retraced when you recall yeah, right, Yeah, 252 00:15:22,160 --> 00:15:25,600 Speaker 1: But it's not like your memory of how great my 253 00:15:25,680 --> 00:15:29,320 Speaker 1: hair looks is sitting in one little spot of your 254 00:15:29,520 --> 00:15:32,640 Speaker 1: brain like it would be on a computer's magnetic tape. 255 00:15:32,720 --> 00:15:36,880 Speaker 1: It's distributed, and it's faulty because when I remember the 256 00:15:36,920 --> 00:15:42,080 Speaker 1: process of remembering, sometimes that that pathway doesn't form exactly 257 00:15:42,120 --> 00:15:45,360 Speaker 1: the way it did, and sometimes it adds new stuff exactly, 258 00:15:45,400 --> 00:15:48,120 Speaker 1: I might fill in some gaps. Like imagine if you 259 00:15:48,160 --> 00:15:51,960 Speaker 1: opened a power point presentation that you've made and uh, 260 00:15:52,000 --> 00:15:53,960 Speaker 1: there are a few slides missing, but then there's some 261 00:15:54,000 --> 00:15:55,840 Speaker 1: new stuff and maybe it was a little bit better 262 00:15:55,880 --> 00:15:59,080 Speaker 1: than before. But you haven't done anything. I don't remember 263 00:15:59,120 --> 00:16:01,200 Speaker 1: this transition it all right, we'll go with it. Just 264 00:16:01,280 --> 00:16:04,000 Speaker 1: the very active retrieving it from your computer's memory and 265 00:16:04,040 --> 00:16:07,240 Speaker 1: opening it again changes it. Right. That doesn't happen in 266 00:16:07,280 --> 00:16:10,800 Speaker 1: a computer, but it doesn't in human memories, right, right, exactly. 267 00:16:10,840 --> 00:16:12,800 Speaker 1: That is exactly what I'm saying. And the reason why 268 00:16:12,800 --> 00:16:16,080 Speaker 1: I say it is that that's a problem because if 269 00:16:16,120 --> 00:16:19,840 Speaker 1: we are ever to move from wetware in center brains 270 00:16:19,840 --> 00:16:24,360 Speaker 1: to hardware and software in the digital realm, unless we 271 00:16:24,520 --> 00:16:28,400 Speaker 1: factor that in somehow, like we create an algorithm that 272 00:16:28,520 --> 00:16:34,560 Speaker 1: mimics the experience of remembering something, the experience is going 273 00:16:34,600 --> 00:16:38,400 Speaker 1: to be fundamentally different. The experience of remembering will be 274 00:16:38,440 --> 00:16:41,800 Speaker 1: totally different. I mean, one of the reasons why I 275 00:16:42,000 --> 00:16:46,880 Speaker 1: very much argue against eyewitness testimony for things, especially for 276 00:16:46,960 --> 00:16:50,040 Speaker 1: crimes that might have happened a long time in the past, 277 00:16:51,160 --> 00:16:53,720 Speaker 1: is that our memories are faulty. Now, if we were 278 00:16:53,720 --> 00:16:56,080 Speaker 1: in this other experience where we had moved to hardware 279 00:16:56,080 --> 00:16:59,400 Speaker 1: and software and our memories were more analogous to computer memory, 280 00:17:00,120 --> 00:17:02,760 Speaker 1: that would not be an issue. I witness that would 281 00:17:02,800 --> 00:17:06,800 Speaker 1: be so. But that's a that'sn't just one illustration of 282 00:17:06,840 --> 00:17:09,119 Speaker 1: how this is a tricky thing. It is tricky, and 283 00:17:09,160 --> 00:17:12,040 Speaker 1: you say that, you know, comparatively speaking, it sounded like 284 00:17:12,200 --> 00:17:15,080 Speaker 1: your take on it was that human memory is faulty 285 00:17:15,160 --> 00:17:18,600 Speaker 1: compared to computer memory. I I would positive that there's 286 00:17:18,640 --> 00:17:22,359 Speaker 1: also another way to look at it, that um, human 287 00:17:22,400 --> 00:17:27,320 Speaker 1: memory is much more robust and rich than computer memory, 288 00:17:27,359 --> 00:17:30,280 Speaker 1: because think about it, when you say, smell something for 289 00:17:30,320 --> 00:17:33,399 Speaker 1: the first time, and then you smell it again and again, 290 00:17:34,000 --> 00:17:37,400 Speaker 1: that that memory of what something smells like is going 291 00:17:37,440 --> 00:17:40,880 Speaker 1: to become more detailed. There's going to be more to it, 292 00:17:40,880 --> 00:17:43,400 Speaker 1: it will become more refined, and it will be totally 293 00:17:43,400 --> 00:17:46,879 Speaker 1: different from that first sent memory that you created of 294 00:17:46,920 --> 00:17:50,679 Speaker 1: whatever it was you smelled. And so I would pose 295 00:17:50,720 --> 00:17:53,600 Speaker 1: it again. Sorry to use that word twice, but it 296 00:17:53,640 --> 00:17:57,840 Speaker 1: makes me sound pretty smart when I does um that 297 00:17:57,840 --> 00:18:01,639 Speaker 1: that additional adding new material, adding new stuff to it 298 00:18:01,680 --> 00:18:04,359 Speaker 1: when you recall things or when you experience something. The 299 00:18:04,440 --> 00:18:07,360 Speaker 1: ability to make your memory more robust and more rich 300 00:18:07,720 --> 00:18:11,040 Speaker 1: and and to be able to refine it just through recall, 301 00:18:11,440 --> 00:18:15,560 Speaker 1: to me, is superior to just straight Here's the information 302 00:18:15,840 --> 00:18:18,760 Speaker 1: that a computer will give you, and it should be 303 00:18:19,000 --> 00:18:21,639 Speaker 1: exactly what you have before. And also with memories, we 304 00:18:21,640 --> 00:18:25,960 Speaker 1: can associate stuff that previously was not connected in our brains, 305 00:18:26,480 --> 00:18:29,160 Speaker 1: whereas with computers, the way you do that as through 306 00:18:29,200 --> 00:18:32,439 Speaker 1: meta data. You tag stuff. Right, You're like, okay, well 307 00:18:32,520 --> 00:18:35,760 Speaker 1: let's tag this piece of information with all the metadata 308 00:18:35,800 --> 00:18:38,679 Speaker 1: we can think of that that that describes what this 309 00:18:38,760 --> 00:18:41,560 Speaker 1: information is really about. And then if I want to 310 00:18:41,560 --> 00:18:45,439 Speaker 1: associate things, I have to look for similar tags like 311 00:18:45,600 --> 00:18:48,760 Speaker 1: but but in my brain it doesn't automatically, and it 312 00:18:48,800 --> 00:18:52,000 Speaker 1: does it in ways that you cannot necessarily anticipate, which 313 00:18:52,040 --> 00:18:55,720 Speaker 1: can lead to things like innovation, creativity. Yes, precisely, and 314 00:18:55,800 --> 00:18:59,359 Speaker 1: you also kind of hinted at something that's the big 315 00:18:59,359 --> 00:19:02,840 Speaker 1: problem they seeing the idea of uploading ourselves onto the 316 00:19:02,840 --> 00:19:09,320 Speaker 1: internet strick It is that with with memory, we can 317 00:19:09,359 --> 00:19:12,960 Speaker 1: figure out memory will will eventually figure out how human 318 00:19:13,000 --> 00:19:17,560 Speaker 1: memory works exactly. And that's what There's a philosopher called 319 00:19:17,680 --> 00:19:21,159 Speaker 1: David Chalmers. That's what he's pointed out as the easy 320 00:19:21,240 --> 00:19:25,480 Speaker 1: problem of consciousness. We understand, we're going to understand how 321 00:19:25,520 --> 00:19:29,600 Speaker 1: the mind functions. Sometime down the road, we will figure 322 00:19:29,640 --> 00:19:32,920 Speaker 1: that out. There's a hard problem is what what Chalmers 323 00:19:33,000 --> 00:19:38,119 Speaker 1: has also pointed out in figuring out how phenomenal experience, 324 00:19:38,200 --> 00:19:44,400 Speaker 1: our experience of reality is produced from those processes. That 325 00:19:44,400 --> 00:19:48,399 Speaker 1: that is the big issue that is facing us trying 326 00:19:48,440 --> 00:19:50,919 Speaker 1: to upload ourselves onto the Internet. It's like when you 327 00:19:50,960 --> 00:19:55,120 Speaker 1: talked about meta. The computer is not writing meta itself. 328 00:19:55,160 --> 00:19:58,240 Speaker 1: It might be able to simulate memory retrieval in its 329 00:19:58,240 --> 00:20:01,480 Speaker 1: own way, but it's not writing its own tags. It's 330 00:20:01,520 --> 00:20:04,800 Speaker 1: not making these connections. It takes a human consciousness to 331 00:20:04,840 --> 00:20:06,920 Speaker 1: do that. And not only do we not know how 332 00:20:06,920 --> 00:20:10,000 Speaker 1: to make a computer simulate that, we don't even know 333 00:20:10,040 --> 00:20:12,280 Speaker 1: how we do that. We may never know. There's a 334 00:20:12,320 --> 00:20:14,960 Speaker 1: lot of philosophers out there like we may never figure 335 00:20:15,000 --> 00:20:18,520 Speaker 1: out the hard problem of consciousness. We've got more to 336 00:20:18,560 --> 00:20:21,240 Speaker 1: say in this classic episode of tech stuff after these 337 00:20:21,320 --> 00:20:34,280 Speaker 1: quick messages. Neuroscientists would say that clearly the mind which 338 00:20:34,320 --> 00:20:36,719 Speaker 1: is what we could probably you know, use as an 339 00:20:36,800 --> 00:20:40,119 Speaker 1: umbrella term for things like consciousness and experience, like intelligence 340 00:20:40,160 --> 00:20:44,160 Speaker 1: and the kind of stuff that that emerges from the 341 00:20:44,160 --> 00:20:47,560 Speaker 1: physical construct of the brain, because you can you can 342 00:20:47,600 --> 00:20:51,440 Speaker 1: observe changes to the mind when someone suffers an illness 343 00:20:51,560 --> 00:20:54,520 Speaker 1: or injury that damages the brain. And therefore it stands 344 00:20:54,520 --> 00:20:56,760 Speaker 1: to reason that the mind in fact, is a product 345 00:20:56,800 --> 00:20:59,040 Speaker 1: of the brain. So if you could figure out how 346 00:20:59,080 --> 00:21:03,640 Speaker 1: to simulate a brain to a significant level of sophistication, 347 00:21:04,560 --> 00:21:11,240 Speaker 1: hypothetically you could have intelligence emerge naturally from that simulation, hypothetically, 348 00:21:11,320 --> 00:21:14,000 Speaker 1: hypothe because we can't do it yet. The best we 349 00:21:14,000 --> 00:21:17,840 Speaker 1: can do right now is to simulate a few thousand neurons. 350 00:21:17,880 --> 00:21:20,800 Speaker 1: But there are you know, we're talking about billions of 351 00:21:20,800 --> 00:21:24,000 Speaker 1: neurons and synapsis in the human brain. Yeah, from what 352 00:21:24,119 --> 00:21:27,679 Speaker 1: I saw, the low but average estimate is something like 353 00:21:27,720 --> 00:21:32,879 Speaker 1: eight six billion normal human brain. I'm sorry, not synapthis neurons, neuron, 354 00:21:32,960 --> 00:21:37,560 Speaker 1: it's trillions of synantha, right, So it's it's incredibly complicated, 355 00:21:37,600 --> 00:21:40,560 Speaker 1: and in fact, there's some people who suggest that it 356 00:21:40,680 --> 00:21:43,280 Speaker 1: may be to truly simulate a human brain, you may 357 00:21:43,320 --> 00:21:45,359 Speaker 1: have to go down to the molecular level, at which 358 00:21:45,359 --> 00:21:48,959 Speaker 1: point the computational requirements for simulating that brain are going 359 00:21:49,000 --> 00:21:52,040 Speaker 1: to be so vast as to be impractical or impossible 360 00:21:52,080 --> 00:21:54,560 Speaker 1: to achieve. Well, you mentioned the Blue Brain Project in 361 00:21:54,600 --> 00:21:58,159 Speaker 1: this article that you wrote, um, and I was just 362 00:21:58,240 --> 00:22:00,679 Speaker 1: kind of skimming their website and they mentioned that in 363 00:22:00,800 --> 00:22:05,520 Speaker 1: their simulations it requires about a laptops worth of computing power. 364 00:22:05,600 --> 00:22:08,280 Speaker 1: They didn't say what kind of RAM or hard drive 365 00:22:08,400 --> 00:22:11,000 Speaker 1: for storage or anything it had. They just used a 366 00:22:11,080 --> 00:22:13,560 Speaker 1: laptops words, you can kind of let your imagination from 367 00:22:13,640 --> 00:22:16,280 Speaker 1: on with it, but that that was required just for 368 00:22:16,560 --> 00:22:20,480 Speaker 1: one individual neuron to power. Yea, so we're talking about 369 00:22:21,160 --> 00:22:26,359 Speaker 1: U billion laptops, which is that's you know, should be 370 00:22:26,400 --> 00:22:32,040 Speaker 1: great news for the exactly any hardware manufacturers out there. Um, 371 00:22:32,080 --> 00:22:35,439 Speaker 1: there are actually quite a few different uh projects out 372 00:22:35,480 --> 00:22:38,399 Speaker 1: there that are attempting to simulate brains for one reason 373 00:22:38,480 --> 00:22:41,920 Speaker 1: or another, not necessarily so that we can pourt consciousness 374 00:22:41,920 --> 00:22:45,800 Speaker 1: to them, but also to just study things like, uh, 375 00:22:45,840 --> 00:22:49,160 Speaker 1: you know, how our brains work, how we might be 376 00:22:49,200 --> 00:22:53,960 Speaker 1: able to treat brain damage or illnesses that that damage 377 00:22:54,000 --> 00:22:57,800 Speaker 1: the brain, that how how certain medications might react to 378 00:22:57,880 --> 00:23:01,040 Speaker 1: our brains. Building these very common plex simulation so some 379 00:23:01,119 --> 00:23:04,200 Speaker 1: of them. M I. T has a course on the 380 00:23:04,280 --> 00:23:09,240 Speaker 1: emergent science of connect tomics. I've seen that lately too. 381 00:23:09,359 --> 00:23:15,440 Speaker 1: It sounds so full of bs, but apparently it's it's 382 00:23:15,520 --> 00:23:18,399 Speaker 1: a real deal, and and once you look into it 383 00:23:18,560 --> 00:23:22,040 Speaker 1: it makes total sense. Just the terrible name. Connectomics is 384 00:23:22,080 --> 00:23:25,680 Speaker 1: all about the connections that happen within the brain and yeah, 385 00:23:25,720 --> 00:23:28,280 Speaker 1: it does. Connectomics sounds like it's some sort of weird 386 00:23:28,320 --> 00:23:32,560 Speaker 1: economics course or like maybe an l Around Hubbard book. Yeah, 387 00:23:32,600 --> 00:23:38,119 Speaker 1: like dineticsnets part two connectomics. Yeah. So that's an example. 388 00:23:38,119 --> 00:23:41,480 Speaker 1: There's the US Brain project, there's an EU Brain project, 389 00:23:41,720 --> 00:23:46,720 Speaker 1: there's the Google project. Yes, and there's the Google Brain project. 390 00:23:47,720 --> 00:23:52,600 Speaker 1: They hired Ray Kurtzwile. Yeah, he's their chief engineer, director 391 00:23:52,640 --> 00:23:56,400 Speaker 1: of engineering. Yeah, for for specifically for the Google Brain project. 392 00:23:56,840 --> 00:23:59,480 Speaker 1: They mean, clearly Google has just put their cars in 393 00:23:59,520 --> 00:24:02,760 Speaker 1: the table. They're like, we're putting some serious resources behind 394 00:24:02,800 --> 00:24:06,919 Speaker 1: figuring out how to get people on to digital consciousness. Right. 395 00:24:06,920 --> 00:24:10,199 Speaker 1: It's it's one thing to think about this kind of 396 00:24:10,920 --> 00:24:15,879 Speaker 1: you know, armchair computer scientists neuroscientists sort of approach, but 397 00:24:15,920 --> 00:24:19,760 Speaker 1: they're really putting actual money towards research and development on 398 00:24:19,880 --> 00:24:24,119 Speaker 1: this stuff, including hiring another guy named Jeff Hinton, who 399 00:24:24,280 --> 00:24:27,920 Speaker 1: is a British computer scientist who who specializes in neural networks. 400 00:24:28,280 --> 00:24:31,359 Speaker 1: So they're looking at using neural networks for lots of stuff, 401 00:24:31,400 --> 00:24:34,919 Speaker 1: not just to simulate a human brain. I mean that 402 00:24:35,000 --> 00:24:37,840 Speaker 1: might be part of it too, but neural networks can 403 00:24:37,840 --> 00:24:40,560 Speaker 1: be really useful for processing different types of information. For 404 00:24:40,600 --> 00:24:43,479 Speaker 1: all sorts of applications, right, true. And also I mean, 405 00:24:43,520 --> 00:24:46,040 Speaker 1: if you think about it, just figuring out some of 406 00:24:46,080 --> 00:24:49,840 Speaker 1: the efficiencies that the human brain is evolved to include 407 00:24:50,160 --> 00:24:53,280 Speaker 1: as far as networking goes, if you could just even 408 00:24:53,359 --> 00:24:59,760 Speaker 1: get some insight or inspiration from that, that could help tremendously. Yeah. Absolutely, 409 00:25:00,200 --> 00:25:03,160 Speaker 1: There's some other great things I can mention. There's um 410 00:25:03,200 --> 00:25:05,440 Speaker 1: Ted Burger, who is a professor at the University of 411 00:25:05,480 --> 00:25:10,680 Speaker 1: Southern California's Center for neuro Engineering, who built a prosthetic 412 00:25:11,320 --> 00:25:19,280 Speaker 1: of the hippocampus. Now, the hippocampus is, uh, hippocampuses is large. Yeah, 413 00:25:19,280 --> 00:25:23,080 Speaker 1: it's largely associated with the formation of memories, also with 414 00:25:23,200 --> 00:25:26,240 Speaker 1: incorporation of emotion. But memory is a big part of 415 00:25:26,240 --> 00:25:29,119 Speaker 1: what hippocampus is involved in. So I think it also 416 00:25:29,359 --> 00:25:33,399 Speaker 1: um takes in century information, determines what region it should 417 00:25:33,400 --> 00:25:35,520 Speaker 1: be transmitted to, if it should go into long term 418 00:25:35,520 --> 00:25:37,920 Speaker 1: memory or that kind of stuff. It's kind of like 419 00:25:38,359 --> 00:25:42,280 Speaker 1: a big engineer in this case. And so in two 420 00:25:42,280 --> 00:25:46,119 Speaker 1: thousand eleven came up with a proof of concept hippocampll 421 00:25:46,119 --> 00:25:49,719 Speaker 1: prosthesis and tested it in live rats. In two thousand 422 00:25:49,800 --> 00:25:55,879 Speaker 1: twelve tested it in non human primates and supposedly sometime 423 00:25:55,960 --> 00:25:58,560 Speaker 1: this year they're going to test it in people. Man, 424 00:25:58,600 --> 00:26:00,560 Speaker 1: that is amazing. So like, if you have some sort 425 00:26:00,600 --> 00:26:03,120 Speaker 1: of damage to your hipocampus and you're no longer able 426 00:26:03,160 --> 00:26:06,159 Speaker 1: to form memories, than this would be the thing for you. 427 00:26:06,240 --> 00:26:08,440 Speaker 1: Kind of yeah. I mean, this could end up being 428 00:26:08,520 --> 00:26:13,679 Speaker 1: depending upon the nature of of the the problem. I mean, 429 00:26:13,720 --> 00:26:16,200 Speaker 1: it could potentially be a treatment for things like Alzheimer's. 430 00:26:17,119 --> 00:26:20,080 Speaker 1: Um Whether or not that turns out to be the case, 431 00:26:20,119 --> 00:26:21,959 Speaker 1: we'll still have to wait and see, but it is 432 00:26:22,200 --> 00:26:25,040 Speaker 1: very promising. Have you ever heard of Henry Mollison. I 433 00:26:25,040 --> 00:26:28,679 Speaker 1: have not. He is like one of the one of 434 00:26:28,680 --> 00:26:32,399 Speaker 1: the more facing, one of the more famous patients, or 435 00:26:32,440 --> 00:26:33,879 Speaker 1: to save time, you could just say one of the 436 00:26:33,880 --> 00:26:38,520 Speaker 1: more faithents um in as far as memory studies go, 437 00:26:38,960 --> 00:26:42,800 Speaker 1: because he had some he had I believe epilepsy, and 438 00:26:42,880 --> 00:26:45,960 Speaker 1: some old timey doctor gave him some brain surgery and 439 00:26:46,680 --> 00:26:50,600 Speaker 1: messed up his his hippocampus and the guy was unable 440 00:26:50,640 --> 00:26:54,480 Speaker 1: to form new memories from that point on. He could 441 00:26:54,480 --> 00:26:57,440 Speaker 1: remember everything up to that point under the surgery. Then 442 00:26:57,480 --> 00:27:01,640 Speaker 1: after that it was almost like his brain freshed every 443 00:27:01,680 --> 00:27:06,600 Speaker 1: I think something like thirty seconds and he was just 444 00:27:06,640 --> 00:27:09,480 Speaker 1: lived in an institution and was fortunately taken care of 445 00:27:09,520 --> 00:27:12,240 Speaker 1: by a few doctors that like really studied him, but 446 00:27:12,320 --> 00:27:15,600 Speaker 1: also like really kept him from the public limelight. His 447 00:27:15,680 --> 00:27:18,080 Speaker 1: name wasn't published in the life he died, but he 448 00:27:18,520 --> 00:27:21,920 Speaker 1: yielded a lot of information about how memories are formed 449 00:27:21,960 --> 00:27:24,440 Speaker 1: thanks to the hippocampus. But it sounds like he would 450 00:27:24,440 --> 00:27:27,439 Speaker 1: have been a great candidate for that. Yeah, I'm reminded 451 00:27:27,600 --> 00:27:32,880 Speaker 1: of and I have to trust other people's uh details 452 00:27:32,920 --> 00:27:35,560 Speaker 1: of this, because I have no memory of it. I 453 00:27:36,160 --> 00:27:39,159 Speaker 1: uh had There was a time where I had a 454 00:27:39,240 --> 00:27:41,720 Speaker 1: kidney stone. It was so bad that I had to 455 00:27:41,760 --> 00:27:44,439 Speaker 1: go to the hospital and they treat me with a 456 00:27:44,560 --> 00:27:48,040 Speaker 1: very powerful pain killer that just knocked your hipocampus out 457 00:27:48,080 --> 00:27:51,120 Speaker 1: of convintion. I couldn't remember things. I had no short 458 00:27:51,200 --> 00:27:53,480 Speaker 1: term memory. Well, it makes sense. Like also, when you're 459 00:27:53,520 --> 00:27:57,080 Speaker 1: drinking um, your hippocampal function is is messed with your 460 00:27:57,240 --> 00:28:01,240 Speaker 1: You are not forming new memories, and you require the 461 00:28:01,320 --> 00:28:03,800 Speaker 1: hippocampus do this. So if you're doing something, if you're 462 00:28:03,960 --> 00:28:06,240 Speaker 1: on drugs, if you have some sort of structural damage, 463 00:28:06,240 --> 00:28:09,760 Speaker 1: if you have been drinking, like that's why you're you're 464 00:28:09,800 --> 00:28:12,159 Speaker 1: not forming new memories. That accounts for a blackout, that 465 00:28:12,200 --> 00:28:18,800 Speaker 1: accounts for amnesia. Your hippocampus is just not functioning properly exactly. Uh. 466 00:28:18,920 --> 00:28:22,960 Speaker 1: There's another expert I want to mention, Ender's Sandberg of 467 00:28:23,040 --> 00:28:26,720 Speaker 1: the Future of Humanity Institute at Oxford University. I'm a 468 00:28:26,840 --> 00:28:31,200 Speaker 1: huge fan of that institute. Yeah, yeah, one of my 469 00:28:31,240 --> 00:28:34,680 Speaker 1: favorite people in the world works. There's names Nick Bostroma. Yeah, 470 00:28:35,200 --> 00:28:38,480 Speaker 1: that guy I know of, Nick Bostrom. So Sandberg had said, 471 00:28:38,840 --> 00:28:41,520 Speaker 1: this is a quote. The point of brain emulation is 472 00:28:41,560 --> 00:28:44,200 Speaker 1: to recreate the function of the original brain. So this 473 00:28:44,280 --> 00:28:47,440 Speaker 1: is talking about actually creating a copy of a of 474 00:28:47,480 --> 00:28:50,480 Speaker 1: a person's brain, not just the concept in general, but 475 00:28:50,640 --> 00:28:54,080 Speaker 1: in the specific case of this person's brain, we're going 476 00:28:54,080 --> 00:28:59,920 Speaker 1: to recreate it. If run, it will be able to 477 00:29:00,080 --> 00:29:03,520 Speaker 1: think and act as the original brain. We are now 478 00:29:03,560 --> 00:29:06,440 Speaker 1: able to take small brain tissue samples and map them 479 00:29:06,480 --> 00:29:09,800 Speaker 1: in three D. These are at exquisite resolution, but the 480 00:29:09,840 --> 00:29:13,600 Speaker 1: blocks are just a few microns across. We can run 481 00:29:13,640 --> 00:29:16,800 Speaker 1: simulations of the size of a mouse brain on supercomputers, 482 00:29:16,840 --> 00:29:20,080 Speaker 1: but we do not have the total connectivity yet. As 483 00:29:20,120 --> 00:29:23,760 Speaker 1: methods improve, I expect to see automatic conversion of scanned 484 00:29:23,880 --> 00:29:27,680 Speaker 1: tissue into models that can be run. The different parts exist, 485 00:29:28,000 --> 00:29:31,920 Speaker 1: but so far there is no pipeline from brains to emulations. 486 00:29:32,400 --> 00:29:36,440 Speaker 1: Now he thinks that it may be very difficult to 487 00:29:36,720 --> 00:29:41,120 Speaker 1: ever simulate memory in a computer the way that humans do, 488 00:29:41,200 --> 00:29:44,560 Speaker 1: for the very reasons we mentioned earlier. Um He also 489 00:29:45,440 --> 00:29:48,920 Speaker 1: points out that there is a problem with this particular approach, 490 00:29:49,080 --> 00:29:54,600 Speaker 1: as the scanning essentially damages or destroys the brain tissue 491 00:29:54,880 --> 00:29:59,760 Speaker 1: because there's not a non invasive way. It's all over again. 492 00:30:00,280 --> 00:30:03,320 Speaker 1: You gotta pretty much crack the nogg and open and 493 00:30:03,480 --> 00:30:05,840 Speaker 1: mush around in the gray stuff to find out, you know, 494 00:30:05,920 --> 00:30:10,200 Speaker 1: to really scan it and get that resolution. This this 495 00:30:10,680 --> 00:30:13,800 Speaker 1: scanning would either kill you or you need a freshly 496 00:30:13,840 --> 00:30:17,200 Speaker 1: dead person, in which case there's no longer consciousness right 497 00:30:17,800 --> 00:30:20,800 Speaker 1: right exactly. Problem, So you can make you can make 498 00:30:20,800 --> 00:30:23,520 Speaker 1: a copy of a dead brain, which, as you point out, 499 00:30:24,160 --> 00:30:27,040 Speaker 1: not really that useful, or you could make a copy 500 00:30:27,040 --> 00:30:28,960 Speaker 1: of a living brain, but in the process you kill 501 00:30:29,040 --> 00:30:32,200 Speaker 1: the living brain. You are left with the copy. Now, 502 00:30:33,520 --> 00:30:38,200 Speaker 1: theoretically this copy would think and react in a way 503 00:30:38,720 --> 00:30:41,920 Speaker 1: that would be exactly the way the original person thought 504 00:30:41,920 --> 00:30:46,200 Speaker 1: and reacted. But the original person is still dead. So Josh, 505 00:30:46,240 --> 00:30:48,160 Speaker 1: if you had this done, there would be a Josh 506 00:30:48,200 --> 00:30:52,200 Speaker 1: computer Josh Bought two thousand and Josh Bought would think 507 00:30:52,280 --> 00:30:56,760 Speaker 1: like you, would have quips like you with even better hair, 508 00:30:56,880 --> 00:31:01,880 Speaker 1: with even better hair than you, and feel somewhat smug 509 00:31:01,920 --> 00:31:06,120 Speaker 1: about it. Meanwhile, Josh Clark, the human being would be 510 00:31:06,160 --> 00:31:11,800 Speaker 1: no more. And this comes to another big problem in 511 00:31:11,960 --> 00:31:16,240 Speaker 1: the concept of digital immortality, which is continuity. Sure, so 512 00:31:16,360 --> 00:31:21,160 Speaker 1: continuity being the continuous experience of you as Josh Clark, 513 00:31:21,520 --> 00:31:25,200 Speaker 1: whether you are in your meat body or poured it 514 00:31:25,240 --> 00:31:27,600 Speaker 1: over to some digital format. I don't think that's that 515 00:31:27,680 --> 00:31:30,560 Speaker 1: big of a problem. Really think about it. Man. Every 516 00:31:30,640 --> 00:31:33,680 Speaker 1: day there we we have gaps in continuity. We go 517 00:31:33,760 --> 00:31:36,360 Speaker 1: to sleep and then we wake up. But you're talking 518 00:31:36,400 --> 00:31:39,880 Speaker 1: about functional continuity. There's also a physical continuity, and there's 519 00:31:39,920 --> 00:31:43,520 Speaker 1: the real problem. So functional continuity is exactly what you're 520 00:31:43,520 --> 00:31:46,680 Speaker 1: talking about. It's our our experience that we are having, 521 00:31:46,680 --> 00:31:48,960 Speaker 1: and it does have interruptions, whether it's when we go 522 00:31:49,000 --> 00:31:54,080 Speaker 1: to sleep or we are put under for exactly all 523 00:31:54,080 --> 00:31:55,880 Speaker 1: of that. It could end up being a break in 524 00:31:55,880 --> 00:32:00,120 Speaker 1: our functional continuity. We can recover from that because is 525 00:32:00,160 --> 00:32:02,880 Speaker 1: the physical continuity. The stuff that's in our brains is 526 00:32:02,920 --> 00:32:06,000 Speaker 1: still there, so that even though we have that reset, 527 00:32:06,680 --> 00:32:10,000 Speaker 1: we can come back and everything will be fine. If 528 00:32:10,000 --> 00:32:13,600 Speaker 1: the physical continuity is destroyed, as in the actual brain dies, 529 00:32:13,720 --> 00:32:16,920 Speaker 1: then you have a problem. Now. An interesting thing is 530 00:32:16,960 --> 00:32:20,280 Speaker 1: that I've looked at some neuroscientists UH and their work 531 00:32:20,560 --> 00:32:24,400 Speaker 1: and what they have to say about this, and it 532 00:32:24,480 --> 00:32:27,440 Speaker 1: was really interesting to me. There's a guy named Stephen Novella. 533 00:32:28,320 --> 00:32:31,719 Speaker 1: He's a neuroscientist works at Yale. He has a great 534 00:32:31,800 --> 00:32:35,440 Speaker 1: podcast called Skeptics Guide to the Universe UM, and he 535 00:32:35,520 --> 00:32:38,680 Speaker 1: is a critical thinker and a skeptic UH. He has 536 00:32:38,720 --> 00:32:41,040 Speaker 1: talked a lot about this as well. He's blogged about 537 00:32:41,080 --> 00:32:45,600 Speaker 1: it and his idea or his perspective. The way he 538 00:32:46,200 --> 00:32:49,560 Speaker 1: communicates it is that as humans, we have brains that 539 00:32:49,600 --> 00:32:54,600 Speaker 1: are divided into two hemispheres. Now, through drugs or through surgery, 540 00:32:55,200 --> 00:32:59,120 Speaker 1: you can have one of those hemispheres separated from the other. 541 00:32:59,480 --> 00:33:04,200 Speaker 1: It essentially is rendered inactive. But the two hemispheres are 542 00:33:04,360 --> 00:33:08,360 Speaker 1: largely copies of each other. So even if this does happen, 543 00:33:08,400 --> 00:33:11,560 Speaker 1: you can have a relatively normal experience. You might have 544 00:33:11,720 --> 00:33:14,320 Speaker 1: find that some things are now very hard to do, 545 00:33:14,680 --> 00:33:22,120 Speaker 1: like math. If your corps colossom isn't exactly. Yeah, so 546 00:33:23,000 --> 00:33:27,200 Speaker 1: he says, but these two halves, which individually can act 547 00:33:27,240 --> 00:33:32,120 Speaker 1: as a single brain, work together, and we have you know, 548 00:33:32,640 --> 00:33:35,880 Speaker 1: even if you have the one shut down, the other 549 00:33:35,920 --> 00:33:39,400 Speaker 1: one can continue to work. You're still you, largely you. 550 00:33:40,360 --> 00:33:43,320 Speaker 1: So he says, what if we then extend this and 551 00:33:43,400 --> 00:33:46,080 Speaker 1: we make the assumption that yes, we have created the 552 00:33:46,120 --> 00:33:50,200 Speaker 1: hardware and software that will allow for the simulation of 553 00:33:50,200 --> 00:33:53,760 Speaker 1: a brain in some way, we connect that to a 554 00:33:53,800 --> 00:33:59,040 Speaker 1: person's brain so that it becomes an extension. It's another 555 00:33:59,080 --> 00:34:02,000 Speaker 1: part of the brain, kind of like a third hemisphere, 556 00:34:02,200 --> 00:34:06,240 Speaker 1: I guess. And and so this one is starting to 557 00:34:07,080 --> 00:34:11,520 Speaker 1: form pathways that mimic what your brain does naturally, so 558 00:34:11,600 --> 00:34:16,600 Speaker 1: over time it helps you think the way you think already. 559 00:34:16,920 --> 00:34:21,319 Speaker 1: It also starts to build in redundant memories, so it's 560 00:34:21,440 --> 00:34:27,960 Speaker 1: essentially backing up your memories, and gradually it's going to 561 00:34:28,040 --> 00:34:32,239 Speaker 1: act like it another hemisphere of your brain. And it 562 00:34:32,280 --> 00:34:35,800 Speaker 1: could even be more powerful. Potentially you could do things 563 00:34:35,800 --> 00:34:38,920 Speaker 1: like include algorithms that like make it way easier for 564 00:34:38,960 --> 00:34:42,400 Speaker 1: you to do math. You'd be a math genius. I 565 00:34:42,440 --> 00:34:45,080 Speaker 1: would hope that if I were uploaded on the internet, 566 00:34:45,120 --> 00:34:49,359 Speaker 1: my math skills would just automatically improve. I would expect that. Yeah, 567 00:34:49,640 --> 00:34:53,400 Speaker 1: there's certain little like base assumptions you want to make, right, 568 00:34:53,600 --> 00:34:55,560 Speaker 1: that's one of them. It would be it would be 569 00:34:55,560 --> 00:35:00,600 Speaker 1: funny to be intially immortal, but crap at math. So 570 00:35:00,680 --> 00:35:02,319 Speaker 1: I guess you get me fun of by all the 571 00:35:02,320 --> 00:35:05,319 Speaker 1: other digital immortals. Very likely, you know, the Kurgan is 572 00:35:05,360 --> 00:35:08,600 Speaker 1: just taunting you before cutting off your digital head. Uh So, 573 00:35:08,840 --> 00:35:12,799 Speaker 1: the his point being that over time you would be 574 00:35:12,840 --> 00:35:16,000 Speaker 1: relying more heavily on the AI version of your brain, 575 00:35:16,480 --> 00:35:20,279 Speaker 1: that even while your meat brain goes to sleep, your 576 00:35:20,320 --> 00:35:23,840 Speaker 1: AI brain could stay awake so that you know, you 577 00:35:23,840 --> 00:35:28,000 Speaker 1: you as you could remain active all day long because 578 00:35:28,040 --> 00:35:30,360 Speaker 1: it's you know, it's it's your organic brain that's sleeping, 579 00:35:30,400 --> 00:35:33,880 Speaker 1: but your AI brain takes over, and it could get 580 00:35:33,920 --> 00:35:35,719 Speaker 1: to a point where you don't even really notice that 581 00:35:35,760 --> 00:35:38,839 Speaker 1: part of you as asleep, and you could theorectly reach 582 00:35:38,840 --> 00:35:41,640 Speaker 1: a point where your AI brain is doing the vast 583 00:35:41,719 --> 00:35:44,200 Speaker 1: majority of the work, so that the time when your 584 00:35:44,320 --> 00:35:48,680 Speaker 1: organic brain dies is a non event to you. Well, 585 00:35:48,760 --> 00:35:50,920 Speaker 1: I hope that you really got a lot out of 586 00:35:50,960 --> 00:35:54,319 Speaker 1: that classic episode of tech stuff. If you didn't, don't worry, 587 00:35:54,440 --> 00:35:56,919 Speaker 1: Josh Clark will be back next week because this ended 588 00:35:56,960 --> 00:35:59,719 Speaker 1: up being a two parter, so we will continue are 589 00:35:59,800 --> 00:36:04,520 Speaker 1: just gusion about digital immortality on that episode next week. 590 00:36:04,920 --> 00:36:07,279 Speaker 1: If you have suggestions for topics we should cover on 591 00:36:07,480 --> 00:36:09,600 Speaker 1: episodes of tech Stuff, please reach out to me. The 592 00:36:09,600 --> 00:36:12,200 Speaker 1: best way to do that is on Twitter. The handle 593 00:36:12,239 --> 00:36:15,480 Speaker 1: for the show is text stuff h s W. I'll 594 00:36:15,480 --> 00:36:24,200 Speaker 1: talk to you again really soon Y. Tech Stuff is 595 00:36:24,200 --> 00:36:27,400 Speaker 1: an I Heart Radio production. For more podcasts from I 596 00:36:27,480 --> 00:36:31,080 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 597 00:36:31,200 --> 00:36:33,200 Speaker 1: or wherever you listen to your favorite shows