1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,760 --> 00:00:14,480 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,600 --> 00:00:17,279 Speaker 1: John and Strickland. I'm an executive producer with I Heart 4 00:00:17,400 --> 00:00:20,840 Speaker 1: Radio and how the Tech are Young. It's time for 5 00:00:21,000 --> 00:00:25,799 Speaker 1: a tech Stuff classic episode. This episode originally published on 6 00:00:26,400 --> 00:00:31,680 Speaker 1: July two thousand and fifteen. It is titled Digital Immortality, 7 00:00:31,800 --> 00:00:34,760 Speaker 1: Part to the Great Josh Clark of Stuff you should 8 00:00:34,800 --> 00:00:39,760 Speaker 1: know joined the show for this one. Hope you enjoy. Say, 9 00:00:39,800 --> 00:00:44,040 Speaker 1: you could conceivably create a really a great simulation of 10 00:00:44,080 --> 00:00:48,159 Speaker 1: an individual person's brain, and it started making connections and 11 00:00:48,200 --> 00:00:54,200 Speaker 1: it became a better version of the organic brain. That's 12 00:00:54,200 --> 00:00:58,080 Speaker 1: a separate thing. How is it connected to the organic 13 00:00:58,120 --> 00:01:01,600 Speaker 1: brain so that you don't just have the experience of 14 00:01:01,600 --> 00:01:04,640 Speaker 1: the organic brain and the experience of the AI brain. Right, 15 00:01:05,040 --> 00:01:07,720 Speaker 1: how are they connected to that they're sharing experience because 16 00:01:07,720 --> 00:01:11,319 Speaker 1: without that connection, without that shared experience, you have, for 17 00:01:11,360 --> 00:01:14,880 Speaker 1: all intents and purposes, two separate individuals. Right. It's bringing 18 00:01:14,920 --> 00:01:19,760 Speaker 1: them together so that we really are translated into a 19 00:01:19,840 --> 00:01:24,360 Speaker 1: digital version of ourselves. That that is this this step 20 00:01:24,959 --> 00:01:27,760 Speaker 1: that guys like Curse Wile and everybody else is talking 21 00:01:27,760 --> 00:01:31,640 Speaker 1: about this just completely walk past. Well, yeah, the the 22 00:01:32,720 --> 00:01:34,520 Speaker 1: I mean, if you nailed them down and said, can 23 00:01:34,560 --> 00:01:37,440 Speaker 1: you explain this? I think the explaination would get would 24 00:01:37,440 --> 00:01:40,160 Speaker 1: be kind of a non explanation, which is that at 25 00:01:40,160 --> 00:01:44,200 Speaker 1: the time you reach the sophistication, the technological sophistication capable 26 00:01:44,480 --> 00:01:47,400 Speaker 1: of simulating a brain to the extent that it would 27 00:01:47,440 --> 00:01:49,680 Speaker 1: be useful in this way, we would have a brain 28 00:01:49,720 --> 00:01:55,040 Speaker 1: computer interface that would allow for the bidirectional communication. I 29 00:01:55,120 --> 00:01:57,120 Speaker 1: realized that you do not curse on this show. We 30 00:01:57,160 --> 00:02:01,160 Speaker 1: don't curse on my show either. But that's the hockey. Yes, yeah, 31 00:02:01,360 --> 00:02:05,040 Speaker 1: that's what they call Strickland magical thinking, I would argue. 32 00:02:06,640 --> 00:02:09,040 Speaker 1: And I realized, I certainly can't go so far as 33 00:02:09,080 --> 00:02:10,680 Speaker 1: to I wouldn't go so far as to call it 34 00:02:10,760 --> 00:02:13,840 Speaker 1: magical thinking in the sense that we have brain computer 35 00:02:13,880 --> 00:02:18,800 Speaker 1: interfaces now, they're just very primitive. So the question is, 36 00:02:18,840 --> 00:02:21,240 Speaker 1: could we get to a level of sophistication where it 37 00:02:21,400 --> 00:02:24,640 Speaker 1: was where we shared consciousness. Yeah, that's what we're talking 38 00:02:24,639 --> 00:02:29,120 Speaker 1: about here, And and I realized that it would it 39 00:02:29,160 --> 00:02:32,960 Speaker 1: would be a disadvantage to humanity to nail down the 40 00:02:33,000 --> 00:02:35,400 Speaker 1: futurists and be like, no, no, no, don't you dare 41 00:02:35,400 --> 00:02:39,160 Speaker 1: make a prediction unless you can exactly lay out how 42 00:02:39,160 --> 00:02:42,440 Speaker 1: it's going to happen. That's not what they do. So 43 00:02:42,520 --> 00:02:45,200 Speaker 1: I don't necessarily have a problem with futurists. I have 44 00:02:45,240 --> 00:02:47,640 Speaker 1: a problem with the people who write about what futurists 45 00:02:47,680 --> 00:02:52,440 Speaker 1: are saying, as if it's going to happen regardless, because 46 00:02:52,880 --> 00:02:56,320 Speaker 1: there's there here's one. This drives me crazy. It was 47 00:02:56,360 --> 00:02:59,120 Speaker 1: a kurtswild prediction, just a random stuff. Like everything that 48 00:02:59,120 --> 00:03:02,120 Speaker 1: guy says gets s right and understandably. So it's an 49 00:03:02,160 --> 00:03:04,960 Speaker 1: interesting dude, and he has interesting thoughts and interesting visions. 50 00:03:06,040 --> 00:03:10,560 Speaker 1: But he said recently that um by humans will be cyborgs. 51 00:03:11,000 --> 00:03:13,000 Speaker 1: Our brains will be jacked into the Internet, will be 52 00:03:13,040 --> 00:03:16,680 Speaker 1: able to say, we want to think about Wikipedia page, 53 00:03:16,880 --> 00:03:18,520 Speaker 1: we will be there. It will be in our minds 54 00:03:18,600 --> 00:03:21,480 Speaker 1: that Wikipedia page. We will have that level of brain 55 00:03:21,520 --> 00:03:27,400 Speaker 1: computer interface, that we are the sum total of human knowledge. Yes, well, 56 00:03:27,560 --> 00:03:29,960 Speaker 1: somebody did try to nail him down, or he he 57 00:03:30,160 --> 00:03:32,600 Speaker 1: volunteered in this talk, whatever it was. I just read 58 00:03:32,680 --> 00:03:35,840 Speaker 1: an article on it um and he says that there's 59 00:03:35,840 --> 00:03:41,480 Speaker 1: going to be DNA based nano robots that will basically 60 00:03:41,720 --> 00:03:45,880 Speaker 1: enter our core texas and connect us to um A, 61 00:03:45,880 --> 00:03:50,600 Speaker 1: an artificial cortex that the Internet is based on that, 62 00:03:50,600 --> 00:03:52,960 Speaker 1: that's how it will happen. It's like, well, that is 63 00:03:53,000 --> 00:03:58,640 Speaker 1: really neat and dandy. How how if you put on something, 64 00:03:58,680 --> 00:04:00,600 Speaker 1: if you say that something's gonna happen th teen years out, 65 00:04:01,520 --> 00:04:06,640 Speaker 1: just just saying well, DNA nanobots, that's how Like, that's 66 00:04:06,640 --> 00:04:09,120 Speaker 1: at the same level as as taking a human brain 67 00:04:09,200 --> 00:04:13,240 Speaker 1: now and plugging it into a USB cable and expecting 68 00:04:13,320 --> 00:04:15,840 Speaker 1: something to happen. I would I would totally agree with 69 00:04:15,840 --> 00:04:20,600 Speaker 1: you on that, absolutely agree with you on that because uh, 70 00:04:20,680 --> 00:04:23,440 Speaker 1: and and knowing what Here's another problem is that we've 71 00:04:23,440 --> 00:04:29,320 Speaker 1: got people who are very brilliant there. They're clearly intelligent people. However, 72 00:04:30,040 --> 00:04:34,240 Speaker 1: you know they're intelligent and focused on specific parts of 73 00:04:34,720 --> 00:04:38,919 Speaker 1: general knowledge and don't necessarily possess other elements of knowledge 74 00:04:38,920 --> 00:04:42,159 Speaker 1: that would be really important to have before you make 75 00:04:42,480 --> 00:04:47,760 Speaker 1: definitive statements and predictions, Like neuroscientists will often say things 76 00:04:47,839 --> 00:04:50,440 Speaker 1: that you know, the physicists may say one thing, but 77 00:04:50,520 --> 00:04:54,719 Speaker 1: neuroscientists believe another thing, Like physicists say maybe there are 78 00:04:54,800 --> 00:04:57,840 Speaker 1: some quantum effects going on in the brain, and neuroscientists 79 00:04:57,839 --> 00:05:02,880 Speaker 1: saying that's not really what we think is happening. And 80 00:05:03,320 --> 00:05:05,920 Speaker 1: because there's so much we do not know. Making any 81 00:05:05,960 --> 00:05:08,279 Speaker 1: sort of prediction of something that is definitely going to 82 00:05:08,279 --> 00:05:10,320 Speaker 1: happen when you say it's definitely gonna happen is kind 83 00:05:10,360 --> 00:05:13,720 Speaker 1: of foolish, you know, it's it's it again, probably falls 84 00:05:13,800 --> 00:05:16,760 Speaker 1: under the category of wishful thinking, particularly in the case 85 00:05:16,800 --> 00:05:20,720 Speaker 1: with Kurtswil, who again, brilliant guy. But I agree with you. 86 00:05:20,760 --> 00:05:24,479 Speaker 1: I think I think trying to explain say it nanobots, 87 00:05:24,560 --> 00:05:29,679 Speaker 1: it's almost akin to saying magic, Yes, it's the especially 88 00:05:29,680 --> 00:05:32,360 Speaker 1: since right now, as far as we know, nanobots are 89 00:05:32,480 --> 00:05:37,720 Speaker 1: probably well beyond our ability to make on any kind 90 00:05:37,760 --> 00:05:41,880 Speaker 1: of truly sophisticated level, like the nanotechnology we talk about today. 91 00:05:42,120 --> 00:05:46,360 Speaker 1: We're talking about nanoparticles that you can guide externally through 92 00:05:46,360 --> 00:05:51,800 Speaker 1: stuff like ultrasonic frequencies or magnetic fields. But they're not robots, like, 93 00:05:51,839 --> 00:05:56,279 Speaker 1: they're not autonomous or even remote controlled items that can 94 00:05:56,320 --> 00:05:57,520 Speaker 1: go to where you want them to go and do 95 00:05:57,600 --> 00:06:00,200 Speaker 1: what you want them to do that level of sophistic case. 96 00:06:00,240 --> 00:06:03,039 Speaker 1: And I think as well, beyond fifteen years out, I 97 00:06:03,080 --> 00:06:05,600 Speaker 1: can't imagine us getting there. And also I think another 98 00:06:05,640 --> 00:06:07,960 Speaker 1: issue I have is that a lot of these futurists 99 00:06:08,680 --> 00:06:12,720 Speaker 1: based their assumptions on things like Moore's law. Yes, strictly, 100 00:06:12,760 --> 00:06:16,000 Speaker 1: And I could hug you right now, man, Yes, you 101 00:06:16,080 --> 00:06:21,960 Speaker 1: can't base any sort of arrogant prediction on Moore's law. 102 00:06:22,120 --> 00:06:25,440 Speaker 1: It has as much veracity as Murphy's law. Yeah, man, 103 00:06:25,560 --> 00:06:29,680 Speaker 1: I'm so More's law was originally just an observation. It 104 00:06:29,760 --> 00:06:32,680 Speaker 1: wasn't and it was a decade long observation prediction for 105 00:06:32,720 --> 00:06:37,200 Speaker 1: a decade. Yeah. You got Gordon Moore who said, it 106 00:06:37,240 --> 00:06:39,919 Speaker 1: looks like every two years what we're going to reach 107 00:06:40,080 --> 00:06:45,200 Speaker 1: is the uh, the sophistication and the economic condition where 108 00:06:45,320 --> 00:06:49,000 Speaker 1: it'll be possible to half the size of transistors, double 109 00:06:49,040 --> 00:06:51,640 Speaker 1: the number of transistors on a square inch of silicon. 110 00:06:52,120 --> 00:06:54,480 Speaker 1: And how how you put that is whether you're a 111 00:06:54,520 --> 00:06:56,599 Speaker 1: glass is half full or half empty kind of person. 112 00:06:57,279 --> 00:07:00,200 Speaker 1: But and he was looking at it as saying, you know, 113 00:07:00,240 --> 00:07:03,040 Speaker 1: it's not just the technological sophistication. It's also the fact 114 00:07:03,120 --> 00:07:07,760 Speaker 1: that our our manufacturing ability will reach a point where 115 00:07:07,760 --> 00:07:10,920 Speaker 1: it makes sense to do it. And it was never 116 00:07:10,960 --> 00:07:14,200 Speaker 1: meant to be something that would exist in perpetuity. It 117 00:07:14,320 --> 00:07:16,840 Speaker 1: was meant that, I like, at least for the foreseeable future, 118 00:07:16,920 --> 00:07:18,960 Speaker 1: this is going to continue until we run up to 119 00:07:19,120 --> 00:07:22,760 Speaker 1: some sort of fundamental block, and we're getting real close 120 00:07:22,800 --> 00:07:25,360 Speaker 1: to that fundamental block now. In fact, a lot of engineers, 121 00:07:25,920 --> 00:07:29,360 Speaker 1: uh including More, have said that this is this is 122 00:07:29,960 --> 00:07:32,840 Speaker 1: the days of Moore's law are close to an end. 123 00:07:33,080 --> 00:07:35,600 Speaker 1: But they have been saying that for a very long time. 124 00:07:35,680 --> 00:07:37,640 Speaker 1: Now they have and when part of the reason that 125 00:07:37,720 --> 00:07:41,120 Speaker 1: they've been able to to push it off is that 126 00:07:41,480 --> 00:07:44,360 Speaker 1: we've kind of redefined More's laws. No longer the number 127 00:07:44,360 --> 00:07:47,520 Speaker 1: of transistors. Now we sort of understand it as the 128 00:07:47,680 --> 00:07:51,440 Speaker 1: processing power of a chip tends to double every two years, 129 00:07:51,600 --> 00:07:55,600 Speaker 1: which you can do both with architecture and with the 130 00:07:56,240 --> 00:07:59,880 Speaker 1: optimization of that architecture. So that's kind of what in 131 00:08:00,040 --> 00:08:02,320 Speaker 1: hell does They'll they'll do things well, they'll shrink down 132 00:08:02,360 --> 00:08:05,440 Speaker 1: elements in one move, and then the next move they 133 00:08:05,480 --> 00:08:09,240 Speaker 1: figure out how the what's the best architecture for those 134 00:08:09,280 --> 00:08:11,920 Speaker 1: elements to take the best advantage of what they've done. 135 00:08:12,360 --> 00:08:15,200 Speaker 1: So it's a tick they called the TikTok approach. The 136 00:08:15,200 --> 00:08:17,200 Speaker 1: tick is where they shrink everything down, the talk is 137 00:08:17,200 --> 00:08:19,240 Speaker 1: where they optimize it, and then the next tick they 138 00:08:19,240 --> 00:08:22,520 Speaker 1: shrink stuff down again, talk they optimize it. Yeah, so 139 00:08:23,000 --> 00:08:25,640 Speaker 1: it helps extend More's law because they don't always have 140 00:08:25,720 --> 00:08:28,360 Speaker 1: to get smaller and smaller, and once you get to 141 00:08:28,440 --> 00:08:32,120 Speaker 1: a certain size, you run into quantum effects that totally 142 00:08:32,200 --> 00:08:35,160 Speaker 1: ruin the way electronics work. So the reason why we're 143 00:08:35,200 --> 00:08:39,000 Speaker 1: even bringing this up is that you cannot count on 144 00:08:39,080 --> 00:08:42,200 Speaker 1: the technological sophistication to continue at the same rate it 145 00:08:42,240 --> 00:08:45,040 Speaker 1: has been going. No, but these guys who are all 146 00:08:45,040 --> 00:08:48,560 Speaker 1: about digital immortality are basing all of their assumptions on 147 00:08:48,600 --> 00:08:51,360 Speaker 1: the idea that Moore's law will keep going like this. Yeah, well, 148 00:08:51,400 --> 00:08:54,680 Speaker 1: and only that they're they're applying the same sort of 149 00:08:55,280 --> 00:08:59,439 Speaker 1: idea of Moore's law to other disciplines, and they're treating 150 00:08:59,440 --> 00:09:02,840 Speaker 1: it as if it is an actual scientific law, and 151 00:09:02,880 --> 00:09:05,600 Speaker 1: they're they're kind of bandying it about as if it 152 00:09:05,760 --> 00:09:08,680 Speaker 1: is and as if it proves that their predictions will 153 00:09:08,720 --> 00:09:12,880 Speaker 1: come true, which I mean, like, I love Moore's law, 154 00:09:12,960 --> 00:09:14,600 Speaker 1: but I would feel a lot better about it if 155 00:09:14,640 --> 00:09:17,920 Speaker 1: it was called like Moore's really good idea or just 156 00:09:18,040 --> 00:09:21,040 Speaker 1: or just the general observation of more, you know, because 157 00:09:21,200 --> 00:09:23,920 Speaker 1: it really again, it was an observation and then a 158 00:09:23,960 --> 00:09:27,560 Speaker 1: prediction based on that observation. But it was it wasn't 159 00:09:27,640 --> 00:09:31,360 Speaker 1: Gordon More who called it Moore's law. He doesn't. No, 160 00:09:31,520 --> 00:09:33,679 Speaker 1: he didn't care for that. He was like, let's not 161 00:09:33,760 --> 00:09:37,560 Speaker 1: called the law, guys, it's too late. Um well, that's 162 00:09:37,640 --> 00:09:40,840 Speaker 1: That's one of the big things is that if More's 163 00:09:40,880 --> 00:09:43,520 Speaker 1: law doesn't hold true, then this sets everything back. And 164 00:09:43,559 --> 00:09:45,880 Speaker 1: if Moore's law, and since you can't apply More's law 165 00:09:45,920 --> 00:09:49,200 Speaker 1: to things like neuroscience, the discoveries and neuroscience happen on 166 00:09:49,200 --> 00:09:53,960 Speaker 1: a totally different time scale. Yeah, like inspiration doesn't happen 167 00:09:54,000 --> 00:09:57,360 Speaker 1: on any kind of plotted charge. So that that's another 168 00:09:57,400 --> 00:10:00,679 Speaker 1: thing to keep in mind is that maybe they're predictions 169 00:10:01,200 --> 00:10:05,280 Speaker 1: are sort of accurate, but not for the time scale. 170 00:10:05,320 --> 00:10:08,920 Speaker 1: It maybe that it's much further out than what they anticipate. 171 00:10:09,160 --> 00:10:12,160 Speaker 1: In fact, I would be shocked if that's not the case, 172 00:10:13,080 --> 00:10:16,319 Speaker 1: based upon what little I know about these other disciplines. 173 00:10:17,559 --> 00:10:20,000 Speaker 1: We'll be back with more of this classic episode of 174 00:10:20,000 --> 00:10:30,559 Speaker 1: tech stuff after this quick break. Okay, well, let's let's 175 00:10:30,600 --> 00:10:32,719 Speaker 1: go ahead and just say, Josh, just for the sake 176 00:10:32,760 --> 00:10:37,040 Speaker 1: of argument, let's say that this world does come about 177 00:10:37,160 --> 00:10:42,120 Speaker 1: where digital immortality it's possible. It's a thing, we can 178 00:10:42,200 --> 00:10:47,040 Speaker 1: do it. The trouble doesn't stop there. There's some other 179 00:10:47,080 --> 00:10:52,080 Speaker 1: big issues happening. So now can you imagine a world 180 00:10:52,640 --> 00:10:57,040 Speaker 1: where there is now an option for you to move 181 00:10:57,080 --> 00:11:01,480 Speaker 1: yourself over into a digital immortality phase. Maybe it doesn't 182 00:11:01,480 --> 00:11:04,520 Speaker 1: even involve leaving your body. Maybe it involves getting an implant, 183 00:11:05,040 --> 00:11:07,680 Speaker 1: and then that takes over and your body will live 184 00:11:08,160 --> 00:11:10,200 Speaker 1: for as long as it can. And then after that 185 00:11:10,320 --> 00:11:14,680 Speaker 1: you could even port that intelligence which is you, into 186 00:11:14,800 --> 00:11:18,000 Speaker 1: some other body, whether it's a robot or maybe even 187 00:11:18,280 --> 00:11:20,520 Speaker 1: sort of free form where you could flow into all 188 00:11:20,559 --> 00:11:23,160 Speaker 1: sorts of different electronics. I even saw one suggestion saying, like, 189 00:11:23,280 --> 00:11:26,280 Speaker 1: in this science fiction future, you could get behind the 190 00:11:26,280 --> 00:11:28,719 Speaker 1: wheel of a car, but you don't really get behind 191 00:11:28,760 --> 00:11:30,800 Speaker 1: the wheel of a car. You become the car, like 192 00:11:30,840 --> 00:11:33,760 Speaker 1: that sort of stuff. All right, do you imagine that 193 00:11:33,760 --> 00:11:37,200 Speaker 1: that would be available to everybody day one? Yeah? Yeah, 194 00:11:37,240 --> 00:11:39,160 Speaker 1: And you make a good point in this article. It's 195 00:11:39,200 --> 00:11:44,640 Speaker 1: like talk about some sort of disparity experience here on 196 00:11:44,720 --> 00:11:47,000 Speaker 1: Earth between the halves and they have not yea and 197 00:11:47,040 --> 00:11:50,560 Speaker 1: the hals, and you talk about the huge gap and 198 00:11:50,600 --> 00:11:54,640 Speaker 1: haves and have nots that already exists. Imagine a world 199 00:11:54,640 --> 00:11:57,200 Speaker 1: where not only is that gap there, but it's exacerbated 200 00:11:57,200 --> 00:11:59,199 Speaker 1: by the fact that now the halves live forever. Yeah, 201 00:11:59,240 --> 00:12:01,720 Speaker 1: I think about the grim thing that people have every 202 00:12:01,760 --> 00:12:03,880 Speaker 1: time Dick Cheney gets a new heart and get to 203 00:12:04,200 --> 00:12:07,160 Speaker 1: like an extra like ten twenty years, that's just a 204 00:12:07,240 --> 00:12:10,440 Speaker 1: new heart. Like, imagine if people like that or anybody 205 00:12:10,440 --> 00:12:14,720 Speaker 1: who is wealthy, was able to become immortal and it 206 00:12:14,800 --> 00:12:18,000 Speaker 1: was only available to them because it was so prohibitively expensive, 207 00:12:18,280 --> 00:12:21,240 Speaker 1: as I think you rightly point out, would definitely be 208 00:12:21,280 --> 00:12:24,480 Speaker 1: the case at first. And you know, not to get 209 00:12:24,480 --> 00:12:27,800 Speaker 1: too political, but I would imagine that this would be 210 00:12:27,840 --> 00:12:31,680 Speaker 1: a world where you would see it even stronger push 211 00:12:32,080 --> 00:12:36,520 Speaker 1: towards protecting the status of those who have things, because 212 00:12:36,559 --> 00:12:40,600 Speaker 1: now they're in it forever. Right. It's it's not even 213 00:12:40,720 --> 00:12:43,560 Speaker 1: I want this protected for as long as I live now, 214 00:12:43,559 --> 00:12:48,040 Speaker 1: it's I want this to be protected forever. Um. You 215 00:12:48,080 --> 00:12:51,680 Speaker 1: could also make the argument that maybe that would mean 216 00:12:52,040 --> 00:12:55,600 Speaker 1: those people would also become way more interested in taking 217 00:12:55,640 --> 00:12:58,040 Speaker 1: better care of the planet because they're going to be 218 00:12:58,080 --> 00:13:00,200 Speaker 1: there forever. I think it would be one of the 219 00:13:00,240 --> 00:13:02,800 Speaker 1: benefits of this is that it would extend long term 220 00:13:02,800 --> 00:13:07,360 Speaker 1: thinking about about the future of humanity, the future of Earth. 221 00:13:07,559 --> 00:13:10,120 Speaker 1: I think that would just be an inevitable byproduct of Yeah, 222 00:13:10,120 --> 00:13:12,720 Speaker 1: because you're going to experience it if you're in that 223 00:13:13,040 --> 00:13:15,320 Speaker 1: if you're in that elite, then you are going to 224 00:13:15,360 --> 00:13:18,079 Speaker 1: experience that. So therefore it does benefit you to think 225 00:13:18,120 --> 00:13:20,240 Speaker 1: about these things. You can't just focus on the short 226 00:13:20,360 --> 00:13:23,600 Speaker 1: term gain anymore. Now It's like I'm going to years. 227 00:13:23,600 --> 00:13:27,200 Speaker 1: Who cares about anything? After that? Dead? Who cares? Yeah? 228 00:13:27,320 --> 00:13:30,520 Speaker 1: Yeah right? And and like for me, I'm like, well, 229 00:13:30,559 --> 00:13:32,520 Speaker 1: I don't have kids, so I don't even have that 230 00:13:32,600 --> 00:13:37,640 Speaker 1: I think about you barely care about this moment. Honestly, 231 00:13:37,679 --> 00:13:40,719 Speaker 1: I stopped caring about five minutes into this episode. But no, 232 00:13:40,920 --> 00:13:43,400 Speaker 1: it's just kidding. But yeah, that's That's one of those things, 233 00:13:43,400 --> 00:13:46,319 Speaker 1: is that we're talking about a potential divide, not just 234 00:13:46,600 --> 00:13:49,480 Speaker 1: in uh in the haves and have nots as far 235 00:13:49,520 --> 00:13:52,200 Speaker 1: as like a single country. But let's say that this 236 00:13:52,240 --> 00:13:55,400 Speaker 1: technology is developed in one part of the world that 237 00:13:55,520 --> 00:13:58,079 Speaker 1: immediately is going to cause issues too, because you've got 238 00:13:58,080 --> 00:14:01,400 Speaker 1: the rest of the world. Nobody wants to I think 239 00:14:01,400 --> 00:14:03,319 Speaker 1: it's pretty safe to say that most people don't want 240 00:14:03,320 --> 00:14:06,720 Speaker 1: to die. Most people like the idea of being able 241 00:14:06,800 --> 00:14:11,080 Speaker 1: to at least live as long as they want to live, right, Like, 242 00:14:11,120 --> 00:14:14,480 Speaker 1: they get the option to say maybe maybe eternity is 243 00:14:14,520 --> 00:14:16,440 Speaker 1: not what I really want, but I would like the 244 00:14:16,480 --> 00:14:20,160 Speaker 1: opportunity to live as long as I want to live 245 00:14:20,280 --> 00:14:24,480 Speaker 1: before deciding to no longer live, right, Like, even that 246 00:14:24,480 --> 00:14:26,280 Speaker 1: could be a case. It could be that when we 247 00:14:26,280 --> 00:14:29,800 Speaker 1: talk about immortality, we're really just talking about you get 248 00:14:29,840 --> 00:14:32,240 Speaker 1: to decide when you go. Well, you know, I read 249 00:14:32,280 --> 00:14:35,080 Speaker 1: an interesting article about that that's saying like, if we 250 00:14:35,120 --> 00:14:37,840 Speaker 1: are going to do this, don't forget to also include 251 00:14:37,880 --> 00:14:41,760 Speaker 1: some sort of suicide switch. Because if you become digitally 252 00:14:41,800 --> 00:14:45,160 Speaker 1: immortal and you have no way of ceasing to exist, 253 00:14:45,920 --> 00:14:47,840 Speaker 1: I mean, what happens when you want to cease to 254 00:14:47,880 --> 00:14:50,280 Speaker 1: exist and you can't do anything about it? Yeah, that 255 00:14:50,280 --> 00:14:53,920 Speaker 1: that would be a totally new type of torture. Sure, 256 00:14:54,040 --> 00:14:56,040 Speaker 1: And that was another part of that same article, I 257 00:14:56,040 --> 00:14:59,000 Speaker 1: think it was in the Atlantic um there where they said, well, 258 00:14:59,000 --> 00:15:01,640 Speaker 1: what is a life sentence? Like, yeah, you know, and 259 00:15:01,720 --> 00:15:03,760 Speaker 1: if you if you do something wrong in the future 260 00:15:03,760 --> 00:15:06,360 Speaker 1: generation decides that you've done something wrong and you should 261 00:15:06,360 --> 00:15:08,960 Speaker 1: be punished, Well, you're still around for that future generation 262 00:15:09,160 --> 00:15:12,320 Speaker 1: to punish you. And how long will that last? And 263 00:15:12,440 --> 00:15:16,920 Speaker 1: what does it look like? Other considerations? Who owns the 264 00:15:18,000 --> 00:15:21,480 Speaker 1: who owns the brain? Because if if you are comporting 265 00:15:21,480 --> 00:15:25,120 Speaker 1: yourself over into some digital format, presumably there's some hardware 266 00:15:25,160 --> 00:15:31,040 Speaker 1: and software involved. Who owns that experience? Does it? Does 267 00:15:31,040 --> 00:15:33,880 Speaker 1: it revert to the person that was the original individual? 268 00:15:34,000 --> 00:15:36,440 Speaker 1: Does it revert to the hardware that it exists upon. 269 00:15:36,520 --> 00:15:38,920 Speaker 1: Does it revert to the software that makes it possible? 270 00:15:40,280 --> 00:15:43,000 Speaker 1: This sounds like a silly question, but it's it's not. 271 00:15:43,120 --> 00:15:46,560 Speaker 1: In the people who are able to patent genes human genes, 272 00:15:47,440 --> 00:15:50,480 Speaker 1: you have to figure out like, well, how how do 273 00:15:50,520 --> 00:15:52,520 Speaker 1: you say, let me correct myself before you get a 274 00:15:52,520 --> 00:15:55,280 Speaker 1: bunch of listener mail who are able to patent genetic 275 00:15:55,320 --> 00:15:58,480 Speaker 1: processes that are the results of genes Because the human 276 00:15:58,520 --> 00:16:02,320 Speaker 1: genes discussion was eventually said no, you can't, you can't 277 00:16:02,360 --> 00:16:04,520 Speaker 1: patent gene but through the late eighties I would have 278 00:16:04,520 --> 00:16:11,320 Speaker 1: been right, yeah uh. And also, going back to cultural issues, 279 00:16:11,880 --> 00:16:14,240 Speaker 1: even if you assume that everyone has access to this, 280 00:16:14,360 --> 00:16:18,120 Speaker 1: what does that mean for religions? A large part of 281 00:16:18,160 --> 00:16:22,360 Speaker 1: many religions happens to be about the experience that comes 282 00:16:22,440 --> 00:16:26,200 Speaker 1: after death. That that, in fact, a lot of religions 283 00:16:26,200 --> 00:16:29,760 Speaker 1: suggests that the life here on earth is merely preparation 284 00:16:29,880 --> 00:16:33,720 Speaker 1: for what comes next. But if you extend that that 285 00:16:33,840 --> 00:16:37,560 Speaker 1: life indefinitely. What does that mean from a religious standpoint. 286 00:16:38,880 --> 00:16:40,920 Speaker 1: It's a tough question to answer. I can't answer it, 287 00:16:41,280 --> 00:16:43,040 Speaker 1: but it's you know, it's another one of those things 288 00:16:43,040 --> 00:16:47,200 Speaker 1: where we get these philosophical uh problems that come in. 289 00:16:47,240 --> 00:16:49,480 Speaker 1: There are other ones to like, let's say that you 290 00:16:49,640 --> 00:16:52,440 Speaker 1: are able to extend your life span indefinitely, what happens 291 00:16:52,440 --> 00:16:56,960 Speaker 1: to population? I saw you, I saw this in this article, 292 00:16:57,000 --> 00:16:59,280 Speaker 1: and I don't think that there's that much of an 293 00:16:59,280 --> 00:17:01,800 Speaker 1: issue here because think about it, if we all live 294 00:17:01,880 --> 00:17:06,480 Speaker 1: in a non corporeal form, that's the key though, that's 295 00:17:06,520 --> 00:17:10,399 Speaker 1: non corporeal, but digital immortality might be allowed if we have, 296 00:17:10,560 --> 00:17:12,560 Speaker 1: like I said, an implant in our heads. Oh well, 297 00:17:12,560 --> 00:17:16,959 Speaker 1: then there will be body. I mean, that will be 298 00:17:17,040 --> 00:17:19,919 Speaker 1: your status symbol that you own your own body and 299 00:17:19,960 --> 00:17:22,960 Speaker 1: there's not a hundred other people in your body with you. Yeah, 300 00:17:23,160 --> 00:17:25,440 Speaker 1: that'll be trying to make decisions about where to go 301 00:17:25,600 --> 00:17:27,919 Speaker 1: to dinner, what to get at. It just becomes a 302 00:17:28,000 --> 00:17:32,840 Speaker 1: really really extreme version of all of me. Another Steve Martin, Yes, 303 00:17:33,520 --> 00:17:37,639 Speaker 1: that was a good one. Lily Tomlin was in that. Um. Yeah, 304 00:17:37,640 --> 00:17:41,840 Speaker 1: So it's it's there are a lot of philosophical questions 305 00:17:41,880 --> 00:17:43,800 Speaker 1: that we would have to come to grips with if 306 00:17:43,800 --> 00:17:46,520 Speaker 1: this were in fact ever to become a reality. They 307 00:17:46,720 --> 00:17:49,240 Speaker 1: never happen. Yeah, And I'm not I'm not poo pooing 308 00:17:49,320 --> 00:17:52,440 Speaker 1: the possibility. I'm not saying that definitely is not going 309 00:17:52,440 --> 00:17:55,560 Speaker 1: to happen. I don't believe that we know enough about 310 00:17:55,680 --> 00:17:58,520 Speaker 1: things to say that it never will. I also don't 311 00:17:58,520 --> 00:18:00,800 Speaker 1: believe we know enough about things to say it definitely will, 312 00:18:00,960 --> 00:18:05,479 Speaker 1: or especially the definitely will by Yeah. Now, in his defense, 313 00:18:05,520 --> 00:18:07,879 Speaker 1: he was saying that we would be cyborgs and that 314 00:18:07,960 --> 00:18:09,920 Speaker 1: we would be able to interface with the internet just 315 00:18:10,040 --> 00:18:13,080 Speaker 1: using our as opposed to being truly immortal. He's talking 316 00:18:13,080 --> 00:18:16,960 Speaker 1: more about Yeah, and even then, that's still one eight 317 00:18:17,080 --> 00:18:21,800 Speaker 1: maybe of that singularity. Right, We've got more to say 318 00:18:21,800 --> 00:18:25,240 Speaker 1: in this classic episode of tech Stuff. After these quick messages, 319 00:18:33,040 --> 00:18:34,439 Speaker 1: I ran into a couple of things that we're kind 320 00:18:34,440 --> 00:18:37,600 Speaker 1: of interesting while getting ready for this podcast, and I 321 00:18:37,640 --> 00:18:39,680 Speaker 1: kind of wanted to sort of conclude on this. So 322 00:18:40,080 --> 00:18:42,480 Speaker 1: while we're nowhere close to the point where we can 323 00:18:42,880 --> 00:18:45,879 Speaker 1: pour the brain over into some digital format and live forever, 324 00:18:46,480 --> 00:18:49,320 Speaker 1: where continuity really isn't an issue yet because we don't 325 00:18:49,320 --> 00:18:51,680 Speaker 1: have the we don't even have the basis to work 326 00:18:51,680 --> 00:18:55,240 Speaker 1: with as uh let alone the figuring out how to 327 00:18:55,280 --> 00:19:00,080 Speaker 1: port stuff over. There are people working on ways to 328 00:19:00,440 --> 00:19:07,520 Speaker 1: share who you are beyond your lifespan, and one of 329 00:19:07,560 --> 00:19:09,560 Speaker 1: them I ran into. In fact, I've signed up for 330 00:19:09,560 --> 00:19:11,959 Speaker 1: the beta, but it hasn't it's still on the alpha 331 00:19:12,080 --> 00:19:15,000 Speaker 1: part yet, so it's down. The beta is a service 332 00:19:15,040 --> 00:19:19,280 Speaker 1: called it turna dot me, so a turn and me 333 00:19:20,160 --> 00:19:24,000 Speaker 1: is a website and it essentially is going to scrub 334 00:19:24,600 --> 00:19:29,040 Speaker 1: your social presence on the web to get an idea 335 00:19:29,040 --> 00:19:31,960 Speaker 1: of who you are. The idea being that eventually it 336 00:19:32,000 --> 00:19:35,639 Speaker 1: would create an avatar that could act as you do, 337 00:19:35,880 --> 00:19:39,680 Speaker 1: you know, guided by all the experiences you share with it, 338 00:19:40,280 --> 00:19:45,040 Speaker 1: as you as you just be you online. The person 339 00:19:45,080 --> 00:19:49,560 Speaker 1: behind it I believe the name is Sunshine. Actually, Um 340 00:19:49,800 --> 00:19:53,480 Speaker 1: says that it would probably take ten years for the 341 00:19:53,520 --> 00:19:58,119 Speaker 1: algorithm to kind of learn your you know, how you 342 00:19:58,160 --> 00:20:01,520 Speaker 1: are you before it could react in a way that 343 00:20:01,680 --> 00:20:05,880 Speaker 1: seemed to be similar. At least ten years, I think 344 00:20:05,920 --> 00:20:08,840 Speaker 1: they they say, the longer, the better is the impression. 345 00:20:09,600 --> 00:20:12,320 Speaker 1: And there's actually kind of a sad part of the 346 00:20:12,359 --> 00:20:15,080 Speaker 1: story too, that that he's been contacted by a lot 347 00:20:15,119 --> 00:20:18,960 Speaker 1: of people who UM are suffering from terminal illnesses, and 348 00:20:18,960 --> 00:20:20,720 Speaker 1: they're looking at this as a way of being able 349 00:20:20,720 --> 00:20:23,200 Speaker 1: to share memories with other people after they have gone. 350 00:20:23,640 --> 00:20:25,520 Speaker 1: He says, you know, it's really hard to deal with 351 00:20:25,560 --> 00:20:28,280 Speaker 1: those requests because honestly, the technology isn't at a level 352 00:20:28,320 --> 00:20:33,159 Speaker 1: of sophistication where it could just adapt that quickly. But 353 00:20:33,200 --> 00:20:35,560 Speaker 1: it is an interesting notion. No, in this case, obviously, 354 00:20:35,600 --> 00:20:40,639 Speaker 1: the immortality is more about your your influence here on 355 00:20:40,680 --> 00:20:44,360 Speaker 1: the planet. After you have gone, you are still gone, 356 00:20:44,600 --> 00:20:50,000 Speaker 1: but what you leave behind continues to influence others. And 357 00:20:50,240 --> 00:20:53,320 Speaker 1: that's very poetic way of looking at immortality. It's not 358 00:20:53,359 --> 00:20:55,199 Speaker 1: the way Woody Allen looked at it, you know, he 359 00:20:55,240 --> 00:20:57,200 Speaker 1: was always says, I don't want to achieve immortality through 360 00:20:57,200 --> 00:21:00,800 Speaker 1: my work. I want to achieve it by not dying, um. 361 00:21:00,800 --> 00:21:05,040 Speaker 1: But immortality through work or through uh your interactions with 362 00:21:05,040 --> 00:21:08,640 Speaker 1: other people. That is something And you know, you could 363 00:21:08,720 --> 00:21:11,560 Speaker 1: argue that, you know, like the poem was it as 364 00:21:11,560 --> 00:21:15,160 Speaker 1: it Azimandias, where essentially you're like, look at this enormous 365 00:21:15,160 --> 00:21:16,920 Speaker 1: monument I built that was going to stand the test 366 00:21:16,960 --> 00:21:18,479 Speaker 1: of time and show how powerful I am, and now 367 00:21:18,480 --> 00:21:22,280 Speaker 1: it's fallen over and it's crumbled and uh and and 368 00:21:22,320 --> 00:21:27,080 Speaker 1: thus the Hubrists of the past proves to be unfounded. Uh. 369 00:21:27,200 --> 00:21:29,960 Speaker 1: I still think that this is a very kind of 370 00:21:30,000 --> 00:21:33,679 Speaker 1: poetic way of of at least showing people that you 371 00:21:33,800 --> 00:21:36,600 Speaker 1: care for them, even if it, even if it is 372 00:21:36,640 --> 00:21:40,240 Speaker 1: a little weird on the artificial intelligence doing it on 373 00:21:40,280 --> 00:21:42,880 Speaker 1: your behalf, You will never get rid of me in 374 00:21:42,960 --> 00:21:46,040 Speaker 1: my Facebook posts. Yeah, and I had posted in the 375 00:21:46,080 --> 00:21:50,320 Speaker 1: in the or I wrote in the article about Brian Brushwood, 376 00:21:50,600 --> 00:21:53,040 Speaker 1: and I even had Brushwood on this show and I 377 00:21:53,040 --> 00:21:55,159 Speaker 1: asked him about this. Yeah, this was years ago, and 378 00:21:55,440 --> 00:21:57,080 Speaker 1: it was an idea he had had. I don't know 379 00:21:57,160 --> 00:21:59,840 Speaker 1: that he ever actually implemented it, but his idea was 380 00:21:59,880 --> 00:22:05,280 Speaker 1: to create a very simple algorithm that would copy previous 381 00:22:05,320 --> 00:22:09,800 Speaker 1: posts he had made online and post on his behalf 382 00:22:09,840 --> 00:22:11,960 Speaker 1: after he dies. The idea being that every year, on 383 00:22:12,000 --> 00:22:14,360 Speaker 1: his birthday, he would be prompted to send a message 384 00:22:14,400 --> 00:22:16,679 Speaker 1: to this account, and if he did not do it 385 00:22:16,720 --> 00:22:19,359 Speaker 1: within a certain amount of time, it would activate and 386 00:22:19,400 --> 00:22:22,160 Speaker 1: start posting for him, so that even after he had died, 387 00:22:22,720 --> 00:22:25,400 Speaker 1: the ghost in the machine version of him would continue 388 00:22:25,440 --> 00:22:30,479 Speaker 1: to update Twitter or Facebook. So I wonder how often 389 00:22:30,520 --> 00:22:33,520 Speaker 1: like the appropriateness in the context of it would be 390 00:22:33,560 --> 00:22:36,320 Speaker 1: so off that it would be like a Horsey Books 391 00:22:36,359 --> 00:22:38,560 Speaker 1: tweet or something like that. Yeah, where you you know, 392 00:22:38,600 --> 00:22:43,720 Speaker 1: like especially you know, especially if something really tragic had happened, 393 00:22:43,760 --> 00:22:49,320 Speaker 1: like like big Newsworthy tragic event happens and then yeah 394 00:22:49,320 --> 00:22:51,400 Speaker 1: and like yeah exactly, or or you know, like let's 395 00:22:51,400 --> 00:22:55,320 Speaker 1: say that there was a huge fire that broke out 396 00:22:55,359 --> 00:22:57,680 Speaker 1: in Texas, which is where he lives. Like let's say 397 00:22:57,680 --> 00:22:59,960 Speaker 1: there's an enormous fire and then it's just like man, 398 00:23:00,040 --> 00:23:02,320 Speaker 1: and it sure is hot today, or something like that. 399 00:23:02,320 --> 00:23:04,320 Speaker 1: That would be it would be like, Wow, that's an 400 00:23:04,359 --> 00:23:08,040 Speaker 1: incredibly poor taste, although to be fairer, Brian would probably 401 00:23:08,080 --> 00:23:12,280 Speaker 1: think that was hilarious. There's there's already another service out there, 402 00:23:12,520 --> 00:23:14,200 Speaker 1: or at least there was a few years back. I 403 00:23:14,240 --> 00:23:15,720 Speaker 1: don't know if they're still around, but it was called 404 00:23:15,760 --> 00:23:19,119 Speaker 1: death Switch and it wasn't it wasn't meant to keep going. 405 00:23:19,720 --> 00:23:23,880 Speaker 1: But you write letters, emails, that kind of stuff, emails, 406 00:23:23,920 --> 00:23:26,359 Speaker 1: I guess you think about it, Facebook post that kind 407 00:23:26,400 --> 00:23:29,280 Speaker 1: of thing, and they're all just held back because every year, 408 00:23:29,359 --> 00:23:32,040 Speaker 1: much like what Brushwood was saying, you get an email 409 00:23:32,040 --> 00:23:33,840 Speaker 1: that you have to like click and answer a little 410 00:23:33,840 --> 00:23:36,400 Speaker 1: bit to prove you're still alive. And the first year 411 00:23:36,440 --> 00:23:39,000 Speaker 1: that doesn't happen, it gives you like two or three 412 00:23:39,040 --> 00:23:40,880 Speaker 1: more chances over the next couple of days, and then 413 00:23:40,920 --> 00:23:45,199 Speaker 1: it sends out these emails everything from like, dear boss, 414 00:23:45,440 --> 00:23:49,320 Speaker 1: here's what I have always thought about you, your sweet 415 00:23:49,320 --> 00:23:52,159 Speaker 1: wife saying I know I've died and I want you 416 00:23:52,200 --> 00:23:55,240 Speaker 1: to remember I always love you. And you know, like 417 00:23:56,200 --> 00:23:58,200 Speaker 1: there's sweet things you can do, there's mean things you 418 00:23:58,200 --> 00:24:01,440 Speaker 1: can do, there's just whatever, like here's here's the combination 419 00:24:01,600 --> 00:24:04,879 Speaker 1: to the fire safe and all that kind of stuff. 420 00:24:05,040 --> 00:24:07,240 Speaker 1: But it was all being held back by this one thing, 421 00:24:07,320 --> 00:24:10,679 Speaker 1: which was every year you kept it from triggering, right, 422 00:24:10,720 --> 00:24:13,480 Speaker 1: which is pretty cool. Yeah. No, it's an interesting idea. 423 00:24:13,520 --> 00:24:17,000 Speaker 1: I like, not immortality at all. No, but it's again 424 00:24:17,080 --> 00:24:20,600 Speaker 1: one of that the idea of extending your presence digitally 425 00:24:20,880 --> 00:24:26,200 Speaker 1: but life is extinguished. Isn't that extremely egotistical? I mean 426 00:24:26,240 --> 00:24:29,240 Speaker 1: it could be, but it could also be that, you know, 427 00:24:29,600 --> 00:24:33,560 Speaker 1: if you have the fear of what if this stuff 428 00:24:33,640 --> 00:24:36,399 Speaker 1: I want to I want to express to somebody, you know, 429 00:24:36,480 --> 00:24:38,159 Speaker 1: what if I what if I never get around to 430 00:24:38,200 --> 00:24:40,679 Speaker 1: doing it? Because some people say that you wasted that 431 00:24:40,720 --> 00:24:42,959 Speaker 1: part of your life Yeah, that's part of life, as 432 00:24:43,040 --> 00:24:47,000 Speaker 1: you were given a finite amount of time, finite amount 433 00:24:47,040 --> 00:24:50,520 Speaker 1: of time to make decisions or let them pass by. 434 00:24:50,600 --> 00:24:53,320 Speaker 1: There's also something where you might you might think, I 435 00:24:53,359 --> 00:24:57,560 Speaker 1: want the ability for this person's final memory of me 436 00:24:57,720 --> 00:25:02,120 Speaker 1: to be this thing because they mean they mean whatever 437 00:25:02,240 --> 00:25:04,280 Speaker 1: to me, and I want them to be aware of that. 438 00:25:04,359 --> 00:25:05,840 Speaker 1: And I like to think of it more on the 439 00:25:05,840 --> 00:25:08,720 Speaker 1: sweet side than the nasty side, but obviously it would 440 00:25:08,720 --> 00:25:11,679 Speaker 1: apply to both, and the idea being that, well, I 441 00:25:11,720 --> 00:25:14,880 Speaker 1: can't guarantee that I won't get hit by a bus tomorrow, 442 00:25:15,520 --> 00:25:19,919 Speaker 1: and so I want the ability for this thought I'm having, this, 443 00:25:19,920 --> 00:25:25,040 Speaker 1: this expression to be sent to this person in that event. 444 00:25:25,160 --> 00:25:27,399 Speaker 1: I have no problem with the death switch service. My 445 00:25:27,520 --> 00:25:32,640 Speaker 1: problem is the idea of becoming immortal. Yeah, digital immortality 446 00:25:32,680 --> 00:25:35,199 Speaker 1: is a little pathetic and desperate. If you ask me, like, 447 00:25:35,280 --> 00:25:38,320 Speaker 1: just the whole concept really can't die, I can't possibly die. 448 00:25:38,320 --> 00:25:41,280 Speaker 1: Don't let me die, Okay, I guess where that's part 449 00:25:41,320 --> 00:25:44,080 Speaker 1: one of it. But then a dumb down, watered down 450 00:25:44,200 --> 00:25:49,160 Speaker 1: version of it to where an AI of you survives you. Definitely, 451 00:25:49,680 --> 00:25:53,880 Speaker 1: that to me is beyond egotistical because if you think 452 00:25:53,920 --> 00:25:57,520 Speaker 1: about it, our concept of immortality. Now, Strickland is what 453 00:25:57,600 --> 00:26:00,080 Speaker 1: you leave behind in the form of your work, in 454 00:26:00,119 --> 00:26:05,440 Speaker 1: your memories, and you leave yourself to be judged and appreciated, hated, whatever, 455 00:26:05,760 --> 00:26:08,240 Speaker 1: by the people who who come after you, by the 456 00:26:08,240 --> 00:26:12,520 Speaker 1: people you've touched, have have who keep you alive with 457 00:26:12,560 --> 00:26:16,920 Speaker 1: their thoughts. Sure, that's immortality now, it's not invasive. It's 458 00:26:17,000 --> 00:26:19,280 Speaker 1: up to the people who are left to stay alive 459 00:26:19,320 --> 00:26:21,359 Speaker 1: to decide how they think of you, or whether they 460 00:26:21,400 --> 00:26:25,440 Speaker 1: even want to think of you. With this AI version 461 00:26:25,480 --> 00:26:28,480 Speaker 1: of yourself, you're just insinuating yourself in their lives beyond 462 00:26:28,560 --> 00:26:31,199 Speaker 1: your own physical death in a way that you're not 463 00:26:31,280 --> 00:26:34,880 Speaker 1: even getting any sort of satisfaction. Have ever told you 464 00:26:34,960 --> 00:26:40,000 Speaker 1: what my plan is after I die? Okay, so this 465 00:26:40,040 --> 00:26:41,680 Speaker 1: is going to give you a look into my psyche 466 00:26:41,720 --> 00:26:44,760 Speaker 1: and explain exactly why this pipe kind of service is 467 00:26:44,800 --> 00:26:49,199 Speaker 1: perfect for me. Just lock the door. I've told my 468 00:26:49,200 --> 00:26:50,840 Speaker 1: wife this. I didn't say this is my plan for 469 00:26:50,880 --> 00:26:55,320 Speaker 1: after when you die. I got that plan to my 470 00:26:55,400 --> 00:26:59,160 Speaker 1: plan for when I die. Uh, And I I'm not 471 00:26:59,280 --> 00:27:02,480 Speaker 1: being genuine when I say this. This was something I 472 00:27:02,520 --> 00:27:04,520 Speaker 1: told my wife and just but I said, what I 473 00:27:04,560 --> 00:27:07,639 Speaker 1: want to have happened is I want to be cremated, 474 00:27:08,480 --> 00:27:11,600 Speaker 1: and then I want us to have part of my 475 00:27:11,880 --> 00:27:13,520 Speaker 1: part of the stuff I leave behind is going to 476 00:27:13,560 --> 00:27:16,280 Speaker 1: be a certain amount of money to hire somebody who, 477 00:27:16,359 --> 00:27:19,600 Speaker 1: for the rest of my wife's life, takes the urn 478 00:27:19,640 --> 00:27:22,359 Speaker 1: of ashes that has me in it, hides it somewhere 479 00:27:22,359 --> 00:27:24,639 Speaker 1: in the house, and she cannot go to bed until 480 00:27:24,680 --> 00:27:27,840 Speaker 1: after she finds me, because I want to irritate her 481 00:27:27,880 --> 00:27:30,320 Speaker 1: as much in death as I have in life. And 482 00:27:30,400 --> 00:27:32,720 Speaker 1: she looked at me, She's like, the scary thing is 483 00:27:32,760 --> 00:27:36,159 Speaker 1: I could totally see you doing I kind of can't now. Honestly, 484 00:27:36,200 --> 00:27:38,239 Speaker 1: I would never do that. I plan on donating my 485 00:27:38,280 --> 00:27:41,359 Speaker 1: body to science. Yeah, you know, supposedly you have to 486 00:27:41,400 --> 00:27:44,000 Speaker 1: make a note if this kind of thing bothers you 487 00:27:44,040 --> 00:27:47,560 Speaker 1: that you don't want, um, say, like a rhino plastic 488 00:27:47,640 --> 00:27:50,160 Speaker 1: practice to be carried out on your I don't care. 489 00:27:51,119 --> 00:27:54,000 Speaker 1: I don't care. You know whatever, whatever helps people in 490 00:27:54,000 --> 00:27:57,960 Speaker 1: the long run. Whether whether that help is is whether 491 00:27:58,000 --> 00:28:00,399 Speaker 1: you judge it as being superficial, whether you judge it 492 00:28:00,440 --> 00:28:04,880 Speaker 1: as being really meaningful. Ultimately, my goal is I want 493 00:28:04,920 --> 00:28:07,040 Speaker 1: I want to I want to leave the world better 494 00:28:07,080 --> 00:28:09,000 Speaker 1: than it was when I came in. That's That's how 495 00:28:09,040 --> 00:28:10,879 Speaker 1: my wife approaches it too. She wants to leave her 496 00:28:10,880 --> 00:28:13,560 Speaker 1: body to science, and I'm like, plastic. She's like, I 497 00:28:13,600 --> 00:28:16,520 Speaker 1: don't It's fine, Like if I'm leaving it the science. Well, 498 00:28:16,520 --> 00:28:19,160 Speaker 1: the way if you're what if there's someone who through 499 00:28:19,640 --> 00:28:25,840 Speaker 1: uh you know, either an inherited defect or whatever, Yeah, 500 00:28:26,480 --> 00:28:29,760 Speaker 1: they could, they could, truly their lives could be transformed 501 00:28:29,800 --> 00:28:33,280 Speaker 1: in I cannot even conceive up, I know what you mean. 502 00:28:33,600 --> 00:28:35,920 Speaker 1: Or there's, of course there's that option. I feel like 503 00:28:37,280 --> 00:28:41,040 Speaker 1: put down the dermatologists, like, oh, you saved lives, and 504 00:28:41,120 --> 00:28:44,120 Speaker 1: she's like, yes, through skin cancer, you could skin cancer. 505 00:28:45,360 --> 00:28:48,400 Speaker 1: I didn't think of it well, Or what about the 506 00:28:48,400 --> 00:28:51,800 Speaker 1: scenario there's a housewife in Beverly Hills and her nose 507 00:28:51,920 --> 00:28:55,800 Speaker 1: is just a little too big. You know, actually, if 508 00:28:55,920 --> 00:28:57,440 Speaker 1: her nose is a little too big, my nose is 509 00:28:57,440 --> 00:29:00,320 Speaker 1: the last one in the world to look at. Son both. 510 00:29:00,320 --> 00:29:02,160 Speaker 1: I can't believe we've both been in here. I know 511 00:29:02,240 --> 00:29:04,600 Speaker 1: it's we almost had to build out an alcove in 512 00:29:04,640 --> 00:29:08,200 Speaker 1: this podcast studio just for the nasal passages. This has 513 00:29:08,240 --> 00:29:10,000 Speaker 1: been a lot of fun, dude, Thank you for having 514 00:29:10,120 --> 00:29:13,880 Speaker 1: thank you for coming on. Of course, obviously the toilets episode. 515 00:29:14,000 --> 00:29:18,120 Speaker 1: Yeah right, Yeah, I didn't make any puns in that sense, 516 00:29:18,200 --> 00:29:20,960 Speaker 1: like I didn't know you know, there there are tons 517 00:29:21,000 --> 00:29:23,400 Speaker 1: of immortality puns I could have made, but I didn't 518 00:29:23,440 --> 00:29:26,080 Speaker 1: do it. I I speak for everyone listening when I 519 00:29:26,120 --> 00:29:29,920 Speaker 1: say we appreciate that. I'm sure well, I hope you 520 00:29:30,040 --> 00:29:33,840 Speaker 1: enjoyed that episode. I might not be digitally immortal, but 521 00:29:33,960 --> 00:29:36,720 Speaker 1: I sure hope my show is, because then a little 522 00:29:36,720 --> 00:29:41,000 Speaker 1: part of me will live on forever irritating people with 523 00:29:41,080 --> 00:29:44,360 Speaker 1: puns and if they go back far enough with loud 524 00:29:44,440 --> 00:29:48,160 Speaker 1: loud listener male sound effects. If you have suggestions for 525 00:29:48,240 --> 00:29:50,760 Speaker 1: topics I should cover in future episodes of tech Stuff, 526 00:29:50,760 --> 00:29:52,760 Speaker 1: please reach out to me. The best way to do 527 00:29:52,840 --> 00:29:55,000 Speaker 1: that is to send me a message on Twitter. The 528 00:29:55,080 --> 00:29:57,800 Speaker 1: handle for the show is tech Stuff H s W. 529 00:29:58,480 --> 00:30:07,280 Speaker 1: I'll talk to you again really soon. Y. Text Stuff 530 00:30:07,360 --> 00:30:10,520 Speaker 1: is an I Heart Radio production. For more podcasts from 531 00:30:10,520 --> 00:30:14,320 Speaker 1: I Heart Radio, visit the I Heart Radio app, Apple Podcasts, 532 00:30:14,440 --> 00:30:16,440 Speaker 1: or wherever you listen to your favorite shows.