1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:13,800 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,840 --> 00:00:17,200 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,280 --> 00:00:20,560 Speaker 1: I Heart Radio and I love all things tech than Today, 5 00:00:20,960 --> 00:00:24,440 Speaker 1: we're going to examine a classic episode of tech Stuff. 6 00:00:24,440 --> 00:00:27,280 Speaker 1: It's a Friday, so it's time for another classic, and 7 00:00:27,360 --> 00:00:31,840 Speaker 1: today's topic is are we living in a computer simulation? 8 00:00:32,479 --> 00:00:35,680 Speaker 1: This is a philosophical question that has received a lot 9 00:00:35,720 --> 00:00:40,360 Speaker 1: of attention and back in Lauren Voge Obama and I 10 00:00:40,440 --> 00:00:42,680 Speaker 1: really thought it'd be a lot of fun to explore 11 00:00:43,159 --> 00:00:47,680 Speaker 1: that whole concept. So enjoy this classic episode do we 12 00:00:47,840 --> 00:00:52,280 Speaker 1: live in a computer simulation? And in fact, the reason 13 00:00:52,320 --> 00:00:55,920 Speaker 1: why we're even talking about this is a few years 14 00:00:55,920 --> 00:01:00,400 Speaker 1: ago a philosopher, and boy is that a surprise, no one. 15 00:01:00,720 --> 00:01:04,959 Speaker 1: A philosopher by the name of Nick Bostrom who works 16 00:01:05,000 --> 00:01:10,440 Speaker 1: at a little um yeah tiny, like in academic circles. 17 00:01:10,440 --> 00:01:15,240 Speaker 1: They have some swagger. It's the University of Oxford. Yeah, yeah, 18 00:01:15,280 --> 00:01:19,800 Speaker 1: as far as the boffins go at Swagger City, right anyway, Yeah, 19 00:01:19,880 --> 00:01:22,200 Speaker 1: he works at the University of Oxford and he's a 20 00:01:22,400 --> 00:01:26,640 Speaker 1: philosopher employed there. Who so his job is to sit 21 00:01:26,680 --> 00:01:29,039 Speaker 1: around and think about the nature of reality. And he 22 00:01:29,120 --> 00:01:32,840 Speaker 1: presented an interesting thought experiment. He said, do we live 23 00:01:32,880 --> 00:01:37,320 Speaker 1: in a computer simulation? Is everything that we experience and 24 00:01:37,360 --> 00:01:41,960 Speaker 1: everything that's around us actually just the product of some 25 00:01:42,080 --> 00:01:46,199 Speaker 1: sort of computer program in a universe larger than our own? 26 00:01:46,640 --> 00:01:49,000 Speaker 1: And we'll talk about you know. What led him into 27 00:01:49,040 --> 00:01:53,160 Speaker 1: this kind of line of thinking and his arguments presenting 28 00:01:53,680 --> 00:01:56,320 Speaker 1: the likelihood that we are actually in a computer simulation. 29 00:01:56,520 --> 00:01:59,600 Speaker 1: But before we begin in that, I thought it'd be 30 00:01:59,600 --> 00:02:03,560 Speaker 1: interesting to look back quite a way. Is actually because 31 00:02:03,760 --> 00:02:08,680 Speaker 1: this idea that reality as we understand it is not 32 00:02:08,680 --> 00:02:12,680 Speaker 1: not exactly new at all. We've been pondering for basically 33 00:02:12,960 --> 00:02:15,440 Speaker 1: as long as we could ponder things, whether or not 34 00:02:15,720 --> 00:02:19,360 Speaker 1: our our experience of reality is reality. Yeah, and and 35 00:02:19,400 --> 00:02:21,560 Speaker 1: part of that is understandable. I mean, we know for 36 00:02:21,600 --> 00:02:25,079 Speaker 1: a fact that reality consists of stuff that is beyond 37 00:02:25,240 --> 00:02:28,880 Speaker 1: our perception, right, Oh, sure, absolutely. Like I don't know 38 00:02:28,919 --> 00:02:31,680 Speaker 1: when the last time you looked into the infrared spectrum was, 39 00:02:31,720 --> 00:02:33,639 Speaker 1: but the last time for me was never because it's 40 00:02:33,639 --> 00:02:37,880 Speaker 1: outside my visual acuity. I can't see in the infrared spectrum, 41 00:02:37,880 --> 00:02:40,360 Speaker 1: all right, And there's lots of lots of things above 42 00:02:40,400 --> 00:02:42,800 Speaker 1: our scale of hearing, above and below our scale of hearing. Um, 43 00:02:42,919 --> 00:02:44,720 Speaker 1: if you talk to sharks, they can they can hear 44 00:02:44,720 --> 00:02:48,600 Speaker 1: different stuff, but my dogs totally different stuff that I 45 00:02:48,639 --> 00:02:52,079 Speaker 1: can hear. Yeah, and and even even before we had 46 00:02:52,120 --> 00:02:54,679 Speaker 1: as much scientific data about that as we do now, 47 00:02:54,840 --> 00:02:56,320 Speaker 1: you know, if we had done the research, we can 48 00:02:56,400 --> 00:02:58,480 Speaker 1: tell you the exact spectrums that we can see within 49 00:02:58,600 --> 00:03:02,119 Speaker 1: but outside of visual which is a pretty good word 50 00:03:02,120 --> 00:03:04,560 Speaker 1: for it. But um, but but yeah, you know it's 51 00:03:04,600 --> 00:03:07,080 Speaker 1: it's even long, long, long times before that people are 52 00:03:07,120 --> 00:03:10,960 Speaker 1: pondering these questions and all kinds of literature. Uh goes 53 00:03:11,000 --> 00:03:13,680 Speaker 1: into literature and philosophy right right all the back in 54 00:03:14,560 --> 00:03:18,000 Speaker 1: one one of the famous famous examples of this kind 55 00:03:18,000 --> 00:03:22,880 Speaker 1: of philosophy was proposed by a fellow named Renee Descartes, 56 00:03:23,360 --> 00:03:26,000 Speaker 1: about whom Monty Python had some rude things to sing. 57 00:03:27,440 --> 00:03:30,480 Speaker 1: If you know that your philosopher's song. But Descartes, um, 58 00:03:30,520 --> 00:03:35,000 Speaker 1: I think therefore I am uh fellow. He wrote something 59 00:03:35,040 --> 00:03:39,160 Speaker 1: called the Meditations on First Philosophy, and he presented an 60 00:03:39,520 --> 00:03:44,360 Speaker 1: very similar thought exercise to the one that Nick Bostrom mentioned. Now, 61 00:03:44,440 --> 00:03:48,040 Speaker 1: his was called the Evil Demon, or sometimes people refer 62 00:03:48,120 --> 00:03:52,400 Speaker 1: to it as the evil genius problem. And he said, well, 63 00:03:52,400 --> 00:03:56,440 Speaker 1: what if everything that I renee, I think therefore I 64 00:03:56,480 --> 00:04:01,880 Speaker 1: am Decartes experience is actually just an illusion that's generated 65 00:04:01,880 --> 00:04:04,320 Speaker 1: by an evil force. In his case, he was from 66 00:04:04,400 --> 00:04:08,920 Speaker 1: an evil demon. So there's this malevolent creature that is 67 00:04:08,960 --> 00:04:13,600 Speaker 1: capable of creating everything that Decard is experiencing. So while 68 00:04:13,680 --> 00:04:17,320 Speaker 1: he thinks he's walking around and being really smart and 69 00:04:17,400 --> 00:04:22,600 Speaker 1: chatting with other smart people and having a croissant um, 70 00:04:22,680 --> 00:04:27,320 Speaker 1: in reality, he's not. He's just he's just a consciousness 71 00:04:27,360 --> 00:04:30,520 Speaker 1: that's being manipulated by this evil demon and everything that's 72 00:04:30,560 --> 00:04:34,040 Speaker 1: happening to him is an illusion that's created by him. 73 00:04:34,080 --> 00:04:38,039 Speaker 1: And this is unfalsifiable, right. It means that that means 74 00:04:38,040 --> 00:04:40,760 Speaker 1: that there's no way that you can prove that it's wrong. 75 00:04:41,000 --> 00:04:43,160 Speaker 1: It's like if I say, there's a six ft invisible 76 00:04:43,200 --> 00:04:46,760 Speaker 1: bunny walking around behind me all the time, makes no 77 00:04:46,760 --> 00:04:49,520 Speaker 1: noise and has no scent, right, and you can't touch it. 78 00:04:49,680 --> 00:04:53,080 Speaker 1: You cannot touch. Yeah, that well, that there's no way 79 00:04:53,080 --> 00:04:55,800 Speaker 1: for me to prove that you're wrong. I will sit 80 00:04:55,800 --> 00:04:57,400 Speaker 1: here and I will think that you are crazy. Or 81 00:04:57,440 --> 00:05:01,800 Speaker 1: Donnie Darko, but that uh, you know, or Deborah Donka 82 00:05:01,920 --> 00:05:04,960 Speaker 1: don't darko, I don't know, I mean, or the character 83 00:05:05,160 --> 00:05:09,839 Speaker 1: in in Harvey there you go, yes, but but you know, 84 00:05:09,880 --> 00:05:12,120 Speaker 1: there's no way for me to prove that there's not 85 00:05:12,200 --> 00:05:17,240 Speaker 1: a six ft invisible, untouchable, unsensible, unhearable bunny behind you. 86 00:05:17,880 --> 00:05:22,039 Speaker 1: So that's that's called that. It's called unfalsifiable. That means 87 00:05:22,080 --> 00:05:27,240 Speaker 1: it is not scientific. Scientific principles premises these sort of things. 88 00:05:27,320 --> 00:05:31,719 Speaker 1: They are falsifiable, meaning that there should be a set 89 00:05:31,760 --> 00:05:35,240 Speaker 1: of criteria under which you would say this is not 90 00:05:35,400 --> 00:05:38,360 Speaker 1: true right now. It does not mean that what you're 91 00:05:38,360 --> 00:05:40,880 Speaker 1: saying isn't true. It just means it has to be possible, 92 00:05:41,560 --> 00:05:43,080 Speaker 1: that has to be within the realm of possible. In 93 00:05:43,160 --> 00:05:44,640 Speaker 1: order for something to be proved, it has to be 94 00:05:44,640 --> 00:05:48,760 Speaker 1: able to be disproved exactly. And so if it's unfalsifiable, 95 00:05:48,800 --> 00:05:51,840 Speaker 1: it's not scientific. Now I should also stress if it's 96 00:05:51,920 --> 00:05:57,320 Speaker 1: unfalsifiable and unscientific, that also does not mean it's not true. 97 00:05:57,600 --> 00:05:59,719 Speaker 1: It could be true. It could absolutely be true. String 98 00:06:00,279 --> 00:06:02,760 Speaker 1: is a great example of this. We call it a theory, 99 00:06:02,880 --> 00:06:05,599 Speaker 1: but but some people argue it's a philosophy, Well, it's 100 00:06:05,600 --> 00:06:08,240 Speaker 1: a it's yeah, it's not a mathematical theory, and I 101 00:06:08,240 --> 00:06:10,719 Speaker 1: get into arguments sometimes of people on Facebook about this 102 00:06:11,279 --> 00:06:14,000 Speaker 1: because the theory is the word that has many many meanings. 103 00:06:14,120 --> 00:06:17,520 Speaker 1: Mathematical meaning of it is is something that has been 104 00:06:17,600 --> 00:06:22,800 Speaker 1: proven is true, whereas has been exercised the whole scientific 105 00:06:22,839 --> 00:06:25,640 Speaker 1: theory versus I have a theory, which is really more 106 00:06:25,680 --> 00:06:27,839 Speaker 1: like I have an idea of why this is the 107 00:06:27,839 --> 00:06:30,039 Speaker 1: way it is. Scientific theory and that kind of theory 108 00:06:30,040 --> 00:06:33,279 Speaker 1: are two different things, but anyway, uh yeah, string theory 109 00:06:33,440 --> 00:06:36,120 Speaker 1: would say that the entire universe is made up of 110 00:06:36,160 --> 00:06:39,520 Speaker 1: these tiny, little vibrating strings. And we're talking like tiny, 111 00:06:39,640 --> 00:06:43,640 Speaker 1: as in tinier than sub atomic particles, tiny small, and 112 00:06:43,680 --> 00:06:46,200 Speaker 1: that the way they vibrate, that's what makes stuff what 113 00:06:46,320 --> 00:06:49,880 Speaker 1: it is. Well, mathematically this makes sense, but there is 114 00:06:50,000 --> 00:06:53,600 Speaker 1: no way we can uh we can observe this or 115 00:06:53,720 --> 00:06:59,320 Speaker 1: test this, so therefore it's unfalsifiable and unscientific using that 116 00:06:59,400 --> 00:07:02,560 Speaker 1: particular definition. Uh So, same sort of thing here with 117 00:07:02,640 --> 00:07:05,359 Speaker 1: Renee di Caart and his theory. And this is not again, 118 00:07:05,440 --> 00:07:07,480 Speaker 1: not the first time this idea has popped up, but 119 00:07:07,480 --> 00:07:09,479 Speaker 1: it's one of the really famous ones. And that's back 120 00:07:09,480 --> 00:07:15,880 Speaker 1: in So if we look at the modern version, you've 121 00:07:15,880 --> 00:07:19,360 Speaker 1: got Nick Bostrom talking about a computer simulation, and his 122 00:07:19,440 --> 00:07:27,560 Speaker 1: whole argument hinges on this idea of transhumanism or the singularity. 123 00:07:27,720 --> 00:07:30,240 Speaker 1: So we kind of have to talk about what the 124 00:07:30,280 --> 00:07:33,080 Speaker 1: singularity is so that we can finally get around to 125 00:07:33,120 --> 00:07:36,480 Speaker 1: this whole computer simulation problem, right, and and and also 126 00:07:36,720 --> 00:07:40,800 Speaker 1: talk about trans human because that that's basically a fancy 127 00:07:40,920 --> 00:07:45,680 Speaker 1: term for stuffy. Yeah, it's it's it's a it's a 128 00:07:45,680 --> 00:07:49,760 Speaker 1: fancy term for saying what happens when we reach a 129 00:07:49,880 --> 00:07:55,160 Speaker 1: point in in advances in science and medicine in biology 130 00:07:55,320 --> 00:07:58,520 Speaker 1: where we can transform ourselves to a point where we 131 00:07:58,560 --> 00:08:02,080 Speaker 1: are no longer strictly speed being human. Right, Like, we 132 00:08:02,160 --> 00:08:06,440 Speaker 1: have altered ourselves on some fundamental level, and we aren't 133 00:08:07,000 --> 00:08:09,720 Speaker 1: we we wouldn't be human as we recognize it today, right, 134 00:08:09,760 --> 00:08:12,680 Speaker 1: that our our technology has bonded to us in a way, 135 00:08:12,800 --> 00:08:15,360 Speaker 1: or that our science has bonded to us a way. Right, 136 00:08:15,360 --> 00:08:18,080 Speaker 1: we we've understood genetics enough so that we're all mutants 137 00:08:18,160 --> 00:08:21,320 Speaker 1: like in the X Men, or we have all become cyborgs, 138 00:08:21,720 --> 00:08:26,040 Speaker 1: or we have all transmitted our consciousness into computers and 139 00:08:26,080 --> 00:08:29,360 Speaker 1: there are no physical versions of us anymore. And these 140 00:08:29,400 --> 00:08:33,400 Speaker 1: are all different ways that we could in theory become 141 00:08:33,440 --> 00:08:37,240 Speaker 1: trans human. So it's a very generic term that spreads 142 00:08:37,240 --> 00:08:42,440 Speaker 1: across multiple possibilities. Right, there's no one trans human because really, 143 00:08:42,480 --> 00:08:43,960 Speaker 1: I mean, the thing is that we don't know because 144 00:08:43,960 --> 00:08:45,640 Speaker 1: we're not there yet. It hasn't happened yet, as it 145 00:08:45,640 --> 00:08:46,920 Speaker 1: turns out. I mean, there's a few of us who 146 00:08:46,920 --> 00:08:50,440 Speaker 1: are a little wacky, but but it doesn't mean some 147 00:08:50,480 --> 00:08:52,600 Speaker 1: of us are clearly better than others. I mean, okay, 148 00:08:52,960 --> 00:08:57,280 Speaker 1: I mean we're not we're not naming names, but we're 149 00:08:57,320 --> 00:09:01,480 Speaker 1: pretty awesome. But no, that's the area of trans humanism. 150 00:09:01,480 --> 00:09:04,679 Speaker 1: Singularity is sort of one of the pathways where we 151 00:09:04,720 --> 00:09:07,640 Speaker 1: could reach this sort of trans human future. And the 152 00:09:07,679 --> 00:09:12,400 Speaker 1: singularity is this idea that several futurists have proposed about 153 00:09:12,960 --> 00:09:18,040 Speaker 1: the fact that technology advances are continuing at a faster 154 00:09:18,120 --> 00:09:22,839 Speaker 1: and faster pace each each year, really exponentially exponentially. You've 155 00:09:22,840 --> 00:09:26,360 Speaker 1: got things like Moore's law, where, depending on how you're 156 00:09:26,400 --> 00:09:29,280 Speaker 1: defining it, essentially the computer power is doubling every two 157 00:09:29,360 --> 00:09:33,040 Speaker 1: years um, but other technological advances are are happening at 158 00:09:33,040 --> 00:09:36,520 Speaker 1: an even faster rate that we will eventually come to 159 00:09:36,600 --> 00:09:40,160 Speaker 1: a point where we are advancing continuously with no break 160 00:09:40,280 --> 00:09:43,920 Speaker 1: between the generation, right right, if you think of like 161 00:09:44,000 --> 00:09:46,080 Speaker 1: operating systems and how they come out, like you know 162 00:09:46,200 --> 00:09:50,640 Speaker 1: Windows seven and then Windows But then you think, all right, well, 163 00:09:50,679 --> 00:09:52,920 Speaker 1: then down the road it might be the every six months, 164 00:09:52,960 --> 00:09:54,320 Speaker 1: and then it might be every three months, and then 165 00:09:54,320 --> 00:09:56,959 Speaker 1: it might just be that every single day there's something 166 00:09:57,000 --> 00:09:59,720 Speaker 1: new that's being incorporated, or every second or every a second, 167 00:09:59,840 --> 00:10:03,120 Speaker 1: right cetera. So at that point, things are changing so 168 00:10:03,160 --> 00:10:05,640 Speaker 1: fast that you cannot even define the era you are 169 00:10:05,720 --> 00:10:09,760 Speaker 1: in because there's nothing like within within one moment. Uh 170 00:10:09,920 --> 00:10:13,360 Speaker 1: you have changed so much that it's it's pointless to 171 00:10:13,400 --> 00:10:17,440 Speaker 1: try and defind a series of moments. Right at this 172 00:10:17,480 --> 00:10:20,280 Speaker 1: point in our future, we would hit what is called 173 00:10:20,320 --> 00:10:24,360 Speaker 1: the singularity. And part of one of the defining features 174 00:10:24,360 --> 00:10:26,760 Speaker 1: of the singularity is it's impossible for us to say 175 00:10:26,840 --> 00:10:29,920 Speaker 1: what will happen once we hit that point, because by 176 00:10:29,960 --> 00:10:32,240 Speaker 1: its very nature, it's going to evolve so fast that 177 00:10:32,280 --> 00:10:35,840 Speaker 1: we cannot conceive of what I mean. It's it's kind 178 00:10:35,880 --> 00:10:39,040 Speaker 1: of pointless to talk about it too much. Doesn't stop me, 179 00:10:39,840 --> 00:10:43,000 Speaker 1: but it's pointless. It doesn't stop philosophers. Either doesn't stop me, 180 00:10:43,040 --> 00:10:45,800 Speaker 1: And there are many pointless discussions that you cannot stop 181 00:10:45,800 --> 00:10:49,319 Speaker 1: me from having This is one of them anyway, So 182 00:10:50,440 --> 00:10:54,400 Speaker 1: the singularity could happen in various ways. Again, biology, science, technology, 183 00:10:54,440 --> 00:10:58,520 Speaker 1: These are all the different avenues that that could lead 184 00:10:58,520 --> 00:11:03,079 Speaker 1: to the singularity or could be you know, yeah and uh. 185 00:11:03,160 --> 00:11:07,760 Speaker 1: And that forms the very crux of Bostrom's argument. Let's 186 00:11:07,760 --> 00:11:09,800 Speaker 1: take a quick break for a word from our sponsor, 187 00:11:17,240 --> 00:11:21,200 Speaker 1: and now back to the show. So Nick Bostrom's argument 188 00:11:21,840 --> 00:11:25,320 Speaker 1: uh is worded this way here. Here's from a paper 189 00:11:25,360 --> 00:11:30,120 Speaker 1: he wrote on the subject, a technologically mature post human 190 00:11:30,400 --> 00:11:35,960 Speaker 1: civilization would have enormous computing power. Based on this empirical fact, 191 00:11:36,240 --> 00:11:39,280 Speaker 1: the simulation argument shows that at least one of the 192 00:11:39,320 --> 00:11:44,440 Speaker 1: following propositions is true. One, the fraction of human level 193 00:11:44,520 --> 00:11:47,439 Speaker 1: civilizations that reach a post human stage is very close 194 00:11:47,480 --> 00:11:51,600 Speaker 1: to zero. To the fraction of post human civilizations that 195 00:11:51,640 --> 00:11:54,880 Speaker 1: are interested in running ancestor simulations is very close to 196 00:11:54,960 --> 00:11:58,959 Speaker 1: zero or three. The fraction of all people with our 197 00:11:59,040 --> 00:12:02,000 Speaker 1: kind of experiences they are living in a simulation is 198 00:12:02,080 --> 00:12:05,120 Speaker 1: very close to one. Now, what that essentially means is 199 00:12:05,160 --> 00:12:08,880 Speaker 1: that if we are able to reach a post human phase, 200 00:12:09,000 --> 00:12:13,120 Speaker 1: this trans human phase where we have at our fingertips 201 00:12:14,200 --> 00:12:18,760 Speaker 1: practically limitless resources because of things are being so magical 202 00:12:18,760 --> 00:12:23,080 Speaker 1: and rainbows are popping out of everything. That uh that 203 00:12:23,240 --> 00:12:28,320 Speaker 1: if that's the case, then we should be able to 204 00:12:28,600 --> 00:12:33,080 Speaker 1: create a computer simulation of the universe that is within 205 00:12:33,160 --> 00:12:37,719 Speaker 1: the realm of the simulation itself extremely realistic, and that 206 00:12:37,760 --> 00:12:42,920 Speaker 1: we could also create within this, uh, this universal simulation 207 00:12:43,720 --> 00:12:48,880 Speaker 1: stimulated intelligent beings, so that these these created beings would 208 00:12:48,880 --> 00:12:52,360 Speaker 1: have sentience, they have consciousness, They would they might be 209 00:12:52,480 --> 00:12:54,840 Speaker 1: very much like modern humans. Yes, they be self aware, 210 00:12:55,480 --> 00:13:00,000 Speaker 1: but they would exist within the context of this created universe, 211 00:13:00,280 --> 00:13:02,640 Speaker 1: and so they would only be able to see the 212 00:13:02,720 --> 00:13:06,000 Speaker 1: things that are within that universe. Anything outside of the 213 00:13:06,080 --> 00:13:11,440 Speaker 1: universe they would be incapable of perceiving. So within that universe, 214 00:13:11,960 --> 00:13:17,120 Speaker 1: it would seem like they were quote unquote the real people, right. 215 00:13:17,320 --> 00:13:19,840 Speaker 1: They were. They were the people who were there because 216 00:13:20,360 --> 00:13:23,600 Speaker 1: of whatever forces caused the universe to create be created 217 00:13:23,640 --> 00:13:27,199 Speaker 1: in the first place. Um, And that for these people 218 00:13:27,200 --> 00:13:29,880 Speaker 1: within the simulated universe, it might be completely impossible for 219 00:13:29,920 --> 00:13:33,480 Speaker 1: them to detect anyone outside of it, anyone being us 220 00:13:33,559 --> 00:13:36,240 Speaker 1: that we were the ones creating it. So his argument 221 00:13:36,280 --> 00:13:38,679 Speaker 1: is that if we in fact reached the stage, and 222 00:13:38,760 --> 00:13:42,720 Speaker 1: if we would use our technology to create simulations so 223 00:13:42,800 --> 00:13:45,720 Speaker 1: that we could see how civilizations develop and that we 224 00:13:45,720 --> 00:13:47,559 Speaker 1: would see how we do that interesting bits of the 225 00:13:47,600 --> 00:13:51,760 Speaker 1: universe work, because we're intrinsically curious, right, So his argument is, 226 00:13:51,760 --> 00:13:53,880 Speaker 1: if it's possible, we would do it, yes, And if 227 00:13:53,880 --> 00:13:56,880 Speaker 1: it's possible and we would do it, that means we, 228 00:13:57,400 --> 00:14:01,280 Speaker 1: the current people living in this universe, almost surely a 229 00:14:01,280 --> 00:14:07,040 Speaker 1: computer simulation. So if it can happen, and if we 230 00:14:07,200 --> 00:14:13,400 Speaker 1: are interested in ourselves, and we obviously are, then it's 231 00:14:13,600 --> 00:14:15,960 Speaker 1: almost a guarantee that we're in a computer simulation. And 232 00:14:16,000 --> 00:14:18,720 Speaker 1: the reason for that argument is that in order for 233 00:14:18,800 --> 00:14:20,920 Speaker 1: us to not be a computer simulation, we would have 234 00:14:20,960 --> 00:14:24,040 Speaker 1: to be the first ones, right, we would have to 235 00:14:24,080 --> 00:14:29,560 Speaker 1: be heading towards that timeline and just haven't gotten there yet, Mr. Fusion, 236 00:14:30,200 --> 00:14:34,600 Speaker 1: whereas in every other case, some other post human civilization 237 00:14:34,640 --> 00:14:38,640 Speaker 1: has already gotten there and made at least one level 238 00:14:38,920 --> 00:14:43,320 Speaker 1: of simulated universe, which means that one out of infinity 239 00:14:43,440 --> 00:14:48,160 Speaker 1: means we're the first, and then every other example is 240 00:14:48,240 --> 00:14:54,600 Speaker 1: we're number two or lower. So yeah, that's that's that's 241 00:14:54,640 --> 00:14:56,200 Speaker 1: kind of the crux of his argument, and it's really 242 00:14:56,240 --> 00:15:00,560 Speaker 1: again a philosophical argument it's not meant to say, right, 243 00:15:00,680 --> 00:15:02,800 Speaker 1: we're we're in a we're in a computer game, because 244 00:15:03,680 --> 00:15:05,400 Speaker 1: because somebody that it's set up. Yeah that you know, 245 00:15:05,440 --> 00:15:11,200 Speaker 1: if if you're saying that that ad infinitum percent of 246 00:15:11,240 --> 00:15:15,240 Speaker 1: the time, we're probably a computer simulation. That it's you're 247 00:15:15,280 --> 00:15:19,520 Speaker 1: you're placing the burden of proof on that point nine, 248 00:15:19,560 --> 00:15:23,960 Speaker 1: etceterascent of the population who might think you're wrong and that. So, 249 00:15:24,120 --> 00:15:27,680 Speaker 1: but it's really fair, it's an interesting it's an interesting question. 250 00:15:29,080 --> 00:15:35,000 Speaker 1: It's it's again unfalsifiable at right now. But but here's 251 00:15:35,000 --> 00:15:38,000 Speaker 1: the funny thing. Boston makes this the statement, but it's 252 00:15:38,000 --> 00:15:40,720 Speaker 1: purely from a philosophical point of view, right, It's not 253 00:15:40,800 --> 00:15:42,840 Speaker 1: from a physics point of view, it's not from a 254 00:15:42,880 --> 00:15:45,920 Speaker 1: science point of view. It's philosophy. That has not stopped 255 00:15:45,960 --> 00:15:49,440 Speaker 1: other people from looking at this from a scientific point 256 00:15:49,480 --> 00:15:53,400 Speaker 1: of view. And that, to me is another interesting aspect 257 00:15:53,480 --> 00:15:56,880 Speaker 1: of this argument is that usually you would look at 258 00:15:56,880 --> 00:15:58,520 Speaker 1: an argument like this and say, okay, well that's an 259 00:15:58,560 --> 00:16:02,600 Speaker 1: interesting philosophical question. Ultimately, there's nothing I can do about 260 00:16:02,640 --> 00:16:04,200 Speaker 1: that one way or the other, and then you go 261 00:16:04,240 --> 00:16:07,040 Speaker 1: about your mary a little way, right, But a bunch 262 00:16:07,160 --> 00:16:11,680 Speaker 1: of nice I'm I'm assuming um quantum scientists have gotten together. 263 00:16:11,720 --> 00:16:13,560 Speaker 1: Are you assuming they're nice? Are you assuming their quantum 264 00:16:13,560 --> 00:16:16,040 Speaker 1: signed assuming well, I'm assuming both. I'm going to go 265 00:16:16,080 --> 00:16:17,600 Speaker 1: ahead and give them the benefit of the doubt. There 266 00:16:19,040 --> 00:16:22,640 Speaker 1: have been have been doing some research into quantum chromodynamics, 267 00:16:23,320 --> 00:16:27,840 Speaker 1: which is which is a theory about one of the 268 00:16:28,000 --> 00:16:31,040 Speaker 1: four fundamental forces in our universe. So the four fundamental 269 00:16:31,080 --> 00:16:37,640 Speaker 1: forces are strong nuclear force, electromagnetism, weak nuclear force, and gravity. 270 00:16:37,960 --> 00:16:40,040 Speaker 1: And that is in fact, in the order of how 271 00:16:40,080 --> 00:16:42,320 Speaker 1: powerful they are, right, gravity is the weakest, is the 272 00:16:42,360 --> 00:16:45,920 Speaker 1: one that we're having the most trouble incorporating into our 273 00:16:46,080 --> 00:16:50,920 Speaker 1: model of the universe. Uh. Strong nuclear force deals with 274 00:16:51,000 --> 00:16:54,320 Speaker 1: the force of you know, you know those nucleus is 275 00:16:54,400 --> 00:16:57,680 Speaker 1: things as the nuclei the exists inside atoms. You know 276 00:16:57,720 --> 00:17:00,440 Speaker 1: how they have like stuff that's stuck together, like protons 277 00:17:00,440 --> 00:17:03,880 Speaker 1: and neutrons. And we're not sure why because because hypothetically 278 00:17:03,960 --> 00:17:07,280 Speaker 1: two protons should should push each other each other. If 279 00:17:07,280 --> 00:17:10,119 Speaker 1: you've ever played with magnetics, we well, strong nuclear force 280 00:17:10,520 --> 00:17:13,240 Speaker 1: is the force that that's the name for the reason 281 00:17:13,440 --> 00:17:19,880 Speaker 1: why these things are stuck together so strong whether why 282 00:17:19,880 --> 00:17:21,879 Speaker 1: they're able to stick together, clearly it has to be 283 00:17:21,920 --> 00:17:25,159 Speaker 1: an incredibly strong force, an incredibly strong force that only 284 00:17:25,880 --> 00:17:29,040 Speaker 1: kicks in in incredibly short distances. We're talking on the 285 00:17:29,080 --> 00:17:34,040 Speaker 1: atomic scale. So the strong nuclear force is what quantum 286 00:17:34,080 --> 00:17:38,120 Speaker 1: chromo dynamics is all about studying. And one of the 287 00:17:38,160 --> 00:17:43,399 Speaker 1: ways that quantum chromo dynamics or q c D should 288 00:17:43,440 --> 00:17:47,520 Speaker 1: be way easier to say is to to look at 289 00:17:48,359 --> 00:17:52,880 Speaker 1: this as a part of well, it looks at reality 290 00:17:52,960 --> 00:17:58,560 Speaker 1: as four space time dimensions, so four dimensions. We mentioned 291 00:17:58,600 --> 00:18:01,560 Speaker 1: string theory earlier, So versions of strength theory require that 292 00:18:01,600 --> 00:18:04,400 Speaker 1: there are no fewer than eleven dimensions for strength theory 293 00:18:04,400 --> 00:18:08,320 Speaker 1: to work, which is basically beyond my comprehension entirely. I 294 00:18:08,320 --> 00:18:11,000 Speaker 1: I get X, y Z and time that I can 295 00:18:11,040 --> 00:18:14,240 Speaker 1: I can get a three dimensional object moving through time. Yeah, 296 00:18:14,480 --> 00:18:19,200 Speaker 1: that's cool. Beyond that it gets a little wonky. Yeah, yeah, 297 00:18:19,240 --> 00:18:22,880 Speaker 1: wibbly wobbly. I'm right there with you, and uh and yeah, 298 00:18:23,000 --> 00:18:25,040 Speaker 1: the same thing. And then again this this plays back 299 00:18:25,080 --> 00:18:26,760 Speaker 1: to that discussion we have at the very beginning where 300 00:18:26,760 --> 00:18:29,160 Speaker 1: we talked about, you know, we know the universe consists 301 00:18:29,200 --> 00:18:32,720 Speaker 1: of stuff that's beyond our perception, but you know, we 302 00:18:32,760 --> 00:18:35,120 Speaker 1: have to filter that through our brains and our brains 303 00:18:35,200 --> 00:18:39,960 Speaker 1: are acting as a middleman between our consciousness and reality. 304 00:18:40,320 --> 00:18:44,159 Speaker 1: So the things that we experience may very well be 305 00:18:45,160 --> 00:18:48,159 Speaker 1: in their fundamental nature extremely different from the way we 306 00:18:48,200 --> 00:18:51,280 Speaker 1: think of them because it's being filtered through because we 307 00:18:51,320 --> 00:18:56,679 Speaker 1: have to model them in a this this concept we 308 00:18:56,720 --> 00:19:00,800 Speaker 1: shouldn't have done. This philosophy makes me sad. I just 309 00:19:00,840 --> 00:19:03,520 Speaker 1: realized that my brain is what's making you sit there 310 00:19:03,560 --> 00:19:05,200 Speaker 1: and you think, like, think about it. If you have 311 00:19:05,240 --> 00:19:07,320 Speaker 1: a bad day, you're thinking, wait a minute. Part of 312 00:19:07,320 --> 00:19:08,840 Speaker 1: the reason I'm having a bad day is because my 313 00:19:08,880 --> 00:19:12,040 Speaker 1: brain is filtering things in a it's my fault, and 314 00:19:12,080 --> 00:19:15,399 Speaker 1: then I just it becomes this cirable loop of of 315 00:19:15,440 --> 00:19:17,440 Speaker 1: everything is terrible because of my brain, which is making 316 00:19:17,480 --> 00:19:21,800 Speaker 1: things terrible. A little glimpse into Jonathan Strickland, folks, that's 317 00:19:21,800 --> 00:19:23,159 Speaker 1: how I get just before I have to go to 318 00:19:23,200 --> 00:19:25,200 Speaker 1: c E S. I know this episode publishes after I 319 00:19:25,280 --> 00:19:31,760 Speaker 1: come back, but trust me, it's it's a terrible life. 320 00:19:31,800 --> 00:19:34,520 Speaker 1: Having to fly around the country and look at really 321 00:19:34,560 --> 00:19:39,680 Speaker 1: shiny technological objects. You're not helping this suspicious cycle going on. 322 00:19:40,280 --> 00:19:42,800 Speaker 1: Getting back to qc D. So part of part of 323 00:19:42,840 --> 00:19:46,359 Speaker 1: this is looking at reality as for spacetime dimensions. Uh, 324 00:19:46,400 --> 00:19:50,440 Speaker 1: and it's using computers that are really really powerful to 325 00:19:50,560 --> 00:19:54,920 Speaker 1: do this and creating something that's uh, well, it's it's 326 00:19:55,000 --> 00:19:58,560 Speaker 1: it's it's called a lattice gauge. Uh. It's last gage 327 00:19:58,600 --> 00:20:03,760 Speaker 1: theory actually. Uh. ATM is sort of the framework within 328 00:20:04,119 --> 00:20:07,320 Speaker 1: which q c D tries to explain the strong nuclear force. 329 00:20:08,040 --> 00:20:11,560 Speaker 1: And uh, this part of this means that we try 330 00:20:11,600 --> 00:20:17,760 Speaker 1: and simulate and then incredibly tiny simulated universe, right, right, 331 00:20:17,800 --> 00:20:20,000 Speaker 1: And this is on the this is on actually the 332 00:20:20,000 --> 00:20:23,360 Speaker 1: femto scale by saying that, right is correct, excellent, um, 333 00:20:23,520 --> 00:20:27,000 Speaker 1: And and a femtometer is one quadrillionth of a meter 334 00:20:27,600 --> 00:20:30,359 Speaker 1: nanometer for for references, one billionth of a meter. And 335 00:20:30,400 --> 00:20:33,280 Speaker 1: that's also very small. Yes, yeah, when you talk about 336 00:20:33,320 --> 00:20:38,000 Speaker 1: nanotechnology and that super small technology that is enormous compared 337 00:20:38,040 --> 00:20:41,960 Speaker 1: to the femto scale, we're talking really really really tiny. Well, 338 00:20:42,080 --> 00:20:46,000 Speaker 1: they build this sort of lattice structure to contain the 339 00:20:46,119 --> 00:20:51,320 Speaker 1: simulated universe, and within this simulated universe, they are examining 340 00:20:51,840 --> 00:20:55,919 Speaker 1: the elements that make up the strong nuclear force so 341 00:20:55,920 --> 00:20:58,720 Speaker 1: that we can understand what it is and how it 342 00:20:58,760 --> 00:21:01,240 Speaker 1: works better. Right, that's the whole purpose. One of the 343 00:21:01,240 --> 00:21:04,320 Speaker 1: big driving forces of the universe. We want to understand it. 344 00:21:04,400 --> 00:21:07,320 Speaker 1: But these these physicists said, hey, wait a minute. If 345 00:21:07,320 --> 00:21:12,080 Speaker 1: this is how we are simulating and entirely fake universe, 346 00:21:12,160 --> 00:21:16,840 Speaker 1: a very tiny universe, doesn't it stand to reason that 347 00:21:17,000 --> 00:21:19,320 Speaker 1: some other like if we are and if we became 348 00:21:19,440 --> 00:21:23,320 Speaker 1: hyper advanced, if if we went trans human, couldn't those 349 00:21:23,320 --> 00:21:27,359 Speaker 1: trans humans in fact use the same technology to simulate 350 00:21:27,680 --> 00:21:31,000 Speaker 1: an entire universe on a big scale, on a really 351 00:21:31,080 --> 00:21:35,400 Speaker 1: large scale niver scale, right, or or at least for us. 352 00:21:35,400 --> 00:21:38,560 Speaker 1: I mean, you could also argue that just to populate 353 00:21:38,600 --> 00:21:43,879 Speaker 1: a fimpto sized universe with with even smaller individual units 354 00:21:43,920 --> 00:21:47,000 Speaker 1: within that fimpto sized universe, the idea being that, well 355 00:21:47,720 --> 00:21:50,760 Speaker 1: you could you could create a stimulation that is a 356 00:21:50,800 --> 00:21:55,600 Speaker 1: true universe with inhabitants and intelligent inhabitants, and that if 357 00:21:55,640 --> 00:21:58,520 Speaker 1: in fact we are in a computer simulation, then perhaps 358 00:21:58,520 --> 00:22:01,800 Speaker 1: there's some way that we could text if this lattice 359 00:22:01,840 --> 00:22:06,879 Speaker 1: structure is around our own universe. Um. Yeah, this is 360 00:22:06,880 --> 00:22:09,320 Speaker 1: where it's getting to the point where it's hard for 361 00:22:09,359 --> 00:22:12,920 Speaker 1: me to actually explain because it's stretching both of our 362 00:22:13,320 --> 00:22:16,880 Speaker 1: grasps on quantum mechanics are are perhaps not as strong 363 00:22:16,920 --> 00:22:19,600 Speaker 1: as they could be, and cosmology for the matter. For 364 00:22:19,640 --> 00:22:22,320 Speaker 1: that matter, because we're talking about things like cosmic rays. 365 00:22:24,280 --> 00:22:28,680 Speaker 1: The physicists suggested that perhaps we could um observe cosmic 366 00:22:28,800 --> 00:22:32,560 Speaker 1: rays and and really study them in depth and see 367 00:22:32,560 --> 00:22:36,320 Speaker 1: how they behave within our universe and look for evidence 368 00:22:36,520 --> 00:22:40,560 Speaker 1: of a lattice structure, which would indicate that some other 369 00:22:41,040 --> 00:22:44,159 Speaker 1: larger universe had used the same techniques we used to 370 00:22:44,240 --> 00:22:46,960 Speaker 1: create the fimto universes we're making to look at the 371 00:22:46,960 --> 00:22:49,840 Speaker 1: strong nuclear force to make our own universe, and then 372 00:22:49,880 --> 00:22:52,320 Speaker 1: we'd say, hey, look there's evidence we are in a 373 00:22:52,359 --> 00:22:57,240 Speaker 1: computer simulation crap. There are there are some issues with this. 374 00:22:57,400 --> 00:23:02,119 Speaker 1: One is that, uh, it presumes that any post human 375 00:23:02,160 --> 00:23:08,520 Speaker 1: civilization would use the exact so uh, there's another assumption 376 00:23:08,680 --> 00:23:13,439 Speaker 1: that said post human society would allow us to be 377 00:23:13,600 --> 00:23:16,119 Speaker 1: able to find our own right that you know, they 378 00:23:16,119 --> 00:23:18,480 Speaker 1: wouldn't paint in a nice little backdrop that would prevent 379 00:23:18,560 --> 00:23:21,280 Speaker 1: us from seeing the scenes, like to put in a 380 00:23:21,359 --> 00:23:25,520 Speaker 1: patch or to reset us. Let's go back to bronze age, Pope, 381 00:23:25,520 --> 00:23:30,400 Speaker 1: folks control or delete um. There's also the argument that, well, 382 00:23:30,520 --> 00:23:34,240 Speaker 1: what if our universe is so large that it is 383 00:23:34,280 --> 00:23:36,720 Speaker 1: a bounded universe, because that's the other thing. A lattice 384 00:23:37,200 --> 00:23:40,080 Speaker 1: structure would also indicate that our universe does have finite 385 00:23:40,320 --> 00:23:44,119 Speaker 1: it's a finite universe. What if that finite universe is 386 00:23:44,119 --> 00:23:46,159 Speaker 1: still too big for us to ever be able to 387 00:23:46,160 --> 00:23:49,720 Speaker 1: see the edges, or what if the universe just is 388 00:23:49,800 --> 00:23:52,399 Speaker 1: finite anyway. Yeah, it could be that the universe is 389 00:23:52,440 --> 00:23:54,960 Speaker 1: finite anyway and has nothing to do with the lass structure. 390 00:23:55,040 --> 00:23:56,879 Speaker 1: And so there are a lot of a lot of 391 00:23:56,960 --> 00:24:00,600 Speaker 1: objections that people have brought up. But mainly what this 392 00:24:00,720 --> 00:24:03,359 Speaker 1: approach would allow us to do is if we saw 393 00:24:03,560 --> 00:24:06,480 Speaker 1: the last structure, we could maybe draw some conclusions. But 394 00:24:06,560 --> 00:24:08,520 Speaker 1: in any other case, like if we didn't see the 395 00:24:08,600 --> 00:24:10,680 Speaker 1: last structure, it doesn't answer any questions. All right, it 396 00:24:10,720 --> 00:24:13,520 Speaker 1: doesn't mean that it doesn't exist. Yeah, it doesn't mean 397 00:24:13,520 --> 00:24:17,440 Speaker 1: that we're not in a computer simulation. So this so 398 00:24:17,520 --> 00:24:20,320 Speaker 1: it's interesting, But again it's just sort of an extension 399 00:24:20,480 --> 00:24:24,280 Speaker 1: of this thought experiment where, uh, you know, we're kind 400 00:24:24,280 --> 00:24:27,000 Speaker 1: of getting round to the point that I'm most interested 401 00:24:27,040 --> 00:24:30,520 Speaker 1: in in this whole discussion that all right, let's let's 402 00:24:30,520 --> 00:24:33,119 Speaker 1: say that let's say that we are in a computer simulation, 403 00:24:33,880 --> 00:24:36,080 Speaker 1: whether we know it or not. It doesn't it doesn't matter. 404 00:24:36,119 --> 00:24:38,200 Speaker 1: We don't necessarily need to know that we are. But 405 00:24:38,280 --> 00:24:40,520 Speaker 1: let's just let's just assume assuming that we are that 406 00:24:40,600 --> 00:24:45,800 Speaker 1: we are. Does that matter on a day to day scale? 407 00:24:46,000 --> 00:24:48,400 Speaker 1: Would it matter if we were in a computer simulation 408 00:24:49,280 --> 00:24:52,040 Speaker 1: in some ways? Now, if we don't know, it doesn't 409 00:24:52,040 --> 00:24:54,680 Speaker 1: matter at all. I mean, you know, if we if yeah, 410 00:24:54,880 --> 00:24:56,879 Speaker 1: you know, if if we had absolute proof of it, 411 00:24:56,960 --> 00:24:58,760 Speaker 1: then that would be that would be huge, That would 412 00:24:58,760 --> 00:25:01,760 Speaker 1: be shattering. Yeah, that would that would probably cause wars. Yeah, 413 00:25:01,920 --> 00:25:05,560 Speaker 1: for for all kinds of philosophies and religions and and 414 00:25:05,600 --> 00:25:08,680 Speaker 1: just interpersonal I mean, I mean within my own head, 415 00:25:08,760 --> 00:25:11,159 Speaker 1: I would probably need to spend a few days just 416 00:25:11,160 --> 00:25:16,160 Speaker 1: just a drooling because talking about my normal weak this 417 00:25:16,240 --> 00:25:19,080 Speaker 1: is what I do. Alright, Fine, Okay, so now I 418 00:25:19,119 --> 00:25:23,200 Speaker 1: know something different between me and Lauren learn a new stuff? 419 00:25:23,400 --> 00:25:27,280 Speaker 1: Um No, I I agree that that. And if we 420 00:25:27,320 --> 00:25:31,199 Speaker 1: don't know, there was no difference because the rules that 421 00:25:31,359 --> 00:25:34,680 Speaker 1: we have created for ourselves, but based on our cultures 422 00:25:34,720 --> 00:25:38,000 Speaker 1: and our society, those haven't changed. Like like if I 423 00:25:38,040 --> 00:25:41,480 Speaker 1: found out somehow, like if if knowledge were given to 424 00:25:41,560 --> 00:25:46,280 Speaker 1: me personally, everyone who's alive, that yes, in fact, you 425 00:25:46,320 --> 00:25:49,040 Speaker 1: live in a computer simulation. After I had that moment 426 00:25:49,040 --> 00:25:53,119 Speaker 1: where I I upped my drooling capacity for my daily 427 00:25:53,119 --> 00:25:56,600 Speaker 1: allowance of drooling, I would sit there and think, well, ultimately, 428 00:25:56,640 --> 00:26:00,280 Speaker 1: this doesn't change anything. I mean, it's my life still 429 00:26:00,280 --> 00:26:02,679 Speaker 1: has meaning within the context of the world I live in. 430 00:26:03,080 --> 00:26:05,840 Speaker 1: We still need to go get lunch, we still we 431 00:26:05,880 --> 00:26:08,600 Speaker 1: still get married, we still die, and I still I 432 00:26:08,640 --> 00:26:11,919 Speaker 1: still laugh, I still cry. I still find that Mitchell 433 00:26:11,960 --> 00:26:13,800 Speaker 1: and Web look to be absolutely hilarious, and I would 434 00:26:13,840 --> 00:26:15,600 Speaker 1: really like it if Netflix streaming would bring it back 435 00:26:15,640 --> 00:26:17,919 Speaker 1: for me. I mean, you know, all these sort of 436 00:26:17,960 --> 00:26:22,560 Speaker 1: things would still be true. So I don't think that 437 00:26:22,720 --> 00:26:27,679 Speaker 1: ultimately it changes anything unless it were something where we 438 00:26:27,760 --> 00:26:32,760 Speaker 1: could definitively prove it, and then that would change major things, 439 00:26:33,119 --> 00:26:35,800 Speaker 1: like essentially a lot of people would have to answer 440 00:26:35,880 --> 00:26:41,119 Speaker 1: some very tough questions in regards to philosophy and religion particularly, 441 00:26:41,240 --> 00:26:43,919 Speaker 1: but other things as well. But those two insticularly some 442 00:26:44,040 --> 00:26:46,640 Speaker 1: very nice philosophy departments would be more or less out 443 00:26:46,640 --> 00:26:50,880 Speaker 1: of jobs. Yeah, well, you know, actually you'd step away 444 00:26:50,880 --> 00:26:52,439 Speaker 1: from being out of a job anyway. Come on, we're 445 00:26:52,440 --> 00:26:56,040 Speaker 1: talking about philosophy here. Snap. This is coming from a 446 00:26:56,119 --> 00:27:02,080 Speaker 1: literature major Okay, I did you see that. This is 447 00:27:02,119 --> 00:27:04,560 Speaker 1: off topic, But there was a list I think of 448 00:27:04,640 --> 00:27:09,920 Speaker 1: the most unemployable majors came out. Philosophy I think was 449 00:27:10,200 --> 00:27:13,320 Speaker 1: in the top three. Yeah, literature was in the top ten. 450 00:27:13,560 --> 00:27:16,000 Speaker 1: Was it was it? I was created writing? Do you 451 00:27:16,000 --> 00:27:19,080 Speaker 1: have any creative writing? I think they didn't even put 452 00:27:19,320 --> 00:27:24,520 Speaker 1: in there. It's like, are you Neil Gaiman? No? Um, hey, 453 00:27:24,560 --> 00:27:27,200 Speaker 1: I'm an editor. I'm working in my field, folks. That's true, 454 00:27:27,400 --> 00:27:31,080 Speaker 1: that's true. I'm I don't know what my field would be. 455 00:27:31,720 --> 00:27:33,480 Speaker 1: So I was a medieval literaguy. I guess I'd be 456 00:27:33,480 --> 00:27:35,919 Speaker 1: teaching medieval literature if I were in my field. But 457 00:27:36,359 --> 00:27:40,639 Speaker 1: it's funny that I went into technology podcasting instead. So anyway, ultimately, 458 00:27:40,800 --> 00:27:43,440 Speaker 1: I don't think it would really matter. There's no way 459 00:27:43,480 --> 00:27:47,240 Speaker 1: of knowing currently one way or the other. So from 460 00:27:47,240 --> 00:27:51,200 Speaker 1: that perspective, thought exercise right. Yeah, hey there it's Job 461 00:27:51,240 --> 00:27:54,399 Speaker 1: of twenty nineteen again. Time to take another quick break 462 00:27:54,640 --> 00:28:05,639 Speaker 1: to thank our sponsor. So, ultimately, uh, it may or 463 00:28:05,680 --> 00:28:08,040 Speaker 1: may not matter if we're in a computer simulation. But 464 00:28:08,359 --> 00:28:11,639 Speaker 1: the other question to ask is how feasible is it? 465 00:28:11,680 --> 00:28:15,800 Speaker 1: How would it be possible to actually create a universe 466 00:28:15,840 --> 00:28:17,879 Speaker 1: on this what would it take to create a universe 467 00:28:17,920 --> 00:28:20,840 Speaker 1: on this scale? Right? And I mean because simulating something 468 00:28:20,920 --> 00:28:24,640 Speaker 1: is actually very much more complicated than just doing it. Um. 469 00:28:24,720 --> 00:28:27,520 Speaker 1: For for example, UM, I was reading one article that 470 00:28:27,520 --> 00:28:30,400 Speaker 1: that sided a number where if you were to take 471 00:28:30,400 --> 00:28:32,840 Speaker 1: a hard drive and you wanted to create a simulation 472 00:28:32,920 --> 00:28:36,240 Speaker 1: of that entire hard drive, you would have to simulate 473 00:28:36,280 --> 00:28:40,240 Speaker 1: every single adom, minute, record, its position, it's it's time scale, 474 00:28:40,400 --> 00:28:43,000 Speaker 1: everything about it. That would be you know, and and 475 00:28:43,000 --> 00:28:45,440 Speaker 1: and every single atomant and hard drive is maybe intend 476 00:28:45,440 --> 00:28:48,520 Speaker 1: to the power of twenty four atoms a couple a 477 00:28:48,560 --> 00:28:52,960 Speaker 1: bunch um and but in order to simulate it, you 478 00:28:53,000 --> 00:28:54,760 Speaker 1: would have to about have about a hundred bits of 479 00:28:54,840 --> 00:28:57,760 Speaker 1: information at least on each of those atoms. Right. So, 480 00:28:57,960 --> 00:29:00,840 Speaker 1: so when you think about that, if the world is 481 00:29:01,000 --> 00:29:03,360 Speaker 1: or this universe is a computer simulation, and that means 482 00:29:03,360 --> 00:29:06,960 Speaker 1: the simulation has to take into account every single object 483 00:29:07,240 --> 00:29:09,600 Speaker 1: within that world, whether that object is part of something 484 00:29:09,640 --> 00:29:13,720 Speaker 1: else or or an individual object, its position, it's uh, 485 00:29:13,760 --> 00:29:17,680 Speaker 1: if it's moving, its relationship to every other object within 486 00:29:17,720 --> 00:29:22,200 Speaker 1: that universe, how that objects behavior you know, however you 487 00:29:22,200 --> 00:29:25,560 Speaker 1: wanted to find that affects other objects. It's it's are 488 00:29:25,600 --> 00:29:28,120 Speaker 1: you are you going down to the atomic scale? Are 489 00:29:28,120 --> 00:29:30,360 Speaker 1: you going to the sub atomic scale? How how deep 490 00:29:30,400 --> 00:29:32,840 Speaker 1: does the rabbit hole go? Right? Right? Because because the 491 00:29:32,880 --> 00:29:35,720 Speaker 1: other problem here is that as we the human beings 492 00:29:35,720 --> 00:29:37,880 Speaker 1: who live right here and now, whether it's a computer 493 00:29:37,920 --> 00:29:40,400 Speaker 1: simulation or not, get more advanced, we get to learn 494 00:29:40,400 --> 00:29:42,320 Speaker 1: more about our environment and we get to look even 495 00:29:42,360 --> 00:29:44,840 Speaker 1: deeper than we could before. So like, let's go back 496 00:29:44,840 --> 00:29:48,239 Speaker 1: to the Stone Age, no microscopes, nothing like anything that 497 00:29:48,320 --> 00:29:50,920 Speaker 1: was beyond our ability to actually see it did not 498 00:29:51,080 --> 00:29:54,200 Speaker 1: exist in our minds. And then we got more and 499 00:29:54,200 --> 00:29:56,360 Speaker 1: more advanced, and we were able to suddenly start seeing 500 00:29:56,360 --> 00:29:58,680 Speaker 1: things that are tinier and tinier. And then we get 501 00:29:58,680 --> 00:30:01,040 Speaker 1: to the point where we've got electron tunneling microscopes and 502 00:30:01,080 --> 00:30:03,640 Speaker 1: we can move individual atoms into place and we can 503 00:30:03,680 --> 00:30:05,920 Speaker 1: see things that are really far away. That would mean 504 00:30:05,960 --> 00:30:08,480 Speaker 1: that the simulation would have to take into account the 505 00:30:08,480 --> 00:30:14,000 Speaker 1: ability for us to see well beyond what we first right, 506 00:30:14,320 --> 00:30:17,680 Speaker 1: So that's kind of incredible to think about that way, 507 00:30:17,720 --> 00:30:19,600 Speaker 1: Like how much power would you have to have to 508 00:30:19,640 --> 00:30:22,400 Speaker 1: generate this? Is it something that would be added over 509 00:30:22,480 --> 00:30:25,960 Speaker 1: time so that like you know, someone's someone's checking in 510 00:30:26,000 --> 00:30:28,840 Speaker 1: every now and like, oh they can see atoms now, 511 00:30:29,000 --> 00:30:31,560 Speaker 1: all right, well we got to build the next level down, guys, 512 00:30:32,160 --> 00:30:34,960 Speaker 1: or or like fire up that extra server. We're gonna 513 00:30:35,000 --> 00:30:37,760 Speaker 1: need it toe exactly like you know, it's some place 514 00:30:37,800 --> 00:30:42,160 Speaker 1: in this other larger universes North Carolina, there's another server 515 00:30:42,240 --> 00:30:45,600 Speaker 1: farm being built. Yeah, I mean it's it's It blows 516 00:30:45,600 --> 00:30:47,680 Speaker 1: the mind. It becomes one of these things where you, 517 00:30:47,880 --> 00:30:50,000 Speaker 1: as you start to think about it, you're like, would 518 00:30:50,000 --> 00:30:53,520 Speaker 1: that even be possible? Now, from a futurists argument, they 519 00:30:53,600 --> 00:30:56,400 Speaker 1: might say that in the future will be able to 520 00:30:56,440 --> 00:30:59,440 Speaker 1: do things like harness the power of black holes to 521 00:30:59,560 --> 00:31:03,080 Speaker 1: do computing, in which case the limitations of computing suddenly 522 00:31:03,160 --> 00:31:08,000 Speaker 1: seem like a non problem. Sure, even with quantum computers, 523 00:31:08,000 --> 00:31:10,760 Speaker 1: which they are already experimenting with, we've gotten up to 524 00:31:10,840 --> 00:31:14,680 Speaker 1: a sixteen sixteen cubits was the last reliable one, but 525 00:31:14,720 --> 00:31:17,240 Speaker 1: we've had there been larger ones. The problem with, of course, 526 00:31:17,320 --> 00:31:19,480 Speaker 1: quantum computers, is that as soon as you really observe 527 00:31:19,560 --> 00:31:23,360 Speaker 1: the state therein it collapses, that decoheres, and you end 528 00:31:23,440 --> 00:31:27,480 Speaker 1: up with a classical computer that is severely underpowered to 529 00:31:27,560 --> 00:31:29,959 Speaker 1: any other normal classical computer. But there are people who 530 00:31:30,000 --> 00:31:32,960 Speaker 1: are working on that problem. So yeah, I mean that 531 00:31:33,360 --> 00:31:38,080 Speaker 1: it may be that reaching such a level of computing 532 00:31:38,120 --> 00:31:40,160 Speaker 1: power is not possible, And in fact, that was part 533 00:31:40,160 --> 00:31:42,800 Speaker 1: of Bostrom's point was he was not necessarily saying we 534 00:31:42,840 --> 00:31:45,040 Speaker 1: live in a computer simulation. He was saying, there's another 535 00:31:45,080 --> 00:31:47,000 Speaker 1: way of looking at this. We can look at this 536 00:31:47,120 --> 00:31:50,160 Speaker 1: is saying trans human or post human in the sense 537 00:31:50,200 --> 00:31:53,560 Speaker 1: that the futurists have defined. It is an impossibility that 538 00:31:54,320 --> 00:31:56,160 Speaker 1: we will never get to a point where we have 539 00:31:56,200 --> 00:31:59,760 Speaker 1: computing power so vast as to be able to simulate 540 00:31:59,800 --> 00:32:04,880 Speaker 1: an entire universe down to the tiniest detail and have 541 00:32:05,040 --> 00:32:08,120 Speaker 1: it populated with intelligent creatures, right or even beyond the 542 00:32:08,680 --> 00:32:12,080 Speaker 1: feasibility of that, the idea that probably we're going to 543 00:32:12,160 --> 00:32:15,160 Speaker 1: kill all of each other way way before that. Actually 544 00:32:15,240 --> 00:32:16,960 Speaker 1: that that was another point. One of his points is 545 00:32:17,040 --> 00:32:19,480 Speaker 1: saying that it's way more likely than any any sort 546 00:32:19,520 --> 00:32:24,480 Speaker 1: of civilization reaching post human or trans human uh status 547 00:32:24,840 --> 00:32:27,440 Speaker 1: would end up wiping itself out in some sort of 548 00:32:27,520 --> 00:32:31,440 Speaker 1: cataclysmic event, whether on purpose or by accident. So we 549 00:32:31,480 --> 00:32:35,080 Speaker 1: could have, Hey, we have reached this level of superiority. 550 00:32:35,120 --> 00:32:37,840 Speaker 1: Now we are going to force our ideological values upon 551 00:32:37,880 --> 00:32:40,200 Speaker 1: everybody else. And everybody else says no, you're not, and 552 00:32:40,240 --> 00:32:42,840 Speaker 1: then we all kill each other or we say, hey 553 00:32:43,040 --> 00:32:47,440 Speaker 1: was this button? Do zombie outbreak? Those are the only 554 00:32:47,440 --> 00:32:51,920 Speaker 1: two possibilities. That's a lie, but no that Sposton's point 555 00:32:51,960 --> 00:32:54,360 Speaker 1: is that that could also be a case. It's really 556 00:32:54,440 --> 00:32:57,560 Speaker 1: a depressing case. But you could say humankind could hit 557 00:32:57,640 --> 00:33:01,120 Speaker 1: extinction before we hit posting. More fun to think about 558 00:33:01,160 --> 00:33:03,840 Speaker 1: that other thing. But overall, zombie I'll break. I agree, 559 00:33:04,600 --> 00:33:10,040 Speaker 1: walking dead man right down the street from me, that's 560 00:33:10,120 --> 00:33:13,080 Speaker 1: just to land on normal traffic. That's true. That's true. 561 00:33:14,120 --> 00:33:16,640 Speaker 1: It's it's pretty much it's the office before we get 562 00:33:16,640 --> 00:33:20,480 Speaker 1: to the coffee machine. It is. It gets ugly folks 563 00:33:20,520 --> 00:33:24,640 Speaker 1: at walkers, a lot of walkers. But yeah, that's I mean, 564 00:33:24,680 --> 00:33:27,080 Speaker 1: that's a good point, is that it you know, we 565 00:33:27,200 --> 00:33:32,080 Speaker 1: could we humankind could go extinct. That's one downside. We 566 00:33:32,200 --> 00:33:34,320 Speaker 1: might not ever be able to reach that level of 567 00:33:34,400 --> 00:33:38,000 Speaker 1: computational achievement in order to to ever simulate a universe. 568 00:33:38,200 --> 00:33:42,680 Speaker 1: That's depending upon whom you ask, also a downside. Or 569 00:33:42,720 --> 00:33:47,840 Speaker 1: we just might never know. So yeah, sometimes it leads 570 00:33:47,920 --> 00:33:50,720 Speaker 1: us to to explore the question further. Really, yeah, and 571 00:33:51,040 --> 00:33:55,080 Speaker 1: we've we've seen that explored in multiple venues, not just philosophy, 572 00:33:55,160 --> 00:33:57,520 Speaker 1: not just science, but entertainment all over the place. Yeah, 573 00:33:57,520 --> 00:33:59,400 Speaker 1: I mean going back to a lot of Shakespeare stories 574 00:33:59,440 --> 00:34:01,600 Speaker 1: we're talking about really, like, like, what's the difference between 575 00:34:01,640 --> 00:34:03,720 Speaker 1: dreams and reality? Midsummer Night's Dream is kind of a 576 00:34:03,720 --> 00:34:07,239 Speaker 1: big one. We are such stuff as dreams are made on. 577 00:34:08,640 --> 00:34:12,960 Speaker 1: Thank you, Jonathan. Oh, brave new world to have such people, 578 00:34:13,040 --> 00:34:20,000 Speaker 1: and it tis new to the that's the tempest right there. 579 00:34:20,400 --> 00:34:25,560 Speaker 1: That's that's our that's our literature major. Yeah here, Yeah, well, 580 00:34:25,560 --> 00:34:27,600 Speaker 1: you know I used the degree once in a while, 581 00:34:28,800 --> 00:34:31,279 Speaker 1: but more perhaps more modern things. A few a few 582 00:34:31,320 --> 00:34:34,160 Speaker 1: of you have probably seen the Matrix. I know, kunk 583 00:34:34,160 --> 00:34:38,279 Speaker 1: fo I keep quote it. Let's just keep going. I 584 00:34:38,320 --> 00:34:41,160 Speaker 1: got it. I got this Vanilla sky. Oh no, I 585 00:34:41,160 --> 00:34:44,920 Speaker 1: haven't seen that. Yeah, I can't do that. The documentary 586 00:34:44,960 --> 00:34:47,640 Speaker 1: or the Spanish version effort. Let's so host do you 587 00:34:47,640 --> 00:34:56,600 Speaker 1: speak Spanish? There we go. That was my total recall. Inception. Um, 588 00:34:56,600 --> 00:35:02,480 Speaker 1: that's that's that's every other short. Not not inception at all. 589 00:35:03,920 --> 00:35:09,399 Speaker 1: The inception one would be that's inception love love. These movies, 590 00:35:09,440 --> 00:35:11,120 Speaker 1: by the way, I'm making a lot of fun, but no, 591 00:35:11,280 --> 00:35:13,200 Speaker 1: these are good points. I mean you do you do 592 00:35:13,239 --> 00:35:15,280 Speaker 1: see this theme come up over and over again. Total 593 00:35:15,320 --> 00:35:18,080 Speaker 1: recall what is real, what is being imagined? Is the 594 00:35:18,239 --> 00:35:21,400 Speaker 1: entire story actually only happening in quad? That was his 595 00:35:21,480 --> 00:35:26,520 Speaker 1: name rights mine? Uh? Or inception? Are you still within 596 00:35:26,600 --> 00:35:29,000 Speaker 1: a dream? Are you one level down in a dream? 597 00:35:29,000 --> 00:35:31,120 Speaker 1: Are you two levels down? At the very end of 598 00:35:31,160 --> 00:35:36,640 Speaker 1: the movie is the is the main character is dream right? Which? 599 00:35:37,160 --> 00:35:39,520 Speaker 1: And of course that brings into questions something else. This 600 00:35:39,600 --> 00:35:42,759 Speaker 1: this is a good argument that falls within the same 601 00:35:42,800 --> 00:35:44,759 Speaker 1: sort of thing about being able to tell versus not 602 00:35:44,760 --> 00:35:48,000 Speaker 1: being able to tell. Uh. This is big spoilers for 603 00:35:48,040 --> 00:35:51,040 Speaker 1: Inception here, obviously, but one of the one of the 604 00:35:51,320 --> 00:35:53,680 Speaker 1: things they set up in an inception is that one 605 00:35:53,719 --> 00:35:56,120 Speaker 1: way you can tell if you are in a dream 606 00:35:56,280 --> 00:35:58,600 Speaker 1: is you have a totem that is specific to you 607 00:35:58,680 --> 00:36:02,239 Speaker 1: and behaves a certain way in the real world, but 608 00:36:02,360 --> 00:36:06,080 Speaker 1: in dream space it does not behave that way. Why 609 00:36:06,520 --> 00:36:10,640 Speaker 1: who knows, But it's convenient physics. The main character has 610 00:36:10,680 --> 00:36:12,960 Speaker 1: a top, so if he spins the top and the 611 00:36:13,000 --> 00:36:15,720 Speaker 1: top stops spinning, he knows he's in the real world. 612 00:36:15,960 --> 00:36:17,960 Speaker 1: If he's a layer down into a dream world and 613 00:36:18,000 --> 00:36:20,640 Speaker 1: he spins the top, it just keeps spinning, and that's 614 00:36:20,640 --> 00:36:24,279 Speaker 1: how he can look at a moment and know if 615 00:36:24,320 --> 00:36:27,520 Speaker 1: he's in a dream or if he's awake. Uh. There's 616 00:36:27,560 --> 00:36:30,120 Speaker 1: a problem with this though. If the whole thing is 617 00:36:30,160 --> 00:36:33,360 Speaker 1: within a dream, there's nothing to stop the dream world 618 00:36:33,400 --> 00:36:36,560 Speaker 1: from saying at a certain level, this top behaves a 619 00:36:36,560 --> 00:36:39,280 Speaker 1: different way. So, in other words, even at the top 620 00:36:39,440 --> 00:36:43,000 Speaker 1: topples over, which into the movie gives you this kind 621 00:36:43,000 --> 00:36:47,200 Speaker 1: of idea, is it gonna sp But even at the 622 00:36:47,200 --> 00:36:49,440 Speaker 1: top word to topple over, that would not answer the 623 00:36:49,520 --> 00:36:51,160 Speaker 1: question of whether or not you're in a dream, because 624 00:36:51,200 --> 00:36:55,359 Speaker 1: it only depends upon if that level is real or not. 625 00:36:55,520 --> 00:36:57,640 Speaker 1: And if it's not real, if that's just another point 626 00:36:57,640 --> 00:36:59,120 Speaker 1: of the dream, then all it means is that the 627 00:36:59,120 --> 00:37:01,320 Speaker 1: behaviors of off in one level of the dream behaved 628 00:37:01,320 --> 00:37:05,000 Speaker 1: differently than the others. And and similarly, if if our 629 00:37:05,040 --> 00:37:07,799 Speaker 1: if our future trans human selves decided to make it 630 00:37:07,840 --> 00:37:10,799 Speaker 1: so that we cannot detect whether or not we are 631 00:37:10,840 --> 00:37:13,879 Speaker 1: in a dream, then yeah, we're kind of stuck. Yeah, 632 00:37:13,920 --> 00:37:15,960 Speaker 1: the same thing with the matrix. The whole idea of 633 00:37:16,000 --> 00:37:20,360 Speaker 1: creating a universe that is not ideal, because if we 634 00:37:20,480 --> 00:37:23,840 Speaker 1: created an ideal universe, humans would go, this can't be real. 635 00:37:24,520 --> 00:37:28,200 Speaker 1: My life stinks. There's no way my life is this awesome. 636 00:37:28,960 --> 00:37:31,160 Speaker 1: And uh, and then we have a universe filled with 637 00:37:31,239 --> 00:37:37,479 Speaker 1: Jonathan's Tripling's and nobody wants that. Nobody, nobody wants, not 638 00:37:37,760 --> 00:37:41,320 Speaker 1: even Jonathan's. And look, I'm enough to deal with already. 639 00:37:42,560 --> 00:37:44,959 Speaker 1: But yeah, I know there's there's tons of examples. There's 640 00:37:44,960 --> 00:37:48,799 Speaker 1: there's a lot in science fiction, obviously, obviously especially since 641 00:37:49,080 --> 00:37:52,359 Speaker 1: since William Gibson in the entire cyberpunk thing became a thing, 642 00:37:52,840 --> 00:37:55,040 Speaker 1: you know, and that's that's sort of our our entrance 643 00:37:55,120 --> 00:37:57,080 Speaker 1: into into how did I just make up a word 644 00:37:57,080 --> 00:37:58,600 Speaker 1: in French? I don't think that that's an actual thing. 645 00:37:58,920 --> 00:38:05,000 Speaker 1: Um Ever, ever since cyberpunk happened, we've been we've been 646 00:38:05,000 --> 00:38:08,120 Speaker 1: looking at technology in in this fearful way, which we 647 00:38:08,160 --> 00:38:09,840 Speaker 1: always kind of do in horror films. I think that 648 00:38:09,880 --> 00:38:11,759 Speaker 1: fiction is a really terrific way for us to work 649 00:38:11,800 --> 00:38:14,520 Speaker 1: out our anxieties because we can go, like, well, in 650 00:38:14,560 --> 00:38:18,000 Speaker 1: this scary world, all of this terrible things. I think 651 00:38:18,120 --> 00:38:20,279 Speaker 1: all of these terrible things will happen. However, we're going 652 00:38:20,320 --> 00:38:22,560 Speaker 1: to have some attractive people in pleather who are going 653 00:38:22,600 --> 00:38:24,520 Speaker 1: to take care of it. For us, and that's or 654 00:38:24,640 --> 00:38:28,719 Speaker 1: die and die in numerous with Yeah, like you said, yeah, 655 00:38:28,760 --> 00:38:30,960 Speaker 1: it's science fiction and horror, the two genres I think 656 00:38:31,000 --> 00:38:33,040 Speaker 1: of the most when I think about this kind of mentality, 657 00:38:33,040 --> 00:38:36,359 Speaker 1: because I also think of things like other movies. Here's 658 00:38:36,360 --> 00:38:40,400 Speaker 1: another spoiler. Guys, if you're a Josh Ween fan, this 659 00:38:40,520 --> 00:38:43,040 Speaker 1: is a spoiler, So spoiler for Jos Weeden fans you 660 00:38:43,080 --> 00:38:46,720 Speaker 1: can skip ahead. But Cabin in the Woods another example 661 00:38:46,880 --> 00:38:51,000 Speaker 1: of like what is real and what is what is artifice? Yes, 662 00:38:51,080 --> 00:38:54,040 Speaker 1: what has been simulated, So there you go, same sort 663 00:38:54,040 --> 00:38:56,680 Speaker 1: of thing. Um. I don't think that's a huge spoiler 664 00:38:56,719 --> 00:39:00,320 Speaker 1: because that's revealed early in the movie. But still, um, 665 00:39:00,360 --> 00:39:03,360 Speaker 1: at any rate, these are you know, we've explored this 666 00:39:03,440 --> 00:39:07,440 Speaker 1: idea and multiple forms of media, whether it's entertainment or 667 00:39:07,520 --> 00:39:13,000 Speaker 1: also a multiple uh disciplines entertainment, science, philosophy, and I 668 00:39:13,000 --> 00:39:15,760 Speaker 1: don't think it's gonna go away anytime soon because obviously 669 00:39:15,800 --> 00:39:18,640 Speaker 1: there's no way for us to answer this question, not 670 00:39:18,640 --> 00:39:21,640 Speaker 1: not unless we detect that last structure. And even then 671 00:39:21,680 --> 00:39:24,279 Speaker 1: you're like, well, well then who created us? That could 672 00:39:24,360 --> 00:39:27,040 Speaker 1: mean anything that could mean and also it could be 673 00:39:27,080 --> 00:39:31,719 Speaker 1: like inception. We might be levels down from reality. It 674 00:39:31,760 --> 00:39:36,040 Speaker 1: could be that marble in the Men in Black. Yeah, yeah, then, yeah, exactly, yes, 675 00:39:36,120 --> 00:39:38,320 Speaker 1: we could be we could be a moat of dust 676 00:39:39,239 --> 00:39:42,719 Speaker 1: upon the nose of a dog in another universe, which 677 00:39:42,719 --> 00:39:44,919 Speaker 1: in itself is a moat of dust upon the nose 678 00:39:44,960 --> 00:39:47,440 Speaker 1: a another dog, and another universe, which in itself is 679 00:39:47,480 --> 00:39:50,200 Speaker 1: a moat of dust upon Douglas Adams pencil as he 680 00:39:50,400 --> 00:39:53,400 Speaker 1: giggles maniacally and writes another book, because in that universe 681 00:39:53,400 --> 00:39:56,560 Speaker 1: he's stilled around. I like that verse. I do too. 682 00:39:56,719 --> 00:39:58,080 Speaker 1: I want to, I want to I want to go 683 00:39:58,120 --> 00:40:04,320 Speaker 1: to their But yeah, it's a it's a really interesting question. So, um, 684 00:40:04,360 --> 00:40:06,799 Speaker 1: and if you've heard about this whole thing, because it 685 00:40:06,920 --> 00:40:10,080 Speaker 1: came up in conversations towards the end of two thousand twelve, 686 00:40:10,120 --> 00:40:13,839 Speaker 1: which is kind of interesting because the first uh, the 687 00:40:13,880 --> 00:40:19,040 Speaker 1: publication of Bostrom's work was in the early two think so. 688 00:40:19,960 --> 00:40:22,160 Speaker 1: But in this case, it was because the physicists had said, hey, 689 00:40:22,200 --> 00:40:23,680 Speaker 1: there might be a way for us to find out 690 00:40:23,880 --> 00:40:28,120 Speaker 1: maybe possibly probably not, but it could happen. U And 691 00:40:28,160 --> 00:40:29,480 Speaker 1: even then, we're going to have to get to a 692 00:40:29,560 --> 00:40:33,759 Speaker 1: level of technological uh, advancement that is beyond what we 693 00:40:33,800 --> 00:40:36,319 Speaker 1: have now. Before we could even hope to detect these 694 00:40:36,320 --> 00:40:40,120 Speaker 1: cosmic rays. And that wraps up this classic episode of 695 00:40:40,160 --> 00:40:42,440 Speaker 1: tech Stuff. Hope you guys liked that one. If you 696 00:40:42,480 --> 00:40:46,000 Speaker 1: have a suggestion for a future episode, send me an email. 697 00:40:46,040 --> 00:40:49,319 Speaker 1: The addresses text stuff at how stuff works dot com, 698 00:40:49,719 --> 00:40:52,719 Speaker 1: drop me a line on Facebook or Twitter handle for 699 00:40:52,800 --> 00:40:56,759 Speaker 1: both of those is text Stuff h s W. You 700 00:40:56,800 --> 00:41:00,000 Speaker 1: can also go to visit our website that's tech Stuff 701 00:41:00,120 --> 00:41:03,440 Speaker 1: podcast dot com. You'll find an archive of every episode 702 00:41:03,480 --> 00:41:05,960 Speaker 1: ever published. You will also find a link to our 703 00:41:06,000 --> 00:41:08,840 Speaker 1: online store, So if you've been itching to get a 704 00:41:09,000 --> 00:41:12,799 Speaker 1: text stuff mug or I bought this T shirt for 705 00:41:12,840 --> 00:41:17,080 Speaker 1: a princely some shirt, then you can do that. Those 706 00:41:17,120 --> 00:41:19,680 Speaker 1: are both there, along with some other designs. Go check 707 00:41:19,719 --> 00:41:21,600 Speaker 1: that out. Every purchase you make goes to help the show. 708 00:41:21,640 --> 00:41:24,719 Speaker 1: We greatly appreciate it, and I will talk to you 709 00:41:24,760 --> 00:41:31,560 Speaker 1: again really soon. Hext Stuff is a production of I 710 00:41:31,640 --> 00:41:34,600 Speaker 1: Heart Radio's How Stuff Works. For more podcasts from my 711 00:41:34,719 --> 00:41:38,359 Speaker 1: heart Radio, visit the i heart Radio app, Apple Podcasts, 712 00:41:38,480 --> 00:41:40,440 Speaker 1: or wherever you listen to your favorite shows.