1 00:00:00,320 --> 00:00:03,800 Speaker 1: Hey, everybody, Josh and I are going on tour again 2 00:00:03,960 --> 00:00:08,640 Speaker 1: to basically wrap up twenty twenty three on the road 3 00:00:08,920 --> 00:00:12,440 Speaker 1: in Orlando, Florida first, then Nashville, Tennessee, and then wrapping 4 00:00:12,520 --> 00:00:14,000 Speaker 1: it up here in Atlanta, Georgia. 5 00:00:14,400 --> 00:00:17,320 Speaker 2: Yeah, and you can listen to the next Tuesday's episode 6 00:00:17,400 --> 00:00:19,840 Speaker 2: for details on when we'll be there and where to 7 00:00:19,840 --> 00:00:21,919 Speaker 2: get tickets and all that stuff. But we just wanted 8 00:00:21,960 --> 00:00:23,520 Speaker 2: to give you guys a heads up, And if you 9 00:00:23,520 --> 00:00:26,119 Speaker 2: don't feel like listening to Tuesday's episode, you can still 10 00:00:26,120 --> 00:00:29,120 Speaker 2: get all of your info at linktree, slash, sysk, or 11 00:00:29,160 --> 00:00:33,120 Speaker 2: our website Stuff youshould Know dot com. 12 00:00:33,360 --> 00:00:41,080 Speaker 1: Welcome to Stuff you Should Know, a production of iHeartRadio. 13 00:00:43,000 --> 00:00:45,680 Speaker 2: Hey, and welcome to the podcast. I'm Josh Clark, and 14 00:00:45,680 --> 00:00:50,400 Speaker 2: there's Charles w. Tchuck Bryant, and here too is Jerry Roland. 15 00:00:50,880 --> 00:00:54,760 Speaker 2: If Jerry Roland exists and this is Stuff. 16 00:00:54,440 --> 00:00:54,920 Speaker 3: You Should Know? 17 00:00:56,520 --> 00:00:59,880 Speaker 1: Can I make a just a quick thank you? Yeah, 18 00:01:00,080 --> 00:01:03,800 Speaker 1: make a thank you? I met Baker, Thank you? 19 00:01:03,840 --> 00:01:05,440 Speaker 2: Sorry, Sure, build a thank you. 20 00:01:06,920 --> 00:01:09,720 Speaker 1: I you know, I mentioned in that one episode that 21 00:01:09,760 --> 00:01:12,240 Speaker 1: I was going to put pictures of my new podcast 22 00:01:12,240 --> 00:01:15,640 Speaker 1: studio up, and then that episode came out thank you. 23 00:01:15,720 --> 00:01:18,039 Speaker 1: And I put it up and everyone was just so 24 00:01:18,200 --> 00:01:20,959 Speaker 1: nice and sweet, and I just want to say thanks. 25 00:01:20,959 --> 00:01:23,480 Speaker 1: And also I want to say sorry to you because 26 00:01:23,560 --> 00:01:27,440 Speaker 1: that post now just supplanted a picture of you and 27 00:01:27,640 --> 00:01:32,200 Speaker 1: I as my most popular ever post on Instagram. 28 00:01:32,240 --> 00:01:33,640 Speaker 2: Really, I'm sick of that post. 29 00:01:33,920 --> 00:01:35,040 Speaker 3: It just passed it today. 30 00:01:35,080 --> 00:01:37,440 Speaker 1: That picture when we did our secret mission out to 31 00:01:37,480 --> 00:01:40,360 Speaker 1: the desert in January was my previous most popular post ever. 32 00:01:41,200 --> 00:01:43,840 Speaker 2: Okay, but you should have probably taken down this new 33 00:01:43,840 --> 00:01:46,360 Speaker 2: post just before it topped it. I'm not sure why 34 00:01:46,400 --> 00:01:46,759 Speaker 2: you didn't. 35 00:01:47,920 --> 00:01:49,960 Speaker 3: No, I had a counter. I was like, here it comes, 36 00:01:50,040 --> 00:01:50,400 Speaker 3: here it. 37 00:01:50,400 --> 00:01:53,320 Speaker 2: Comes, and then you had like a confetti and the 38 00:01:53,320 --> 00:01:54,279 Speaker 2: little noise maker. 39 00:01:54,600 --> 00:01:57,400 Speaker 3: That's right, and then the picture of us just dissolved. 40 00:02:00,120 --> 00:02:02,800 Speaker 2: No, we just aged and turned into mummies in the picture. 41 00:02:03,680 --> 00:02:05,400 Speaker 3: That's right, because it's not reality. 42 00:02:05,520 --> 00:02:08,760 Speaker 2: Right, it isn't reality. That. That's a great segue, Chuck, 43 00:02:08,800 --> 00:02:12,560 Speaker 2: because I have a question to kick this one off. Sure, Chuck, 44 00:02:13,720 --> 00:02:15,440 Speaker 2: are you hallucinating right now? 45 00:02:16,720 --> 00:02:20,960 Speaker 3: Unfortunately no, but philosophers might say that. I am. 46 00:02:21,280 --> 00:02:29,560 Speaker 2: Yeah, not just philosophers, neuroscientists, physicists, biologists, maybe evolutionary biologists 47 00:02:29,560 --> 00:02:32,160 Speaker 2: in particular, especially ones that are hipped to this whole thing, 48 00:02:32,760 --> 00:02:37,200 Speaker 2: would say, yeah, you're hallucinating right now, and so are you, Josh, 49 00:02:37,200 --> 00:02:39,440 Speaker 2: and so are you Jerry. If you do exist, you're 50 00:02:39,440 --> 00:02:43,120 Speaker 2: hallucinating every single thing you're looking at, smelling, hearing, touching. 51 00:02:43,680 --> 00:02:48,400 Speaker 2: That none of this is real. And we're talking today 52 00:02:48,400 --> 00:02:53,840 Speaker 2: about whether reality is real or not, and there's a 53 00:02:53,919 --> 00:02:58,360 Speaker 2: very deep, like mind blowing aspect to it. And I 54 00:02:58,400 --> 00:03:00,120 Speaker 2: feel like that's where a lot of people leave. But 55 00:03:00,200 --> 00:03:02,440 Speaker 2: it's just like, in this the craziest stuff these people 56 00:03:02,440 --> 00:03:05,079 Speaker 2: are talking about, but there's more to it than that. 57 00:03:05,320 --> 00:03:11,000 Speaker 2: And like, I've realized that in investigating the nature of reality, 58 00:03:11,440 --> 00:03:14,639 Speaker 2: we end up learning more about ourselves than we do 59 00:03:14,720 --> 00:03:19,680 Speaker 2: about reality. And I just find that endlessly fascinating. 60 00:03:20,440 --> 00:03:26,400 Speaker 1: Welcome to Jurassic Park. I know what you mean because 61 00:03:26,440 --> 00:03:29,160 Speaker 1: this article. And by the way, big shout out to 62 00:03:29,240 --> 00:03:33,120 Speaker 1: Dave Ruse, because you threw him this topic as if 63 00:03:33,120 --> 00:03:37,000 Speaker 1: it was just like, hey, do one on elephants, and 64 00:03:37,080 --> 00:03:39,680 Speaker 1: it was tough, and Dave even had to take a 65 00:03:39,760 --> 00:03:44,640 Speaker 1: rare second stab at it, because it's just really hard 66 00:03:44,680 --> 00:03:47,920 Speaker 1: to nail something like this down, this sort of it's 67 00:03:47,920 --> 00:03:52,680 Speaker 1: hard to not delve into philosophical navel gazing with stuff 68 00:03:52,720 --> 00:03:57,040 Speaker 1: like this. And we've covered some philosophy stuff here and there. Sure, 69 00:03:57,360 --> 00:03:59,360 Speaker 1: and it's always kind of fun. But you know, my 70 00:03:59,480 --> 00:04:03,200 Speaker 1: deal with philosophy was I did pretty I took one 71 00:04:03,200 --> 00:04:04,840 Speaker 1: class in college and did pretty well, and I think 72 00:04:04,840 --> 00:04:08,800 Speaker 1: it made an aar b. But the same thing happened 73 00:04:08,880 --> 00:04:11,320 Speaker 1: in that class, as it happens every time we tackle it, 74 00:04:11,400 --> 00:04:15,600 Speaker 1: is I'm lost and then annoyed, and then eventually he 75 00:04:15,680 --> 00:04:18,160 Speaker 1: kind of come around and think it's cool and understand 76 00:04:18,200 --> 00:04:18,880 Speaker 1: it a little bit. 77 00:04:19,440 --> 00:04:20,520 Speaker 2: So that's where you are with this. 78 00:04:21,400 --> 00:04:21,679 Speaker 3: Yeah. 79 00:04:21,720 --> 00:04:24,080 Speaker 1: I think it's like the fifth or sixth time I 80 00:04:24,080 --> 00:04:25,800 Speaker 1: went through it, I was like, all right, this is 81 00:04:25,800 --> 00:04:28,000 Speaker 1: actually kind of cool, whereas before I was like. 82 00:04:28,040 --> 00:04:28,880 Speaker 3: I hate all this stuff. 83 00:04:28,880 --> 00:04:32,000 Speaker 1: Of course everything is that apple is real and see it, 84 00:04:32,000 --> 00:04:32,680 Speaker 1: I can taste it. 85 00:04:32,720 --> 00:04:34,040 Speaker 3: But now I kind of get it. 86 00:04:34,160 --> 00:04:35,440 Speaker 2: Now you know that you're dead wrong? 87 00:04:36,160 --> 00:04:37,280 Speaker 3: Yeah? Maybe? 88 00:04:37,560 --> 00:04:39,680 Speaker 2: Yeah. And I want to point out Dave didn't have 89 00:04:39,760 --> 00:04:41,800 Speaker 2: to do a second attempt of this. We didn't ask 90 00:04:41,880 --> 00:04:44,280 Speaker 2: him to, but we had taken so long to get 91 00:04:44,279 --> 00:04:46,520 Speaker 2: her getting around to doing it. He's like, here I 92 00:04:46,680 --> 00:04:49,239 Speaker 2: revised this. Maybe you can say is that how went down? Yeah? 93 00:04:49,279 --> 00:04:49,719 Speaker 2: For sure? 94 00:04:50,240 --> 00:04:50,760 Speaker 3: Yeah, gotcha. 95 00:04:50,960 --> 00:04:54,120 Speaker 2: We definitely didn't say like I know. I was just like, okay, 96 00:04:54,160 --> 00:04:56,120 Speaker 2: this is you know, thanks for doing this, Dave, and 97 00:04:56,160 --> 00:04:57,160 Speaker 2: we'll do it when we can. 98 00:04:57,640 --> 00:05:00,000 Speaker 3: And he's like, I've noticed you haven't red that. 99 00:05:00,680 --> 00:05:03,479 Speaker 2: Very much, very much, So thank you to Dave for sure. 100 00:05:04,120 --> 00:05:08,359 Speaker 2: So people have been thinking about whether what we think 101 00:05:08,480 --> 00:05:12,120 Speaker 2: and see and feel is real for a really long time. 102 00:05:13,440 --> 00:05:16,880 Speaker 2: It's probably one of the first things we ever thought 103 00:05:16,920 --> 00:05:20,600 Speaker 2: of that was you know, kind of deep, since we 104 00:05:20,640 --> 00:05:24,080 Speaker 2: started eating mushrooms and developed the consciousness. 105 00:05:24,480 --> 00:05:28,840 Speaker 1: Yeah, And as we go through this, you know, it's 106 00:05:28,640 --> 00:05:31,000 Speaker 1: it makes sense that as we go through it, it's 107 00:05:31,040 --> 00:05:33,760 Speaker 1: it's kind of been a timeline of different philosophers and 108 00:05:34,000 --> 00:05:36,479 Speaker 1: as we learned more about science, things were tweaked and 109 00:05:36,600 --> 00:05:38,960 Speaker 1: changed all the way up and then eventually we end 110 00:05:39,040 --> 00:05:42,640 Speaker 1: up with some modern day like Ted Talkers or one 111 00:05:42,640 --> 00:05:45,640 Speaker 1: in particular. So it kind of makes sense that things 112 00:05:45,640 --> 00:05:49,680 Speaker 1: would morph and change philosophically over time as we talk 113 00:05:49,720 --> 00:05:51,440 Speaker 1: about things like ohay, is time even real? 114 00:05:52,040 --> 00:05:54,680 Speaker 2: Yeah, but if you really look at it, and especially 115 00:05:54,680 --> 00:05:57,800 Speaker 2: if you're paying attention later on, as we get more 116 00:05:57,839 --> 00:06:01,640 Speaker 2: into modern interpretations, they very striking resemblance to some of 117 00:06:01,680 --> 00:06:06,040 Speaker 2: the first cracks at explaining whether reality is real to us. 118 00:06:06,400 --> 00:06:08,560 Speaker 2: And one of the first people that we know of 119 00:06:08,600 --> 00:06:12,480 Speaker 2: who really tried to tackle this was Plato, and he 120 00:06:12,520 --> 00:06:15,880 Speaker 2: came up with a very famous allegory of the cave 121 00:06:16,880 --> 00:06:20,719 Speaker 2: where people are he calls them prisoners. They're situated in 122 00:06:20,760 --> 00:06:23,520 Speaker 2: a cave, they're chained up, they're not able to turn around, 123 00:06:23,560 --> 00:06:26,160 Speaker 2: so all they can do is face the back wall 124 00:06:26,200 --> 00:06:29,359 Speaker 2: of the cave. Behind them is a fire, and then 125 00:06:30,960 --> 00:06:34,240 Speaker 2: on the other side of the fire. No, I already 126 00:06:34,279 --> 00:06:36,839 Speaker 2: messed it up, sorry, Plato. Behind them is a fire, 127 00:06:36,880 --> 00:06:39,839 Speaker 2: and then between them and the fire are people who 128 00:06:39,880 --> 00:06:43,800 Speaker 2: can move around their puppeteers. They can cast shadows from 129 00:06:43,839 --> 00:06:46,960 Speaker 2: the campfire onto the wall, and all the prisoners have 130 00:06:47,040 --> 00:06:50,320 Speaker 2: ever experienced are the shadows on the wall. So to them, 131 00:06:50,560 --> 00:06:54,640 Speaker 2: that's real. But in reality, what's actually real are the 132 00:06:54,640 --> 00:06:57,000 Speaker 2: puppets that the people are showing behind them that they 133 00:06:57,040 --> 00:06:59,680 Speaker 2: can't see. So what they think is real is actually 134 00:06:59,760 --> 00:07:03,400 Speaker 2: just a similacrum like kind of a distilled version of 135 00:07:03,440 --> 00:07:06,400 Speaker 2: the actual reality. And that was Plato's take on the 136 00:07:06,400 --> 00:07:06,840 Speaker 2: whole thing. 137 00:07:07,400 --> 00:07:11,520 Speaker 3: Yeah, like, here's a bunny, this is Richard Nixon. H 138 00:07:12,640 --> 00:07:13,120 Speaker 3: what else? 139 00:07:13,600 --> 00:07:15,720 Speaker 2: That? But that? But yeah, that that what we think 140 00:07:15,760 --> 00:07:20,720 Speaker 2: of as Richard Nixon is a distilled form of what 141 00:07:21,080 --> 00:07:22,840 Speaker 2: actually is Richard Nixon. 142 00:07:23,320 --> 00:07:25,360 Speaker 3: Yeah. And the key to doing Richard Nixon is and 143 00:07:25,360 --> 00:07:26,360 Speaker 3: the knuckles. 144 00:07:26,960 --> 00:07:28,520 Speaker 2: I used to do his knuckles a lot. 145 00:07:28,880 --> 00:07:30,760 Speaker 3: No, no, talking about shadow puppets. 146 00:07:32,960 --> 00:07:34,480 Speaker 2: The trick is a man's knuckle. 147 00:07:35,120 --> 00:07:37,720 Speaker 1: I can oh, man, I can actually do a few 148 00:07:37,760 --> 00:07:38,800 Speaker 1: shadow puppets pretty well. 149 00:07:38,840 --> 00:07:40,200 Speaker 2: Oh which ones? Which ones? 150 00:07:40,680 --> 00:07:41,320 Speaker 3: I can do. 151 00:07:41,280 --> 00:07:46,080 Speaker 1: A I just well, one's kind of a rabbity gargoyle. 152 00:07:46,720 --> 00:07:49,040 Speaker 1: And then I can do like an alligator, and I 153 00:07:49,080 --> 00:07:53,320 Speaker 1: can do a some some other long snouted animal with 154 00:07:53,360 --> 00:07:54,600 Speaker 1: an ear, I mean dog. 155 00:07:55,280 --> 00:07:56,440 Speaker 3: Sure, I can do a few things. 156 00:07:56,680 --> 00:07:57,640 Speaker 2: That's really great, man. 157 00:07:57,880 --> 00:07:58,920 Speaker 3: I used to mucky around with it. 158 00:07:58,960 --> 00:08:01,280 Speaker 1: And then a few you're ago, when Ruby was little, 159 00:08:01,480 --> 00:08:03,240 Speaker 1: there was a light casting and I was like, hey, 160 00:08:03,440 --> 00:08:05,280 Speaker 1: check this out, and her mind was blown. 161 00:08:05,280 --> 00:08:06,280 Speaker 3: I was like, still got it. 162 00:08:06,560 --> 00:08:09,880 Speaker 2: Yeah, she's where you like Plato would have loved you, kid. 163 00:08:10,680 --> 00:08:12,000 Speaker 3: Yeah, And she was like who. 164 00:08:13,040 --> 00:08:17,280 Speaker 1: So anyway, Plato basically is saying that the material world 165 00:08:17,320 --> 00:08:20,320 Speaker 1: around us and how we perceive it is not a 166 00:08:20,360 --> 00:08:23,200 Speaker 1: reliable thing. And what he believed in as the truth 167 00:08:23,280 --> 00:08:26,800 Speaker 1: is something he called forms or maybe ideas. But we're 168 00:08:26,800 --> 00:08:29,080 Speaker 1: going to use the word form. As we move forward 169 00:08:29,160 --> 00:08:30,120 Speaker 1: into Aristotle. 170 00:08:30,440 --> 00:08:34,760 Speaker 2: Yeah, as Dave puts it, that our perceived reality, according 171 00:08:34,760 --> 00:08:37,360 Speaker 2: to Plato, is just the shadow of an objective higher 172 00:08:37,360 --> 00:08:41,400 Speaker 2: truth makes sense, Yes, But what he's saying is what 173 00:08:41,440 --> 00:08:43,319 Speaker 2: we said, what you're like, what's in front of you, 174 00:08:43,440 --> 00:08:46,679 Speaker 2: Richard Nixon, your apple? It's not actually real. It's just 175 00:08:46,720 --> 00:08:50,120 Speaker 2: a version of that. And along came Aristotle, who was 176 00:08:50,120 --> 00:08:54,960 Speaker 2: a contemporary of Plato. I think he learned directly from Plato, 177 00:08:55,120 --> 00:08:58,439 Speaker 2: if I'm not mistaken, But in this case he disagreed 178 00:08:58,480 --> 00:09:02,880 Speaker 2: with Plato. He's said, Plato kind of on the right track. 179 00:09:02,920 --> 00:09:05,560 Speaker 2: But these things are not totally separate. And the way 180 00:09:05,559 --> 00:09:10,200 Speaker 2: that a shadow is not truly related to the object 181 00:09:10,240 --> 00:09:14,000 Speaker 2: that's casting the shadow, it's something detached from it. These 182 00:09:14,040 --> 00:09:20,200 Speaker 2: things are attached, like, yes, there's an actual objective, real form, 183 00:09:20,240 --> 00:09:23,559 Speaker 2: but the thing that we perceive is somehow tied to it, 184 00:09:23,880 --> 00:09:27,480 Speaker 2: and it's tied to it through what Aristotle called forms. 185 00:09:27,480 --> 00:09:31,120 Speaker 2: So forms were in a human, our soul, whereas the 186 00:09:31,280 --> 00:09:34,600 Speaker 2: organic body is the matter matter in forms. 187 00:09:35,040 --> 00:09:38,080 Speaker 1: Yeah, and you know, eventually science would come to the 188 00:09:38,120 --> 00:09:42,080 Speaker 1: picture slowly. So if we go, you know a few 189 00:09:42,080 --> 00:09:46,600 Speaker 1: centuries ahead in time, the nature of reality and truth 190 00:09:46,679 --> 00:09:50,360 Speaker 1: as people knew it started to change once science started 191 00:09:50,480 --> 00:09:52,560 Speaker 1: saying you're kind of wrong about a lot of stuff. 192 00:09:52,960 --> 00:09:54,960 Speaker 1: And a good example Dave gives here is, you know, 193 00:09:55,000 --> 00:09:57,160 Speaker 1: we thought the Earth was the center of the universe 194 00:09:57,200 --> 00:10:00,640 Speaker 1: for a long time. Astronomy comes along and math comes 195 00:10:00,679 --> 00:10:03,520 Speaker 1: along and says, no, that's actually not true. So now 196 00:10:03,520 --> 00:10:09,040 Speaker 1: there's actually a basis for saying, hey, what you think 197 00:10:09,120 --> 00:10:10,920 Speaker 1: is true and what you think is reality might not 198 00:10:10,960 --> 00:10:12,960 Speaker 1: actually be the case. And we're starting to prove that 199 00:10:13,040 --> 00:10:16,920 Speaker 1: a little bit. And Galileo steps in and this is 200 00:10:16,960 --> 00:10:20,680 Speaker 1: in the I guess the seventeenth and sixteenth and seventeenth centuries. 201 00:10:21,440 --> 00:10:24,840 Speaker 1: And he comes along basically and says, and this is 202 00:10:24,840 --> 00:10:28,520 Speaker 1: an actual quote, I think that tastes, odors, colors, and 203 00:10:28,559 --> 00:10:31,360 Speaker 1: so on are no more than mere names so far 204 00:10:31,400 --> 00:10:33,320 Speaker 1: as the object in which we place them as concerned 205 00:10:33,480 --> 00:10:37,080 Speaker 1: that they reside only in the consciousness. Hence, if we 206 00:10:37,120 --> 00:10:40,200 Speaker 1: live in creatures were removed, all these qualities would be 207 00:10:40,240 --> 00:10:44,520 Speaker 1: wiped away and annihilated. Meaning in other words, and this 208 00:10:44,600 --> 00:10:46,800 Speaker 1: is something we're going to kind of repeat here and there. 209 00:10:47,000 --> 00:10:49,040 Speaker 1: It's kind of like if a tree falls in the forest, 210 00:10:49,080 --> 00:10:53,840 Speaker 1: does it make a sound. What Galileo's saying is it's 211 00:10:53,880 --> 00:10:56,880 Speaker 1: only because we've assigned these things color and odor and 212 00:10:56,960 --> 00:10:59,959 Speaker 1: taste that they have color and odor and taste. 213 00:11:00,240 --> 00:11:04,680 Speaker 2: Right, They don't inherently possess these objects. Something exists, but 214 00:11:04,800 --> 00:11:07,240 Speaker 2: it doesn't exist in the form that we perceive it as. 215 00:11:07,280 --> 00:11:11,080 Speaker 2: It's probably far more complex or at the very least different. 216 00:11:11,280 --> 00:11:14,840 Speaker 2: What's interesting is that quote could have been written by 217 00:11:15,360 --> 00:11:20,200 Speaker 2: anyone today researching that, Like, that's a very contemporary understanding 218 00:11:20,240 --> 00:11:23,000 Speaker 2: of what's going on, and that was Galileo back in 219 00:11:23,040 --> 00:11:27,360 Speaker 2: the sixteenth and seventeenth century. It's pretty cool. John Locke 220 00:11:27,400 --> 00:11:29,280 Speaker 2: was the next one to really kind of contribute to it, 221 00:11:29,360 --> 00:11:32,720 Speaker 2: and what he came up with was similar to Plato 222 00:11:32,760 --> 00:11:37,320 Speaker 2: and Aristotle's take. He said that there are everything has 223 00:11:38,080 --> 00:11:41,360 Speaker 2: two types of qualities primary qualities, which is the actual 224 00:11:41,400 --> 00:11:45,559 Speaker 2: objective reality of the thing, and apples just for some 225 00:11:45,600 --> 00:11:49,280 Speaker 2: reason keeps it's the easiest example for some reason. But 226 00:11:49,400 --> 00:11:53,920 Speaker 2: in apple has it has a form, it has like size, 227 00:11:54,040 --> 00:11:57,480 Speaker 2: it has shape, it has bulk to it. It cannot move. 228 00:11:57,559 --> 00:12:02,360 Speaker 2: That's a big primary characteristic of it. Like the apple 229 00:12:02,400 --> 00:12:07,240 Speaker 2: itself isn't inherently red. There's a certain arrangement of electrons 230 00:12:07,280 --> 00:12:10,840 Speaker 2: inside the apple skin or that make up the apple skin, 231 00:12:11,280 --> 00:12:15,600 Speaker 2: that absorb some kinds of wavelengths of light and reflect 232 00:12:15,760 --> 00:12:18,439 Speaker 2: back the red wavelength of light. And that's what we 233 00:12:18,480 --> 00:12:20,720 Speaker 2: see that hits our eye. But that doesn't mean that 234 00:12:20,800 --> 00:12:24,040 Speaker 2: the apple itself is red in any way whatsoever. It's 235 00:12:24,120 --> 00:12:25,400 Speaker 2: just that's what we perceive. 236 00:12:26,280 --> 00:12:28,040 Speaker 1: Yeah, it doesn't even hit our eye. It hits the 237 00:12:28,120 --> 00:12:31,560 Speaker 1: receptors in our eye, those rods and cones, yep. And 238 00:12:31,600 --> 00:12:34,959 Speaker 1: it just sends dumb data messages to our smart brain, 239 00:12:35,080 --> 00:12:35,840 Speaker 1: right right. 240 00:12:36,640 --> 00:12:42,439 Speaker 2: But then there's secondary qualities, things like it's taste, it's color, shininess, 241 00:12:42,720 --> 00:12:46,000 Speaker 2: and that the secondary qualities are the things that we 242 00:12:46,120 --> 00:12:48,400 Speaker 2: lay on top of it. That that's our perception. But 243 00:12:48,440 --> 00:12:50,880 Speaker 2: that if you took away our perception, what would be 244 00:12:51,040 --> 00:12:54,120 Speaker 2: left are the size, the bulk, the inability of the 245 00:12:54,160 --> 00:12:57,120 Speaker 2: apple to move. Those things are objective and unchanging. 246 00:12:57,840 --> 00:13:01,560 Speaker 1: Yeah, and he labeled those was called extension. And that's 247 00:13:01,800 --> 00:13:04,160 Speaker 1: the fact that it just takes up physical space in 248 00:13:04,200 --> 00:13:07,160 Speaker 1: a place. And then permanence, it does that at a 249 00:13:07,200 --> 00:13:10,000 Speaker 1: specific time, so it exists in time. And then the 250 00:13:10,080 --> 00:13:13,080 Speaker 1: last one is that it interacts with other objects. And 251 00:13:13,120 --> 00:13:17,080 Speaker 1: he called that causal powers, and that can be anything 252 00:13:17,120 --> 00:13:19,600 Speaker 1: from just the air around it to the desk that 253 00:13:19,600 --> 00:13:22,760 Speaker 1: it's sitting on or whatever, sure, which are also constructs. 254 00:13:23,640 --> 00:13:27,319 Speaker 2: Yes, we'll get to that, and then I think the 255 00:13:27,440 --> 00:13:32,319 Speaker 2: last contributor to the historical understanding of reality, at least 256 00:13:32,360 --> 00:13:35,960 Speaker 2: one of the big name, big shot ones, was Emmanuel Kant. 257 00:13:37,280 --> 00:13:41,600 Speaker 2: He was a German mathematician philosopher from the Enlightenment era. 258 00:13:41,679 --> 00:13:45,400 Speaker 2: I believe, yeah, And he basically he wasn't so much 259 00:13:45,440 --> 00:13:48,559 Speaker 2: after like, okay, what is the nature of reality? His 260 00:13:49,640 --> 00:13:52,640 Speaker 2: question was even more basic than that. It was can 261 00:13:52,679 --> 00:13:58,840 Speaker 2: we even possibly perceive actual reality? And after thinking about it, 262 00:13:58,920 --> 00:14:00,880 Speaker 2: thinking about it, really kind of humming on it for 263 00:14:00,920 --> 00:14:02,920 Speaker 2: a little bit, he said no, no, I don't believe 264 00:14:02,920 --> 00:14:06,600 Speaker 2: we ever possibly can, and that forms the basis for 265 00:14:06,920 --> 00:14:12,400 Speaker 2: the modern exploration of reality. 266 00:14:13,000 --> 00:14:13,360 Speaker 3: Yeah. 267 00:14:13,400 --> 00:14:15,760 Speaker 1: He was one of those that pushed it even further 268 00:14:15,960 --> 00:14:18,800 Speaker 1: and said, all right, so I'm digging what you're saying 269 00:14:18,880 --> 00:14:22,560 Speaker 1: that that red apple, that color isn't really real and 270 00:14:22,600 --> 00:14:25,000 Speaker 1: the shape isn't really real. So lock what you were 271 00:14:25,040 --> 00:14:29,760 Speaker 1: saying about these primary qualities, even that it exists in 272 00:14:29,840 --> 00:14:34,040 Speaker 1: time and space, like dude, that is in your head 273 00:14:34,120 --> 00:14:37,480 Speaker 1: as well, like those don't even exist. They exist in 274 00:14:37,520 --> 00:14:41,400 Speaker 1: our minds, and so we can't even conceive of anything. 275 00:14:42,080 --> 00:14:43,600 Speaker 3: We can't know really anything. 276 00:14:43,840 --> 00:14:45,960 Speaker 2: Yeah, he went so so far as to say, like 277 00:14:46,040 --> 00:14:50,200 Speaker 2: science and math, which describes the basic laws of the 278 00:14:50,320 --> 00:14:55,840 Speaker 2: universe quite accurately, these are constructs themselves, Like what we're 279 00:14:55,880 --> 00:14:59,960 Speaker 2: actually describing are hallucinations that we all share in common 280 00:15:00,400 --> 00:15:00,720 Speaker 2: m hm. 281 00:15:01,400 --> 00:15:03,400 Speaker 3: Or he called them appearances, yeah. 282 00:15:03,120 --> 00:15:05,680 Speaker 2: He And in fact, he called science and math appearances 283 00:15:05,680 --> 00:15:08,840 Speaker 2: of appearances. And he was saying, like, there's we're never 284 00:15:08,960 --> 00:15:10,920 Speaker 2: going to be able to figure this out. And luckily 285 00:15:11,000 --> 00:15:13,040 Speaker 2: Kant was wrong, because we do seem to be kind 286 00:15:13,040 --> 00:15:15,320 Speaker 2: of on a track of figuring things out a little more. 287 00:15:16,200 --> 00:15:19,640 Speaker 3: All right, Man, that was a robust, like thirteen minutes. 288 00:15:19,680 --> 00:15:20,960 Speaker 2: I think I think so too. 289 00:15:21,520 --> 00:15:24,840 Speaker 3: So let's take a break and we'll be right back. 290 00:15:28,040 --> 00:15:48,960 Speaker 2: Stuff you should know, Josh, and show stuff you should know? Okay, So, uh, 291 00:15:49,200 --> 00:15:50,800 Speaker 2: we've been kind of teasing it, and I think she 292 00:15:50,920 --> 00:15:53,800 Speaker 2: was probably saying it outright too. Some of these early 293 00:15:53,840 --> 00:15:56,120 Speaker 2: philosophers really kind of hit the nail on the head 294 00:15:56,120 --> 00:15:59,640 Speaker 2: as far as our current understanding goes. And one of 295 00:15:59,680 --> 00:16:03,240 Speaker 2: the big contributions or one of the big contributors of 296 00:16:03,360 --> 00:16:07,960 Speaker 2: the twentieth and especially twenty first century to exploring what 297 00:16:08,080 --> 00:16:13,760 Speaker 2: the basis of reality is was neuroscience. Neuroscience has said, okay, 298 00:16:13,800 --> 00:16:16,720 Speaker 2: well wait a minute, there's a there's something that we 299 00:16:16,760 --> 00:16:21,440 Speaker 2: all need to kind of explore. If all of this 300 00:16:21,520 --> 00:16:25,720 Speaker 2: is just in our minds, which is what's suggested, we 301 00:16:25,840 --> 00:16:28,600 Speaker 2: have ways of looking into the mind, so let's figure 302 00:16:28,600 --> 00:16:29,800 Speaker 2: out what's actually going on. 303 00:16:31,120 --> 00:16:34,520 Speaker 1: Yeah, and this, you know, I realized that as I 304 00:16:34,600 --> 00:16:37,560 Speaker 1: have had problems with some deep philosophical things like this, 305 00:16:38,280 --> 00:16:40,840 Speaker 1: listeners to some of this stuff might like, it's not 306 00:16:40,880 --> 00:16:43,120 Speaker 1: for everyone, you know what I mean? Sure, So, like 307 00:16:43,160 --> 00:16:45,360 Speaker 1: I get it if some people are listening and being like, 308 00:16:45,400 --> 00:16:47,640 Speaker 1: wait a minute, what have you been talking about When 309 00:16:47,680 --> 00:16:49,800 Speaker 1: we say things like our eyes don't see and our 310 00:16:49,840 --> 00:16:52,480 Speaker 1: ears don't hear. But we're going to explain it in 311 00:16:52,520 --> 00:16:55,720 Speaker 1: a science y way that I think grounds it as 312 00:16:55,760 --> 00:16:58,520 Speaker 1: others have before us. This is not like our ideas, right, 313 00:16:58,560 --> 00:17:01,400 Speaker 1: but it is true that our don't actually really see 314 00:17:02,040 --> 00:17:05,639 Speaker 1: and our tongues don't actually taste. What we have is 315 00:17:05,840 --> 00:17:09,919 Speaker 1: a system, which is our body and our brain working together. 316 00:17:10,080 --> 00:17:15,040 Speaker 1: So we have all these receptors that capture this data basically, 317 00:17:15,400 --> 00:17:18,119 Speaker 1: and we send it to the brain, which is and 318 00:17:18,160 --> 00:17:21,359 Speaker 1: we're going to reiterate this too. The brain does great work. 319 00:17:21,400 --> 00:17:24,520 Speaker 1: But the brain is inside the skull. It's trapped in there. 320 00:17:24,880 --> 00:17:28,439 Speaker 1: The brain isn't eyes and ears and tongue and stuff 321 00:17:28,480 --> 00:17:32,320 Speaker 1: like that. It just works with whatever sensory data is 322 00:17:32,359 --> 00:17:37,480 Speaker 1: sent to it from those receptors, from those organs and says, 323 00:17:37,680 --> 00:17:41,359 Speaker 1: all right, here's what I think is going on in 324 00:17:41,400 --> 00:17:43,879 Speaker 1: a way that will make sense to you walking around 325 00:17:43,880 --> 00:17:44,399 Speaker 1: in the world. 326 00:17:44,600 --> 00:17:48,800 Speaker 2: Yeah, And that's what produces our conscious experience, that translation 327 00:17:49,160 --> 00:17:55,800 Speaker 2: of electromagnetic waves and acoustic waves just hitting like our raw. 328 00:17:56,880 --> 00:17:59,480 Speaker 2: The way that I got this finally was to start 329 00:17:59,520 --> 00:18:03,119 Speaker 2: thinking of eyes and ears and tongues like antenna on 330 00:18:03,200 --> 00:18:06,520 Speaker 2: a bog same thing, right, and that the brain just 331 00:18:06,520 --> 00:18:09,520 Speaker 2: puts all that sensory information together. And one of the 332 00:18:09,520 --> 00:18:14,040 Speaker 2: ways that we've we've shown like indisputably that this happens 333 00:18:14,280 --> 00:18:18,200 Speaker 2: are through optical illusions. There's so many optical illusions out there. 334 00:18:18,359 --> 00:18:22,240 Speaker 2: One of the most famous are is the checkerboard, where 335 00:18:22,280 --> 00:18:25,480 Speaker 2: there's a cylinder casting a shadow across the checkerboard and 336 00:18:25,520 --> 00:18:30,000 Speaker 2: there's like gray, gray squares and white squares, and if 337 00:18:30,040 --> 00:18:32,440 Speaker 2: you link these two squares together, you realize they're actually 338 00:18:32,480 --> 00:18:36,040 Speaker 2: the same color. It's just the brain sees a shadow 339 00:18:36,119 --> 00:18:39,520 Speaker 2: being cast, so it's darkening one of those squares where 340 00:18:39,560 --> 00:18:42,720 Speaker 2: really it's the exact same color. And so what neuroscience 341 00:18:42,720 --> 00:18:46,320 Speaker 2: did was to step up and say, okay, let's investigate 342 00:18:46,359 --> 00:18:51,760 Speaker 2: exactly where this weird illusion is happening. And what they 343 00:18:51,800 --> 00:18:56,760 Speaker 2: found that in cases of visual illusions optical illusions, the 344 00:18:56,960 --> 00:19:00,800 Speaker 2: eyes are sending the correct data. There's seeing the illusion 345 00:19:00,800 --> 00:19:03,440 Speaker 2: for what it is. They can see that those checkerboards 346 00:19:03,480 --> 00:19:06,560 Speaker 2: are the same color. It's when it gets transferred to 347 00:19:06,600 --> 00:19:09,280 Speaker 2: the frontal lobe and the frontal lobe starts putting together 348 00:19:09,320 --> 00:19:13,560 Speaker 2: a picture of reality based on past experience and physical 349 00:19:13,640 --> 00:19:16,560 Speaker 2: laws and things like that, it says, no, that's not possible. 350 00:19:16,720 --> 00:19:19,960 Speaker 2: This is actually two different colors and produces the illusion 351 00:19:20,000 --> 00:19:21,480 Speaker 2: in our conscious experience. 352 00:19:22,280 --> 00:19:25,119 Speaker 1: Yeah, and they've shown this with the Wonder machine, with 353 00:19:25,160 --> 00:19:31,000 Speaker 1: the fMRI machine and experiments. It's literally just the brain saying, well, 354 00:19:31,000 --> 00:19:34,040 Speaker 1: that ain't right. So I'm gonna tell you that it's 355 00:19:34,160 --> 00:19:36,639 Speaker 1: this based on everything that you've ever seen in your 356 00:19:36,640 --> 00:19:38,080 Speaker 1: life that really makes sense. 357 00:19:37,960 --> 00:19:42,199 Speaker 2: Right, And so the upshot of all that is we see, 358 00:19:42,600 --> 00:19:45,800 Speaker 2: we can demonstrate that the brain does not give us 359 00:19:45,920 --> 00:19:49,480 Speaker 2: any sort of accurate picture of reality. It gives us 360 00:19:49,560 --> 00:19:52,560 Speaker 2: a rough sketch, a good enough sketch of reality to 361 00:19:52,640 --> 00:19:55,840 Speaker 2: allow us to navigate the universe. So we know for 362 00:19:55,960 --> 00:19:58,760 Speaker 2: a fact that what we see and perceive is not 363 00:19:58,960 --> 00:20:02,960 Speaker 2: actual reality. The question then becomes is just how removed 364 00:20:03,000 --> 00:20:07,520 Speaker 2: from actual reality is our conscious experience as human beings. 365 00:20:07,960 --> 00:20:11,400 Speaker 1: Yeah, and you know what, I haven't gone through it yet, 366 00:20:11,400 --> 00:20:14,639 Speaker 1: but we have an upcoming episode at some point that 367 00:20:14,680 --> 00:20:19,760 Speaker 1: I've been avoiding on stereograms, hm, the hidden eye pictures 368 00:20:19,800 --> 00:20:21,600 Speaker 1: that were such a big deal in the nineties. So, 369 00:20:22,480 --> 00:20:25,000 Speaker 1: and I'm sure that all of this stuff is in there, 370 00:20:25,040 --> 00:20:27,159 Speaker 1: because that's kind of what you're talking about, or what 371 00:20:27,160 --> 00:20:28,280 Speaker 1: we're talking about here. 372 00:20:28,200 --> 00:20:33,160 Speaker 2: Right, Yeah, Yeah, the frontal lobe taking perfectly good, legitimate 373 00:20:33,200 --> 00:20:36,000 Speaker 2: information and putting it together in wacky ways. And yeah, 374 00:20:36,040 --> 00:20:38,119 Speaker 2: I would guess that would be the basis of a 375 00:20:38,160 --> 00:20:39,040 Speaker 2: stereogram too. 376 00:20:39,400 --> 00:20:42,040 Speaker 3: Yeah, there's a sailboat there. You don't see it, look harder. 377 00:20:42,280 --> 00:20:44,560 Speaker 2: Yeah, there's the Tasmanian Devil. 378 00:20:45,359 --> 00:20:46,399 Speaker 3: Did you ever see a Tasmin? 379 00:20:47,359 --> 00:20:50,359 Speaker 2: I don't know if I'm just imagining that because he 380 00:20:50,440 --> 00:20:53,160 Speaker 2: was huge at the same time. But I'm sure there 381 00:20:53,200 --> 00:20:56,040 Speaker 2: was a Tasmanian devil stereogram. 382 00:20:55,480 --> 00:20:57,800 Speaker 1: Or was it a mud flap of the Tasmanian devil 383 00:20:57,800 --> 00:20:59,000 Speaker 1: and it said back off. 384 00:20:59,000 --> 00:21:02,000 Speaker 2: That was just somebody saying, oh, that's right, I've had 385 00:21:02,080 --> 00:21:03,959 Speaker 2: the two guns. 386 00:21:04,560 --> 00:21:06,560 Speaker 1: I even had him in my brain. But you know, 387 00:21:06,760 --> 00:21:07,760 Speaker 1: is that is that even real? 388 00:21:08,359 --> 00:21:08,880 Speaker 2: Very nice? 389 00:21:08,960 --> 00:21:10,480 Speaker 3: Chuck, Okay, okay. 390 00:21:11,480 --> 00:21:15,000 Speaker 1: So here's to me where it gets really really interesting. 391 00:21:15,960 --> 00:21:19,760 Speaker 1: We've sort of laid the groundwork. It all makes sense 392 00:21:19,760 --> 00:21:22,320 Speaker 1: to me, hopefully, hopefully it's making sense to to listeners. 393 00:21:22,480 --> 00:21:25,040 Speaker 2: I feel like, yeah, we've been laying it down pretty clearly. 394 00:21:25,560 --> 00:21:26,159 Speaker 3: Yeah, I think so. 395 00:21:26,680 --> 00:21:29,600 Speaker 1: But here's where it gets interesting to me, because why 396 00:21:30,200 --> 00:21:34,240 Speaker 1: is this happening? And the reason is, like what you 397 00:21:34,280 --> 00:21:36,840 Speaker 1: have to do is you have to You don't have to, 398 00:21:36,920 --> 00:21:39,360 Speaker 1: but it's very beneficial, I think, to look at almost 399 00:21:39,400 --> 00:21:43,440 Speaker 1: everything that we are able to do through a lens 400 00:21:43,480 --> 00:21:46,560 Speaker 1: of evolution and natural selection, so like there has to 401 00:21:46,560 --> 00:21:50,280 Speaker 1: be because in that lies a fundamental reason why your 402 00:21:50,280 --> 00:21:52,040 Speaker 1: brain is doing this. There has to be a reason 403 00:21:52,040 --> 00:21:54,720 Speaker 1: why this is happening. And it turns out when you 404 00:21:54,720 --> 00:21:57,119 Speaker 1: look at it through that lens, it makes sense. And 405 00:21:58,040 --> 00:21:59,840 Speaker 1: like I don't know if this is a I mean, 406 00:21:59,840 --> 00:22:03,000 Speaker 1: this is still philosophical stuff, but like it all makes 407 00:22:03,040 --> 00:22:03,760 Speaker 1: total sense to me. 408 00:22:04,320 --> 00:22:07,200 Speaker 2: Yeah, the basis of any time you bring evolution or 409 00:22:07,280 --> 00:22:11,480 Speaker 2: natural selection into the picture, you're basically saying, Okay, whatever's 410 00:22:11,560 --> 00:22:15,480 Speaker 2: going on actually improves our chances of survival. So there's 411 00:22:15,520 --> 00:22:19,240 Speaker 2: a psychologist from I think you see Berkeley. Don't quote 412 00:22:19,280 --> 00:22:24,480 Speaker 2: me on that. His name's Donald Hoffman, and he is 413 00:22:24,560 --> 00:22:27,320 Speaker 2: one of the I guess leading researchers into the nature 414 00:22:27,359 --> 00:22:30,720 Speaker 2: of reality right now. His hypotheses seem to be pretty 415 00:22:31,160 --> 00:22:32,280 Speaker 2: ocorront right. 416 00:22:32,760 --> 00:22:33,560 Speaker 3: I bet it's Berkeley. 417 00:22:34,200 --> 00:22:37,720 Speaker 2: It's gotta be, it's gotta be. The basis of his 418 00:22:37,960 --> 00:22:43,360 Speaker 2: interpretation is that we see a rough sketch of the 419 00:22:43,400 --> 00:22:49,760 Speaker 2: world around us, because that's the version of reality that 420 00:22:49,920 --> 00:22:53,040 Speaker 2: is most likely to allow us to survive, or was 421 00:22:53,160 --> 00:22:56,960 Speaker 2: over the millions of years of our evolution to this point. 422 00:22:57,480 --> 00:23:01,919 Speaker 1: By the way, he's from uc Ermine, so close not 423 00:23:02,480 --> 00:23:04,239 Speaker 1: it seemed very much like a Berkeley kind of thing 424 00:23:04,280 --> 00:23:05,960 Speaker 1: to do, for Surevine, who knew? 425 00:23:06,040 --> 00:23:08,840 Speaker 2: At least I didn't say Davis. 426 00:23:11,840 --> 00:23:12,600 Speaker 3: Or sand burdu. 427 00:23:14,480 --> 00:23:15,840 Speaker 2: I didn't even want to bring that up. 428 00:23:17,560 --> 00:23:18,520 Speaker 3: What was the last thing you said? 429 00:23:18,560 --> 00:23:20,760 Speaker 2: I'm sorry, I said that. He was saying, like, the 430 00:23:20,840 --> 00:23:23,480 Speaker 2: reason that we have a rough sketch of reality is 431 00:23:23,520 --> 00:23:27,000 Speaker 2: because natural selection has said that's the that is the 432 00:23:27,119 --> 00:23:30,240 Speaker 2: version of reality that will keep humans alive most likely. 433 00:23:31,040 --> 00:23:34,240 Speaker 1: Right, Okay, so you have to remember what we said earlier, 434 00:23:34,280 --> 00:23:36,800 Speaker 1: because this all you know ties together is that we 435 00:23:36,880 --> 00:23:41,359 Speaker 1: got to reiterate. The brain is in its skull. It 436 00:23:41,440 --> 00:23:45,159 Speaker 1: is only receiving these messages that it's given from these receptors. 437 00:23:45,920 --> 00:23:49,320 Speaker 1: Evolution is the same thing. Evolution is also blind in 438 00:23:49,359 --> 00:23:54,760 Speaker 1: a sense. Natural selection isn't favoring one thing over another 439 00:23:55,400 --> 00:23:58,080 Speaker 1: to try to get what's accurate as far as reality goes. 440 00:23:58,960 --> 00:24:00,000 Speaker 3: It's UnBias. 441 00:24:00,000 --> 00:24:03,000 Speaker 1: This natural selection is only going to favor the reality 442 00:24:03,160 --> 00:24:06,440 Speaker 1: that's going to give you that chance to survive. And 443 00:24:06,840 --> 00:24:10,520 Speaker 1: this is the point where I got it fairly confused. 444 00:24:11,040 --> 00:24:14,280 Speaker 1: But then it all came back around with the desktop analogy. 445 00:24:14,840 --> 00:24:18,440 Speaker 1: But I do have to admit this, before the desktop thing, 446 00:24:18,800 --> 00:24:20,800 Speaker 1: I was pretty lost right here and still sort of 447 00:24:20,800 --> 00:24:21,359 Speaker 1: am Okay. 448 00:24:21,400 --> 00:24:25,720 Speaker 2: So one of the examples that I've seen Hoffmann use 449 00:24:25,800 --> 00:24:28,560 Speaker 2: to describe what he's talking about at this point is, 450 00:24:29,200 --> 00:24:32,639 Speaker 2: let's say we had developed the ability to see oxygen 451 00:24:33,080 --> 00:24:37,000 Speaker 2: or levels of concentrations of oxygen in the air. Right, Okay, 452 00:24:37,000 --> 00:24:40,359 Speaker 2: we need oxygen to breathe. So in his example, the 453 00:24:40,400 --> 00:24:43,439 Speaker 2: greener the air, the more oxygen there is. The redder 454 00:24:43,480 --> 00:24:46,600 Speaker 2: the air, the less oxygen there is. Right. So, if 455 00:24:46,640 --> 00:24:48,919 Speaker 2: we had just been gifted with a view of the 456 00:24:49,000 --> 00:24:52,000 Speaker 2: actual reality, so we saw the gradient that was present 457 00:24:52,040 --> 00:24:54,719 Speaker 2: in any given parcel of air that we were standing around, 458 00:24:55,480 --> 00:24:58,680 Speaker 2: that doesn't mean that we know what gradient we want 459 00:24:59,119 --> 00:25:02,480 Speaker 2: or that will help us survive. Instead, what we were 460 00:25:02,560 --> 00:25:05,840 Speaker 2: gifted with in this analogy was the ability to see 461 00:25:06,000 --> 00:25:08,439 Speaker 2: red and stay away from that because it will kill us, 462 00:25:08,560 --> 00:25:10,600 Speaker 2: or green and go to that because that was the 463 00:25:10,680 --> 00:25:12,840 Speaker 2: kind of that was the amount of oxygen that we 464 00:25:12,880 --> 00:25:15,600 Speaker 2: know we need to survive. Right, We don't know how 465 00:25:15,680 --> 00:25:17,959 Speaker 2: much oxygen is in the air, and as far as 466 00:25:18,600 --> 00:25:21,800 Speaker 2: natural selection is concerned, it doesn't matter if we know 467 00:25:21,880 --> 00:25:24,119 Speaker 2: how much oxygen is the air. We just need to 468 00:25:24,160 --> 00:25:26,879 Speaker 2: know that that green air is where we want to be, 469 00:25:27,119 --> 00:25:29,199 Speaker 2: the red air is where we want to stay away from. 470 00:25:29,359 --> 00:25:32,320 Speaker 2: Or back to the apple example, we know that the 471 00:25:32,359 --> 00:25:34,920 Speaker 2: red apple is the one we want to eat, the 472 00:25:35,280 --> 00:25:37,520 Speaker 2: black rotted apple is the one we want to stay 473 00:25:37,520 --> 00:25:40,760 Speaker 2: away from. But then you have to take it back 474 00:25:40,800 --> 00:25:44,439 Speaker 2: to the beginning. That apple's not actually red. So somewhere 475 00:25:44,480 --> 00:25:47,399 Speaker 2: along the way, our brains and natural selection got together 476 00:25:47,800 --> 00:25:52,040 Speaker 2: to allow us to see colors. And because we could 477 00:25:52,080 --> 00:25:54,560 Speaker 2: see colors, that was the way we began to interact 478 00:25:54,600 --> 00:25:57,000 Speaker 2: with the world. Because we can taste things, that's the 479 00:25:57,000 --> 00:25:59,040 Speaker 2: way we interact with the world. There are plenty of 480 00:25:59,080 --> 00:26:01,359 Speaker 2: other ways to interact with the world. There are plenty 481 00:26:01,359 --> 00:26:03,640 Speaker 2: of things that were missing about the world because we 482 00:26:03,680 --> 00:26:07,320 Speaker 2: only have these particular five senses. But that's all humans 483 00:26:07,400 --> 00:26:10,440 Speaker 2: needed to survive as a species. That's why we don't 484 00:26:10,440 --> 00:26:15,760 Speaker 2: see the full picture of reality. You did it, thank god, 485 00:26:16,040 --> 00:26:17,960 Speaker 2: because that is the hardest part for sure. 486 00:26:18,920 --> 00:26:21,600 Speaker 1: That's the part that kept breaking my brain. You were 487 00:26:21,359 --> 00:26:23,320 Speaker 1: you had that oxygen thing in your hip pocket. You 488 00:26:23,320 --> 00:26:24,800 Speaker 1: didn't let me know about that, so I. 489 00:26:24,760 --> 00:26:28,160 Speaker 2: Did my friend, I sent it to you. You did, yeah, 490 00:26:28,200 --> 00:26:30,720 Speaker 2: But it was a flurry of emails, for sure, so 491 00:26:31,040 --> 00:26:31,440 Speaker 2: I did. 492 00:26:32,440 --> 00:26:33,879 Speaker 3: It was behind the curtain everyone. 493 00:26:33,880 --> 00:26:35,879 Speaker 1: There's there's always a few things leading up to an 494 00:26:35,880 --> 00:26:38,800 Speaker 1: episode here and there that we try to lock in 495 00:26:38,880 --> 00:26:41,600 Speaker 1: early as possible. But this one was just like, h 496 00:26:41,920 --> 00:26:45,600 Speaker 1: and I think this and probably this will help. It 497 00:26:45,640 --> 00:26:47,760 Speaker 1: was kind of like akin to like, as someone is 498 00:26:47,800 --> 00:26:50,520 Speaker 1: shoving his out on stage, they're like, and just remember, guys, 499 00:26:50,600 --> 00:26:51,520 Speaker 1: this is the key to it. 500 00:26:51,480 --> 00:26:54,080 Speaker 2: All right. They shove us out on stage, but they 501 00:26:54,080 --> 00:26:57,000 Speaker 2: make sure to flash that Shepherd took that they've gotten 502 00:26:57,040 --> 00:26:58,360 Speaker 2: already if we screw it right. 503 00:26:59,200 --> 00:27:02,119 Speaker 1: All right, So I guess that now, now we're going 504 00:27:02,160 --> 00:27:04,040 Speaker 1: to talk about what I mentioned before, which really brings 505 00:27:04,040 --> 00:27:08,160 Speaker 1: it home in a very understandable way, is the desktop analogy. 506 00:27:08,240 --> 00:27:10,040 Speaker 1: I'm having a hard time saying that for some reason. 507 00:27:10,359 --> 00:27:14,120 Speaker 1: And this is Hoffman again from Irvine and by way 508 00:27:14,119 --> 00:27:17,919 Speaker 1: of Berkeley, and here's the analogy. All you have to 509 00:27:17,920 --> 00:27:21,000 Speaker 1: do is look at your laptop and your desktop screen, 510 00:27:21,680 --> 00:27:24,160 Speaker 1: and you've got icons all over it. You've got those 511 00:27:24,880 --> 00:27:27,760 Speaker 1: those blue folders that have all the the things that 512 00:27:27,840 --> 00:27:31,120 Speaker 1: might have like a word document or a or an 513 00:27:31,240 --> 00:27:33,520 Speaker 1: MP three file or whatever's on your desktiny. 514 00:27:33,200 --> 00:27:36,160 Speaker 2: O an MP three file? Did you get it off Napster? 515 00:27:37,320 --> 00:27:39,240 Speaker 3: They did? Or MP three is not even anything anymore, 516 00:27:39,240 --> 00:27:40,000 Speaker 3: I don't think so. 517 00:27:40,359 --> 00:27:42,160 Speaker 2: I don't think they call them MP three. 518 00:27:42,640 --> 00:27:43,920 Speaker 3: I don't even know what are they now. 519 00:27:44,160 --> 00:27:45,600 Speaker 2: I think they just call them songs. 520 00:27:46,119 --> 00:27:51,760 Speaker 1: Okay, just listen to music. Oh boy, So here's the deal. 521 00:27:52,000 --> 00:27:54,199 Speaker 1: You see all that stuff, You know that you're supposed 522 00:27:54,200 --> 00:27:56,280 Speaker 1: to click on that blue folder and click on that 523 00:27:56,320 --> 00:27:57,960 Speaker 1: word document if you want to get your. 524 00:27:57,840 --> 00:27:58,680 Speaker 3: Word file up. 525 00:27:59,640 --> 00:28:02,440 Speaker 1: But all that stuff is just a user and interface, 526 00:28:02,520 --> 00:28:06,640 Speaker 1: a graphical user interface that we know how to interact with. 527 00:28:07,080 --> 00:28:12,040 Speaker 1: What's really going on is there's a system in the 528 00:28:12,080 --> 00:28:14,399 Speaker 1: guts of that computer that is hard at work with 529 00:28:14,440 --> 00:28:15,360 Speaker 1: ones and zeros. 530 00:28:15,960 --> 00:28:17,000 Speaker 3: But we wouldn't know. 531 00:28:17,160 --> 00:28:19,040 Speaker 1: How to make heads or tails of that stuff if 532 00:28:19,080 --> 00:28:23,119 Speaker 1: we didn't have these icons that represented the things that 533 00:28:23,160 --> 00:28:27,240 Speaker 1: we want to interact with on that desktop. And those icons, 534 00:28:27,440 --> 00:28:31,000 Speaker 1: my friends, is the same thing is that apple on 535 00:28:31,080 --> 00:28:34,680 Speaker 1: the desks. The apple is an icon, the same way 536 00:28:34,760 --> 00:28:37,520 Speaker 1: as that blue folder on your desktop is an icon. 537 00:28:37,640 --> 00:28:41,640 Speaker 1: It's just something that we have assigned so that we 538 00:28:41,680 --> 00:28:42,640 Speaker 1: can interact with it. 539 00:28:42,880 --> 00:28:46,240 Speaker 2: Yeah, because we couldn't possibly get done what we want 540 00:28:46,280 --> 00:28:49,200 Speaker 2: to get done by interacting with real reality. It's not 541 00:28:49,240 --> 00:28:53,560 Speaker 2: how we see things. We see things as like shadows 542 00:28:53,600 --> 00:28:58,200 Speaker 2: on the cave wall. Right. I love the desktop analogy, man. 543 00:28:58,320 --> 00:29:02,200 Speaker 2: There's another part of the desktop icon analogy too, that's 544 00:29:02,240 --> 00:29:07,520 Speaker 2: a consequence of this whole hypothesis. Right, That is, when 545 00:29:07,560 --> 00:29:13,240 Speaker 2: you turn your computer off that folder icon ceases to exist. 546 00:29:13,680 --> 00:29:16,360 Speaker 2: It doesn't keep running in the background. It's gone. It 547 00:29:16,440 --> 00:29:22,600 Speaker 2: does not exist. The circuitry, the software, the operating system 548 00:29:22,800 --> 00:29:27,160 Speaker 2: that produces that desktop icon that continues to exist, and 549 00:29:27,200 --> 00:29:31,040 Speaker 2: when you turn the computer back on, the icon exists again, 550 00:29:32,200 --> 00:29:35,440 Speaker 2: but in the meantime it ceases to exist. And that 551 00:29:35,600 --> 00:29:40,640 Speaker 2: is analogous to this interpretation of reality that when you 552 00:29:40,680 --> 00:29:43,960 Speaker 2: stop looking at an apple, that apple ceases to exist. 553 00:29:44,480 --> 00:29:48,120 Speaker 2: The thing that produces that apple, whether it's some grand 554 00:29:48,240 --> 00:29:51,960 Speaker 2: circuitry that we're unaware of that actually that's actually base reality, 555 00:29:52,360 --> 00:29:57,440 Speaker 2: or it's some data combined with a simple algorithm that 556 00:29:58,040 --> 00:30:02,520 Speaker 2: produces our experience of reality. Whatever produces it is still there, 557 00:30:02,800 --> 00:30:05,080 Speaker 2: just like the circuitry and the operating system in the 558 00:30:05,120 --> 00:30:08,920 Speaker 2: computers still there. But the apple doesn't exist any longer 559 00:30:09,120 --> 00:30:12,680 Speaker 2: because there's no human around to experience it. Because apples 560 00:30:12,840 --> 00:30:17,400 Speaker 2: only exist the way we see them in the reality 561 00:30:17,400 --> 00:30:20,280 Speaker 2: that humans experience. That's the only place they exist. 562 00:30:21,680 --> 00:30:24,000 Speaker 1: By the way, you said operating system, I think we 563 00:30:24,040 --> 00:30:25,440 Speaker 1: call that an OS now. 564 00:30:25,440 --> 00:30:27,560 Speaker 2: Sorry, mister MP three. 565 00:30:28,480 --> 00:30:31,640 Speaker 1: By the way, we should totally have t shirts that 566 00:30:31,720 --> 00:30:33,920 Speaker 1: say stuff you should know on the front and on 567 00:30:33,960 --> 00:30:37,440 Speaker 1: the back. It just says everything is an icon. I 568 00:30:37,440 --> 00:30:40,040 Speaker 1: think that's a great idea, but not well, that might 569 00:30:40,200 --> 00:30:42,560 Speaker 1: confusing though they might think they mean icon is in 570 00:30:43,920 --> 00:30:44,920 Speaker 1: an iconoclast. 571 00:30:45,040 --> 00:30:47,640 Speaker 2: Well, we could put in parentheses. Listen to the reality 572 00:30:47,680 --> 00:30:51,200 Speaker 2: episode and you'll know what we're talking about. Yeah, exactly, 573 00:30:51,240 --> 00:30:54,400 Speaker 2: and then we'll put okay question mark because we don't 574 00:30:54,400 --> 00:30:55,680 Speaker 2: want to boss anybody around. 575 00:30:56,760 --> 00:31:00,520 Speaker 1: So if you were confused by what you were Josh 576 00:31:00,640 --> 00:31:03,080 Speaker 1: was just talking about, we have to look at it 577 00:31:03,120 --> 00:31:07,840 Speaker 1: again through that lens of natural selection and evolution because 578 00:31:07,920 --> 00:31:12,360 Speaker 1: our brains have you know, let's talk about the apple again. 579 00:31:12,880 --> 00:31:15,400 Speaker 1: Our brains evolved to see that color, like you said, 580 00:31:15,440 --> 00:31:18,200 Speaker 1: as something that is ripe and delicious and that will 581 00:31:18,240 --> 00:31:21,920 Speaker 1: give us nutrition to a certain degree. But our brains, 582 00:31:21,960 --> 00:31:25,720 Speaker 1: like weren't evolving in isolation. Everything else was evolving along 583 00:31:25,800 --> 00:31:29,080 Speaker 1: with it, including that apple, and that apple evolved to 584 00:31:29,280 --> 00:31:32,240 Speaker 1: be read so we would eat it and eventually spread 585 00:31:32,240 --> 00:31:35,360 Speaker 1: those seeds so it could survive as well and grow 586 00:31:35,440 --> 00:31:36,120 Speaker 1: more apples. 587 00:31:36,680 --> 00:31:41,360 Speaker 3: So evolution itself is that desktop. 588 00:31:41,200 --> 00:31:44,760 Speaker 2: Right, That's what created that desktop. It's not our brains 589 00:31:44,800 --> 00:31:47,360 Speaker 2: just like we just came up with this kind of thing. 590 00:31:47,400 --> 00:31:50,239 Speaker 2: It was like working in conjunction with evolution, that's just 591 00:31:50,240 --> 00:31:53,560 Speaker 2: what we evolved to experience. And so in that sense, 592 00:31:53,640 --> 00:31:57,400 Speaker 2: this to me was super reassuring when I realized this, Yeah, 593 00:31:57,480 --> 00:32:01,480 Speaker 2: that means that there's no big mysy, there's no there's 594 00:32:01,480 --> 00:32:05,080 Speaker 2: no purposeful veil that like God or the universe or 595 00:32:05,080 --> 00:32:08,240 Speaker 2: somebody cast over us to prevent us from seeing real reality. 596 00:32:08,680 --> 00:32:11,600 Speaker 2: The reason we don't see real reality is because we 597 00:32:11,800 --> 00:32:15,680 Speaker 2: just didn't evolve to see reality that way. We evolved 598 00:32:15,680 --> 00:32:18,440 Speaker 2: to see reality in a different way. And that that 599 00:32:18,720 --> 00:32:21,480 Speaker 2: even though we know that there's other aspects of reality 600 00:32:21,520 --> 00:32:25,680 Speaker 2: we don't sense like, that doesn't mean that there's something 601 00:32:26,480 --> 00:32:30,320 Speaker 2: forever beyond our grasp. Like Emmanual Kant suggested. 602 00:32:31,040 --> 00:32:34,800 Speaker 3: I agree, all right, uh neo, I think you go 603 00:32:34,840 --> 00:32:36,880 Speaker 3: take the red pill the blue pill. 604 00:32:37,800 --> 00:32:40,480 Speaker 2: I always say, why not both exactly. 605 00:32:40,840 --> 00:32:41,600 Speaker 3: We'll be right back. 606 00:32:44,720 --> 00:32:53,560 Speaker 2: Stuff you should know and shock stuff you should know. 607 00:33:05,560 --> 00:33:10,360 Speaker 1: All right, So quick recap of Hoffman from Irvine basically 608 00:33:10,400 --> 00:33:13,880 Speaker 1: saying that, like everything that we're perceiving around us is 609 00:33:14,280 --> 00:33:20,840 Speaker 1: a construct from a combined process of these evolutionary forces 610 00:33:20,840 --> 00:33:26,200 Speaker 1: that are blind working in cooperation with the brain. And this, 611 00:33:26,440 --> 00:33:28,360 Speaker 1: you know, this can be hard to swallow to some 612 00:33:28,360 --> 00:33:31,960 Speaker 1: people that might sound kind of goofy and ridiculous. People 613 00:33:32,000 --> 00:33:37,800 Speaker 1: have come along, certainly, and this was pre Hoffman, of course, 614 00:33:37,800 --> 00:33:40,560 Speaker 1: but people have come along through the years to poopoo 615 00:33:40,560 --> 00:33:44,880 Speaker 1: all this great thinkers, even someone like Samuel Johnson, he 616 00:33:44,960 --> 00:33:47,800 Speaker 1: was an essayist in the eighteenth century and a great writer. 617 00:33:48,840 --> 00:33:52,520 Speaker 1: He had it out with a contemporary of his philosopher 618 00:33:52,600 --> 00:33:57,720 Speaker 1: named Bishop Barclay not Berkeley from UC Berkeley, and he 619 00:33:57,880 --> 00:34:02,080 Speaker 1: was basically like, dude, you can't tell him that these 620 00:34:02,080 --> 00:34:05,600 Speaker 1: things don't exist outside of the mine, Like look at 621 00:34:05,600 --> 00:34:08,600 Speaker 1: this rock right here. And he went and kicked it 622 00:34:08,680 --> 00:34:11,960 Speaker 1: and said, I refute it thusly. In other words, how 623 00:34:12,000 --> 00:34:14,080 Speaker 1: can you tell me that rock is just a construct 624 00:34:14,080 --> 00:34:16,000 Speaker 1: of my mind when I just went and kicked it 625 00:34:16,040 --> 00:34:18,000 Speaker 1: and it made a sound, and it was heavy and 626 00:34:18,040 --> 00:34:18,799 Speaker 1: it hurt my toe. 627 00:34:19,040 --> 00:34:22,560 Speaker 2: That's where we come back to, like John Locke and 628 00:34:22,600 --> 00:34:26,840 Speaker 2: Plato and Aristotle and Galileo especially getting it right, like 629 00:34:26,960 --> 00:34:30,400 Speaker 2: basically out of the gate that yes, these these objects 630 00:34:30,400 --> 00:34:34,280 Speaker 2: that we interact with in the universe, they have bulk, 631 00:34:34,440 --> 00:34:37,040 Speaker 2: they have mass, They move or they don't move. They 632 00:34:37,080 --> 00:34:41,680 Speaker 2: have primary characteristics, right, So yes, if you kick a rock, 633 00:34:42,640 --> 00:34:47,799 Speaker 2: apparently mass is part of the rock's primary characteristic, right, Yeah, 634 00:34:47,920 --> 00:34:51,040 Speaker 2: but say the color of the rock, or the shape 635 00:34:51,040 --> 00:34:54,319 Speaker 2: of the rock, or the shininess of the rock, that 636 00:34:54,480 --> 00:35:00,799 Speaker 2: is not necessarily part of reality. Yeah, Okay, I think 637 00:35:00,800 --> 00:35:02,799 Speaker 2: we've got it. Is mind blowing for sure, But at 638 00:35:02,800 --> 00:35:06,279 Speaker 2: the same time, it's just there's more to reality than 639 00:35:06,560 --> 00:35:09,720 Speaker 2: we see. But it doesn't mean that there's some great 640 00:35:09,840 --> 00:35:13,240 Speaker 2: mystery necessarily. I feel like we solve that the mystery 641 00:35:13,280 --> 00:35:16,880 Speaker 2: doesn't actually exist. It's just there's other parts of the 642 00:35:16,960 --> 00:35:19,839 Speaker 2: universe we just don't sense and there's nothing to it 643 00:35:19,880 --> 00:35:21,919 Speaker 2: other than we didn't evolve to sense it that way. 644 00:35:22,760 --> 00:35:25,960 Speaker 1: Yeah, And well, we have great concrete examples of that, 645 00:35:26,719 --> 00:35:30,080 Speaker 1: and that is the fact that we when we see things, 646 00:35:30,160 --> 00:35:32,960 Speaker 1: what we're seeing is just a small portion of what 647 00:35:33,000 --> 00:35:36,960 Speaker 1: there is. Yeah, we see what a visible light on 648 00:35:37,080 --> 00:35:40,879 Speaker 1: the on the spectrum, like, it's pretty narrow in comparison 649 00:35:41,239 --> 00:35:44,840 Speaker 1: to the entire spectrum. But there are also gamma rays, 650 00:35:44,840 --> 00:35:47,440 Speaker 1: and there are X rays, and there are radio waves, 651 00:35:47,800 --> 00:35:49,760 Speaker 1: and there are all these things that we can't see 652 00:35:50,760 --> 00:35:54,200 Speaker 1: and detect with our human eyeballs, but we still know 653 00:35:54,239 --> 00:35:59,160 Speaker 1: they're there because we have built machines and systems to 654 00:35:59,239 --> 00:36:01,440 Speaker 1: allow us to enter act with those things like X 655 00:36:01,520 --> 00:36:05,040 Speaker 1: ray machines or radios that allow you to hear what's 656 00:36:05,040 --> 00:36:07,839 Speaker 1: happening on those radio waves, but you can't actually see 657 00:36:07,840 --> 00:36:11,360 Speaker 1: that stuff. So it's a good way of illustrating, like 658 00:36:13,080 --> 00:36:15,640 Speaker 1: what you know is just a very small portion of 659 00:36:15,920 --> 00:36:17,240 Speaker 1: what there really is out there. 660 00:36:17,440 --> 00:36:19,640 Speaker 2: Yeah, But also one of the other cool things about 661 00:36:19,680 --> 00:36:22,640 Speaker 2: it is we know that they're out there, and we've 662 00:36:22,840 --> 00:36:26,640 Speaker 2: learned to like build machines that can detect things that 663 00:36:26,680 --> 00:36:30,399 Speaker 2: we can't perceive with our senses pretty amazing, and then 664 00:36:30,560 --> 00:36:33,400 Speaker 2: we've built more machines to figure out how to interact 665 00:36:33,840 --> 00:36:37,000 Speaker 2: with those parts of reality that we can't since. So, 666 00:36:37,320 --> 00:36:41,160 Speaker 2: if you see an image from a James Web telescope 667 00:36:41,600 --> 00:36:46,319 Speaker 2: picture right, and what you're seeing is there is a 668 00:36:46,480 --> 00:36:51,000 Speaker 2: radio telescope picture. So the James Web telescope sees in 669 00:36:51,160 --> 00:36:54,120 Speaker 2: radio waves. We can't see radio waves, but part of 670 00:36:54,120 --> 00:36:59,160 Speaker 2: its software converts radio waves into visible light spectrum, which 671 00:36:59,200 --> 00:37:02,239 Speaker 2: we can see. So for all intents and purposes, when 672 00:37:02,280 --> 00:37:04,360 Speaker 2: you're looking at a picture of a star that the 673 00:37:04,440 --> 00:37:07,880 Speaker 2: James Webb telescope took, you're seeing that star in the 674 00:37:07,920 --> 00:37:10,480 Speaker 2: same way we would see it if we could see 675 00:37:10,640 --> 00:37:16,040 Speaker 2: radio waves. Right, So it's not like reality is forever 676 00:37:16,160 --> 00:37:20,040 Speaker 2: out of our grasp. Were becoming smart enough to learn 677 00:37:20,120 --> 00:37:23,640 Speaker 2: ways to sense it in other ways, to convert it 678 00:37:23,960 --> 00:37:25,680 Speaker 2: into things that we can sense. 679 00:37:26,960 --> 00:37:28,880 Speaker 1: Yeah, and you know this is when I thought of 680 00:37:29,000 --> 00:37:31,640 Speaker 1: and it sounds kind of silly, but I actually got 681 00:37:32,000 --> 00:37:37,640 Speaker 1: a more a deeper appreciation of the movie Predator, right, Yeah, 682 00:37:37,800 --> 00:37:40,200 Speaker 1: from this, because when I was a kid, I saw 683 00:37:40,200 --> 00:37:41,920 Speaker 1: it and I was like, Oh, that's cool. The Predator 684 00:37:41,960 --> 00:37:46,000 Speaker 1: thing can see heat or thermo whatever would it be, 685 00:37:46,040 --> 00:37:51,080 Speaker 1: Thermo thermo properties. You can see heat, let's just say that, 686 00:37:51,480 --> 00:37:56,400 Speaker 1: and cool and stuff like that, can see temperature. I 687 00:37:56,440 --> 00:37:59,680 Speaker 1: was trying to be all fancy, which is true, and 688 00:37:59,719 --> 00:38:01,120 Speaker 1: when you kid is like, oh cool, that thing can 689 00:38:01,120 --> 00:38:03,440 Speaker 1: see temperature. But this made me think of it in 690 00:38:03,560 --> 00:38:05,600 Speaker 1: like a more philosophical way, that this thing is so 691 00:38:05,719 --> 00:38:11,200 Speaker 1: advanced that it has gained a new maybe not consciousness, 692 00:38:11,239 --> 00:38:15,240 Speaker 1: but a new ability to see the unseen, or. 693 00:38:15,160 --> 00:38:19,640 Speaker 2: It evolved in a different type of pressure that favored 694 00:38:20,000 --> 00:38:23,880 Speaker 2: being able to see infrared, so you can see temperature 695 00:38:23,920 --> 00:38:24,360 Speaker 2: of things. 696 00:38:24,920 --> 00:38:25,640 Speaker 3: Yeah, exactly. 697 00:38:25,680 --> 00:38:27,600 Speaker 2: You know, it doesn't necessarily make it more advanced in 698 00:38:27,600 --> 00:38:31,400 Speaker 2: the same way that you know, butterflies can see UV. 699 00:38:31,480 --> 00:38:33,800 Speaker 2: We can't see UV. But that doesn't necessarily mean the 700 00:38:33,840 --> 00:38:37,359 Speaker 2: butterflies more advanced than we are. It's just it evolved 701 00:38:37,520 --> 00:38:40,640 Speaker 2: to sense the world differently, right. Yeah, it's as simple 702 00:38:40,680 --> 00:38:42,960 Speaker 2: as that. But it also kind of brings back a 703 00:38:42,960 --> 00:38:46,520 Speaker 2: certain amount of humility to us that we like, we 704 00:38:46,760 --> 00:38:50,680 Speaker 2: just can't interact with parts of reality because we didn't 705 00:38:50,680 --> 00:38:53,080 Speaker 2: evolve that way, and it just kind of, I don't know, 706 00:38:53,080 --> 00:38:55,720 Speaker 2: it knocks us down a peg. I think in my estimation, 707 00:38:55,800 --> 00:38:58,479 Speaker 2: it kind of reminds us like, hey, yeah, we're pretty great. 708 00:38:58,520 --> 00:39:00,200 Speaker 2: We do a lot of really neat stuff, but we're 709 00:39:00,200 --> 00:39:03,839 Speaker 2: still animals. Don't forget that part. Yeah, I like that, 710 00:39:04,040 --> 00:39:06,080 Speaker 2: And then I think, to me, Chuck, the fact of 711 00:39:06,120 --> 00:39:11,880 Speaker 2: the podcast, oh, is that if you take Hoffman's argument, 712 00:39:12,960 --> 00:39:16,160 Speaker 2: there's an answer to that zen question of if if 713 00:39:16,160 --> 00:39:18,200 Speaker 2: a tree falls in the woods and no one's around 714 00:39:18,239 --> 00:39:20,800 Speaker 2: to hear it, like you mentioned, does it make a sound? 715 00:39:21,400 --> 00:39:23,840 Speaker 2: The answer is no, it does not make a sound. 716 00:39:24,080 --> 00:39:26,960 Speaker 2: And then even further, there's not even a tree if 717 00:39:26,960 --> 00:39:29,680 Speaker 2: there's no human around to see it or hear it. 718 00:39:29,760 --> 00:39:31,719 Speaker 3: Yeah, oh boy, I love that. 719 00:39:31,800 --> 00:39:34,640 Speaker 2: Like this, it's like, you know, it's like Bart Simpsons saying, 720 00:39:34,640 --> 00:39:38,920 Speaker 2: what's the sound of one hand clapping. M right, you 721 00:39:38,960 --> 00:39:41,799 Speaker 2: know they figured it out. I love it. 722 00:39:41,840 --> 00:39:43,200 Speaker 3: Oh boy, I tell you what. 723 00:39:43,360 --> 00:39:46,960 Speaker 1: Man, every time we and we haven't done these many times, 724 00:39:46,960 --> 00:39:51,640 Speaker 1: but every time we tackle something philosophical like this, I'm 725 00:39:51,719 --> 00:39:53,920 Speaker 1: upset for a little while, and I always come out 726 00:39:53,920 --> 00:39:56,680 Speaker 1: on the other side I think better for it. 727 00:39:56,760 --> 00:39:57,759 Speaker 2: So same here. 728 00:39:57,840 --> 00:39:59,479 Speaker 3: I'm glad you think of these things because I certain 729 00:39:59,480 --> 00:40:00,640 Speaker 3: wouldn't ass on these topics. 730 00:40:00,719 --> 00:40:01,960 Speaker 2: Oh, thank you. I appreciate that. 731 00:40:02,520 --> 00:40:04,600 Speaker 3: Yeah, i'd be like, hey, what about elephants? 732 00:40:05,320 --> 00:40:07,920 Speaker 2: That was a good one. Remember baby elephants suck their 733 00:40:07,920 --> 00:40:10,359 Speaker 2: trunk like baby humans suck their thumbs. 734 00:40:10,800 --> 00:40:13,960 Speaker 3: They have no trunk, Yeah, they do. Trunk. Trunks don't exist. 735 00:40:15,160 --> 00:40:17,440 Speaker 2: Man, you keep getting me with that one. I know 736 00:40:18,640 --> 00:40:23,520 Speaker 2: you got anything else. No, I don't either. So since 737 00:40:23,520 --> 00:40:26,359 Speaker 2: we have nothing else about this, it's time for listener mail. 738 00:40:28,800 --> 00:40:31,400 Speaker 3: All right, I'm gonna call this Smell of Vision. This 739 00:40:31,480 --> 00:40:32,120 Speaker 3: is a pretty good one. 740 00:40:33,400 --> 00:40:35,120 Speaker 1: Hey, guys, couldn't help it, but I just finished the 741 00:40:35,120 --> 00:40:37,759 Speaker 1: Smell Vision episode, and I think you've missed a real 742 00:40:37,800 --> 00:40:42,759 Speaker 1: opportunity there, or they did rather by calling it Centemah. 743 00:40:43,400 --> 00:40:44,920 Speaker 2: Yeah, I saw that. That was a great idea. 744 00:40:45,520 --> 00:40:46,160 Speaker 3: That was a good one. 745 00:40:46,239 --> 00:40:48,160 Speaker 1: I love the show, the wide range of topics you cover, 746 00:40:48,640 --> 00:40:51,000 Speaker 1: the fun jokes in the banter, and even the occasional 747 00:40:51,440 --> 00:40:56,120 Speaker 1: Chucker's reference. Obviously, the learning beneficial component is a huge plus. 748 00:40:56,719 --> 00:40:59,319 Speaker 1: But I cannot find a term that I heard in 749 00:40:59,360 --> 00:41:02,200 Speaker 1: one of your previous shows, and I need your help. Essentially, 750 00:41:02,320 --> 00:41:05,319 Speaker 1: Josh mentioned a term that spoke to humans or any 751 00:41:05,360 --> 00:41:10,360 Speaker 1: life form will take themselves after a certain period of time. 752 00:41:10,760 --> 00:41:14,439 Speaker 1: What if humans don't take ourselves out of the event, 753 00:41:14,480 --> 00:41:15,200 Speaker 1: take ourselves out? 754 00:41:15,280 --> 00:41:15,360 Speaker 2: Oh? 755 00:41:15,360 --> 00:41:18,160 Speaker 1: Okay, Oh yeah, I miss the word take themselves out. So, 756 00:41:18,200 --> 00:41:20,719 Speaker 1: in other words, if humans don't take ourselves out in 757 00:41:21,160 --> 00:41:24,399 Speaker 1: X term, we will see a greater opportunity for long 758 00:41:24,480 --> 00:41:25,799 Speaker 1: term human existence. 759 00:41:26,040 --> 00:41:29,320 Speaker 3: It's driving me crazy. I hope you have a reply. 760 00:41:30,680 --> 00:41:31,400 Speaker 3: Do you know what that is? 761 00:41:31,840 --> 00:41:35,759 Speaker 2: Yeah? I think what they're talking about is technological maturity. 762 00:41:35,880 --> 00:41:37,880 Speaker 2: If we if we make it through what's considered the 763 00:41:37,880 --> 00:41:40,400 Speaker 2: great filter, which is all like all the ways that 764 00:41:40,400 --> 00:41:44,160 Speaker 2: we could possibly wipe ourselves out using technology before we 765 00:41:44,280 --> 00:41:48,800 Speaker 2: learn to use it wisely. If we can make it 766 00:41:48,840 --> 00:41:52,000 Speaker 2: through that, then we'll emerge on technological maturity and we'll 767 00:41:52,040 --> 00:41:54,560 Speaker 2: be basically will this live forever as a species? 768 00:41:55,239 --> 00:41:56,399 Speaker 3: Okay? I bet you. That's it. 769 00:41:56,400 --> 00:42:00,520 Speaker 1: It's gotta be and he finishes off. My wife is 770 00:42:00,560 --> 00:42:03,239 Speaker 1: seven months pregnant with our first child. You can spare 771 00:42:03,239 --> 00:42:05,000 Speaker 1: a moment for a shout out to Stephanie and baby 772 00:42:05,040 --> 00:42:08,040 Speaker 1: Coorra coming in August to make a great birthday present. 773 00:42:08,120 --> 00:42:11,520 Speaker 1: But Darren, we don't do shout outs on the show. 774 00:42:11,520 --> 00:42:14,080 Speaker 1: We get so many requests for shout outs that we 775 00:42:14,160 --> 00:42:14,680 Speaker 1: just can't. 776 00:42:14,520 --> 00:42:16,360 Speaker 3: Get to them all. So we are certainly not going to. 777 00:42:16,360 --> 00:42:19,799 Speaker 1: Shout out Stephanie and your amazing baby Coorra that's coming 778 00:42:19,800 --> 00:42:22,759 Speaker 1: in August. We're just not going to mention them. No, 779 00:42:22,840 --> 00:42:26,520 Speaker 1: not at all, so don't ask. Thanks for all the positivity, 780 00:42:26,600 --> 00:42:28,200 Speaker 1: joy and laughter that you spread in the world. 781 00:42:28,239 --> 00:42:33,080 Speaker 3: That's much needed. Your Floridian friend, Darren Nutting. 782 00:42:33,239 --> 00:42:35,400 Speaker 2: Very nice, Darren, Thank you, and thank you also to 783 00:42:35,440 --> 00:42:37,880 Speaker 2: Stephanie and Cora, who are not going to mention. And 784 00:42:37,920 --> 00:42:39,600 Speaker 2: if you want to get in touch with us, like 785 00:42:39,719 --> 00:42:42,360 Speaker 2: Darren did, you can send us an email to Stuff 786 00:42:42,400 --> 00:42:47,840 Speaker 2: podcast at iHeartRadio dot com. 787 00:42:48,000 --> 00:42:50,319 Speaker 3: Stuff you Should Know is a production of iHeartRadio. 788 00:42:50,800 --> 00:42:54,000 Speaker 1: For more podcasts my heart Radio, visit the iHeartRadio app, 789 00:42:54,200 --> 00:43:01,560 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.