1 00:00:01,480 --> 00:00:04,960 Speaker 1: Welcome to Stuff You Should Know, a production of iHeartRadio. 2 00:00:11,160 --> 00:00:13,640 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:13,760 --> 00:00:17,760 Speaker 2: Chuck and Jerry's not here. We are and we're pretty 4 00:00:17,800 --> 00:00:20,400 Speaker 2: sure that both of us are here. But it's possible 5 00:00:21,000 --> 00:00:23,880 Speaker 2: just be this here, and this is stuff you should know. 6 00:00:24,640 --> 00:00:31,760 Speaker 3: That's right. Diving into philosophy again, this one sollipsism, perhaps 7 00:00:31,880 --> 00:00:37,319 Speaker 3: the most naval, gayzy, sort of unintelligible aspect of all 8 00:00:37,400 --> 00:00:41,280 Speaker 3: of them, which is that sort of old classic stoner 9 00:00:41,400 --> 00:00:44,960 Speaker 3: college dorm room thing. How do we know if anything 10 00:00:45,040 --> 00:00:47,320 Speaker 3: is real? What if it's as all in this part 11 00:00:47,320 --> 00:00:48,920 Speaker 3: of it? What if it's all just a simulation? 12 00:00:50,000 --> 00:00:52,559 Speaker 2: Yeah, that's one example of what it could be. But 13 00:00:54,000 --> 00:00:56,760 Speaker 2: there's a couple of things about this one, as annoying 14 00:00:56,840 --> 00:01:01,240 Speaker 2: as it is, because if you're if you're arguing for solypsism, 15 00:01:01,560 --> 00:01:04,240 Speaker 2: and I don't want to say you're syllopsist, because there's 16 00:01:04,240 --> 00:01:10,040 Speaker 2: basically nobody out there who's an actual syllopsist, Like you 17 00:01:10,080 --> 00:01:12,440 Speaker 2: can just keep saying like, but yeah, how do you know? 18 00:01:12,760 --> 00:01:15,520 Speaker 2: But yeah, how do you know? There doesn't seem to 19 00:01:15,560 --> 00:01:22,720 Speaker 2: be any more reducted argument in all of philosophy, all 20 00:01:22,840 --> 00:01:28,520 Speaker 2: other philosophy can essentially be argued against by solyopsism. And 21 00:01:28,560 --> 00:01:32,360 Speaker 2: the reason why is because the basis of solypsism is 22 00:01:32,400 --> 00:01:38,240 Speaker 2: that there is no reality. It's just you, you hearing this. 23 00:01:38,680 --> 00:01:42,160 Speaker 2: I don't exist, Chuck doesn't exist, this podcast doesn't exist. 24 00:01:42,880 --> 00:01:47,360 Speaker 2: Nothing exists except for your mind. And that's the basis 25 00:01:47,400 --> 00:01:50,680 Speaker 2: of everything that you think is real. And none of 26 00:01:50,760 --> 00:01:54,440 Speaker 2: us are actually doing anything that you're not projecting out 27 00:01:54,440 --> 00:01:58,320 Speaker 2: of your mind. That's sollpsism and it sounds mind blowing, 28 00:01:58,880 --> 00:02:01,520 Speaker 2: but like I said, it's also annoying, and it's also 29 00:02:01,600 --> 00:02:05,040 Speaker 2: extremely simple, so much so that it can deceive you 30 00:02:05,160 --> 00:02:08,720 Speaker 2: into thinking that it means more than it does. It doesn't. 31 00:02:08,960 --> 00:02:12,920 Speaker 2: It's it's as basic as that. But again, as annoying 32 00:02:12,960 --> 00:02:15,440 Speaker 2: as it is, it is in some ways a useful 33 00:02:15,560 --> 00:02:18,680 Speaker 2: argument because if you really want to make a philosopher 34 00:02:18,880 --> 00:02:22,320 Speaker 2: rigorous in their argument, have them take on sollipsism or 35 00:02:22,320 --> 00:02:23,080 Speaker 2: some form of it. 36 00:02:23,720 --> 00:02:23,919 Speaker 1: Yeah. 37 00:02:23,960 --> 00:02:27,920 Speaker 3: I think Olivia found something online that said, like, no modern, 38 00:02:28,639 --> 00:02:32,480 Speaker 3: like legitimate philosopher even takes part in these arguments because 39 00:02:32,520 --> 00:02:35,400 Speaker 3: it's just such like, hey, get off the couch with 40 00:02:35,480 --> 00:02:40,160 Speaker 3: your brong and maybe talk about something some real philosophy well. 41 00:02:40,200 --> 00:02:42,040 Speaker 2: Also, the other part of it too is if you 42 00:02:42,040 --> 00:02:45,760 Speaker 2: were a genuine philosopher and you genuinely believe that nothing 43 00:02:45,919 --> 00:02:49,200 Speaker 2: was real except in your mind, there's zero point for 44 00:02:49,280 --> 00:02:53,720 Speaker 2: you to do anything like write a philosophy paper for solipsism, 45 00:02:54,720 --> 00:02:57,760 Speaker 2: because nobody's out there to read it in actuality. So 46 00:02:57,800 --> 00:03:00,880 Speaker 2: what's the point. You're by yourself in the entire universe 47 00:03:01,200 --> 00:03:04,560 Speaker 2: so much so that the universe doesn't even exist outside 48 00:03:04,560 --> 00:03:08,520 Speaker 2: of your mind. You're by yourself in some incomprehensible form 49 00:03:08,600 --> 00:03:12,520 Speaker 2: of existence. It's just weird and depressing in a lot 50 00:03:12,560 --> 00:03:13,000 Speaker 2: of ways. 51 00:03:13,320 --> 00:03:16,959 Speaker 3: Yeah, for sure we should mention the skeptics a little 52 00:03:17,000 --> 00:03:19,359 Speaker 3: bit because that's sort of that sort of lays the 53 00:03:19,400 --> 00:03:25,120 Speaker 3: groundwork for solyopsism a little bit. The skeptics started around 54 00:03:25,120 --> 00:03:28,760 Speaker 3: the third century BCE. The word itself came from the 55 00:03:28,760 --> 00:03:33,600 Speaker 3: Greek term meaning inquiry or examination, and they were basically like, hey, 56 00:03:34,360 --> 00:03:39,440 Speaker 3: it's not possible to have like some knowledge to make 57 00:03:39,480 --> 00:03:44,240 Speaker 3: definitive judgments, arguing against the stoics, who said, no, you 58 00:03:44,280 --> 00:03:48,520 Speaker 3: should be able to test claims using, you know, stuff 59 00:03:48,520 --> 00:03:51,920 Speaker 3: that we can see and hear, using our senses, and 60 00:03:52,000 --> 00:03:53,680 Speaker 3: the skeptics and this sort of laid the ground of 61 00:03:53,680 --> 00:03:56,119 Speaker 3: what was to come for solopsism basically said like, hey, 62 00:03:56,720 --> 00:03:59,640 Speaker 3: we can all be deceived, though, what about the case 63 00:03:59,640 --> 00:04:02,800 Speaker 3: of idea nical twins? You could you could be deceived there, 64 00:04:03,360 --> 00:04:05,560 Speaker 3: or you could have a sensory experience if you're talking 65 00:04:05,600 --> 00:04:08,840 Speaker 3: about trusting your senses that aren't connected to reality, Like 66 00:04:08,880 --> 00:04:11,600 Speaker 3: that's what a dream is. Yeah, And dreams sort of 67 00:04:11,600 --> 00:04:13,960 Speaker 3: play into the whole thing, at least at the beginning. 68 00:04:14,480 --> 00:04:18,200 Speaker 2: Yeah, the skeptics said, you ever had a dream dingus? Yeah, 69 00:04:18,200 --> 00:04:19,560 Speaker 2: how are you gonna how are you going to trust 70 00:04:19,600 --> 00:04:23,640 Speaker 2: your senses with that? And so skepticism, like you said, 71 00:04:23,720 --> 00:04:27,080 Speaker 2: is kind of an extension of it, a basis of it. 72 00:04:27,080 --> 00:04:31,679 Speaker 2: It's not quite there, but enough that sollipsism is often 73 00:04:31,720 --> 00:04:35,320 Speaker 2: thought of as an extreme form of skepticism. Sometimes it's 74 00:04:35,360 --> 00:04:39,920 Speaker 2: also called global skepticism, like your skeptical everything and then uh, 75 00:04:40,040 --> 00:04:42,400 Speaker 2: it's also sometimes called mondo scepto. 76 00:04:44,640 --> 00:04:47,000 Speaker 3: Uh. Speaking of the dreams, though, there was a Taois 77 00:04:47,040 --> 00:04:50,919 Speaker 3: philosopher named juang Zi that in this also sort of 78 00:04:50,960 --> 00:04:54,120 Speaker 3: like the groundwork was, Hey, if I wake up from 79 00:04:54,160 --> 00:04:57,800 Speaker 3: a dream and I was a you know, an a 80 00:04:57,880 --> 00:05:00,440 Speaker 3: will debeast in that dream, how to I know that 81 00:05:00,480 --> 00:05:03,839 Speaker 3: I'm not really a will Tobeaston that this is the dream. 82 00:05:04,200 --> 00:05:06,880 Speaker 2: Yeah, and all this again, yes it does seem kind 83 00:05:06,920 --> 00:05:12,359 Speaker 2: of bongish or bong adjacent for sure, But this is 84 00:05:12,400 --> 00:05:16,279 Speaker 2: the kind of stuff that philosophers care about. Like it's 85 00:05:16,360 --> 00:05:20,920 Speaker 2: called epistemology. Epistemiology I like to add a lot of 86 00:05:21,360 --> 00:05:23,800 Speaker 2: syllables sometimes, so I'm not quite sure which one. But 87 00:05:24,120 --> 00:05:27,160 Speaker 2: it's the basis of how we know what we know, 88 00:05:27,320 --> 00:05:31,560 Speaker 2: how we gain knowledge. And the point of this, of 89 00:05:31,600 --> 00:05:34,160 Speaker 2: all this stuff, as people were kind of building on it, 90 00:05:34,200 --> 00:05:36,920 Speaker 2: is to say, like we need to kind of figure 91 00:05:36,920 --> 00:05:39,000 Speaker 2: out how we do know, because if you really stop 92 00:05:39,040 --> 00:05:41,800 Speaker 2: and think about it, we're not quite sure exactly how 93 00:05:41,839 --> 00:05:44,720 Speaker 2: we know anything. And that whole thing was picked up 94 00:05:44,720 --> 00:05:48,920 Speaker 2: in the seventeenth century by Descartes and his very famous 95 00:05:49,360 --> 00:05:53,279 Speaker 2: quote I think therefore I AM came out of this right. 96 00:05:53,720 --> 00:05:57,920 Speaker 3: Yeah, And that's you know, basically saying like, hey, I 97 00:05:58,000 --> 00:06:02,960 Speaker 3: know that I am. I'm doing the thinking. I have 98 00:06:03,000 --> 00:06:06,960 Speaker 3: a brain, and but that's about all I know. They 99 00:06:07,080 --> 00:06:08,840 Speaker 3: you know, Descartes was the first one that came in 100 00:06:08,880 --> 00:06:11,440 Speaker 3: and said, hey, maybe we should get a system called 101 00:06:11,480 --> 00:06:16,240 Speaker 3: methodic doubt. Great band name to determine if like, hey, 102 00:06:16,680 --> 00:06:19,000 Speaker 3: you're saying something is truth, like one of the truths. 103 00:06:19,360 --> 00:06:23,920 Speaker 3: We should be able to test this, but things are fallible, 104 00:06:24,000 --> 00:06:26,520 Speaker 3: like math, you can make mistakes in you can't look 105 00:06:26,520 --> 00:06:30,760 Speaker 3: at tradition of a culture because you know, people might 106 00:06:31,000 --> 00:06:34,279 Speaker 3: disagree with that kind of thing. And then the idea 107 00:06:34,320 --> 00:06:39,440 Speaker 3: of an evil demon coming in and basically kind of 108 00:06:39,480 --> 00:06:42,400 Speaker 3: taking hold of your consciousness and saying that you're you're 109 00:06:42,440 --> 00:06:47,760 Speaker 3: having all these illusions and that's inhabited inside of you. 110 00:06:48,560 --> 00:06:51,080 Speaker 2: Yeah, so that evil demon thing too. It's like, as 111 00:06:51,080 --> 00:06:54,040 Speaker 2: we'll see, it's been updated in much more modern form. 112 00:06:54,080 --> 00:06:56,160 Speaker 2: But Descartes was the first one to really kind of 113 00:06:56,200 --> 00:07:01,279 Speaker 2: say it's possible. Especially we should say cart believed in God, 114 00:07:02,480 --> 00:07:05,400 Speaker 2: that the extension of that I think therefore I am, 115 00:07:06,040 --> 00:07:10,800 Speaker 2: is also that anything I can just intuitively know is real, 116 00:07:11,280 --> 00:07:15,480 Speaker 2: like God is real. So he believed that there definitely 117 00:07:15,640 --> 00:07:19,440 Speaker 2: was God. So he was arguing like, okay, if we 118 00:07:19,520 --> 00:07:23,480 Speaker 2: believe in God, then there we have to entertain the 119 00:07:23,520 --> 00:07:28,240 Speaker 2: possibility that it's not just you know, our minds that 120 00:07:28,320 --> 00:07:31,920 Speaker 2: are projecting this, but that we're we're being diluted, that 121 00:07:31,960 --> 00:07:34,920 Speaker 2: we're like an entire universe is being created for us 122 00:07:34,920 --> 00:07:39,600 Speaker 2: by this evil demon. This is his seventeenth century application 123 00:07:39,720 --> 00:07:42,240 Speaker 2: of it, but it's it's like we, like I said, 124 00:07:42,240 --> 00:07:45,160 Speaker 2: it kind of formed these or it's been updated in 125 00:07:45,240 --> 00:07:49,120 Speaker 2: modern forms, and that really kind of that's where it 126 00:07:49,160 --> 00:07:52,320 Speaker 2: gets super tough, because it's like, Okay, yeah, it's ridiculous 127 00:07:52,360 --> 00:07:55,000 Speaker 2: that you're you're the only person who exists and all 128 00:07:55,040 --> 00:07:59,640 Speaker 2: the rest of us don't exist, or even more creepy, 129 00:07:59,680 --> 00:08:02,440 Speaker 2: this is where it gets it to me. Okay, like 130 00:08:02,840 --> 00:08:06,320 Speaker 2: when you start to try to argue against sollipsism, one 131 00:08:06,320 --> 00:08:09,440 Speaker 2: of the ways that you're going to go is that 132 00:08:09,760 --> 00:08:14,280 Speaker 2: other people have experiences and thoughts and emotions too, so 133 00:08:14,400 --> 00:08:19,320 Speaker 2: that totally discounts the idea that you're the only entity, 134 00:08:19,400 --> 00:08:22,680 Speaker 2: you're the only self in the entire universe, and that 135 00:08:22,760 --> 00:08:26,760 Speaker 2: all of this is just in your mind. But then 136 00:08:26,800 --> 00:08:28,440 Speaker 2: you have to ask, like, well, wait a minute, how 137 00:08:28,440 --> 00:08:31,040 Speaker 2: do you know other people have experiences in thoughts and 138 00:08:31,080 --> 00:08:34,120 Speaker 2: feelings like you do. There's no way for you to 139 00:08:34,360 --> 00:08:38,000 Speaker 2: know that, and there's actually no way for them to 140 00:08:38,120 --> 00:08:41,480 Speaker 2: get that across to you in any provable way. And 141 00:08:41,520 --> 00:08:43,200 Speaker 2: then you just kind of go like, oh, it's a 142 00:08:43,200 --> 00:08:46,800 Speaker 2: little scary. At least I do it's the kind of 143 00:08:46,840 --> 00:08:48,240 Speaker 2: thing that keeps me up at night. 144 00:08:50,440 --> 00:08:52,520 Speaker 3: Well, the word itself, if you want to break it down, 145 00:08:53,240 --> 00:08:57,000 Speaker 3: it first appeared in eighteen sixty nine from Kant and 146 00:08:58,360 --> 00:09:02,880 Speaker 3: I think it's Latin for solace from a loone in ipsy, 147 00:09:03,559 --> 00:09:07,000 Speaker 3: meaning self. And this isn't the kind of thing where 148 00:09:07,040 --> 00:09:09,000 Speaker 3: like at the beginning, people are like, wow, this is 149 00:09:09,080 --> 00:09:12,600 Speaker 3: really holds a lot of water. From the beginning, it 150 00:09:12,640 --> 00:09:16,800 Speaker 3: was pretty ridiculous, and philosophers thought it was pretty ridiculous. Oh, 151 00:09:16,800 --> 00:09:21,040 Speaker 3: here's a quote no great philosopher has espoused solypsism. It 152 00:09:21,120 --> 00:09:25,840 Speaker 3: is the quote that Lvia found. Because you know, if 153 00:09:25,920 --> 00:09:29,880 Speaker 3: you believe that there's nothing, then there you can't have 154 00:09:29,920 --> 00:09:32,520 Speaker 3: an argument about anything, because, like you said at the beginning, 155 00:09:33,080 --> 00:09:35,320 Speaker 3: a solypsis would just come in and go like, well. 156 00:09:35,160 --> 00:09:36,080 Speaker 1: How do you know prove it? 157 00:09:36,080 --> 00:09:38,680 Speaker 3: You can't prove that because you even your proof isn't 158 00:09:38,679 --> 00:09:40,040 Speaker 3: proof because it's not real. 159 00:09:40,960 --> 00:09:43,079 Speaker 2: Yeah. One of the other things too, just to kind 160 00:09:43,120 --> 00:09:46,520 Speaker 2: of get this into perspective, is like you can't even 161 00:09:46,559 --> 00:09:49,000 Speaker 2: say that you have a brain because everything you know 162 00:09:49,080 --> 00:09:52,680 Speaker 2: about a brain, you've you've basically you're not born with 163 00:09:52,720 --> 00:09:55,120 Speaker 2: the concept of a brain. You learn that from the 164 00:09:55,200 --> 00:09:59,640 Speaker 2: external world, and if the external world doesn't exist, then 165 00:09:59,679 --> 00:10:03,080 Speaker 2: maybe brains don't exist, Like maybe you just don't can't 166 00:10:03,080 --> 00:10:06,120 Speaker 2: even conceive of who you are, and that's the ultimate problem. 167 00:10:06,200 --> 00:10:08,240 Speaker 2: You just you can just keep reducing it, like you 168 00:10:08,320 --> 00:10:12,679 Speaker 2: can't prove how you know what you know. And I 169 00:10:12,720 --> 00:10:15,840 Speaker 2: feel like that's really kind of set us up for 170 00:10:16,320 --> 00:10:17,840 Speaker 2: a break. Who do you think? 171 00:10:18,240 --> 00:10:19,320 Speaker 1: Yeah, we'll be right back. 172 00:10:42,640 --> 00:10:44,840 Speaker 2: So one of the things I kind of talked about 173 00:10:45,000 --> 00:10:49,040 Speaker 2: earlier before the break, Chuck, was that the as far 174 00:10:49,080 --> 00:10:51,640 Speaker 2: as philosophy goes, like if you're trying to actually apply 175 00:10:51,720 --> 00:10:55,320 Speaker 2: this to philosophical arguments or maybe like real world kind 176 00:10:55,320 --> 00:10:59,440 Speaker 2: of stuff, is that it has to do with other 177 00:10:59,559 --> 00:11:03,720 Speaker 2: minds and the fact that we can't ever fully understand 178 00:11:04,280 --> 00:11:07,320 Speaker 2: what other people are thinking. And then as relates to sollipsism, 179 00:11:07,440 --> 00:11:11,880 Speaker 2: we can't really prove that other people are thinking. And 180 00:11:11,920 --> 00:11:17,439 Speaker 2: there's actually some not just philosophers, but neuroscientists who've kind 181 00:11:17,480 --> 00:11:20,360 Speaker 2: of investigated this because it is an interesting question, Like 182 00:11:21,080 --> 00:11:24,000 Speaker 2: it's that same kind of question like how do I 183 00:11:24,080 --> 00:11:28,240 Speaker 2: know that we both experience the same color green in 184 00:11:28,320 --> 00:11:31,000 Speaker 2: the exact same way, and that what you call green 185 00:11:31,080 --> 00:11:35,160 Speaker 2: I actually think is blue? Like, I experience it as 186 00:11:35,240 --> 00:11:37,680 Speaker 2: what you would experience blue, but I call it green 187 00:11:37,679 --> 00:11:39,720 Speaker 2: because I think that's what you're talking about too. 188 00:11:40,440 --> 00:11:44,200 Speaker 3: Yeah, And in terms of neuroscience, you know, you're talking 189 00:11:44,240 --> 00:11:47,760 Speaker 3: about maybe a technology where you could brain splice and 190 00:11:47,800 --> 00:11:51,320 Speaker 3: you could literally maybe get someone inside someone else's head. 191 00:11:52,480 --> 00:11:55,160 Speaker 3: But even then, it's not like some sort of full 192 00:11:55,200 --> 00:11:59,960 Speaker 3: proof solipsistic argument, because even if you were sending signals 193 00:12:00,080 --> 00:12:02,840 Speaker 3: from one brain to another, it's still going to be 194 00:12:03,240 --> 00:12:06,959 Speaker 3: a subjective experience and you wouldn't have any idea even 195 00:12:07,000 --> 00:12:09,080 Speaker 3: though you're getting the signals from their brain, like the 196 00:12:09,200 --> 00:12:12,920 Speaker 3: subjective nature of it, like you can't you can't gauge 197 00:12:13,080 --> 00:12:14,760 Speaker 3: subjectivity scientifically. 198 00:12:15,559 --> 00:12:20,320 Speaker 2: Yeah, And there's this example of like, Okay, one person 199 00:12:20,360 --> 00:12:23,320 Speaker 2: that you're connected to the brain of is saying like, 200 00:12:23,360 --> 00:12:26,079 Speaker 2: I'm thinking of a red apple, and the other person 201 00:12:26,720 --> 00:12:29,160 Speaker 2: with the other connected brain is like, yep, I can 202 00:12:29,200 --> 00:12:32,440 Speaker 2: see the red apple that you're thinking of. But again 203 00:12:32,640 --> 00:12:35,680 Speaker 2: that to that person, red is what the other person 204 00:12:35,720 --> 00:12:38,120 Speaker 2: would think of as blue, and you can't possibly know 205 00:12:38,200 --> 00:12:40,720 Speaker 2: that that person is thinking of what would be actually 206 00:12:40,720 --> 00:12:43,400 Speaker 2: a blue apple and calling it red. But the thing 207 00:12:43,480 --> 00:12:45,920 Speaker 2: is you can you can just be like, okay, person 208 00:12:46,000 --> 00:12:48,840 Speaker 2: number two. Now you think of a red apple, and 209 00:12:48,880 --> 00:12:51,079 Speaker 2: we'll see what person number one thinks of it if 210 00:12:51,080 --> 00:12:54,600 Speaker 2: it matches their conception. It'd be really easy to find 211 00:12:54,600 --> 00:12:55,599 Speaker 2: that out if you ask me. 212 00:12:56,240 --> 00:12:58,560 Speaker 3: Yeah, and if that was a test subject, they'd say, 213 00:12:58,559 --> 00:12:59,920 Speaker 3: why is Chuck walking into traffic? 214 00:13:01,640 --> 00:13:02,360 Speaker 1: What's going on? 215 00:13:02,920 --> 00:13:05,079 Speaker 2: Yeah? I mean we should say here, like all of 216 00:13:05,160 --> 00:13:08,000 Speaker 2: this does require brain implants, and I just don't feel 217 00:13:08,040 --> 00:13:12,200 Speaker 2: like there's anybody trustworthy to put an implant in your 218 00:13:12,200 --> 00:13:12,920 Speaker 2: brain right now. 219 00:13:13,160 --> 00:13:14,480 Speaker 3: Yeah, that's true. 220 00:13:14,679 --> 00:13:18,040 Speaker 2: There's also this concept of a philosophical zombie, right, They're 221 00:13:18,080 --> 00:13:20,600 Speaker 2: called p zombies, and it kind of ties in with 222 00:13:20,600 --> 00:13:25,240 Speaker 2: what I was saying, Like, we can't ever say that 223 00:13:25,320 --> 00:13:28,560 Speaker 2: somebody else is thinking or emoting, because we can conceive 224 00:13:28,600 --> 00:13:31,680 Speaker 2: of something that looks like a human, acts like a human, 225 00:13:32,120 --> 00:13:36,280 Speaker 2: has all the same thought processes of a human, maybe 226 00:13:36,320 --> 00:13:39,319 Speaker 2: even has emotions and all that stuff, but they're missing 227 00:13:39,960 --> 00:13:43,400 Speaker 2: what it means to be a human, which is the 228 00:13:43,480 --> 00:13:48,160 Speaker 2: experience of experiencing something. Right, So, like that person can 229 00:13:48,200 --> 00:13:50,680 Speaker 2: eat an apple and taste what an apple tastes like, 230 00:13:51,000 --> 00:13:54,680 Speaker 2: but they will never feel what it feels like to 231 00:13:54,720 --> 00:13:56,839 Speaker 2: taste an apple that's like really delicious, You know what 232 00:13:56,880 --> 00:13:59,199 Speaker 2: I'm saying, and some people come up with it, came 233 00:13:59,280 --> 00:14:02,000 Speaker 2: up with the idea of a p zombie, a philosophical 234 00:14:02,120 --> 00:14:04,840 Speaker 2: zombie to try to investigate like what it is that 235 00:14:04,920 --> 00:14:07,440 Speaker 2: makes humans humans? And that's kind of what they came 236 00:14:07,559 --> 00:14:07,800 Speaker 2: up with. 237 00:14:08,720 --> 00:14:10,960 Speaker 3: Yeah, and if you you know, as AI comes on 238 00:14:11,520 --> 00:14:13,760 Speaker 3: more and more, and I know you tackled some of 239 00:14:13,800 --> 00:14:17,280 Speaker 3: this in the End of the World your special podcast 240 00:14:17,320 --> 00:14:22,760 Speaker 3: series for that. But the idea of like AI becoming 241 00:14:22,840 --> 00:14:25,680 Speaker 3: sentient or conscious or whatever, Like, how were we going 242 00:14:25,760 --> 00:14:28,480 Speaker 3: to know if that's even happening, because it's not just 243 00:14:28,560 --> 00:14:31,480 Speaker 3: if it knows so much stuff it can you know, 244 00:14:31,800 --> 00:14:36,640 Speaker 3: AI can learn facts and things, but like it's that 245 00:14:36,640 --> 00:14:40,880 Speaker 3: that subjectivity of a human or a I guess just 246 00:14:40,920 --> 00:14:43,760 Speaker 3: an experience because it wouldn't be human. And like, how 247 00:14:43,800 --> 00:14:45,440 Speaker 3: do we know if that's happening to an AI? 248 00:14:45,920 --> 00:14:48,640 Speaker 2: Yeah, or a person too, you know, like it's just 249 00:14:49,480 --> 00:14:51,400 Speaker 2: and again, like I think you kind of nail it 250 00:14:51,440 --> 00:14:53,480 Speaker 2: on the head, like all this seems like navel gazing, 251 00:14:53,520 --> 00:14:56,520 Speaker 2: but there is like some utility to it. 252 00:14:56,840 --> 00:15:01,200 Speaker 3: Well, let's talk about some of the I guess varieties 253 00:15:01,200 --> 00:15:03,360 Speaker 3: of solypsism that they've come up with over the years. 254 00:15:04,480 --> 00:15:10,720 Speaker 3: There's one called metaphysical solepism. That's basically that an individual 255 00:15:11,360 --> 00:15:13,720 Speaker 3: is like yourself is all that there is nothing else 256 00:15:14,120 --> 00:15:16,000 Speaker 3: has any independent reality at all. 257 00:15:17,320 --> 00:15:18,040 Speaker 1: Then there is. 258 00:15:18,680 --> 00:15:23,600 Speaker 3: Epistemological solypsism, and that is it is not even possible 259 00:15:24,200 --> 00:15:28,000 Speaker 3: to know where anything outside our individual consciousness exists or 260 00:15:28,040 --> 00:15:28,640 Speaker 3: is real. 261 00:15:29,560 --> 00:15:31,400 Speaker 2: And that one is actually like kind of a step 262 00:15:31,440 --> 00:15:34,600 Speaker 2: down from metaphysical. They're like, we're not saying that nothing 263 00:15:34,640 --> 00:15:40,080 Speaker 2: else in the universe exists but your mind, but we're saying, like, 264 00:15:40,080 --> 00:15:41,640 Speaker 2: like you and I were just talking about with p 265 00:15:41,840 --> 00:15:46,880 Speaker 2: zombies and AI, like we can't prove that anybody else 266 00:15:46,920 --> 00:15:49,400 Speaker 2: has those thoughts some feelings besides the thinker. 267 00:15:49,880 --> 00:15:54,320 Speaker 3: Right, not the famous statue, but you know, a real thinker. 268 00:15:55,480 --> 00:16:01,160 Speaker 3: There's methodological methodological soism. I got that extra ow in there, 269 00:16:01,160 --> 00:16:04,360 Speaker 3: and that means it's not possible to even start to 270 00:16:04,640 --> 00:16:09,280 Speaker 3: analyze the world except through your own individual consciousness and lens. 271 00:16:09,280 --> 00:16:10,440 Speaker 1: Which that makes sense. 272 00:16:10,600 --> 00:16:14,000 Speaker 2: Yeah, it does. But I saw that it really gets 273 00:16:14,000 --> 00:16:18,760 Speaker 2: tricky with research because at base, methodological sollaptism says you 274 00:16:18,760 --> 00:16:21,600 Speaker 2: don't need to mess with data or other people's research, 275 00:16:21,840 --> 00:16:24,760 Speaker 2: just what do you think about the subject, And that 276 00:16:24,800 --> 00:16:29,400 Speaker 2: doesn't really hold water for like a research paper, because yeah, 277 00:16:29,440 --> 00:16:31,560 Speaker 2: I mean that's a good place to start. You can't 278 00:16:31,640 --> 00:16:34,400 Speaker 2: just dive in, or I guess you can, but it's 279 00:16:34,400 --> 00:16:37,280 Speaker 2: also like, what are your conceptions about this, and let's 280 00:16:37,320 --> 00:16:39,840 Speaker 2: start from there and then go figure out if that's correct. 281 00:16:40,480 --> 00:16:42,720 Speaker 2: Like this is just sticking with the what do you 282 00:16:42,760 --> 00:16:45,200 Speaker 2: think about this? And write the research paper. So it's 283 00:16:45,240 --> 00:16:48,240 Speaker 2: not a really good idea, frankly, and then also, Chuck, 284 00:16:48,680 --> 00:16:53,720 Speaker 2: there's just a straight up bad idea. Ethical solypsism also 285 00:16:53,800 --> 00:16:57,480 Speaker 2: called pos sollopsism. Yeah. 286 00:16:57,560 --> 00:17:01,320 Speaker 3: I think there's a professor from m team name a 287 00:17:01,400 --> 00:17:05,640 Speaker 3: Plosis or rather or maybe teaches philosophy named Casper Hair 288 00:17:07,040 --> 00:17:09,760 Speaker 3: and he had a book in two thousand and nine 289 00:17:09,760 --> 00:17:14,200 Speaker 3: called On Myself and Other Less Important Subjects where he 290 00:17:14,560 --> 00:17:18,840 Speaker 3: was he was arguing a lesser version of ethical sollipsism, 291 00:17:18,880 --> 00:17:21,639 Speaker 3: which is the idea that other things and people might exist, 292 00:17:22,040 --> 00:17:24,840 Speaker 3: but we have no obligation to any of those people 293 00:17:25,080 --> 00:17:27,000 Speaker 3: or ideas except for our own. 294 00:17:27,119 --> 00:17:29,560 Speaker 2: No, And I think ultimately at the end he's like, 295 00:17:29,640 --> 00:17:34,080 Speaker 2: but for us, for the person, the individual, to lead 296 00:17:34,119 --> 00:17:36,520 Speaker 2: a fuller life, you kind of do need people here 297 00:17:36,600 --> 00:17:39,119 Speaker 2: or there, So you don't want to just completely screw 298 00:17:39,160 --> 00:17:42,960 Speaker 2: over everybody for yourself. But that's the basis of what's 299 00:17:42,960 --> 00:17:46,720 Speaker 2: called ethical solaptism, that you have no moral obligation to 300 00:17:47,000 --> 00:17:50,640 Speaker 2: anyone except yourself. And then the other thing that really 301 00:17:50,640 --> 00:17:53,160 Speaker 2: stuck out to me, Chuck, was that you have no 302 00:17:53,240 --> 00:17:56,959 Speaker 2: moral obligation to anybody but yourself right now, So you 303 00:17:56,960 --> 00:17:59,040 Speaker 2: don't even have to look out for your future self. 304 00:17:59,160 --> 00:18:01,600 Speaker 2: All you need to care about is your present self. 305 00:18:02,240 --> 00:18:05,000 Speaker 2: And that's why I call it pos solypsism. 306 00:18:05,359 --> 00:18:08,520 Speaker 1: Yeah, piece of what piece of s. 307 00:18:10,320 --> 00:18:12,920 Speaker 3: There are also lots of little sort of side ideas 308 00:18:12,920 --> 00:18:15,280 Speaker 3: that come along if you're gonna gaze at your navel 309 00:18:15,760 --> 00:18:19,600 Speaker 3: about solypsism. And one is the famous brain in a 310 00:18:19,680 --> 00:18:24,560 Speaker 3: vat or the futurama or the matrix idea, which is 311 00:18:24,760 --> 00:18:27,879 Speaker 3: all you are maybe is a brain floating in a 312 00:18:27,960 --> 00:18:30,240 Speaker 3: jar with some life sustaining liquid and it's hooked up 313 00:18:30,280 --> 00:18:33,520 Speaker 3: to a computer and everything you see is a simulation. 314 00:18:34,280 --> 00:18:36,840 Speaker 2: Yeah, and this is where we kind of get into 315 00:18:37,000 --> 00:18:41,800 Speaker 2: the modern updated versions of Descartes evil demon right, Like, 316 00:18:41,840 --> 00:18:45,240 Speaker 2: what's keeping you in the brain in vat? What's running 317 00:18:45,320 --> 00:18:50,280 Speaker 2: that simulation for you? Yeah? There's also the simulation hypothesis, 318 00:18:50,320 --> 00:18:53,600 Speaker 2: which came from Nick Bostrom, which I did do a 319 00:18:53,600 --> 00:18:56,600 Speaker 2: whole episode on End of the World on because it 320 00:18:56,800 --> 00:19:00,000 Speaker 2: just fascinates me. But it's a lot of people confuse 321 00:19:00,160 --> 00:19:01,959 Speaker 2: it with the brain and a vat. But it's different 322 00:19:02,000 --> 00:19:08,680 Speaker 2: because in the simulation hypothesis, which is that if if 323 00:19:09,200 --> 00:19:13,280 Speaker 2: civilization becomes advanced, like say, are we're there ancestors, there 324 00:19:13,280 --> 00:19:17,600 Speaker 2: are descendants, they just keep getting more and more technologically advanced, 325 00:19:17,680 --> 00:19:23,200 Speaker 2: that they can invent simulations that are indistinguishable from reality, 326 00:19:24,000 --> 00:19:27,080 Speaker 2: and they run a bunch of simulations over and over again, 327 00:19:27,160 --> 00:19:30,480 Speaker 2: like say, they sell copies of the simulation game, so 328 00:19:30,840 --> 00:19:34,400 Speaker 2: one hundred million simulations are ever created over the course 329 00:19:34,440 --> 00:19:38,919 Speaker 2: of history. Then, mathematically speaking, since we can't distinguish between 330 00:19:38,960 --> 00:19:43,440 Speaker 2: reality and a simulation, it makes it's much likelier that 331 00:19:43,520 --> 00:19:46,439 Speaker 2: you and I exist in a simulation rather than the 332 00:19:46,600 --> 00:19:51,239 Speaker 2: actual one version of reality that the simulations are based on. Right, 333 00:19:51,880 --> 00:19:54,000 Speaker 2: And the thing that people get mixed up with the 334 00:19:54,000 --> 00:19:56,639 Speaker 2: brain in a vat is that the brain in the 335 00:19:56,720 --> 00:19:59,560 Speaker 2: vat in reality, your brain in a vat in the 336 00:19:59,560 --> 00:20:04,600 Speaker 2: simular in reality, your reality simulated. But to you it's reality. 337 00:20:04,800 --> 00:20:07,879 Speaker 2: There's nothing different. There's no other like reality that you 338 00:20:07,880 --> 00:20:11,600 Speaker 2: could wake up to that's just reality. It's essentially like 339 00:20:11,640 --> 00:20:17,120 Speaker 2: a techno version of creationism. Essentially, like if you replace 340 00:20:17,800 --> 00:20:21,240 Speaker 2: whoever came up with the code for the simulation with God, 341 00:20:21,640 --> 00:20:23,160 Speaker 2: it's essentially saying the same thing. 342 00:20:23,840 --> 00:20:25,000 Speaker 1: Yeah. 343 00:20:25,080 --> 00:20:27,160 Speaker 3: There's also and this is sort of along those lines, 344 00:20:27,200 --> 00:20:31,760 Speaker 3: the experience machine idea. There's a philosopher named Robert Nozick 345 00:20:32,359 --> 00:20:35,720 Speaker 3: in a nineteen seventy four book that said, how about 346 00:20:35,760 --> 00:20:39,639 Speaker 3: this for a thought experiment. I don't think that people 347 00:20:39,720 --> 00:20:43,080 Speaker 3: are just basically hedonistic in life, And what if what 348 00:20:43,119 --> 00:20:45,159 Speaker 3: would people choose if they could be attached to a 349 00:20:45,200 --> 00:20:49,480 Speaker 3: machine they can simulate any experience, like as if it 350 00:20:49,480 --> 00:20:52,840 Speaker 3: were identical and real and you thought it was real, Like, 351 00:20:52,880 --> 00:20:53,840 Speaker 3: would it be hedonistic? 352 00:20:53,880 --> 00:20:55,359 Speaker 1: Would you choose falling in love with you? 353 00:20:55,440 --> 00:20:57,640 Speaker 3: Would you choose to write, you know, create a great 354 00:20:57,640 --> 00:21:00,320 Speaker 3: piece of art or something like that? And then that 355 00:21:00,480 --> 00:21:02,399 Speaker 3: you know, there are different versions of that, like what 356 00:21:02,440 --> 00:21:04,560 Speaker 3: if it's for two years at a time, what if 357 00:21:04,560 --> 00:21:07,960 Speaker 3: it's your whole lives? And the counter to that usually 358 00:21:08,040 --> 00:21:12,040 Speaker 3: is somebody saying, yeah, but people, it's not reality, and 359 00:21:12,080 --> 00:21:15,600 Speaker 3: people aren't engaged in reality. In humans inherently want to 360 00:21:15,680 --> 00:21:16,840 Speaker 3: engage with reality. 361 00:21:17,680 --> 00:21:20,960 Speaker 2: Yeah, and like to sweeten the pot. Noasick was like, like, 362 00:21:21,520 --> 00:21:24,000 Speaker 2: you will have nothing but pleasure for the rest of 363 00:21:24,040 --> 00:21:27,040 Speaker 2: your life, all all the pleasure you want. You will 364 00:21:27,080 --> 00:21:29,919 Speaker 2: never be able to distinguish it from what life was 365 00:21:29,960 --> 00:21:32,719 Speaker 2: like before. You won't remember that there was a life before. Like, 366 00:21:32,840 --> 00:21:36,320 Speaker 2: it will be amazing. And something like seventy four to 367 00:21:36,359 --> 00:21:39,800 Speaker 2: eighty percent of people who are posed this thought experiment 368 00:21:39,920 --> 00:21:42,560 Speaker 2: say like, nah, I don't want to do that. Even 369 00:21:42,640 --> 00:21:45,679 Speaker 2: though life is suffering in a lot of ways and 370 00:21:45,840 --> 00:21:48,800 Speaker 2: sucks and can be boring and is definitely not one 371 00:21:48,880 --> 00:21:53,800 Speaker 2: hundred percent pleasure all the time, most people still want 372 00:21:53,840 --> 00:21:58,000 Speaker 2: to be engaged in reality. And that's again, like, it's 373 00:21:58,040 --> 00:22:00,560 Speaker 2: not just a cool thought experiment. I use that to 374 00:22:00,680 --> 00:22:03,439 Speaker 2: argue against the idea that humans are at bottom, just 375 00:22:03,480 --> 00:22:06,800 Speaker 2: nothing but hedonistic creatures who seek out nothing but to 376 00:22:06,920 --> 00:22:11,159 Speaker 2: increase their pleasure. Knowsick really kind of demolished that with 377 00:22:11,240 --> 00:22:12,280 Speaker 2: that thought experiment. 378 00:22:12,800 --> 00:22:15,080 Speaker 3: Yeah, I think, what in the matrix? What in Joey pants? 379 00:22:15,800 --> 00:22:17,400 Speaker 3: Fully on board with the simulation. 380 00:22:18,640 --> 00:22:20,520 Speaker 2: I don't remember. I don't remember that. 381 00:22:20,880 --> 00:22:22,399 Speaker 3: I think he was because I think he was like 382 00:22:22,560 --> 00:22:24,760 Speaker 3: eating the steak and they were like, yeah, but the 383 00:22:24,760 --> 00:22:26,359 Speaker 3: stake's not real and he's like, yeah, but you know, 384 00:22:27,160 --> 00:22:30,200 Speaker 3: it tastes, tastes real, tastes good to me, something along 385 00:22:30,200 --> 00:22:31,960 Speaker 3: those lines. But yeah, that's pretty interesting. 386 00:22:33,119 --> 00:22:34,800 Speaker 2: Do you want to take our second break and come 387 00:22:34,800 --> 00:22:37,359 Speaker 2: back and talk about our favorite part of this criticisms. 388 00:22:37,680 --> 00:22:53,760 Speaker 3: Yeah, let's journey into K three. 389 00:23:03,680 --> 00:23:04,080 Speaker 1: All right. 390 00:23:04,119 --> 00:23:07,800 Speaker 3: So when it comes to criticisms of sollipsism not from us. 391 00:23:07,880 --> 00:23:12,320 Speaker 3: There are some famous stories, one of which very famous 392 00:23:12,320 --> 00:23:16,480 Speaker 3: story in philosophy circles as at least you're not a philosopher, 393 00:23:16,480 --> 00:23:18,440 Speaker 3: you probably would be like once I got kicking the 394 00:23:18,520 --> 00:23:21,960 Speaker 3: rock for But it was an eighteenth century story about 395 00:23:22,000 --> 00:23:27,000 Speaker 3: writer Samuel Johnson who was in a I guess debate 396 00:23:27,080 --> 00:23:31,760 Speaker 3: with the philosopher named George Berkeley, and Berkeley said, hey, 397 00:23:31,960 --> 00:23:38,560 Speaker 3: Descartes's mind body dualism is faulty, and everything that appears 398 00:23:39,080 --> 00:23:42,240 Speaker 3: to have existence is just made up in your mind. Well, 399 00:23:42,280 --> 00:23:44,680 Speaker 3: first Berkeley said it's impossible to refute this, and that's 400 00:23:44,680 --> 00:23:48,080 Speaker 3: when cheeky old Samuel Johnson came in and kicked a 401 00:23:48,160 --> 00:23:52,600 Speaker 3: very large rock and said, I refute this. In other words, hey, 402 00:23:52,720 --> 00:23:55,840 Speaker 3: this rock is here, and this is just an absurd 403 00:23:55,840 --> 00:23:57,880 Speaker 3: idea because I can kick that rock and it hurt 404 00:23:57,880 --> 00:23:58,480 Speaker 3: my toe. 405 00:23:58,800 --> 00:24:02,480 Speaker 2: Right exactly. Yeah, And so if you're a philosopher, you're 406 00:24:02,560 --> 00:24:06,280 Speaker 2: like Samuel Johnson doesn't get it, And if you're not 407 00:24:06,359 --> 00:24:10,000 Speaker 2: a philosopher, you're like Samuel Johnson gets it. Like philosophers 408 00:24:10,000 --> 00:24:18,120 Speaker 2: are very famously, maybe overly engaged in perfectly crafted, totally 409 00:24:18,200 --> 00:24:21,399 Speaker 2: air tight arguments and the idea of just kicking a 410 00:24:21,480 --> 00:24:24,480 Speaker 2: rock and being like, see it's real. It doesn't really 411 00:24:24,480 --> 00:24:27,240 Speaker 2: hold water with them, but for everybody else it's like, yeah, 412 00:24:28,000 --> 00:24:31,719 Speaker 2: it kind of gets to what Wickenstein is that how 413 00:24:31,720 --> 00:24:36,400 Speaker 2: you would say it in German? Yeah, Ludwig Wckenstein. I'm 414 00:24:36,400 --> 00:24:38,080 Speaker 2: gonna say his name again at least one more time, 415 00:24:38,119 --> 00:24:40,960 Speaker 2: because it's fun. He was a philosopher of the twentieth century. 416 00:24:41,080 --> 00:24:44,080 Speaker 2: He basically was like, man, philosophy, this is not a 417 00:24:44,160 --> 00:24:47,040 Speaker 2: quote I'm paraphrasing, has some like real hang ups with 418 00:24:47,720 --> 00:24:51,440 Speaker 2: having to just like the fact that sollipsism is actually 419 00:24:51,920 --> 00:24:54,879 Speaker 2: it exists, and people feel the need to argue against it. 420 00:24:54,920 --> 00:24:58,440 Speaker 2: Sometimes says all you need to know about how uptight 421 00:24:58,640 --> 00:25:03,640 Speaker 2: philosophers are aboutilosophy, and essentially we just need to take 422 00:25:03,720 --> 00:25:06,280 Speaker 2: some things as fact as granted or else all we're 423 00:25:06,280 --> 00:25:09,600 Speaker 2: doing is spinning our wheels. But if you say, like, okay, 424 00:25:09,880 --> 00:25:12,879 Speaker 2: I believe that the world is material, that it exists 425 00:25:13,320 --> 00:25:16,719 Speaker 2: apart from human consciousness, that if there were no humans 426 00:25:16,760 --> 00:25:20,960 Speaker 2: around and nothing, no life to experience it, everything would 427 00:25:21,000 --> 00:25:24,880 Speaker 2: still be the same. Like, let's just take that as fact, 428 00:25:25,000 --> 00:25:27,399 Speaker 2: if that's what you believe, and just move on from there. 429 00:25:28,040 --> 00:25:31,520 Speaker 2: You need to have some sort of foundation that you 430 00:25:31,600 --> 00:25:34,800 Speaker 2: can say this is real, this exists, and then you 431 00:25:34,920 --> 00:25:38,680 Speaker 2: build off of that, and if you don't, then you're 432 00:25:38,800 --> 00:25:41,240 Speaker 2: just shooting yourself in the foot. Essentially, it was what 433 00:25:41,320 --> 00:25:44,240 Speaker 2: Vickens was what Ludwig Vickenstein was saying. 434 00:25:45,200 --> 00:25:51,040 Speaker 3: Yeah, yeah, I think's unfortunately at Vickenstein, Well, I was 435 00:25:51,440 --> 00:25:53,679 Speaker 3: I didn't see it, but the I think the second 436 00:25:53,760 --> 00:25:56,640 Speaker 3: letter in the EI in German is the one that's favored. 437 00:25:57,520 --> 00:25:59,560 Speaker 2: Oh so how would you pronounce it? Then? 438 00:26:00,160 --> 00:26:01,720 Speaker 1: I think it would be Vitckenstein. 439 00:26:03,440 --> 00:26:06,000 Speaker 2: Well, I still like Vickenstein, so I'm going to. 440 00:26:06,040 --> 00:26:07,920 Speaker 1: Stay with it like Frankenstein. 441 00:26:09,480 --> 00:26:13,280 Speaker 2: Oh, I had it backwards, So I thought that Stein 442 00:26:13,480 --> 00:26:16,280 Speaker 2: was like the Anglicized version of it in Stein was 443 00:26:16,320 --> 00:26:17,719 Speaker 2: the German version. 444 00:26:19,200 --> 00:26:21,560 Speaker 3: I'm pretty sure the second letter in German is the 445 00:26:21,600 --> 00:26:22,360 Speaker 3: one that's pronounced. 446 00:26:22,400 --> 00:26:23,679 Speaker 2: I know I do I believe you. 447 00:26:24,080 --> 00:26:26,680 Speaker 3: I'm not positive, but I do know that that Frankenstein 448 00:26:26,840 --> 00:26:27,520 Speaker 3: was the doctor. 449 00:26:28,600 --> 00:26:31,159 Speaker 1: You mean Frankenstein had not the monster. 450 00:26:32,000 --> 00:26:33,000 Speaker 2: Yeah, that's true. 451 00:26:33,680 --> 00:26:35,399 Speaker 1: So let's talk about Steven P. Thornton. 452 00:26:35,560 --> 00:26:39,160 Speaker 3: He's a philosopher at the University of Limerick, the most 453 00:26:39,200 --> 00:26:43,359 Speaker 3: singsongy university in all of Ireland. He has an argument 454 00:26:43,440 --> 00:26:46,720 Speaker 3: that hey, guys, it's a big mistake to view these 455 00:26:46,760 --> 00:26:51,080 Speaker 3: mental states as just something we experienced subjectively and then 456 00:26:51,560 --> 00:26:53,680 Speaker 3: you know, relate to others like, hey, I know how 457 00:26:53,680 --> 00:26:56,480 Speaker 3: it feels to get my toast stubbed because I've done it, 458 00:26:56,520 --> 00:26:57,919 Speaker 3: so I see that's happened to you, so I know 459 00:26:57,960 --> 00:27:01,840 Speaker 3: how that feels. He says, we learn what these mental 460 00:27:01,880 --> 00:27:07,240 Speaker 3: states are and what he called an intersubjective world, like 461 00:27:07,320 --> 00:27:12,240 Speaker 3: we learn like a kid when it's born understands what 462 00:27:12,359 --> 00:27:16,360 Speaker 3: being sad is by looking around at someone crying or 463 00:27:16,400 --> 00:27:18,560 Speaker 3: something like that. And that's how they know what sad 464 00:27:18,640 --> 00:27:21,800 Speaker 3: is because of a behavior they witness in a context 465 00:27:21,800 --> 00:27:25,600 Speaker 3: they witness it in. So if you you know, Livia 466 00:27:25,760 --> 00:27:27,840 Speaker 3: used a great example, if you're grinding your teeth and 467 00:27:27,920 --> 00:27:31,080 Speaker 3: if you're you know, snapping at people in your life 468 00:27:31,080 --> 00:27:34,560 Speaker 3: and you can't sleep, then you probably know you're experiencing stress. 469 00:27:35,440 --> 00:27:38,399 Speaker 2: Right. This guy's argument to me is the one that's 470 00:27:38,440 --> 00:27:42,160 Speaker 2: that makes the most sense, just refuting sollipsism, which is like, yes, 471 00:27:42,480 --> 00:27:47,000 Speaker 2: you have internal feelings and thoughts, like the experience of 472 00:27:47,080 --> 00:27:51,400 Speaker 2: feeling sad is not the whole of sadness. That there's 473 00:27:51,480 --> 00:27:54,719 Speaker 2: other stuff and all the rest of it essentially comes 474 00:27:54,720 --> 00:27:58,840 Speaker 2: from interacting and learning from the external world. And so 475 00:27:58,960 --> 00:28:01,520 Speaker 2: the whole idea of solis is based on a faulty 476 00:28:01,520 --> 00:28:04,959 Speaker 2: premise that the entire world could possibly just be in 477 00:28:05,000 --> 00:28:07,640 Speaker 2: your head, because how are you going to learn from 478 00:28:07,680 --> 00:28:11,919 Speaker 2: something that's not actually there in the first place. I 479 00:28:12,000 --> 00:28:16,119 Speaker 2: like Stephen P. Thornton. He's my new favorite philosopher. Uh. 480 00:28:16,200 --> 00:28:18,680 Speaker 2: There was one other guy too we have to bring 481 00:28:18,720 --> 00:28:23,560 Speaker 2: into the conversation, Bertrand Russell. He's a very famous philosopher, 482 00:28:23,560 --> 00:28:28,160 Speaker 2: a mathematician. I believe his whole thing was like if if, 483 00:28:28,840 --> 00:28:31,439 Speaker 2: if we might be like, like, what was it? Juang 484 00:28:31,560 --> 00:28:35,720 Speaker 2: z him saying like, how can I tell if I'm 485 00:28:35,800 --> 00:28:37,560 Speaker 2: a man dreaming of a will to beast or a 486 00:28:37,560 --> 00:28:40,840 Speaker 2: wild to be dreaming of being a man, And Bertrand 487 00:28:40,880 --> 00:28:43,040 Speaker 2: Russell was like, if that were true dreams are just 488 00:28:43,120 --> 00:28:46,720 Speaker 2: weird and freaky and anything goes really like waking life 489 00:28:46,800 --> 00:28:49,600 Speaker 2: is not like that. So if waking life for a dream, 490 00:28:50,040 --> 00:28:52,800 Speaker 2: there would be measurable ways that it veers off of, 491 00:28:52,840 --> 00:28:55,720 Speaker 2: like physics or whatever, and we would notice that. And 492 00:28:55,760 --> 00:28:57,680 Speaker 2: these days it's called a glitch in the matrix. You 493 00:28:57,680 --> 00:29:00,960 Speaker 2: would notice glitches in the Matrix. And there's actually a 494 00:29:01,040 --> 00:29:05,560 Speaker 2: really coolddit subreddit called Glitch of the Matrix and it's 495 00:29:05,560 --> 00:29:10,760 Speaker 2: people's like stories about just how just weird, inexplicable, strange 496 00:29:10,800 --> 00:29:14,040 Speaker 2: small things that they've noticed here there in life. They'll 497 00:29:14,040 --> 00:29:16,160 Speaker 2: they'll post them and every once in a while they'll 498 00:29:16,160 --> 00:29:18,480 Speaker 2: be like a picture too. This is kind of fun 499 00:29:18,480 --> 00:29:19,000 Speaker 2: to go through. 500 00:29:19,520 --> 00:29:21,520 Speaker 3: What would it be of like, give me an example. 501 00:29:23,520 --> 00:29:26,320 Speaker 2: Any one that I saw a couple of times is 502 00:29:26,360 --> 00:29:29,719 Speaker 2: something like seeing somebody like go out a door and 503 00:29:29,760 --> 00:29:32,280 Speaker 2: then like thirty seconds later they come in and totally 504 00:29:32,280 --> 00:29:36,160 Speaker 2: opposite door that they physically couldn't have possibly gotten through. 505 00:29:36,360 --> 00:29:38,320 Speaker 2: So how do you explain that? Just stuff like that, 506 00:29:39,280 --> 00:29:42,160 Speaker 2: Like how in the actual movie the Matrix things would 507 00:29:42,200 --> 00:29:44,080 Speaker 2: literally glitch like you can kind of tell all of 508 00:29:44,120 --> 00:29:47,240 Speaker 2: a sudden they were like ones and zeros. This is 509 00:29:47,320 --> 00:29:49,760 Speaker 2: kind of like that, but it's like the program itself 510 00:29:49,840 --> 00:29:52,000 Speaker 2: is lazy or something like that. 511 00:29:52,440 --> 00:29:53,360 Speaker 1: I gotcha. 512 00:29:53,880 --> 00:29:57,200 Speaker 3: So, you know, the navel gazing and talking about solipsism 513 00:29:57,280 --> 00:30:00,080 Speaker 3: and debating it or whatever is one thing, but if 514 00:29:59,840 --> 00:30:03,400 Speaker 3: you do you have a mental illness, especially if you 515 00:30:03,440 --> 00:30:09,320 Speaker 3: have something like schizophrenia, this idea is terrifying. It's called derealization, 516 00:30:09,560 --> 00:30:11,640 Speaker 3: and it's you know, something that can happen if you 517 00:30:12,160 --> 00:30:17,360 Speaker 3: suffer from paranoid schizophrenia. There are people that suffer from 518 00:30:17,360 --> 00:30:20,800 Speaker 3: that that talk about sort of exactly this, like the 519 00:30:20,840 --> 00:30:24,880 Speaker 3: people around them are extras or empty shells, and that 520 00:30:25,440 --> 00:30:29,440 Speaker 3: you and you alone are real and responsible for like 521 00:30:29,520 --> 00:30:33,120 Speaker 3: the world moving on as it is and being alienated 522 00:30:33,160 --> 00:30:35,960 Speaker 3: from your own body and not having a sense of self. 523 00:30:36,200 --> 00:30:38,920 Speaker 3: Like that's all real stuff and terrifying stuff. 524 00:30:39,640 --> 00:30:43,960 Speaker 2: Yeah, for sure. There's a a psychologist named Clara S. 525 00:30:44,040 --> 00:30:48,840 Speaker 2: Humpston who kind of explains how somebody with schizophrenia might 526 00:30:48,880 --> 00:30:53,000 Speaker 2: actually retreat to a solipsistic state as a way to 527 00:30:53,080 --> 00:30:55,800 Speaker 2: kind of exercise control over a world that they feel 528 00:30:55,840 --> 00:30:59,720 Speaker 2: like they have zero control over. That Like, if you're like, nope, 529 00:30:59,760 --> 00:31:02,720 Speaker 2: all this is just in my mind and it's not real, 530 00:31:03,360 --> 00:31:06,000 Speaker 2: then in a weird sense, even though as lonely and 531 00:31:06,040 --> 00:31:09,840 Speaker 2: horrifying as that thought actually is, Like you can feel 532 00:31:09,920 --> 00:31:13,280 Speaker 2: like you can control those things then too, And that 533 00:31:13,320 --> 00:31:16,080 Speaker 2: actually kind of ties into yet another argument or criticism 534 00:31:16,120 --> 00:31:19,160 Speaker 2: of sollopsism. If all of this is just in your mind, 535 00:31:19,200 --> 00:31:21,760 Speaker 2: all of reality, how do you explain the fact that 536 00:31:21,800 --> 00:31:24,800 Speaker 2: you have no idea what's coming in the future, or 537 00:31:24,800 --> 00:31:28,280 Speaker 2: that you can be surprised or startled? Like none of 538 00:31:28,320 --> 00:31:34,480 Speaker 2: that makes sense either. So I don't remember through thirty 539 00:31:34,520 --> 00:31:36,959 Speaker 2: seconds on how those two things tied together, but just 540 00:31:37,040 --> 00:31:39,480 Speaker 2: if I rewind, I'm sure I would find out that 541 00:31:39,520 --> 00:31:40,280 Speaker 2: they did. 542 00:31:41,000 --> 00:31:43,400 Speaker 3: Well. There are other disorders too that touch on other 543 00:31:43,440 --> 00:31:46,440 Speaker 3: parts of solypsism. Certainly you're talking about the pos kind 544 00:31:46,520 --> 00:31:50,360 Speaker 3: the ethical solypsism that very closely could tie into something 545 00:31:50,440 --> 00:31:57,080 Speaker 3: like narcissistic personality disorder or anti social personality disorder. That 546 00:31:57,520 --> 00:32:01,720 Speaker 3: sort of lack of empathy and only making choices based 547 00:32:01,760 --> 00:32:05,040 Speaker 3: on their own needs. That definitely is like rings of 548 00:32:05,120 --> 00:32:06,160 Speaker 3: ethical solypsism. 549 00:32:06,560 --> 00:32:12,360 Speaker 2: Yes, so, yeah, I mean that's pretty much solyopsism. I 550 00:32:12,440 --> 00:32:14,880 Speaker 2: don't think we're going to do a part two eventually. 551 00:32:16,720 --> 00:32:20,160 Speaker 2: I think we've kind of put it to bed, which 552 00:32:20,200 --> 00:32:23,560 Speaker 2: feels good Chuck, And since Chuck doesn't. 553 00:32:23,280 --> 00:32:25,640 Speaker 1: Have anything else, right, I got nothing else. 554 00:32:25,840 --> 00:32:27,800 Speaker 2: I got nothing else either, So then that means, of 555 00:32:27,840 --> 00:32:29,480 Speaker 2: course that it brings up Listen or mail. 556 00:32:32,160 --> 00:32:38,160 Speaker 3: This is from Yun spelled ja n but yun is German. Hey, guys, 557 00:32:38,160 --> 00:32:40,800 Speaker 3: listen to the episode on Ludvig the second, which I 558 00:32:40,880 --> 00:32:43,000 Speaker 3: enjoyed like all your episodes. I work in research and 559 00:32:43,040 --> 00:32:46,800 Speaker 3: development for wastewater technology, so I know how much work 560 00:32:46,840 --> 00:32:48,800 Speaker 3: it is to research a new topic and become familiar 561 00:32:48,880 --> 00:32:52,000 Speaker 3: enough to talk about it like you guys do, and 562 00:32:52,040 --> 00:32:54,560 Speaker 3: I mostly research stuff in my own field. 563 00:32:54,640 --> 00:32:55,360 Speaker 1: So well done. 564 00:32:56,080 --> 00:32:57,560 Speaker 3: I want to say thank you for being a steady 565 00:32:57,560 --> 00:33:02,320 Speaker 3: presence throughout my PhD on waterless toilet, It's fatherhood, the pandemic, 566 00:33:02,360 --> 00:33:04,120 Speaker 3: and my new job which often takes me on long 567 00:33:04,160 --> 00:33:06,640 Speaker 3: road trips. Love learning, and your podcast allows me to 568 00:33:06,680 --> 00:33:11,600 Speaker 3: broaden my horizon way beyond my normal work. Today, however, guys, 569 00:33:11,720 --> 00:33:14,080 Speaker 3: I have to write to assure you that filling a 570 00:33:14,080 --> 00:33:16,920 Speaker 3: hall of five hundred people in Germany would be in 571 00:33:17,040 --> 00:33:19,720 Speaker 3: and kendush be a child's play. 572 00:33:20,600 --> 00:33:21,040 Speaker 2: Oh wow. 573 00:33:21,120 --> 00:33:27,040 Speaker 3: Okay, most people below the age of forty here speak 574 00:33:27,080 --> 00:33:28,760 Speaker 3: English to a decent degree and I know plenty of 575 00:33:28,800 --> 00:33:31,040 Speaker 3: people that listen to your show, so please please come 576 00:33:31,040 --> 00:33:33,200 Speaker 3: to Germany. If you do, I'll make it my mission 577 00:33:33,240 --> 00:33:35,640 Speaker 3: to get the event sold out. Oh wow, and let 578 00:33:35,680 --> 00:33:38,240 Speaker 3: me know if you want any recommendations for decent beers 579 00:33:38,240 --> 00:33:38,880 Speaker 3: while you're here. 580 00:33:39,280 --> 00:33:41,160 Speaker 1: And that is from jun Man. 581 00:33:41,200 --> 00:33:44,280 Speaker 2: That's awesome. That was a great email, Yun. Yeah, I 582 00:33:44,280 --> 00:33:46,400 Speaker 2: think we should take you on up on that. Finally, Chuck, 583 00:33:46,400 --> 00:33:47,560 Speaker 2: I want to go to Germany. 584 00:33:48,440 --> 00:33:50,360 Speaker 3: We've had enough people seeing come to Germany. I think 585 00:33:50,360 --> 00:33:52,640 Speaker 3: we have to go to like Berlin and Munich just 586 00:33:52,640 --> 00:33:53,640 Speaker 3: to see how heck is. 587 00:33:53,640 --> 00:33:54,360 Speaker 1: Going on too. 588 00:33:54,640 --> 00:33:55,480 Speaker 2: Huh. 589 00:33:55,560 --> 00:33:55,800 Speaker 1: Yeah. 590 00:33:55,840 --> 00:33:59,080 Speaker 3: We could do the big city style and then Bavarian 591 00:33:59,120 --> 00:33:59,720 Speaker 3: city style. 592 00:34:00,080 --> 00:34:04,520 Speaker 2: Okay, let's do it. Then it's it is settled. And 593 00:34:04,560 --> 00:34:07,680 Speaker 2: that was from young j A N. Yeah, I'm glad 594 00:34:07,720 --> 00:34:10,279 Speaker 2: you said that, because for my whole life I've been saying, well, 595 00:34:10,320 --> 00:34:12,680 Speaker 2: first I said Jan, then I grew up and I 596 00:34:12,880 --> 00:34:15,440 Speaker 2: thought Jan. I did not know it was Jun. 597 00:34:16,120 --> 00:34:17,799 Speaker 1: Well, this is what I mean. 598 00:34:18,960 --> 00:34:22,319 Speaker 3: J A N. In this letter said it's pronounced why 599 00:34:22,480 --> 00:34:23,359 Speaker 3: you n n? 600 00:34:24,000 --> 00:34:28,879 Speaker 2: Yeah that's young for sure, that's young. But anybody's name 601 00:34:29,040 --> 00:34:32,040 Speaker 2: pronounced Jan reminds me of a quote. Have you ever 602 00:34:32,080 --> 00:34:37,960 Speaker 2: seen Johnny Swayede the Brad Pitt movie. I never saw that, yeah, 603 00:34:38,280 --> 00:34:41,399 Speaker 2: but there was a classic line in it where he's 604 00:34:41,440 --> 00:34:43,760 Speaker 2: at dinner at like his date's house. In the dates, 605 00:34:43,880 --> 00:34:48,600 Speaker 2: Dad says, you know, John, if we were in Sweden, 606 00:34:49,160 --> 00:34:53,439 Speaker 2: your name would be Yon Spade and he says, no, sir, 607 00:34:53,880 --> 00:34:57,360 Speaker 2: it'd be John Johnny Swayed, always has been, always will be. 608 00:34:57,600 --> 00:34:59,400 Speaker 3: That's a pretty good, uh, Brad Pitt. 609 00:34:59,480 --> 00:35:03,440 Speaker 2: Yeah, yeah, imagine Brad Pitt blankly saying this, but with 610 00:35:03,480 --> 00:35:04,440 Speaker 2: a huge pompadour. 611 00:35:04,480 --> 00:35:06,479 Speaker 1: It's pretty great, not bad. 612 00:35:07,800 --> 00:35:10,680 Speaker 2: Well, I think that's it again. Thanks you, and we'll 613 00:35:10,680 --> 00:35:14,759 Speaker 2: see everybody, including you in Germany. Eventually we'll figure it out. 614 00:35:15,400 --> 00:35:18,080 Speaker 2: And in the meantime, if anybody out there from Germany 615 00:35:18,239 --> 00:35:20,279 Speaker 2: or otherwise wants to get in touch with us, you 616 00:35:20,320 --> 00:35:23,520 Speaker 2: can send us an email to Stuff podcast at iHeartRadio 617 00:35:23,600 --> 00:35:27,000 Speaker 2: dot com. 618 00:35:27,160 --> 00:35:29,480 Speaker 1: Stuff you Should Know is a production of iHeartRadio. 619 00:35:29,960 --> 00:35:33,200 Speaker 3: For more podcasts my heart Radio, visit the iHeartRadio app, 620 00:35:33,360 --> 00:35:36,280 Speaker 3: Apple Podcasts, or wherever you listen to your favorite shows.