1 00:00:05,720 --> 00:00:07,680 Speaker 1: Hey, welcome to Stuff to Blow Your Mind. My name 2 00:00:07,720 --> 00:00:10,920 Speaker 1: is Robert Lamb and I'm Joe McCormick, and it's Saturday. 3 00:00:10,960 --> 00:00:13,040 Speaker 1: Time to go into the vault for a classic episode 4 00:00:13,039 --> 00:00:17,680 Speaker 1: of the show. This one originally published on February and 5 00:00:17,760 --> 00:00:21,119 Speaker 1: it's about Acam's Razor. Yeah, this one, this one is 6 00:00:21,160 --> 00:00:22,759 Speaker 1: a lot of fun we get into. You know, we 7 00:00:22,840 --> 00:00:27,040 Speaker 1: discussed scientific thinking and speculative thinking. Uh, we discussed the 8 00:00:27,120 --> 00:00:29,880 Speaker 1: name of the rose a little bit for obvious reasons 9 00:00:30,120 --> 00:00:32,640 Speaker 1: that this one was fun. There's some some history, some science. 10 00:00:32,960 --> 00:00:39,400 Speaker 1: Everything you want. Dr wonderful. Welcome to Stuff to Blow 11 00:00:39,440 --> 00:00:48,640 Speaker 1: your Mind, a production of I Heart Radios How Stuff Work. Hey, 12 00:00:48,760 --> 00:00:50,479 Speaker 1: welcome to Stuff to Blow your Mind. My name is 13 00:00:50,600 --> 00:00:53,880 Speaker 1: Robert Lamb and I'm Joe McCormick. And today we're going 14 00:00:53,960 --> 00:00:57,160 Speaker 1: to discuss a problem solving principles that many of you 15 00:00:57,160 --> 00:01:00,440 Speaker 1: have probably heard of and that we've we've definitely referenced 16 00:01:00,480 --> 00:01:03,680 Speaker 1: on the show before, and that is Acom's Razor. That's right, 17 00:01:03,720 --> 00:01:05,760 Speaker 1: it's it's one of the classics, one of the hits 18 00:01:05,760 --> 00:01:08,800 Speaker 1: of like the Skeptical tool Kit, and uh, I think 19 00:01:08,800 --> 00:01:10,880 Speaker 1: it's a really good one to get into because it's 20 00:01:10,920 --> 00:01:14,400 Speaker 1: something that is widely known, but in different ways and 21 00:01:14,520 --> 00:01:18,399 Speaker 1: often uh. To whatever extent it actually does have value, 22 00:01:19,120 --> 00:01:21,720 Speaker 1: it often gets deployed in ways that do not actually 23 00:01:21,760 --> 00:01:25,120 Speaker 1: make use of its value, right, Like like an actual 24 00:01:25,240 --> 00:01:29,200 Speaker 1: razor blade may be misused from time to time. Now, 25 00:01:29,400 --> 00:01:31,280 Speaker 1: one specific place that I know we've talked about it 26 00:01:31,319 --> 00:01:34,680 Speaker 1: before is that is in the context of Carl Sagan's 27 00:01:35,080 --> 00:01:39,640 Speaker 1: recommendations for the tools of skeptical thinking. Uh. He lays 28 00:01:39,720 --> 00:01:43,040 Speaker 1: these out, and one of them is Occam's razor. He writes, 29 00:01:43,120 --> 00:01:46,720 Speaker 1: Occam's razor. This convenient rule of thumb urges us, when 30 00:01:46,760 --> 00:01:50,559 Speaker 1: faced with two hypotheses that explain the data equally well, 31 00:01:50,960 --> 00:01:53,800 Speaker 1: to choose the simpler. Okay, Now, why do we end 32 00:01:53,880 --> 00:01:56,040 Speaker 1: up talking about this today. We were in the studio 33 00:01:56,120 --> 00:01:59,480 Speaker 1: the other day, uh, discussing upcoming episodes, and you said 34 00:01:59,480 --> 00:02:02,000 Speaker 1: that Seth mentioned this, our our producer, Seth. Yeah. I 35 00:02:02,040 --> 00:02:05,400 Speaker 1: was in here and Seth Nicholas Johnson was working on 36 00:02:05,440 --> 00:02:08,440 Speaker 1: a crossword puzzle. Was it the New York Times? He 37 00:02:08,480 --> 00:02:10,680 Speaker 1: tells us it was The New York Times? Uh, And 38 00:02:10,800 --> 00:02:12,800 Speaker 1: he he asked me how to spell okam is an 39 00:02:12,840 --> 00:02:16,600 Speaker 1: Ockham's razor? And I took a guess. At it and 40 00:02:16,720 --> 00:02:18,880 Speaker 1: I can't. I can't remember I was correct. I was 41 00:02:18,880 --> 00:02:21,440 Speaker 1: probably wrong, but also probably hit one of the multiple 42 00:02:21,480 --> 00:02:25,760 Speaker 1: acceptable spellings for Ockhams raiser um. But anyway, we started 43 00:02:25,800 --> 00:02:27,120 Speaker 1: talking about it and I was like, oh, yeah, we 44 00:02:27,520 --> 00:02:29,639 Speaker 1: could do that as an episode, and so here we are. 45 00:02:29,720 --> 00:02:31,480 Speaker 1: I'm very glad we picked this because I think one 46 00:02:31,480 --> 00:02:35,520 Speaker 1: of my personal favorite genres of critical thinking is is 47 00:02:35,560 --> 00:02:39,240 Speaker 1: being skeptical about the tools of skepticism. You know, is 48 00:02:39,280 --> 00:02:43,040 Speaker 1: sometimes people who identify as skeptics can can I get 49 00:02:43,040 --> 00:02:45,000 Speaker 1: a little cocky? You know, they get a little too 50 00:02:45,080 --> 00:02:48,200 Speaker 1: sure of themselves about what the reasoning tools they use, 51 00:02:48,600 --> 00:02:50,839 Speaker 1: and it's worth putting those tools to the test, giving 52 00:02:50,880 --> 00:02:54,080 Speaker 1: them a closer look. Yeah. Absolutely, Now I have to 53 00:02:54,080 --> 00:02:57,000 Speaker 1: say that I definitely remember the first time I encountered 54 00:02:57,000 --> 00:02:59,360 Speaker 1: the concept of Ockham's raz or, at least the first 55 00:02:59,360 --> 00:03:01,560 Speaker 1: time I encounter unded it, and it on some level 56 00:03:01,639 --> 00:03:04,359 Speaker 1: stuck with me. And that was when I view the 57 00:03:05,480 --> 00:03:10,680 Speaker 1: film adaptation of Carl Sagan's novel Contact. The movie I 58 00:03:10,720 --> 00:03:14,040 Speaker 1: can't watch without crying. Oh yeah, yeah, well why does 59 00:03:14,080 --> 00:03:18,359 Speaker 1: it make you cry? Oh God, there's no no, It's 60 00:03:18,400 --> 00:03:21,920 Speaker 1: just it's pointed, like especially the first part where you know, 61 00:03:22,000 --> 00:03:23,919 Speaker 1: it zooms out from the earth and you're hearing the 62 00:03:24,040 --> 00:03:26,680 Speaker 1: radio signals go back in time, and then and then 63 00:03:26,720 --> 00:03:29,360 Speaker 1: it shows the young Ellie air Away experimenting with the 64 00:03:29,360 --> 00:03:32,119 Speaker 1: ham radio and her dad's helping her, and I get 65 00:03:32,160 --> 00:03:35,520 Speaker 1: so emotional. I don't know, yeah, yeah, it's it's been 66 00:03:35,560 --> 00:03:37,760 Speaker 1: a very long I haven't seen it since it initially 67 00:03:37,880 --> 00:03:40,960 Speaker 1: came out, And in fact, the main thing I remember 68 00:03:41,120 --> 00:03:44,600 Speaker 1: from it is this scene in which Jodie Foster's character 69 00:03:45,160 --> 00:03:49,680 Speaker 1: Eleanor air Away has having this conversation with Matthew McConaughey's character. 70 00:03:49,760 --> 00:03:52,360 Speaker 1: Who how old was Matthew McConaughey at this point, I 71 00:03:52,360 --> 00:03:54,120 Speaker 1: don't even know how old he is now he's just 72 00:03:54,240 --> 00:03:57,640 Speaker 1: like this ageless demon. But anyway, he has his character, 73 00:03:57,920 --> 00:04:00,840 Speaker 1: he's playing his character named Palmer Joss. Uh huh. And 74 00:04:00,880 --> 00:04:04,320 Speaker 1: in the scene in question, Foster's character brings up Acam's 75 00:04:04,400 --> 00:04:07,800 Speaker 1: raiser in a discussion on the nature of God. She 76 00:04:07,800 --> 00:04:11,640 Speaker 1: she says, well, which is ultimately the simpler hypothesis than 77 00:04:11,640 --> 00:04:15,360 Speaker 1: an all powerful God exists or the human beings made 78 00:04:15,400 --> 00:04:18,600 Speaker 1: God up in order to feel better about things, and 79 00:04:18,640 --> 00:04:21,279 Speaker 1: then this ultimately comes back around is kind of flipped 80 00:04:21,320 --> 00:04:23,839 Speaker 1: on her later on in the film regarding her characters 81 00:04:23,960 --> 00:04:28,000 Speaker 1: encounter with an extraterrestrial intelligence. Right, is it more likely 82 00:04:28,080 --> 00:04:30,480 Speaker 1: that she really had the experience she thinks she had 83 00:04:30,520 --> 00:04:33,960 Speaker 1: with with all these aliens or that she like hallucinated 84 00:04:34,000 --> 00:04:37,039 Speaker 1: something that would give her emotional closure. Yeah, and so yeah, 85 00:04:37,040 --> 00:04:38,680 Speaker 1: I think I was in high school at the time, 86 00:04:38,720 --> 00:04:41,159 Speaker 1: so it was It was an interesting concept, especially in 87 00:04:41,200 --> 00:04:45,680 Speaker 1: the context of atheism versus you know, faith in a 88 00:04:45,760 --> 00:04:49,000 Speaker 1: creator deity. Uh. To to suddenly have this tool from 89 00:04:49,000 --> 00:04:51,359 Speaker 1: the chest of skeptical thinking just thrown up on the 90 00:04:51,400 --> 00:04:55,320 Speaker 1: table and you and seemingly used by both sides. Well, yeah, 91 00:04:55,360 --> 00:04:57,360 Speaker 1: I think this is funny. This is a great example 92 00:04:57,680 --> 00:05:00,040 Speaker 1: because it highlights some of the most common features of 93 00:05:00,120 --> 00:05:04,080 Speaker 1: Occam's razor as it is actually used, Like it's often 94 00:05:04,160 --> 00:05:07,400 Speaker 1: invoked in a kind of fuzzy way, like without an 95 00:05:07,400 --> 00:05:11,880 Speaker 1: objective measure, uh, just kind of invoked to back up 96 00:05:11,880 --> 00:05:15,760 Speaker 1: your intuitions about the probability of something. Right. But another 97 00:05:15,800 --> 00:05:18,680 Speaker 1: thing is that this example shows how it's not always 98 00:05:18,720 --> 00:05:21,760 Speaker 1: easy to find a way to compare the simplicity of 99 00:05:21,800 --> 00:05:25,719 Speaker 1: two different propositions, like is the existence of God a 100 00:05:25,800 --> 00:05:29,120 Speaker 1: simple hypothesis or a complicated one that I think that 101 00:05:29,200 --> 00:05:32,240 Speaker 1: really depends on kind of how you feel about it, 102 00:05:32,279 --> 00:05:34,760 Speaker 1: like like what kind of objective measure can you come 103 00:05:34,839 --> 00:05:37,360 Speaker 1: up with to evaluate that question? Right, It's going to 104 00:05:37,440 --> 00:05:40,360 Speaker 1: depend so much on your like your background, your culture, 105 00:05:40,360 --> 00:05:43,800 Speaker 1: what you grew up with, and just how you how 106 00:05:43,839 --> 00:05:47,120 Speaker 1: you've come to view the possibility of of of God's existence. 107 00:05:47,200 --> 00:05:49,680 Speaker 1: Is it just kind of the bedrock of your your 108 00:05:49,720 --> 00:05:53,160 Speaker 1: worldview or is it this thing from the outside that 109 00:05:53,240 --> 00:05:55,840 Speaker 1: you are contemplating. And also how do you view it, 110 00:05:56,200 --> 00:05:58,359 Speaker 1: like the coherence of the idea. Do you view it 111 00:05:58,400 --> 00:06:01,160 Speaker 1: as something that's like, uh, that's full of all these 112 00:06:01,240 --> 00:06:05,000 Speaker 1: little kind of ad hoc accommodations, or something that is 113 00:06:05,040 --> 00:06:09,719 Speaker 1: a holistic, coherent sort of like fact about nature, you know, 114 00:06:10,120 --> 00:06:13,279 Speaker 1: it's I think this is a perfect example that shows 115 00:06:13,360 --> 00:06:16,400 Speaker 1: like when people use the idea of Okham's razor in 116 00:06:16,400 --> 00:06:19,320 Speaker 1: a way that is not helpful and doesn't really it 117 00:06:19,360 --> 00:06:22,039 Speaker 1: doesn't really get you any closer to figuring out what's true. 118 00:06:22,279 --> 00:06:24,599 Speaker 1: Now if you're one, If if you're still questioning like 119 00:06:24,680 --> 00:06:28,080 Speaker 1: what the concept really means, don't worry. We will get 120 00:06:28,120 --> 00:06:31,039 Speaker 1: to some I think some some very understandable examples of 121 00:06:31,080 --> 00:06:35,760 Speaker 1: how it can be used properly and used improperly. But 122 00:06:35,839 --> 00:06:39,000 Speaker 1: let's go ahead and to start about the concept itself 123 00:06:39,240 --> 00:06:43,720 Speaker 1: the word acum uh. And you know where this comes from. 124 00:06:43,760 --> 00:06:46,800 Speaker 1: We'll get to the origins of akamas razor. So Acam's 125 00:06:46,880 --> 00:06:50,080 Speaker 1: razor is also known as the principle of parsimony, and 126 00:06:50,160 --> 00:06:55,240 Speaker 1: parsimony means a tendency toward cheapness or frugality. So I 127 00:06:55,279 --> 00:06:57,359 Speaker 1: like that. It's like the principle of parsimony is like, 128 00:06:57,560 --> 00:07:00,760 Speaker 1: you want to be cheap with your with your logic, right, yeah, 129 00:07:00,880 --> 00:07:03,640 Speaker 1: I don't need more than two steps of logic between 130 00:07:03,680 --> 00:07:06,080 Speaker 1: me and the solution. Uh. You know, don't give me 131 00:07:06,160 --> 00:07:08,800 Speaker 1: one with four or five uh. And it was named 132 00:07:08,800 --> 00:07:12,360 Speaker 1: after the medieval English philosopher William of Ockham, of course, 133 00:07:12,520 --> 00:07:15,560 Speaker 1: William of Ockom Uh. So he he lived in the 134 00:07:15,640 --> 00:07:18,920 Speaker 1: thirteenth and fourteenth centuries, from twelve eighty five to either 135 00:07:19,000 --> 00:07:21,960 Speaker 1: thirteen forty seven or thirteen forty nine. I've seen different 136 00:07:22,160 --> 00:07:25,800 Speaker 1: death dates given for him. I've seen different birthdates as well. 137 00:07:26,080 --> 00:07:28,400 Speaker 1: At twelve eighty seven or twelve eighty eight, That's what 138 00:07:28,480 --> 00:07:30,840 Speaker 1: I was looking at. That's interesting. So he was a 139 00:07:30,880 --> 00:07:34,800 Speaker 1: prolific scholar Franciscan friar. We'll get more into his ideas 140 00:07:34,800 --> 00:07:37,080 Speaker 1: in a minute. You know, one thing I've always wondered 141 00:07:37,120 --> 00:07:39,720 Speaker 1: is where the heck is Ocum. I've never heard of that. Well, yeah, 142 00:07:39,760 --> 00:07:41,400 Speaker 1: because the words sound it has kind of like a 143 00:07:41,440 --> 00:07:44,880 Speaker 1: remoteness to it. It sounds alien in some ways. Akom 144 00:07:45,120 --> 00:07:47,600 Speaker 1: is very much a real place. It is a rural 145 00:07:47,680 --> 00:07:50,520 Speaker 1: village in Surrey, England. You can look it up online. 146 00:07:50,520 --> 00:07:54,080 Speaker 1: You can find out the website for the church in Ocum, 147 00:07:54,200 --> 00:07:57,920 Speaker 1: for example. And this area has been occupied since ancient times. 148 00:07:58,000 --> 00:08:01,440 Speaker 1: It's about a day's ride south best of London, and 149 00:08:01,600 --> 00:08:05,000 Speaker 1: it was the birthplace of the individual who had come 150 00:08:05,040 --> 00:08:07,600 Speaker 1: to be known as William of Ockham. Now beyond that, 151 00:08:07,720 --> 00:08:10,800 Speaker 1: beyond the fact that he was born here, we don't 152 00:08:10,840 --> 00:08:14,360 Speaker 1: know a lot about William's life. Uh. We don't know 153 00:08:14,440 --> 00:08:17,520 Speaker 1: what his social or family background was, or if his 154 00:08:17,640 --> 00:08:21,600 Speaker 1: native language was French or Middle English. As Paul Vincent 155 00:08:21,680 --> 00:08:25,680 Speaker 1: Spade explains in The Cambridge Companion to Ockham, he was 156 00:08:25,840 --> 00:08:28,640 Speaker 1: likely given over to the Franciscan Order as a young 157 00:08:28,720 --> 00:08:32,360 Speaker 1: boy before the age of fourteen, and here Latin would 158 00:08:32,400 --> 00:08:35,520 Speaker 1: have quickly become his language of of of not only writing, 159 00:08:35,559 --> 00:08:39,520 Speaker 1: but also just conversation. Gray Friar's convent in London was 160 00:08:39,600 --> 00:08:44,400 Speaker 1: likely his home convent, but later he traveled. He visited Avignon, 161 00:08:44,840 --> 00:08:47,440 Speaker 1: he visited Italy, and he lived the last two decades 162 00:08:47,480 --> 00:08:53,160 Speaker 1: of his life in Germany. Now, philosophically, William was a nominalist, 163 00:08:53,640 --> 00:08:56,360 Speaker 1: and Spade writes that the two main themes of this 164 00:08:56,520 --> 00:09:01,679 Speaker 1: for William were the rejection of universals and ontological reduction. 165 00:09:02,320 --> 00:09:05,600 Speaker 1: And these two themes are are not necessarily interconnected, like 166 00:09:05,640 --> 00:09:07,600 Speaker 1: you can you could, you could believe in one but 167 00:09:07,679 --> 00:09:11,120 Speaker 1: not the other, you know, and vice versa um. But 168 00:09:11,960 --> 00:09:15,560 Speaker 1: basically let's let's get into what these means. So the first, 169 00:09:15,600 --> 00:09:19,679 Speaker 1: the rejection of universals is perhaps best considered, and this 170 00:09:19,760 --> 00:09:23,679 Speaker 1: is very brief and broad. Certainly you can find so 171 00:09:23,760 --> 00:09:27,040 Speaker 1: much written and set on this topic, but basically, think 172 00:09:27,080 --> 00:09:29,760 Speaker 1: of it as a rejection of the Platonic idea of 173 00:09:29,760 --> 00:09:33,080 Speaker 1: the realm of forms. So that idea that all chairs 174 00:09:33,120 --> 00:09:35,720 Speaker 1: that we might make, the whom I design and carve 175 00:09:35,800 --> 00:09:38,600 Speaker 1: and a symbol are an attempt to create the perfect chair, 176 00:09:38,960 --> 00:09:41,600 Speaker 1: which doesn't reside in our world, but only resides within 177 00:09:41,679 --> 00:09:45,000 Speaker 1: this realm of forms. So all chairs that we create 178 00:09:45,000 --> 00:09:47,760 Speaker 1: are like an aspiration for the ideal chair. Another way 179 00:09:47,760 --> 00:09:49,880 Speaker 1: I've thought about it, at least as I understood it, 180 00:09:49,920 --> 00:09:52,680 Speaker 1: was that nominalism is kind of the idea that there 181 00:09:52,760 --> 00:09:55,400 Speaker 1: is no such thing as a chair. There's only this 182 00:09:55,559 --> 00:09:58,560 Speaker 1: chair and that chair and this chair over here. There 183 00:09:58,720 --> 00:10:01,679 Speaker 1: is no chair right like this. This is the kind 184 00:10:01,679 --> 00:10:04,199 Speaker 1: of the situation one gets it too when you get 185 00:10:04,240 --> 00:10:09,120 Speaker 1: into like the genre classifications of say albums, artists or 186 00:10:09,200 --> 00:10:12,319 Speaker 1: movies that you care a great deal about, and someone 187 00:10:12,360 --> 00:10:15,280 Speaker 1: tries to limit it to a classification and say, oh, well, 188 00:10:15,280 --> 00:10:18,040 Speaker 1: that's classic rock or that's alternative rock, and you're like, no, no, 189 00:10:18,040 --> 00:10:20,880 Speaker 1: no, no no, no, you don't. Don't try and fit that. 190 00:10:20,960 --> 00:10:23,920 Speaker 1: There is there is. These categories do not apply. There 191 00:10:24,000 --> 00:10:26,079 Speaker 1: is There is only you know, whatever your band of 192 00:10:26,200 --> 00:10:28,160 Speaker 1: choice happens to be. That, there is only tool, There 193 00:10:28,200 --> 00:10:30,800 Speaker 1: is only primus or whatever. Right there, Yeah, there there 194 00:10:30,880 --> 00:10:34,079 Speaker 1: is only things, not categories. Now let's move on to 195 00:10:34,200 --> 00:10:38,839 Speaker 1: the second theme here, ontological reduction. This is, as Britannica 196 00:10:38,880 --> 00:10:42,840 Speaker 1: defines it, quote, the metaphysical doctrine that entities of a 197 00:10:42,880 --> 00:10:46,800 Speaker 1: certain kind are, in reality collections or combinations of entities 198 00:10:47,080 --> 00:10:50,679 Speaker 1: of simpler or more basic kind. I think your classic 199 00:10:50,760 --> 00:10:56,280 Speaker 1: example here is molecules atoms. Yeah. So another example here's 200 00:10:56,600 --> 00:11:01,400 Speaker 1: while our Aristotle defined ten categories of objects that might 201 00:11:01,440 --> 00:11:04,480 Speaker 1: be apprehended by a human mind, and these would have 202 00:11:04,520 --> 00:11:08,000 Speaker 1: been uh translations, very on on how you wanted to 203 00:11:08,000 --> 00:11:13,760 Speaker 1: find these. But substance, quantity, quality, relative place, time, attitude, condition, action, 204 00:11:14,120 --> 00:11:18,640 Speaker 1: and affection. William cut these down to two substance and quality. 205 00:11:18,800 --> 00:11:20,960 Speaker 1: He's really getting in there. That's the razor. That's what 206 00:11:21,040 --> 00:11:23,959 Speaker 1: a razor does. It just it slices away, It cuts 207 00:11:24,000 --> 00:11:27,720 Speaker 1: off the fat and gets down to the meat. Spade writes, quote. 208 00:11:27,880 --> 00:11:31,640 Speaker 1: Although these two strands of Acam's thinking are independent, they 209 00:11:31,640 --> 00:11:34,720 Speaker 1: are nevertheless often viewed as joint effects of a more 210 00:11:34,800 --> 00:11:40,120 Speaker 1: fundamental concern, the principle of parsimony, known as Acam's razor. Okay, 211 00:11:40,160 --> 00:11:42,680 Speaker 1: so we're getting to the razor here. Yeah. So William 212 00:11:42,679 --> 00:11:46,840 Speaker 1: devoted a lot of energy to arguing against what Spade 213 00:11:47,080 --> 00:11:52,320 Speaker 1: calls the bloated ontological inventories of his contemporaries, and he 214 00:11:52,400 --> 00:11:56,120 Speaker 1: became well known to his peers for this as such, 215 00:11:56,520 --> 00:11:59,360 Speaker 1: either towards the end of his life or shortly after 216 00:11:59,400 --> 00:12:03,520 Speaker 1: his death, a kind of Greatest Hits album came out 217 00:12:04,160 --> 00:12:08,120 Speaker 1: on his Thoughts and Ideas titled on the Principles of Theology. 218 00:12:08,360 --> 00:12:11,360 Speaker 1: Now it wasn't actually by William of Ockham, but it 219 00:12:11,440 --> 00:12:15,280 Speaker 1: featured his doctrine as well as verbatim quotes. There was 220 00:12:15,320 --> 00:12:19,120 Speaker 1: no ascribed author either, so later generations would often just 221 00:12:19,160 --> 00:12:22,200 Speaker 1: attribute it to him um as well as the notion 222 00:12:22,240 --> 00:12:27,040 Speaker 1: of Akham's razor. Uh. However, this specific phrase was apparently 223 00:12:27,120 --> 00:12:30,440 Speaker 1: never actually used by him. He never said Ackham in 224 00:12:30,440 --> 00:12:32,280 Speaker 1: the house, I'm going to get the razor out and 225 00:12:32,320 --> 00:12:36,240 Speaker 1: start carving on some uh some some some some ideas here. No, 226 00:12:36,440 --> 00:12:39,760 Speaker 1: this is something that is attributed by others to his work. Yeah, 227 00:12:39,800 --> 00:12:42,960 Speaker 1: Okham's razor is a is a name for this principle 228 00:12:43,200 --> 00:12:46,199 Speaker 1: that is supposed to be kind of a summation of 229 00:12:46,240 --> 00:12:50,160 Speaker 1: several different thoughts he articulated in different ways. Yes, yeah, 230 00:12:50,160 --> 00:12:53,480 Speaker 1: he summed it up in different different manners. Uh. In 231 00:12:53,520 --> 00:12:55,960 Speaker 1: Spade includes includes a few examples of this in his work. 232 00:12:56,080 --> 00:12:58,920 Speaker 1: For instance, here, here's some quotes from Akam. Beings are 233 00:12:58,960 --> 00:13:03,719 Speaker 1: not to be multiplied beyond necessity or plurality, is not 234 00:13:03,800 --> 00:13:07,800 Speaker 1: to be a positive without necessity or what can happen 235 00:13:07,840 --> 00:13:11,720 Speaker 1: through fewer principles happens in Vain through more and there 236 00:13:11,760 --> 00:13:14,640 Speaker 1: are other there are other examples of this as well. 237 00:13:14,720 --> 00:13:18,080 Speaker 1: We're basically saying the same thing, but maybe like it 238 00:13:18,120 --> 00:13:20,520 Speaker 1: just comes off a little flower at least in translations. Yeah, 239 00:13:20,559 --> 00:13:23,280 Speaker 1: I think the the simple version you could get to 240 00:13:23,760 --> 00:13:27,640 Speaker 1: the summarizing some of his abuse here, like, uh, don't 241 00:13:27,679 --> 00:13:32,000 Speaker 1: make assumptions you don't have to, don't pile on explanations 242 00:13:32,120 --> 00:13:35,480 Speaker 1: that are not necessary. Yeah, and also just don't take 243 00:13:35,520 --> 00:13:37,599 Speaker 1: more steps they are necessary to get from point A 244 00:13:37,679 --> 00:13:40,920 Speaker 1: to point B than your reasoning and in your hypothesis. 245 00:13:41,559 --> 00:13:44,480 Speaker 1: And the way this usually gets translated into modern thinking, 246 00:13:44,480 --> 00:13:46,960 Speaker 1: as we've talked about before, is that when you've got 247 00:13:47,000 --> 00:13:51,200 Speaker 1: competing explanations, it's better to tend towards the simpler one, 248 00:13:51,280 --> 00:13:54,040 Speaker 1: the one that makes fewer assumptions, rather than the more 249 00:13:54,040 --> 00:13:57,880 Speaker 1: complicated one that makes more assumptions. Now here's another fun 250 00:13:58,000 --> 00:14:01,679 Speaker 1: fact about William of Aucham. William Ackom is key to 251 00:14:01,880 --> 00:14:06,120 Speaker 1: Elmberto Echo's excellent novel The Name of the Rose. Uh. 252 00:14:06,200 --> 00:14:09,160 Speaker 1: This was a novel that was published in nineteen eighty. 253 00:14:09,360 --> 00:14:12,040 Speaker 1: Many of you may be familiar with the certainly the 254 00:14:11,800 --> 00:14:16,640 Speaker 1: the film adaptation that starred Sean Connery, f Murray, Abraham Um, 255 00:14:16,880 --> 00:14:19,480 Speaker 1: Christian Slater in a host of wonderful character actors. And 256 00:14:19,520 --> 00:14:22,520 Speaker 1: then there was there's a more recent mini series adaptation 257 00:14:22,520 --> 00:14:25,320 Speaker 1: with John Taturo that I have not seen, but I 258 00:14:25,360 --> 00:14:27,880 Speaker 1: should probably see at some point or another. But anyway, 259 00:14:27,960 --> 00:14:32,120 Speaker 1: the main character in Echoes novel is William of Baskerville, 260 00:14:32,280 --> 00:14:35,560 Speaker 1: who is in many ways similar. He's a Franciscan friar. 261 00:14:35,880 --> 00:14:39,840 Speaker 1: He's got a kind of empirical streak. Yeah, he's basically 262 00:14:39,880 --> 00:14:43,720 Speaker 1: a mash up of William of Ockam and Sherlock Holmes, 263 00:14:43,760 --> 00:14:48,280 Speaker 1: thus the Baskerville alluding to uh Hound of the Baskerville's. 264 00:14:49,320 --> 00:14:52,080 Speaker 1: Then the title itself, the Name of the Rose, has 265 00:14:52,160 --> 00:14:56,400 Speaker 1: has been interpreted as being a reference to Acom's uh nominalism. 266 00:14:56,440 --> 00:14:59,080 Speaker 1: There is no one rose. There is only the Name 267 00:14:59,200 --> 00:15:02,160 Speaker 1: of the Rose. But they're also other I think interpretations 268 00:15:02,160 --> 00:15:03,800 Speaker 1: on it, and it's meant to be kind of cryptic. 269 00:15:04,400 --> 00:15:06,720 Speaker 1: Now according to I was reading more about this, and 270 00:15:06,800 --> 00:15:08,560 Speaker 1: it's been been a little while since I've read In 271 00:15:08,560 --> 00:15:10,040 Speaker 1: the Name of the Rose, you've read it more recently 272 00:15:10,040 --> 00:15:12,360 Speaker 1: than yes, Because we were misremembering. We were thinking, now 273 00:15:12,440 --> 00:15:14,360 Speaker 1: was it was? Was it the case? In the book 274 00:15:14,400 --> 00:15:17,640 Speaker 1: that William of Ockham was supposed to be this fictional 275 00:15:17,680 --> 00:15:20,560 Speaker 1: main character's mentor. I somehow had that in my mind 276 00:15:20,600 --> 00:15:24,440 Speaker 1: as well. No, instead it was another medieval scholastic thinker. 277 00:15:24,440 --> 00:15:28,480 Speaker 1: It was Roger Bacon. So so yes, Roger Bacon was 278 00:15:28,720 --> 00:15:32,160 Speaker 1: William of Baskerville's mentor, as opposed to William of Acham, 279 00:15:32,200 --> 00:15:35,400 Speaker 1: who I do not believe as Ackam is actually mentioned 280 00:15:35,680 --> 00:15:39,040 Speaker 1: in the novel. So I was reading a little bit 281 00:15:39,040 --> 00:15:41,480 Speaker 1: more about this. There was a two thousand eighteen article 282 00:15:41,560 --> 00:15:45,400 Speaker 1: that came out in Philosophy Now by Carol Nicholson titled 283 00:15:45,400 --> 00:15:48,960 Speaker 1: Acam's Rose, and she pointed out that Echo had apparently 284 00:15:49,000 --> 00:15:52,480 Speaker 1: explored the possibility of simply using Ackam as his main 285 00:15:52,600 --> 00:15:56,960 Speaker 1: character in in this novel, but he ultimately quote did 286 00:15:57,000 --> 00:16:00,920 Speaker 1: not find him a very attractive person. And therefore, I mean, 287 00:16:00,920 --> 00:16:02,840 Speaker 1: did that makes sense right? If you're it's like, you 288 00:16:02,880 --> 00:16:05,160 Speaker 1: can either lean on a historical figure, or he can 289 00:16:05,200 --> 00:16:07,640 Speaker 1: do something a little more fun and do a mash 290 00:16:07,760 --> 00:16:12,040 Speaker 1: up of ACoM and the Great Detective And ultimately, I mean, 291 00:16:12,080 --> 00:16:14,200 Speaker 1: that's one of the fun things about the novel is 292 00:16:14,240 --> 00:16:16,640 Speaker 1: that is that you do have these elements where it's 293 00:16:16,680 --> 00:16:19,920 Speaker 1: a it's Sherlock Holmes going up against bores, you know, 294 00:16:19,960 --> 00:16:22,720 Speaker 1: that kind of sort of thing. She writes, Uh, this 295 00:16:22,760 --> 00:16:25,480 Speaker 1: is interesting as well, just to draw the parallel between 296 00:16:25,520 --> 00:16:28,480 Speaker 1: William of Baskerville and William of of Olcom. She writes, 297 00:16:28,680 --> 00:16:31,240 Speaker 1: quote in thirty seven, the year in which the name 298 00:16:31,280 --> 00:16:35,160 Speaker 1: of the Roses set, ACoM faced fifty six charges of 299 00:16:35,200 --> 00:16:39,040 Speaker 1: heresy and was excommunicated after escaping the protection of Emperor 300 00:16:39,040 --> 00:16:42,280 Speaker 1: Louis of Bavaria. This put an end to his academic career, 301 00:16:42,320 --> 00:16:43,800 Speaker 1: and he spent the rest of his life as a 302 00:16:43,840 --> 00:16:47,400 Speaker 1: political activists, advocating freedom of speech, the separation of church 303 00:16:47,400 --> 00:16:50,920 Speaker 1: and state, and arguing against the infallibility of the pope. 304 00:16:51,240 --> 00:16:54,240 Speaker 1: She also points out that Ackom, like the fictional William 305 00:16:54,240 --> 00:16:58,600 Speaker 1: of Baskerville, likely died of the plague. Alright, on that note, 306 00:16:58,600 --> 00:17:00,080 Speaker 1: we're going to take a quick break, but when we 307 00:17:00,200 --> 00:17:03,760 Speaker 1: come back we will continue our discussion of Acams razor. 308 00:17:05,359 --> 00:17:09,280 Speaker 1: Thank alright, we're back, all right. So we've been talking 309 00:17:09,320 --> 00:17:13,280 Speaker 1: about this principle known as Akam's razor that we've described 310 00:17:13,320 --> 00:17:17,280 Speaker 1: already as the idea that simpler hypotheses are better than 311 00:17:17,320 --> 00:17:19,639 Speaker 1: more complex hypotheses. There are a number of ways you 312 00:17:19,680 --> 00:17:22,320 Speaker 1: can formulate it. But it's a principle that's been referred 313 00:17:22,359 --> 00:17:25,760 Speaker 1: back to actually since probably before William of Akam. It is, 314 00:17:25,800 --> 00:17:29,320 Speaker 1: I think, a principle that somewhat predates him in intellectual history, 315 00:17:29,600 --> 00:17:32,600 Speaker 1: right right, He did not. He did not create something 316 00:17:32,640 --> 00:17:36,000 Speaker 1: that was not already utilized by other thinkers of the 317 00:17:36,080 --> 00:17:39,639 Speaker 1: day and thinkers before him. One great example of somebody 318 00:17:39,680 --> 00:17:43,240 Speaker 1: not before William of Acham but later articulating similar ideas 319 00:17:43,320 --> 00:17:46,560 Speaker 1: is Isaac Newton in his great work The Principia Mathematica. 320 00:17:46,640 --> 00:17:51,320 Speaker 1: From Newton writes, quote, we are to admit no more 321 00:17:51,520 --> 00:17:55,040 Speaker 1: causes of natural things than such as are both true 322 00:17:55,240 --> 00:17:59,679 Speaker 1: and sufficient to explain their appearances. Uh So, a similar 323 00:17:59,720 --> 00:18:03,520 Speaker 1: idea is there's no need to add extra explanations when 324 00:18:03,560 --> 00:18:07,240 Speaker 1: you already have an explanation that is number one true 325 00:18:07,600 --> 00:18:11,240 Speaker 1: and number two explains everything you see. Right. So, an 326 00:18:11,240 --> 00:18:14,960 Speaker 1: example of this might be why do the planets orbit 327 00:18:15,000 --> 00:18:17,040 Speaker 1: the Sun? This would be something that Newton would be 328 00:18:17,119 --> 00:18:19,680 Speaker 1: concerned with. Newton would say, okay, we know of two 329 00:18:19,720 --> 00:18:24,440 Speaker 1: forces that explain what we see, gravity and inertia. Inertia 330 00:18:24,640 --> 00:18:27,160 Speaker 1: is the tendency of an object in motion to stay 331 00:18:27,200 --> 00:18:31,200 Speaker 1: in motion. Gravity is the mutually attracting force between two 332 00:18:31,200 --> 00:18:35,359 Speaker 1: objects with mass. So, because of inertia, the planets flying 333 00:18:35,400 --> 00:18:37,920 Speaker 1: through space want to keep traveling in a straight line 334 00:18:37,960 --> 00:18:41,280 Speaker 1: at a constant speed. And because of gravity, instead of 335 00:18:41,320 --> 00:18:44,439 Speaker 1: traveling in a straight line, their path bends around towards 336 00:18:44,480 --> 00:18:47,880 Speaker 1: the Sun as they travel. And so that those two 337 00:18:47,920 --> 00:18:51,080 Speaker 1: things are both true, and they explain everything we observed, 338 00:18:51,160 --> 00:18:53,560 Speaker 1: not now, actually not quite everything, but they were good 339 00:18:53,640 --> 00:18:58,040 Speaker 1: enough for Newton's time explaining everything. You might also say, though, 340 00:18:58,119 --> 00:19:01,200 Speaker 1: that maybe in addition to gravi d and inertia, there 341 00:19:01,200 --> 00:19:04,640 Speaker 1: are angels that guide the planets in their orbits because 342 00:19:04,680 --> 00:19:08,040 Speaker 1: those elliptical pathways are pleasing to the Lord. But if 343 00:19:08,040 --> 00:19:10,960 Speaker 1: somebody proposes that, you're you're kind of stuck. Because there's 344 00:19:11,000 --> 00:19:14,280 Speaker 1: no way to prove the angel hypothesis wrong. You can't 345 00:19:14,359 --> 00:19:18,280 Speaker 1: say there aren't invisible angels guiding the planets. But pretty 346 00:19:18,359 --> 00:19:20,840 Speaker 1: much everybody today, I think, even people who believe in 347 00:19:20,880 --> 00:19:24,439 Speaker 1: angels in some sense, would not see any reason to 348 00:19:24,520 --> 00:19:28,159 Speaker 1: believe that there are angels doing that, because there are 349 00:19:28,200 --> 00:19:31,399 Speaker 1: other explanations which do all the explaining that needs to 350 00:19:31,440 --> 00:19:34,120 Speaker 1: be done. Right, Yeah, I mean, once you drag angels 351 00:19:34,119 --> 00:19:36,000 Speaker 1: into it too, it it opens up the door for 352 00:19:36,680 --> 00:19:39,720 Speaker 1: just a never ending list of reasons why the angels 353 00:19:39,720 --> 00:19:42,200 Speaker 1: can't be detected or why the you know, well, why 354 00:19:42,240 --> 00:19:44,719 Speaker 1: the angel wanted why the planet seems to be behaving 355 00:19:44,760 --> 00:19:47,680 Speaker 1: this way. It's in accordance with these known laws rather 356 00:19:47,760 --> 00:19:51,919 Speaker 1: than the machinations of a divine being right, And you 357 00:19:51,920 --> 00:19:54,560 Speaker 1: don't need to appeal in any way to the additional 358 00:19:54,600 --> 00:19:57,360 Speaker 1: plausibility of angels or not. Like the reason I said 359 00:19:57,359 --> 00:20:01,119 Speaker 1: that even people who otherwise believe in angels don't say 360 00:20:01,160 --> 00:20:03,480 Speaker 1: that they're guiding the motions of the planets is you 361 00:20:03,520 --> 00:20:06,160 Speaker 1: don't need them to explain that. And you've just got 362 00:20:06,200 --> 00:20:09,320 Speaker 1: basic laws of physics that explain what the planets are doing. 363 00:20:09,359 --> 00:20:12,440 Speaker 1: There's no reason to add an angel's explanation. It doesn't 364 00:20:12,480 --> 00:20:14,880 Speaker 1: do anymore work. Yeah, it doesn't even help angels out. 365 00:20:15,480 --> 00:20:18,399 Speaker 1: I mean, yeah, it's there. There's just no point in 366 00:20:18,440 --> 00:20:20,359 Speaker 1: it now. Of course, sticking on the theory of like 367 00:20:20,400 --> 00:20:22,240 Speaker 1: the motions of the planets for a minute, of course, 368 00:20:22,240 --> 00:20:24,320 Speaker 1: we would have to later come up with a more 369 00:20:24,359 --> 00:20:27,400 Speaker 1: refined theory of gravity for those rare cases where Newton's 370 00:20:27,440 --> 00:20:30,080 Speaker 1: theory of gravity would fail, And we would get that 371 00:20:30,119 --> 00:20:34,439 Speaker 1: with Einstein and general relativity, which recharacterized gravity is the 372 00:20:34,440 --> 00:20:38,200 Speaker 1: curvature of space time caused by deformation due to mass, 373 00:20:38,359 --> 00:20:41,440 Speaker 1: rather than as a mutually attractive force between objects, though 374 00:20:41,480 --> 00:20:43,520 Speaker 1: in most cases if you think of it as a 375 00:20:43,640 --> 00:20:46,480 Speaker 1: force in in the Newtonian sense, your predictions work out 376 00:20:46,520 --> 00:20:49,119 Speaker 1: just fine. But from an article that I want to 377 00:20:49,119 --> 00:20:52,520 Speaker 1: refer to later by a philosopher named Elliott sober Uh, 378 00:20:52,560 --> 00:20:55,479 Speaker 1: he writes, quote Albert Einstein spoke for many when he 379 00:20:55,520 --> 00:20:58,880 Speaker 1: said quote, it can scarcely be denied that the supreme 380 00:20:58,960 --> 00:21:02,240 Speaker 1: goal of all the rie is to make the irreducible 381 00:21:02,280 --> 00:21:06,080 Speaker 1: basic elements as simple and as few as possible without 382 00:21:06,119 --> 00:21:09,720 Speaker 1: having to surrender the adequate representation of a single datum 383 00:21:09,760 --> 00:21:13,719 Speaker 1: of experience, which in a way is again articulating something 384 00:21:13,920 --> 00:21:17,399 Speaker 1: like Ockham's razor. It's saying like, you want the simplest 385 00:21:17,520 --> 00:21:21,359 Speaker 1: possible explanation that explains everything. And if we're sticking with 386 00:21:21,359 --> 00:21:25,399 Speaker 1: Einstein for a minute, to go beyond positing something like angels, 387 00:21:25,440 --> 00:21:28,159 Speaker 1: if if you want to go into real scientific hypotheses 388 00:21:28,200 --> 00:21:30,680 Speaker 1: in history, there are all kinds of things that you 389 00:21:30,800 --> 00:21:34,200 Speaker 1: might argue we're sort of done away with by an 390 00:21:34,200 --> 00:21:37,479 Speaker 1: Acam's razor ish kind of process, though I think there 391 00:21:37,480 --> 00:21:40,520 Speaker 1: are some historians and philosophers of science that might disagree there. 392 00:21:40,720 --> 00:21:42,440 Speaker 1: But one example that comes to my mind is the 393 00:21:42,560 --> 00:21:46,240 Speaker 1: luminiferous ether. You know, it was once believed by many 394 00:21:46,320 --> 00:21:49,560 Speaker 1: scientists that there had to be a medium in space 395 00:21:50,040 --> 00:21:53,000 Speaker 1: through which light propagates, right, the same way that if 396 00:21:53,000 --> 00:21:56,080 Speaker 1: you want sound to propagate, there's no sound in space, right, 397 00:21:56,320 --> 00:21:59,480 Speaker 1: You've got to have sound traveling through a medium like air, 398 00:21:59,680 --> 00:22:02,000 Speaker 1: or like water, or like a you know, like a 399 00:22:02,080 --> 00:22:05,760 Speaker 1: steel wire. There must be matter to transmit that energy. 400 00:22:06,080 --> 00:22:08,840 Speaker 1: And so the idea was that space was filled with 401 00:22:08,920 --> 00:22:13,960 Speaker 1: this stuff, this ether, that light waves propagated through. And eventually, 402 00:22:14,040 --> 00:22:16,880 Speaker 1: due to Einstein and too other thinkers and experiments it 403 00:22:16,880 --> 00:22:20,760 Speaker 1: it started to become clear that the ether was superfluous. 404 00:22:20,800 --> 00:22:23,440 Speaker 1: You didn't need it to explain any of the properties 405 00:22:23,440 --> 00:22:27,080 Speaker 1: of light. Now, there's another example from history that often 406 00:22:27,560 --> 00:22:30,360 Speaker 1: comes up when people talk about Okham's razor. It's often 407 00:22:30,400 --> 00:22:33,640 Speaker 1: brought up as a great example of Ockham's razor being applied. 408 00:22:34,280 --> 00:22:36,240 Speaker 1: But we're gonna get to an article later on that 409 00:22:36,320 --> 00:22:39,440 Speaker 1: I think has presents a pretty devastating case against this 410 00:22:39,640 --> 00:22:42,359 Speaker 1: being true. But just to set it up here, it 411 00:22:42,480 --> 00:22:45,800 Speaker 1: is the idea of comparing the Ptolemaic universe versus the 412 00:22:45,800 --> 00:22:50,280 Speaker 1: Copernican universe, which obviously, this argument was brought to a 413 00:22:50,880 --> 00:22:53,800 Speaker 1: very dramatic end UH in the life of Galileo. Right 414 00:22:53,840 --> 00:22:57,879 Speaker 1: Galileo got into big trouble with the Inquisition for, among 415 00:22:57,920 --> 00:23:00,480 Speaker 1: other things, they were also politics involved, but four, among 416 00:23:00,520 --> 00:23:05,360 Speaker 1: other things, advocating the Copernican model over the Polemic model. UH. 417 00:23:05,440 --> 00:23:08,520 Speaker 1: For simplicity's sake, the Copernican model of the Solar System 418 00:23:08,600 --> 00:23:10,680 Speaker 1: was of course the one we know to be more 419 00:23:10,800 --> 00:23:13,960 Speaker 1: basically correct, not totally correct, but more correct because it 420 00:23:14,000 --> 00:23:16,639 Speaker 1: was heliocentric. It put the Sun at the center of 421 00:23:16,640 --> 00:23:19,720 Speaker 1: the Solar System and argued that the other planets, including 422 00:23:19,720 --> 00:23:22,560 Speaker 1: the Earth, all rotated around the Sun. UH. This of 423 00:23:22,600 --> 00:23:25,080 Speaker 1: course was not the orthodox astronomy of the day. The 424 00:23:25,119 --> 00:23:28,880 Speaker 1: more favored models were the traditional Toolemic model, which had 425 00:23:28,920 --> 00:23:31,720 Speaker 1: the Earth at the center and the the planets all 426 00:23:31,760 --> 00:23:35,320 Speaker 1: going around the Earth, and these strange kind of spirograph 427 00:23:35,440 --> 00:23:38,680 Speaker 1: patterns that had these things called epicycles where they would 428 00:23:38,720 --> 00:23:41,400 Speaker 1: sort of stop and then do a circle and another circle, 429 00:23:41,480 --> 00:23:45,440 Speaker 1: and like loops within their their traveling um. And then 430 00:23:45,480 --> 00:23:49,240 Speaker 1: you had some compromise models like the model of Tycho Brahi. Now, 431 00:23:49,240 --> 00:23:52,080 Speaker 1: the traditional argument here in favor of saying, you know, 432 00:23:52,160 --> 00:23:55,359 Speaker 1: Copernicus and Galileo were on the side of Occam's razor, 433 00:23:55,640 --> 00:23:59,280 Speaker 1: it would go something like, well, the Ptolemaic system and 434 00:23:59,320 --> 00:24:01,960 Speaker 1: the and the type Cobrahi models, they've got all this 435 00:24:02,160 --> 00:24:06,000 Speaker 1: extra stuff. You need to assume, all these weird extra assumptions, 436 00:24:06,040 --> 00:24:09,440 Speaker 1: like like epicycles, you know, like where the planets are 437 00:24:09,440 --> 00:24:12,000 Speaker 1: going around in loops and it's not explained exactly why 438 00:24:12,040 --> 00:24:14,439 Speaker 1: they're doing that. You just have to insert the loops 439 00:24:14,760 --> 00:24:18,040 Speaker 1: in order to make it match our are our observations, 440 00:24:18,359 --> 00:24:22,080 Speaker 1: and therefore the Tolemaic model was more complex. We'll come 441 00:24:22,119 --> 00:24:24,800 Speaker 1: back to that later on, because I think now it's 442 00:24:24,960 --> 00:24:28,359 Speaker 1: going to be important to get into some criticisms of 443 00:24:28,400 --> 00:24:31,399 Speaker 1: Acams razor. You know, if you go into especially a 444 00:24:31,400 --> 00:24:34,040 Speaker 1: lot of like kind of skeptic communities on the Internet, 445 00:24:34,600 --> 00:24:38,120 Speaker 1: you might sometimes see people treating ocams razor as if 446 00:24:38,160 --> 00:24:41,920 Speaker 1: it is some kind of law of nature, like referring 447 00:24:41,920 --> 00:24:44,240 Speaker 1: to Akam's razor in the same way you might refer 448 00:24:44,320 --> 00:24:48,320 Speaker 1: to proven theories about reality, uh, such as you know, 449 00:24:48,359 --> 00:24:52,080 Speaker 1: the equations describing the action of gravity or something. Uh. 450 00:24:52,080 --> 00:24:54,640 Speaker 1: And so I think while OCAM's razor is an interesting 451 00:24:54,680 --> 00:24:58,359 Speaker 1: and sometimes useful skeptical lens to apply, it is not 452 00:24:58,560 --> 00:25:00,480 Speaker 1: in fact a law of nature. And then there are 453 00:25:00,520 --> 00:25:04,119 Speaker 1: a couple of major branches of criticisms of ye old razor. 454 00:25:04,760 --> 00:25:07,760 Speaker 1: I think the first would be like accusations that it 455 00:25:07,960 --> 00:25:11,920 Speaker 1: is often misunderstood or misused. And then second there would 456 00:25:11,920 --> 00:25:14,840 Speaker 1: be actual attacks on the usefulness of the razor, even 457 00:25:14,920 --> 00:25:17,919 Speaker 1: when it is in its supposedly true form. Now, the 458 00:25:17,960 --> 00:25:19,800 Speaker 1: first thing would be pretty simple, and it's just the 459 00:25:19,840 --> 00:25:25,800 Speaker 1: idea that Ockham's razor is misunderstood, misquoted, misconstrued, misused. Uh. 460 00:25:25,840 --> 00:25:28,240 Speaker 1: I Actually I came across a funny blog post that, 461 00:25:28,320 --> 00:25:30,600 Speaker 1: of all things, pointed to a quote from a mystery 462 00:25:30,680 --> 00:25:36,240 Speaker 1: writer named Harlan Coben. Uh mystery writers, yeah, uh yeah, 463 00:25:36,280 --> 00:25:38,399 Speaker 1: I'm not familiar with this writer, but I thought this 464 00:25:38,480 --> 00:25:40,440 Speaker 1: was interesting this would you know? It was just an 465 00:25:40,440 --> 00:25:43,560 Speaker 1: example of somebody saying, no, you're not using Ockham's razor, right, 466 00:25:44,080 --> 00:25:47,480 Speaker 1: this writer wrote quote, most people oversimplify Ockham's razor to 467 00:25:47,600 --> 00:25:50,680 Speaker 1: mean the simplest answer is usually correct, but the real 468 00:25:50,800 --> 00:25:53,840 Speaker 1: meaning what the Franciscan Friar William of Oakin really wanted 469 00:25:53,880 --> 00:25:57,320 Speaker 1: to emphasize is that you shouldn't complicate, that you shouldn't 470 00:25:57,480 --> 00:26:00,880 Speaker 1: stack a theory. If a simpler exploit nation was at 471 00:26:00,880 --> 00:26:04,960 Speaker 1: the ready, pare it down, prune the excess. And so 472 00:26:05,000 --> 00:26:07,200 Speaker 1: I think looking at it this way, this fits more 473 00:26:07,280 --> 00:26:09,960 Speaker 1: with like the version that we were talking about with 474 00:26:10,000 --> 00:26:14,160 Speaker 1: Isaac Newton. Right. It's not necessarily a statement about simplicity 475 00:26:14,200 --> 00:26:17,800 Speaker 1: as a general principle, but saying that you shouldn't stack 476 00:26:17,960 --> 00:26:21,560 Speaker 1: things that explain the same outcomes on top of each 477 00:26:21,560 --> 00:26:24,800 Speaker 1: other because you get no extra usefulness out of that. 478 00:26:25,840 --> 00:26:28,040 Speaker 1: Another example that I was just thinking of that's come 479 00:26:28,080 --> 00:26:30,600 Speaker 1: up on the show before is the idea of aquatic 480 00:26:30,640 --> 00:26:34,640 Speaker 1: ape theory. Oh yes, this is the idea that, among 481 00:26:34,680 --> 00:26:38,040 Speaker 1: other things, humans are hairless because for a while our 482 00:26:38,359 --> 00:26:42,520 Speaker 1: our ancestors lived at least partially in the water. Yeah. 483 00:26:42,560 --> 00:26:44,439 Speaker 1: The ideas you look at a lot of our body 484 00:26:44,480 --> 00:26:51,200 Speaker 1: features are relatively smooth skin, bipedalism, layers of subcutaneous fat, uh, 485 00:26:51,240 --> 00:26:54,560 Speaker 1: the abilities of our vocal cords, all kinds of things 486 00:26:54,600 --> 00:26:57,840 Speaker 1: like that. The proponents of aquatic ape theory say, hey, 487 00:26:57,880 --> 00:27:01,239 Speaker 1: we've got all these strange anatomical more logical features that 488 00:27:01,280 --> 00:27:03,760 Speaker 1: are not the same as other great apes. Why do 489 00:27:03,800 --> 00:27:06,600 Speaker 1: we have those qualities? I think you could explain them 490 00:27:06,600 --> 00:27:09,080 Speaker 1: all if humans once needed to be in the water, 491 00:27:09,280 --> 00:27:11,880 Speaker 1: so they needed to be smooth. You have smooth skin 492 00:27:12,240 --> 00:27:15,840 Speaker 1: in order to be aerodynamic swimmers, and they became bipedal 493 00:27:16,000 --> 00:27:18,000 Speaker 1: so that they could wade around in the water. And 494 00:27:18,040 --> 00:27:20,320 Speaker 1: you come up with a list of explanations along these 495 00:27:20,359 --> 00:27:24,040 Speaker 1: lines that they would argue all point to an aquatic ancestry. 496 00:27:24,080 --> 00:27:26,920 Speaker 1: But there's a wrinkle there, because, of course, if that's 497 00:27:26,920 --> 00:27:29,879 Speaker 1: all true, the question is, then why did we retain 498 00:27:30,040 --> 00:27:32,879 Speaker 1: all those features after leaving the water? You know, humans 499 00:27:32,880 --> 00:27:34,760 Speaker 1: are not an aquatic species now, I mean, we can 500 00:27:34,840 --> 00:27:37,240 Speaker 1: go into the water, but water is not our primary 501 00:27:37,800 --> 00:27:41,280 Speaker 1: environmental niche So what you know, how can we still 502 00:27:41,359 --> 00:27:44,920 Speaker 1: have all those features? And the the aquatic ape theorists 503 00:27:45,080 --> 00:27:47,760 Speaker 1: might say, oh, well, once you came onto the land, 504 00:27:47,800 --> 00:27:50,640 Speaker 1: it actually was useful to be bipedal for these other reasons, 505 00:27:50,680 --> 00:27:53,320 Speaker 1: and which useful to be hairless for these other reasons, 506 00:27:53,320 --> 00:27:55,919 Speaker 1: which means you could cut out an entire step of 507 00:27:55,920 --> 00:27:57,840 Speaker 1: having to be in the water to stick with these 508 00:27:57,880 --> 00:28:00,200 Speaker 1: are useful for living on the land exactly you, I'd 509 00:28:00,200 --> 00:28:02,720 Speaker 1: apply ACAM here and say, if those features turn out 510 00:28:02,720 --> 00:28:05,600 Speaker 1: to be useful on land, why wouldn't they just evolve 511 00:28:05,680 --> 00:28:08,199 Speaker 1: on land in the first place? Right, So there is 512 00:28:08,359 --> 00:28:11,920 Speaker 1: like you've you've you've been up then creating or redirecting 513 00:28:12,320 --> 00:28:16,520 Speaker 1: to the hypothesis that is one enormous step shorter. Yeah, 514 00:28:16,560 --> 00:28:18,800 Speaker 1: and so aquatic ape theory, I think is one of 515 00:28:18,800 --> 00:28:22,960 Speaker 1: those things that, like it would be hard to completely disprove. 516 00:28:23,040 --> 00:28:26,520 Speaker 1: I think that there is no physical evidence pointing toward it. 517 00:28:26,520 --> 00:28:29,199 Speaker 1: It would be hard to say this is impossible to 518 00:28:29,359 --> 00:28:32,240 Speaker 1: have happened, but there's just no reason to assume it. 519 00:28:32,240 --> 00:28:34,639 Speaker 1: It just it just like adds in an extra step 520 00:28:34,680 --> 00:28:38,360 Speaker 1: of explanations that don't explain anything any better than other 521 00:28:38,400 --> 00:28:41,120 Speaker 1: explanations could. Yeah. I mean, it's kind of like if 522 00:28:41,160 --> 00:28:44,760 Speaker 1: I come home from work and I have say beer 523 00:28:44,840 --> 00:28:48,240 Speaker 1: and bread. Uh, maybe I stopped at two places to 524 00:28:48,280 --> 00:28:49,640 Speaker 1: get the beer in the bread. I got the beer 525 00:28:49,640 --> 00:28:51,440 Speaker 1: at one place and the bread of the other, But 526 00:28:51,520 --> 00:28:53,880 Speaker 1: I also probably just stopped at one store to get 527 00:28:53,880 --> 00:28:57,080 Speaker 1: both of them. Both are likely one is a shorter trip. 528 00:28:57,280 --> 00:28:58,960 Speaker 1: I feel like you would also have to add in 529 00:28:59,080 --> 00:29:01,520 Speaker 1: something it kind of extravagant that would be like you 530 00:29:01,600 --> 00:29:04,600 Speaker 1: stopped at the way home and you entered a raffle 531 00:29:04,680 --> 00:29:08,280 Speaker 1: contest in which you won beer and bread. Uh. And 532 00:29:08,320 --> 00:29:10,959 Speaker 1: then you also may have stopped at the store, you know, 533 00:29:11,080 --> 00:29:13,720 Speaker 1: to get something else, but like, yeah, I stole beer 534 00:29:13,720 --> 00:29:16,120 Speaker 1: and bread, as like when the simple explanation is probably 535 00:29:16,200 --> 00:29:18,720 Speaker 1: probably just bought beer and bread. Where beer and bread 536 00:29:18,840 --> 00:29:21,880 Speaker 1: was was placed in my car by a mysterious stranger. 537 00:29:22,160 --> 00:29:25,680 Speaker 1: Like these are all things that are possible and could 538 00:29:25,720 --> 00:29:28,080 Speaker 1: conceivably be the reason that I have beer and bread 539 00:29:28,200 --> 00:29:34,320 Speaker 1: in the car. But OCAM's razor slices away the unnecessary steps, 540 00:29:34,320 --> 00:29:38,120 Speaker 1: the less likely steps for the the shorter trip between 541 00:29:38,120 --> 00:29:40,520 Speaker 1: point and point B. Right. And I think in cases 542 00:29:40,600 --> 00:29:43,440 Speaker 1: like that, you could say that ocums raizor doesn't necessarily 543 00:29:43,640 --> 00:29:46,640 Speaker 1: prove a theory wrong, but it is kind of a 544 00:29:46,720 --> 00:29:53,400 Speaker 1: useful heuristic. It might help you use your intellectual time wisely. Right. Uh. 545 00:29:53,440 --> 00:29:55,160 Speaker 1: But and and that gets us to the next step, 546 00:29:55,200 --> 00:29:58,920 Speaker 1: which is the more comprehensive criticism, the idea that ACAM 547 00:29:59,080 --> 00:30:01,920 Speaker 1: is maybe in act wrong, more not useful. I think 548 00:30:01,920 --> 00:30:05,080 Speaker 1: in some cases this criticism is true, so maybe we 549 00:30:05,080 --> 00:30:07,200 Speaker 1: should get into it a bit. The first article I 550 00:30:07,240 --> 00:30:11,240 Speaker 1: wanted to look at is called The Tyranny of Simple Explanations, 551 00:30:11,280 --> 00:30:13,800 Speaker 1: and it was published in the Atlantic. It was written 552 00:30:13,840 --> 00:30:16,400 Speaker 1: by the science writer Philip Ball, one of my favorite 553 00:30:16,400 --> 00:30:19,800 Speaker 1: current science writers, who wrote the book Beyond Weird, a 554 00:30:19,880 --> 00:30:23,160 Speaker 1: really fantastic book about quantum physics that I recommended last summer. 555 00:30:23,480 --> 00:30:25,440 Speaker 1: This is one of your summer reading picks. I think, yeah, 556 00:30:25,680 --> 00:30:28,120 Speaker 1: it's really good. It's one of those books that you 557 00:30:28,160 --> 00:30:30,000 Speaker 1: may think you already you know, you've already read a 558 00:30:30,040 --> 00:30:32,480 Speaker 1: quantum physics book. You know, you know the basics, you know, 559 00:30:32,640 --> 00:30:35,200 Speaker 1: you know the the what the interpretations are and all that. 560 00:30:35,400 --> 00:30:37,320 Speaker 1: I feel like this is one you can still be 561 00:30:37,440 --> 00:30:40,880 Speaker 1: newly amazed by and learn a lot more from and 562 00:30:41,080 --> 00:30:43,760 Speaker 1: true form as a great science writer. Ball I think 563 00:30:43,800 --> 00:30:47,720 Speaker 1: makes a fantastic case in this article against Stockholm's razor, 564 00:30:47,840 --> 00:30:51,360 Speaker 1: against you know, a liberal use of it. So he 565 00:30:51,400 --> 00:30:54,280 Speaker 1: starts by saying, quote, Ockham's razor is often stated as 566 00:30:54,320 --> 00:30:58,160 Speaker 1: an injunction not to make more assumptions than you absolutely need. 567 00:30:58,640 --> 00:31:02,320 Speaker 1: And in that way, it's almost a truism, right, I mean, like, 568 00:31:02,680 --> 00:31:06,120 Speaker 1: when when you phrase it that way, who would say, well, yeah, no, 569 00:31:06,280 --> 00:31:09,640 Speaker 1: I want to make more assumptions than I need. Yeah, 570 00:31:09,680 --> 00:31:13,160 Speaker 1: I mean you can come back to, like a forensic example, right, 571 00:31:14,000 --> 00:31:17,880 Speaker 1: detective work, which even Carl Sagan makes a discuss this 572 00:31:17,960 --> 00:31:20,720 Speaker 1: a lot like committing science to UH to the work 573 00:31:20,720 --> 00:31:24,600 Speaker 1: of a detective, like how many hypotheses do you need 574 00:31:24,720 --> 00:31:28,160 Speaker 1: for a murder? Right, and you know there's gonna You're 575 00:31:28,200 --> 00:31:31,360 Speaker 1: gonna be the obvious ones that you know, especially the 576 00:31:31,360 --> 00:31:33,880 Speaker 1: acam's razer, are going to be the primary candidates that 577 00:31:33,920 --> 00:31:36,680 Speaker 1: it was someone the victim knew, that it was, like 578 00:31:36,720 --> 00:31:41,000 Speaker 1: a spouse or a friend, etcetera. Uh, Rather than inventing 579 00:31:41,000 --> 00:31:44,240 Speaker 1: wild scenarios with no evidence to base them on, right, saying, 580 00:31:44,320 --> 00:31:47,280 Speaker 1: you know, certainly getting into possible scenarios like maybe it 581 00:31:47,480 --> 00:31:51,280 Speaker 1: was the random work of a serial murder. Serial murders exist, 582 00:31:51,520 --> 00:31:54,320 Speaker 1: this does happen from time to time, but is it 583 00:31:54,400 --> 00:31:57,240 Speaker 1: the most likely scenario? And then that's not even getting 584 00:31:57,240 --> 00:32:01,000 Speaker 1: into wilder possibilities like well, perhaps it was a an assassin, 585 00:32:01,080 --> 00:32:03,880 Speaker 1: a spy whom it's took them for another person. Well 586 00:32:03,960 --> 00:32:06,800 Speaker 1: that's possible too, but again, more far more steps that 587 00:32:06,840 --> 00:32:11,200 Speaker 1: are necessary, the the shorter trip is the more likely. Right, 588 00:32:11,320 --> 00:32:14,400 Speaker 1: And in terms of not making more assumptions than you need, 589 00:32:14,520 --> 00:32:16,880 Speaker 1: ball rights that this is of course good advice. If 590 00:32:16,880 --> 00:32:19,160 Speaker 1: you're trying to come up with a good explanation for something, 591 00:32:19,440 --> 00:32:22,120 Speaker 1: you add nothing by writing in a bunch of extra 592 00:32:22,200 --> 00:32:26,600 Speaker 1: complications that don't help the explanation explain anything more than 593 00:32:26,640 --> 00:32:29,480 Speaker 1: it did when it was simpler. They should. Explanations should 594 00:32:29,480 --> 00:32:32,000 Speaker 1: be as simple as they can be without losing power 595 00:32:32,040 --> 00:32:36,160 Speaker 1: to explain and predict. Quote. That's why most scientific theories 596 00:32:36,200 --> 00:32:41,000 Speaker 1: are intentional simplifications. They ignore some effects, not because they 597 00:32:41,000 --> 00:32:44,240 Speaker 1: don't happen, but because they're thought to have a negligible 598 00:32:44,240 --> 00:32:48,000 Speaker 1: effect on the outcome. Applied this way, simplicity is a 599 00:32:48,040 --> 00:32:52,560 Speaker 1: practical virtue allowing a clearer view of what's most important 600 00:32:52,600 --> 00:32:56,200 Speaker 1: in a phenomenon. So again, he's saying there that okhams Razor. 601 00:32:56,360 --> 00:32:59,840 Speaker 1: It's it's not necessarily that Okams razor tells you what's true, 602 00:33:00,360 --> 00:33:05,080 Speaker 1: but Acams razor makes theories useful because then he goes 603 00:33:05,120 --> 00:33:08,000 Speaker 1: on to argue that Acam's razor is quote fetishized and 604 00:33:08,120 --> 00:33:12,640 Speaker 1: misapplied as a guiding beacon for scientific inquiry. So he thinks, what, 605 00:33:12,840 --> 00:33:15,040 Speaker 1: you know, what we're just saying, Simplicity is a virtue 606 00:33:15,080 --> 00:33:19,200 Speaker 1: of theories and explanations because they make theories clearer, easier 607 00:33:19,240 --> 00:33:22,640 Speaker 1: to use, but it's dangerous to jump from that to 608 00:33:22,720 --> 00:33:26,280 Speaker 1: the assumption that simplicity is actually a measure of truth. 609 00:33:26,680 --> 00:33:30,200 Speaker 1: Quote here, the the implication is the simplest theory isn't 610 00:33:30,200 --> 00:33:34,440 Speaker 1: just more convenient, but gets closer to how nature really works. 611 00:33:34,840 --> 00:33:38,400 Speaker 1: In other words, it's more probably the correct. One Ball 612 00:33:38,440 --> 00:33:41,640 Speaker 1: says this is wrong is simplicity does not actually tell 613 00:33:41,680 --> 00:33:44,840 Speaker 1: you anything about which theories are right and which ones 614 00:33:44,840 --> 00:33:48,320 Speaker 1: are wrong. He argues, there's really no reason to believe 615 00:33:48,400 --> 00:33:52,440 Speaker 1: that simpler theories better described nature than complicated ones, and 616 00:33:52,480 --> 00:33:55,080 Speaker 1: he gives a few examples. He talks about Francis Crick 617 00:33:55,240 --> 00:33:58,800 Speaker 1: warning against trying to apply Okham's razor as a critical 618 00:33:58,840 --> 00:34:02,840 Speaker 1: tool for theories and biology because biology gets really messy, 619 00:34:02,880 --> 00:34:05,520 Speaker 1: and he cites examples where it kind of led us astray. 620 00:34:05,600 --> 00:34:09,320 Speaker 1: Like he he cites Alfred Kempy's eighteen seventy nine proof 621 00:34:09,400 --> 00:34:12,600 Speaker 1: of the four color theorem and mathematics, which was kind 622 00:34:12,600 --> 00:34:15,080 Speaker 1: of favored for a while because the proof was considered 623 00:34:15,239 --> 00:34:18,040 Speaker 1: very simple and very elegant, but it turned out to 624 00:34:18,040 --> 00:34:21,400 Speaker 1: be wrong, you know, very roughly. Here, it makes me 625 00:34:21,480 --> 00:34:23,520 Speaker 1: think of something we talked about before in the show 626 00:34:23,560 --> 00:34:28,320 Speaker 1: about how how evolution is often kind of a miser 627 00:34:28,400 --> 00:34:32,480 Speaker 1: it's often cheap. Uh, and so part of that you 628 00:34:32,480 --> 00:34:35,600 Speaker 1: could you could apply the simplicity model to that and say, Okay, 629 00:34:35,760 --> 00:34:38,560 Speaker 1: it's that means it tends to take the shortest route, 630 00:34:38,600 --> 00:34:42,440 Speaker 1: it tends to to perhaps engage in simplicity, but at 631 00:34:42,480 --> 00:34:46,080 Speaker 1: the same time, uh, it's kind of lazy, and lazy 632 00:34:46,120 --> 00:34:49,399 Speaker 1: can create these sort of messes where and yeah, yeah, 633 00:34:49,400 --> 00:34:53,000 Speaker 1: we're saying like some biological structure has evolved, you know, 634 00:34:53,040 --> 00:34:55,920 Speaker 1: for one thing, but it ends up getting partially abandoned 635 00:34:55,920 --> 00:34:58,120 Speaker 1: and re used for something else, And it can get 636 00:34:58,160 --> 00:35:01,000 Speaker 1: it can get messy, it can get complicated. Million years 637 00:35:01,000 --> 00:35:05,080 Speaker 1: of shortcuts can turn into a quite circuitous route. Yeah, 638 00:35:05,160 --> 00:35:08,160 Speaker 1: and so Ball rights that in his view, he has 639 00:35:08,200 --> 00:35:11,360 Speaker 1: not found a single case in the history of science 640 00:35:11,360 --> 00:35:15,440 Speaker 1: where Akham's razor was actually used to settle a debate 641 00:35:15,560 --> 00:35:19,080 Speaker 1: between rival theories. So I just want to make sure 642 00:35:19,120 --> 00:35:21,520 Speaker 1: that his distinction is coming through. He is saying, it's 643 00:35:21,680 --> 00:35:25,520 Speaker 1: useful for trying to make theories easier to talk about, 644 00:35:25,560 --> 00:35:29,080 Speaker 1: easier to understand, easier to apply, But when it comes 645 00:35:29,239 --> 00:35:32,359 Speaker 1: between competing theories, trying to say which one is more 646 00:35:32,400 --> 00:35:36,040 Speaker 1: true which one makes better predictions. He has not found 647 00:35:36,080 --> 00:35:40,080 Speaker 1: a single case where Okam's razor was the decisive factor. 648 00:35:40,520 --> 00:35:42,080 Speaker 1: And what's worse, he says a lot of people have 649 00:35:42,120 --> 00:35:46,440 Speaker 1: tried to retroactively apply Ockham's razor to historical scientific debates 650 00:35:46,440 --> 00:35:50,160 Speaker 1: where it was not in fact decisive in reality. Uh 651 00:35:50,200 --> 00:35:52,360 Speaker 1: And he cites as an example a debate we've already 652 00:35:52,360 --> 00:35:56,239 Speaker 1: discussed the geocentric versus the heliocentric solar system. And I 653 00:35:56,239 --> 00:35:58,479 Speaker 1: thought his take on this was really interesting because I 654 00:35:58,480 --> 00:36:01,719 Speaker 1: I had been taken in. I think I had previously thought, well, 655 00:36:01,760 --> 00:36:05,960 Speaker 1: maybe a really good case of Akham's razor is heliocentrism 656 00:36:06,040 --> 00:36:10,000 Speaker 1: winning over geocentrism, because with geocentrism you just had to 657 00:36:10,000 --> 00:36:12,680 Speaker 1: make all these weird assumptions about the movements of planet. 658 00:36:12,680 --> 00:36:15,399 Speaker 1: You had to do extra work to make it fit, right, 659 00:36:15,680 --> 00:36:18,319 Speaker 1: That's what I thought. But he actually digs into the 660 00:36:18,360 --> 00:36:21,480 Speaker 1: debate of the time Ball points out that in reality, 661 00:36:21,520 --> 00:36:22,919 Speaker 1: So you know, we talked about one of the big 662 00:36:22,960 --> 00:36:26,240 Speaker 1: things being all these epicycles that in the ptolemic model, 663 00:36:26,400 --> 00:36:29,640 Speaker 1: the the geocentric view, the planets go around the Earth, 664 00:36:29,680 --> 00:36:31,279 Speaker 1: but they don't just go around. They make all these 665 00:36:31,280 --> 00:36:34,360 Speaker 1: weird loops and stuff called epicycles. You had to build 666 00:36:34,400 --> 00:36:37,239 Speaker 1: that in in order to explain what astronomers saw in 667 00:36:37,280 --> 00:36:39,960 Speaker 1: the night sky, the planets appearing to regress. They'd go 668 00:36:40,200 --> 00:36:43,560 Speaker 1: back and forth and stuff. Um so, so he says, 669 00:36:43,600 --> 00:36:46,480 Speaker 1: we've got all these epicycles. But Ball points out that 670 00:36:46,520 --> 00:36:49,560 Speaker 1: in reality, the Copernican model that was being argued about 671 00:36:49,560 --> 00:36:54,839 Speaker 1: in Galileo's day, that heliocentric model, was also full of epicycles. 672 00:36:54,920 --> 00:36:58,080 Speaker 1: And this was because Copernicus was not aware of what 673 00:36:58,160 --> 00:37:02,040 Speaker 1: Johannes Kepler would later discover about the orbits of planetary 674 00:37:02,080 --> 00:37:06,520 Speaker 1: bodies being elliptical rather than circular. So because he lacked 675 00:37:06,560 --> 00:37:09,600 Speaker 1: that crucial assumption that that important part of the theory, 676 00:37:09,840 --> 00:37:13,200 Speaker 1: Copernicus also had to build weird little loops into his 677 00:37:13,280 --> 00:37:17,200 Speaker 1: heliocentric model of the Solar System. He got the heliocentrism right, 678 00:37:17,480 --> 00:37:20,239 Speaker 1: but he thought the planets were moving in perfect circles 679 00:37:20,360 --> 00:37:24,520 Speaker 1: that didn't match observations either. So like Ptolemy, he he cheated. 680 00:37:24,560 --> 00:37:26,680 Speaker 1: He put all these loops in there to make the 681 00:37:26,719 --> 00:37:31,080 Speaker 1: model work out right, and it wasn't until heliocentrism was 682 00:37:31,160 --> 00:37:34,440 Speaker 1: combined with Kepler and elliptical orbits that the epicycles were 683 00:37:34,480 --> 00:37:37,799 Speaker 1: finally banished, and based on this, Ball argues that there 684 00:37:37,880 --> 00:37:39,920 Speaker 1: was really no way at the time to suggest that 685 00:37:39,960 --> 00:37:43,640 Speaker 1: the Copernican system was simpler. In fact, he points out 686 00:37:43,640 --> 00:37:48,120 Speaker 1: that Copernicus invokes a number of weird, non scientific assumptions 687 00:37:48,120 --> 00:37:51,480 Speaker 1: in support of his model. For example, quote uh, in 688 00:37:51,560 --> 00:37:56,120 Speaker 1: his main work on the heliocentric theory, De revolutiontionibus, I'm 689 00:37:56,120 --> 00:38:01,680 Speaker 1: gonna have trouble with this one day revolutiontionibus orbium celestium. Uh, 690 00:38:01,719 --> 00:38:03,880 Speaker 1: he argued that it was proper for the sun to 691 00:38:04,000 --> 00:38:06,759 Speaker 1: sit at the center quote, as if resting on a 692 00:38:06,880 --> 00:38:11,560 Speaker 1: kingly throne, governing the stars like a wise ruler. That 693 00:38:11,600 --> 00:38:14,680 Speaker 1: doesn't sound like a very scientific criterion. No, I mean, 694 00:38:14,680 --> 00:38:17,080 Speaker 1: maybe he's kind of breaking it down for people, you know. 695 00:38:17,800 --> 00:38:19,560 Speaker 1: I mean, of course he did turn out to be right, 696 00:38:19,840 --> 00:38:24,440 Speaker 1: But like that, that seems like an unjustified assumption based 697 00:38:24,480 --> 00:38:27,680 Speaker 1: on what he knew at the time. Uh. Ball also 698 00:38:27,760 --> 00:38:30,439 Speaker 1: points out that by the time Kepler comes around, we're 699 00:38:30,480 --> 00:38:33,480 Speaker 1: no longer in a situation of competing theories trying to 700 00:38:33,560 --> 00:38:40,040 Speaker 1: explain the same observations, because Kepler had access to better observations. Quote. 701 00:38:40,280 --> 00:38:42,560 Speaker 1: The point here is that as a tool for distinguishing 702 00:38:42,560 --> 00:38:46,480 Speaker 1: between rival theories. Occam's razor is only relevant if the 703 00:38:46,480 --> 00:38:50,719 Speaker 1: two theories predict identical results, but one is simpler than 704 00:38:50,719 --> 00:38:53,759 Speaker 1: the other, which is to say, it makes fewer assumptions. 705 00:38:54,200 --> 00:38:57,960 Speaker 1: This is a situation rarely, if ever, encountered in science. 706 00:38:58,400 --> 00:39:02,360 Speaker 1: Much more often theories are distinguish not by making fewer assumptions, 707 00:39:02,400 --> 00:39:06,640 Speaker 1: but different ones. It's then not obvious how to weigh 708 00:39:06,640 --> 00:39:09,799 Speaker 1: them up. I think this is a fantastic point, right, 709 00:39:09,840 --> 00:39:12,040 Speaker 1: I think to come back to the aquatic ape theory 710 00:39:12,040 --> 00:39:14,320 Speaker 1: like that, that is one of these rare situations. I 711 00:39:14,360 --> 00:39:16,399 Speaker 1: think that it seems to match up, right, it's making 712 00:39:16,440 --> 00:39:19,160 Speaker 1: additional assumptions, and it's like, oh, yeah, we would have 713 00:39:19,200 --> 00:39:22,600 Speaker 1: to keep those traits later anyway, we need explanations for that. 714 00:39:23,120 --> 00:39:26,120 Speaker 1: It just seems like it's making more assumptions. But that's 715 00:39:26,160 --> 00:39:28,880 Speaker 1: almost never how it goes. Usually the assumption is just 716 00:39:29,000 --> 00:39:31,760 Speaker 1: different assumptions, and then how do you know which assumption 717 00:39:31,920 --> 00:39:34,879 Speaker 1: is simpler than the other one? Right, the the whole 718 00:39:34,880 --> 00:39:40,840 Speaker 1: aquatic ape section of the of presumed evolutionary advancement is 719 00:39:40,880 --> 00:39:44,000 Speaker 1: kind of its own epicycle. Yeah, exactly removed because there's 720 00:39:44,000 --> 00:39:47,200 Speaker 1: an epicycle in this theory but not in this one exactly. Yes, 721 00:39:47,440 --> 00:39:50,000 Speaker 1: I mean, if you're trying to look at like not 722 00:39:50,400 --> 00:39:54,640 Speaker 1: additional assumptions in the theory, but just different assumptions in 723 00:39:54,680 --> 00:39:57,880 Speaker 1: the theory. Even cases where to us it might seem 724 00:39:57,880 --> 00:40:00,239 Speaker 1: obvious one way or another, which one seems simple, alert 725 00:40:00,239 --> 00:40:02,879 Speaker 1: it's not always obvious to people at the time. Uh 726 00:40:02,920 --> 00:40:06,840 Speaker 1: he He brings up the question of Darwinian evolution, is 727 00:40:06,920 --> 00:40:11,719 Speaker 1: descent from a common ancestor more or less complicated than 728 00:40:11,760 --> 00:40:15,080 Speaker 1: the idea of a divine created order common descent? I 729 00:40:15,360 --> 00:40:18,000 Speaker 1: think that would seem like a less complicated theory to 730 00:40:18,080 --> 00:40:21,160 Speaker 1: many of us today, But would it have seemed simpler 731 00:40:21,400 --> 00:40:23,680 Speaker 1: to the world view of people who were debating common 732 00:40:23,719 --> 00:40:26,560 Speaker 1: descent in like the mid late nineteenth century. Who you know, 733 00:40:26,600 --> 00:40:29,280 Speaker 1: you've already got a theistic worldview that's basically a built 734 00:40:29,280 --> 00:40:32,200 Speaker 1: in assumption, right right, Yeah, Yeah. A lot of this 735 00:40:32,280 --> 00:40:34,919 Speaker 1: does come down again coming to what we spoke about 736 00:40:34,960 --> 00:40:37,959 Speaker 1: earlier regarding the basic religious argument. Like if you're coming 737 00:40:38,000 --> 00:40:42,359 Speaker 1: from a really religious background where we've had this um this, 738 00:40:42,920 --> 00:40:45,120 Speaker 1: you know, the the idea the reality of a God 739 00:40:45,200 --> 00:40:48,680 Speaker 1: hammered into you, and then you're presented with with with 740 00:40:48,719 --> 00:40:51,120 Speaker 1: the atheist argument you know, you may say, well know 741 00:40:51,280 --> 00:40:54,839 Speaker 1: that that is that requires far of there had so 742 00:40:54,840 --> 00:40:59,319 Speaker 1: many epicycles in your your your your atheism, where my 743 00:40:59,320 --> 00:41:02,480 Speaker 1: my face is just a clear and straightforward as a whistle. 744 00:41:02,520 --> 00:41:04,960 Speaker 1: I mean people did actually argue that way. They'd say, 745 00:41:04,960 --> 00:41:07,160 Speaker 1: look at all this weird stuff you have to assume 746 00:41:07,200 --> 00:41:09,319 Speaker 1: about the history of life, and all I believe is 747 00:41:09,360 --> 00:41:12,239 Speaker 1: there's a divine created order. I mean, that's it's like 748 00:41:12,280 --> 00:41:15,960 Speaker 1: a moper sticker thing, like, uh, what God, God wrote it, 749 00:41:16,040 --> 00:41:20,040 Speaker 1: I believe it in the story three steps that theory, Yeah, 750 00:41:20,440 --> 00:41:22,799 Speaker 1: it is a simplicity is often in the eye of 751 00:41:22,800 --> 00:41:25,680 Speaker 1: the beholder, like you don't have I mean, there are 752 00:41:25,719 --> 00:41:27,920 Speaker 1: some people who would argue there are cases where you 753 00:41:27,920 --> 00:41:33,040 Speaker 1: can try to mathematically quantify uh, complications or assumptions or simplicity, 754 00:41:33,160 --> 00:41:35,279 Speaker 1: but in general that's really hard to do. You don't 755 00:41:35,320 --> 00:41:38,680 Speaker 1: have an objective measure that you can apply from the outside. 756 00:41:38,960 --> 00:41:40,879 Speaker 1: A lot of times it's just going to be kind 757 00:41:40,880 --> 00:41:44,960 Speaker 1: of fuzzy qualitative judgments. What what seems like less of 758 00:41:45,000 --> 00:41:48,120 Speaker 1: an assumption to you. You lack an objective measure, people 759 00:41:48,160 --> 00:41:51,240 Speaker 1: go with their intuitions. Uh, and this does not seem 760 00:41:51,280 --> 00:41:54,840 Speaker 1: like a good recipe for sorting between theories. So, coming 761 00:41:54,840 --> 00:41:58,600 Speaker 1: back again to two balse formulation of of Okham's razor, 762 00:41:58,640 --> 00:42:01,319 Speaker 1: It's basically like, if you have two theories that are 763 00:42:01,360 --> 00:42:04,920 Speaker 1: competing to explain the same things, they make all the 764 00:42:05,000 --> 00:42:08,520 Speaker 1: same predictions and explain it equally well. Yeah, they explain 765 00:42:08,600 --> 00:42:11,239 Speaker 1: that they make the same predictions explain things equally well. 766 00:42:12,120 --> 00:42:14,759 Speaker 1: But one of them has more assumptions, you go with 767 00:42:14,800 --> 00:42:17,560 Speaker 1: the one with fewer assumptions. But Ball argues that you 768 00:42:17,600 --> 00:42:21,200 Speaker 1: almost never, in reality get cases where the predictions of 769 00:42:21,239 --> 00:42:26,040 Speaker 1: two theories are exactly the same. Instead quote, scientific models 770 00:42:26,080 --> 00:42:30,480 Speaker 1: that differ in their assumptions typically make slightly different predictions too. 771 00:42:30,840 --> 00:42:34,880 Speaker 1: It is these predictions, not the criteria of simplicity, that 772 00:42:34,960 --> 00:42:38,920 Speaker 1: are of the greatest use for evaluating rival theories. Again, 773 00:42:38,960 --> 00:42:41,600 Speaker 1: I think this is a good point. I mean, theories 774 00:42:41,640 --> 00:42:44,560 Speaker 1: almost never predict the exact same thing, so why not 775 00:42:44,640 --> 00:42:48,480 Speaker 1: just judge them on how good their predictions are. Uh. Finally, 776 00:42:48,480 --> 00:42:50,960 Speaker 1: he writes that he can only think of one real 777 00:42:51,040 --> 00:42:54,560 Speaker 1: instance in UH, in science where there are rival theories 778 00:42:55,040 --> 00:42:58,799 Speaker 1: that make exactly the same predictions on the basis of 779 00:42:58,920 --> 00:43:02,680 Speaker 1: quote easily in new morable and comparable assumptions. And this 780 00:43:02,960 --> 00:43:06,279 Speaker 1: one example he can think of is the different interpretations 781 00:43:06,320 --> 00:43:09,680 Speaker 1: of quantum mechanics, which I think is a fantastic example, 782 00:43:09,680 --> 00:43:11,239 Speaker 1: and that did not come to my mind, but I 783 00:43:11,280 --> 00:43:15,000 Speaker 1: think he's exactly right about this. So we've discussed interpretations 784 00:43:15,000 --> 00:43:17,239 Speaker 1: of quantum mechanics on the show before. We're not going 785 00:43:17,280 --> 00:43:19,640 Speaker 1: to go deep on that, but just for a very 786 00:43:19,640 --> 00:43:24,320 Speaker 1: short refresher. Basically, we know that the mathematical fundamentals of 787 00:43:24,400 --> 00:43:28,120 Speaker 1: quantum theory are correct. They make extremely good predictions, like 788 00:43:28,160 --> 00:43:31,799 Speaker 1: we know the theories right, but there's a problem. They 789 00:43:31,840 --> 00:43:36,200 Speaker 1: predict a world of probabilities, not of certainties. So if 790 00:43:36,200 --> 00:43:38,719 Speaker 1: you have a theory that predicts an electron will be 791 00:43:38,920 --> 00:43:41,680 Speaker 1: fifty percent in one state and fifty percent in an 792 00:43:41,719 --> 00:43:46,200 Speaker 1: opposite state, but we only ever observe physical reality embodying 793 00:43:46,320 --> 00:43:48,839 Speaker 1: one state at a time, how do you resolve that 794 00:43:48,960 --> 00:43:52,400 Speaker 1: it just does not match our experience of reality. So 795 00:43:52,440 --> 00:43:55,120 Speaker 1: that's where the interpretations of quantum mechanics come in. There 796 00:43:55,160 --> 00:43:59,800 Speaker 1: they're trying to reconcile this difference, explaining why the indeterministic, 797 00:44:00,000 --> 00:44:05,240 Speaker 1: hobbabilistic quantum world somehow resolves into the solid deterministic world 798 00:44:05,280 --> 00:44:08,880 Speaker 1: that we experience every day. And there are tons of interpretations. 799 00:44:08,920 --> 00:44:11,960 Speaker 1: You've got like the classic Copenhagen interpretation, which predicts that 800 00:44:12,040 --> 00:44:14,920 Speaker 1: objects exist in a kind of in a state of 801 00:44:14,920 --> 00:44:18,719 Speaker 1: superposition until something interacts with them and collapses the way 802 00:44:18,719 --> 00:44:21,600 Speaker 1: of function makes them assume one state or the other. 803 00:44:22,040 --> 00:44:26,279 Speaker 1: You've got the now popular many worlds interpretation, originating with 804 00:44:26,320 --> 00:44:28,239 Speaker 1: the physicist you Ever at the Third in the late 805 00:44:28,280 --> 00:44:32,400 Speaker 1: nineteen fifties. This suggests that reality is constantly splitting into 806 00:44:32,480 --> 00:44:37,160 Speaker 1: infinite alternate timelines based on the different possible outcomes of 807 00:44:37,280 --> 00:44:41,040 Speaker 1: unresolved quantum states. And and we only observe one outcome 808 00:44:41,120 --> 00:44:43,759 Speaker 1: because we are also splitting, and the current version of 809 00:44:43,880 --> 00:44:47,239 Speaker 1: us is only one of many uses that experiences one 810 00:44:47,280 --> 00:44:49,239 Speaker 1: world at a time. And then you've got a bunch 811 00:44:49,280 --> 00:44:53,200 Speaker 1: of other theories to Basically, these interpretations make exactly the 812 00:44:53,280 --> 00:44:57,040 Speaker 1: same physical predictions. No matter which one of them is correct, 813 00:44:57,120 --> 00:45:00,800 Speaker 1: the outcomes of our experiments will be exactly the s aim, 814 00:45:00,800 --> 00:45:03,640 Speaker 1: so there's no way to test which one is right. Though, 815 00:45:03,640 --> 00:45:06,080 Speaker 1: And in a funny turn, Ball points out that Ockham's 816 00:45:06,160 --> 00:45:09,719 Speaker 1: razor has been invoked both for and against the many 817 00:45:09,760 --> 00:45:12,520 Speaker 1: worlds interpretation, again coming back to the fact that a 818 00:45:12,520 --> 00:45:16,160 Speaker 1: lot of times this just comes down to people's intuitive judgments, 819 00:45:16,160 --> 00:45:18,880 Speaker 1: like he quotes the quantum theorist role in omnus quote, 820 00:45:19,080 --> 00:45:22,080 Speaker 1: as far as economy of thought is concerned, there never 821 00:45:22,280 --> 00:45:26,200 Speaker 1: was anything in the history of thought so bluntly contrary 822 00:45:26,239 --> 00:45:30,080 Speaker 1: to Ockham's rule than ever it's many worlds. On the 823 00:45:30,120 --> 00:45:33,120 Speaker 1: other hand, you've got a modern physicist like Sean Carroll 824 00:45:33,200 --> 00:45:37,320 Speaker 1: of of Caltech who advocates the many world's interpretation, specifically 825 00:45:37,360 --> 00:45:41,120 Speaker 1: because he argues it's the simplest interpretation of quantum theory. 826 00:45:41,520 --> 00:45:44,239 Speaker 1: He says, it doesn't make any additional assumptions. It's the 827 00:45:44,280 --> 00:45:47,719 Speaker 1: simplest way you can map the theory onto reality. The 828 00:45:47,760 --> 00:45:50,480 Speaker 1: weird thing about about this, too, is that I feel like, 829 00:45:50,560 --> 00:45:53,919 Speaker 1: at this point, if you consume enough science fiction, and 830 00:45:54,080 --> 00:45:56,760 Speaker 1: not even just science fiction but general just popular culture, 831 00:45:57,040 --> 00:46:00,239 Speaker 1: the many World's interpretation has been and you did, at 832 00:46:00,320 --> 00:46:04,600 Speaker 1: least casually so often, then in a way it feels 833 00:46:04,600 --> 00:46:08,279 Speaker 1: slightly more plausible, just because just due to familiarity, which 834 00:46:08,320 --> 00:46:10,759 Speaker 1: I realized is not a scientific argue, like you could 835 00:46:10,760 --> 00:46:14,000 Speaker 1: not you could not reasonably say, well, I leaned towards 836 00:46:14,000 --> 00:46:17,040 Speaker 1: many worlds interpretation because that's how The X Men works. 837 00:46:17,160 --> 00:46:19,600 Speaker 1: My favorite TV show uses it. It's got to be real, 838 00:46:20,280 --> 00:46:22,719 Speaker 1: but on on some like level, it's still kind of good. 839 00:46:22,760 --> 00:46:25,040 Speaker 1: Gets into you, it still affects you. I agree. I 840 00:46:25,040 --> 00:46:27,000 Speaker 1: mean again, I think this is this is pointing out 841 00:46:27,000 --> 00:46:29,880 Speaker 1: some of the weaknesses and how Alcam's razor is often applied. 842 00:46:29,960 --> 00:46:33,279 Speaker 1: It's like people think they're applying some kind of objective 843 00:46:33,320 --> 00:46:35,719 Speaker 1: criterion when really they're just kind of going with their 844 00:46:35,760 --> 00:46:40,040 Speaker 1: gut about like what what feels more plausible? Uh. And 845 00:46:40,040 --> 00:46:42,520 Speaker 1: and that's something Ball kind of hammers home at the 846 00:46:42,600 --> 00:46:44,799 Speaker 1: end when he writes quote, but this is all just 847 00:46:44,920 --> 00:46:48,840 Speaker 1: special pleading. Acam's razor was never meant for pairing nature 848 00:46:48,880 --> 00:46:53,360 Speaker 1: down to some beautiful, parsimonious core of truth. Because science 849 00:46:53,440 --> 00:46:56,920 Speaker 1: is so difficult and messy, the allure of a philosophical 850 00:46:56,960 --> 00:47:00,239 Speaker 1: tool for clearing a path or pruning the thickets is obvious. Yes, 851 00:47:00,800 --> 00:47:04,080 Speaker 1: in the readiness to find spurious applications of Akham's razors 852 00:47:04,200 --> 00:47:06,920 Speaker 1: in the history of science, or to enlist, dismiss, or 853 00:47:06,960 --> 00:47:10,200 Speaker 1: reshape the razor at will to shore up their preferences. 854 00:47:10,640 --> 00:47:14,400 Speaker 1: Scientists reveal their seduction by this vision, but they should 855 00:47:14,400 --> 00:47:17,800 Speaker 1: resist it. The value of keeping assumptions to a minimum 856 00:47:17,880 --> 00:47:23,080 Speaker 1: is cognitive not ontological. It helps you think a theory 857 00:47:23,200 --> 00:47:25,839 Speaker 1: is not better if it is simpler, but it might 858 00:47:25,920 --> 00:47:29,879 Speaker 1: well be more useful, and that counts for much more. Yeah, 859 00:47:29,960 --> 00:47:32,759 Speaker 1: that's well put. It helps us think, read it, and 860 00:47:32,880 --> 00:47:35,600 Speaker 1: help us explain the world. Right, there's no way to 861 00:47:35,760 --> 00:47:38,719 Speaker 1: show that well. Actually, so we're about to get into 862 00:47:38,880 --> 00:47:41,040 Speaker 1: somebody who says that there may be cases where you 863 00:47:41,040 --> 00:47:45,000 Speaker 1: can show simpler theories are objectively more true. But but 864 00:47:45,160 --> 00:47:47,120 Speaker 1: Ball argues that at least most of the time in 865 00:47:47,239 --> 00:47:50,320 Speaker 1: science and real competing theories in the history of science, 866 00:47:50,680 --> 00:47:54,160 Speaker 1: it's not that simpler theories are more true or explain 867 00:47:54,280 --> 00:47:57,680 Speaker 1: reality better. They're just easier to get your head around 868 00:47:57,760 --> 00:48:00,200 Speaker 1: and test. All right, on that note, we're gonna take 869 00:48:00,200 --> 00:48:02,400 Speaker 1: one more break, but we will be right back with 870 00:48:02,480 --> 00:48:09,440 Speaker 1: further discussion of the razor. Alright, we're back, All right. 871 00:48:09,440 --> 00:48:11,840 Speaker 1: There's one more article about Akham's razor that I found 872 00:48:11,880 --> 00:48:15,520 Speaker 1: really interesting, very useful, and it is called why is 873 00:48:15,600 --> 00:48:19,440 Speaker 1: Simpler Better? This was published in Eon by Elliott Sober, 874 00:48:19,480 --> 00:48:23,120 Speaker 1: who is a professor of philosophy at the University of Wisconsin, Madison, 875 00:48:23,560 --> 00:48:26,040 Speaker 1: and he's published a lot on the philosophy of science, 876 00:48:26,040 --> 00:48:29,719 Speaker 1: specifically as it applies to biology and natural selection, and 877 00:48:29,800 --> 00:48:32,920 Speaker 1: he wrote a book on the subject of Akham's razor. Uh. 878 00:48:33,000 --> 00:48:35,360 Speaker 1: So he starts off, I think this is kind of 879 00:48:35,400 --> 00:48:39,400 Speaker 1: interesting talking about simplicity and complexity and art. Could you 880 00:48:39,400 --> 00:48:43,120 Speaker 1: possibly have a norm that one is always better than 881 00:48:43,160 --> 00:48:45,839 Speaker 1: the other? I mean that seems kind of strange, right, 882 00:48:45,880 --> 00:48:48,480 Speaker 1: Like we love simple art and we love complex art, 883 00:48:48,719 --> 00:48:50,879 Speaker 1: and it would be strange to find a person who 884 00:48:51,000 --> 00:48:54,120 Speaker 1: just wants one or the other. Yeah, I mean this 885 00:48:54,200 --> 00:48:58,040 Speaker 1: makes me think of of movie posters. I don't know, 886 00:48:58,200 --> 00:48:59,759 Speaker 1: you probably remember it seems like it was a few 887 00:48:59,840 --> 00:49:04,160 Speaker 1: year is back. The big craze for a while was 888 00:49:04,200 --> 00:49:07,400 Speaker 1: that the designers would come up with a super simplistic 889 00:49:07,840 --> 00:49:11,120 Speaker 1: movie poster for classic film or a you know, a 890 00:49:11,160 --> 00:49:14,480 Speaker 1: fan favorite film. And it was really fun for a while. 891 00:49:15,080 --> 00:49:17,560 Speaker 1: And uh and but then it kind of overstate it's welcome, 892 00:49:17,680 --> 00:49:19,960 Speaker 1: you know, and and and it just became kind of, 893 00:49:20,040 --> 00:49:22,239 Speaker 1: at least to me anyway, kind of kind of irritating 894 00:49:22,239 --> 00:49:24,200 Speaker 1: to even look at. You're like, no, I don't don't 895 00:49:24,200 --> 00:49:26,720 Speaker 1: want to see like this film reduced to this ultra 896 00:49:26,920 --> 00:49:30,000 Speaker 1: simplistic symbol. I know exactly what you're talking about. And 897 00:49:30,040 --> 00:49:32,080 Speaker 1: I think there was a counter reaction. Yeah, because then 898 00:49:32,120 --> 00:49:34,279 Speaker 1: you started to see a lot of graphic design for 899 00:49:34,760 --> 00:49:37,600 Speaker 1: redoing old movies with new posters in the kind of 900 00:49:37,640 --> 00:49:40,760 Speaker 1: Return of the Jedi stuff where there's a bunch of stuff, 901 00:49:40,760 --> 00:49:42,799 Speaker 1: there's like a bunch of people on the poster and 902 00:49:42,920 --> 00:49:45,440 Speaker 1: things happening. Yeah, or that it's just kind of like 903 00:49:45,480 --> 00:49:49,319 Speaker 1: a geometric explosion of things, you know. Uh so, yeah, 904 00:49:49,360 --> 00:49:52,759 Speaker 1: you so saw the pendulum swing both ways. But in general, yeah, 905 00:49:52,880 --> 00:49:54,759 Speaker 1: I feel like it's that way in art. I mean, 906 00:49:54,840 --> 00:49:57,400 Speaker 1: I think we can all point to specific examples in 907 00:49:57,400 --> 00:49:59,399 Speaker 1: our own life where here's something we like that it's 908 00:49:59,520 --> 00:50:02,759 Speaker 1: very very tight and neat and minimalists. Maybe it's even 909 00:50:02,800 --> 00:50:08,040 Speaker 1: like a musical argument. Yeah, I love like minimalist ambient recordings, 910 00:50:08,080 --> 00:50:10,759 Speaker 1: but I'm also the type of person who enjoys uh 911 00:50:10,920 --> 00:50:16,080 Speaker 1: cacophonist recordings and complex recordings, and likewise with visual arts, 912 00:50:16,120 --> 00:50:19,920 Speaker 1: likewise with you know, film, TV and other mediums you 913 00:50:20,160 --> 00:50:23,799 Speaker 1: you like hugely layered like mixed tracks and stuff. Yeah. Yeah, 914 00:50:24,120 --> 00:50:27,520 Speaker 1: but then I also like, uh, you know, I love 915 00:50:27,760 --> 00:50:29,279 Speaker 1: I don't know, I don't, I don't know that it 916 00:50:29,280 --> 00:50:31,640 Speaker 1: gets kind of complicated, right, because even something that is 917 00:50:31,719 --> 00:50:35,640 Speaker 1: very minimalist can be of course very complicated and layered. Uh. 918 00:50:36,120 --> 00:50:39,800 Speaker 1: But but yeah, I think everybody is gonna everybody's taste 919 00:50:39,800 --> 00:50:42,040 Speaker 1: pendulum is going to swing both ways there. But that's 920 00:50:42,040 --> 00:50:44,239 Speaker 1: the world of art though, right, I mean, so that's 921 00:50:44,280 --> 00:50:47,640 Speaker 1: one thing. That's the world of human creation. Um. And 922 00:50:47,760 --> 00:50:52,160 Speaker 1: sometimes those creations are are made, uh to mimic nature, 923 00:50:52,160 --> 00:50:54,960 Speaker 1: but they are not necessarily nature itself. Right, Yes, And 924 00:50:55,120 --> 00:50:57,480 Speaker 1: I think you can apply something similar to science. So 925 00:50:57,840 --> 00:50:59,600 Speaker 1: some of what Sober is going to write in this 926 00:50:59,680 --> 00:51:02,239 Speaker 1: article mirrors what we were just talking about with Ball. 927 00:51:02,320 --> 00:51:05,000 Speaker 1: Like he he starts off by saying, Okay, it's clear 928 00:51:05,120 --> 00:51:09,120 Speaker 1: that simpler theories have some qualities that are good. They're 929 00:51:09,160 --> 00:51:13,719 Speaker 1: easier to understand, they're easier to remember, they're easier to test, uh, 930 00:51:13,760 --> 00:51:16,799 Speaker 1: And of course in just an aesthetic sense, they can 931 00:51:16,840 --> 00:51:19,680 Speaker 1: be more beautiful. But he says that the real problem 932 00:51:19,719 --> 00:51:21,640 Speaker 1: comes in when you're trying to figure out how good 933 00:51:21,760 --> 00:51:24,799 Speaker 1: is a theory for telling you what's true? You know, 934 00:51:24,880 --> 00:51:27,800 Speaker 1: how well does it predict things that you will encounter 935 00:51:27,840 --> 00:51:31,200 Speaker 1: in the world. Some pasta scientific thinkers have tried to 936 00:51:31,200 --> 00:51:35,160 Speaker 1: come up with reasons why. Yeah, it's like simplicity is 937 00:51:35,200 --> 00:51:38,080 Speaker 1: actually better. It actually predicts predicts the world better. And 938 00:51:38,120 --> 00:51:41,719 Speaker 1: a lot of these justifications were theological in nature. Uh. 939 00:51:41,920 --> 00:51:44,920 Speaker 1: Like for example, in Newton and talking about why he 940 00:51:44,960 --> 00:51:49,279 Speaker 1: prefers simpler theories, wrote quote to choose those constructions which, 941 00:51:49,320 --> 00:51:52,960 Speaker 1: without straining, reduced things to the greatest simplicity. Uh. The 942 00:51:53,000 --> 00:51:55,799 Speaker 1: reason of this is that truth is ever to be 943 00:51:55,840 --> 00:51:59,360 Speaker 1: found in simplicity, and not in the multiplicity and confusion 944 00:51:59,400 --> 00:52:02,080 Speaker 1: of things. It is the perfection of God's works that 945 00:52:02,160 --> 00:52:05,040 Speaker 1: they are all done with the greatest simplicity. He is 946 00:52:05,080 --> 00:52:08,120 Speaker 1: the God of order and not of confusion. And therefore, 947 00:52:08,400 --> 00:52:10,920 Speaker 1: as they that would understand the frame of the world 948 00:52:11,040 --> 00:52:14,560 Speaker 1: must endeavor to reduce their knowledge to all possible simplicity. 949 00:52:14,920 --> 00:52:17,880 Speaker 1: So it must be in seeking to understand these visions. 950 00:52:18,239 --> 00:52:20,040 Speaker 1: So again, I mean, I would say that's fine to 951 00:52:20,080 --> 00:52:23,120 Speaker 1: believe that. That's not a scientific reason for believing things 952 00:52:23,160 --> 00:52:25,680 Speaker 1: that simpler things are more likely to be true. Right, 953 00:52:25,800 --> 00:52:27,680 Speaker 1: had to fall back on the idea that we have 954 00:52:27,800 --> 00:52:32,120 Speaker 1: a lawful, good God as opposed to a chaotic good God. Right, 955 00:52:32,280 --> 00:52:34,200 Speaker 1: I mean, it would only be a bad God that 956 00:52:34,239 --> 00:52:37,560 Speaker 1: would allow more complex explanations to be correct. And so 957 00:52:37,719 --> 00:52:41,360 Speaker 1: were actually says there are some cases today, uh that 958 00:52:41,520 --> 00:52:45,320 Speaker 1: can help us know when a model is objectively more accurate, 959 00:52:45,400 --> 00:52:48,200 Speaker 1: like modern statistical methods, there are some ways that you 960 00:52:48,239 --> 00:52:53,200 Speaker 1: can reduce theories to mathematical advantage, at least roughly, and 961 00:52:53,320 --> 00:52:56,080 Speaker 1: that in these cases there there are times where you 962 00:52:56,080 --> 00:52:59,960 Speaker 1: can show simpler is actually better. Uh. He argues, there 963 00:53:00,040 --> 00:53:03,600 Speaker 1: reparadigms in which Occam's razor holds true, and so the 964 00:53:03,640 --> 00:53:08,760 Speaker 1: first one is that sometimes simpler theories actually have higher probabilities. 965 00:53:09,680 --> 00:53:13,960 Speaker 1: He invokes the medical adage here, don't chase zebras. This 966 00:53:14,040 --> 00:53:16,080 Speaker 1: is this comes from the idea of you know, when 967 00:53:16,080 --> 00:53:19,600 Speaker 1: you hear hoofbeats, think horses, not zebras. I've also heard 968 00:53:19,600 --> 00:53:23,520 Speaker 1: that as unicorns. As another analogy, if you hear footsteps 969 00:53:23,520 --> 00:53:25,320 Speaker 1: coming down the hall, you can have a couple of 970 00:53:25,320 --> 00:53:28,479 Speaker 1: different hypotheses. It's a human walking down the hall, or 971 00:53:28,560 --> 00:53:31,480 Speaker 1: it's a RoboCop walking down the hall, which one is 972 00:53:31,480 --> 00:53:33,640 Speaker 1: going to be correct more often, Well, it's going to 973 00:53:33,719 --> 00:53:37,400 Speaker 1: be a human. It could either conceivably be somebody in 974 00:53:37,440 --> 00:53:41,319 Speaker 1: a RoboCup cost him, but the chances of that are 975 00:53:41,320 --> 00:53:43,480 Speaker 1: pretty slimp. I mean, unless you like are in a 976 00:53:43,560 --> 00:53:46,560 Speaker 1: RoboCop factory or something. It's going to be a human 977 00:53:46,600 --> 00:53:49,560 Speaker 1: way more often, And the same goes in diagnosing diseases. 978 00:53:49,600 --> 00:53:52,840 Speaker 1: If you observe a set of symptoms in patient history 979 00:53:52,960 --> 00:53:56,160 Speaker 1: that are equally likely to predict a common disease and 980 00:53:56,200 --> 00:53:59,080 Speaker 1: a rare disease, pick the common one, you're going to 981 00:53:59,160 --> 00:54:01,960 Speaker 1: be correct more often than if you always pick the 982 00:54:02,040 --> 00:54:04,560 Speaker 1: rare one. Right. Um. You know this also brings me 983 00:54:04,600 --> 00:54:07,880 Speaker 1: back to the serial killer example. You know, like, what 984 00:54:07,880 --> 00:54:10,560 Speaker 1: what is more more likely though that it's someone that 985 00:54:10,520 --> 00:54:13,040 Speaker 1: the individual new, or it is a random killing by 986 00:54:13,040 --> 00:54:15,479 Speaker 1: a serial murder. You know, unless there is a serial 987 00:54:15,560 --> 00:54:18,840 Speaker 1: murder active in the area, which raises that that the 988 00:54:18,920 --> 00:54:22,360 Speaker 1: chances for that to be true, but by a considerable margin. Uh, 989 00:54:22,400 --> 00:54:25,080 Speaker 1: it's going to remain a zebra. Now a unicorn, but 990 00:54:25,160 --> 00:54:28,719 Speaker 1: a zebra exactly unless you have independent evidence pointing to 991 00:54:28,760 --> 00:54:31,400 Speaker 1: that as a superior hypothesis. There's no reason to go 992 00:54:31,520 --> 00:54:36,120 Speaker 1: to a rare phenomenon that would explain things equally. Well, yeah, 993 00:54:36,280 --> 00:54:38,000 Speaker 1: so I know it seems like there are enough podcasts 994 00:54:38,000 --> 00:54:40,960 Speaker 1: about serial murders. It might seem like there are more 995 00:54:40,960 --> 00:54:42,840 Speaker 1: of them out there than there are. Well, there you 996 00:54:42,840 --> 00:54:46,840 Speaker 1: get into some cognitive biases from Yeah, the availability heuristic 997 00:54:46,920 --> 00:54:50,759 Speaker 1: kicks in But of course, another question is, like, how 998 00:54:50,800 --> 00:54:53,440 Speaker 1: often does a thorough review actually put you in the 999 00:54:53,520 --> 00:54:57,120 Speaker 1: situation where two things explain what you see equally well, 1000 00:54:57,239 --> 00:55:01,839 Speaker 1: like truly equally well. One's rare and one's comm But 1001 00:55:01,840 --> 00:55:04,440 Speaker 1: but so Sober says that you've got this concept he 1002 00:55:04,480 --> 00:55:08,400 Speaker 1: calls the razor of silence, and and the basic explanation 1003 00:55:08,400 --> 00:55:11,160 Speaker 1: of this is that if you've got evidence that A 1004 00:55:11,560 --> 00:55:14,480 Speaker 1: is the cause of something and no evidence that B 1005 00:55:14,880 --> 00:55:18,760 Speaker 1: is the cause of something, then A alone is statistically 1006 00:55:18,800 --> 00:55:22,600 Speaker 1: a better explanation than A and B together. This goes 1007 00:55:22,640 --> 00:55:25,120 Speaker 1: back to the stacking of explanations that we were talking 1008 00:55:25,160 --> 00:55:27,960 Speaker 1: about earlier, Like, if you've got an explanation that already 1009 00:55:27,960 --> 00:55:33,440 Speaker 1: explains everything, there is no justification for adding additional explanations 1010 00:55:33,440 --> 00:55:35,439 Speaker 1: on top of it. That you don't need to add 1011 00:55:35,480 --> 00:55:38,239 Speaker 1: the angels pushing the planets right, Well, let's come back 1012 00:55:38,280 --> 00:55:41,879 Speaker 1: to the murder scenario. How do we apply this forensically? Uh? Well, 1013 00:55:42,120 --> 00:55:44,279 Speaker 1: as so we're actually I think says something kind of 1014 00:55:44,280 --> 00:55:47,080 Speaker 1: like this, But like, if you have clear evidence of 1015 00:55:47,120 --> 00:55:49,920 Speaker 1: one cause of death on somebody, you don't need to 1016 00:55:49,960 --> 00:55:53,440 Speaker 1: assume extra causes of death stacking on top of it 1017 00:55:53,880 --> 00:55:56,400 Speaker 1: without direct evidence of them as well. So if you 1018 00:55:56,440 --> 00:55:59,480 Speaker 1: find like a you know, a body, I don't know, 1019 00:55:59,520 --> 00:56:02,120 Speaker 1: a body the bottom of a cliff and they're dead, 1020 00:56:02,239 --> 00:56:04,400 Speaker 1: you can assume that it was falling off the cliff 1021 00:56:04,440 --> 00:56:06,680 Speaker 1: that killed them. You don't need to also assume that 1022 00:56:06,719 --> 00:56:09,480 Speaker 1: they were poisoned or something. King unless you know, you 1023 00:56:09,480 --> 00:56:12,240 Speaker 1: do blood talks and then it comes back with poison. 1024 00:56:12,320 --> 00:56:14,640 Speaker 1: You can't assume it then. But there's no reason to 1025 00:56:14,640 --> 00:56:19,040 Speaker 1: start stacking on additional assumptions. Now there's another way that 1026 00:56:19,120 --> 00:56:23,120 Speaker 1: sober says sometimes OCAM's razer actually does hold true. It 1027 00:56:23,120 --> 00:56:26,640 Speaker 1: it's sometimes simpler explanations are better, and it's simply that 1028 00:56:26,760 --> 00:56:31,279 Speaker 1: sometimes simpler theories are better supported by observations. Uh. He 1029 00:56:31,320 --> 00:56:33,880 Speaker 1: gives this great example. Suppose all the lights on your 1030 00:56:33,880 --> 00:56:38,040 Speaker 1: street go out. You could have two competing hypotheses. First 1031 00:56:38,080 --> 00:56:42,000 Speaker 1: one something happened at the power plant and that influenced 1032 00:56:42,000 --> 00:56:44,200 Speaker 1: what happened to all the lights in the neighborhood, or 1033 00:56:44,200 --> 00:56:47,480 Speaker 1: maybe there's a down power line something like that. The 1034 00:56:47,560 --> 00:56:50,680 Speaker 1: other one, something happened to all of the light bulbs 1035 00:56:50,760 --> 00:56:55,560 Speaker 1: at the same time. Now, these would both explain the observations, right, 1036 00:56:56,400 --> 00:56:59,799 Speaker 1: Like either either all of the light bulbs suddenly went 1037 00:56:59,840 --> 00:57:02,799 Speaker 1: out on their own independently, just coincidentally, all at the 1038 00:57:02,800 --> 00:57:05,760 Speaker 1: same time, or there's something happened with the power supply 1039 00:57:05,920 --> 00:57:09,000 Speaker 1: to the whole neighborhood. Sober argues, based on the work 1040 00:57:09,000 --> 00:57:12,239 Speaker 1: of the philosopher Hans Reichenbach, that in this case you 1041 00:57:12,280 --> 00:57:15,880 Speaker 1: can actually show mathematically that the evidence for the first 1042 00:57:16,080 --> 00:57:19,680 Speaker 1: for the power plant hypothesis is stronger, just based on 1043 00:57:19,720 --> 00:57:23,200 Speaker 1: the fact that it's simpler. Uh. And a similar example 1044 00:57:23,240 --> 00:57:26,760 Speaker 1: in real science look at common descent in biology. So 1045 00:57:26,920 --> 00:57:30,480 Speaker 1: based on the evidence of massive amounts of genetic code 1046 00:57:30,560 --> 00:57:34,680 Speaker 1: shared by all living things today, people usually say, okay, 1047 00:57:34,720 --> 00:57:37,440 Speaker 1: that that's evidence of common descent. We all share a 1048 00:57:37,440 --> 00:57:42,000 Speaker 1: common ancestor, we all inherit some common genetic code. Now 1049 00:57:42,000 --> 00:57:44,520 Speaker 1: you could also say, well, maybe all living things on 1050 00:57:44,560 --> 00:57:47,880 Speaker 1: Earth have different ancestors and they just happened by coincidence 1051 00:57:47,960 --> 00:57:52,040 Speaker 1: to have overlapping strings of genetic code. That would require 1052 00:57:52,080 --> 00:57:55,760 Speaker 1: a lot of strange coincidences. So the evidence actually favors 1053 00:57:55,800 --> 00:57:58,520 Speaker 1: common descent, just like it favors a power outage over 1054 00:57:58,600 --> 00:58:03,080 Speaker 1: hundreds of simultaneous light bulb failures. So a serial killer 1055 00:58:03,160 --> 00:58:06,760 Speaker 1: example of this might be, oh man, what's happening in 1056 00:58:06,800 --> 00:58:08,920 Speaker 1: the dark corners of your brain today, Rob, I don't know, 1057 00:58:08,920 --> 00:58:10,920 Speaker 1: I just keep coming back to it, I guess. But okay, 1058 00:58:11,000 --> 00:58:13,960 Speaker 1: so one person. So if like people, they're all these 1059 00:58:13,960 --> 00:58:16,680 Speaker 1: dead people and they all have say a death head, 1060 00:58:16,760 --> 00:58:21,000 Speaker 1: moth um, what was a caterpillar? Oh? Yes, yes, yes, yes? 1061 00:58:21,120 --> 00:58:23,959 Speaker 1: Or was it a cocoon? I can't recall off hand 1062 00:58:24,240 --> 00:58:26,080 Speaker 1: and from silence to the lamps. Yeah, they've got like 1063 00:58:26,120 --> 00:58:29,000 Speaker 1: a moth cocoon in their mouth or something. So perhaps 1064 00:58:29,400 --> 00:58:32,240 Speaker 1: they just happened to each individually wind up with one 1065 00:58:32,240 --> 00:58:34,680 Speaker 1: in their mouth, like somebody accidentally eight one one in 1066 00:58:34,680 --> 00:58:36,680 Speaker 1: the salad bar. Another one was like looking up and 1067 00:58:36,720 --> 00:58:38,760 Speaker 1: it fell out of a tree, because one had escaped 1068 00:58:38,800 --> 00:58:41,280 Speaker 1: from a private collection, was living in a tree. You 1069 00:58:41,280 --> 00:58:44,720 Speaker 1: could have sort of independent explanations for why each of 1070 00:58:44,720 --> 00:58:47,520 Speaker 1: these occurred. Or the other possibility is somebody's killing them 1071 00:58:47,520 --> 00:58:49,640 Speaker 1: and putting them in their throats. Right, the one common 1072 00:58:49,680 --> 00:58:53,720 Speaker 1: explanation actually explains observations better than assuming a whole bunch 1073 00:58:53,760 --> 00:58:56,320 Speaker 1: of strange coincidence. Yes, and then we got the third 1074 00:58:56,320 --> 00:58:59,320 Speaker 1: paradigm Sober gets into, which is that he says, sometimes 1075 00:58:59,400 --> 00:59:02,520 Speaker 1: the simplicity of a model is relevant to estimating its 1076 00:59:02,520 --> 00:59:06,000 Speaker 1: predictive accuracy. So what a good theories do well? They 1077 00:59:06,040 --> 00:59:08,680 Speaker 1: make accurate predictions about things we don't know yet. They 1078 00:59:08,680 --> 00:59:13,400 Speaker 1: either accurately predict future measurements or outcomes or discoveries. Does 1079 00:59:13,400 --> 00:59:16,439 Speaker 1: acams Raiser have anything to say here? Sober says yes, 1080 00:59:16,680 --> 00:59:21,000 Speaker 1: Sometimes simplicity affects our best guesses about how accurate a 1081 00:59:21,040 --> 00:59:23,880 Speaker 1: new theory will be, and he cites the work of 1082 00:59:23,920 --> 00:59:28,520 Speaker 1: a Japanese statistician named Hiratuga Akayiki, who did important work 1083 00:59:28,520 --> 00:59:31,800 Speaker 1: in a field called model selection theory. This means how 1084 00:59:31,840 --> 00:59:34,200 Speaker 1: to judge the strength of a new model or theory 1085 00:59:34,280 --> 00:59:37,040 Speaker 1: before it has had time to be tested in the field, 1086 00:59:37,840 --> 00:59:42,760 Speaker 1: and a model evaluation system called the Akayiki information criterion 1087 00:59:43,120 --> 00:59:45,080 Speaker 1: says that you can predict how good a new model 1088 00:59:45,160 --> 00:59:47,800 Speaker 1: or theory will be by two measures, how well it 1089 00:59:47,840 --> 00:59:51,120 Speaker 1: fits old or existing data. Obviously, better fits are better, 1090 00:59:51,480 --> 00:59:55,000 Speaker 1: and then how simple it is Simpler models are better. Uh. 1091 00:59:55,040 --> 00:59:59,320 Speaker 1: Simplicity is evaluated by quote the number of adjustable parameters 1092 00:59:59,360 --> 01:00:02,280 Speaker 1: and having few or is better. Now. Sober gives an 1093 01:00:02,280 --> 01:00:04,800 Speaker 1: analysis of why this is the case, using an example 1094 01:00:04,880 --> 01:00:07,320 Speaker 1: of trying to estimate the height of plants in a 1095 01:00:07,360 --> 01:00:10,680 Speaker 1: corn field based on previous random samplings of the fields. 1096 01:00:10,920 --> 01:00:12,960 Speaker 1: I'm not going to get down into all the details 1097 01:00:12,960 --> 01:00:14,760 Speaker 1: of this, but if you want a deeper understanding of 1098 01:00:14,760 --> 01:00:17,160 Speaker 1: this one, i'd recommend looking up the article that. The 1099 01:00:17,320 --> 01:00:20,520 Speaker 1: short version is that in some situations, depending on a 1100 01:00:20,600 --> 01:00:23,280 Speaker 1: number of assumptions about what types of models and data 1101 01:00:23,320 --> 01:00:26,240 Speaker 1: you're dealing with, simplicity of a model is actually a 1102 01:00:26,240 --> 01:00:29,160 Speaker 1: good predictor of how well future data will conform to 1103 01:00:29,240 --> 01:00:32,280 Speaker 1: that model. And it's just a fact about statistics. The 1104 01:00:32,360 --> 01:00:35,680 Speaker 1: sorcery of average is not a fact about individual cases 1105 01:00:35,720 --> 01:00:38,640 Speaker 1: on the ground. Now, he concludes by saying that these 1106 01:00:38,640 --> 01:00:42,800 Speaker 1: three paradigms have something uh in common and quote whether 1107 01:00:42,840 --> 01:00:45,440 Speaker 1: a given problem fits into any of them depends on 1108 01:00:45,520 --> 01:00:50,280 Speaker 1: empirical assumptions about the problem. Those assumptions might be true 1109 01:00:50,280 --> 01:00:53,720 Speaker 1: of some problems but false of others. Although parsimony is 1110 01:00:53,760 --> 01:00:57,640 Speaker 1: demonstrably relevant in forming judgments about what the world is like, 1111 01:00:58,080 --> 01:01:02,120 Speaker 1: there is, in the end no un conditional and presupposition 1112 01:01:02,200 --> 01:01:07,080 Speaker 1: less justification for Ockham's razor. Uh So that's tough, right, 1113 01:01:07,080 --> 01:01:09,680 Speaker 1: Like Ockham's razor is not a tool you can apply 1114 01:01:09,760 --> 01:01:12,760 Speaker 1: to every situation to get closer to the truth. It's 1115 01:01:12,760 --> 01:01:17,800 Speaker 1: a tool that is useful sometimes for some types of judgment, 1116 01:01:18,200 --> 01:01:21,360 Speaker 1: and the real difficulty is recognizing when you're in one 1117 01:01:21,400 --> 01:01:24,480 Speaker 1: of those situations in which it's useful or one of 1118 01:01:24,480 --> 01:01:27,680 Speaker 1: those situations where it's actually just a logical red herring. 1119 01:01:28,600 --> 01:01:31,000 Speaker 1: So really it kind of comes back to, uh, you know, 1120 01:01:31,080 --> 01:01:32,880 Speaker 1: we we were talking about Sagan at the beginning of 1121 01:01:32,920 --> 01:01:34,640 Speaker 1: this and how he said, this is one of the 1122 01:01:34,680 --> 01:01:38,080 Speaker 1: tools in your skeptics tool chest. And the thing about 1123 01:01:38,080 --> 01:01:40,080 Speaker 1: a tool chest is that you have more than one 1124 01:01:40,080 --> 01:01:44,240 Speaker 1: tool in there. And the screwdriver cannot be used for everything, right, 1125 01:01:44,320 --> 01:01:45,880 Speaker 1: I mean, you can try. It's useful for a lot 1126 01:01:45,920 --> 01:01:48,720 Speaker 1: of things, uh, and certainly very useful for screws, but 1127 01:01:48,720 --> 01:01:51,800 Speaker 1: there's gonna be a time when you're gonna have to 1128 01:01:51,800 --> 01:01:54,320 Speaker 1: pull out another tool to deal with the problem. And 1129 01:01:54,320 --> 01:01:56,640 Speaker 1: there are gonna be plenty of cases you will encounter 1130 01:01:56,680 --> 01:01:59,400 Speaker 1: We're trying to use the skeptical tool of Akham's razor 1131 01:01:59,520 --> 01:02:01,760 Speaker 1: is like trying to clean out your electrical socket with 1132 01:02:01,760 --> 01:02:05,280 Speaker 1: the screwdriver. You're just it's gonna steer you astray. And 1133 01:02:05,320 --> 01:02:07,440 Speaker 1: I'm very sorry that in the end here we don't 1134 01:02:07,520 --> 01:02:10,320 Speaker 1: have like a clean rule to just guide you like 1135 01:02:10,360 --> 01:02:12,040 Speaker 1: this is when you can use it, this is when 1136 01:02:12,080 --> 01:02:14,400 Speaker 1: you can't. I think it comes down to, I mean, 1137 01:02:14,560 --> 01:02:17,640 Speaker 1: Sober has some useful things to say. They're about like 1138 01:02:17,680 --> 01:02:21,440 Speaker 1: types of situations where it is helpful. But yeah, there there, 1139 01:02:21,560 --> 01:02:24,000 Speaker 1: there's I'm sorry, there's not just like an easy rule 1140 01:02:24,000 --> 01:02:27,000 Speaker 1: of thumb for when they when the razor will be helpful. Yeah. 1141 01:02:27,000 --> 01:02:29,160 Speaker 1: I mean, ultimately, it is a tool that was not 1142 01:02:29,640 --> 01:02:31,919 Speaker 1: plucked out of the sky, but it was plucked out 1143 01:02:31,960 --> 01:02:35,440 Speaker 1: of human reasoning and uh and and human problem solving. 1144 01:02:36,120 --> 01:02:38,600 Speaker 1: By the way, coming back to the name of the Rose, 1145 01:02:39,040 --> 01:02:41,640 Speaker 1: I want to point out that there is apparently a 1146 01:02:42,120 --> 01:02:47,480 Speaker 1: highly regarded Spanish seven eight bit computer game based on 1147 01:02:47,560 --> 01:02:50,520 Speaker 1: the name of the road. Yeah, it's a it's titled 1148 01:02:50,560 --> 01:02:53,840 Speaker 1: The Abbey of the Crime, which was actually uh and 1149 01:02:53,920 --> 01:02:56,080 Speaker 1: they conceived it as an adaptation of the name of 1150 01:02:56,080 --> 01:02:58,480 Speaker 1: the Rose, but they were unable to secure permission to 1151 01:02:58,560 --> 01:03:01,320 Speaker 1: do so. And uh they in fact, I read that 1152 01:03:01,320 --> 01:03:03,160 Speaker 1: they didn't even hear back from Echo. They tried to 1153 01:03:03,200 --> 01:03:04,920 Speaker 1: get a hut of them and they couldn't get hold 1154 01:03:04,960 --> 01:03:07,560 Speaker 1: of it. And try to imagine the umberto Echo essay 1155 01:03:07,600 --> 01:03:10,280 Speaker 1: about this video game when he tries to play it, 1156 01:03:11,720 --> 01:03:15,240 Speaker 1: that would be good. Um, but basically the Abbey of 1157 01:03:15,280 --> 01:03:17,720 Speaker 1: the Crime. The title they went with was apparently like 1158 01:03:17,760 --> 01:03:20,280 Speaker 1: the working title for the Name of the Rose at 1159 01:03:20,320 --> 01:03:23,440 Speaker 1: one point. Um, so they released it under that name, 1160 01:03:23,440 --> 01:03:26,280 Speaker 1: and instead of having the main character be William of Baskerville, 1161 01:03:26,400 --> 01:03:29,760 Speaker 1: the main character is William of Alcolm and uh. And 1162 01:03:29,920 --> 01:03:32,160 Speaker 1: I thought that was pretty much the into it. You know, 1163 01:03:32,200 --> 01:03:34,200 Speaker 1: you can look up the footage of the game and all. 1164 01:03:34,440 --> 01:03:37,360 Speaker 1: But then I just learned for the first time this 1165 01:03:37,440 --> 01:03:40,680 Speaker 1: may be more common knowledge for everyone else out there. Um, 1166 01:03:41,600 --> 01:03:43,400 Speaker 1: there is a remake of it like they did, like 1167 01:03:43,440 --> 01:03:47,880 Speaker 1: a revamped version of it with improved but nicely pixelated graphics. Um, 1168 01:03:48,480 --> 01:03:51,760 Speaker 1: the Abbey of the Crime Extensive, which you can get 1169 01:03:51,800 --> 01:03:54,560 Speaker 1: on Steam. Apparently. I don't really do Steam, so I 1170 01:03:54,560 --> 01:03:56,840 Speaker 1: don't really know how it works. But um yeah, it's 1171 01:03:56,880 --> 01:04:00,320 Speaker 1: listed on there. Came out and it looks really cool, 1172 01:04:00,440 --> 01:04:03,120 Speaker 1: like the For instance, now the the updated sprites the 1173 01:04:03,120 --> 01:04:07,000 Speaker 1: little characters in the game, they look so much like 1174 01:04:07,120 --> 01:04:10,480 Speaker 1: the actors in the original film adaptation to the Name 1175 01:04:10,480 --> 01:04:14,320 Speaker 1: of Rose, Like it's a little Sean Connery and Christians Later. Yeah, 1176 01:04:14,320 --> 01:04:16,919 Speaker 1: I don't know if they got permission to use their likenesses. Um, 1177 01:04:17,040 --> 01:04:19,040 Speaker 1: how close. Does they have to be in eight bits? 1178 01:04:19,280 --> 01:04:23,280 Speaker 1: I don't know. That's that's a great question. But but 1179 01:04:23,480 --> 01:04:25,040 Speaker 1: my other question is just I would like to ask 1180 01:04:25,280 --> 01:04:28,400 Speaker 1: listeners out there have if you've played this, please let 1181 01:04:28,440 --> 01:04:30,480 Speaker 1: me know how it is. I'm very curious, Not that 1182 01:04:30,520 --> 01:04:32,840 Speaker 1: I think I will actually play it for myself, but 1183 01:04:32,920 --> 01:04:36,640 Speaker 1: I just I'm genuinely, genuinely interested in, uh in what 1184 01:04:36,920 --> 01:04:39,680 Speaker 1: a video game adaptation to the Name of the Rose is. Like. 1185 01:04:39,840 --> 01:04:41,640 Speaker 1: If you know the solution at the end of the book, 1186 01:04:41,680 --> 01:04:44,600 Speaker 1: can you automatically beat the game immediately? Like yeah? Or 1187 01:04:44,640 --> 01:04:48,160 Speaker 1: are there different solutions? I don't know, Uh, you know, 1188 01:04:48,320 --> 01:04:50,400 Speaker 1: is it a different murder each time? That would be crazy? 1189 01:04:50,520 --> 01:04:53,880 Speaker 1: Arrives at the abbey, speaks to the abbot immediately says 1190 01:04:53,920 --> 01:04:57,200 Speaker 1: I got something to lay on you. Is Acam's razor 1191 01:04:57,440 --> 01:05:00,040 Speaker 1: a an item that you can pick up like a 1192 01:05:00,040 --> 01:05:02,960 Speaker 1: plus one occoms razor that can then be employed in combat. 1193 01:05:03,000 --> 01:05:05,520 Speaker 1: It's like the Master Sword. Yeah, surely there is not 1194 01:05:05,640 --> 01:05:08,600 Speaker 1: combat in this game. I should hope not. I should 1195 01:05:08,640 --> 01:05:14,120 Speaker 1: hope it's just a lot of talking um Catholics. Yeah, 1196 01:05:15,600 --> 01:05:18,720 Speaker 1: I cast the poverty of Christ on you. Well. In 1197 01:05:18,720 --> 01:05:21,080 Speaker 1: the screenshot I was looking at, does look like the 1198 01:05:21,120 --> 01:05:25,080 Speaker 1: main character Baskerville slash. ACoM does have a pair of spectacles, 1199 01:05:25,360 --> 01:05:28,080 Speaker 1: but then there's like one to three there. There are 1200 01:05:28,200 --> 01:05:31,280 Speaker 1: multiple empty spots here, So I guess he gets other stuff. 1201 01:05:31,320 --> 01:05:34,760 Speaker 1: I mean, I guess various books and whatnot, some of 1202 01:05:34,800 --> 01:05:39,000 Speaker 1: lemon juice. Uh, and probably some cheese, some cheese where 1203 01:05:39,080 --> 01:05:41,720 Speaker 1: that gets like some fried cheese at some point, yeah, 1204 01:05:41,760 --> 01:05:44,840 Speaker 1: I think so, but mostly books, mostly books. All right, 1205 01:05:45,480 --> 01:05:47,720 Speaker 1: So there you have it, Acoms raisor hopefully we're able 1206 01:05:47,760 --> 01:05:50,160 Speaker 1: to to lay it out for you, um, you know, 1207 01:05:50,360 --> 01:05:53,320 Speaker 1: an explanation of what what alcomes razor is, where it 1208 01:05:53,400 --> 01:05:57,520 Speaker 1: came from, uh, some of the various opinions on its usefulness. 1209 01:05:58,480 --> 01:06:00,880 Speaker 1: You know. It's so you can take the tool, put 1210 01:06:00,880 --> 01:06:02,640 Speaker 1: it back into the tool chest, and know a little 1211 01:06:02,680 --> 01:06:04,880 Speaker 1: little bit more about it the next time you pull 1212 01:06:04,920 --> 01:06:07,600 Speaker 1: it out and go to use it. In the meantime, 1213 01:06:07,640 --> 01:06:09,320 Speaker 1: if you want to check out other episodes of Stuff 1214 01:06:09,360 --> 01:06:11,120 Speaker 1: to Blow your Mind, go to stuff to Blow your 1215 01:06:11,200 --> 01:06:13,240 Speaker 1: Mind dot com. That will shoot you over to the 1216 01:06:13,280 --> 01:06:16,200 Speaker 1: I heart listing for this podcast. But ultimately you can 1217 01:06:16,240 --> 01:06:19,439 Speaker 1: find this podcast wherever you get your podcast. We don't 1218 01:06:19,440 --> 01:06:21,680 Speaker 1: care where that is, wherever it happens. To be. Just 1219 01:06:21,760 --> 01:06:24,280 Speaker 1: make sure that you subscribe, that you rate through your review. 1220 01:06:24,600 --> 01:06:26,880 Speaker 1: These are the things that help us out huge thanks 1221 01:06:26,920 --> 01:06:30,360 Speaker 1: as always to our excellent audio producer, Seth Nicholas Johnson. 1222 01:06:30,680 --> 01:06:32,160 Speaker 1: If you would like to get in touch with us 1223 01:06:32,160 --> 01:06:34,600 Speaker 1: with feedback on this episode or any other, to suggest 1224 01:06:34,600 --> 01:06:37,160 Speaker 1: topic for the future, just to say hi, you can 1225 01:06:37,240 --> 01:06:40,120 Speaker 1: email us at contact at stuff to Blow your Mind 1226 01:06:40,280 --> 01:06:49,800 Speaker 1: dot com. Stuff to Blow Your Mind is a production 1227 01:06:49,840 --> 01:06:52,360 Speaker 1: of iHeart Radio's How Stuff Works. For more podcasts from 1228 01:06:52,360 --> 01:06:54,160 Speaker 1: my heart Radio is at the i heart Radio app, 1229 01:06:54,320 --> 01:06:56,960 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows 1230 01:07:00,080 --> 01:07:12,080 Speaker 1: that the point four foot fo