1 00:00:03,080 --> 00:00:05,920 Speaker 1: Welcome to Stuff to Blow your Mind from how Stuff 2 00:00:05,920 --> 00:00:14,160 Speaker 1: Works dot com. Hey, you're welcome to Stuff to Blow 3 00:00:14,200 --> 00:00:17,000 Speaker 1: your mind. My name is Robert Lamb and I'm Joe McCormick. 4 00:00:17,040 --> 00:00:21,439 Speaker 1: In Today, we're gonna be looking at the philosophical underpinnings 5 00:00:21,480 --> 00:00:25,319 Speaker 1: of science. Don't run away, stick here, because we're gonna 6 00:00:25,320 --> 00:00:30,280 Speaker 1: be talking about scientific reductionism. Now, we've done episodes before 7 00:00:30,280 --> 00:00:33,240 Speaker 1: where we've talked about not just you know, the fruits 8 00:00:33,240 --> 00:00:37,400 Speaker 1: of scientific investigation, but the ideas that lie underneath what 9 00:00:37,479 --> 00:00:41,360 Speaker 1: we do when we do science. Uh, we've talked before 10 00:00:41,360 --> 00:00:45,000 Speaker 1: about that Daniel Dennett quote from Darwin's Dangerous Idea where 11 00:00:45,000 --> 00:00:47,440 Speaker 1: he says that you know, scientists a lot of times 12 00:00:47,960 --> 00:00:51,360 Speaker 1: think that philosophies what those other, you know, naval gazers 13 00:00:51,400 --> 00:00:55,200 Speaker 1: do over there, and that that science is really free 14 00:00:55,240 --> 00:00:58,160 Speaker 1: from all the constraints of that naval gazing, that they're 15 00:00:58,200 --> 00:01:02,720 Speaker 1: immune to, quote the confusions that philosophers devote their lives 16 00:01:02,760 --> 00:01:06,199 Speaker 1: to dissolving. But what Dennett says is there's no such 17 00:01:06,240 --> 00:01:10,000 Speaker 1: thing as philosophy free science. There's only science whose philosophical 18 00:01:10,080 --> 00:01:14,560 Speaker 1: baggage is taken on board without examination. And I think 19 00:01:14,600 --> 00:01:16,080 Speaker 1: that makes a lot of sense. I think that's a 20 00:01:16,120 --> 00:01:19,520 Speaker 1: maximum we should adhere to to to look at what's 21 00:01:19,600 --> 00:01:24,680 Speaker 1: lying underneath science intellectually and and always ask ourselves the question, like, 22 00:01:25,360 --> 00:01:28,520 Speaker 1: is what is what we're doing philosophically grounded? Does it 23 00:01:28,600 --> 00:01:30,880 Speaker 1: make sense? Yeah? I mean it gets it down to 24 00:01:30,920 --> 00:01:33,560 Speaker 1: the idea of to what extent can we truly just 25 00:01:33,920 --> 00:01:39,520 Speaker 1: sort of mathematically, passionlessly, uh, breakdown things into their fundamental 26 00:01:39,600 --> 00:01:42,120 Speaker 1: parts in order to make sense of them. Yeah, And 27 00:01:42,200 --> 00:01:44,160 Speaker 1: that is the idea we're gonna be talking about today. 28 00:01:44,240 --> 00:01:48,360 Speaker 1: It's the concept of scientific reductionism. Now, I want to 29 00:01:48,360 --> 00:01:52,200 Speaker 1: start by clarifying the meaning of that term, because if 30 00:01:52,240 --> 00:01:54,800 Speaker 1: you've heard that term used in conversation, there's a very 31 00:01:54,800 --> 00:01:56,560 Speaker 1: good chance that you've heard it used in a way 32 00:01:56,600 --> 00:01:59,560 Speaker 1: that's different than what we mean today. So people did 33 00:02:00,000 --> 00:02:03,440 Speaker 1: bonding on Facebook to say a new story about a 34 00:02:03,680 --> 00:02:06,720 Speaker 1: scientific study or a theory and say, well, that sounds reductionist. 35 00:02:06,720 --> 00:02:09,480 Speaker 1: You're just being a reductionist here. Uh. Yeah. So there's 36 00:02:09,480 --> 00:02:12,839 Speaker 1: one version of the word reductive that means something sort 37 00:02:12,880 --> 00:02:16,880 Speaker 1: of like oversimplified. Like if I say that you know, 38 00:02:16,960 --> 00:02:19,560 Speaker 1: the only reason French cooking tastes good is because they 39 00:02:19,639 --> 00:02:23,520 Speaker 1: use like a full stick of butter and every dish. People. 40 00:02:23,680 --> 00:02:26,120 Speaker 1: You know, some French chef might say, no, wait, there's 41 00:02:26,120 --> 00:02:29,120 Speaker 1: a lot of technique. You're taking a very reductionist attitude. 42 00:02:29,520 --> 00:02:33,399 Speaker 1: It's not that simple um or another one you might hear. 43 00:02:33,680 --> 00:02:35,560 Speaker 1: I think often these days, if you hear people talking 44 00:02:35,600 --> 00:02:40,560 Speaker 1: about scientific reductionism, you're hearing it used, uh, not in 45 00:02:40,600 --> 00:02:45,359 Speaker 1: a philosophical argument about the underpinnings of science, but more 46 00:02:45,400 --> 00:02:48,840 Speaker 1: in an argument about the validity of world views. And 47 00:02:48,880 --> 00:02:51,520 Speaker 1: it works as this kind of snarl word that means 48 00:02:51,560 --> 00:02:55,280 Speaker 1: something like nihilism, or the belief that there is nothing 49 00:02:55,320 --> 00:02:58,799 Speaker 1: of value, beauty, or goodness in the world. That is 50 00:02:58,840 --> 00:03:00,920 Speaker 1: not what we mean. That's not what the term means 51 00:03:00,919 --> 00:03:03,160 Speaker 1: in philosophy of science, and that's not the way we're 52 00:03:03,160 --> 00:03:05,120 Speaker 1: going to be using it today. Yeah. I think a 53 00:03:05,160 --> 00:03:07,480 Speaker 1: lot of this gets back to the concepts that we 54 00:03:07,560 --> 00:03:10,520 Speaker 1: tackled in the Wicked Problems episode and as well as 55 00:03:10,560 --> 00:03:13,480 Speaker 1: the illusion of explanatory depth. And that's the idea that simple, 56 00:03:13,880 --> 00:03:18,760 Speaker 1: broad solutions to complex societal problems, complicts problems in general 57 00:03:18,880 --> 00:03:21,840 Speaker 1: tend to be ineffective and spawn new problems. And you 58 00:03:21,840 --> 00:03:25,000 Speaker 1: could say that's that's because they take a reductionist approach 59 00:03:25,000 --> 00:03:27,480 Speaker 1: to it. Robots are dangerous, so we ban all robots. 60 00:03:27,680 --> 00:03:30,560 Speaker 1: Humans are the cause of war, so exterminate all all humans. 61 00:03:30,680 --> 00:03:33,200 Speaker 1: Right now, I wonder if that way of using reductionism 62 00:03:33,200 --> 00:03:36,120 Speaker 1: fits more into the hater's definition or the definition we're 63 00:03:36,120 --> 00:03:38,400 Speaker 1: going to look at today. Well, the interesting thing, isn't 64 00:03:38,440 --> 00:03:43,760 Speaker 1: it that the hater's definition is itself reductionist, right, and 65 00:03:43,800 --> 00:03:47,480 Speaker 1: then maybe my that definition is also reductionist because to 66 00:03:47,560 --> 00:03:50,480 Speaker 1: a certain extent, Yes, when you there are some things 67 00:03:50,480 --> 00:03:52,960 Speaker 1: that they're complicated enough and if you boil them, you know, 68 00:03:53,000 --> 00:03:55,800 Speaker 1: you can boil them down to sort of concrete solutions 69 00:03:56,400 --> 00:04:02,200 Speaker 1: and concrete um causes. And it depends on what you're studying. 70 00:04:02,240 --> 00:04:05,560 Speaker 1: Like sometimes, especially with societal issues, it's not always that 71 00:04:05,600 --> 00:04:08,720 Speaker 1: cut and dry. There's just too many factors and it's 72 00:04:08,720 --> 00:04:13,119 Speaker 1: difficult to test out the solution, certainly in real time. Yeah, 73 00:04:13,200 --> 00:04:16,520 Speaker 1: because by the time you've deployed the answer, you've created 74 00:04:16,560 --> 00:04:20,440 Speaker 1: all these additional problems. So that's that it's only butter explanation. 75 00:04:20,520 --> 00:04:23,559 Speaker 1: Somebody is trying to explain deliciousness in terms of butter 76 00:04:23,640 --> 00:04:27,680 Speaker 1: only when we in reality it's much more complicated. Yes, Okay, 77 00:04:27,800 --> 00:04:31,360 Speaker 1: so what what what I mean by scientific reductionism, and 78 00:04:31,400 --> 00:04:34,960 Speaker 1: what's usually discussed in uh philosophy of science, is that 79 00:04:35,640 --> 00:04:40,520 Speaker 1: as a method, it means that any system or entity 80 00:04:40,720 --> 00:04:45,280 Speaker 1: existing in reality will ultimately be best understood if broken 81 00:04:45,320 --> 00:04:50,040 Speaker 1: down into its simpler constituent parts, and the workings of 82 00:04:50,080 --> 00:04:54,000 Speaker 1: those parts are understood. Uh. And we see this all 83 00:04:54,040 --> 00:04:56,680 Speaker 1: the time in science, right, the science is constantly trying 84 00:04:56,680 --> 00:05:00,560 Speaker 1: to reduce a complex phenomenon into its part and find 85 00:05:00,560 --> 00:05:04,640 Speaker 1: out how it's parts work. Yeah, what's it's kind of like, 86 00:05:04,720 --> 00:05:07,160 Speaker 1: I mean, it's basically what how stuff works is all about, right, 87 00:05:07,200 --> 00:05:10,520 Speaker 1: It's about how is this actually working? What are the properties? 88 00:05:10,520 --> 00:05:13,279 Speaker 1: What are the physical laws involved? Even if it's something 89 00:05:13,320 --> 00:05:16,080 Speaker 1: as simple. Well, for instance, I wrote an article on 90 00:05:16,120 --> 00:05:18,520 Speaker 1: how hula hoops work a while back, and so part 91 00:05:18,520 --> 00:05:20,200 Speaker 1: of that is like the culture of the history, who 92 00:05:20,440 --> 00:05:22,080 Speaker 1: in the in the history of the hula hoop, where 93 00:05:22,080 --> 00:05:24,839 Speaker 1: it came from, how it gained popularity, how it's utilize 94 00:05:24,920 --> 00:05:26,840 Speaker 1: different ways. But you also get down to just the 95 00:05:26,880 --> 00:05:30,320 Speaker 1: basic physics of what's going on when a hula hoop 96 00:05:30,320 --> 00:05:32,760 Speaker 1: is swinging around a body and motion, and you can 97 00:05:32,880 --> 00:05:35,200 Speaker 1: reduce it to physicals. So in that sense, I think 98 00:05:35,240 --> 00:05:38,400 Speaker 1: that is a perfectly valid way of using a reductionist 99 00:05:38,480 --> 00:05:41,200 Speaker 1: approach is saying, like, what are the most basic laws 100 00:05:41,240 --> 00:05:45,000 Speaker 1: and elements and explanatory UH systems that are in play 101 00:05:45,440 --> 00:05:48,080 Speaker 1: when you see somebody hula hooping and there man and 102 00:05:48,120 --> 00:05:49,960 Speaker 1: I can see where there would be people who would say, 103 00:05:50,000 --> 00:05:52,640 Speaker 1: who are like such hula hoop enthusiasts? And it would 104 00:05:52,640 --> 00:05:56,360 Speaker 1: say stop explaining trying to explain the magic of hula hoops. 105 00:05:56,560 --> 00:05:58,800 Speaker 1: When you explain the physics of hula hoops, you take 106 00:05:58,800 --> 00:06:01,400 Speaker 1: all the fun out of it. That would be a 107 00:06:01,400 --> 00:06:04,320 Speaker 1: crazy statement because the fun still there. We're just explaining 108 00:06:04,600 --> 00:06:07,799 Speaker 1: how the fun works, right. And the controversy we're talking 109 00:06:07,800 --> 00:06:10,240 Speaker 1: about today, and this example we can probably use a 110 00:06:10,240 --> 00:06:14,559 Speaker 1: better example in the minic would not be uh whether 111 00:06:14,600 --> 00:06:16,720 Speaker 1: it takes the fun out of it, but whether it 112 00:06:16,800 --> 00:06:19,960 Speaker 1: misses something crucial. If you just describe a hula hoop 113 00:06:20,000 --> 00:06:23,120 Speaker 1: in terms of the basic physics of of how it 114 00:06:23,120 --> 00:06:27,120 Speaker 1: goes around the body, are you missing something factually crucial 115 00:06:27,320 --> 00:06:31,040 Speaker 1: and autonomously true about the phenomenon of hula hooping? Who 116 00:06:31,120 --> 00:06:34,880 Speaker 1: is something vital to its functionality? Yeah, that's probably not 117 00:06:34,960 --> 00:06:37,839 Speaker 1: a good example. But one other way we can look 118 00:06:37,880 --> 00:06:41,320 Speaker 1: at what scientific reductionism is is that it's a it's 119 00:06:41,320 --> 00:06:45,320 Speaker 1: a hypothesis on the final nature of the relationship between 120 00:06:45,320 --> 00:06:48,400 Speaker 1: science and reality, and so it can be interpreted to 121 00:06:48,440 --> 00:06:52,039 Speaker 1: mean that in effect, every correct explanation of the world 122 00:06:52,400 --> 00:06:57,039 Speaker 1: can be reduced to the most fundamental, lowest theory of reality, 123 00:06:57,279 --> 00:07:02,200 Speaker 1: and essentially everything is physics if you go deep enough. Uh. Now, 124 00:07:02,600 --> 00:07:04,800 Speaker 1: there're gonna be a lot of reductionists who will say, now, 125 00:07:04,839 --> 00:07:10,400 Speaker 1: I understand that we need sciences like psychology and chemistry 126 00:07:10,480 --> 00:07:14,000 Speaker 1: and political science and sociology to explain things that it 127 00:07:14,000 --> 00:07:16,880 Speaker 1: would be ridiculous to think we can explain by looking 128 00:07:16,880 --> 00:07:21,080 Speaker 1: at all the elementary particles. Um, but that in principle 129 00:07:21,520 --> 00:07:24,480 Speaker 1: we should be able to explain all those things given 130 00:07:24,560 --> 00:07:28,040 Speaker 1: just our understanding of elementary particles and forces. They're just 131 00:07:28,080 --> 00:07:32,440 Speaker 1: too complex for us to understand right now. Okay, So 132 00:07:32,720 --> 00:07:36,360 Speaker 1: let's try to put this into a specific concrete example. Uh. 133 00:07:36,840 --> 00:07:39,240 Speaker 1: If if you accept that everything in the universe is 134 00:07:39,280 --> 00:07:41,680 Speaker 1: subject to the laws of physics, and I think Robert, 135 00:07:41,680 --> 00:07:43,360 Speaker 1: you and I can agree that as far as we 136 00:07:43,400 --> 00:07:47,400 Speaker 1: know it is, uh, then everything in the universe ultimately 137 00:07:47,480 --> 00:07:51,560 Speaker 1: could be best explained by fundamental physics or whatever we 138 00:07:51,640 --> 00:07:55,560 Speaker 1: find lying underneath fundamental physics, whatever is the ultimate theory 139 00:07:55,560 --> 00:07:59,840 Speaker 1: of everything, the ultimate underlying law of the universe. Uh. 140 00:07:59,840 --> 00:08:03,720 Speaker 1: And so let's look at a higher order phenomenon and 141 00:08:03,880 --> 00:08:06,040 Speaker 1: and try to say what it would mean to reduce it. 142 00:08:06,120 --> 00:08:09,760 Speaker 1: So I came up with this horrible example and an 143 00:08:09,760 --> 00:08:14,920 Speaker 1: apparent case of psychogenic blindness. So you are with your 144 00:08:14,920 --> 00:08:18,200 Speaker 1: family on Thanksgiving, and so they get to pick what 145 00:08:18,320 --> 00:08:21,600 Speaker 1: movie you go out and see, and you are overruled, 146 00:08:21,680 --> 00:08:24,000 Speaker 1: and you go to see the new Adam Sandler movie, 147 00:08:24,120 --> 00:08:26,360 Speaker 1: in which I imagine the next one is going to be. 148 00:08:26,400 --> 00:08:30,480 Speaker 1: Adam Sandler plays the fifteenth Dalai Lama and also plays 149 00:08:30,520 --> 00:08:35,120 Speaker 1: the Dalai Lama is really loud, flatulent twin sister. Twenty 150 00:08:35,160 --> 00:08:37,560 Speaker 1: five minutes into the onset of this film, you go 151 00:08:37,720 --> 00:08:41,080 Speaker 1: blind in both eyes. Now you go to the doctor, 152 00:08:41,120 --> 00:08:43,840 Speaker 1: and the doctors can find no evidence of injury or 153 00:08:43,880 --> 00:08:47,480 Speaker 1: neurological dysfunctions, So they classify this as a rare case 154 00:08:47,520 --> 00:08:53,040 Speaker 1: of psychogenic blindness, blindness that's induced by psychological distress rooted 155 00:08:53,080 --> 00:08:55,920 Speaker 1: in a state of mind. And we've discussed this on 156 00:08:55,960 --> 00:09:00,079 Speaker 1: the show before, especially in terms of certain almost for 157 00:09:00,200 --> 00:09:03,719 Speaker 1: natural occurrences right right, where like there's a mystical experience 158 00:09:03,760 --> 00:09:06,880 Speaker 1: and it leads to a bodily manifestation. Yeah, and so 159 00:09:06,960 --> 00:09:10,400 Speaker 1: in psychology there might be some framework for explaining what 160 00:09:10,520 --> 00:09:13,000 Speaker 1: happened to you uh, and that framework would be a 161 00:09:13,040 --> 00:09:16,880 Speaker 1: theory like knowledge. It would be explanatory talking about causes 162 00:09:17,040 --> 00:09:20,719 Speaker 1: in the mind and uh, and perhaps solutions that take 163 00:09:20,760 --> 00:09:24,160 Speaker 1: place in the mind. But of course, assuming that we 164 00:09:24,200 --> 00:09:27,520 Speaker 1: had a complete understanding of the whole state of your 165 00:09:27,520 --> 00:09:31,040 Speaker 1: body at the level of neuroscience, fully explaining all of 166 00:09:31,080 --> 00:09:34,319 Speaker 1: your brain tissues and functions and how they were interacting 167 00:09:34,800 --> 00:09:40,120 Speaker 1: on the reductionist hypothesis, we actually shouldn't need the psychological explanation, right, 168 00:09:40,640 --> 00:09:44,320 Speaker 1: that's just a convenience. If we understood everything there was 169 00:09:44,400 --> 00:09:47,760 Speaker 1: to understand about the physical nature of your brain, we 170 00:09:47,760 --> 00:09:50,920 Speaker 1: wouldn't need psychology. We wouldn't need the psychologist to say, 171 00:09:50,960 --> 00:09:53,440 Speaker 1: what's happening with your blindness. We could just look at 172 00:09:53,440 --> 00:09:56,800 Speaker 1: the cells in your brain. Now. Of course, if we 173 00:09:56,880 --> 00:10:00,720 Speaker 1: had a perfect understanding of your brain from perspective of 174 00:10:00,760 --> 00:10:03,600 Speaker 1: cell biology, explaining what all the cells in your body 175 00:10:03,600 --> 00:10:06,920 Speaker 1: are doing and how, we technically wouldn't need the neuroscience explanation. 176 00:10:07,000 --> 00:10:09,400 Speaker 1: We just say, okay, here's the cell theory that you know. 177 00:10:09,440 --> 00:10:12,800 Speaker 1: This is what explains everything that's going on. And then 178 00:10:12,840 --> 00:10:14,560 Speaker 1: you could reduce it further. You could say, well, of 179 00:10:14,559 --> 00:10:18,440 Speaker 1: course we had a perfect understanding of every molecule in 180 00:10:18,480 --> 00:10:22,720 Speaker 1: your body from the standpoint of fundamental chemistry, understanding what 181 00:10:22,760 --> 00:10:25,080 Speaker 1: all the atoms and molecules are doing and how, we 182 00:10:25,120 --> 00:10:29,080 Speaker 1: wouldn't need the biochemistry you know, the biology or biochemistry explanation. 183 00:10:29,480 --> 00:10:32,000 Speaker 1: And of course, if we had a perfect understanding of 184 00:10:32,240 --> 00:10:34,280 Speaker 1: the whole state of your body from the standpoint of 185 00:10:34,320 --> 00:10:39,040 Speaker 1: fundamental physics, you know, elementary particle physics, the quantum mechanical 186 00:10:39,080 --> 00:10:42,000 Speaker 1: wave functions of all the particles and energy states in 187 00:10:42,000 --> 00:10:45,960 Speaker 1: your body, we wouldn't need the chemistry explanation. So ultimately, 188 00:10:45,960 --> 00:10:48,760 Speaker 1: the hypothesis goes like this, if we're able to explain 189 00:10:48,960 --> 00:10:53,040 Speaker 1: everything in the universe in terms of fundamental physics, we 190 00:10:53,160 --> 00:10:56,480 Speaker 1: would and that would be the best explanation. It's only 191 00:10:56,559 --> 00:10:59,800 Speaker 1: our lack of understanding and our lack of knowledge and 192 00:10:59,800 --> 00:11:04,960 Speaker 1: computational power that forces us to conceive of explanations of 193 00:11:05,040 --> 00:11:08,600 Speaker 1: things that are more complex than fundamental physics, like chemistry, 194 00:11:08,640 --> 00:11:13,120 Speaker 1: like biology, like psychology like you know, sociology, or or 195 00:11:13,200 --> 00:11:17,479 Speaker 1: political science. Um, but that at bottom, the best explanation 196 00:11:17,559 --> 00:11:21,240 Speaker 1: for everything is here are the particles and their energy 197 00:11:21,280 --> 00:11:24,200 Speaker 1: states and vectors. Now it's almost impossible to think about 198 00:11:24,240 --> 00:11:27,559 Speaker 1: something of this without like this, without thinking about comparisons 199 00:11:27,600 --> 00:11:31,320 Speaker 1: to like our modern computing experience, right, you know, like 200 00:11:31,400 --> 00:11:34,400 Speaker 1: thinking of hardware, and then a type of hardware, the code, 201 00:11:34,800 --> 00:11:36,960 Speaker 1: and I'm simplifying here, but then on top of the 202 00:11:37,000 --> 00:11:40,280 Speaker 1: code the sort of uh user interface, and then our 203 00:11:40,320 --> 00:11:43,680 Speaker 1: interaction with that user interface. So it seems like you 204 00:11:43,679 --> 00:11:46,880 Speaker 1: would have I mean, realistically, you would have problems that 205 00:11:46,880 --> 00:11:50,040 Speaker 1: occur with the root of that problem at different levels 206 00:11:50,960 --> 00:11:53,920 Speaker 1: in that in that depth, right, something, you might have 207 00:11:53,960 --> 00:11:55,560 Speaker 1: a problem at the hardware level, you might have a 208 00:11:55,600 --> 00:11:59,680 Speaker 1: problem uh at the the the user interface level. But 209 00:11:59,800 --> 00:12:02,000 Speaker 1: if we had a complete enough understanding, we could address 210 00:12:02,040 --> 00:12:06,720 Speaker 1: any problem from the bottom exactly. So the physical reductionists 211 00:12:06,760 --> 00:12:10,400 Speaker 1: would say, okay, but if you had a perfect understanding, yeah, everything, 212 00:12:10,640 --> 00:12:13,599 Speaker 1: there would be energy states and elementary particles. If you 213 00:12:13,640 --> 00:12:16,640 Speaker 1: understood what all of them were doing, you could fix 214 00:12:16,679 --> 00:12:18,880 Speaker 1: any problem at any level. Yes, So, so so not even 215 00:12:18,920 --> 00:12:20,960 Speaker 1: going down to the hardware level, but going down to 216 00:12:21,080 --> 00:12:23,959 Speaker 1: the the the even more primal levels, going into the 217 00:12:24,040 --> 00:12:27,240 Speaker 1: very basement of reality. Yeah, and so the question today 218 00:12:27,360 --> 00:12:31,640 Speaker 1: is looking at that perspective of the world, is that true? 219 00:12:32,280 --> 00:12:36,400 Speaker 1: Is that a correct understanding of what science is? Um 220 00:12:36,520 --> 00:12:44,120 Speaker 1: or our higher level science is more complex sciences like chemistry, biology, psychology, sociology, 221 00:12:44,160 --> 00:12:49,240 Speaker 1: and so forth. Do these sciences have unique insights that 222 00:12:49,280 --> 00:12:53,360 Speaker 1: are not present at the lower levels of more simplicity 223 00:12:53,760 --> 00:12:57,480 Speaker 1: of you know, simpler realities. Are they just the best 224 00:12:57,559 --> 00:13:01,040 Speaker 1: we can do to understand complex phenomena like society, minds, 225 00:13:01,040 --> 00:13:05,360 Speaker 1: and organisms? Or do they Are they intellectually autonomous? Do 226 00:13:05,440 --> 00:13:08,960 Speaker 1: they have something original to offer? So, like, the counter 227 00:13:09,080 --> 00:13:13,240 Speaker 1: argument here would be that that physics and carpentry engineering 228 00:13:13,280 --> 00:13:16,680 Speaker 1: can explain the way that a stage is built, but 229 00:13:16,760 --> 00:13:19,040 Speaker 1: they're not going to have any impact on say, the 230 00:13:19,080 --> 00:13:22,360 Speaker 1: play that the actors on the stage you're reciting. Sure, yeah, 231 00:13:22,400 --> 00:13:25,800 Speaker 1: that that you might uh, well, yeah, I mean that 232 00:13:25,800 --> 00:13:28,960 Speaker 1: that that's sort of an analogy that works, I guess. Um. 233 00:13:29,640 --> 00:13:31,520 Speaker 1: One of the things that I do want to make 234 00:13:31,559 --> 00:13:34,920 Speaker 1: sure we're clarifying is that I don't plan on Robert 235 00:13:34,960 --> 00:13:37,800 Speaker 1: you can, you can violate this if you want, but 236 00:13:37,880 --> 00:13:41,400 Speaker 1: I don't plan on exploring arguments against scientific reductionism that 237 00:13:41,440 --> 00:13:46,920 Speaker 1: are based in a belief in supernatural causation, because, as 238 00:13:46,920 --> 00:13:49,600 Speaker 1: we mentioned in another recent podcast, I'm not even sure 239 00:13:49,640 --> 00:13:52,960 Speaker 1: the concept of supernatural causation is coherent. I'm not I'm 240 00:13:52,960 --> 00:13:55,960 Speaker 1: not sure that it's incoherent. But then again, try to 241 00:13:56,000 --> 00:14:00,440 Speaker 1: picture it. What are you picturing? Usually you're just picturing 242 00:14:00,559 --> 00:14:05,079 Speaker 1: natural causation with with some kind of blurry nous or 243 00:14:05,160 --> 00:14:07,680 Speaker 1: some kind of detail obscured. Right, It's like the hand 244 00:14:07,679 --> 00:14:11,000 Speaker 1: of God. Analogy. If if the hand if God is 245 00:14:11,040 --> 00:14:13,840 Speaker 1: something outside of our universe, then for that hand to 246 00:14:13,880 --> 00:14:16,640 Speaker 1: reach into our universe to do something, it has to 247 00:14:17,160 --> 00:14:19,520 Speaker 1: adhere to the laws of physics. It has to wear 248 00:14:19,560 --> 00:14:21,880 Speaker 1: the glove of our reality at least, and then it 249 00:14:21,960 --> 00:14:25,240 Speaker 1: has to therefore be observable as a physical phenomenon. Yet 250 00:14:25,280 --> 00:14:29,280 Speaker 1: to do something, it has to do something right. Um. 251 00:14:29,320 --> 00:14:32,760 Speaker 1: But but we will instead look at a different concept, 252 00:14:32,920 --> 00:14:36,760 Speaker 1: that is the concept of emergentis um, a philosophical distinction 253 00:14:37,320 --> 00:14:41,200 Speaker 1: that says that there are large complex systems that show 254 00:14:41,360 --> 00:14:45,680 Speaker 1: genuinely novel properties due to their complexity, that are not 255 00:14:45,880 --> 00:14:50,680 Speaker 1: inherently predictable from or reducible to the combined effects of 256 00:14:50,720 --> 00:14:54,920 Speaker 1: their simpler, more constituent parts and ultimately not predictable from 257 00:14:55,040 --> 00:14:59,080 Speaker 1: or reducible to fundamental physics. So let's this is where 258 00:14:59,080 --> 00:15:03,040 Speaker 1: I come back to my perhaps imperfect analogy of the 259 00:15:03,080 --> 00:15:05,720 Speaker 1: stage and the actors on the stage. You can't have 260 00:15:05,760 --> 00:15:09,240 Speaker 1: one without the other, but one seems to be operating 261 00:15:09,280 --> 00:15:12,200 Speaker 1: in a way that the lower level cannot fully predict 262 00:15:12,480 --> 00:15:15,800 Speaker 1: or control beyond the very basic levels. Okay, that, yeah, 263 00:15:15,800 --> 00:15:17,640 Speaker 1: I can see that. I think that's a good analogy. 264 00:15:17,680 --> 00:15:20,320 Speaker 1: Then I'm sorry if I was skeptical it to be. No, no, no, 265 00:15:20,760 --> 00:15:25,120 Speaker 1: A healthy dose of skepticism is uh is important here. Okay. 266 00:15:25,200 --> 00:15:28,120 Speaker 1: So given our given our idea that we're gonna look 267 00:15:28,160 --> 00:15:31,760 Speaker 1: at emergentis um as a form of material understanding of 268 00:15:31,800 --> 00:15:34,520 Speaker 1: the world. You know, it's an extension of science, not 269 00:15:34,520 --> 00:15:38,760 Speaker 1: not an expression of like vitalism or supernaturalism. What are 270 00:15:38,800 --> 00:15:42,960 Speaker 1: some examples of things in nature that we might assume 271 00:15:43,240 --> 00:15:47,840 Speaker 1: are not able to be explained by fundamental physics. Well, 272 00:15:47,920 --> 00:15:51,640 Speaker 1: big one is intelligence. So yeah, even playing like Dungeons 273 00:15:51,640 --> 00:15:55,000 Speaker 1: and Dragons, where you have a definite intelligent score versus 274 00:15:55,040 --> 00:15:59,120 Speaker 1: like wisdom or charisma, I often find myself in conversations 275 00:15:59,440 --> 00:16:01,800 Speaker 1: with the poem i'm playing with, like, was this would 276 00:16:01,800 --> 00:16:03,960 Speaker 1: this be an intelligence check or wisdom check? Like, we 277 00:16:04,000 --> 00:16:08,960 Speaker 1: really have a very ambiguous idea of what constitutes intelligence. Yeah, 278 00:16:08,960 --> 00:16:12,600 Speaker 1: But at the same time, intelligence is I think by 279 00:16:12,600 --> 00:16:15,120 Speaker 1: the fact that you consider it this separate property, that 280 00:16:15,200 --> 00:16:18,080 Speaker 1: you have this separate score, and it's natural to think 281 00:16:18,080 --> 00:16:22,080 Speaker 1: of it as something that discreetly emerges at higher levels 282 00:16:22,160 --> 00:16:26,760 Speaker 1: of complexity and isn't reducible to simpler objects. So, uh yeah, 283 00:16:26,880 --> 00:16:29,720 Speaker 1: like you said, it's sometimes kind of difficult to define intelligence. 284 00:16:29,760 --> 00:16:32,320 Speaker 1: What is it? My favorite definition that I've come across 285 00:16:32,400 --> 00:16:35,880 Speaker 1: is that intelligence is the tendency of a system to 286 00:16:36,000 --> 00:16:42,160 Speaker 1: accelerate the solution of problems. It leads to faster solving. So, 287 00:16:42,600 --> 00:16:45,280 Speaker 1: however we define it, we know when we see it right. 288 00:16:45,320 --> 00:16:49,560 Speaker 1: Intelligence is highly useful, ubiquitous, undeniable. It's part of our 289 00:16:49,600 --> 00:16:53,600 Speaker 1: experience of the everyday world. But can intelligence be explained 290 00:16:53,600 --> 00:16:58,040 Speaker 1: in terms of simpler fundamental units. I don't know. After all, 291 00:16:58,120 --> 00:17:01,640 Speaker 1: there is no indication that a single goal neuron possesses 292 00:17:01,720 --> 00:17:06,840 Speaker 1: anything like intelligence. There's no analogy for intelligence below what 293 00:17:06,920 --> 00:17:09,760 Speaker 1: things like brains or computers do, at least as far 294 00:17:09,760 --> 00:17:13,040 Speaker 1: as I can tell. Well, even then, it's far more complicated. 295 00:17:13,040 --> 00:17:15,240 Speaker 1: We've all I felt we've probably even covered on the 296 00:17:15,240 --> 00:17:18,120 Speaker 1: show before the whole topic of what makes a genius? 297 00:17:18,160 --> 00:17:20,679 Speaker 1: What does a genius's brain look like? And yes, you 298 00:17:20,680 --> 00:17:23,560 Speaker 1: can you can draw certain Uh, you can look to 299 00:17:23,600 --> 00:17:27,280 Speaker 1: the gray matter and line up various factors, but a 300 00:17:27,320 --> 00:17:29,199 Speaker 1: lot of it is going to be beyond that. It's 301 00:17:29,200 --> 00:17:31,720 Speaker 1: going to have to do with with with it, with 302 00:17:31,720 --> 00:17:34,879 Speaker 1: the experience and personality of the individual. Yeah, it's the 303 00:17:34,880 --> 00:17:39,000 Speaker 1: whole Boys from Brazil scenario we're trying to colonial. Yeah. So, 304 00:17:39,040 --> 00:17:42,720 Speaker 1: I mean assuming that animal intelligence or computer intelligence is 305 00:17:42,760 --> 00:17:46,520 Speaker 1: not magic. We're not believing it's magic, but that it's 306 00:17:46,560 --> 00:17:50,680 Speaker 1: still possible that it can't be explained reductively by recourse 307 00:17:50,720 --> 00:17:55,440 Speaker 1: to more fundamental sciences. That chemistry alone can't explain intelligence. 308 00:17:55,440 --> 00:17:58,359 Speaker 1: It's something that only happens to matter at a certain 309 00:17:58,440 --> 00:18:02,480 Speaker 1: level of complexity and configure curation, and is not predictable 310 00:18:02,560 --> 00:18:06,560 Speaker 1: from lower levels of understanding. Uh so, what what would 311 00:18:06,560 --> 00:18:09,880 Speaker 1: it mean to understand intelligence of the level of single cells. 312 00:18:10,000 --> 00:18:12,240 Speaker 1: I don't know. Maybe it's possible to do that, but 313 00:18:12,359 --> 00:18:15,520 Speaker 1: at least sounds like a very difficult project. Yeah, I mean, 314 00:18:15,560 --> 00:18:18,720 Speaker 1: so the basic idea here is that interactions among smaller 315 00:18:18,840 --> 00:18:21,960 Speaker 1: entities lead to larger entities, and there's a self organizing 316 00:18:22,000 --> 00:18:26,040 Speaker 1: aspect to reality. So what economist Jeffrey Goldstein called quote 317 00:18:26,040 --> 00:18:29,720 Speaker 1: the arising of novel and coherent structures, patterns, and properties 318 00:18:30,000 --> 00:18:33,760 Speaker 1: during the process of self organization in complex systems. So 319 00:18:33,920 --> 00:18:37,879 Speaker 1: I think, like a classic sort of science example I 320 00:18:37,880 --> 00:18:40,120 Speaker 1: always go back to, is just the accretion of cosmic 321 00:18:40,240 --> 00:18:45,000 Speaker 1: dust into smaller bodies and clouds, building mass, exerting gravity 322 00:18:45,040 --> 00:18:49,680 Speaker 1: forming stars, planet, everything everything else gravitating around each other, 323 00:18:49,760 --> 00:18:54,760 Speaker 1: held in uh in gravitational um enslavement to each other, 324 00:18:55,119 --> 00:18:59,840 Speaker 1: and becoming the system. A system that emerges initially from 325 00:19:00,200 --> 00:19:03,639 Speaker 1: particles floating around and bumping into each other, and thus 326 00:19:03,920 --> 00:19:06,640 Speaker 1: and thus it does obey the laws of physics at 327 00:19:06,640 --> 00:19:09,760 Speaker 1: every level, But is it reducible too and predictable from 328 00:19:09,800 --> 00:19:14,879 Speaker 1: those laws When the case of just like basic the 329 00:19:14,920 --> 00:19:18,960 Speaker 1: basic assembly of solar system, of of a galaxy, I 330 00:19:19,040 --> 00:19:21,400 Speaker 1: would think, yes, I think this is this is definitely 331 00:19:21,480 --> 00:19:25,880 Speaker 1: a physics based bringing together of properties. But it's such 332 00:19:25,960 --> 00:19:28,320 Speaker 1: a Maybe it's because it's just such a grandiose thing 333 00:19:28,359 --> 00:19:33,880 Speaker 1: to imagine assembling from such minute pieces. It feels an 334 00:19:33,880 --> 00:19:38,040 Speaker 1: appropriate metaphor for the emergence of say consciousness, the vergence 335 00:19:38,080 --> 00:19:43,159 Speaker 1: of intelligence, because if dust can turn into the Milky 336 00:19:43,160 --> 00:19:46,960 Speaker 1: Way galaxy, and and I'm simplifying, uh, but but if 337 00:19:47,000 --> 00:19:51,440 Speaker 1: if something so vast and complex and energetic can can 338 00:19:51,680 --> 00:19:54,159 Speaker 1: can can come together from such small pieces, then it 339 00:19:54,200 --> 00:19:58,000 Speaker 1: makes sense that's something as at least on the individual level, 340 00:19:58,240 --> 00:20:02,080 Speaker 1: as complex and amazing and brew as intelligence and consciousness, 341 00:20:02,200 --> 00:20:05,359 Speaker 1: that that too could emerge from just things bumping into 342 00:20:05,400 --> 00:20:07,760 Speaker 1: each other. Well, yeah, and there you do say something 343 00:20:07,800 --> 00:20:10,159 Speaker 1: that I think should put us on guard against the 344 00:20:10,200 --> 00:20:13,520 Speaker 1: concept of emergentism, which is that it feels good for us, 345 00:20:13,920 --> 00:20:16,040 Speaker 1: you know, the idea that no, no, no, you know, 346 00:20:16,200 --> 00:20:20,400 Speaker 1: things like intelligence and and higher order concepts really are 347 00:20:20,760 --> 00:20:23,680 Speaker 1: somehow unique at their level of organization, and they're not 348 00:20:23,800 --> 00:20:28,000 Speaker 1: just reducible to elementary particles and and energy states. Well, 349 00:20:28,280 --> 00:20:31,800 Speaker 1: that's that's something we like to feel, and so thus 350 00:20:31,840 --> 00:20:34,080 Speaker 1: I think we should be a little on guard about 351 00:20:34,480 --> 00:20:37,320 Speaker 1: about that idea. Yeah, but I would to counter I 352 00:20:37,320 --> 00:20:40,879 Speaker 1: would say that basically, the the idea of consciousness, of 353 00:20:41,000 --> 00:20:44,400 Speaker 1: being conscious, of being a live feels majestic. I think 354 00:20:44,400 --> 00:20:48,080 Speaker 1: we can look to just the universe itself and say, well, 355 00:20:48,680 --> 00:20:52,359 Speaker 1: the universe and all its majesty is based on things 356 00:20:52,400 --> 00:20:55,240 Speaker 1: bumping into each other, you know, randomly and in order 357 00:20:55,520 --> 00:20:58,040 Speaker 1: arising out of all of that. Then it's no great 358 00:20:58,040 --> 00:21:01,359 Speaker 1: stretch to say that the mind is the same the 359 00:21:01,440 --> 00:21:05,240 Speaker 1: cosmos is a humiliating analogy. Yeah, well yeah, because you 360 00:21:05,280 --> 00:21:06,760 Speaker 1: can go either way. You can say my mind is 361 00:21:06,760 --> 00:21:08,919 Speaker 1: like the universe, man, but you can also say, hey, 362 00:21:08,960 --> 00:21:10,840 Speaker 1: your mind is just like the universe. It's just stuff 363 00:21:10,880 --> 00:21:14,920 Speaker 1: bumping into each other until a system emerges. Okay, well, 364 00:21:14,960 --> 00:21:16,720 Speaker 1: maybe we should look at a few more examples of 365 00:21:16,720 --> 00:21:20,240 Speaker 1: supposed emergent behavior that you can see in nature systems, 366 00:21:20,320 --> 00:21:23,560 Speaker 1: where you know, at a complex enough level, things seem 367 00:21:23,600 --> 00:21:26,720 Speaker 1: to happen that are not obviously predictable from the simpler 368 00:21:26,760 --> 00:21:30,399 Speaker 1: components acting alone. Well, evolution is a big one, of course, 369 00:21:30,800 --> 00:21:34,680 Speaker 1: just the I mean, that's the basic underlying principle right 370 00:21:34,960 --> 00:21:37,920 Speaker 1: in the in this uh, this constant race, the survival 371 00:21:37,960 --> 00:21:42,480 Speaker 1: of the fittest, uh, natural selection that you have all 372 00:21:42,480 --> 00:21:45,879 Speaker 1: these different forms, these various mutations that are kind of 373 00:21:45,920 --> 00:21:50,000 Speaker 1: throwing out different different versions of the same product into 374 00:21:50,000 --> 00:21:55,040 Speaker 1: the open market of brutal survivalism, and then whatever sticks sticks, yeah, 375 00:21:55,440 --> 00:21:58,560 Speaker 1: uh yeah, And so that's the thing, like if you 376 00:21:58,600 --> 00:22:00,960 Speaker 1: were just in the universe where there was no life, 377 00:22:01,000 --> 00:22:04,240 Speaker 1: but there was just say, organic chemistry. Would you would 378 00:22:04,280 --> 00:22:07,080 Speaker 1: you be able to really predict the devolution was going 379 00:22:07,119 --> 00:22:11,440 Speaker 1: to occur? Maybe, I don't know. Um what about here. 380 00:22:11,440 --> 00:22:14,280 Speaker 1: Here's a product of evolution that's often been cited as 381 00:22:14,280 --> 00:22:18,120 Speaker 1: a really interesting version of emergent behavior, hive insect behavior. 382 00:22:18,240 --> 00:22:20,520 Speaker 1: Oh yes, yeah, this is this is always a cool 383 00:22:20,520 --> 00:22:26,760 Speaker 1: concept the view of you social insects, bees, wasps, ants, uh, termites, etcetera. 384 00:22:27,359 --> 00:22:31,800 Speaker 1: But especially especially bees and ants being like the prime 385 00:22:31,840 --> 00:22:36,160 Speaker 1: examples of this. Uh, they're they're essentially an emergent system 386 00:22:37,200 --> 00:22:39,399 Speaker 1: after all. You know, how else is all of this 387 00:22:39,440 --> 00:22:43,520 Speaker 1: behavior going to get there? Nobody's programming the ants, nobody's 388 00:22:43,880 --> 00:22:46,919 Speaker 1: telling them, Oh, you're the queen and you do this. Uh. No, 389 00:22:47,320 --> 00:22:51,840 Speaker 1: one ant or one b is able to display anything 390 00:22:51,960 --> 00:22:55,200 Speaker 1: like the hive behavior we see, And not even any 391 00:22:55,240 --> 00:22:58,040 Speaker 1: small group of them show these rudimentary signs of it. 392 00:22:58,040 --> 00:22:59,879 Speaker 1: It's only when you get enough of them in our 393 00:23:00,040 --> 00:23:04,280 Speaker 1: acting that hive behavior emerges. Yeah, and out of all 394 00:23:04,320 --> 00:23:07,320 Speaker 1: of these little interactions, these roles, it all adds up 395 00:23:07,359 --> 00:23:10,959 Speaker 1: to a kind of and it's important to say, you know, 396 00:23:11,080 --> 00:23:14,240 Speaker 1: non sci fi and the non science fictional sense, a 397 00:23:14,359 --> 00:23:17,679 Speaker 1: hive mind. You know, they don't like share a conscious experience, right, 398 00:23:17,720 --> 00:23:20,040 Speaker 1: they're not, you know, they don't have their brains all 399 00:23:20,080 --> 00:23:22,760 Speaker 1: hooked up with tubes into a you know, floating mega 400 00:23:22,800 --> 00:23:25,240 Speaker 1: brain or something. But in a very real, non scie 401 00:23:25,240 --> 00:23:27,840 Speaker 1: by sense, there is this hive mind, this high think 402 00:23:28,280 --> 00:23:31,720 Speaker 1: that emerges and they're able to do something many things 403 00:23:31,760 --> 00:23:34,679 Speaker 1: as a group, solve problems as a group that the 404 00:23:34,720 --> 00:23:38,520 Speaker 1: individual is just not. I mean it's almost cheating to 405 00:23:38,520 --> 00:23:41,399 Speaker 1: say they're incapable of it, because they are capable of 406 00:23:41,440 --> 00:23:43,959 Speaker 1: it as this sort of metal organism that they've become, 407 00:23:44,600 --> 00:23:48,760 Speaker 1: just not on the individual level. And this approach has 408 00:23:48,800 --> 00:23:52,000 Speaker 1: proven very useful and artificial intelligence and robotics. I'm sure 409 00:23:52,000 --> 00:23:54,480 Speaker 1: you've covered this in the past and the Forward Thinking 410 00:23:54,840 --> 00:23:59,200 Speaker 1: podcast the the study of of you social insects and 411 00:23:59,320 --> 00:24:02,760 Speaker 1: figuring out how to ads robotics and engineers faces challenges 412 00:24:02,760 --> 00:24:06,719 Speaker 1: and the interactions of simple machines machine learning. So you 413 00:24:06,920 --> 00:24:10,239 Speaker 1: essentially have the creation of like little robots that are 414 00:24:10,280 --> 00:24:14,760 Speaker 1: behaving like ants. Yeah. Uh, here here's another big one. Consciousness. 415 00:24:14,800 --> 00:24:17,440 Speaker 1: This is probably the thing that is most often discussed 416 00:24:17,520 --> 00:24:22,560 Speaker 1: as as a potentially irreducible phenomenon in nature. Uh, so 417 00:24:22,720 --> 00:24:25,000 Speaker 1: you have a mind. You don't just have a brain, 418 00:24:25,560 --> 00:24:27,840 Speaker 1: but you have a mind. Assuming you do have a mind, 419 00:24:27,880 --> 00:24:29,520 Speaker 1: I don't you know, it's impossible for me to know 420 00:24:29,600 --> 00:24:31,840 Speaker 1: anybody else in the world has a mind. I assume 421 00:24:31,920 --> 00:24:35,120 Speaker 1: you do. You seem like you do, you claim to usually, 422 00:24:35,920 --> 00:24:38,480 Speaker 1: But yeah, you've got a mind, a conscious experience. And 423 00:24:38,560 --> 00:24:41,480 Speaker 1: there's no analogy that we can find or that we 424 00:24:41,560 --> 00:24:45,000 Speaker 1: have good evidence of at lower levels. Right, there's no 425 00:24:45,119 --> 00:24:48,439 Speaker 1: evidence that single neurons are conscious. There's no evidence that 426 00:24:48,880 --> 00:24:51,720 Speaker 1: atoms or molecules or anything like that have any kind 427 00:24:51,720 --> 00:24:54,879 Speaker 1: of rudimentary consciousness. Some people assume this. There's like a 428 00:24:55,280 --> 00:24:59,199 Speaker 1: philosophical position known as pan psychism, which is the idea 429 00:24:59,320 --> 00:25:02,920 Speaker 1: that at um it's in some kind of way consciousness 430 00:25:02,960 --> 00:25:05,520 Speaker 1: exists all over the universe and all that matter. This 431 00:25:05,640 --> 00:25:10,080 Speaker 1: is the idea that any sufficiently complex system may manifest 432 00:25:10,240 --> 00:25:12,840 Speaker 1: as consciousness. No. No, this is the idea that that 433 00:25:13,000 --> 00:25:15,280 Speaker 1: all matter is in some sense, that a rock is 434 00:25:15,320 --> 00:25:18,520 Speaker 1: in some sense conscious, but by virtue of being like 435 00:25:18,640 --> 00:25:23,320 Speaker 1: a complex system. Okay, yeah, because even a rock is 436 00:25:22,920 --> 00:25:25,720 Speaker 1: is complex. When you start breaking it down and you 437 00:25:25,760 --> 00:25:29,160 Speaker 1: start really diving the powers of ten style. And now 438 00:25:29,520 --> 00:25:32,280 Speaker 1: I think that's an interesting speculation, but I don't see 439 00:25:32,320 --> 00:25:34,920 Speaker 1: any evidence for that. It would be hard to know 440 00:25:35,000 --> 00:25:38,199 Speaker 1: what evidence for that would be. Yeah, yeah, I like 441 00:25:38,280 --> 00:25:41,040 Speaker 1: I like it too. I think it it lines up 442 00:25:41,160 --> 00:25:46,520 Speaker 1: nicely with some with various supernatural interpretations of reality. But 443 00:25:46,560 --> 00:25:48,399 Speaker 1: I'm not sure how much I'm willing to invest in 444 00:25:48,440 --> 00:25:51,280 Speaker 1: it at this point. So yeah, so we we have 445 00:25:51,359 --> 00:25:55,040 Speaker 1: the rock speaks, When will it speak? And what would 446 00:25:55,080 --> 00:25:58,359 Speaker 1: it say? That's the thing I mean, the rocks. The 447 00:25:58,480 --> 00:26:00,760 Speaker 1: rock it's I mean, it's seen a lot stuff, but 448 00:26:00,960 --> 00:26:04,640 Speaker 1: it hasn't really been up to much. You got mud 449 00:26:04,680 --> 00:26:09,480 Speaker 1: on your face, big disgrace. Oh never mind, Oh yes, 450 00:26:09,560 --> 00:26:13,359 Speaker 1: we'll we'll rock you good, good reference. You can shame 451 00:26:13,400 --> 00:26:16,080 Speaker 1: me later. Okay. One more thing I would think of 452 00:26:16,240 --> 00:26:20,280 Speaker 1: is uh the human equivalent of hive insect behavior? But 453 00:26:20,440 --> 00:26:26,600 Speaker 1: what about social sciences, sociology, political science, anthropology, the study 454 00:26:26,640 --> 00:26:28,720 Speaker 1: of what humans do in large groups. There seem to 455 00:26:28,760 --> 00:26:33,280 Speaker 1: be phenomenon there that are not uh, strictly predictable from 456 00:26:33,359 --> 00:26:36,639 Speaker 1: just understanding of say psychology. Could you look at a 457 00:26:36,760 --> 00:26:39,919 Speaker 1: really really good understanding of psychology and say this is 458 00:26:39,920 --> 00:26:44,160 Speaker 1: how societies will work? I don't know, I mean there 459 00:26:44,160 --> 00:26:48,720 Speaker 1: are those who extrapolate meaning from psychological concepts, who those 460 00:26:48,760 --> 00:26:52,280 Speaker 1: who attempt to. But yeah, it becomes increasingly complicated. Uh. 461 00:26:52,520 --> 00:26:57,360 Speaker 1: One of talking about this difference between, you know, reducible 462 00:26:57,359 --> 00:27:01,160 Speaker 1: phenomena and emergent phenomena. One of my favorite often misattributed 463 00:27:01,240 --> 00:27:06,720 Speaker 1: quotes Joseph Stalin is alleged to have said, falsely alleged 464 00:27:06,760 --> 00:27:09,920 Speaker 1: to have said quantity has a quality all its own. 465 00:27:10,960 --> 00:27:13,440 Speaker 1: I've always liked that quote. I can't find any evidence 466 00:27:13,480 --> 00:27:15,159 Speaker 1: he ever said this, but it does sort of echo 467 00:27:15,200 --> 00:27:18,000 Speaker 1: a sentiment explored by Marx and Engels in their writings 468 00:27:18,040 --> 00:27:22,800 Speaker 1: about economics and their adaptation of Hegelian dialectical philosophy. Uh, 469 00:27:22,840 --> 00:27:25,840 Speaker 1: the quantitative differences over a time, which is what we'd 470 00:27:25,840 --> 00:27:30,200 Speaker 1: sort of be looking at for for reductive philosophy of science. 471 00:27:30,440 --> 00:27:36,399 Speaker 1: Just quantitative changes actually do become qualitative differences. More is 472 00:27:36,440 --> 00:27:40,640 Speaker 1: not just more. In many cases more is different. Yes, 473 00:27:40,680 --> 00:27:42,320 Speaker 1: And I think this is this plasn't nicely with a 474 00:27:42,320 --> 00:27:45,240 Speaker 1: couple of papers, will look at later in the episode. Um, 475 00:27:45,280 --> 00:27:47,320 Speaker 1: as long as we're throwing out quotes, one I always 476 00:27:47,480 --> 00:27:50,199 Speaker 1: uh have have enjoyed on this sort of topic of 477 00:27:50,240 --> 00:27:53,760 Speaker 1: emergence is one from the poet Wawa Stevens his poem 478 00:27:54,000 --> 00:27:57,240 Speaker 1: Connoisseur of Chaos. A violent order is a disorder, and 479 00:27:57,280 --> 00:28:00,040 Speaker 1: be a great disorder as an order. These two of 480 00:28:00,160 --> 00:28:03,240 Speaker 1: things are one pages of illustrations. Oh man, that's a 481 00:28:03,280 --> 00:28:05,560 Speaker 1: good one. Stevens has a lot of great quotes that 482 00:28:05,600 --> 00:28:08,159 Speaker 1: I think somehow apply to science. You know, when I 483 00:28:08,200 --> 00:28:11,520 Speaker 1: often I get I get these feelings about what must 484 00:28:11,560 --> 00:28:14,040 Speaker 1: be true in science, But then I often hear in 485 00:28:14,080 --> 00:28:16,160 Speaker 1: the back of my mind that line from the Emperor 486 00:28:16,160 --> 00:28:22,160 Speaker 1: of ice Cream. Let be b finale of seam. He's 487 00:28:22,160 --> 00:28:23,760 Speaker 1: a good one. If if anyone out there is looking 488 00:28:23,760 --> 00:28:26,080 Speaker 1: to pick up some thought provoking poetry, get yourself a 489 00:28:26,080 --> 00:28:30,879 Speaker 1: book of Wallace Stevens and and to flip around in there. Okay, well, 490 00:28:30,880 --> 00:28:32,520 Speaker 1: I think we should take a quick break, and then 491 00:28:32,920 --> 00:28:35,119 Speaker 1: when we come back we will look at one of 492 00:28:35,160 --> 00:28:38,600 Speaker 1: our main resources in this episode, classic paper from the 493 00:28:38,640 --> 00:28:48,720 Speaker 1: history of science from the nineteen seventies called More is Different. Alright, 494 00:28:48,720 --> 00:28:52,480 Speaker 1: we're back, so tell me Joe is more different. That's 495 00:28:52,480 --> 00:28:56,200 Speaker 1: a good question, Robert. That's sort of the central question 496 00:28:56,240 --> 00:28:58,800 Speaker 1: of this episode. As you reach higher levels of complexity, 497 00:28:58,840 --> 00:29:02,600 Speaker 1: you get more things to other interacting to do different, 498 00:29:02,800 --> 00:29:07,400 Speaker 1: uniquely new properties emerge or is it just more and more? Well. 499 00:29:07,480 --> 00:29:11,200 Speaker 1: In nineteen seventy two, the Nobel Prize winning American physicist 500 00:29:11,200 --> 00:29:16,120 Speaker 1: Philip W. Anderson published this massively influential, highly cited paper 501 00:29:16,160 --> 00:29:19,560 Speaker 1: in the journal Science, and the title of the the 502 00:29:19,680 --> 00:29:22,959 Speaker 1: essay was more is Different and you, as you can 503 00:29:23,000 --> 00:29:26,200 Speaker 1: probably guess based on the title what position he took 504 00:29:26,240 --> 00:29:30,360 Speaker 1: on the emergentists debate. So Anderson writes that while at 505 00:29:30,360 --> 00:29:32,640 Speaker 1: the time he was writing, which was nint seventy two, 506 00:29:32,720 --> 00:29:38,240 Speaker 1: philosophers might still debate scientific reductionism, but he said scientists don't. 507 00:29:38,760 --> 00:29:44,120 Speaker 1: Scientists just take scientific reductionism for granted. Uh. And his 508 00:29:44,240 --> 00:29:48,680 Speaker 1: formulation of the reductionist hypothesis went like this quote. The 509 00:29:48,720 --> 00:29:51,440 Speaker 1: workings of our minds and bodies, and of all the 510 00:29:51,480 --> 00:29:55,480 Speaker 1: animate or inanimate matter of which they have any detailed knowledge, 511 00:29:55,800 --> 00:29:58,920 Speaker 1: are assumed to be controlled by the same set of 512 00:29:58,960 --> 00:30:03,360 Speaker 1: fundamental laws, which accept under certain extreme conditions, we feel 513 00:30:03,400 --> 00:30:06,640 Speaker 1: we know pretty well. In other words, he's saying, when 514 00:30:06,640 --> 00:30:09,920 Speaker 1: you chase causal explanations deep enough, it all boils down 515 00:30:09,920 --> 00:30:12,800 Speaker 1: to the bottom. It all goes straight down to fundamental physics, 516 00:30:13,040 --> 00:30:15,920 Speaker 1: and that is as it should be right. I mean, 517 00:30:15,960 --> 00:30:19,680 Speaker 1: that's why we established all of these basic fundamental physical 518 00:30:19,760 --> 00:30:24,600 Speaker 1: laws and interactions, because we wanted an idea of how 519 00:30:25,160 --> 00:30:29,000 Speaker 1: the the universe works, and so everything should boil down 520 00:30:29,000 --> 00:30:31,840 Speaker 1: to those laws. If it doesn't, that would indicate there's 521 00:30:31,880 --> 00:30:34,720 Speaker 1: some sort of problem with our laws, then our physics exactly. Yeah, 522 00:30:34,920 --> 00:30:39,040 Speaker 1: what what good does physics if it's not actually fundamental? Uh? So, 523 00:30:39,520 --> 00:30:42,120 Speaker 1: Anderson says, you know, if this is true, many many 524 00:30:42,120 --> 00:30:46,080 Speaker 1: people assume that it entails the idea that very few 525 00:30:46,120 --> 00:30:49,360 Speaker 1: people in the sciences are actually working on anything fundamental, 526 00:30:49,480 --> 00:30:54,200 Speaker 1: anything autonomous, anything original. Uh And to illustrate this example 527 00:30:54,440 --> 00:30:57,040 Speaker 1: or this frame of mind, Anderson quotes this passage from 528 00:30:57,040 --> 00:31:02,120 Speaker 1: the theoretical physicist Victor F. Weisskopf, which which sorts all 529 00:31:02,240 --> 00:31:07,920 Speaker 1: science into these two categories, which Weiskoff calls intensive and extensive. 530 00:31:08,600 --> 00:31:14,600 Speaker 1: So intensive research tries to discover fundamental laws. Extensive research 531 00:31:14,720 --> 00:31:19,680 Speaker 1: tries to explain phenomena with the use of known fundamental laws. 532 00:31:20,440 --> 00:31:23,920 Speaker 1: So at any given time, a minority, vast, a very 533 00:31:23,960 --> 00:31:27,200 Speaker 1: small minority of scientists, generally in fields like particle physics, 534 00:31:27,520 --> 00:31:31,760 Speaker 1: are working on describing fundamental laws that govern reality. Um, 535 00:31:31,800 --> 00:31:36,080 Speaker 1: they're doing the intensive science, and meanwhile, the vast majority 536 00:31:36,120 --> 00:31:40,120 Speaker 1: of scientists are just taking the models of fundamental laws 537 00:31:40,160 --> 00:31:44,400 Speaker 1: and applying them as an explanation for anything. For why 538 00:31:44,480 --> 00:31:47,920 Speaker 1: the rain in Nashville smells like hot dog water, or 539 00:31:48,240 --> 00:31:53,000 Speaker 1: sometimes sometimes I'm just kidding, you know, sometimes rain anywhere 540 00:31:53,000 --> 00:31:56,520 Speaker 1: smells like I haven't spent enough time in Nashville, you know, anything, 541 00:31:56,560 --> 00:32:00,479 Speaker 1: why your eyes won't stop bleeding. Uh So, the the 542 00:32:00,560 --> 00:32:04,760 Speaker 1: extension of this distinction, some presume, is that once we 543 00:32:04,840 --> 00:32:07,520 Speaker 1: have a fundamental theory of physics at the base of 544 00:32:07,520 --> 00:32:11,600 Speaker 1: all science, there's no intensive science left to do. Does 545 00:32:11,600 --> 00:32:14,200 Speaker 1: that make sense? Like you could still apply theories up 546 00:32:14,240 --> 00:32:18,040 Speaker 1: the chain, but there's nothing original left to discover. It's 547 00:32:18,160 --> 00:32:23,200 Speaker 1: just continually the application of what we know to different phenomena. 548 00:32:24,400 --> 00:32:27,760 Speaker 1: But Anderson throws down a flag here. He says, hold on, 549 00:32:28,400 --> 00:32:32,160 Speaker 1: let's say we accept the reductionist hypothesis that we can 550 00:32:32,200 --> 00:32:38,600 Speaker 1: reduce complex phenomenon explanations uh to simpler, more fundamental physical laws. 551 00:32:38,600 --> 00:32:42,040 Speaker 1: That doesn't necessarily imply the converse, which he calls the 552 00:32:42,120 --> 00:32:46,440 Speaker 1: quote constructionist hypothesis. It does not, in his words, it 553 00:32:46,520 --> 00:32:50,280 Speaker 1: does not imply the ability to start from those laws 554 00:32:50,320 --> 00:32:55,040 Speaker 1: and reconstruct the universe. So what is science supposed to do. 555 00:32:55,120 --> 00:32:57,480 Speaker 1: It's supposed to be able to predict. Right, if you 556 00:32:57,520 --> 00:33:00,080 Speaker 1: have a correct scientific theory, you should be able to 557 00:33:00,120 --> 00:33:05,480 Speaker 1: make accurate predictions about the future. But if you can't 558 00:33:06,240 --> 00:33:10,240 Speaker 1: make accurate predictions about the future from the fundamental laws 559 00:33:10,280 --> 00:33:13,760 Speaker 1: of physics, then do the fundamental laws of physics really 560 00:33:13,840 --> 00:33:18,840 Speaker 1: describe everything? So, in in Anderson's view, uh, why does 561 00:33:18,880 --> 00:33:21,440 Speaker 1: it not imply that we can start from the fundamental 562 00:33:21,520 --> 00:33:24,200 Speaker 1: laws and predict everything? Um, you know, shouldn't we be 563 00:33:24,280 --> 00:33:26,880 Speaker 1: able to do that in in in principle? Well, according 564 00:33:26,880 --> 00:33:30,280 Speaker 1: to Anderson, the answer is no. And Anderson says there 565 00:33:30,320 --> 00:33:34,440 Speaker 1: are two main problems with the constructionist hypothesis. One is 566 00:33:34,520 --> 00:33:38,680 Speaker 1: scale and the other is complexity. And I just want 567 00:33:38,680 --> 00:33:41,160 Speaker 1: to read a quote from him. Anderson writes, quote the 568 00:33:41,240 --> 00:33:45,560 Speaker 1: behavior of large and complex aggregations of elementary particles, So 569 00:33:45,640 --> 00:33:49,400 Speaker 1: that would be anything a football, Uh, to return to 570 00:33:49,560 --> 00:33:54,240 Speaker 1: a hot dog, a jar of pickles. Yes, I guess 571 00:33:54,280 --> 00:33:58,600 Speaker 1: I'm all thinking of cylindrical foods. I'm not sure why. 572 00:33:58,720 --> 00:34:03,000 Speaker 1: Um Uh. The behavior of large and complex aggregations of 573 00:34:03,000 --> 00:34:06,760 Speaker 1: elementary particles, it turns out, is not to be understood 574 00:34:06,800 --> 00:34:09,360 Speaker 1: in terms of a simple explanation of the properties of 575 00:34:09,400 --> 00:34:14,680 Speaker 1: a few particles. Instead, at each level of complexity, entirely 576 00:34:14,719 --> 00:34:18,400 Speaker 1: new properties appear, and the understanding of the new behaviors 577 00:34:18,400 --> 00:34:21,839 Speaker 1: requires research, which I think is as fundamental in its 578 00:34:21,920 --> 00:34:25,480 Speaker 1: nature as any other. So he's throwing in with with 579 00:34:25,560 --> 00:34:29,520 Speaker 1: a certain version of the emergentist hypothesis. Studying what happens 580 00:34:29,560 --> 00:34:32,480 Speaker 1: to more complex bodies, like studying what happens to a 581 00:34:32,560 --> 00:34:37,280 Speaker 1: jar of pickles, is doing original research that is actually 582 00:34:37,440 --> 00:34:42,399 Speaker 1: yielding hypotheses and theories that are not predictable from just 583 00:34:42,680 --> 00:34:47,160 Speaker 1: understanding the particles that make up that jar of pickles. Well, 584 00:34:47,160 --> 00:34:50,839 Speaker 1: this just reminds me again of societal examples and then 585 00:34:50,880 --> 00:34:53,200 Speaker 1: the idea of wicked problems. They like rolling out a 586 00:34:53,239 --> 00:34:58,239 Speaker 1: solution into a complex system that is society and not 587 00:34:58,320 --> 00:35:02,240 Speaker 1: realizing that the solution is going to spin off additional problems. 588 00:35:02,239 --> 00:35:04,719 Speaker 1: It's going to create additional complexity. They're going to be 589 00:35:04,920 --> 00:35:09,239 Speaker 1: emergent uh problems out of your solution, right that, Yeah, 590 00:35:09,280 --> 00:35:12,399 Speaker 1: there are things we can't predict from simpler principles, even 591 00:35:12,480 --> 00:35:17,000 Speaker 1: if those simpler principles are correct. Uh so, uh So, 592 00:35:17,080 --> 00:35:19,879 Speaker 1: Just to clarify, Anderson accepts that the sciences of more 593 00:35:19,920 --> 00:35:25,920 Speaker 1: complex phenomena are explanatorily dependent on the sciences of simpler phenomena. Right, 594 00:35:26,080 --> 00:35:29,760 Speaker 1: psychology is in a sense dependent on biology. We couldn't 595 00:35:29,760 --> 00:35:33,160 Speaker 1: have it without a you know, which is dependent on chemistry, 596 00:35:33,200 --> 00:35:36,480 Speaker 1: which is dependent on physics. But explicitly he rejects the 597 00:35:36,520 --> 00:35:41,239 Speaker 1: idea that this means psychology is just applied biology, or 598 00:35:41,280 --> 00:35:44,719 Speaker 1: that biology is just applied chemistry. At each of these 599 00:35:44,719 --> 00:35:49,399 Speaker 1: new levels of complexity, genuinely novel properties emerge which are 600 00:35:49,440 --> 00:35:53,279 Speaker 1: not necessarily predictable from a complete understanding of the more 601 00:35:53,360 --> 00:35:57,000 Speaker 1: fundamental science. Uh and he grounds this in an example 602 00:35:57,040 --> 00:36:00,000 Speaker 1: from his own field, because he works in many body physics. 603 00:36:01,400 --> 00:36:04,160 Speaker 1: And uh he he grounds it in this concept that 604 00:36:04,280 --> 00:36:08,040 Speaker 1: is known as symmetry breaking. So what does that mean? Well, 605 00:36:08,480 --> 00:36:12,319 Speaker 1: for Anderson, the study of fundamental physics is almost synonymous 606 00:36:12,360 --> 00:36:15,600 Speaker 1: with the study of symmetry. In other words, fundamental physics 607 00:36:15,680 --> 00:36:18,319 Speaker 1: is the search for the laws of reality that are 608 00:36:18,880 --> 00:36:22,600 Speaker 1: homogeneous and isotropic. What does that mean. It means they're 609 00:36:22,640 --> 00:36:26,360 Speaker 1: the same everywhere and they apply to everything no matter 610 00:36:26,480 --> 00:36:29,520 Speaker 1: from what vantage point you look. That sounds like a 611 00:36:29,560 --> 00:36:32,040 Speaker 1: good description to me of what the fundamental laws should be. 612 00:36:32,080 --> 00:36:36,520 Speaker 1: In other words, they're fundamentally symmetrical. They're they're the same everywhere. Right. 613 00:36:36,560 --> 00:36:38,479 Speaker 1: It works in the city, works in the country, works 614 00:36:38,480 --> 00:36:41,439 Speaker 1: on earth, works in alpha centaire. Right, and that that's 615 00:36:41,440 --> 00:36:45,360 Speaker 1: what physics should be. But while all matter obeys basic 616 00:36:45,400 --> 00:36:49,080 Speaker 1: electrodynamics and quantum theory, many objects in the universe, and 617 00:36:49,160 --> 00:36:53,040 Speaker 1: not just minds and societies, but Anderson uses examples of 618 00:36:53,080 --> 00:36:57,720 Speaker 1: tiny basic physical structures. Many of these objects display novel 619 00:36:58,000 --> 00:37:01,799 Speaker 1: or asymmetrical properties, which he says, they're not strictly predictable 620 00:37:02,280 --> 00:37:05,520 Speaker 1: from the symmetrical laws that govern them govern them. So 621 00:37:05,719 --> 00:37:09,400 Speaker 1: these asymmetries include He gives examples like the inversion of 622 00:37:09,440 --> 00:37:14,320 Speaker 1: the ammonia molecule, the shapes of atomic nuclei like sometimes 623 00:37:14,320 --> 00:37:17,880 Speaker 1: an atomic nucleus you can work out mathematically is in 624 00:37:17,920 --> 00:37:21,759 Speaker 1: a sense shaped like a football or shaped like a plate. Uh, 625 00:37:21,800 --> 00:37:24,520 Speaker 1: and he talks about the structures of crystals. These are 626 00:37:24,680 --> 00:37:28,760 Speaker 1: you know, they should be. They're based on symmetrical laws, 627 00:37:28,800 --> 00:37:34,000 Speaker 1: but the symmetrical laws end up generating asymmetries in reality. 628 00:37:34,600 --> 00:37:37,840 Speaker 1: So in Anderson's view, the question is why are large 629 00:37:37,920 --> 00:37:42,280 Speaker 1: systems not just bigger than elementary particles, but fundamentally different 630 00:37:42,400 --> 00:37:45,800 Speaker 1: from them, with unique properties to study. And here I 631 00:37:45,800 --> 00:37:47,760 Speaker 1: want to read a quote from Anderson. He says, quote, 632 00:37:48,040 --> 00:37:51,839 Speaker 1: the essential idea is that in the so called into infinity, 633 00:37:51,960 --> 00:37:55,759 Speaker 1: in approaching infinity limit of large systems between stuff on 634 00:37:55,800 --> 00:37:59,120 Speaker 1: our own macroscopic scale, it is not only convenient but 635 00:37:59,440 --> 00:38:04,360 Speaker 1: essentially to realize that matter will undergo mathematically sharp singular 636 00:38:04,760 --> 00:38:09,560 Speaker 1: phase transitions to state at which the macros microscopic symmetries, 637 00:38:10,400 --> 00:38:13,400 Speaker 1: and even the microscopic equations of motion are in a 638 00:38:13,560 --> 00:38:17,960 Speaker 1: sense violated. The symmetry leaves behind ass as its expression, 639 00:38:18,080 --> 00:38:23,600 Speaker 1: only certain characteristic behaviors, for example, long wavelength vibrations of 640 00:38:23,640 --> 00:38:27,120 Speaker 1: which the familiar example is sound waves, or the unusual 641 00:38:27,200 --> 00:38:31,759 Speaker 1: macroscopic conduction phenomena of the superconductor, or in a very 642 00:38:31,800 --> 00:38:36,560 Speaker 1: deep analogy, the very rigidity of crystal lattices and thus 643 00:38:36,600 --> 00:38:40,120 Speaker 1: of most solid matter. There is, of course no question 644 00:38:40,239 --> 00:38:44,040 Speaker 1: of the systems really violating, as opposed to breaking the 645 00:38:44,120 --> 00:38:47,919 Speaker 1: symmetry of space and time. But because its parts find 646 00:38:47,960 --> 00:38:52,440 Speaker 1: it energetically more favorable to maintain certain fixed relationships with 647 00:38:52,520 --> 00:38:55,719 Speaker 1: each other, the symmetry allows only the body as a 648 00:38:55,800 --> 00:39:00,120 Speaker 1: whole to respond to external forces. Uh so again he 649 00:39:00,120 --> 00:39:03,000 Speaker 1: he's not saying that a big system macroscopic system like 650 00:39:03,040 --> 00:39:06,440 Speaker 1: a jar of pickles, violates the laws of physics, but 651 00:39:06,480 --> 00:39:09,759 Speaker 1: he's saying, at certain levels of complexity, large objects make 652 00:39:09,880 --> 00:39:12,959 Speaker 1: more sense understood as a whole than at the level 653 00:39:13,000 --> 00:39:16,759 Speaker 1: of their constituent parts. Uh. And the whole has a 654 00:39:16,840 --> 00:39:20,799 Speaker 1: novel scheme of behavior that's not easily predictable from the 655 00:39:20,880 --> 00:39:24,640 Speaker 1: nature of its elementary particles. And then of course there 656 00:39:24,680 --> 00:39:26,600 Speaker 1: he's like, well, now we've just talked about you know, 657 00:39:26,640 --> 00:39:28,719 Speaker 1: crystals and stuff like that, but he of course says, 658 00:39:28,760 --> 00:39:30,720 Speaker 1: you know, this applies to d N A and stuff 659 00:39:30,760 --> 00:39:33,440 Speaker 1: like that too. Of Course, once you get much more complex, 660 00:39:33,480 --> 00:39:37,040 Speaker 1: the problem is is magnified all the more things just 661 00:39:37,440 --> 00:39:42,359 Speaker 1: become really seemingly impossible to reduce to or predict from 662 00:39:42,680 --> 00:39:46,080 Speaker 1: the underlying laws of elementary particles. There are these quantum 663 00:39:46,160 --> 00:39:50,359 Speaker 1: leaps where it appears that quantity has a quality all 664 00:39:50,360 --> 00:39:53,000 Speaker 1: its own suddenly, And of course, in the end of 665 00:39:53,040 --> 00:39:55,960 Speaker 1: his paper he paraphrases marks in in that saying quantity 666 00:39:55,960 --> 00:39:58,600 Speaker 1: as a quality all its own, and then I love this. 667 00:39:58,680 --> 00:40:02,719 Speaker 1: He also quotes a supposed conversation between f. Scott Fitzgerald 668 00:40:02,760 --> 00:40:06,719 Speaker 1: and Earnest Hemingway, where of course Fitzgerald says the rich 669 00:40:06,719 --> 00:40:10,719 Speaker 1: are different from us and Hemingway replies, yes, they have 670 00:40:10,840 --> 00:40:14,560 Speaker 1: more money. Now this is interesting because it immediately brings 671 00:40:14,560 --> 00:40:18,960 Speaker 1: to mind some like reductionist criticisms that are thrown out. 672 00:40:19,239 --> 00:40:23,480 Speaker 1: I've seen before about say that human beings to say, oh, 673 00:40:23,520 --> 00:40:26,360 Speaker 1: well you can you can dissect a human being. You can, 674 00:40:26,600 --> 00:40:29,120 Speaker 1: you can hold a human heart in your hand, but 675 00:40:29,200 --> 00:40:31,080 Speaker 1: you're not going to get a sense of who that 676 00:40:31,120 --> 00:40:34,479 Speaker 1: person was based on that experience. Yeah, and I would say, 677 00:40:34,480 --> 00:40:39,239 Speaker 1: actually that um that Anderson is not the final word 678 00:40:39,239 --> 00:40:43,080 Speaker 1: on this, obviously, like people disagree with him, but this 679 00:40:43,120 --> 00:40:45,600 Speaker 1: has been a really interesting and influential paper. And it's 680 00:40:45,600 --> 00:40:48,480 Speaker 1: also not to rule out the idea that redundant sciences 681 00:40:48,480 --> 00:40:52,239 Speaker 1: do exist somewhere. For example, there might be fields of 682 00:40:52,280 --> 00:40:55,279 Speaker 1: science that really do reduce to nothing more the app 683 00:40:55,360 --> 00:40:58,520 Speaker 1: than the application of principles of a more fundamental field 684 00:40:58,560 --> 00:41:01,279 Speaker 1: of science. But it just looks like this is not 685 00:41:01,360 --> 00:41:04,920 Speaker 1: the case for most, if not all, mature scientific fields. 686 00:41:05,640 --> 00:41:08,799 Speaker 1: But somebody out there in a lab right now, it 687 00:41:08,880 --> 00:41:11,640 Speaker 1: could happen to you. You could be reduced to a 688 00:41:11,680 --> 00:41:15,359 Speaker 1: simpler study field is okay, But maybe we should look 689 00:41:15,360 --> 00:41:19,280 Speaker 1: at a counterpoint, because, as as I mentioned, not everybody 690 00:41:19,320 --> 00:41:23,520 Speaker 1: agrees with Anderson. Uh well, and so what if maybe 691 00:41:23,560 --> 00:41:26,560 Speaker 1: it's not as different as you think. More might seem 692 00:41:26,640 --> 00:41:30,560 Speaker 1: different or field different, and more might be useful to 693 00:41:30,719 --> 00:41:34,640 Speaker 1: treat as different given our limitations. But maybe it's not 694 00:41:34,960 --> 00:41:38,799 Speaker 1: really different. There's nothing actually unique going on at higher 695 00:41:38,880 --> 00:41:42,520 Speaker 1: levels of complexity. It's just convenient for us to treat 696 00:41:42,560 --> 00:41:45,439 Speaker 1: it that way. And here I want to come to 697 00:41:45,520 --> 00:41:49,920 Speaker 1: another Nobel Prize winning American physicist, Stephen Weinberg, who offers 698 00:41:49,960 --> 00:41:54,880 Speaker 1: a really interesting complementary counter analysis in his book Dreams 699 00:41:54,880 --> 00:41:57,320 Speaker 1: of a Final Theory. Have you ever read anything by Weinberg? 700 00:41:59,120 --> 00:42:02,239 Speaker 1: I am not sure that I have. He's a really 701 00:42:02,239 --> 00:42:05,960 Speaker 1: good writer. The first chapter of this book is just 702 00:42:06,000 --> 00:42:11,000 Speaker 1: this brilliant, rollicking fun adventure through science, through chemistry and 703 00:42:11,040 --> 00:42:14,319 Speaker 1: particle physics, if you can believe that, where he he 704 00:42:14,360 --> 00:42:17,439 Speaker 1: talks about a piece of chalk, and he's like, let's 705 00:42:17,440 --> 00:42:20,279 Speaker 1: apply the reductionist hypothesis to a piece of chalk, and 706 00:42:20,480 --> 00:42:23,799 Speaker 1: and and in every way possible, look at its properties 707 00:42:24,120 --> 00:42:27,560 Speaker 1: and ask why. And every time you ask why, why 708 00:42:27,680 --> 00:42:30,440 Speaker 1: is the chalk white? Why is the chalk shaped? Like 709 00:42:30,480 --> 00:42:33,439 Speaker 1: it is. Why you know any question like that, you 710 00:42:33,520 --> 00:42:36,960 Speaker 1: can What you're doing, essentially is playing the reductionist game, right, 711 00:42:37,040 --> 00:42:39,239 Speaker 1: You're you're going one level down. I have to do 712 00:42:39,280 --> 00:42:41,959 Speaker 1: that all the time as a as a father. Yeah, 713 00:42:42,160 --> 00:42:45,759 Speaker 1: I'm constantly asked questions. I mean, he hasn't asked me 714 00:42:45,760 --> 00:42:47,799 Speaker 1: about chalk, but I can easily imagine and ask me 715 00:42:47,840 --> 00:42:50,400 Speaker 1: those very questions. Why is it white? Why does it 716 00:42:50,440 --> 00:42:53,080 Speaker 1: do this? One? Is it? That will get very reductionist 717 00:42:53,200 --> 00:42:57,000 Speaker 1: questions about virtually everything. And when you do that, you're 718 00:42:57,040 --> 00:43:01,120 Speaker 1: putting reductionism into practice. You're saying, Okay, well, I can 719 00:43:01,160 --> 00:43:05,240 Speaker 1: explain these these higher properties in terms of lower properties, 720 00:43:05,640 --> 00:43:08,840 Speaker 1: simpler properties that cause an effect we see at a 721 00:43:08,880 --> 00:43:11,680 Speaker 1: large scale. Yeah, I often don't see see it quite 722 00:43:11,760 --> 00:43:14,120 Speaker 1: that beautifully. Generally, it's like, oh jeez, I'm just trying 723 00:43:14,160 --> 00:43:16,160 Speaker 1: to to drive you to school, and now I've I've 724 00:43:16,200 --> 00:43:20,280 Speaker 1: got to explain gravity. You know, because you were asking 725 00:43:20,320 --> 00:43:22,800 Speaker 1: about a you know, a bird or a football or something, 726 00:43:23,480 --> 00:43:26,520 Speaker 1: what like why did that bird fly into the car window? 727 00:43:26,840 --> 00:43:29,680 Speaker 1: Is it? That's uh no, no, no, no, unfortunately not. 728 00:43:29,920 --> 00:43:32,400 Speaker 1: But you know, it will just be you know, random 729 00:43:32,440 --> 00:43:35,279 Speaker 1: wonderful questions about just how the universe works, and it 730 00:43:35,280 --> 00:43:38,120 Speaker 1: will generally start start with a particular detail, but it 731 00:43:38,200 --> 00:43:42,840 Speaker 1: quickly spirals out into these very complex uh you know, 732 00:43:42,960 --> 00:43:47,799 Speaker 1: notions of of reality. Yeah. Uh so, So Weinberg is 733 00:43:47,840 --> 00:43:51,200 Speaker 1: a fan of reductionism. Weinberg he's looking for a final theory. 734 00:43:51,280 --> 00:43:54,040 Speaker 1: He wants to find a final theory of physics. And 735 00:43:54,200 --> 00:43:57,080 Speaker 1: ultimately he says, yeah, maybe not in practice, can we 736 00:43:57,120 --> 00:43:59,680 Speaker 1: actually reduce everything to physics, like it might just be 737 00:43:59,680 --> 00:44:02,840 Speaker 1: beyond into our capabilities. But in theory, everything should be 738 00:44:02,880 --> 00:44:06,000 Speaker 1: reducible to fundamental physics, that there should be no higher 739 00:44:06,120 --> 00:44:10,120 Speaker 1: order insights. Really, uh, it's all there in the physics. 740 00:44:10,160 --> 00:44:13,879 Speaker 1: So in this opening chapter he's discussing problems with with 741 00:44:14,000 --> 00:44:17,480 Speaker 1: putting the reductionist hypothesis into practice, and he acknowledges there 742 00:44:17,480 --> 00:44:20,640 Speaker 1: are plenty of problems. He's not cavalier about that. Uh. 743 00:44:20,640 --> 00:44:23,239 Speaker 1: And one of the problems with reducing things like biology 744 00:44:23,360 --> 00:44:26,360 Speaker 1: to fundamental physics is that he points out biology is 745 00:44:26,400 --> 00:44:30,120 Speaker 1: not just a product of fundamental laws, but also biology 746 00:44:30,160 --> 00:44:35,120 Speaker 1: incorporates stuff that happened in the past, like it is 747 00:44:35,160 --> 00:44:38,719 Speaker 1: the product of both the fundamental laws of physics and 748 00:44:39,040 --> 00:44:43,640 Speaker 1: some accidents of history. In Weinberg's view, biology wouldn't be 749 00:44:43,760 --> 00:44:47,240 Speaker 1: the way it was if some different things had happened 750 00:44:47,239 --> 00:44:51,080 Speaker 1: in the past. Um. So I think that's kind of interesting. 751 00:44:51,120 --> 00:44:53,920 Speaker 1: And in this sense, you see in sciences like biology, 752 00:44:54,080 --> 00:44:58,480 Speaker 1: the past becomes calcified into structures that all life on 753 00:44:58,520 --> 00:45:04,360 Speaker 1: Earth uses, and so physics appears to be timeless and universal. 754 00:45:04,640 --> 00:45:08,120 Speaker 1: But biology is a contingent science. It's a result of 755 00:45:08,239 --> 00:45:12,000 Speaker 1: something that happened at one point. Uh. Now, you could 756 00:45:12,000 --> 00:45:13,960 Speaker 1: maybe go to a higher level and say that even 757 00:45:13,960 --> 00:45:17,120 Speaker 1: physics could be that way. Maybe there, you know, maybe 758 00:45:17,120 --> 00:45:20,120 Speaker 1: there's a multiverse. Maybe the laws of physics in this 759 00:45:20,200 --> 00:45:23,000 Speaker 1: local universe are in fact contingent. They didn't have to 760 00:45:23,040 --> 00:45:26,319 Speaker 1: be that way. Different universes could have different laws of physics, 761 00:45:26,360 --> 00:45:29,000 Speaker 1: that's possible, but they at least appear to be universal 762 00:45:29,080 --> 00:45:31,640 Speaker 1: in this universe. Okay, So yeah, when we look at 763 00:45:31,640 --> 00:45:35,840 Speaker 1: a complex system, we're also looking at a process. Yeah. Um, 764 00:45:35,880 --> 00:45:39,360 Speaker 1: but then uh. Weinberg also deals with the concept of 765 00:45:39,400 --> 00:45:42,600 Speaker 1: emergence and and he tries to he's respectful toward it, 766 00:45:42,680 --> 00:45:44,880 Speaker 1: but he tries to show that he thinks it doesn't 767 00:45:45,080 --> 00:45:49,719 Speaker 1: undercut the reductionist hypothesis. So he cites Anderson's essay more 768 00:45:49,840 --> 00:45:53,640 Speaker 1: is different, and Weinberg stresses, like Anderson, that while most 769 00:45:53,680 --> 00:45:56,960 Speaker 1: obvious examples of potential emergence are in the biological and 770 00:45:57,000 --> 00:46:00,480 Speaker 1: social sciences, if emergence exists, it appears to be in 771 00:46:00,920 --> 00:46:05,680 Speaker 1: physics as well, and he gives this prime example thermodynamics, 772 00:46:06,400 --> 00:46:09,799 Speaker 1: the study of heat. Now you might be thinking like, well, 773 00:46:09,840 --> 00:46:13,520 Speaker 1: how could heat be all that complex? Heat is mega complex. 774 00:46:13,680 --> 00:46:15,880 Speaker 1: If you ask somebody who's been trying to do, you know, 775 00:46:15,960 --> 00:46:21,239 Speaker 1: calculations and thermodynamics, it's really complicated. And Weinberg points out 776 00:46:21,280 --> 00:46:25,840 Speaker 1: that in the nineteenth century, thermodynamics was a fundamentally different 777 00:46:25,960 --> 00:46:31,400 Speaker 1: and distinct science. It was considered logically autonomous and kept 778 00:46:31,480 --> 00:46:35,439 Speaker 1: separate from general mechanics. So you might have your Newtonians 779 00:46:35,480 --> 00:46:39,480 Speaker 1: over here, you know, doing their mechanics work, and then 780 00:46:39,600 --> 00:46:44,080 Speaker 1: you've got your thermodynamicists. And so while physics relied on 781 00:46:44,200 --> 00:46:48,000 Speaker 1: concepts like particles and forces, he says, thermodynamics relied on 782 00:46:48,080 --> 00:46:51,759 Speaker 1: concepts like temperature and entropy, which just did not have 783 00:46:51,880 --> 00:46:56,080 Speaker 1: counterparts in general mechanics. Uh and uh. He says that 784 00:46:56,160 --> 00:46:59,360 Speaker 1: the only real bridge was the first law of thermodynamics, 785 00:46:59,400 --> 00:47:02,759 Speaker 1: which was the conservation of energy that linked thermodynamics with 786 00:47:02,800 --> 00:47:05,359 Speaker 1: the rest of physics, but he writes that the main 787 00:47:05,440 --> 00:47:08,680 Speaker 1: idea and thermodynamics was the second law, which says that 788 00:47:08,760 --> 00:47:13,360 Speaker 1: in any closed system, there's this magical quantity called entropy, 789 00:47:13,400 --> 00:47:16,920 Speaker 1: which tends to increase over time until the system reaches 790 00:47:17,000 --> 00:47:20,480 Speaker 1: a state of equilibrium, until everything just sort of equals 791 00:47:20,520 --> 00:47:25,080 Speaker 1: out and becomes very mellow um. But then he writes, 792 00:47:25,120 --> 00:47:27,840 Speaker 1: in the nineteenth century, physicists took the second law of 793 00:47:27,880 --> 00:47:31,360 Speaker 1: thermodynamics as an axiom. They believed it, believed in it 794 00:47:31,400 --> 00:47:34,920 Speaker 1: basically on the basis of induction. Uh. And you could 795 00:47:35,080 --> 00:47:38,800 Speaker 1: and still can see examples of thermodynamics all over nature. 796 00:47:39,080 --> 00:47:41,520 Speaker 1: You can look at the behavior of steam billowing up 797 00:47:41,560 --> 00:47:45,080 Speaker 1: from a pot and see thermody dynamics. You can see 798 00:47:45,120 --> 00:47:48,200 Speaker 1: freezing and boiling liquids, and then you can even see 799 00:47:48,640 --> 00:47:52,840 Speaker 1: versions of what looked like thermodynamics in globular clusters in space. 800 00:47:53,480 --> 00:47:57,879 Speaker 1: But if you see thermodynamics principles play out all over 801 00:47:57,960 --> 00:48:00,520 Speaker 1: all scales of the universe, from like mall lecules of 802 00:48:00,640 --> 00:48:03,759 Speaker 1: H two in your kitchen to clouds and clouds of 803 00:48:03,800 --> 00:48:08,960 Speaker 1: stars and galaxies, then surely thermodynamics is logically independent from 804 00:48:08,960 --> 00:48:14,320 Speaker 1: fundamental physics. Right, But Weinberg says no. He writes that eventually, 805 00:48:14,560 --> 00:48:18,320 Speaker 1: the work of theoretical physicists like Maxwell, Boltzmann and Gibbs 806 00:48:19,680 --> 00:48:23,600 Speaker 1: showed that quote, the principles of thermodynamics could in fact 807 00:48:23,680 --> 00:48:28,200 Speaker 1: be deduced mathematically by an analysis of the pro probabilities 808 00:48:28,239 --> 00:48:32,720 Speaker 1: of different configurations of certain kinds of system those systems 809 00:48:32,760 --> 00:48:36,520 Speaker 1: whose energy is shared among a very large number of subsystems, 810 00:48:36,920 --> 00:48:39,880 Speaker 1: as for instance, a gas whose energy is shared among 811 00:48:39,920 --> 00:48:43,520 Speaker 1: the molecules of which it is composed. So, in other words, 812 00:48:43,719 --> 00:48:46,440 Speaker 1: they came up with the interpretive bridge to show how 813 00:48:46,480 --> 00:48:53,200 Speaker 1: thermodynamics reduces to underlying mechanics, statistical mechanics. And this amazing 814 00:48:53,360 --> 00:48:57,280 Speaker 1: weird property known as heat really just is the combined 815 00:48:57,440 --> 00:48:59,880 Speaker 1: kinetic energy of all the particles in the system. And 816 00:49:00,000 --> 00:49:02,239 Speaker 1: that's what we're taught in school. Now you learn heat 817 00:49:02,320 --> 00:49:06,040 Speaker 1: is the kinetic energy of vibrating particles. Uh An entropy, 818 00:49:06,160 --> 00:49:10,560 Speaker 1: it's just actually a measure of how disordered the system is. Entropy, Uh, 819 00:49:10,640 --> 00:49:12,560 Speaker 1: it just means the amount of order in a closed 820 00:49:12,560 --> 00:49:16,520 Speaker 1: system decreases over time. So thermodynamics, he says, has been 821 00:49:16,560 --> 00:49:21,520 Speaker 1: reduced to underlying theories of particles and forces. And yet 822 00:49:22,000 --> 00:49:25,680 Speaker 1: Weinberg writes, you know, these higher order, complex and seemingly 823 00:49:25,719 --> 00:49:30,080 Speaker 1: emergent properties like temperature and entropy, which have no counterpart 824 00:49:30,239 --> 00:49:32,799 Speaker 1: at the scale of individual particles, they're just not they're 825 00:49:32,880 --> 00:49:36,120 Speaker 1: down low, are still useful for lots of kinds of explanations. 826 00:49:36,160 --> 00:49:39,839 Speaker 1: So he's not saying higher order sciences aren't useful, they're 827 00:49:39,880 --> 00:49:44,520 Speaker 1: just not actually fundamental. They're not describing anything novel necessarily. 828 00:49:45,320 --> 00:49:49,200 Speaker 1: Uh So he he concludes this discussion that, um, yeah, 829 00:49:49,600 --> 00:49:53,160 Speaker 1: by saying, quote, thermodynamics is more like a mode of 830 00:49:53,320 --> 00:49:58,200 Speaker 1: reasoning than a body of universal physical law. Wherever it applies, 831 00:49:58,480 --> 00:50:01,160 Speaker 1: it always allows us to jud stify the use of 832 00:50:01,200 --> 00:50:05,520 Speaker 1: the same principles. But the explanation of why thermodynamics it 833 00:50:05,600 --> 00:50:09,040 Speaker 1: does apply to any particular system takes the form of 834 00:50:09,080 --> 00:50:13,600 Speaker 1: a deduction using the methods of statistical mechanics from the 835 00:50:13,640 --> 00:50:17,719 Speaker 1: details of what the system contains, and this inevitably leads 836 00:50:17,800 --> 00:50:20,879 Speaker 1: us down to the level of elementary particles. So he's 837 00:50:20,880 --> 00:50:24,759 Speaker 1: saying it's useful, but it's it's elementary physics. They're what's 838 00:50:24,880 --> 00:50:28,920 Speaker 1: driving it is really fundamental physics. So my interpretation of 839 00:50:28,960 --> 00:50:33,040 Speaker 1: Weinberg here is that he's using the example of thermodynamics 840 00:50:33,080 --> 00:50:36,440 Speaker 1: to show that while these higher order sciences dealing with 841 00:50:36,480 --> 00:50:40,719 Speaker 1: complex phenomenon might always remain explanatory, useful, they're just never 842 00:50:40,880 --> 00:50:45,239 Speaker 1: logically autonomous, never fundamental, never independent, and they might be 843 00:50:45,320 --> 00:50:49,200 Speaker 1: good to retain for purposes of communication and understanding, but 844 00:50:49,280 --> 00:50:53,239 Speaker 1: they don't describe fundamental truths. For that, you need reduction 845 00:50:53,280 --> 00:50:57,680 Speaker 1: to fundamental physics, paired with an acknowledgement of accidents of history, 846 00:50:57,960 --> 00:51:02,439 Speaker 1: and ultimately a theory of every thing. So basically, any 847 00:51:02,480 --> 00:51:04,279 Speaker 1: of these different fields is ultimately going to be a 848 00:51:04,320 --> 00:51:07,680 Speaker 1: subset of another field. All Right, we're gonna take a 849 00:51:07,760 --> 00:51:09,880 Speaker 1: quick break, and when we come back, we're going to 850 00:51:09,960 --> 00:51:13,359 Speaker 1: discuss this in terms of some more human elements. Um, 851 00:51:13,600 --> 00:51:17,319 Speaker 1: So if your if your mind is exploding with all 852 00:51:17,320 --> 00:51:20,440 Speaker 1: of the thermodynamics, bear with us, because things are going 853 00:51:20,480 --> 00:51:28,520 Speaker 1: to get a little more human. Okay, So we just 854 00:51:28,600 --> 00:51:34,040 Speaker 1: looked at some difficult examples of where emergent properties may 855 00:51:34,280 --> 00:51:38,960 Speaker 1: appear to exist in things like crystals or thermodynamics. They 856 00:51:39,040 --> 00:51:41,920 Speaker 1: might really exist and be fundamental. They might just be 857 00:51:41,960 --> 00:51:45,239 Speaker 1: an illusion that they're not actually fundamental. But one of 858 00:51:45,280 --> 00:51:48,000 Speaker 1: the places where people have a really hard time not 859 00:51:48,160 --> 00:51:52,640 Speaker 1: seeing something unique and original at higher levels of complexities 860 00:51:52,680 --> 00:51:56,320 Speaker 1: and the human sciences in things like psychology and anthropology. 861 00:51:56,360 --> 00:51:58,600 Speaker 1: So maybe we should look at a couple of examples 862 00:51:59,040 --> 00:52:03,000 Speaker 1: of papers taking the idea of emergentis um and applying 863 00:52:03,040 --> 00:52:06,920 Speaker 1: it to these higher complexity sciences. Yeah, and you know 864 00:52:06,960 --> 00:52:09,360 Speaker 1: a lot of this boils down to like what's the 865 00:52:09,400 --> 00:52:14,440 Speaker 1: saying three's company fours the crowd? Like there, I mean, 866 00:52:14,480 --> 00:52:17,200 Speaker 1: in our own experience, we know that as is more 867 00:52:17,239 --> 00:52:24,960 Speaker 1: people gather together, certain certain realities come online, certain certain 868 00:52:25,040 --> 00:52:30,560 Speaker 1: social responsibilities come online, like, for instance, yoga classes. If 869 00:52:30,560 --> 00:52:32,279 Speaker 1: anyone out there has ever been a yoga class, rom 870 00:52:32,280 --> 00:52:34,560 Speaker 1: exercise class, if there are just two people in it, 871 00:52:34,600 --> 00:52:36,840 Speaker 1: if there's just a teacher and a student, one of 872 00:52:36,920 --> 00:52:39,919 Speaker 1: the realities not to be crude is there there There 873 00:52:40,080 --> 00:52:45,680 Speaker 1: is no plausible deniability of flatulence. If one person um 874 00:52:45,800 --> 00:52:49,600 Speaker 1: passes gas and it's audible or you know, or not audible. 875 00:52:49,600 --> 00:52:53,319 Speaker 1: If it's noticeable, then there's no questioning who did it. 876 00:52:54,160 --> 00:52:57,080 Speaker 1: But if there are three, then there's plausible deniability. Then 877 00:52:57,080 --> 00:53:00,239 Speaker 1: you've got a society, you've got a suspicion and got 878 00:53:00,280 --> 00:53:03,200 Speaker 1: bluffing exactly. And I mean that's just a very simple example. 879 00:53:03,239 --> 00:53:08,480 Speaker 1: But this takes place, the more you expand the social dynamic. 880 00:53:09,160 --> 00:53:11,200 Speaker 1: And uh, and there have been studies that have that 881 00:53:11,280 --> 00:53:13,920 Speaker 1: have looked into this, uh in you know, broad or 882 00:53:14,000 --> 00:53:17,920 Speaker 1: less crude terms of course. UH. One paper in particular, 883 00:53:18,040 --> 00:53:20,600 Speaker 1: and this is one that that that you found for 884 00:53:20,680 --> 00:53:25,520 Speaker 1: us here is from Robert L. Carnario, the Transition from 885 00:53:25,640 --> 00:53:29,160 Speaker 1: Quantity to Quality and neglected causal mechanism in Accounting for 886 00:53:29,239 --> 00:53:31,840 Speaker 1: social evolution. I was interested in this one because it 887 00:53:31,840 --> 00:53:35,000 Speaker 1: plays on the idea of quantity becoming quality. So yeah, 888 00:53:35,000 --> 00:53:38,000 Speaker 1: the basic nugget here is that when the quantitative increase 889 00:53:38,040 --> 00:53:41,400 Speaker 1: in some entity reaches a certain threshold, the situation gives 890 00:53:41,480 --> 00:53:45,840 Speaker 1: rise to a qualitative change. So more is different, right Exactly, 891 00:53:45,840 --> 00:53:48,719 Speaker 1: it's the same process, but the idea is that it 892 00:53:48,760 --> 00:53:52,560 Speaker 1: would break down, uh, you know, beyond mere biological and 893 00:53:52,600 --> 00:53:54,759 Speaker 1: chemical examples. We've touched on some of them already, but 894 00:53:54,880 --> 00:53:57,560 Speaker 1: like a couple more that the the author brings up here, 895 00:53:57,600 --> 00:54:00,400 Speaker 1: like the critical mass of uranium or the quanty aitative 896 00:54:00,440 --> 00:54:03,399 Speaker 1: difference in the wavelength of the light received by our 897 00:54:03,400 --> 00:54:07,279 Speaker 1: retina and the effect that has on color perception. So 898 00:54:07,280 --> 00:54:08,400 Speaker 1: I guess you can think of in terms you know, 899 00:54:08,400 --> 00:54:12,600 Speaker 1: there's a there's a tipping point um where where quantity 900 00:54:13,520 --> 00:54:16,799 Speaker 1: because it becomes quality. Oh yeah, I never thought about that. 901 00:54:16,880 --> 00:54:21,320 Speaker 1: Wavelengths of light, So increasing wavelengths suddenly we just perceive 902 00:54:21,400 --> 00:54:24,480 Speaker 1: a different color, right, that's that's that's the basic idea. 903 00:54:24,680 --> 00:54:27,560 Speaker 1: But the author here focuses on the notion that quantitative 904 00:54:27,560 --> 00:54:30,520 Speaker 1: increases in the form of population give rise to a 905 00:54:30,600 --> 00:54:32,600 Speaker 1: change in the structure of a society. So it's that 906 00:54:32,680 --> 00:54:37,880 Speaker 1: yoga example, it's the threes company fours a crowd, except 907 00:54:38,520 --> 00:54:41,319 Speaker 1: he explores it through some some various other examples here. 908 00:54:41,920 --> 00:54:45,960 Speaker 1: So on a basic level, let's say we have a 909 00:54:46,040 --> 00:54:49,200 Speaker 1: village of humans and it reaches a large enough size 910 00:54:49,719 --> 00:54:52,520 Speaker 1: that you know what, you end up having factions emerge, 911 00:54:52,640 --> 00:54:55,200 Speaker 1: clans emerged, like this is a This is a classic 912 00:54:55,239 --> 00:54:59,600 Speaker 1: trope of various fictional scenarios in which you have outsiders 913 00:54:59,719 --> 00:55:04,000 Speaker 1: and a survivalist. You know, Stephen King's the Mist, Lord 914 00:55:04,000 --> 00:55:07,600 Speaker 1: of the Flies Lost. You're gonna have factions emerged, right, 915 00:55:07,640 --> 00:55:10,120 Speaker 1: and this is reality television. What did you mean the 916 00:55:10,160 --> 00:55:13,280 Speaker 1: Mist or the stand Well, both, right, I guess, because 917 00:55:13,320 --> 00:55:15,400 Speaker 1: the Mist is like simplified version of that. They're all 918 00:55:15,440 --> 00:55:18,919 Speaker 1: trapped in the supermarket and then immediately there's like they're 919 00:55:18,960 --> 00:55:21,920 Speaker 1: like two different factions. There's like the the It's been 920 00:55:21,920 --> 00:55:23,399 Speaker 1: a while since I've read it, but I remember there's 921 00:55:23,440 --> 00:55:27,560 Speaker 1: one one one faction is a little more uh apocalypt 922 00:55:27,640 --> 00:55:30,960 Speaker 1: apocalyptic than the other. Who's to say which one is 923 00:55:31,000 --> 00:55:36,040 Speaker 1: correct given that apocalyptic scenario, but we so so we 924 00:55:36,080 --> 00:55:39,600 Speaker 1: see splintering in groups, We see splintering in countries and organizations, 925 00:55:39,640 --> 00:55:42,360 Speaker 1: both real and fictional. I mean, who can forget the 926 00:55:42,400 --> 00:55:45,080 Speaker 1: People's Front of Judea and the Judea and People's Front 927 00:55:45,200 --> 00:55:47,000 Speaker 1: right from the Life of Brian. Yeah, well you have 928 00:55:47,040 --> 00:55:50,839 Speaker 1: the two different resistance organization that it's splintered from the one, 929 00:55:50,880 --> 00:55:54,359 Speaker 1: and an additional satellite organizations have splintered off as well. 930 00:55:54,480 --> 00:55:58,640 Speaker 1: I think also there, I guess uh satirizing the narcissism 931 00:55:58,640 --> 00:56:02,040 Speaker 1: of small differences. Now, when it breaks down to their 932 00:56:02,080 --> 00:56:06,120 Speaker 1: actual villages, the Kaiapo villages typically hit six hundred or 933 00:56:06,160 --> 00:56:09,960 Speaker 1: eight hundred persons. Okay, that's like the their their upper limit. 934 00:56:10,440 --> 00:56:12,839 Speaker 1: The Yana Mamo, however, they tend to max out at 935 00:56:12,880 --> 00:56:15,759 Speaker 1: two hundred or or even a little below, and then 936 00:56:15,800 --> 00:56:18,200 Speaker 1: they splinter. So the difference here is that the Kappo 937 00:56:18,280 --> 00:56:24,000 Speaker 1: boast a complex social segmentation consisting of clans, while the 938 00:56:24,120 --> 00:56:26,719 Speaker 1: Yana Mamo have only a few different lineages. So the 939 00:56:26,800 --> 00:56:31,239 Speaker 1: takeaway here is that larger popular population aggregates UH can 940 00:56:31,360 --> 00:56:36,919 Speaker 1: bring about an abrupt elaboration in social structure. So it's 941 00:56:36,960 --> 00:56:40,560 Speaker 1: it's it's interesting because the the larger group, the group 942 00:56:40,600 --> 00:56:43,960 Speaker 1: that is able to maintain the larger village, does so 943 00:56:44,719 --> 00:56:49,560 Speaker 1: by through this complex complexity. Like it's the it's almost 944 00:56:49,680 --> 00:56:52,120 Speaker 1: like if you were to apply to an engineering standpoint, 945 00:56:52,200 --> 00:56:55,920 Speaker 1: like to create a large domed building, UH is going 946 00:56:55,960 --> 00:56:57,719 Speaker 1: to be more of an engineering feat and require a 947 00:56:57,719 --> 00:57:02,160 Speaker 1: little more finesse than like a small like an egg 948 00:57:02,239 --> 00:57:04,520 Speaker 1: glue type hut Oh. It's kind of like how we've 949 00:57:04,520 --> 00:57:08,000 Speaker 1: talked about, like the difference between building a house and 950 00:57:08,040 --> 00:57:11,839 Speaker 1: building a skyscraper. Skyscraper is not just bigger, it's a 951 00:57:11,840 --> 00:57:14,640 Speaker 1: different thing. It's a different project. You can't just have 952 00:57:14,680 --> 00:57:17,360 Speaker 1: an approchase with a different mentality, exactly. You can't just 953 00:57:17,400 --> 00:57:20,240 Speaker 1: have a larger elephant. You have to have a different 954 00:57:20,400 --> 00:57:24,400 Speaker 1: organism that may resemble in some way the elephant um 955 00:57:24,600 --> 00:57:26,560 Speaker 1: And so we see we see that reflected here. He 956 00:57:26,640 --> 00:57:30,600 Speaker 1: also points out North American Plains, Indians, they did they 957 00:57:31,080 --> 00:57:34,280 Speaker 1: displayed a tendency to rely on simple social organizations for 958 00:57:34,320 --> 00:57:38,760 Speaker 1: small bands. Okay, so the existing in in small groups, 959 00:57:39,240 --> 00:57:41,440 Speaker 1: and they'll have a leader of those groups, but the 960 00:57:41,560 --> 00:57:44,440 Speaker 1: leader doesn't exert a tremendous amount of power. But then 961 00:57:44,480 --> 00:57:47,560 Speaker 1: when they they will periodically come together for say some 962 00:57:47,600 --> 00:57:50,720 Speaker 1: sort of a large hunt or tribal exercise, and then 963 00:57:50,760 --> 00:57:54,080 Speaker 1: they'll they'll organize under a tribal chief who exerts far 964 00:57:54,200 --> 00:57:57,520 Speaker 1: greater power than a regional So it's not it it's 965 00:57:57,840 --> 00:58:01,200 Speaker 1: it's not like even a necessarily a proportional uh change 966 00:58:01,200 --> 00:58:03,600 Speaker 1: in power. It's a significant change in power, Like the 967 00:58:03,640 --> 00:58:06,880 Speaker 1: complexity really takes off, and then when they have to 968 00:58:06,920 --> 00:58:10,080 Speaker 1: splinter again, it all just kind of goes away. Um, 969 00:58:10,120 --> 00:58:13,800 Speaker 1: But it's it's it's it seems very emergent in its form, 970 00:58:14,280 --> 00:58:16,240 Speaker 1: and that it's not like, oh, they're more of us, now, 971 00:58:16,280 --> 00:58:17,840 Speaker 1: this is the way we do things. It's more like, 972 00:58:17,880 --> 00:58:19,560 Speaker 1: this is the way we do things when we come 973 00:58:19,600 --> 00:58:25,400 Speaker 1: together high necessity. We're making the bigger elephant here interesting 974 00:58:26,840 --> 00:58:29,680 Speaker 1: and uh he he points to uh, some other sources 975 00:58:29,680 --> 00:58:32,960 Speaker 1: on this point. There's a quote here included from anthropologist 976 00:58:33,000 --> 00:58:36,880 Speaker 1: Michael J. Harner, who observed that quote population pressure is 977 00:58:36,920 --> 00:58:40,400 Speaker 1: a major uh determinant of social evolution, and that we 978 00:58:40,440 --> 00:58:45,360 Speaker 1: see this in all of humanity's greatest transformation, so agriculture, industry, science, 979 00:58:45,360 --> 00:58:50,280 Speaker 1: ETCETERA greater land subsistence, resource scarcity with consequently intensified competition 980 00:58:50,320 --> 00:58:52,760 Speaker 1: for its control. This leads to the spread of war, 981 00:58:52,880 --> 00:58:57,560 Speaker 1: the development of states, and all the the human complexity 982 00:58:57,560 --> 00:58:59,880 Speaker 1: that spreads out from that. Yeah, it's interesting to think 983 00:58:59,880 --> 00:59:04,400 Speaker 1: of about how uh in a sense large society is 984 00:59:04,760 --> 00:59:07,800 Speaker 1: I guess they're emphasizing here just not predictable from small 985 00:59:07,880 --> 00:59:11,840 Speaker 1: groups of humans, uh, that that it transforms into this 986 00:59:11,920 --> 00:59:18,560 Speaker 1: fundamentally different thing with with different functions and yeah, um, 987 00:59:18,600 --> 00:59:20,800 Speaker 1: I mean I I can definitely see this even at 988 00:59:20,800 --> 00:59:22,640 Speaker 1: a small scale, like like you were talking about with 989 00:59:22,640 --> 00:59:27,080 Speaker 1: the yoga class. You know, a uh, this is getting 990 00:59:27,200 --> 00:59:29,840 Speaker 1: very colloquial with the idea of emergentism, but you know, 991 00:59:29,880 --> 00:59:34,000 Speaker 1: a gathering of a gathering of five friends is not 992 00:59:34,160 --> 00:59:37,480 Speaker 1: just larger than a gathering of two friends. It's very, 993 00:59:37,640 --> 00:59:42,040 Speaker 1: very different. And to come back to the the apocalyptic 994 00:59:42,040 --> 00:59:43,720 Speaker 1: examples from fiction when we're drawing, I think that's one 995 00:59:43,760 --> 00:59:46,200 Speaker 1: of the appeals of stuff like the Walking Dead or 996 00:59:46,240 --> 00:59:49,520 Speaker 1: the stand or the mist in that these examples reduced 997 00:59:49,520 --> 00:59:52,520 Speaker 1: the human population to a much smaller and at least 998 00:59:52,800 --> 00:59:55,920 Speaker 1: seemingly manageable number. And then we try to and in 999 00:59:56,000 --> 00:59:59,240 Speaker 1: a sense, we're trying to reduce societal problems to fundamental 1000 00:59:59,640 --> 01:00:03,040 Speaker 1: proper these like everything goes screwed because of this character, 1001 01:00:03,520 --> 01:00:06,200 Speaker 1: how he or she is behaving. You can be familiar 1002 01:00:06,360 --> 01:00:09,560 Speaker 1: with all of the agents that matter, right, and this 1003 01:00:09,680 --> 01:00:13,480 Speaker 1: is not true of society today. There are tons of 1004 01:00:13,520 --> 01:00:16,080 Speaker 1: agents acting upon your life who you don't even know 1005 01:00:16,120 --> 01:00:18,760 Speaker 1: who they are, what their names are. Yeah, or it's 1006 01:00:18,760 --> 01:00:21,400 Speaker 1: not necessarily oh, the villainous character. It's more like, oh, 1007 01:00:21,480 --> 01:00:27,800 Speaker 1: it's the the villainous are that that emerges when this 1008 01:00:27,880 --> 01:00:31,160 Speaker 1: group of people get together with these ideals in mind, 1009 01:00:31,440 --> 01:00:34,120 Speaker 1: and these ideals are actually really positive, but then there's 1010 01:00:34,120 --> 01:00:38,240 Speaker 1: this negative manifestation of it. Yeah, it gets, uh, it gets, 1011 01:00:38,720 --> 01:00:43,520 Speaker 1: it gets complexity emerges fairly quickly. And then there's another 1012 01:00:43,560 --> 01:00:46,040 Speaker 1: study looked at here, and this is a our Keith 1013 01:00:46,080 --> 01:00:48,640 Speaker 1: saw your emergence and psychology lessons from the history of 1014 01:00:48,680 --> 01:00:51,400 Speaker 1: non reductionist science, and the basic nugget in this one 1015 01:00:51,480 --> 01:00:54,400 Speaker 1: was that while we often look to psychology for a 1016 01:00:54,400 --> 01:00:57,760 Speaker 1: reductionist view, there's a lot of potential in an emergent 1017 01:00:57,800 --> 01:01:00,360 Speaker 1: view of psychology. The mind is not mere really a 1018 01:01:00,440 --> 01:01:03,560 Speaker 1: shadow cast by a functioning brain, which is kind of 1019 01:01:03,560 --> 01:01:05,840 Speaker 1: an analogy off and fall back on, but but a 1020 01:01:06,320 --> 01:01:09,400 Speaker 1: higher level emergence system forming the shadow puppet on the 1021 01:01:09,440 --> 01:01:12,840 Speaker 1: wall and continually revising its form. So like, even if 1022 01:01:12,880 --> 01:01:16,040 Speaker 1: you don't take a substance duelist point of view, even 1023 01:01:16,120 --> 01:01:18,440 Speaker 1: if you don't think that the mind is supernatural in 1024 01:01:18,520 --> 01:01:22,160 Speaker 1: some sense, you could still uh find some merit in 1025 01:01:22,200 --> 01:01:25,160 Speaker 1: the idea that the mind is not fully explicable from 1026 01:01:25,240 --> 01:01:28,760 Speaker 1: the standpoint of neuroscience. Yes, that's that's my take take 1027 01:01:28,800 --> 01:01:31,400 Speaker 1: away from the paper anyway. Yeah, you can't just look 1028 01:01:31,440 --> 01:01:33,440 Speaker 1: at all the tissue in the brain and say this 1029 01:01:33,520 --> 01:01:36,680 Speaker 1: is the kind of mind it would generate. Okay, Well, 1030 01:01:36,720 --> 01:01:38,640 Speaker 1: one more thing I wanted to look at before we 1031 01:01:38,680 --> 01:01:41,520 Speaker 1: wrap things up is we've heard from the reductionist view 1032 01:01:41,520 --> 01:01:44,960 Speaker 1: of Weinberg, and then we've heard from emergentis like Anderson. 1033 01:01:45,160 --> 01:01:49,240 Speaker 1: But Anderson accepts one interpretation of reductionism, he just rejects 1034 01:01:49,280 --> 01:01:52,560 Speaker 1: another interpretation of it. What about people who are away 1035 01:01:52,640 --> 01:01:57,240 Speaker 1: far out there in fully rejecting explanatory reductionism in all 1036 01:01:57,280 --> 01:02:02,360 Speaker 1: its forms. Obviously, the debate still going on among some thinkers, 1037 01:02:02,360 --> 01:02:05,240 Speaker 1: and I found a good short essay from by the 1038 01:02:05,360 --> 01:02:10,680 Speaker 1: biologist and philosopher of science Massimo Pelaucci about this ongoing debate, 1039 01:02:10,720 --> 01:02:13,040 Speaker 1: and he discusses the work of a few philosophers like 1040 01:02:13,120 --> 01:02:17,320 Speaker 1: John Duprey, Jerry Photo, and Nancy Cartwright who have argued 1041 01:02:17,360 --> 01:02:21,920 Speaker 1: against the fundamental unity of sciences and against the reductionist hypothesis. 1042 01:02:22,320 --> 01:02:24,960 Speaker 1: And I think he makes a few interesting points. One 1043 01:02:25,040 --> 01:02:30,320 Speaker 1: he talks about Jerry phoed or making a distinction about 1044 01:02:30,320 --> 01:02:33,720 Speaker 1: what it means for one science to reduce to another. Anyway, 1045 01:02:33,760 --> 01:02:36,960 Speaker 1: So you could be talking about ontological reduction, which just 1046 01:02:37,040 --> 01:02:41,280 Speaker 1: means that the more complex phenomena the mind is literally 1047 01:02:41,320 --> 01:02:43,720 Speaker 1: made out of the simpler phenomena. You know, the mind 1048 01:02:43,880 --> 01:02:46,480 Speaker 1: literally is dependent upon the brain. You can agree with that, 1049 01:02:47,560 --> 01:02:50,960 Speaker 1: but uh, this part might be pretty obviously true to you. 1050 01:02:51,120 --> 01:02:53,600 Speaker 1: Molecules are made out of atoms, organisms are made out 1051 01:02:53,640 --> 01:02:56,640 Speaker 1: of cells, populations are made out of individual organisms. But 1052 01:02:57,400 --> 01:03:00,600 Speaker 1: when it comes to theoretical reduction, which you might also 1053 01:03:00,640 --> 01:03:05,480 Speaker 1: call explanatory or explanatory reduction, the same does not necessarily 1054 01:03:05,560 --> 01:03:10,120 Speaker 1: hold true. While complex phenomena are made out of simpler phenomena. 1055 01:03:10,200 --> 01:03:15,080 Speaker 1: Are theories explaining complex phenomena are different than the things themselves. 1056 01:03:15,120 --> 01:03:18,280 Speaker 1: They exist in our minds, not in physical space. And 1057 01:03:18,320 --> 01:03:21,680 Speaker 1: just because the thing reduces does not necessarily mean that 1058 01:03:21,760 --> 01:03:24,800 Speaker 1: the proper explanation for it reduces. I know that's kind 1059 01:03:24,800 --> 01:03:28,160 Speaker 1: of a strange philosophical point, but I think there's there 1060 01:03:28,200 --> 01:03:32,040 Speaker 1: might be a grain of truth there um. Another thing, though, 1061 01:03:32,120 --> 01:03:36,760 Speaker 1: is that uh, reductionism is not supported by an inductive 1062 01:03:36,800 --> 01:03:39,600 Speaker 1: survey of the progress of science. This is kind of interesting, 1063 01:03:39,600 --> 01:03:41,800 Speaker 1: and I think I mostly agree with him on this one. 1064 01:03:42,240 --> 01:03:46,480 Speaker 1: Instead of more complex theories collapsing into simpler ones, what 1065 01:03:46,560 --> 01:03:48,880 Speaker 1: have we seen in the history of science. We've seen 1066 01:03:48,920 --> 01:03:53,080 Speaker 1: exactly the opposite. Instead, we see the proliferation of more 1067 01:03:53,160 --> 01:03:58,000 Speaker 1: and more specialized theories. We don't see the specific science 1068 01:03:58,040 --> 01:04:01,800 Speaker 1: collapsing into the general. We see the general branching off 1069 01:04:01,840 --> 01:04:04,920 Speaker 1: into the specific. Now, maybe this just means our our 1070 01:04:04,960 --> 01:04:07,600 Speaker 1: study of science isn't mature enough yet, you know, like 1071 01:04:07,640 --> 01:04:11,160 Speaker 1: that we haven't done enough work reducing complex sciences into 1072 01:04:11,200 --> 01:04:14,919 Speaker 1: simpler ones. That's possible, but if you're just to look 1073 01:04:14,960 --> 01:04:19,560 Speaker 1: at it inductively, science is not reducing. That's not happening. 1074 01:04:19,560 --> 01:04:23,360 Speaker 1: At all. Uh. One more thing is that voter says, 1075 01:04:23,400 --> 01:04:26,040 Speaker 1: you know, the reductionist assumption is not, as far as 1076 01:04:26,040 --> 01:04:30,200 Speaker 1: we know, actually guided by a principle. It might be intuitive, 1077 01:04:30,360 --> 01:04:33,560 Speaker 1: especially to scientists who have, you know, some other phenomena, 1078 01:04:34,520 --> 01:04:38,680 Speaker 1: who have seen some other phenomena successfully reduced to simpler principles. 1079 01:04:38,680 --> 01:04:42,640 Speaker 1: Think of Weinberg talking about thermodynamics. But what reason do 1080 01:04:42,760 --> 01:04:46,160 Speaker 1: we actually have to assume that biology can be fully 1081 01:04:46,240 --> 01:04:49,960 Speaker 1: explained by physics? Uh? I don't know. My intuition certainly 1082 01:04:49,960 --> 01:04:52,560 Speaker 1: tells me it can be. But my intuition, of course, 1083 01:04:52,920 --> 01:04:56,360 Speaker 1: is not worth a sack of split piece in science. Uh. 1084 01:04:56,440 --> 01:04:58,760 Speaker 1: And then one last idea I wanted to end on 1085 01:04:58,800 --> 01:05:01,000 Speaker 1: because I thought this was really yeard, but also very 1086 01:05:01,040 --> 01:05:07,040 Speaker 1: interesting is the anti realism of Nancy Cartwright, the philosopher 1087 01:05:07,080 --> 01:05:10,000 Speaker 1: of science. Nancy Cartwright not the voice actress who plays 1088 01:05:10,040 --> 01:05:15,680 Speaker 1: Bart Simpson. So she offers a positive rationale for believing 1089 01:05:15,720 --> 01:05:19,320 Speaker 1: that theories for complex phenomenon might not be expected to 1090 01:05:19,360 --> 01:05:23,160 Speaker 1: reduce to theories for simpler ones, and she advocates what's 1091 01:05:23,240 --> 01:05:26,680 Speaker 1: known as an anti realist position. And in her case, 1092 01:05:26,720 --> 01:05:29,960 Speaker 1: what this means is she rejects the idea that there 1093 01:05:30,160 --> 01:05:33,479 Speaker 1: is such a thing as fundamental laws of nature. Now 1094 01:05:33,640 --> 01:05:36,160 Speaker 1: you might be thinking, how on earth could you do that? Well, 1095 01:05:36,760 --> 01:05:38,840 Speaker 1: this it sounds kind of weird, but think would go 1096 01:05:38,880 --> 01:05:41,240 Speaker 1: with her for a second. I think it's actually kind 1097 01:05:41,240 --> 01:05:44,680 Speaker 1: of interesting. One thing, we can't denize that science works, 1098 01:05:44,840 --> 01:05:48,960 Speaker 1: right we we know it works practically, pragmatically, it just works. 1099 01:05:49,000 --> 01:05:52,080 Speaker 1: It generates theories that make predictions which are accurate enough 1100 01:05:52,640 --> 01:05:56,360 Speaker 1: for us to make technology and make civilization out of them. 1101 01:05:56,480 --> 01:06:00,520 Speaker 1: But what if they're not, in fact truly universal and fundamental, 1102 01:06:00,600 --> 01:06:03,920 Speaker 1: but rather, as I said a minute ago, accurate enough. 1103 01:06:05,040 --> 01:06:07,640 Speaker 1: And there's really a present precedent for this in the 1104 01:06:07,760 --> 01:06:10,960 Speaker 1: history of the pursuit of physics already, because for a 1105 01:06:11,000 --> 01:06:12,880 Speaker 1: long time, what did we have in physics? We had 1106 01:06:12,920 --> 01:06:16,840 Speaker 1: the mechanics of Isaac Newton, and they were accurate enough 1107 01:06:17,160 --> 01:06:20,000 Speaker 1: that we could use them to predict the motions of baseballs, 1108 01:06:20,080 --> 01:06:22,000 Speaker 1: or if I throw a jar of pickles at your face, 1109 01:06:22,920 --> 01:06:25,280 Speaker 1: even tried to study the motions of planets. This could 1110 01:06:25,280 --> 01:06:30,160 Speaker 1: pretty much all be explained accurately by Newtonian mechanics, um 1111 01:06:30,200 --> 01:06:32,360 Speaker 1: and we we could we could make a technology out 1112 01:06:32,360 --> 01:06:34,840 Speaker 1: of them. We can fire cannonballs, all that stuff. But 1113 01:06:34,920 --> 01:06:38,520 Speaker 1: we now know that strictly speaking, Newton was wrong. His 1114 01:06:38,680 --> 01:06:42,600 Speaker 1: laws were not able to generate very accurate predictions at 1115 01:06:42,680 --> 01:06:46,560 Speaker 1: things beyond the medium scales of matter and energy, and 1116 01:06:46,640 --> 01:06:49,360 Speaker 1: for those things they've now been replaced with things like 1117 01:06:49,680 --> 01:06:53,160 Speaker 1: general relativity and quantum mechanics, which can give us even 1118 01:06:53,200 --> 01:06:56,480 Speaker 1: more accurate predictions to explain those weird few cases where 1119 01:06:56,720 --> 01:07:01,840 Speaker 1: Newtonian mechanics break down in our experience. So where does 1120 01:07:01,920 --> 01:07:04,440 Speaker 1: Nancy Cartwright go with this? She says, well, what if 1121 01:07:04,480 --> 01:07:09,360 Speaker 1: in fact, all possible fundamental theories are like that, accurate 1122 01:07:09,520 --> 01:07:13,280 Speaker 1: enough to make predictions, but not actually district descriptive of 1123 01:07:13,400 --> 01:07:18,760 Speaker 1: inviolable universal laws. So this could maybe explain why, or 1124 01:07:18,760 --> 01:07:22,800 Speaker 1: at least the ultimate reason why it proves so hard 1125 01:07:22,880 --> 01:07:26,840 Speaker 1: to reduce all science to physics, because we haven't essentially 1126 01:07:26,960 --> 01:07:31,280 Speaker 1: an imperfect system that merely lines up with most things. Yeah, 1127 01:07:31,280 --> 01:07:33,400 Speaker 1: I mean, the the idea would be, yeah, that the 1128 01:07:33,840 --> 01:07:37,160 Speaker 1: physics will always be imperfect, that there is no universal 1129 01:07:37,240 --> 01:07:42,440 Speaker 1: physics at bottom, there's only predictive enough. And in Cartwright's terminology, 1130 01:07:42,520 --> 01:07:45,720 Speaker 1: this would mean that all scientific laws are quote phenomenal logical, 1131 01:07:46,200 --> 01:07:48,880 Speaker 1: good enough to reckon our experience of the world at 1132 01:07:48,880 --> 01:07:53,080 Speaker 1: the level of their appropriate application, but not necessarily truly 1133 01:07:53,240 --> 01:07:57,720 Speaker 1: universal and fundamental. Uh. And if that's the case, that 1134 01:07:57,720 --> 01:08:00,560 Speaker 1: that could essentially apply all down the line. You know, 1135 01:08:00,680 --> 01:08:05,840 Speaker 1: because there is this inherent indeterminacy or you know, this 1136 01:08:06,040 --> 01:08:10,640 Speaker 1: inherent imprecision at the basis of all matter and energy, 1137 01:08:10,920 --> 01:08:14,520 Speaker 1: you can understand why higher, more complex levels of science 1138 01:08:14,560 --> 01:08:17,559 Speaker 1: would not be reducible to lower ones. So it's like 1139 01:08:17,600 --> 01:08:20,280 Speaker 1: saying there's no United States. There's actually just all these 1140 01:08:20,320 --> 01:08:23,519 Speaker 1: different states. There's no there's no European Union. There's just 1141 01:08:23,560 --> 01:08:26,360 Speaker 1: all these different countries. Are they or to go back 1142 01:08:26,400 --> 01:08:28,600 Speaker 1: to the state's aeology, there are just these counties that 1143 01:08:28,640 --> 01:08:32,800 Speaker 1: are assembled into this this order. On an individual level, 1144 01:08:33,080 --> 01:08:37,280 Speaker 1: there can be a truth, but not an overall arching system. Well, 1145 01:08:37,320 --> 01:08:39,200 Speaker 1: I mean, I think she would be saying that at 1146 01:08:39,200 --> 01:08:42,479 Speaker 1: the bottom there is no universal truth, so that that 1147 01:08:42,640 --> 01:08:45,880 Speaker 1: maybe you might have like that there's no there's no 1148 01:08:46,040 --> 01:08:50,200 Speaker 1: fundamental basis of political organization from what you're saying, Like, 1149 01:08:50,439 --> 01:08:56,120 Speaker 1: you know, you can use political organization to reckon countries, states, counties, 1150 01:08:56,160 --> 01:08:59,040 Speaker 1: and stuff, and it all works well enough at those levels, 1151 01:08:59,040 --> 01:09:03,120 Speaker 1: but there is no bottom of political organization. There's no 1152 01:09:03,200 --> 01:09:08,040 Speaker 1: fundamental unit of it that is perfectly real. Yeah, all right, 1153 01:09:08,120 --> 01:09:11,680 Speaker 1: I can I'm not saying i'd buy her take on it, 1154 01:09:11,680 --> 01:09:13,880 Speaker 1: but I can see how it would. I see how 1155 01:09:13,920 --> 01:09:16,599 Speaker 1: it lines up. Yeah, and I do think it's interesting. 1156 01:09:16,760 --> 01:09:18,880 Speaker 1: I'm not saying I'm convinced by her point of view. 1157 01:09:18,960 --> 01:09:20,880 Speaker 1: I just think it's an interesting idea. Yeah. And to 1158 01:09:21,000 --> 01:09:22,880 Speaker 1: your like point that we laid out at the beginning, 1159 01:09:22,920 --> 01:09:26,240 Speaker 1: it's a it's a non magical version of this. Like 1160 01:09:26,280 --> 01:09:29,559 Speaker 1: certainly we can look to any two various examples where 1161 01:09:29,560 --> 01:09:33,360 Speaker 1: someone uh isn't buying into it for supernatural reasons, but 1162 01:09:33,400 --> 01:09:38,280 Speaker 1: she has a a scientific theory here. Yeah, and so uh, well, 1163 01:09:38,320 --> 01:09:41,519 Speaker 1: I don't know, but let's say it's at least a 1164 01:09:41,600 --> 01:09:45,160 Speaker 1: non supernatural thing and it participates inductively, and so because 1165 01:09:45,160 --> 01:09:47,479 Speaker 1: it looks at like, well, this has been the case 1166 01:09:47,720 --> 01:09:49,760 Speaker 1: in in some of our study of science, we keep 1167 01:09:49,800 --> 01:09:52,960 Speaker 1: finding out that stuff that we think accurately describes the 1168 01:09:52,960 --> 01:09:58,559 Speaker 1: world is not really perfectly accurate. Is just accurate enough anyway, 1169 01:09:59,360 --> 01:10:01,800 Speaker 1: That's what I got. So, Robert, are you convinced? What 1170 01:10:01,800 --> 01:10:04,640 Speaker 1: what do you think? Are you reductionist emergentists somewhere in 1171 01:10:04,640 --> 01:10:08,080 Speaker 1: between one of those qualified middle grounds. Oh, I guess 1172 01:10:08,120 --> 01:10:10,000 Speaker 1: I've I've got to fall back on the sort of 1173 01:10:10,520 --> 01:10:12,680 Speaker 1: you know, lens based view of it. You know, I 1174 01:10:12,720 --> 01:10:17,520 Speaker 1: can put the lens of reductionism and the lens of 1175 01:10:17,520 --> 01:10:20,639 Speaker 1: of emergence on as needed and certainly see how they 1176 01:10:20,800 --> 01:10:25,000 Speaker 1: line up with reality. But but yeah, I mean, I 1177 01:10:24,720 --> 01:10:29,920 Speaker 1: I certainly think, uh, emergence carries a lot of weight. Yeah, 1178 01:10:29,960 --> 01:10:33,320 Speaker 1: I certainly intuitively feel that sense of emergence. But then again, 1179 01:10:33,360 --> 01:10:36,599 Speaker 1: I also when I get thinking in the reductionist of mindset, 1180 01:10:36,680 --> 01:10:38,360 Speaker 1: that can make sense to me too. I guess I'm 1181 01:10:38,400 --> 01:10:41,880 Speaker 1: just very impressionable. I don't know what to think about this. 1182 01:10:41,960 --> 01:10:44,200 Speaker 1: I do think it's a really interesting subject though, and 1183 01:10:44,240 --> 01:10:47,040 Speaker 1: I do think it's important always to to come back 1184 01:10:47,080 --> 01:10:48,760 Speaker 1: to the kind of stuff we're doing here, where we 1185 01:10:49,240 --> 01:10:51,439 Speaker 1: pay attention not just to how science is done, but 1186 01:10:51,520 --> 01:10:57,120 Speaker 1: to the assumptions underpinning it. Yeah. Indeed, all right, Well, hey, 1187 01:10:57,200 --> 01:10:58,920 Speaker 1: if you want, if you want to find out more 1188 01:10:58,920 --> 01:11:01,400 Speaker 1: about this topic other related topics to do with sort 1189 01:11:01,400 --> 01:11:04,919 Speaker 1: of the the nature of science and the nature scientific inquiry, 1190 01:11:05,040 --> 01:11:06,760 Speaker 1: heading over to stuff to Blow your Mind dot com. 1191 01:11:06,760 --> 01:11:09,439 Speaker 1: That's what we'll find all the podcast episodes, videos, blog 1192 01:11:09,479 --> 01:11:13,439 Speaker 1: post links out to various social media accounts of his Facebook, Twitter, Tumbler, etcetera. 1193 01:11:13,840 --> 01:11:15,920 Speaker 1: And hey, there's even an old fashioned way to get 1194 01:11:15,880 --> 01:11:18,160 Speaker 1: in Dutch with us as well. Right, you can email 1195 01:11:18,280 --> 01:11:20,519 Speaker 1: us as always that blow the mind at how stuff 1196 01:11:20,560 --> 01:11:32,960 Speaker 1: works dot com for more on this than thousands of 1197 01:11:32,960 --> 01:11:58,080 Speaker 1: other topics. Is that how stuff works dot com