1 00:00:00,600 --> 00:00:04,040 Speaker 1: Welcome to Stuff you Should Know from House Stuff Works 2 00:00:04,080 --> 00:00:13,160 Speaker 1: dot com. Hey, and welcome to the podcast. I'm Josh Clark, 3 00:00:13,240 --> 00:00:17,119 Speaker 1: There's Charles w Chuck Bryant, There's Jerry Stuffy. Should know 4 00:00:17,360 --> 00:00:20,680 Speaker 1: why grinning. It's been a while, man. Know it's funny, 5 00:00:20,720 --> 00:00:22,880 Speaker 1: like those words come pouring out of my mouth and 6 00:00:22,960 --> 00:00:25,119 Speaker 1: then it's cool. You wake up in the middle of 7 00:00:25,200 --> 00:00:27,000 Speaker 1: night saying that, and you mean like slugs you in 8 00:00:27,040 --> 00:00:32,120 Speaker 1: the face. She's sleep She has to dry my brow. Yes, 9 00:00:32,159 --> 00:00:36,239 Speaker 1: we prerecorded some for December, as we like to do 10 00:00:36,320 --> 00:00:37,560 Speaker 1: to take a little time off at the end of 11 00:00:37,600 --> 00:00:40,960 Speaker 1: the year and not explain things for a few weeks 12 00:00:40,960 --> 00:00:44,400 Speaker 1: in our real lives. Like people ask me things like 13 00:00:44,920 --> 00:00:47,080 Speaker 1: what happened to that stick of butter? Yeah, I don't know, 14 00:00:47,240 --> 00:00:49,640 Speaker 1: don't ask don't even ask me. I could tell you, 15 00:00:50,240 --> 00:00:52,280 Speaker 1: but I'm not gonna exactly. That's how it goes in 16 00:00:52,360 --> 00:00:56,760 Speaker 1: my house. Find your own butter. December was find your 17 00:00:56,800 --> 00:01:00,240 Speaker 1: own butter money. That's a good that's that's a good one. 18 00:01:00,400 --> 00:01:02,280 Speaker 1: That should be a T shirt. Stuff you should know 19 00:01:02,360 --> 00:01:05,360 Speaker 1: find your own butter or December is find your own butter. 20 00:01:05,480 --> 00:01:07,680 Speaker 1: M Yeah, that's right. Maybe a stick of butter and 21 00:01:07,680 --> 00:01:11,520 Speaker 1: some Garland on it. Yeah, I like that. So it's 22 00:01:11,520 --> 00:01:13,479 Speaker 1: good to see you again. Man's gonna be back in here. Yeah, 23 00:01:13,560 --> 00:01:15,440 Speaker 1: it is nice to be back. As much as the 24 00:01:15,440 --> 00:01:18,120 Speaker 1: break was great, I'm happy to be explaining things again. 25 00:01:18,160 --> 00:01:20,440 Speaker 1: Well that's good because if we got in here and 26 00:01:20,480 --> 00:01:22,240 Speaker 1: you're like, I can't do this, I can't do it again, 27 00:01:22,280 --> 00:01:24,800 Speaker 1: we'd be in trouble. Yeah. So I'm glad we're all 28 00:01:24,840 --> 00:01:27,880 Speaker 1: feeling good. Jerry, you're feeling good. Jerry's got two thumbs 29 00:01:27,920 --> 00:01:31,240 Speaker 1: up in a big goofy smile, two of her three thumbs. 30 00:01:31,760 --> 00:01:35,000 Speaker 1: She looks like Bob from that male enhancement pill ad. 31 00:01:36,040 --> 00:01:38,639 Speaker 1: Oh see the guy, the old man that's like super Buff, 32 00:01:39,600 --> 00:01:41,640 Speaker 1: I would call him old. He was middle aged. He 33 00:01:41,720 --> 00:01:44,679 Speaker 1: looked like kind of a Bob Dobbs type dude. I 34 00:01:44,680 --> 00:01:46,520 Speaker 1: think that's kind of who he was modeled. You see 35 00:01:46,520 --> 00:01:49,600 Speaker 1: the guy that's super muscling. Now I'm thinking of someone different. 36 00:01:49,640 --> 00:01:52,360 Speaker 1: I think, are you thinking of Jack lo Lane? No? No, no, no, 37 00:01:52,480 --> 00:01:54,520 Speaker 1: just there's some ad. There's some old man that looks 38 00:01:54,520 --> 00:01:57,760 Speaker 1: like really creepy because from the next super Buff looks 39 00:01:57,760 --> 00:02:00,200 Speaker 1: like it's twenty five year old. No, remember that was 40 00:02:00,240 --> 00:02:03,000 Speaker 1: like a male enhancement pill and I'm making air quotes 41 00:02:03,040 --> 00:02:06,400 Speaker 1: here for erectile this unsion. Well, there go the air quotes. 42 00:02:06,920 --> 00:02:10,760 Speaker 1: But yes, And it was like in the early two thousands. 43 00:02:10,880 --> 00:02:14,560 Speaker 1: I think maybe late nineties, but I think early two thousands, 44 00:02:14,639 --> 00:02:17,440 Speaker 1: and these ads were everywhere, and there was Bob and 45 00:02:17,520 --> 00:02:19,880 Speaker 1: like all these great things happened to him because he 46 00:02:19,960 --> 00:02:22,160 Speaker 1: started taking this pill. I can't remember the name of 47 00:02:22,200 --> 00:02:25,040 Speaker 1: the pill. But the company like got into a lot 48 00:02:25,080 --> 00:02:28,040 Speaker 1: of trouble because it was basically like a subscription service 49 00:02:28,720 --> 00:02:30,320 Speaker 1: and like you gave him your credit card and you 50 00:02:30,400 --> 00:02:32,920 Speaker 1: got this free trial, but then they started sending it 51 00:02:32,960 --> 00:02:35,200 Speaker 1: to you and it was like next to impossible to 52 00:02:35,280 --> 00:02:38,680 Speaker 1: cut off service. Interesting, They're like, no, we want your 53 00:02:38,720 --> 00:02:43,440 Speaker 1: mailness to be enhanced. So you you've seen these ads. 54 00:02:43,600 --> 00:02:47,000 Speaker 1: I was gonna start asking questions, but why bother? I 55 00:02:47,000 --> 00:02:49,760 Speaker 1: will I will YouTube. I will find it on YouTube. 56 00:02:49,800 --> 00:02:52,840 Speaker 1: I'll be like, oh, Bob, yeah you will. You'll go, Oh, 57 00:02:53,240 --> 00:02:55,200 Speaker 1: I want to come back in and record an insert 58 00:02:55,600 --> 00:02:57,040 Speaker 1: the guy that's on the back of all these pill 59 00:02:57,040 --> 00:03:02,520 Speaker 1: bottles in my bathroom. Oh, chuck, I don't even remember 60 00:03:02,560 --> 00:03:05,120 Speaker 1: how we got Oh yeah, Jerry did that. That was 61 00:03:05,200 --> 00:03:10,280 Speaker 1: Jerry's fault. But um, you remember we did the Enlightenment episode. Okay, 62 00:03:10,639 --> 00:03:13,799 Speaker 1: we talked a lot about how there's this kind of 63 00:03:13,919 --> 00:03:21,720 Speaker 1: um tug of war over the human psyche between rationalism 64 00:03:21,760 --> 00:03:25,400 Speaker 1: and mysticism. I guess you could you could put it well, 65 00:03:25,440 --> 00:03:28,040 Speaker 1: I feel like we're talking today about the scientific method. 66 00:03:28,400 --> 00:03:30,959 Speaker 1: Great idea, by the way, Thank you very much. It's 67 00:03:30,960 --> 00:03:34,280 Speaker 1: been a long time coming. Um, because I realized, like 68 00:03:34,560 --> 00:03:38,480 Speaker 1: I don't understand it as fully as I don't understand science. 69 00:03:38,520 --> 00:03:40,840 Speaker 1: I understand the scientific method because it's pretty cut and 70 00:03:40,920 --> 00:03:43,680 Speaker 1: dry and it's beautiful and elegant and simple. But then 71 00:03:44,080 --> 00:03:47,360 Speaker 1: you just take this thing and it came out of 72 00:03:47,360 --> 00:03:49,840 Speaker 1: the birth of rationalism, and when you place it into 73 00:03:49,840 --> 00:03:53,920 Speaker 1: the world and make it function, there's a lot of implications. 74 00:03:54,400 --> 00:03:57,280 Speaker 1: Is it being used properly? Is it being used responsibly? 75 00:03:57,360 --> 00:04:00,680 Speaker 1: Like are we putting what constitutes f E than to that? 76 00:04:00,800 --> 00:04:02,960 Speaker 1: You know, like it just raises all this other stuff. 77 00:04:03,200 --> 00:04:06,240 Speaker 1: And it made me realize, like I don't understand science 78 00:04:06,280 --> 00:04:08,480 Speaker 1: as much as I want to, So researching this it 79 00:04:08,520 --> 00:04:11,280 Speaker 1: was awesome. Yeah, and this is a cool episode, I 80 00:04:11,320 --> 00:04:13,240 Speaker 1: think because not only are we going to talk about 81 00:04:13,280 --> 00:04:16,360 Speaker 1: the scientific method, but we're going to talk about just science, 82 00:04:16,400 --> 00:04:19,440 Speaker 1: like what is science in general? And some of the 83 00:04:19,560 --> 00:04:23,240 Speaker 1: rock stars along the way, who really you know, laid 84 00:04:23,279 --> 00:04:28,040 Speaker 1: out the path remarkably in like many many years ago, 85 00:04:28,560 --> 00:04:31,200 Speaker 1: like coming up with these amazing discoveries that still like 86 00:04:31,320 --> 00:04:34,200 Speaker 1: hold you know, you can like hold their feet to 87 00:04:34,240 --> 00:04:36,200 Speaker 1: the fire for a lot of this stuff. Yeah, because 88 00:04:36,240 --> 00:04:40,919 Speaker 1: if you come upon a universal truth, you know it 89 00:04:41,040 --> 00:04:42,640 Speaker 1: is what it is, like you got to be the 90 00:04:42,680 --> 00:04:45,479 Speaker 1: person who discovered it because you know, you saw it, 91 00:04:45,760 --> 00:04:48,760 Speaker 1: you realized it a certain way, but ultimately it was 92 00:04:48,800 --> 00:04:51,080 Speaker 1: there already. Yeah, like Newton, I mean, we'll talk about 93 00:04:51,120 --> 00:04:52,640 Speaker 1: all this stuff, but it's not like now we're like, 94 00:04:52,839 --> 00:04:55,359 Speaker 1: oh Newton, most of what he said was wrong, But 95 00:04:55,440 --> 00:04:58,160 Speaker 1: that's understandable because it was a long time ago. Like 96 00:04:58,279 --> 00:05:01,400 Speaker 1: his stuff holds up really well. I was wondering if 97 00:05:01,440 --> 00:05:04,040 Speaker 1: he on his deathbed was just like, oh, man, I 98 00:05:04,080 --> 00:05:08,680 Speaker 1: contributed so much to humanity it's mind boggling, but I 99 00:05:08,720 --> 00:05:12,560 Speaker 1: couldn't enhance my malehood. Well, Bob hadn't come along yet, 100 00:05:13,279 --> 00:05:16,920 Speaker 1: So Chuck, let's let's just quit stalling and talk about science, 101 00:05:17,520 --> 00:05:20,880 Speaker 1: like what is science? Well, I hate the old elementary 102 00:05:20,920 --> 00:05:24,680 Speaker 1: school uh defined as, but it's a pretty good place 103 00:05:24,720 --> 00:05:27,960 Speaker 1: to start here to get a base definition of science. Yeah, 104 00:05:27,960 --> 00:05:30,320 Speaker 1: Old William Harris did a great job with this. Yes, 105 00:05:30,440 --> 00:05:33,800 Speaker 1: William Harris did a great job. Uh. Science the intellectual 106 00:05:33,839 --> 00:05:36,960 Speaker 1: and practical activity encompassing the structure and behavior of the 107 00:05:36,960 --> 00:05:43,440 Speaker 1: physical and natural world through observation and experimentation. Uh So 108 00:05:43,560 --> 00:05:47,279 Speaker 1: the first part of that is science is practical and 109 00:05:47,560 --> 00:05:50,000 Speaker 1: it is you know, they make a good he makes. 110 00:05:50,200 --> 00:05:52,200 Speaker 1: Bill Harris makes a great point in here. It's not 111 00:05:52,240 --> 00:05:54,840 Speaker 1: just stuff you do in a lab, and it's not 112 00:05:54,920 --> 00:05:58,760 Speaker 1: just for scientists. It is all about being hands on 113 00:05:58,839 --> 00:06:02,840 Speaker 1: and active, and it's about discovery and asking questions about 114 00:06:03,440 --> 00:06:06,680 Speaker 1: I mean, that's how everything is ultimately solved, is by 115 00:06:06,720 --> 00:06:09,440 Speaker 1: someone looking at something and having a question about it exactly. 116 00:06:09,600 --> 00:06:12,000 Speaker 1: And then the scientific method comes in when you say, 117 00:06:12,080 --> 00:06:16,799 Speaker 1: and this is how you properly get to that answer exactly. Um. 118 00:06:16,880 --> 00:06:20,240 Speaker 1: And he makes another good point to that. The idea 119 00:06:20,279 --> 00:06:23,520 Speaker 1: that there's a method, a scientific method, makes it seem 120 00:06:23,600 --> 00:06:29,680 Speaker 1: like it's it's secreted away among the the fraternity of scientists. 121 00:06:29,760 --> 00:06:32,480 Speaker 1: And like you said, any anybody can use it. It's 122 00:06:32,480 --> 00:06:34,400 Speaker 1: just kind of part of being a curious human is 123 00:06:34,520 --> 00:06:37,240 Speaker 1: it's not even anyone can use it everyone does use it, 124 00:06:37,720 --> 00:06:39,720 Speaker 1: you just might not even know that you're using it, 125 00:06:40,279 --> 00:06:42,120 Speaker 1: like if you I mean one of the examples that 126 00:06:42,200 --> 00:06:44,880 Speaker 1: use later is if like your car overheats, when you 127 00:06:45,200 --> 00:06:49,000 Speaker 1: figure out why and fix it, that's the scientific method, right, 128 00:06:49,160 --> 00:06:53,880 Speaker 1: playing out exactly based on reasoning. Yeah, okay, and deduction 129 00:06:53,960 --> 00:06:57,479 Speaker 1: and induction. Man, there's so much to talk about. Okay, 130 00:06:57,520 --> 00:06:59,919 Speaker 1: So let's let's talk about that definition that you had. 131 00:07:00,800 --> 00:07:04,000 Speaker 1: So the first part is that science is it's a 132 00:07:04,080 --> 00:07:09,880 Speaker 1: practical activity. So science is practical, right, It's it's this, Um. 133 00:07:10,040 --> 00:07:13,400 Speaker 1: The basis of the whole thing is discovery. Right, you 134 00:07:13,480 --> 00:07:16,560 Speaker 1: see something, You see birds in flight, and you say, 135 00:07:16,560 --> 00:07:19,840 Speaker 1: where are those birds going? And if you just went 136 00:07:19,880 --> 00:07:21,840 Speaker 1: and laid down on the ground and went to sleep 137 00:07:21,920 --> 00:07:24,920 Speaker 1: after that, then you're not you're not carrying out science. 138 00:07:24,960 --> 00:07:26,800 Speaker 1: But if you went, I want to find out where 139 00:07:26,800 --> 00:07:29,160 Speaker 1: those birds are going, and you follow them and you 140 00:07:29,240 --> 00:07:32,520 Speaker 1: start taking notes, that's that is the basis of science 141 00:07:32,600 --> 00:07:36,760 Speaker 1: is discovery. Yeah, and that's the observational part as well. Um, 142 00:07:36,960 --> 00:07:39,960 Speaker 1: sometimes you're using a microscope or a telescope, sometimes you're 143 00:07:40,000 --> 00:07:44,120 Speaker 1: using your eyeballs. But no matter what your tool is, uh, 144 00:07:44,160 --> 00:07:48,800 Speaker 1: you're gonna be watching something and recording. What's called data 145 00:07:48,960 --> 00:07:51,960 Speaker 1: or data depending on I don't know what kind of 146 00:07:51,960 --> 00:07:55,480 Speaker 1: person you are. What do you say I think? I 147 00:07:55,520 --> 00:07:59,960 Speaker 1: say both. I think data. Yeah, I don't think I've 148 00:08:00,000 --> 00:08:03,600 Speaker 1: don't think. I say data data, I say data data. Yeah, 149 00:08:03,640 --> 00:08:06,440 Speaker 1: all right, we'll go with data. You say both. I 150 00:08:06,480 --> 00:08:08,000 Speaker 1: feel like it just comes out of my mouth one 151 00:08:08,040 --> 00:08:09,640 Speaker 1: way or the other, and I don't really think about it. 152 00:08:09,720 --> 00:08:12,520 Speaker 1: I think that's like being ambidextrous. Yeah. Yeah, I'm a 153 00:08:12,600 --> 00:08:18,360 Speaker 1: data datas Yeah. Uh so once you are observing this data, well, 154 00:08:18,360 --> 00:08:22,080 Speaker 1: there are a couple of kinds. There's quantitative data, which 155 00:08:22,080 --> 00:08:26,880 Speaker 1: are numbers, like you know, your body temperature is point six, 156 00:08:26,920 --> 00:08:30,640 Speaker 1: although I think that's changed slightly now, didn't it. Yeah? Yeah, there. 157 00:08:30,680 --> 00:08:32,680 Speaker 1: You used to be like, if you were a human being, 158 00:08:33,040 --> 00:08:35,520 Speaker 1: your body temperature is point six, and they it's like, no, 159 00:08:35,600 --> 00:08:38,120 Speaker 1: there's a little more variation than that. But any kind 160 00:08:38,160 --> 00:08:45,360 Speaker 1: of just numerical representation is quantitative, whereas qualitative is behavioral, 161 00:08:45,800 --> 00:08:49,640 Speaker 1: like I'm gonna watch that bird um eat in poop 162 00:08:49,760 --> 00:08:53,360 Speaker 1: for the next week, right, or what happens if I 163 00:08:53,440 --> 00:08:55,319 Speaker 1: what will the slug do if I put a bunch 164 00:08:55,360 --> 00:08:58,199 Speaker 1: of salt on it? Do you know I don't do that. No, 165 00:08:58,360 --> 00:09:00,640 Speaker 1: you really should not do that. No, it's awful. But 166 00:09:00,760 --> 00:09:04,560 Speaker 1: the reaction of the slug is gathering qualitative data. And 167 00:09:04,559 --> 00:09:08,280 Speaker 1: depending on who you talk to, there isn't qualitative data 168 00:09:08,320 --> 00:09:12,120 Speaker 1: and science that it should all just be quantitative because yeah, 169 00:09:12,240 --> 00:09:20,000 Speaker 1: because quantitative data is reproducible. Qualitative data is. It's not 170 00:09:20,080 --> 00:09:23,959 Speaker 1: necessarily reproducible. You can observe the same phenomenon, but you're 171 00:09:24,000 --> 00:09:28,000 Speaker 1: not necessarily controlling it. Okay, well, I guess I get that. 172 00:09:28,040 --> 00:09:30,360 Speaker 1: But I agree with Bill here and that they are both. 173 00:09:30,640 --> 00:09:33,160 Speaker 1: They go hand in hand, and neither one is more 174 00:09:33,160 --> 00:09:34,679 Speaker 1: important than the other. You need to have both. Well, 175 00:09:34,679 --> 00:09:36,600 Speaker 1: a lot of people do, and we'll talk more about 176 00:09:36,600 --> 00:09:40,160 Speaker 1: it later, because without the idea that qualitative data is 177 00:09:40,360 --> 00:09:44,559 Speaker 1: acceptable and scientific, you don't have the social social sciences, 178 00:09:44,559 --> 00:09:47,920 Speaker 1: like they don't exist. Yeah, that's a good point, but yes, 179 00:09:48,520 --> 00:09:51,720 Speaker 1: we have quantitative data and qualitative data. I agree with you. 180 00:09:51,760 --> 00:09:57,640 Speaker 1: They're both useful. Uh. It is an intellectual pursuit um, 181 00:09:57,720 --> 00:10:00,360 Speaker 1: So you can make observations on data all day long. 182 00:10:00,960 --> 00:10:04,040 Speaker 1: But until you bring reason, in this case inductive reasoning, 183 00:10:04,040 --> 00:10:09,480 Speaker 1: which is driving a generalization based on your observations, then 184 00:10:10,040 --> 00:10:12,280 Speaker 1: it's just data sitting there on a piece of paper 185 00:10:12,480 --> 00:10:15,839 Speaker 1: like it's supposed to lead you somewhere right exactly, And 186 00:10:15,960 --> 00:10:18,679 Speaker 1: so we should talk about inductive and deductive reasoning and 187 00:10:18,800 --> 00:10:21,000 Speaker 1: it depending. Again, it's really weird, But one of the 188 00:10:21,040 --> 00:10:25,239 Speaker 1: things that came across is that there's not a universal 189 00:10:25,280 --> 00:10:29,600 Speaker 1: agreement on how science is carried out. Like I saw 190 00:10:29,640 --> 00:10:32,120 Speaker 1: some places where there's like there's no place for inductive 191 00:10:32,120 --> 00:10:35,800 Speaker 1: reasoning in science. Then other places they're saying, well, you 192 00:10:35,840 --> 00:10:39,439 Speaker 1: have to have science using inductive reasoning. Everybody seems to 193 00:10:39,480 --> 00:10:43,960 Speaker 1: agree that deductive reasoning is the basis of science, but 194 00:10:44,040 --> 00:10:46,599 Speaker 1: that you also have to have inductive So deductive is 195 00:10:46,640 --> 00:10:52,120 Speaker 1: basically taking a big, broad generalization and saying that it 196 00:10:52,200 --> 00:10:57,720 Speaker 1: applies to something specific more specific. Yes, Uh, inductive is 197 00:10:57,760 --> 00:11:02,160 Speaker 1: the opposite where you say, I've noticed these different data points, 198 00:11:02,840 --> 00:11:08,199 Speaker 1: and uh, that means that this broad generalization is true. 199 00:11:08,640 --> 00:11:13,280 Speaker 1: So you go from specific, small observations to a broad generalization. 200 00:11:13,440 --> 00:11:15,360 Speaker 1: And the reason that a lot of people say, well, 201 00:11:15,400 --> 00:11:18,920 Speaker 1: inductive reasoning doesn't have any place in science is because 202 00:11:20,080 --> 00:11:25,439 Speaker 1: you're saying those birds over there are all brown, Therefore 203 00:11:25,520 --> 00:11:27,520 Speaker 1: all birds of that type of brown, even though I 204 00:11:27,520 --> 00:11:30,000 Speaker 1: haven't seen every single bird of that type in the world. 205 00:11:30,440 --> 00:11:33,120 Speaker 1: I'm saying that all those birds are brown. And a 206 00:11:33,160 --> 00:11:36,080 Speaker 1: lot of people say there's no place for that in science. Well, 207 00:11:36,160 --> 00:11:37,920 Speaker 1: if you want to go out and prove that, then 208 00:11:37,960 --> 00:11:40,720 Speaker 1: that's your business. You know. You can't just say that 209 00:11:40,760 --> 00:11:43,680 Speaker 1: and be like and I'm done, right exactly. I guess 210 00:11:43,679 --> 00:11:46,880 Speaker 1: you could. But much of scientists, right, but the the 211 00:11:47,400 --> 00:11:50,800 Speaker 1: you can use it to formulate hey, hypotheses. Right, so 212 00:11:50,840 --> 00:11:53,559 Speaker 1: you can say I've generated all these data points, I'm 213 00:11:53,559 --> 00:11:56,800 Speaker 1: gonna put them together and see if this broad generalization 214 00:11:56,920 --> 00:11:59,880 Speaker 1: is right. Okay, so there is a place for inductive 215 00:12:00,040 --> 00:12:03,880 Speaker 1: seeing science. But everybody says deductive reasoning is the basis 216 00:12:03,920 --> 00:12:08,640 Speaker 1: of science. Well, Bill Harris does. Uh. He offers a 217 00:12:08,720 --> 00:12:12,839 Speaker 1: great example for inductive reasoning with Edwin Hubble of the 218 00:12:12,880 --> 00:12:17,320 Speaker 1: Hubble Telescope. Uh. He was looking through the Hooker telescope, 219 00:12:17,320 --> 00:12:21,400 Speaker 1: which at the time at California's Mount Wilson. Is that 220 00:12:21,520 --> 00:12:26,640 Speaker 1: the one from Rebel Now that's um Griffith Park Observatory, 221 00:12:27,840 --> 00:12:32,160 Speaker 1: which has been redesigned, and uh, it is really cool now, yeah, 222 00:12:32,200 --> 00:12:33,560 Speaker 1: I mean it was kind of cool before, but it 223 00:12:33,600 --> 00:12:37,640 Speaker 1: was definitely like, uh, sort of the bass music in 224 00:12:37,679 --> 00:12:40,280 Speaker 1: the time forgot. Oh really, so they've updated it. I'll 225 00:12:40,280 --> 00:12:42,320 Speaker 1: bet that was cool though in its own way. Yeah, 226 00:12:42,320 --> 00:12:43,720 Speaker 1: it was neat. I used to live near there, so 227 00:12:43,880 --> 00:12:46,000 Speaker 1: it was kind of But that's like the famous one, 228 00:12:46,000 --> 00:12:47,920 Speaker 1: at least in the movies. Yeah, that's where they have 229 00:12:48,000 --> 00:12:50,439 Speaker 1: the big knife fight. Yeah, and there's this James Dean 230 00:12:50,520 --> 00:12:55,760 Speaker 1: statue there too. I didn't like a bust um. So yes. 231 00:12:55,920 --> 00:12:58,440 Speaker 1: Edwin Hubble he's at Mountain Wilson and he's looking through 232 00:12:58,440 --> 00:13:01,480 Speaker 1: the Hooker telescope, which was the biggest one. And at 233 00:13:01,480 --> 00:13:03,920 Speaker 1: the time, everyone said, the Milky Way galaxy is it? 234 00:13:04,280 --> 00:13:07,760 Speaker 1: That's what we've got going on. Did you know this? Uh? Yeah, 235 00:13:07,760 --> 00:13:11,600 Speaker 1: I knew that because we're talking. Yeah, not that long ago. 236 00:13:11,720 --> 00:13:14,400 Speaker 1: Did not realize this. And he started looking through this 237 00:13:14,440 --> 00:13:17,280 Speaker 1: telescope and said, you know what, these nebula that everyone 238 00:13:17,320 --> 00:13:19,640 Speaker 1: says are part of our galaxy look to me like 239 00:13:19,679 --> 00:13:23,400 Speaker 1: they're beyond our galaxy. Not only that, they look like 240 00:13:23,400 --> 00:13:27,480 Speaker 1: they're moving away from us. So he made this, uh 241 00:13:27,480 --> 00:13:30,480 Speaker 1: with through inductive reasoning, made this observation that you know what, 242 00:13:30,520 --> 00:13:34,040 Speaker 1: I think there are many many galaxies out there, and 243 00:13:34,120 --> 00:13:37,280 Speaker 1: not only that, I think they are expanding, and uh, 244 00:13:37,600 --> 00:13:42,080 Speaker 1: through technological advancement with telescopes over the years, scientists, you know, 245 00:13:42,200 --> 00:13:45,400 Speaker 1: it proved to be true. Yeah, pretty cool. So this 246 00:13:45,440 --> 00:13:48,080 Speaker 1: is a really good example of him saying, like, I've 247 00:13:48,080 --> 00:13:51,640 Speaker 1: made some observations, and now I'm going to say this 248 00:13:51,760 --> 00:13:56,760 Speaker 1: broad generalization. Right, So these these galaxies appear to be 249 00:13:56,840 --> 00:13:59,640 Speaker 1: moving away from me, not another. So the whole universe 250 00:14:00,240 --> 00:14:05,800 Speaker 1: is expanding. Right, that's inductive reasoning. It's a pretty brave thing, uh, 251 00:14:05,920 --> 00:14:09,199 Speaker 1: especially back then, because you're really putting your reputation at stake, 252 00:14:10,080 --> 00:14:13,040 Speaker 1: you know. So what Hubble was, what Hubble did was 253 00:14:13,720 --> 00:14:17,040 Speaker 1: what we've come to see as science. He made some observations, 254 00:14:17,200 --> 00:14:20,960 Speaker 1: he came up with a hypothesis, um, and then it 255 00:14:21,600 --> 00:14:25,640 Speaker 1: was tested later on. It's not you don't necessarily as 256 00:14:25,640 --> 00:14:30,120 Speaker 1: a scientist. You're a part of a larger collective of scientists, right, 257 00:14:30,440 --> 00:14:33,920 Speaker 1: And every scientist needs one another. It's why there's journals 258 00:14:33,920 --> 00:14:37,840 Speaker 1: and and um conferences and things like that to share information, right, 259 00:14:37,960 --> 00:14:41,200 Speaker 1: And the party and the party, and and Hubble came 260 00:14:41,280 --> 00:14:45,080 Speaker 1: up with his own observations. And rather than just experimenting, experimenting, 261 00:14:45,120 --> 00:14:48,880 Speaker 1: experimenting himself, which I'm sure he continued to do, he 262 00:14:49,000 --> 00:14:52,080 Speaker 1: created this basis of work that he probably realized is 263 00:14:52,120 --> 00:14:57,120 Speaker 1: going to survive him. Right, And then later on scientists 264 00:14:57,120 --> 00:15:00,560 Speaker 1: came down the road and they tested his hypothesis and 265 00:15:00,600 --> 00:15:04,400 Speaker 1: they found it was correct, and so his hypothesis became 266 00:15:04,440 --> 00:15:06,720 Speaker 1: a theory. It eventually became part of the basis of 267 00:15:06,720 --> 00:15:09,480 Speaker 1: the Big Bang theory that the universe started as a 268 00:15:09,560 --> 00:15:15,000 Speaker 1: huge explosion and it's expanding still because we're because it 269 00:15:15,080 --> 00:15:18,600 Speaker 1: exploded at one point, right, And they did that by 270 00:15:18,640 --> 00:15:24,160 Speaker 1: carrying out other tests or experiments exactly. So this is 271 00:15:24,200 --> 00:15:26,840 Speaker 1: how science works. Like some guy back in in nineteen 272 00:15:26,920 --> 00:15:32,200 Speaker 1: nine makes some observations in California, he proposes this big, 273 00:15:32,240 --> 00:15:36,480 Speaker 1: broad generalist generalization, and over the next like ensuing half 274 00:15:36,480 --> 00:15:40,160 Speaker 1: a century, more and more scientists all around the world 275 00:15:40,480 --> 00:15:43,680 Speaker 1: start testing his hypothesis and find it to be true, 276 00:15:43,680 --> 00:15:47,160 Speaker 1: so it becomes a theory. Yeah. Well, well let's finish 277 00:15:47,240 --> 00:15:50,280 Speaker 1: up here with science. The last part of the definition 278 00:15:50,880 --> 00:15:54,760 Speaker 1: is that it's systematic and it's methodical, and it requires 279 00:15:54,880 --> 00:15:59,960 Speaker 1: testing and experiments, and it requires those experiments and tests 280 00:16:00,240 --> 00:16:04,560 Speaker 1: uh to be repeated and verified, and it is is 281 00:16:04,680 --> 00:16:07,840 Speaker 1: it's a system. It's a way of working things out. 282 00:16:08,000 --> 00:16:11,560 Speaker 1: It's a way of working That is the scientific method basically, Yeah, 283 00:16:11,920 --> 00:16:16,200 Speaker 1: you have your idea, you pose a question, you theorize 284 00:16:16,520 --> 00:16:19,080 Speaker 1: hyper you put a hypothesis out there, and then you 285 00:16:19,240 --> 00:16:22,320 Speaker 1: go about trying to either prove it or disprove it. Yeah, exactly. 286 00:16:22,400 --> 00:16:24,280 Speaker 1: And then the way that you go about proving or 287 00:16:24,320 --> 00:16:27,880 Speaker 1: disproving it, that's the scientific method. Everything else is just 288 00:16:28,000 --> 00:16:32,080 Speaker 1: scientific inquiry. The way you go about, the standardized way 289 00:16:32,120 --> 00:16:35,440 Speaker 1: of going about scientific inquiry is the scientific method. And 290 00:16:35,520 --> 00:16:39,160 Speaker 1: we friend, we'll talk about the scientific method right after this. 291 00:16:52,640 --> 00:16:54,680 Speaker 1: All right, you brought up a point. I think we 292 00:16:54,720 --> 00:16:57,200 Speaker 1: should go ahead and just get right to my friend. 293 00:16:57,280 --> 00:17:02,480 Speaker 1: Let's do hypotheses and theories. One thing to say together, No, 294 00:17:03,480 --> 00:17:08,400 Speaker 1: One thing that really chase my hide is, uh, when 295 00:17:08,400 --> 00:17:11,959 Speaker 1: you hear poopo ors of whatever scientific theory say, well, 296 00:17:12,000 --> 00:17:14,800 Speaker 1: it's just a theory. And you where was this thing 297 00:17:14,840 --> 00:17:18,320 Speaker 1: that you found that poo pooed that? Do you remember 298 00:17:18,440 --> 00:17:23,320 Speaker 1: what website that was? No? No, although I do want 299 00:17:23,359 --> 00:17:24,760 Speaker 1: to give a shout out now that you mentioned it. 300 00:17:24,840 --> 00:17:29,840 Speaker 1: Two explorables. It's like an online university basically of free courses, 301 00:17:30,520 --> 00:17:34,439 Speaker 1: and uh, there is one on scientific reasoning that is 302 00:17:34,520 --> 00:17:36,959 Speaker 1: just amazing. It's like a huge rabbit hole. You go 303 00:17:37,040 --> 00:17:39,879 Speaker 1: down and you start clicking on the embedded links and 304 00:17:39,920 --> 00:17:42,959 Speaker 1: you end up like understanding all sorts of stuff. So 305 00:17:43,119 --> 00:17:45,679 Speaker 1: go check that one out if you like understanding stuff. 306 00:17:46,920 --> 00:17:48,479 Speaker 1: So that's one of the things that bug me if 307 00:17:48,480 --> 00:17:50,600 Speaker 1: someone says it's just a theory, and this does a 308 00:17:50,680 --> 00:17:54,800 Speaker 1: great job of kind of throwing that out the window. Um, 309 00:17:54,840 --> 00:17:58,680 Speaker 1: because it's basically mixing up the two definitions of theory. Yeah, 310 00:17:58,720 --> 00:18:01,920 Speaker 1: there's like a colloquial definition that people use every day 311 00:18:02,000 --> 00:18:04,000 Speaker 1: that it doesn't really have much to do with the 312 00:18:04,040 --> 00:18:08,080 Speaker 1: scientific use. Like, I got a theory that Jerry entered 313 00:18:08,080 --> 00:18:11,960 Speaker 1: one hour bathroom breaks every day is really playing uh 314 00:18:12,119 --> 00:18:15,960 Speaker 1: words with friends in the lobby. I think your theory 315 00:18:16,040 --> 00:18:20,320 Speaker 1: is correct. So that's a theory in the colloquial meaning. 316 00:18:21,080 --> 00:18:24,000 Speaker 1: As far as science goes, the theory is not just 317 00:18:24,200 --> 00:18:27,119 Speaker 1: something you postulate, say if this may or may not 318 00:18:27,160 --> 00:18:30,680 Speaker 1: be true. The theory is beyond the hypothesis, and it's 319 00:18:30,720 --> 00:18:35,439 Speaker 1: something that is strongly supported in many different ways and 320 00:18:35,520 --> 00:18:38,399 Speaker 1: all there's all kinds of evidence to support something that 321 00:18:38,440 --> 00:18:42,000 Speaker 1: eventually becomes a theory. Right, So, um, what you your 322 00:18:42,119 --> 00:18:45,520 Speaker 1: theory about Jerry's bathroom breaks in the scientific world would 323 00:18:45,560 --> 00:18:50,199 Speaker 1: be a hypothesis. In fact, yeah, it would be a 324 00:18:50,200 --> 00:18:54,480 Speaker 1: scientific law, but ultimately it would begin as a hypothesis, 325 00:18:54,480 --> 00:19:00,360 Speaker 1: a hunch based on intuition, based on the data you've collected, observeyations, 326 00:19:00,400 --> 00:19:03,399 Speaker 1: that kind of stuff. Where like, you know, you've seen 327 00:19:03,680 --> 00:19:06,280 Speaker 1: that Jerry goes to the bathroom for like an hour 328 00:19:06,359 --> 00:19:10,760 Speaker 1: to stretch frequently. When she comes back, she's um finishing 329 00:19:10,880 --> 00:19:14,920 Speaker 1: up a game of words with friends. You've heard that 330 00:19:15,000 --> 00:19:18,000 Speaker 1: she's been spotted in the lobby during these times. So 331 00:19:18,119 --> 00:19:22,000 Speaker 1: your hypothesis is that while she is gone for these 332 00:19:22,040 --> 00:19:24,440 Speaker 1: hour long bathroom breaks, she's actually down the lobby playing 333 00:19:24,440 --> 00:19:28,800 Speaker 1: words with friends. Right, based on knowledge, observation, and logic. Right, 334 00:19:28,840 --> 00:19:31,440 Speaker 1: So let's say that you decided to set up an experiment, 335 00:19:31,480 --> 00:19:34,159 Speaker 1: and you experimented, and you went and you found Jerry 336 00:19:34,200 --> 00:19:38,240 Speaker 1: playing words with friends five different times, and you told 337 00:19:38,400 --> 00:19:40,280 Speaker 1: me about it, and I was like, I'm going to 338 00:19:40,400 --> 00:19:43,560 Speaker 1: run that same experiment exactly the way you did, Right, 339 00:19:44,080 --> 00:19:47,040 Speaker 1: I would test that same hypothesis. If I found the 340 00:19:47,040 --> 00:19:49,600 Speaker 1: same results to be true, then what you would have 341 00:19:49,680 --> 00:19:52,760 Speaker 1: come up with, your hypothesis would move to basically a 342 00:19:52,840 --> 00:19:57,320 Speaker 1: theory that is, this widely accepted thing, this explanation that 343 00:19:57,920 --> 00:20:01,720 Speaker 1: Jerry is not actually in the bad room, she's downstairs 344 00:20:01,720 --> 00:20:04,760 Speaker 1: playing with friends. Would be the Jerry bathroom break theory, 345 00:20:05,400 --> 00:20:09,240 Speaker 1: that's right. And then if it turns out that you 346 00:20:09,280 --> 00:20:13,239 Speaker 1: find that Jerry spending an hour a day pretending to 347 00:20:13,280 --> 00:20:17,520 Speaker 1: be in the bathroom but actually being downstairs playing words 348 00:20:17,560 --> 00:20:22,640 Speaker 1: with friends, if the universe couldn't exist without her doing 349 00:20:22,680 --> 00:20:28,639 Speaker 1: that every day, you would have a scientific law. That's right. Yeah, 350 00:20:28,880 --> 00:20:30,600 Speaker 1: I think that was a good example you came up 351 00:20:30,600 --> 00:20:34,280 Speaker 1: with as a great example. As it turns out. Uh, 352 00:20:34,400 --> 00:20:36,520 Speaker 1: I guess the point here is when you hear someone 353 00:20:36,600 --> 00:20:39,800 Speaker 1: say in an argument, well, that's just a theory, just 354 00:20:40,320 --> 00:20:42,520 Speaker 1: punch him in the head and then tell them what 355 00:20:42,600 --> 00:20:45,400 Speaker 1: we just said about the bathroom breaks and they'll say, 356 00:20:45,400 --> 00:20:49,120 Speaker 1: who's Jerry? Or just just queue up that whole bit 357 00:20:49,359 --> 00:20:52,640 Speaker 1: and stand outside of their window wearing a trench coat 358 00:20:52,640 --> 00:20:54,679 Speaker 1: and holding a boom box over your head with the 359 00:20:54,840 --> 00:20:58,880 Speaker 1: smug look on your face. Uh all right, So should 360 00:20:58,920 --> 00:21:01,159 Speaker 1: we go back in the old way back machine a 361 00:21:01,200 --> 00:21:03,000 Speaker 1: little bit and just talk a little bit about how 362 00:21:03,000 --> 00:21:14,840 Speaker 1: the scientific method came to be? Yes, man, this this thing, 363 00:21:15,600 --> 00:21:19,240 Speaker 1: what are you running this on these days? Straight? Kerosene? 364 00:21:19,920 --> 00:21:23,480 Speaker 1: The fumes in here killing me. Sorry about that? Trying 365 00:21:23,480 --> 00:21:27,879 Speaker 1: to go green? You know, kerosene is not green diesel. 366 00:21:27,920 --> 00:21:32,159 Speaker 1: Maybe I'm choking bio diesel. How about that? Okay, the 367 00:21:32,160 --> 00:21:34,600 Speaker 1: way Back machine will run French freed grease. That would 368 00:21:34,600 --> 00:21:37,080 Speaker 1: be fine. I'll get to work on that. I could 369 00:21:37,119 --> 00:21:40,560 Speaker 1: handle this. So you you tease us with the Renaissance, 370 00:21:40,600 --> 00:21:43,719 Speaker 1: and the reason the Renaissance was so awesome and necessary 371 00:21:43,760 --> 00:21:46,440 Speaker 1: it was because of something else we talked about, which 372 00:21:46,520 --> 00:21:50,520 Speaker 1: was the Dark Ages, Uh when, which you remember, that's 373 00:21:50,520 --> 00:21:56,800 Speaker 1: the rationalists disparaging term for this era. That's right, but 374 00:21:56,880 --> 00:22:00,480 Speaker 1: I think sort of rightfully so because right before the 375 00:22:00,520 --> 00:22:03,239 Speaker 1: Dark Ages until about a century after, there was not 376 00:22:03,440 --> 00:22:07,320 Speaker 1: much advancement at all in the realm of scientific advancement. 377 00:22:08,320 --> 00:22:11,040 Speaker 1: Uh No, it's it's true. That's hard to argue with that, 378 00:22:11,160 --> 00:22:14,120 Speaker 1: and and the reason why is again, science wasn't really 379 00:22:14,160 --> 00:22:17,960 Speaker 1: born yet, and there is a huge struggle between rationalism 380 00:22:17,960 --> 00:22:21,359 Speaker 1: and mysticism, and ultimately we're living in the age of 381 00:22:21,480 --> 00:22:24,359 Speaker 1: rationalism now. Yeah, and we should point out too that 382 00:22:24,440 --> 00:22:28,280 Speaker 1: this was mainly in Europe over the Islamic world. As 383 00:22:28,480 --> 00:22:30,560 Speaker 1: I think we had a listener mail point out, there 384 00:22:30,560 --> 00:22:33,480 Speaker 1: are a lot of advancements being made uh just sort 385 00:22:33,520 --> 00:22:36,960 Speaker 1: of flying under the European radar at the time, because 386 00:22:37,680 --> 00:22:40,880 Speaker 1: some say the Catholic Church kind of kept science under 387 00:22:40,920 --> 00:22:44,160 Speaker 1: its thumb for a while, and it's a pretty big threat. 388 00:22:45,440 --> 00:22:47,000 Speaker 1: He said, you know, you can't do this stuff, you 389 00:22:47,040 --> 00:22:51,040 Speaker 1: can't experiment like this, and don't ask these questions because 390 00:22:51,080 --> 00:22:54,920 Speaker 1: here are your answers. But eventually the Renaissance came about 391 00:22:54,920 --> 00:22:58,320 Speaker 1: in the twelfth century and people woke up and I 392 00:22:58,400 --> 00:23:00,480 Speaker 1: saw some of the work in the Islamic world and said, 393 00:23:00,480 --> 00:23:03,320 Speaker 1: you know what, maybe let's start reading up on Aristotle 394 00:23:03,359 --> 00:23:06,199 Speaker 1: and told me and euclid it once again. Yeah, they're like, 395 00:23:06,240 --> 00:23:09,080 Speaker 1: we forgot about these guys. I mean, it literally kind 396 00:23:09,080 --> 00:23:11,920 Speaker 1: of vanished for a while. It did from the West. Yes, 397 00:23:12,080 --> 00:23:15,040 Speaker 1: fortunately it was still around, you know, and in its 398 00:23:15,080 --> 00:23:18,080 Speaker 1: home places. But yes, in the West they were lost. 399 00:23:18,160 --> 00:23:21,480 Speaker 1: The Roman stuff was almost entirely lost because it was 400 00:23:21,520 --> 00:23:24,159 Speaker 1: being suppressed by the locals, and I think the Greek 401 00:23:24,240 --> 00:23:28,400 Speaker 1: knowledge was completely banished. Yes, somehow somehow they got there 402 00:23:28,480 --> 00:23:31,640 Speaker 1: was some um. We got another listener mail after the Enlightenment. 403 00:23:31,680 --> 00:23:35,200 Speaker 1: One they said that it was an Islamic scholar who 404 00:23:35,280 --> 00:23:41,160 Speaker 1: was the one who translated Aristotle into Latin. Or something 405 00:23:41,240 --> 00:23:44,080 Speaker 1: like that, and that without this guy, like the West 406 00:23:44,119 --> 00:23:47,440 Speaker 1: wouldn't have had much to start with, because that's where 407 00:23:47,520 --> 00:23:50,720 Speaker 1: that birth of rationalism came from from, was this rediscovery 408 00:23:50,760 --> 00:23:55,240 Speaker 1: of Greek and Roman classical thought, and this was the 409 00:23:55,280 --> 00:23:59,200 Speaker 1: basis of scientific inquiry of rationalism, of saying, like, okay, 410 00:23:59,200 --> 00:24:01,399 Speaker 1: there's there's set rules to things, and we need to 411 00:24:01,440 --> 00:24:04,600 Speaker 1: discover these rules and how the principles of how the 412 00:24:04,680 --> 00:24:07,040 Speaker 1: universe works, like there has to be principles, and we 413 00:24:07,080 --> 00:24:10,000 Speaker 1: need to find this in a rational, methodical way. And 414 00:24:10,200 --> 00:24:13,720 Speaker 1: right out of the gate Europe said, oh, okay, well 415 00:24:13,760 --> 00:24:16,520 Speaker 1: whatever you say is right then, Aristotle. We're used to 416 00:24:16,560 --> 00:24:20,560 Speaker 1: just believing everything without questioning it. And luckily Albert Magnus, 417 00:24:20,560 --> 00:24:24,480 Speaker 1: I think is who it was. Um Albertus was Albertus Magnus. 418 00:24:24,720 --> 00:24:27,960 Speaker 1: Roger Bacon, who said, no, it is Bacon, Roger Bacon, 419 00:24:28,040 --> 00:24:32,639 Speaker 1: who just has this great name, Roger Bacon, the Bacon brothers. Yeah, 420 00:24:32,800 --> 00:24:36,560 Speaker 1: he's they weren't brothers though, but they were they related 421 00:24:36,600 --> 00:24:38,800 Speaker 1: at all? You know. I looked that up and I 422 00:24:38,840 --> 00:24:41,800 Speaker 1: don't think people know either way. I don't think there's 423 00:24:41,840 --> 00:24:44,439 Speaker 1: any proof, but a lot of people think because of 424 00:24:44,480 --> 00:24:46,760 Speaker 1: their names and the way things went back then, that 425 00:24:46,880 --> 00:24:49,280 Speaker 1: they may very well have been. And I mean they 426 00:24:49,280 --> 00:24:52,919 Speaker 1: were separated by three or so years. Although Roger was 427 00:24:52,960 --> 00:24:55,960 Speaker 1: a was a monk, so he would not have had children. 428 00:24:56,520 --> 00:24:59,080 Speaker 1: So if they were, it's an excellent point. It wasn't 429 00:24:59,119 --> 00:25:02,719 Speaker 1: necessarily through line, you know. Yeah, it could have been 430 00:25:02,800 --> 00:25:06,040 Speaker 1: a nephew or something, or his brother Kevin might have 431 00:25:06,119 --> 00:25:09,880 Speaker 1: had the line that matched. So Roger was the one 432 00:25:09,880 --> 00:25:13,800 Speaker 1: who said, everybody stopped. Just because Aristotle wrote something doesn't 433 00:25:13,800 --> 00:25:18,400 Speaker 1: mean it's fact, especially when we find contradictions to it. 434 00:25:18,920 --> 00:25:21,440 Speaker 1: It doesn't oursells not automatically, right, And this is a 435 00:25:21,520 --> 00:25:25,639 Speaker 1: huge advancement. Yeah, And Albertus Magnus was the one, I 436 00:25:25,680 --> 00:25:30,000 Speaker 1: believe who said, you know, this thing called revealed truth, 437 00:25:30,160 --> 00:25:34,639 Speaker 1: which is basically God says this instead of a truth 438 00:25:34,680 --> 00:25:39,359 Speaker 1: found by experimenting, is maybe we should experiment instead and 439 00:25:39,440 --> 00:25:42,199 Speaker 1: not take this revealed truth as the truth. Right. And 440 00:25:42,280 --> 00:25:46,320 Speaker 1: we mentioned in that Enlightenment episode as well about scholasticism 441 00:25:46,320 --> 00:25:52,399 Speaker 1: about using scientific inquiry to explain theology, which was, you know, 442 00:25:52,640 --> 00:25:55,840 Speaker 1: you're still working from a theological standpoint, right, but you're 443 00:25:55,840 --> 00:26:00,560 Speaker 1: starting to use scientific inquiry and the the idea that 444 00:26:00,640 --> 00:26:03,399 Speaker 1: you shouldn't just accept things as truth. That was again 445 00:26:03,440 --> 00:26:07,840 Speaker 1: a huge, huge breakthrough. Yeah. Uh. Francis Bacon, the other 446 00:26:07,880 --> 00:26:10,760 Speaker 1: Bacon brother, he's one of the heroes of this story. Yeah. 447 00:26:10,800 --> 00:26:16,080 Speaker 1: He was an attorney and philosopher and possibly Shakespeare. Oh really, 448 00:26:16,400 --> 00:26:21,640 Speaker 1: I never heard that? Interesting, So what do you mean 449 00:26:21,680 --> 00:26:25,600 Speaker 1: like wrote those under the pseudonym? Huh? And there the 450 00:26:25,640 --> 00:26:27,800 Speaker 1: Shakespeare sister was the other theory too, right, it was 451 00:26:27,840 --> 00:26:30,840 Speaker 1: a woman. I've heard that, and she couldn't like, you know, 452 00:26:30,920 --> 00:26:34,760 Speaker 1: women couldn't be the play right, so was her dumb 453 00:26:34,760 --> 00:26:38,280 Speaker 1: brother William? That's good? Was it a brother? I think 454 00:26:38,320 --> 00:26:40,680 Speaker 1: that was one of the theories. This is a good 455 00:26:41,119 --> 00:26:44,960 Speaker 1: Smith song too. Uh, Shakespeare sister was that the name 456 00:26:44,960 --> 00:26:48,360 Speaker 1: of it? Wouldn't it a band to you? I think 457 00:26:48,400 --> 00:26:52,280 Speaker 1: it was? What was it? Maybe? Uh? So anyway, he 458 00:26:52,400 --> 00:26:56,679 Speaker 1: was a philosopher and a lawyer, and he said, you 459 00:26:56,720 --> 00:27:01,879 Speaker 1: know what, the Baconian method basically became the scientific method. 460 00:27:02,280 --> 00:27:05,040 Speaker 1: He was the first dude who really said, this is 461 00:27:05,320 --> 00:27:09,480 Speaker 1: how the steps that you should take to uh investigate science. 462 00:27:10,359 --> 00:27:12,080 Speaker 1: There has to be a framework. And the whole point 463 00:27:12,119 --> 00:27:15,200 Speaker 1: of this that we take this so for granted now 464 00:27:15,280 --> 00:27:22,120 Speaker 1: because it's so intuitively and on its face right as 465 00:27:22,119 --> 00:27:24,560 Speaker 1: far as science inquiry, he goes, but this is an 466 00:27:24,720 --> 00:27:30,000 Speaker 1: enormous breakthrough to say, follow this step, these steps, this framework, 467 00:27:30,920 --> 00:27:35,160 Speaker 1: and if everybody who carries out science follows the same framework, 468 00:27:35,480 --> 00:27:39,600 Speaker 1: then science will be universal and interchangeable and anyone in 469 00:27:39,600 --> 00:27:43,119 Speaker 1: the world, and not just now, but anytime, we'll be 470 00:27:43,200 --> 00:27:47,399 Speaker 1: able to carry out the same experiment and we'll be 471 00:27:47,480 --> 00:27:52,600 Speaker 1: able to verify or disprove it. And that is amazing 472 00:27:52,800 --> 00:27:55,440 Speaker 1: that that happened. That's why Francis Bacon is one of 473 00:27:55,480 --> 00:27:57,159 Speaker 1: the heroes of the story. And he didn't come up 474 00:27:57,200 --> 00:27:59,639 Speaker 1: with this entirely on his own, but he was the 475 00:27:59,640 --> 00:28:02,399 Speaker 1: one who said, this is what we're gonna do. I'm 476 00:28:02,440 --> 00:28:04,760 Speaker 1: gonna give it a name. I'm going to spell it out, 477 00:28:04,880 --> 00:28:08,159 Speaker 1: and from now on you can call me the dad 478 00:28:08,680 --> 00:28:11,639 Speaker 1: of the scientific method. Yeah, and that's why Newton was 479 00:28:11,680 --> 00:28:14,800 Speaker 1: such a rock star, because he so rigorously stuck to 480 00:28:14,920 --> 00:28:19,359 Speaker 1: the scientific method that all these centuries later, his uh, 481 00:28:19,400 --> 00:28:22,040 Speaker 1: you know, his systems of laws are they have stood 482 00:28:22,040 --> 00:28:24,480 Speaker 1: the test of time. And uh, I think it's a 483 00:28:24,480 --> 00:28:28,159 Speaker 1: good point to bring up to that the collaboration of 484 00:28:28,200 --> 00:28:34,800 Speaker 1: scientists is really the hallmark of advancement and moving forward. 485 00:28:35,359 --> 00:28:38,400 Speaker 1: It's not working in a vacuum. It's sharing your ideas 486 00:28:38,440 --> 00:28:41,680 Speaker 1: and working with one another and the whole uh little 487 00:28:41,720 --> 00:28:44,000 Speaker 1: sidebar here on celth here I thought was pretty cool, 488 00:28:44,880 --> 00:28:48,360 Speaker 1: which was when science quit. We're not quit, but started 489 00:28:48,400 --> 00:28:51,680 Speaker 1: looking at small things instead of looking at the universe 490 00:28:51,720 --> 00:28:56,000 Speaker 1: around them and at the stars. And uh said basically, 491 00:28:56,440 --> 00:29:00,160 Speaker 1: you know, through the advancement of lens grinding, Antonio van 492 00:29:00,880 --> 00:29:05,440 Speaker 1: Levin Hook specifically Dutch tradesman was pretty good at making 493 00:29:05,480 --> 00:29:09,040 Speaker 1: simple microscopes, and all of a sudden, contemporaries like Robert 494 00:29:09,040 --> 00:29:11,400 Speaker 1: Hook said, you know what, let's start looking at tiny 495 00:29:11,440 --> 00:29:14,600 Speaker 1: things because they're in might lie the answer too many 496 00:29:14,640 --> 00:29:18,440 Speaker 1: many things. Yeah, and they're right. Robert Hook found cork 497 00:29:19,160 --> 00:29:23,240 Speaker 1: or he discovered cells by looking at cork through an 498 00:29:23,280 --> 00:29:27,560 Speaker 1: early microscope. So in this story, science is hastened by 499 00:29:27,680 --> 00:29:32,920 Speaker 1: technological advancement lens grinding to make microscopes, and then this 500 00:29:32,960 --> 00:29:36,000 Speaker 1: new technology is used to further science. Right. Yeah, it's 501 00:29:36,000 --> 00:29:41,000 Speaker 1: like mutual inspiration between Levin Hook and Hook n Hook. 502 00:29:41,360 --> 00:29:46,720 Speaker 1: It was neat because Hook heard about Leavin Hooks microscopes, 503 00:29:47,080 --> 00:29:50,120 Speaker 1: got his hands on one or a microscope, looked at 504 00:29:50,240 --> 00:29:52,800 Speaker 1: him like cork and said, oh, there's such a thing 505 00:29:52,880 --> 00:29:56,320 Speaker 1: as cells. Levin Hook said, oh, that's pretty neat. Let 506 00:29:56,320 --> 00:29:58,480 Speaker 1: me try, and he said, oh, there's such a thing 507 00:29:58,480 --> 00:30:03,280 Speaker 1: as quote little animal, which we call protozoan bacteria. And 508 00:30:03,720 --> 00:30:07,200 Speaker 1: one of the Royal Societies. After Leavin Hook presented his findings, 509 00:30:07,480 --> 00:30:10,360 Speaker 1: turned back to Hook and said, hey, Hook, we know 510 00:30:10,400 --> 00:30:13,600 Speaker 1: you're pretty handy with the microscope. You confirmed Leavin Hook's 511 00:30:13,880 --> 00:30:17,840 Speaker 1: findings are their little animals? Hook said, there are. Indeed, 512 00:30:18,000 --> 00:30:20,320 Speaker 1: I can see them with my microscope. That's right. And 513 00:30:20,360 --> 00:30:25,840 Speaker 1: that inspired a German botanist name Matthias Schleiden ah to 514 00:30:26,240 --> 00:30:29,160 Speaker 1: look at a lot of plants and he was the 515 00:30:29,200 --> 00:30:31,320 Speaker 1: first guy to say, you know what, plants are composed 516 00:30:31,320 --> 00:30:34,000 Speaker 1: of cells. And he's having dinner one night with his 517 00:30:34,120 --> 00:30:37,520 Speaker 1: zoologist buddy. Yeah, and this is about a hundred years later. Yeah, 518 00:30:37,560 --> 00:30:42,160 Speaker 1: Theodore schran and said, you know what, dude, Uh, order 519 00:30:42,240 --> 00:30:45,920 Speaker 1: the wine and order the steak. Trust me, because this 520 00:30:45,960 --> 00:30:51,000 Speaker 1: place is fantastic. And uh, also, plants are made of cells. 521 00:30:51,200 --> 00:30:53,520 Speaker 1: Don't tell anyone. And he went, you know what, dude, 522 00:30:53,680 --> 00:30:57,280 Speaker 1: I have been investigating animals with microscopes and they're made 523 00:30:57,280 --> 00:31:00,800 Speaker 1: of cells too. And so they figured out at this dinner, Yeah, 524 00:31:00,840 --> 00:31:03,239 Speaker 1: that everything is made of cells. All living things are 525 00:31:03,280 --> 00:31:06,200 Speaker 1: made of cells. Boom. Okay, so this is huge. This 526 00:31:06,280 --> 00:31:09,520 Speaker 1: is a big advancement, right that we're hitting upon right now. 527 00:31:10,320 --> 00:31:14,520 Speaker 1: But it laid the further foundation, right, So initial scientific 528 00:31:14,560 --> 00:31:19,360 Speaker 1: inquiry led to further scientific inquiry and further scientific conclusions 529 00:31:20,120 --> 00:31:23,800 Speaker 1: and generalizations. All living things are made of cells, and 530 00:31:23,800 --> 00:31:27,400 Speaker 1: then it was extrapolated elsewhere. Right. Yeah, Like twenty years later, 531 00:31:27,520 --> 00:31:32,240 Speaker 1: Rudolph Virtual said, you know what, not only is everything 532 00:31:32,240 --> 00:31:35,520 Speaker 1: made of living cells, but they all come from pre 533 00:31:35,520 --> 00:31:38,480 Speaker 1: existing cells, which was a huge deal at the time 534 00:31:38,560 --> 00:31:42,840 Speaker 1: because people believed in spontaneous generation at the time, Like 535 00:31:42,880 --> 00:31:46,800 Speaker 1: if you left some um wheat seed and a sweaty shirt, 536 00:31:46,880 --> 00:31:49,800 Speaker 1: it would spawn mice. I think it was one of them. 537 00:31:50,600 --> 00:31:53,760 Speaker 1: There's a lot of weird ones pressed basil between some 538 00:31:53,840 --> 00:31:57,200 Speaker 1: bricks and you'll get a scorpion. Was one, like they 539 00:31:57,240 --> 00:31:59,880 Speaker 1: were really out there. Yeah, well the one that is 540 00:32:00,240 --> 00:32:02,240 Speaker 1: well not true, but the one that you could actually 541 00:32:02,280 --> 00:32:07,160 Speaker 1: see was rotten meat would eventually, uh spawn maggots. How 542 00:32:07,200 --> 00:32:10,520 Speaker 1: did they possibly get there? Yeah, spontaneous generation. But that's 543 00:32:10,560 --> 00:32:13,160 Speaker 1: the obvious explanation. And if you think about it, they're 544 00:32:13,200 --> 00:32:16,720 Speaker 1: working from Okam's razor and ocams. Razor says, the simplest 545 00:32:16,760 --> 00:32:21,760 Speaker 1: explanation is usually the right one, all all other things given. Well, 546 00:32:21,760 --> 00:32:24,760 Speaker 1: the thing is is spontaneous generation has never been shown 547 00:32:24,800 --> 00:32:28,480 Speaker 1: to be possible. If we get the sell thing over here, 548 00:32:28,560 --> 00:32:34,360 Speaker 1: let's investigate that. So this uh what was the guy's name? Virtual? Yes, 549 00:32:34,520 --> 00:32:36,760 Speaker 1: he's saying, okay, well, wait a minute, I get this 550 00:32:37,320 --> 00:32:39,560 Speaker 1: cell theory I'm working on that's been around for a 551 00:32:39,600 --> 00:32:42,960 Speaker 1: couple of decades. Hypothesis probably the cell hypothesis at the 552 00:32:43,600 --> 00:32:46,080 Speaker 1: nice catch. Don't feel bad though, because this article that 553 00:32:46,120 --> 00:32:50,160 Speaker 1: you sent said that scientists today like still like confuse 554 00:32:50,240 --> 00:32:54,600 Speaker 1: those terms just and there. The House of Works article 555 00:32:54,640 --> 00:32:56,760 Speaker 1: makes a good point in saying that science and everything 556 00:32:57,000 --> 00:32:59,040 Speaker 1: that has to do with it is in the scientific 557 00:32:59,080 --> 00:33:04,640 Speaker 1: method is very luid and open new interpretation and experimentation. Obviously, 558 00:33:05,000 --> 00:33:08,840 Speaker 1: but so he says, um, okay, this cell hypothesis, this 559 00:33:08,920 --> 00:33:12,880 Speaker 1: is a pretty good explanation for what we now call 560 00:33:13,000 --> 00:33:16,200 Speaker 1: spontaneous generation. He didn't do anything about it, He just 561 00:33:16,320 --> 00:33:20,000 Speaker 1: put it out there. Yeah, and then along comes Louis Pastier, 562 00:33:20,320 --> 00:33:22,680 Speaker 1: who does do something about it. He figures out a 563 00:33:22,720 --> 00:33:28,360 Speaker 1: great experiment to try to disprove spontaneous generation. Yeah, it's 564 00:33:28,360 --> 00:33:33,120 Speaker 1: pretty simple too. Um. He basically took a broth uh, 565 00:33:33,280 --> 00:33:36,600 Speaker 1: put equal amounts in two different beakers. One had a 566 00:33:36,640 --> 00:33:39,440 Speaker 1: straight neck and one had an S shaped neck. He 567 00:33:39,760 --> 00:33:42,080 Speaker 1: boiled it just to make sure everything and it was killed, 568 00:33:42,640 --> 00:33:44,960 Speaker 1: and then just let it sit there in the same conditions, 569 00:33:45,080 --> 00:33:48,560 Speaker 1: open to the to the world and are open to 570 00:33:48,600 --> 00:33:51,960 Speaker 1: the room like. It wasn't corked in other words, nor 571 00:33:52,360 --> 00:33:55,160 Speaker 1: he noticed that the one with the straight neck eventually 572 00:33:55,200 --> 00:33:58,640 Speaker 1: became cloudy and discolored, uh, meaning there was some junk 573 00:33:58,680 --> 00:34:01,400 Speaker 1: growing in there, and one in the S shaped neck 574 00:34:01,560 --> 00:34:05,920 Speaker 1: did not do anything. It remained the same. So think 575 00:34:06,080 --> 00:34:09,239 Speaker 1: what well, he thought that germs, that there were such 576 00:34:09,280 --> 00:34:12,640 Speaker 1: things as germs which leaving hoke and hook had already 577 00:34:12,640 --> 00:34:17,000 Speaker 1: shown um, And that if that in the S shaped 578 00:34:17,040 --> 00:34:20,440 Speaker 1: flask they had gotten trapped in the neck, in this 579 00:34:20,920 --> 00:34:23,799 Speaker 1: the open neck, they had been able to just enter 580 00:34:23,960 --> 00:34:28,440 Speaker 1: unobstructed and had generated there. The reason that the S 581 00:34:28,440 --> 00:34:31,880 Speaker 1: shape flask was still sterile was because there is no 582 00:34:31,960 --> 00:34:35,080 Speaker 1: such thing as spontaneous generation. If there were, then no 583 00:34:35,560 --> 00:34:39,480 Speaker 1: S shaped neck would impede anything like that, and boom, 584 00:34:39,719 --> 00:34:44,719 Speaker 1: there you have it. So he disproved that spontaneous generation 585 00:34:45,000 --> 00:34:50,040 Speaker 1: is a thing right through the scientific method. Exactly. Here's 586 00:34:50,080 --> 00:34:52,759 Speaker 1: the leap that a lot of people make, scientists included, 587 00:34:53,080 --> 00:34:56,320 Speaker 1: that really is a great disservice to science. He didn't 588 00:34:56,719 --> 00:35:00,560 Speaker 1: prove cell theory, right. What he did is take that 589 00:35:00,640 --> 00:35:08,239 Speaker 1: cell hypothesis and present some really persuasive evidence that it's 590 00:35:08,320 --> 00:35:12,040 Speaker 1: probably right. Yeah. But like this article you sent points out, 591 00:35:12,080 --> 00:35:15,040 Speaker 1: disproving something is just as important as proving something. So 592 00:35:15,080 --> 00:35:17,279 Speaker 1: here's the thing that's the most you can hope for 593 00:35:17,320 --> 00:35:20,839 Speaker 1: a science is disproving sure with science, unless you're talking 594 00:35:20,840 --> 00:35:23,680 Speaker 1: about math. With science, there's no such thing as proof. 595 00:35:24,239 --> 00:35:27,600 Speaker 1: A theory, even a law, universal law still has the 596 00:35:27,640 --> 00:35:32,640 Speaker 1: potential for being undermined by one single experiment, one single observation, 597 00:35:33,080 --> 00:35:37,480 Speaker 1: and therefore there is no real ultimate proof in science. 598 00:35:37,719 --> 00:35:41,960 Speaker 1: There's just theories and support for theories, and then ultimately 599 00:35:42,440 --> 00:35:46,840 Speaker 1: laws aim further and further support for laws, right, but 600 00:35:47,320 --> 00:35:52,880 Speaker 1: they're not proven. What science does ultimately is disproved things 601 00:35:53,000 --> 00:35:57,440 Speaker 1: or lend support for existing theories, are existing interpretations of 602 00:35:57,440 --> 00:36:00,640 Speaker 1: why things happen the way they do. And that's what 603 00:36:00,800 --> 00:36:03,160 Speaker 1: Pasture did. So if you look at the experiment, he 604 00:36:03,280 --> 00:36:08,239 Speaker 1: disproved spontaneous generation, but he lent support to the cell theory, 605 00:36:08,239 --> 00:36:10,640 Speaker 1: and probably with his experiment it went from the cell 606 00:36:10,760 --> 00:36:14,959 Speaker 1: hypothesis to the cell theory because it was just so persuasive. 607 00:36:15,520 --> 00:36:17,360 Speaker 1: And that's what a theory is. It means that a 608 00:36:17,400 --> 00:36:21,359 Speaker 1: lot of people out there who are reasonable say this 609 00:36:21,440 --> 00:36:24,520 Speaker 1: explanation is probably the right one. Yeah, it's predictive. If 610 00:36:24,520 --> 00:36:26,120 Speaker 1: you do it over and over, you're probably going to 611 00:36:26,160 --> 00:36:29,480 Speaker 1: get the same result. But that's not to say that 612 00:36:29,800 --> 00:36:32,640 Speaker 1: Pastures showed that if you do this a million in 613 00:36:32,760 --> 00:36:37,759 Speaker 1: one times that the S shaped flask won't turn cloudy. 614 00:36:38,080 --> 00:36:40,719 Speaker 1: He didn't prove that. You can't prove that, which is 615 00:36:40,760 --> 00:36:45,279 Speaker 1: again science can disprove and lent support can't prove a 616 00:36:45,440 --> 00:36:48,440 Speaker 1: very good point. So right after this message break, we're 617 00:36:48,440 --> 00:37:04,080 Speaker 1: going to get into the actual steps of the scientific method. Alright, dude, 618 00:37:04,239 --> 00:37:08,160 Speaker 1: I guess that long last we're there. Like you mentioned before, 619 00:37:08,719 --> 00:37:12,160 Speaker 1: the scientific method is fluid, and it's not like when 620 00:37:12,160 --> 00:37:15,080 Speaker 1: you get your science degree they hand you a little 621 00:37:15,160 --> 00:37:19,959 Speaker 1: laminated card like the Miranda rights that cop scary that 622 00:37:20,160 --> 00:37:22,080 Speaker 1: you know, list out all the different steps you have 623 00:37:22,160 --> 00:37:27,200 Speaker 1: to take. Um, But generally maybe I would we should 624 00:37:27,200 --> 00:37:29,240 Speaker 1: carry those around, all right, we should make little wallet 625 00:37:29,239 --> 00:37:32,239 Speaker 1: cards of the scientific method just to carry stuff you 626 00:37:32,239 --> 00:37:34,000 Speaker 1: should know about. Go on it. Oh yeah, I'll make 627 00:37:34,000 --> 00:37:39,279 Speaker 1: a million buttons random and sell them. Uh. Generally speaking, though, 628 00:37:39,640 --> 00:37:42,080 Speaker 1: it follows these steps. The first thing you do, like 629 00:37:42,120 --> 00:37:45,600 Speaker 1: we mentioned earlier, is you observe something. You ask a question. 630 00:37:46,280 --> 00:37:49,640 Speaker 1: Uh next, Like Darwin was known, I think when we 631 00:37:49,680 --> 00:37:52,000 Speaker 1: did our podcast on him too, he would spend like 632 00:37:52,000 --> 00:37:56,759 Speaker 1: a week on three square feet of ground. It was 633 00:37:56,800 --> 00:37:59,279 Speaker 1: like even longer than that. Remember, remember, wasn't it he 634 00:37:59,320 --> 00:38:01,520 Speaker 1: said that he did. He wasn't gonna MOA's lawn for 635 00:38:01,560 --> 00:38:04,919 Speaker 1: like three years because he wanted to see what what happened? Yeah, 636 00:38:04,920 --> 00:38:08,520 Speaker 1: so he's the ultimate and qualitative data of just observing, 637 00:38:08,960 --> 00:38:12,360 Speaker 1: writing things down and asking questions. And the reason you 638 00:38:12,400 --> 00:38:16,160 Speaker 1: ask your question is so you can narrow something down 639 00:38:17,000 --> 00:38:18,680 Speaker 1: like that. I think the example they use in here 640 00:38:18,760 --> 00:38:22,759 Speaker 1: is on Galapa ghosts, like the beaks of what bird? 641 00:38:22,840 --> 00:38:25,120 Speaker 1: Was it? Yeah? The finch bird. He noticed a bunch 642 00:38:25,120 --> 00:38:29,000 Speaker 1: of different beaks, so he finally posted a question like, um, 643 00:38:29,040 --> 00:38:30,880 Speaker 1: you know, I think these beaks are different for a 644 00:38:30,960 --> 00:38:34,520 Speaker 1: very specific reason, and I to find out why. Yes, 645 00:38:34,920 --> 00:38:38,239 Speaker 1: he said, what caused the diversification of finches on Gallopagus? 646 00:38:39,080 --> 00:38:43,439 Speaker 1: Who you should have done that with an accent. Well, yeah, 647 00:38:43,480 --> 00:38:46,120 Speaker 1: he would have had a British accent. Huh yeah, unless 648 00:38:46,120 --> 00:38:48,480 Speaker 1: he was pretending to be someone else. I think of 649 00:38:48,560 --> 00:38:52,200 Speaker 1: him as like, um, sounding like Hemingway or something. Oh yeah, 650 00:38:52,280 --> 00:38:55,719 Speaker 1: just drunken, violent kind of. But he wasn't. He was 651 00:38:55,760 --> 00:38:58,200 Speaker 1: like the opposite of that. Yeah. Well I saw that 652 00:38:58,280 --> 00:39:02,440 Speaker 1: the movie. Saw a picture his voice as the dude 653 00:39:03,480 --> 00:39:07,000 Speaker 1: that played him, who I can't remember right now, ed Norton. No, 654 00:39:09,400 --> 00:39:12,560 Speaker 1: I finally saw a birdman. Tho did you see that? Yeah? 655 00:39:13,080 --> 00:39:18,720 Speaker 1: Great movie? Um, I disagree you didn't like it? What? Wow? 656 00:39:19,160 --> 00:39:24,160 Speaker 1: That surprises me. Um, we'll get into that off year. 657 00:39:25,320 --> 00:39:30,279 Speaker 1: So uh sorry, you just threw me with that. And 658 00:39:30,280 --> 00:39:32,520 Speaker 1: he's on galapagost and he's like, what the heck with 659 00:39:32,560 --> 00:39:35,359 Speaker 1: all these different finches one small island? Why would there 660 00:39:35,360 --> 00:39:39,239 Speaker 1: be different species of finch so and why are they 661 00:39:39,239 --> 00:39:43,840 Speaker 1: all seeming to survive and coexist so? Well, what's what 662 00:39:44,080 --> 00:39:47,280 Speaker 1: makes Yeah? Then he leads to the question, what's making 663 00:39:47,320 --> 00:39:50,880 Speaker 1: all of these species of finches so diverse? Right? Or? 664 00:39:50,920 --> 00:39:54,480 Speaker 1: Bill Harris uses a pretty good example. That's something everyone 665 00:39:54,520 --> 00:39:58,359 Speaker 1: can understand, Like what car body shape is the best 666 00:39:58,400 --> 00:40:01,600 Speaker 1: for air resistance? Like one the shape like a box, 667 00:40:01,719 --> 00:40:04,400 Speaker 1: or one the shape like aerodynamic, like a bird, and 668 00:40:04,440 --> 00:40:07,480 Speaker 1: he carries that out. In the next step, you formulate 669 00:40:07,560 --> 00:40:12,839 Speaker 1: your hypothesis based on your you know, for knowledge and 670 00:40:13,360 --> 00:40:16,799 Speaker 1: maybe observations, like so, you know what, I think that 671 00:40:16,840 --> 00:40:19,319 Speaker 1: a car shaped like a bird is probably more aerodynamic 672 00:40:19,320 --> 00:40:21,240 Speaker 1: than one shape like a box. Yeah, if you're thinking, 673 00:40:21,239 --> 00:40:23,400 Speaker 1: if you're the type of person who's sitting around asking 674 00:40:23,480 --> 00:40:26,880 Speaker 1: questions about aerodynamics, you probably already have some sort of 675 00:40:26,920 --> 00:40:30,120 Speaker 1: sense that a box is less aerodynamic than a bird. 676 00:40:30,680 --> 00:40:33,960 Speaker 1: Boxes rarely fly unless they're carried by one of those 677 00:40:33,960 --> 00:40:39,200 Speaker 1: delightful Amazon delivery drones. They don't have those yet, right, 678 00:40:39,239 --> 00:40:42,200 Speaker 1: They're not gonna do that, are they. There's like a 679 00:40:42,239 --> 00:40:46,279 Speaker 1: pizza delivery drone service. Man, I think where you have 680 00:40:47,120 --> 00:40:50,200 Speaker 1: PiZZ are grilled cheese in New York and you go 681 00:40:50,320 --> 00:40:52,800 Speaker 1: stand on an XF you order and it like comes 682 00:40:52,840 --> 00:40:54,840 Speaker 1: and drops it. That is the dumbest thing of it. 683 00:40:55,920 --> 00:40:58,000 Speaker 1: And I can't wait to do it where they're making 684 00:40:58,040 --> 00:41:01,320 Speaker 1: a lot of money. Let's pretty funny. Um, Yet we 685 00:41:01,360 --> 00:41:04,080 Speaker 1: can't get food to the homeless somehow exactly, we can 686 00:41:04,160 --> 00:41:06,600 Speaker 1: drop a grilled cheese on someone's head. They're like you 687 00:41:06,640 --> 00:41:11,239 Speaker 1: homeless guy, get off that X exactly. Um, alright, So 688 00:41:11,480 --> 00:41:14,440 Speaker 1: your hypothesis, I don't think we ever mentioned, is typically 689 00:41:14,440 --> 00:41:18,080 Speaker 1: represented as an if then statement. Yeah, if you're doing 690 00:41:18,080 --> 00:41:25,359 Speaker 1: good sciences, Yeah, like if the cars profile. Uh, well, 691 00:41:25,440 --> 00:41:28,120 Speaker 1: the the example he uses if the body's profile related 692 00:41:28,160 --> 00:41:31,120 Speaker 1: to the amount of air it produces, which is the 693 00:41:31,160 --> 00:41:34,120 Speaker 1: more general statement. Yeah, that's like based on a theory. Yeah, 694 00:41:34,200 --> 00:41:36,600 Speaker 1: and it's gonna get more specific than the car designed 695 00:41:36,640 --> 00:41:39,800 Speaker 1: like the body of a bird will be more aerodynamic 696 00:41:39,840 --> 00:41:42,920 Speaker 1: than one like a box. So that's inductive reasoning, starting 697 00:41:42,920 --> 00:41:45,920 Speaker 1: with a broad statement and going to something narrow, and 698 00:41:46,000 --> 00:41:49,200 Speaker 1: it's if then at the same time. Yeah, and now 699 00:41:49,239 --> 00:41:51,600 Speaker 1: you have a test. You have a question that can 700 00:41:51,640 --> 00:41:54,080 Speaker 1: be answered, you can figure out a way to answer it. Yeah, 701 00:41:54,120 --> 00:41:57,680 Speaker 1: And he points out to this is pretty important that, uh, 702 00:41:57,719 --> 00:42:01,240 Speaker 1: your hypothesis, if it's formulated correctly, means it is testable 703 00:42:01,520 --> 00:42:05,239 Speaker 1: and it's falsifiable, which are often one and the same, 704 00:42:06,000 --> 00:42:08,719 Speaker 1: you know. Yeah, And that's again we go to the 705 00:42:08,960 --> 00:42:14,040 Speaker 1: people who say that they're they're soft sciences aren't real science. 706 00:42:14,080 --> 00:42:17,680 Speaker 1: There's pseudoscience because a lot of the data that they 707 00:42:17,719 --> 00:42:19,640 Speaker 1: come up with, a lot of the hypotheses they come 708 00:42:19,719 --> 00:42:24,600 Speaker 1: up with aren't falsifiable, they're not testable. It's a it's 709 00:42:24,640 --> 00:42:27,759 Speaker 1: a thing, it's an issue, it's a thing. So next 710 00:42:27,840 --> 00:42:32,840 Speaker 1: up in the steps, you're gonna experiment. And when you experiment, 711 00:42:33,120 --> 00:42:34,799 Speaker 1: you can't just go in there willy nilly and do 712 00:42:34,840 --> 00:42:37,480 Speaker 1: whatever you want. Um, you have to set up specific 713 00:42:37,520 --> 00:42:41,560 Speaker 1: conditions and they must be controlled, and you want to 714 00:42:41,920 --> 00:42:45,000 Speaker 1: everything that's supposed to be identical needs to be identical. 715 00:42:45,120 --> 00:42:48,560 Speaker 1: So basically, you have two variables. At least, you have 716 00:42:48,600 --> 00:42:52,160 Speaker 1: an independent variable and you have a dependent variable. And 717 00:42:52,239 --> 00:42:56,560 Speaker 1: if you're talking about car shape, that is the independent 718 00:42:56,640 --> 00:43:00,520 Speaker 1: variable in this study, that's the one that's manipulated exactly, 719 00:43:00,719 --> 00:43:03,360 Speaker 1: it's the one you're controlling. The independent variable is the 720 00:43:03,400 --> 00:43:06,680 Speaker 1: one you, the researcher, is controlling. So in this case, 721 00:43:06,960 --> 00:43:10,239 Speaker 1: you're controlling the shape of the car. You have yourself 722 00:43:10,320 --> 00:43:13,120 Speaker 1: a bird shaped car and you have yourself a box 723 00:43:13,120 --> 00:43:16,280 Speaker 1: shaped car. So the shape of the car changed because 724 00:43:16,360 --> 00:43:19,080 Speaker 1: you made it change. Now, when you blast a bunch 725 00:43:19,080 --> 00:43:22,280 Speaker 1: of air over it during your experiment, what you're measuring 726 00:43:22,640 --> 00:43:27,720 Speaker 1: is the dependent variable. So you're measuring what happens based 727 00:43:27,760 --> 00:43:30,719 Speaker 1: on the change that you made. That's right, and you 728 00:43:30,760 --> 00:43:33,000 Speaker 1: want to you, you want to study one single variable 729 00:43:33,040 --> 00:43:36,160 Speaker 1: at a time, basically, yeah, don't get fancy, just just 730 00:43:36,239 --> 00:43:39,560 Speaker 1: do good science, step by step methodical. You also have 731 00:43:39,640 --> 00:43:42,600 Speaker 1: to have your control group in any experiment, uh, and 732 00:43:42,800 --> 00:43:47,360 Speaker 1: an experimental group and the controlled group is what's gonna 733 00:43:47,440 --> 00:43:52,160 Speaker 1: allow you to compare the test results to that baseline measurement. Yeah, 734 00:43:52,239 --> 00:43:55,239 Speaker 1: and you need that baseline measurement, so it's not just 735 00:43:55,320 --> 00:43:59,160 Speaker 1: like chance. Basically exactly like if Pastor had just done 736 00:43:59,200 --> 00:44:02,279 Speaker 1: the S shaped now can nothing happen? He wouldn't have 737 00:44:02,320 --> 00:44:05,719 Speaker 1: necessarily been able to say that he was right, even 738 00:44:05,760 --> 00:44:08,920 Speaker 1: though he was right, he needed that control, which was 739 00:44:08,960 --> 00:44:11,920 Speaker 1: the open flask, right. Or with the cars, you need 740 00:44:11,960 --> 00:44:14,799 Speaker 1: two cars, like you said, one bird shaped and one 741 00:44:14,840 --> 00:44:17,600 Speaker 1: box shaped, right, Or then maybe in this case, since 742 00:44:17,680 --> 00:44:19,480 Speaker 1: the bird shape and the box shape both show up 743 00:44:19,480 --> 00:44:22,440 Speaker 1: in the hypothsis, you need a third like egg shaped 744 00:44:22,520 --> 00:44:24,880 Speaker 1: one or something like that. I bet that would be 745 00:44:24,880 --> 00:44:30,120 Speaker 1: pretty streamline. Yeah. Yeah. But the key though, is all 746 00:44:30,160 --> 00:44:32,440 Speaker 1: of those variables have to be um. All the other 747 00:44:32,480 --> 00:44:34,920 Speaker 1: variables have to be the same, like you have to 748 00:44:34,960 --> 00:44:37,000 Speaker 1: have them. They have to be the same weight, they 749 00:44:37,000 --> 00:44:40,920 Speaker 1: have to be painted the same, the tires, everything, the windows, 750 00:44:41,560 --> 00:44:43,919 Speaker 1: one can't have an antenna on the other not They've 751 00:44:43,920 --> 00:44:46,880 Speaker 1: all they've got to be identical other than the one variable, right, 752 00:44:46,960 --> 00:44:50,440 Speaker 1: the independent variable that you're that's the one you want different. 753 00:44:50,480 --> 00:44:53,160 Speaker 1: Everything else you want the same, or else it's possible that, oh, 754 00:44:53,200 --> 00:44:56,759 Speaker 1: well this one had bigger tires, so that actually made 755 00:44:56,760 --> 00:44:59,319 Speaker 1: it more aerodynamic. Yeah, and you're just doing yourself a 756 00:44:59,360 --> 00:45:01,960 Speaker 1: favor by doing all that stuff. You know, you want 757 00:45:02,000 --> 00:45:06,480 Speaker 1: to rule out everything else but that one variable. After that, 758 00:45:06,520 --> 00:45:10,319 Speaker 1: you want to analyze your data so you can draw 759 00:45:10,360 --> 00:45:15,120 Speaker 1: your conclusion. And sometimes it's kind of straightforward and easy. 760 00:45:15,200 --> 00:45:17,600 Speaker 1: Sometimes takes a lot of work and a lot of 761 00:45:18,280 --> 00:45:23,520 Speaker 1: various tools. Let's say you're just blasting a car and 762 00:45:23,600 --> 00:45:27,640 Speaker 1: a wind tunnel. You're measuring the wind resistance using certain 763 00:45:27,760 --> 00:45:30,800 Speaker 1: awesome instruments and that kind of stuff, and you're taking 764 00:45:30,800 --> 00:45:33,800 Speaker 1: that data, and then afterwards you're gonna analyze. You're gonna 765 00:45:33,840 --> 00:45:37,280 Speaker 1: compare the data that you gathered from the bird shaped 766 00:45:37,320 --> 00:45:40,319 Speaker 1: car the box shaped car, and then the control the 767 00:45:40,320 --> 00:45:43,120 Speaker 1: egg shaped car. You're gonna compare them and you're gonna say, well, 768 00:45:43,440 --> 00:45:47,320 Speaker 1: the wind resistance was less for the bird shaped car 769 00:45:47,920 --> 00:45:52,040 Speaker 1: then the box shaped car, which means that my hypothesis 770 00:45:52,480 --> 00:45:55,479 Speaker 1: was correct, and here all the data points, whereas Louis 771 00:45:55,520 --> 00:45:58,719 Speaker 1: paste or could just say look at the beakers exactly. 772 00:45:59,200 --> 00:46:02,439 Speaker 1: Don't be an idiot it I'm a scientist. That one's 773 00:46:02,440 --> 00:46:06,400 Speaker 1: got gross stuff. You can see it, right. But the 774 00:46:06,440 --> 00:46:10,399 Speaker 1: other thing about science to chuck ideally is let's say 775 00:46:10,440 --> 00:46:14,200 Speaker 1: that egg shaped one turned out the control group turned 776 00:46:14,200 --> 00:46:17,200 Speaker 1: out to have better wind resistance than anything. Well, just 777 00:46:17,280 --> 00:46:20,319 Speaker 1: by virtue of carrying out this experiment correctly, you would 778 00:46:20,320 --> 00:46:24,319 Speaker 1: have stumbled upon an even better aerodynamic design, and you 779 00:46:24,320 --> 00:46:27,520 Speaker 1: would have come up with that little egg shaped Mercedes 780 00:46:27,600 --> 00:46:30,680 Speaker 1: suv that was so huge, like ten years ago, the 781 00:46:30,760 --> 00:46:35,360 Speaker 1: Mercedes egg coming to a store near you. So um. 782 00:46:35,400 --> 00:46:38,120 Speaker 1: That's a big, big part of the scientific method is 783 00:46:39,000 --> 00:46:46,319 Speaker 1: carrying out a a an experiment, controlling the variables, analyzing 784 00:46:46,360 --> 00:46:49,400 Speaker 1: the data. And then there's a step that he missed 785 00:46:49,440 --> 00:46:53,440 Speaker 1: that is very rarely part of a scientific method list 786 00:46:54,120 --> 00:46:57,520 Speaker 1: that is to share your data. And this is a 787 00:46:57,600 --> 00:47:00,759 Speaker 1: huge problem with science right now. The article you said 788 00:47:00,760 --> 00:47:03,760 Speaker 1: it was really eye opening. Uh, scientific research has changed 789 00:47:03,800 --> 00:47:05,920 Speaker 1: the world. Now it needs to change itself. It's an 790 00:47:05,920 --> 00:47:08,640 Speaker 1: economist article. It's up on the internet. Yeah, it was 791 00:47:08,719 --> 00:47:12,960 Speaker 1: kind of scary that it's I mean, here's some of 792 00:47:12,960 --> 00:47:15,760 Speaker 1: the data he points out is one rule of thumb 793 00:47:15,800 --> 00:47:21,839 Speaker 1: among biotech venture capitalists is about half published research can't 794 00:47:21,840 --> 00:47:25,200 Speaker 1: even be replicated. And on the biotech firm and Jin 795 00:47:25,280 --> 00:47:28,360 Speaker 1: found that they could reproduce only six of their fifty 796 00:47:28,400 --> 00:47:32,400 Speaker 1: three landmark studies in cancer research. So you can't repeat 797 00:47:32,440 --> 00:47:36,360 Speaker 1: these things. It's like everyone's fighting for dollars in fame, 798 00:47:37,040 --> 00:47:39,400 Speaker 1: and maybe not fame, but to some of our career 799 00:47:39,400 --> 00:47:43,240 Speaker 1: advancements such that they're kind of not doing that final 800 00:47:43,239 --> 00:47:47,000 Speaker 1: step any longer. No, and it's not necessarily just them, 801 00:47:47,160 --> 00:47:50,480 Speaker 1: it's the other scientists aren't going back and saying, well, 802 00:47:50,600 --> 00:47:54,560 Speaker 1: let me see if your results are reproducible. People are 803 00:47:54,640 --> 00:47:57,960 Speaker 1: just taking it on faith. We need another Roger Bacon 804 00:47:58,040 --> 00:47:59,960 Speaker 1: to come along and be like, dude, we can't as 805 00:48:00,120 --> 00:48:03,360 Speaker 1: blindly accept that one person carried out this one study 806 00:48:03,880 --> 00:48:06,520 Speaker 1: and then just go do clinical trials on it without 807 00:48:06,560 --> 00:48:10,279 Speaker 1: anybody reproducing it to see if the results can be 808 00:48:10,400 --> 00:48:13,600 Speaker 1: verified independently. Yeah. Because uh, and this is a good 809 00:48:13,600 --> 00:48:16,400 Speaker 1: time to mention bias. There is such a thing as bias, 810 00:48:16,600 --> 00:48:22,200 Speaker 1: and it still happens. Um. A scientist is usually out 811 00:48:22,280 --> 00:48:26,279 Speaker 1: to prove something or disprove something that they want a 812 00:48:26,360 --> 00:48:30,600 Speaker 1: specific result. Like, even if you're super open minded, you're 813 00:48:30,600 --> 00:48:34,360 Speaker 1: probably hoping to disprove or prove something one way or 814 00:48:34,400 --> 00:48:38,520 Speaker 1: the other, and your confirmation bias might you know, even 815 00:48:38,560 --> 00:48:41,040 Speaker 1: if you don't think you're doing it, you might nudge 816 00:48:41,040 --> 00:48:44,799 Speaker 1: out some results that don't support your hypothesis, and so 817 00:48:44,880 --> 00:48:49,799 Speaker 1: you won't make it into that awesome journal um which 818 00:48:50,080 --> 00:48:53,359 Speaker 1: this author points out that journals need to start uh 819 00:48:54,000 --> 00:48:58,920 Speaker 1: putting in what he calls uninteresting results and experiments right 820 00:48:59,120 --> 00:49:02,120 Speaker 1: or like this if it's not sexy, right, or studies 821 00:49:02,160 --> 00:49:06,360 Speaker 1: that failed to show that their hypothesis was correct. Stuff 822 00:49:06,440 --> 00:49:10,399 Speaker 1: is disproved. Those things still need to well not even disproved. Well, yeah, 823 00:49:10,440 --> 00:49:12,359 Speaker 1: I guess it is disproved, but yes, like the guy 824 00:49:12,400 --> 00:49:17,319 Speaker 1: set out to say, like the red balloon uses less 825 00:49:17,360 --> 00:49:20,440 Speaker 1: helium than a silver balloon, and it turns out that 826 00:49:20,520 --> 00:49:23,120 Speaker 1: now they use the same amount of helium. Well, if 827 00:49:23,160 --> 00:49:26,400 Speaker 1: that study gets published and put out there into the 828 00:49:26,560 --> 00:49:30,160 Speaker 1: scientific literature on helium and balloons, then it's going to 829 00:49:30,239 --> 00:49:33,480 Speaker 1: prevent some other scientists down the road from wasting time, money, 830 00:49:33,560 --> 00:49:39,560 Speaker 1: and helium, which, as you'll remember, is an increasingly needed commodity. UM. 831 00:49:39,600 --> 00:49:43,800 Speaker 1: By carrying out the same experiment, whether whether the results 832 00:49:43,840 --> 00:49:47,040 Speaker 1: are positive or negative, or what the study is meant 833 00:49:47,040 --> 00:49:49,600 Speaker 1: to be shared. And that's the point of the scientific method, 834 00:49:49,640 --> 00:49:52,360 Speaker 1: is to to reduce bias. And if you follow it 835 00:49:52,400 --> 00:49:54,960 Speaker 1: all the way through ideally and do all of the steps, 836 00:49:54,960 --> 00:49:59,319 Speaker 1: including share your research whether it's happy or sad, then 837 00:50:00,080 --> 00:50:03,359 Speaker 1: science benefits. The world benefits, and by not doing that, 838 00:50:03,480 --> 00:50:06,480 Speaker 1: the world does not benefit. Yeah. He points out that 839 00:50:06,960 --> 00:50:11,680 Speaker 1: these days only four percent of published papers are quote 840 00:50:11,760 --> 00:50:15,560 Speaker 1: unquote negative results, and it used to be like or 841 00:50:15,600 --> 00:50:18,560 Speaker 1: more um And he says, because it's a lot of 842 00:50:18,560 --> 00:50:22,160 Speaker 1: it has to do with this sort of you know, 843 00:50:23,200 --> 00:50:25,719 Speaker 1: getting in these journals and you're the rockstar scientists and 844 00:50:25,719 --> 00:50:28,239 Speaker 1: this study is super sexy. Like if they kind of 845 00:50:28,320 --> 00:50:30,359 Speaker 1: quit going that route and made it what it should be, 846 00:50:30,800 --> 00:50:35,440 Speaker 1: then research dollars would be better spent and people could 847 00:50:35,560 --> 00:50:37,839 Speaker 1: you know, he said, the peer reviewed thing isn't even 848 00:50:37,840 --> 00:50:40,720 Speaker 1: all extract up to know that. He mentioned to study 849 00:50:40,920 --> 00:50:43,600 Speaker 1: from a medical journal that gave a bunch of peer 850 00:50:43,600 --> 00:50:48,960 Speaker 1: reviewers some stuff with deliberate errors inserted into the research, 851 00:50:49,000 --> 00:50:51,719 Speaker 1: into the studies, and even when they were told that 852 00:50:51,760 --> 00:50:54,360 Speaker 1: they were being tested to find this, they still missed 853 00:50:54,360 --> 00:50:57,480 Speaker 1: a lot of it. Yeah, so, yeah, the science used 854 00:50:57,520 --> 00:50:59,480 Speaker 1: to kind of re evaluate the way it's carrying out 855 00:50:59,520 --> 00:51:04,560 Speaker 1: sciences not science. The problem isn't science itself. The problem 856 00:51:04,600 --> 00:51:06,759 Speaker 1: isn't the scientific method. It's the way that it's being 857 00:51:06,840 --> 00:51:08,920 Speaker 1: used or not followed through. And a lot of it 858 00:51:08,920 --> 00:51:11,919 Speaker 1: has to do with academia and the people funding science. Yeah, 859 00:51:11,920 --> 00:51:14,120 Speaker 1: and he said, you know, these days there's a seven 860 00:51:14,120 --> 00:51:17,560 Speaker 1: million researchers, and back in the day, even in like 861 00:51:17,640 --> 00:51:21,040 Speaker 1: the nineteen fifties, there were like a few thousand maybe, right, 862 00:51:21,080 --> 00:51:23,879 Speaker 1: So there's just a lot of career competition, he calls 863 00:51:23,920 --> 00:51:27,680 Speaker 1: it careerism. And so you fake a result or two, 864 00:51:27,760 --> 00:51:30,480 Speaker 1: or you just nudge out some results that don't support 865 00:51:30,520 --> 00:51:34,439 Speaker 1: your hypothesis. You want the bigger paycheck or the fame 866 00:51:34,520 --> 00:51:38,600 Speaker 1: or notoriety, and all of a sudden, science is not science. Yeah, 867 00:51:39,120 --> 00:51:45,239 Speaker 1: you know, it's pseudoscience exactly. And speaking of pseudoscience, I 868 00:51:45,239 --> 00:51:47,640 Speaker 1: think we've reached the point where, um, we should talk 869 00:51:47,680 --> 00:51:50,759 Speaker 1: about the limitations of the scientific method, because it does 870 00:51:50,840 --> 00:51:54,280 Speaker 1: have its limits, right, Like the way that the scientific 871 00:51:54,360 --> 00:51:57,040 Speaker 1: method is set up, especially if you go through, um, 872 00:51:57,080 --> 00:52:01,200 Speaker 1: if you include falsification, which most science is now, say 873 00:52:01,440 --> 00:52:05,360 Speaker 1: is a thing like falsifiability of your hypothesis means that 874 00:52:05,440 --> 00:52:09,239 Speaker 1: you have a real scientific hypothesis there if it can 875 00:52:09,280 --> 00:52:12,880 Speaker 1: be disproven by some observation or some measurement or whatever, 876 00:52:13,080 --> 00:52:16,919 Speaker 1: then it's falsifiable. And if it's not falsifiable, then it's 877 00:52:16,960 --> 00:52:21,000 Speaker 1: not really science. So the thing is for something to 878 00:52:21,040 --> 00:52:23,440 Speaker 1: be falsifiable. And it was actually a philosopher that came 879 00:52:23,520 --> 00:52:26,160 Speaker 1: up with the concept of falsification, a guy named Carl 880 00:52:26,200 --> 00:52:29,399 Speaker 1: Popper in the ninet thirties, and he was the one 881 00:52:29,440 --> 00:52:31,960 Speaker 1: that said, like, you're you have to be able to 882 00:52:32,040 --> 00:52:36,200 Speaker 1: falsify something for it to be disproven or supported, and 883 00:52:36,239 --> 00:52:39,680 Speaker 1: if not, then it's pseudoscience. Well, part and parcel of 884 00:52:39,760 --> 00:52:42,279 Speaker 1: that is that what you're saying has to be able 885 00:52:42,280 --> 00:52:46,359 Speaker 1: to be detected empirically. There's some way that it has 886 00:52:46,400 --> 00:52:49,440 Speaker 1: to the presence of it has to be measured or inferred. 887 00:52:50,160 --> 00:52:53,480 Speaker 1: And so a lot of people say, well, then with 888 00:52:53,600 --> 00:52:57,200 Speaker 1: the scientific method it reaches it's the limits of its 889 00:52:57,239 --> 00:53:00,279 Speaker 1: current usefulness when it tries to explain the supernatural role. 890 00:53:01,000 --> 00:53:06,359 Speaker 1: When somebody says, like, are real exactly you can't prove that, well, 891 00:53:06,400 --> 00:53:10,719 Speaker 1: you also can't disprove it either, right, and So if 892 00:53:10,800 --> 00:53:15,440 Speaker 1: you are a scientist who says, uh, because the scientific 893 00:53:15,520 --> 00:53:20,000 Speaker 1: method can't prove or disprove the existence of ghosts or God, 894 00:53:20,840 --> 00:53:23,640 Speaker 1: there is no such thing as ghosts or God, you're 895 00:53:23,680 --> 00:53:27,000 Speaker 1: making a leap of faith just as much as the 896 00:53:27,040 --> 00:53:30,239 Speaker 1: same person who says science can't prove or disprove the 897 00:53:30,239 --> 00:53:33,600 Speaker 1: existence of ghosts or God, therefore God's and ghosts are real. 898 00:53:34,719 --> 00:53:37,359 Speaker 1: They're both leaps of faith. And that really the most 899 00:53:37,400 --> 00:53:41,239 Speaker 1: scientific approach to the existence of the supernatural, whether it 900 00:53:41,320 --> 00:53:44,680 Speaker 1: is a ghost or God, is that we simply don't know, 901 00:53:45,120 --> 00:53:48,680 Speaker 1: and that we cannot know scientifically. And but that doesn't 902 00:53:48,680 --> 00:53:51,680 Speaker 1: mean that it does exist or doesn't exist. And that's 903 00:53:51,680 --> 00:53:54,840 Speaker 1: saying that science shows that it does or doesn't exist, 904 00:53:54,960 --> 00:53:58,040 Speaker 1: is by by definition, the opposite of what science shows. 905 00:53:58,520 --> 00:54:03,640 Speaker 1: Science shows neither. It's not capable of showing or showing 906 00:54:03,680 --> 00:54:06,960 Speaker 1: that something doesn't exist. That's a good point. Uh. The 907 00:54:07,000 --> 00:54:10,440 Speaker 1: other place where science can get corrupted is when it 908 00:54:10,600 --> 00:54:14,319 Speaker 1: blurs the lines, or when people blur the lines between uh, 909 00:54:14,600 --> 00:54:19,920 Speaker 1: moral judgments and science value judgments. Like you can study 910 00:54:19,960 --> 00:54:22,440 Speaker 1: global warming, you can study cause and effect, you can 911 00:54:22,440 --> 00:54:27,080 Speaker 1: report data, But when you make that secondly to say, 912 00:54:27,440 --> 00:54:29,440 Speaker 1: and this is a scientist. I mean, someone can come 913 00:54:29,440 --> 00:54:32,000 Speaker 1: along and say global warming is bad, shouldn't drive your suv. 914 00:54:32,400 --> 00:54:35,920 Speaker 1: That's fine, But a scientist can't do a study and 915 00:54:36,000 --> 00:54:40,440 Speaker 1: say that because that's uh, that's a value judgment. And 916 00:54:40,520 --> 00:54:44,640 Speaker 1: that's where science can get corrupted pretty much. Right. You can. 917 00:54:44,760 --> 00:54:47,520 Speaker 1: You can study global warming and results until the cows 918 00:54:47,600 --> 00:54:51,160 Speaker 1: come home, but you can't assert that if you use 919 00:54:51,239 --> 00:54:54,440 Speaker 1: this lightbulb, you're a bad person, right, or um ocean 920 00:54:54,440 --> 00:54:58,200 Speaker 1: and acidification is bad. It's not good for humans, but 921 00:54:58,320 --> 00:55:01,600 Speaker 1: if you're a jelly fits it's awesome, you know. So, yes, 922 00:55:01,880 --> 00:55:04,719 Speaker 1: and again you made a great point. It's not science, 923 00:55:05,000 --> 00:55:10,520 Speaker 1: it's people using science to make value judgments. So ultimately, 924 00:55:10,760 --> 00:55:14,239 Speaker 1: the scientific method, although it does have its limitations in 925 00:55:14,280 --> 00:55:18,920 Speaker 1: that it needs empirical data, uh to prove or disprove something, 926 00:55:19,800 --> 00:55:23,160 Speaker 1: it's not that it's not flawed. That's not a flaw. 927 00:55:23,239 --> 00:55:27,640 Speaker 1: That's a limitation, and it's it's when it's misused then 928 00:55:27,680 --> 00:55:31,360 Speaker 1: it its results become flawed or skewed, and that's the 929 00:55:31,400 --> 00:55:34,360 Speaker 1: people doming it. Man, not science, that's right. It's pretty 930 00:55:34,400 --> 00:55:36,960 Speaker 1: interesting stuff. Yeah, man, this is a good one. I 931 00:55:37,000 --> 00:55:39,279 Speaker 1: thought so too. Man. Let us start out with a 932 00:55:39,320 --> 00:55:43,640 Speaker 1: bang boom all downhill from here. If you want to 933 00:55:43,640 --> 00:55:47,000 Speaker 1: know more about scientific method, check out that article on 934 00:55:47,040 --> 00:55:50,640 Speaker 1: the economists, check out explorables uh, and then of course 935 00:55:50,719 --> 00:55:53,239 Speaker 1: check out the scientific method in the search bar at 936 00:55:53,239 --> 00:55:55,600 Speaker 1: how stuff works dot com. And since I said that 937 00:55:55,719 --> 00:56:01,439 Speaker 1: it's time for listener mail, that's right, but quickly before 938 00:56:01,480 --> 00:56:06,160 Speaker 1: listener mail. Uh. We get asked by listeners all the time, 939 00:56:06,239 --> 00:56:08,640 Speaker 1: what can we do? Since you have a free podcast, 940 00:56:08,680 --> 00:56:10,440 Speaker 1: we can't pay for it. What can we do to 941 00:56:10,480 --> 00:56:13,400 Speaker 1: help you? Guys? And one thing you can do that 942 00:56:13,440 --> 00:56:17,200 Speaker 1: we would appreciate is, uh, go to iTunes and leave 943 00:56:17,320 --> 00:56:20,919 Speaker 1: a rating and a review for us. That makes big, 944 00:56:20,960 --> 00:56:23,360 Speaker 1: big difference in keeping us up there in the rankings, 945 00:56:23,719 --> 00:56:26,600 Speaker 1: which means more people find stuff you should know after 946 00:56:26,640 --> 00:56:29,680 Speaker 1: they listen to Cereal they'll just say, well, jeez, there's 947 00:56:29,680 --> 00:56:34,080 Speaker 1: other podcasts in the world, what is this pod? So 948 00:56:34,239 --> 00:56:36,760 Speaker 1: ratings and reviews really help us out and it doesn't 949 00:56:36,800 --> 00:56:39,480 Speaker 1: cost you anything but a few minutes. Um, be honest, 950 00:56:39,520 --> 00:56:41,759 Speaker 1: We're not saying go leave us some great review, but 951 00:56:42,080 --> 00:56:46,239 Speaker 1: go leave us a great review. You said it, uh, 952 00:56:46,280 --> 00:56:48,360 Speaker 1: and told tell one person about stuff you should know. 953 00:56:48,360 --> 00:56:50,720 Speaker 1: We would appreciate that. To turn somebody onto the show 954 00:56:51,360 --> 00:56:56,400 Speaker 1: and um, that's it. That's our version of a pledge drive. Wow, 955 00:56:56,760 --> 00:56:59,800 Speaker 1: we do what once every three years. Not not very 956 00:57:00,000 --> 00:57:03,439 Speaker 1: obnoxious and it last alright. So on the listener mail, 957 00:57:03,520 --> 00:57:06,319 Speaker 1: this is from my sister in law. Actually, oh yeah, 958 00:57:06,320 --> 00:57:11,480 Speaker 1: there's some nepotism. Yeah, Jenny, Jenny Bryant, she mentioned in 959 00:57:11,520 --> 00:57:14,919 Speaker 1: the home school episode homeschooled her kids for a little while, 960 00:57:15,400 --> 00:57:19,240 Speaker 1: and she sort of corrected me. Uh, love the homeschooling episode, guys. 961 00:57:19,280 --> 00:57:22,120 Speaker 1: One very big trend these days in the home schooling 962 00:57:22,120 --> 00:57:27,560 Speaker 1: community is what Abbey, my niece does, which is hybrid homeschooling. 963 00:57:28,120 --> 00:57:30,040 Speaker 1: So two to three days a week she is at 964 00:57:30,080 --> 00:57:32,720 Speaker 1: school and then the rest of the time she does 965 00:57:32,760 --> 00:57:36,520 Speaker 1: a plant. She's not a plant. Uh, the rest of 966 00:57:36,520 --> 00:57:38,280 Speaker 1: time she's at home. So she says it's a great 967 00:57:38,320 --> 00:57:41,640 Speaker 1: option with curriculum provided and new topics sought at school 968 00:57:41,720 --> 00:57:44,880 Speaker 1: and then worked out at home. Many of these schools 969 00:57:44,880 --> 00:57:49,000 Speaker 1: are accredited making getting into college, including Ivy League schools, 970 00:57:49,000 --> 00:57:53,520 Speaker 1: Hassle Free and Abbey School has sports teams homecoming. Abbey 971 00:57:53,600 --> 00:57:57,640 Speaker 1: is actually an excellent volleyball player. Beta club newspaper staff 972 00:57:57,680 --> 00:58:00,960 Speaker 1: all the good stuff. The flexibility is great, for families, 973 00:58:01,000 --> 00:58:03,440 Speaker 1: and we are huge fans of how the hybrid approach 974 00:58:03,480 --> 00:58:06,600 Speaker 1: prepared students for college by allowing them time outside of 975 00:58:06,640 --> 00:58:10,200 Speaker 1: class to manage their work and life schedules. So that's 976 00:58:10,200 --> 00:58:15,240 Speaker 1: from Jenny Nice actually via text. The first listener mail 977 00:58:15,360 --> 00:58:17,800 Speaker 1: via text. How did you print that out? Did you 978 00:58:17,800 --> 00:58:21,480 Speaker 1: retype it and print it? No? Dude, are you serious? 979 00:58:21,960 --> 00:58:25,520 Speaker 1: You can print from texts? No, you just copy pasted 980 00:58:25,560 --> 00:58:28,480 Speaker 1: to an evening. Oh yeah, yeah, I forgot about that. Men, then, 981 00:58:28,640 --> 00:58:31,440 Speaker 1: how in the world did you did you do that? 982 00:58:31,480 --> 00:58:35,320 Speaker 1: With your thoughts? I have a niece who is excellent 983 00:58:35,360 --> 00:58:37,840 Speaker 1: at volleyball too. Yeah, how would we should get them together? 984 00:58:38,560 --> 00:58:40,880 Speaker 1: He's I don't know, it's time of eleven, okay, something 985 00:58:40,920 --> 00:58:43,520 Speaker 1: like that. Abby just turned thirteen, so there. Oh, maybe 986 00:58:43,560 --> 00:58:47,000 Speaker 1: they face off against Yeah? Is she in Atlanta? Yes, 987 00:58:47,080 --> 00:58:51,400 Speaker 1: she's something Canton. You never know where's Evie. She's in Roswell. 988 00:58:51,480 --> 00:58:53,480 Speaker 1: But they I think with volleyball they kind of have 989 00:58:53,880 --> 00:58:56,919 Speaker 1: played all over the state. Bizarre if they played each other. Yeah, 990 00:58:56,920 --> 00:58:58,680 Speaker 1: we'll just see each other at a match one day, 991 00:58:59,120 --> 00:59:01,680 Speaker 1: on opposite sides of the court with our arms folded. Yeah, 992 00:59:03,400 --> 00:59:06,479 Speaker 1: what else? I got nothing else? Well, like Chuck said, 993 00:59:06,520 --> 00:59:09,440 Speaker 1: go leave us a review, and if you want to 994 00:59:09,440 --> 00:59:11,240 Speaker 1: get in touch with us, you can tweet to us 995 00:59:11,280 --> 00:59:13,960 Speaker 1: at s y s K podcast. You can join us 996 00:59:13,960 --> 00:59:17,200 Speaker 1: on Facebook dot com, slash stuff you Should Know. You 997 00:59:17,360 --> 00:59:21,080 Speaker 1: can email us. Can we still do that? Ye? You 998 00:59:21,120 --> 00:59:24,720 Speaker 1: can't text me at uh stuff podcast at how stuff 999 00:59:24,720 --> 00:59:26,840 Speaker 1: works dot com and has always joined us at our 1000 00:59:26,880 --> 00:59:34,280 Speaker 1: home on the web, Stuff you Should Know dot com. 1001 00:59:34,280 --> 00:59:36,720 Speaker 1: For more on this and thousands of other topics, is 1002 00:59:36,760 --> 00:59:45,440 Speaker 1: it how stuff Works dot com