1 00:00:00,600 --> 00:00:04,280 Speaker 1: Hi, everyone, Happy Saturday. This is Charles W. Chuck Bryant here. 2 00:00:04,760 --> 00:00:07,360 Speaker 1: I hope you slept well. Hope you're feeling good because 3 00:00:07,400 --> 00:00:10,480 Speaker 1: you're about to listen to How the Scientific Method Works 4 00:00:10,920 --> 00:00:14,560 Speaker 1: is from January of two thousand fifteen, and boy, this 5 00:00:14,600 --> 00:00:16,840 Speaker 1: is a good one. I really loved it because we 6 00:00:16,920 --> 00:00:19,239 Speaker 1: love science around here and we love the scientific method 7 00:00:19,640 --> 00:00:22,400 Speaker 1: and proving stuff out. So check it out right now. 8 00:00:25,560 --> 00:00:27,840 Speaker 1: Welcome to Stuff You Should Know, a production of My 9 00:00:28,040 --> 00:00:36,920 Speaker 1: Heart Radios How Stuff Works. Hey, and welcome to the podcast. 10 00:00:37,000 --> 00:00:40,360 Speaker 1: I'm Josh Clark. There's Charles W. Chuck Bryant, there's Jerry 11 00:00:41,120 --> 00:00:44,360 Speaker 1: Stuffy should Know grinning. It's even a while. I know, 12 00:00:44,960 --> 00:00:47,279 Speaker 1: it's funny, like those words come pouring out of my 13 00:00:47,280 --> 00:00:49,640 Speaker 1: mouth and then it's cool. You wake up in the 14 00:00:49,640 --> 00:00:51,640 Speaker 1: middle of night saying that, and you mean like slugs 15 00:00:51,640 --> 00:00:55,280 Speaker 1: you in the face. She's sleep She has to dry 16 00:00:55,320 --> 00:01:00,640 Speaker 1: my brow. Yes, we prerecorded some for December, as we 17 00:01:00,680 --> 00:01:02,120 Speaker 1: like to do to take a little time off at 18 00:01:02,120 --> 00:01:05,240 Speaker 1: the end of the year and not explain things for 19 00:01:05,240 --> 00:01:08,399 Speaker 1: a few weeks in our real lives. Like people ask 20 00:01:08,480 --> 00:01:11,440 Speaker 1: me things like what happened to that stick of butter? Yeah, 21 00:01:11,480 --> 00:01:13,760 Speaker 1: I don't know, don't ask. Don't even ask me. I 22 00:01:13,800 --> 00:01:16,760 Speaker 1: could tell you, but I'm not gonna exactly. That's how 23 00:01:16,800 --> 00:01:21,039 Speaker 1: it goes in my house. Find your own butter. December 24 00:01:21,120 --> 00:01:24,280 Speaker 1: was find your own butter Munthy. That's a good that's 25 00:01:24,280 --> 00:01:26,039 Speaker 1: that's a good one. That should be a T shirt 26 00:01:26,040 --> 00:01:28,840 Speaker 1: and stuff. You should know. Find your own butter or 27 00:01:28,959 --> 00:01:31,240 Speaker 1: December is find your own butter mundy. Yeah, that's right. 28 00:01:31,400 --> 00:01:34,240 Speaker 1: Maybe a stick of butter and some garland on it. Yeah, 29 00:01:34,600 --> 00:01:37,080 Speaker 1: I like that. So it's good to see you again, man. 30 00:01:37,160 --> 00:01:38,880 Speaker 1: Good to be back in here. Yeah, it is nice 31 00:01:38,920 --> 00:01:40,959 Speaker 1: to be back because as much as the break was great, 32 00:01:40,959 --> 00:01:43,720 Speaker 1: I'm happy to be explaining things again. Well that's good 33 00:01:43,959 --> 00:01:45,640 Speaker 1: because if we got in here and you're like, I 34 00:01:45,680 --> 00:01:47,320 Speaker 1: can't do this, I can't do it again, we'll be 35 00:01:47,400 --> 00:01:50,320 Speaker 1: in trouble. Yeah. Yeah, So I'm glad we're all feeling good. Jerry, 36 00:01:50,320 --> 00:01:53,200 Speaker 1: you're feeling good. Jerry's got two thumbs up in a 37 00:01:53,200 --> 00:01:56,960 Speaker 1: big goofy smile, two of her three thumbs. She looks 38 00:01:56,960 --> 00:02:01,000 Speaker 1: like Bob from that male enhancement pill. I had. Oh 39 00:02:01,080 --> 00:02:03,440 Speaker 1: see the guy, the old man that's like super buff. 40 00:02:04,440 --> 00:02:06,480 Speaker 1: I would call him old. He was middle aged. He 41 00:02:06,520 --> 00:02:09,480 Speaker 1: looked like kind of a Bob Dobbs type dude. I 42 00:02:09,480 --> 00:02:11,120 Speaker 1: think that's kind of who he was modeled. And I 43 00:02:11,160 --> 00:02:13,760 Speaker 1: see the guy that's super muscle. Now I'm thinking of 44 00:02:13,840 --> 00:02:17,200 Speaker 1: someone different. I think you're thinking of Jack La Lane. No, no, no, no, 45 00:02:17,320 --> 00:02:19,320 Speaker 1: just there's some ad. There's some old man that looks 46 00:02:19,320 --> 00:02:22,560 Speaker 1: like really creepy because from the next super buff looks 47 00:02:22,560 --> 00:02:25,520 Speaker 1: like it year old. No. Remember, there was like a 48 00:02:25,560 --> 00:02:28,440 Speaker 1: male enhancement pill and I'm making air quotes here for 49 00:02:28,560 --> 00:02:32,280 Speaker 1: erectile dysfunction. Well, there go the air quotes. But yes, 50 00:02:32,840 --> 00:02:35,800 Speaker 1: and it was like in the early two thousand's, I 51 00:02:35,840 --> 00:02:39,400 Speaker 1: think maybe late nineties, but I think early two thousands, 52 00:02:39,440 --> 00:02:42,240 Speaker 1: and these ads were everywhere, and there was Bob and 53 00:02:42,320 --> 00:02:44,680 Speaker 1: like all these great things happened to him because he 54 00:02:44,800 --> 00:02:46,959 Speaker 1: started taking this pill. I can't remember the name of 55 00:02:47,000 --> 00:02:49,880 Speaker 1: the pill. But the company like got into a lot 56 00:02:49,880 --> 00:02:52,840 Speaker 1: of trouble because it was basically like a subscription service 57 00:02:53,520 --> 00:02:55,200 Speaker 1: and like you gave him your credit card and you've 58 00:02:55,200 --> 00:02:57,720 Speaker 1: got this free trial, but then they started sending it 59 00:02:57,760 --> 00:03:00,079 Speaker 1: to you and it was like next to impossible to 60 00:03:00,080 --> 00:03:03,480 Speaker 1: cut off service. Interesting They're like, no, we want your 61 00:03:03,520 --> 00:03:08,200 Speaker 1: mailness to be enhanced. So you you've seen these ads. 62 00:03:08,400 --> 00:03:11,560 Speaker 1: I was gonna start asking questions, but why why bother? 63 00:03:11,720 --> 00:03:14,560 Speaker 1: I will? I will YouTube. I will find it on YouTube. 64 00:03:14,600 --> 00:03:17,640 Speaker 1: I'll be like, oh, Bob, yeah you will. You'll go oh, 65 00:03:18,040 --> 00:03:19,640 Speaker 1: I won't have to come back in and record an 66 00:03:19,639 --> 00:03:21,639 Speaker 1: insert the guy that's on the back of all those 67 00:03:21,639 --> 00:03:26,880 Speaker 1: pill bottles in my bathroom so chucky. I don't even 68 00:03:26,919 --> 00:03:29,760 Speaker 1: remember how we got Oh yeah, Jerry did that. That 69 00:03:29,880 --> 00:03:32,120 Speaker 1: was Jerry's fault. But um, you remember we did the 70 00:03:32,240 --> 00:03:37,800 Speaker 1: Enlightenment episode. Okay, we talked a lot about how there's 71 00:03:37,840 --> 00:03:42,880 Speaker 1: this kind of um tug of war over the human 72 00:03:43,800 --> 00:03:48,160 Speaker 1: psyche between rationalism and mysticism. I guess you could you 73 00:03:48,160 --> 00:03:51,720 Speaker 1: could put it well, I feel like we're talking today 74 00:03:51,720 --> 00:03:54,440 Speaker 1: about the scientific method. Yeah, great idea. By the way, 75 00:03:54,480 --> 00:03:57,880 Speaker 1: thank you very much. It's been a long time coming. Um, 76 00:03:57,920 --> 00:04:02,000 Speaker 1: because I realized, like I, I don't understand as fully 77 00:04:02,040 --> 00:04:04,680 Speaker 1: as I don't understand science. I understand the scientific method 78 00:04:04,720 --> 00:04:07,040 Speaker 1: because it's pretty cut and dry, and it's beautiful and 79 00:04:07,040 --> 00:04:09,960 Speaker 1: elegant and simple. But then you just take this thing 80 00:04:10,440 --> 00:04:13,640 Speaker 1: and it came out of the birth of rationalism, and 81 00:04:13,680 --> 00:04:15,600 Speaker 1: when you place it into the world and make it 82 00:04:15,640 --> 00:04:20,560 Speaker 1: function there's a lot of implications. Is it being used properly? 83 00:04:20,680 --> 00:04:23,760 Speaker 1: Is it being used responsibly? Like are we putting what 84 00:04:23,960 --> 00:04:26,840 Speaker 1: constitutes faith into that? You know, like it just raises 85 00:04:26,839 --> 00:04:29,080 Speaker 1: all this other stuff. And it made me realize, like 86 00:04:29,640 --> 00:04:31,960 Speaker 1: I don't understand science as much as I want to, 87 00:04:32,040 --> 00:04:34,919 Speaker 1: So researching this it was awesome. Yeah, and this is 88 00:04:34,960 --> 00:04:37,400 Speaker 1: a cool episode, I think, because not only are we 89 00:04:37,400 --> 00:04:39,279 Speaker 1: going to talk about the scientific method, but we're going 90 00:04:39,320 --> 00:04:42,520 Speaker 1: to talk about just science, like what is science in general? 91 00:04:42,600 --> 00:04:46,920 Speaker 1: And some of the rock stars along the way who really, 92 00:04:47,080 --> 00:04:52,000 Speaker 1: you know, laid out the path remarkably and like many 93 00:04:52,040 --> 00:04:55,279 Speaker 1: many years ago, like coming up with these amazing discoveries 94 00:04:55,320 --> 00:04:58,520 Speaker 1: that still like hold you know, you can like hold 95 00:04:58,560 --> 00:05:00,719 Speaker 1: their feet to the fire for a lot this stuff. Yeah, 96 00:05:00,760 --> 00:05:05,520 Speaker 1: because if you come upon a universal truth, you know 97 00:05:05,640 --> 00:05:07,279 Speaker 1: it is what it is, like you got to be 98 00:05:07,360 --> 00:05:10,279 Speaker 1: the person who discovered it because you know, you saw it, 99 00:05:10,560 --> 00:05:13,560 Speaker 1: you realized it a certain way, but ultimately it was 100 00:05:13,600 --> 00:05:15,880 Speaker 1: there already. Yeah, like Newton, I mean, we'll talk about 101 00:05:15,920 --> 00:05:18,320 Speaker 1: all this stuff, but it's not like now we're like, oh, Newton, 102 00:05:18,800 --> 00:05:21,120 Speaker 1: most of what he said was wrong, But that's understandable 103 00:05:21,120 --> 00:05:23,479 Speaker 1: because it was a long time ago, like his stuff 104 00:05:23,520 --> 00:05:26,320 Speaker 1: holds up really really well. I was wondering if he 105 00:05:26,480 --> 00:05:29,560 Speaker 1: on his deathbed was just like, oh man, I contributed 106 00:05:29,720 --> 00:05:33,880 Speaker 1: so much to humanity it's mind boggling, but I couldn't 107 00:05:33,960 --> 00:05:38,479 Speaker 1: enhance my malehood. Well, Bob hadn't come along yet, So Chuck, 108 00:05:38,560 --> 00:05:42,400 Speaker 1: let's let's just quit stalling and talk about science. Like, 109 00:05:42,480 --> 00:05:46,080 Speaker 1: what is science? Well, I hate the old elementary school 110 00:05:46,560 --> 00:05:49,599 Speaker 1: uh defined as but it's a pretty good place to 111 00:05:49,600 --> 00:05:52,760 Speaker 1: start here to get a base definition of science. Yeah, 112 00:05:52,760 --> 00:05:55,120 Speaker 1: old William Harris did a great job with this. Yes, 113 00:05:55,240 --> 00:05:58,600 Speaker 1: William Harris did a great job. Uh. Science the intellectual 114 00:05:58,640 --> 00:06:01,760 Speaker 1: and practical activity encompassing the structure and behavior of the 115 00:06:01,760 --> 00:06:06,200 Speaker 1: physical and natural world through observation and experimentation. Boom end 116 00:06:06,279 --> 00:06:09,800 Speaker 1: of podcast. Uh So, the first part of that is 117 00:06:10,120 --> 00:06:14,040 Speaker 1: science is practical and it is you know, they make 118 00:06:14,080 --> 00:06:16,360 Speaker 1: a good he makes. Bill Harris makes a great point 119 00:06:16,360 --> 00:06:19,080 Speaker 1: in here. It's not just stuff you do in a lab, 120 00:06:19,120 --> 00:06:21,960 Speaker 1: and it's not just for scientists. It is all about 121 00:06:22,720 --> 00:06:25,760 Speaker 1: being hands on and active, and it's all about discovery 122 00:06:25,800 --> 00:06:30,240 Speaker 1: and asking questions about I mean, that's how everything is 123 00:06:30,320 --> 00:06:32,760 Speaker 1: ultimately solved, is by someone looking at something and having 124 00:06:32,800 --> 00:06:35,720 Speaker 1: a question about it exactly. And then the scientific method 125 00:06:35,800 --> 00:06:37,839 Speaker 1: comes in when you say, and this is how you 126 00:06:38,400 --> 00:06:42,240 Speaker 1: properly get to that answer exactly. Um. And he makes 127 00:06:42,240 --> 00:06:46,160 Speaker 1: another good point to that the idea that there's a method, 128 00:06:46,200 --> 00:06:50,400 Speaker 1: a scientific method, makes it seem like it's it's secreted 129 00:06:50,440 --> 00:06:55,680 Speaker 1: away among the the fraternity of scientists. And like you said, 130 00:06:55,880 --> 00:06:57,880 Speaker 1: any anybody can use it. It's just kind of part 131 00:06:57,880 --> 00:07:00,320 Speaker 1: of being a curious human is it's not even anyone 132 00:07:00,360 --> 00:07:03,000 Speaker 1: can use it. Everyone does use it, you just might 133 00:07:03,000 --> 00:07:05,479 Speaker 1: not even know that you're using it, like if you 134 00:07:05,560 --> 00:07:07,560 Speaker 1: I mean, one of the examples that use later is 135 00:07:07,600 --> 00:07:11,320 Speaker 1: if like your car overheats, when you figure out why 136 00:07:11,440 --> 00:07:14,480 Speaker 1: and fix it, that's the scientific method, right, playing out 137 00:07:14,600 --> 00:07:20,480 Speaker 1: exactly based on reasoning. Yeah, okay, in deduction and induction. Man, 138 00:07:20,800 --> 00:07:23,040 Speaker 1: there's so much to talk about. Okay, So let's let's 139 00:07:23,040 --> 00:07:26,880 Speaker 1: talk about that definition that you had. So the first 140 00:07:26,920 --> 00:07:30,080 Speaker 1: part is that science is it's a practical activity. So 141 00:07:30,120 --> 00:07:35,400 Speaker 1: science is practical, right, It's it's this, Um. The basis 142 00:07:35,480 --> 00:07:38,880 Speaker 1: of the whole thing is discovery. Right. You see something, 143 00:07:38,960 --> 00:07:41,720 Speaker 1: you see birds in flight, and you say, where are 144 00:07:41,720 --> 00:07:45,040 Speaker 1: those birds going? And if you just went and laid 145 00:07:45,040 --> 00:07:47,200 Speaker 1: down on the ground and went to sleep. After that, 146 00:07:47,280 --> 00:07:50,040 Speaker 1: then you're not you're not carrying out science. But if 147 00:07:50,040 --> 00:07:52,040 Speaker 1: you went I want to find out where those birds 148 00:07:52,040 --> 00:07:55,040 Speaker 1: are going, and you follow them and you start taking notes, 149 00:07:55,400 --> 00:07:58,400 Speaker 1: that's that is the basis of science, is discovery. Yeah, 150 00:07:58,440 --> 00:08:02,440 Speaker 1: and that's the observational part as well. Um, sometimes you're 151 00:08:02,520 --> 00:08:05,720 Speaker 1: using a microscope or a telescope. Sometimes you're using your eyeballs. 152 00:08:06,240 --> 00:08:09,400 Speaker 1: But no matter what your tool is, uh, you're gonna 153 00:08:09,400 --> 00:08:14,200 Speaker 1: be watching something and recording what's called data or data, 154 00:08:14,280 --> 00:08:17,960 Speaker 1: depending on I don't know what kind of person you are. Yeah, 155 00:08:18,400 --> 00:08:21,760 Speaker 1: what do you say? I think? I say both. I 156 00:08:21,760 --> 00:08:25,240 Speaker 1: think data. Yeah, I don't think I've I don't think 157 00:08:25,280 --> 00:08:28,720 Speaker 1: I say data data, I say data data. Yeah, all right, 158 00:08:29,200 --> 00:08:31,640 Speaker 1: we'll go with data. You say both. I feel like 159 00:08:31,640 --> 00:08:33,040 Speaker 1: it just comes out of my mouth one way or 160 00:08:33,080 --> 00:08:34,600 Speaker 1: the other, and I don't really think about it. I 161 00:08:34,600 --> 00:08:37,640 Speaker 1: think that's like being ambidextrous. Yeah. Yeah, I'm a data 162 00:08:37,720 --> 00:08:43,160 Speaker 1: data Yeah. Uh. So once you are observing this data, well, 163 00:08:43,160 --> 00:08:46,880 Speaker 1: there are a couple of kinds. There's quantitative data, which 164 00:08:46,880 --> 00:08:51,680 Speaker 1: are numbers, like you know, your body temperature is point six, 165 00:08:51,720 --> 00:08:55,440 Speaker 1: although I think that's changed slightly now, didn't it. Yeah, yeah, there. 166 00:08:55,480 --> 00:08:57,480 Speaker 1: You used to be like, if you were a human being, 167 00:08:57,840 --> 00:09:00,360 Speaker 1: your body temperature is point six and they's like, no, 168 00:09:00,440 --> 00:09:02,920 Speaker 1: there's a little more variation than that. But any kind 169 00:09:02,960 --> 00:09:10,199 Speaker 1: of just numerical representation is quantitative, whereas qualitative as behavioral, 170 00:09:10,600 --> 00:09:14,440 Speaker 1: like I'm gonna watch that bird um eat and poop 171 00:09:14,559 --> 00:09:18,160 Speaker 1: for the next week, right, or what happens if I 172 00:09:18,240 --> 00:09:20,160 Speaker 1: what will the slug do if I put a bunch 173 00:09:20,160 --> 00:09:23,000 Speaker 1: of salt on it? You know, I don't do that. No, 174 00:09:23,160 --> 00:09:25,440 Speaker 1: you really should not do that. No, that's awful. But 175 00:09:25,559 --> 00:09:29,360 Speaker 1: the reaction of the slug is gathering qualitative data, and 176 00:09:29,400 --> 00:09:33,080 Speaker 1: depending on who you talk to, there isn't qualitative data 177 00:09:33,120 --> 00:09:36,920 Speaker 1: and science that it should all just be quantitative, because yeah, 178 00:09:37,040 --> 00:09:44,800 Speaker 1: because quantitative data is reproducible. Qualitative data is it's not 179 00:09:44,880 --> 00:09:48,760 Speaker 1: necessarily reproducible. You can observe the same phenomenon, but you're 180 00:09:48,800 --> 00:09:52,800 Speaker 1: not necessarily controlling it. Okay, well, I guess I get that, 181 00:09:52,840 --> 00:09:55,160 Speaker 1: But I agree with Bill here and that they are both. 182 00:09:55,440 --> 00:09:57,920 Speaker 1: They go hand in hand, and neither one is more 183 00:09:57,960 --> 00:09:59,480 Speaker 1: important than the other. You need to have both. Well, 184 00:09:59,480 --> 00:10:01,400 Speaker 1: a lot of people do, and we'll talk more about 185 00:10:01,400 --> 00:10:04,960 Speaker 1: it later. Because without the idea that qualitative data is 186 00:10:05,160 --> 00:10:09,359 Speaker 1: acceptable and scientific. You don't have the social social sciences, 187 00:10:09,360 --> 00:10:12,400 Speaker 1: like they don't exist. Yeah, that's a good point, but 188 00:10:12,520 --> 00:10:16,280 Speaker 1: yes we have quantitative data and qualitative data. I agree 189 00:10:16,320 --> 00:10:20,520 Speaker 1: with you, they're both useful. Okay. Uh, it is an 190 00:10:20,559 --> 00:10:24,360 Speaker 1: intellectual pursuit um. So you can make observations on data 191 00:10:24,440 --> 00:10:27,840 Speaker 1: all day long. But until you bring reason, in this case, 192 00:10:27,880 --> 00:10:32,959 Speaker 1: inductive reasoning, which is driving a generalization based on your observations, 193 00:10:33,679 --> 00:10:36,760 Speaker 1: then it's just data sitting there on a piece of 194 00:10:36,760 --> 00:10:40,160 Speaker 1: paper like it's supposed to lead you somewhere right exactly, 195 00:10:40,480 --> 00:10:43,199 Speaker 1: And so we should talk about inductive and deductive reasoning 196 00:10:43,320 --> 00:10:45,800 Speaker 1: and it depending. Again. It's really weird. One of the 197 00:10:45,840 --> 00:10:50,040 Speaker 1: things that came across is that there's not a universal 198 00:10:50,080 --> 00:10:54,040 Speaker 1: agreement on how science is carried out. Like No, I 199 00:10:54,200 --> 00:10:56,320 Speaker 1: saw some places where there's like there's no place for 200 00:10:56,400 --> 00:11:00,440 Speaker 1: inductive reasoning in science. Then other places they're saying, well, 201 00:11:00,520 --> 00:11:04,200 Speaker 1: you have to have science using inductive reasoning. Everybody seems 202 00:11:04,200 --> 00:11:07,960 Speaker 1: to agree that deductive reasoning is the basis of science, 203 00:11:08,679 --> 00:11:11,280 Speaker 1: but that you also have to have inductive So deductive 204 00:11:11,360 --> 00:11:16,720 Speaker 1: is basically taking a big broad generalization and saying that 205 00:11:16,840 --> 00:11:22,360 Speaker 1: it applies to something specific more specific. Yes, Uh, inductive 206 00:11:22,440 --> 00:11:26,040 Speaker 1: is the opposite where you say, I've noticed these different 207 00:11:26,160 --> 00:11:32,520 Speaker 1: data points, and uh, that means that this broad generalization 208 00:11:32,600 --> 00:11:36,360 Speaker 1: is true. So you go from specific, small observations to 209 00:11:36,720 --> 00:11:39,200 Speaker 1: a broad generalization. And the reason that a lot of 210 00:11:39,200 --> 00:11:42,320 Speaker 1: people say, well, inductive reasoning doesn't have any place in 211 00:11:42,360 --> 00:11:48,280 Speaker 1: science is because you're saying, those birds over there are 212 00:11:48,320 --> 00:11:51,679 Speaker 1: all brown. Therefore all birds of that type are brown. 213 00:11:51,880 --> 00:11:53,960 Speaker 1: Even though I haven't seen every single bird of that 214 00:11:54,000 --> 00:11:56,960 Speaker 1: type in the world. I'm saying that all those birds 215 00:11:56,760 --> 00:11:59,079 Speaker 1: are brown, and a lot of people say there's no 216 00:11:59,120 --> 00:12:01,600 Speaker 1: place for that in science. Well, if you want to 217 00:12:01,640 --> 00:12:03,720 Speaker 1: go out and prove that, then that's your business. You know. 218 00:12:04,640 --> 00:12:06,680 Speaker 1: You can't just say that and be like and I'm done, 219 00:12:07,160 --> 00:12:10,760 Speaker 1: right exactly, I guess you could, but be much of scientists, right, 220 00:12:10,920 --> 00:12:15,439 Speaker 1: But the the you can use it to formulate hey hypotheses. Right, 221 00:12:15,480 --> 00:12:18,200 Speaker 1: So you can say, I've generated all these data points, 222 00:12:18,200 --> 00:12:20,880 Speaker 1: I'm gonna put them together and see if this broad 223 00:12:20,920 --> 00:12:24,200 Speaker 1: generalization is right. Okay, so there is a place for 224 00:12:24,280 --> 00:12:28,160 Speaker 1: inductive reasoning science. But everybody says deductive reasoning is the 225 00:12:28,240 --> 00:12:33,480 Speaker 1: basis of science. Well, Bill Harris does. He offers a 226 00:12:33,520 --> 00:12:37,640 Speaker 1: great example for inductive reasoning with Edwin Hubbell of the 227 00:12:37,679 --> 00:12:42,120 Speaker 1: Hubble Telescope. Uh. He was looking through the Hooker telescope, 228 00:12:42,120 --> 00:12:46,199 Speaker 1: which at the time at California's Mount Wilson. Is it 229 00:12:46,280 --> 00:12:49,680 Speaker 1: the one from Rebel Without a Cause? Now that's um 230 00:12:49,880 --> 00:12:55,520 Speaker 1: Griffith Park Observatory, which has been redesigned, and uh is 231 00:12:55,559 --> 00:12:57,600 Speaker 1: really cool now, Yeah, I mean it was kind of 232 00:12:57,600 --> 00:12:59,920 Speaker 1: cool before, but it was definitely like, uh, sort of 233 00:13:00,000 --> 00:13:04,360 Speaker 1: a bass museum that time forgot. Oh really, so they've 234 00:13:04,440 --> 00:13:06,360 Speaker 1: updated it. I'll bet that was cool though in its 235 00:13:06,360 --> 00:13:08,000 Speaker 1: own way. Yeah, it was neat. I used to live 236 00:13:08,040 --> 00:13:09,840 Speaker 1: near there, so it was kind of But that's like 237 00:13:10,080 --> 00:13:12,400 Speaker 1: the famous one, at least in the movies. Yeah, that's 238 00:13:12,400 --> 00:13:14,400 Speaker 1: where they have the big knife fight and there's this 239 00:13:14,800 --> 00:13:17,120 Speaker 1: James Dean statue there too. Oh I didn't like a 240 00:13:17,200 --> 00:13:22,520 Speaker 1: bust um. So yes, Edwin Hubble, he's at Mountain Wilson 241 00:13:22,600 --> 00:13:24,839 Speaker 1: and he's looking through the Hooker telescope, which was the 242 00:13:24,880 --> 00:13:27,800 Speaker 1: biggest one. And at the time everyone said the Milky 243 00:13:27,800 --> 00:13:30,480 Speaker 1: Way Galaxy is it? That's what we've got going on? 244 00:13:30,760 --> 00:13:33,319 Speaker 1: Did you know this? Uh? Yeah, I knew that because 245 00:13:33,320 --> 00:13:37,560 Speaker 1: we're talking. Yeah, not that long ago did not realize this. 246 00:13:37,720 --> 00:13:40,079 Speaker 1: And he started looking through this telescope and said, you 247 00:13:40,120 --> 00:13:42,679 Speaker 1: know what, these nebula that everyone says are part of 248 00:13:42,679 --> 00:13:46,360 Speaker 1: our galaxy look to me like they're beyond our galaxy. 249 00:13:46,800 --> 00:13:49,400 Speaker 1: Not only that, they look like they're moving away from us. 250 00:13:50,080 --> 00:13:53,600 Speaker 1: So he made this, uh, with through inductive reasoning, made 251 00:13:53,600 --> 00:13:56,320 Speaker 1: this observation that you know what, I think there are 252 00:13:56,480 --> 00:13:59,640 Speaker 1: many many galaxies out there, and not only that, I 253 00:13:59,679 --> 00:14:04,400 Speaker 1: think they are expanding. And uh, through technological advancement with 254 00:14:04,440 --> 00:14:07,760 Speaker 1: telescopes over the years, scientists, you know, it proved to 255 00:14:07,840 --> 00:14:10,600 Speaker 1: be true. Yeah. Pretty cool. So this is a really 256 00:14:10,679 --> 00:14:14,040 Speaker 1: good example of him saying, like, I've made some observations 257 00:14:14,520 --> 00:14:18,280 Speaker 1: and now I'm going to say this broad generalization. Right, 258 00:14:18,320 --> 00:14:23,040 Speaker 1: So these these galaxies appear to be moving away from another. 259 00:14:23,120 --> 00:14:27,920 Speaker 1: So the whole universe is expanding. Right. That's inductive reasoning. 260 00:14:28,160 --> 00:14:31,640 Speaker 1: It's a pretty brave thing, uh, especially back then, because 261 00:14:31,640 --> 00:14:34,720 Speaker 1: you're really putting your reputation at steak. It really is, 262 00:14:34,880 --> 00:14:37,920 Speaker 1: you know. So what Hubble was, what Hubble did was 263 00:14:38,520 --> 00:14:41,840 Speaker 1: what we've come to see as science. He made some observations, 264 00:14:42,040 --> 00:14:45,800 Speaker 1: he came up with a hypothesis, um, and then it 265 00:14:46,400 --> 00:14:50,440 Speaker 1: was tested later on. It's not you don't necessarily as 266 00:14:50,440 --> 00:14:54,920 Speaker 1: a scientist. You're a part of a larger collective of scientists, right, 267 00:14:55,240 --> 00:14:58,720 Speaker 1: and every scientist needs one another. It's why there's journals 268 00:14:58,720 --> 00:15:02,640 Speaker 1: and and um conferences and things like that to share information. Right, 269 00:15:02,920 --> 00:15:06,280 Speaker 1: the party into party, and and Hubble came up with 270 00:15:06,320 --> 00:15:11,160 Speaker 1: his own observations, and rather than just experimenting, experimenting, experimenting himself, 271 00:15:11,320 --> 00:15:14,360 Speaker 1: which I'm sure he continued to do, he created this 272 00:15:14,440 --> 00:15:17,240 Speaker 1: basis of work that he probably realized is going to 273 00:15:17,680 --> 00:15:22,280 Speaker 1: survive him. Right. And then later on scientists came down 274 00:15:22,360 --> 00:15:26,000 Speaker 1: the road and they tested his hypothesis and they found 275 00:15:26,400 --> 00:15:29,640 Speaker 1: it was correct, and so his hypothesis became a theory. 276 00:15:29,680 --> 00:15:31,840 Speaker 1: It eventually became part of the basis of the Big 277 00:15:31,840 --> 00:15:35,320 Speaker 1: Bang theory that the universe started as a huge explosion 278 00:15:35,640 --> 00:15:40,640 Speaker 1: and it's expanding still because we're because it exploded at 279 00:15:40,680 --> 00:15:43,960 Speaker 1: one point, right. And they did that by carrying out 280 00:15:44,120 --> 00:15:49,920 Speaker 1: other tests or experiments exactly. So this is how science works. 281 00:15:49,960 --> 00:15:53,080 Speaker 1: Like some guy back in in n makes some observations 282 00:15:53,080 --> 00:15:59,360 Speaker 1: in California, he proposes this big broad generally generalization, and 283 00:15:59,440 --> 00:16:02,960 Speaker 1: over the next like ensuing half a century more and 284 00:16:03,000 --> 00:16:07,240 Speaker 1: more scientists all around the world start testing his hypothesis 285 00:16:07,360 --> 00:16:10,800 Speaker 1: and find it to be true, so it becomes a theory. Yeah. Well, 286 00:16:11,280 --> 00:16:13,960 Speaker 1: well let's finish up here with science. The last part 287 00:16:13,960 --> 00:16:17,880 Speaker 1: of the definition is that it's systematic, and it's methodical, 288 00:16:18,560 --> 00:16:22,880 Speaker 1: and it requires testing and experiments, and it requires those 289 00:16:23,440 --> 00:16:28,760 Speaker 1: experiments and tests to be repeated and verified, and it 290 00:16:28,880 --> 00:16:31,360 Speaker 1: is is. It's a system, it's a way of working 291 00:16:32,120 --> 00:16:34,480 Speaker 1: things out. It's a way of working and that is 292 00:16:34,520 --> 00:16:38,080 Speaker 1: the scientific method. Basically. Yeah, you have your idea, you 293 00:16:38,400 --> 00:16:42,720 Speaker 1: pose a question, you theorize hyper you put a hypothesis 294 00:16:42,760 --> 00:16:45,240 Speaker 1: out there, and then you go about trying to either 295 00:16:45,280 --> 00:16:47,560 Speaker 1: prove it or disprove it. Yeah, exactly. And then the 296 00:16:47,600 --> 00:16:50,240 Speaker 1: way that you go about proving or disproving it, that's 297 00:16:50,280 --> 00:16:54,360 Speaker 1: the scientific method. Everything else is just scientific inquiry. The 298 00:16:54,440 --> 00:16:57,560 Speaker 1: way you go about, the standardized way of going about 299 00:16:57,640 --> 00:17:01,880 Speaker 1: scientific inquiry is the scientific method. And we, friend, we'll 300 00:17:01,880 --> 00:17:24,119 Speaker 1: talk about the scientific method right after this. All right, 301 00:17:24,160 --> 00:17:26,199 Speaker 1: you brought up a point. I think we should go 302 00:17:26,200 --> 00:17:28,800 Speaker 1: ahead and just get right to my friend, let's do 303 00:17:29,040 --> 00:17:34,840 Speaker 1: hypotheses and theories. One thing to to stay together. One 304 00:17:34,880 --> 00:17:39,639 Speaker 1: thing that really chase my hide is, uh, when you 305 00:17:39,720 --> 00:17:43,119 Speaker 1: hear poop o ers of whatever scientific theory say, well, 306 00:17:43,160 --> 00:17:45,920 Speaker 1: it's just a theory, and you where was this thing 307 00:17:46,000 --> 00:17:49,479 Speaker 1: that you found that poop poo that Do you remember 308 00:17:49,600 --> 00:17:54,480 Speaker 1: what website that was? No? No, Also, I do want 309 00:17:54,520 --> 00:17:55,920 Speaker 1: to give a shout out now that you mentioned it 310 00:17:56,000 --> 00:18:00,480 Speaker 1: too Explorables. It's like an online university basically of free 311 00:18:00,520 --> 00:18:05,359 Speaker 1: courses and uh, there is one on scientific reasoning that 312 00:18:05,520 --> 00:18:07,960 Speaker 1: is just amazing. It's like a huge rabbit hole. You 313 00:18:08,000 --> 00:18:10,919 Speaker 1: go down and you start clicking on the embedded links 314 00:18:10,920 --> 00:18:13,920 Speaker 1: and you end up like understanding all sorts of stuff. 315 00:18:14,000 --> 00:18:16,840 Speaker 1: So go check that one out if you like understanding stuff. 316 00:18:18,080 --> 00:18:19,640 Speaker 1: So that's one of the things that bug me if 317 00:18:19,640 --> 00:18:21,800 Speaker 1: someone says it's just a theory, and this does a 318 00:18:21,800 --> 00:18:25,960 Speaker 1: great job of kind of throwing that out the window, um, 319 00:18:26,000 --> 00:18:29,800 Speaker 1: because it's basically mixing up the two definitions of theory. Yeah, 320 00:18:29,880 --> 00:18:33,040 Speaker 1: there's like a colloquial definition that people use every day 321 00:18:33,160 --> 00:18:35,159 Speaker 1: that it doesn't really have much to do with the 322 00:18:35,160 --> 00:18:38,879 Speaker 1: scientific use. Like I got a theory that Jerry and 323 00:18:39,000 --> 00:18:42,760 Speaker 1: in one hour bathroom breaks every day is really playing 324 00:18:43,280 --> 00:18:47,120 Speaker 1: words with friends in the lobby. I think your theory 325 00:18:47,200 --> 00:18:51,480 Speaker 1: is correct. So that's a theory in the colloquial meaning 326 00:18:52,240 --> 00:18:55,879 Speaker 1: as where science goes. The theory is not just something 327 00:18:55,960 --> 00:18:58,680 Speaker 1: you postulate. So if this may or may not be true, 328 00:18:59,119 --> 00:19:02,119 Speaker 1: the theory is b on the hypothesis, and it's something 329 00:19:02,160 --> 00:19:06,880 Speaker 1: that is strongly supported in many different ways and all 330 00:19:06,960 --> 00:19:09,960 Speaker 1: there's all kinds of evidence to support something that eventually 331 00:19:09,960 --> 00:19:13,560 Speaker 1: becomes a theory. Right, So, um, what you your theory 332 00:19:13,600 --> 00:19:16,880 Speaker 1: about Jerry's bathroom breaks in the scientific world would be 333 00:19:17,080 --> 00:19:21,359 Speaker 1: a hypothesis. What In fact, yeah, it would be a 334 00:19:21,359 --> 00:19:25,720 Speaker 1: scientific law, but ultimately would begin as a hypothesis, a 335 00:19:25,800 --> 00:19:31,480 Speaker 1: hunch based on intuition, based on the data you've collected, observations, 336 00:19:31,560 --> 00:19:34,520 Speaker 1: that kind of stuff, where like, you know, you've seen 337 00:19:34,840 --> 00:19:37,400 Speaker 1: that Jerry goes to the bathroom for like an hour 338 00:19:37,520 --> 00:19:41,919 Speaker 1: to stretch frequently. When she comes back, she's um finishing 339 00:19:42,040 --> 00:19:46,080 Speaker 1: up a game of words with friends. You've heard that 340 00:19:46,160 --> 00:19:49,160 Speaker 1: she's been spotted in the lobby during these times. So 341 00:19:49,280 --> 00:19:53,160 Speaker 1: your hypothesis is that while she is gone for these 342 00:19:53,200 --> 00:19:55,560 Speaker 1: hour long bathroom breaks, she's actually down the lobby playing 343 00:19:55,600 --> 00:19:59,879 Speaker 1: words with friends. Right, based on knowledge, observation, and logic. Right, 344 00:20:00,040 --> 00:20:02,600 Speaker 1: So let's say that you decided to set up an experiment, 345 00:20:02,640 --> 00:20:05,320 Speaker 1: and you experimented, and you went and you found Jerry 346 00:20:05,320 --> 00:20:09,400 Speaker 1: playing words with friends five different times, and you told 347 00:20:09,560 --> 00:20:11,440 Speaker 1: me about it, and I was like, I'm going to 348 00:20:11,560 --> 00:20:14,720 Speaker 1: run that same experiment exactly the way you did, Right, 349 00:20:15,240 --> 00:20:18,159 Speaker 1: I would test that same hypothesis. If I found the 350 00:20:18,200 --> 00:20:20,760 Speaker 1: same results to be true, then what you would have 351 00:20:20,840 --> 00:20:23,919 Speaker 1: come up with, your hypothesis would move to basically a 352 00:20:24,000 --> 00:20:28,480 Speaker 1: theory that is, this widely accepted thing. This explanation that 353 00:20:29,080 --> 00:20:33,160 Speaker 1: Jerry is not actually in the bathroom, she's downstairs playing 354 00:20:33,200 --> 00:20:37,040 Speaker 1: with friends, would be the Jerry bathroom break theory. Right. 355 00:20:37,400 --> 00:20:41,080 Speaker 1: And then if it turns out that you find that 356 00:20:41,440 --> 00:20:44,679 Speaker 1: Jerry spending an hour a day pretending to be in 357 00:20:44,720 --> 00:20:49,280 Speaker 1: the bathroom but actually being downstairs playing words with friends, 358 00:20:50,960 --> 00:20:55,040 Speaker 1: if the universe couldn't exist without her doing that every day, 359 00:20:55,119 --> 00:21:00,159 Speaker 1: you would have a scientific law. That's right. Yeah, I 360 00:21:00,160 --> 00:21:02,000 Speaker 1: think that was a good example you came up. Well, 361 00:21:02,040 --> 00:21:05,639 Speaker 1: it's a great example, as it turns out. Uh. I 362 00:21:05,640 --> 00:21:08,000 Speaker 1: guess the point here is when you hear someone say 363 00:21:08,560 --> 00:21:11,680 Speaker 1: in an argument, well that's just a theory, just punch 364 00:21:11,760 --> 00:21:13,800 Speaker 1: him in the head and then tell him what we 365 00:21:13,880 --> 00:21:17,119 Speaker 1: just said about the bathroom breaks, and they'll say, who's Jerry, 366 00:21:17,520 --> 00:21:21,160 Speaker 1: or just just queue up that whole bit and stand 367 00:21:21,200 --> 00:21:24,080 Speaker 1: outside of their window wearing a trench coat and holding 368 00:21:24,119 --> 00:21:26,720 Speaker 1: a boom box over your head with the smug look 369 00:21:26,760 --> 00:21:30,280 Speaker 1: on your face. Uh all right, So should we go 370 00:21:30,359 --> 00:21:32,600 Speaker 1: back in in the old way back machine a little 371 00:21:32,600 --> 00:21:34,200 Speaker 1: bit and just talk a little bit about how the 372 00:21:34,280 --> 00:21:46,000 Speaker 1: scientific method came to be? Yes? Man, this this thing? 373 00:21:46,720 --> 00:21:50,400 Speaker 1: Where are you running this on these days? Straight? Kerosene? 374 00:21:51,080 --> 00:21:54,640 Speaker 1: The fumes in here killing me? Sorry about that? Trying 375 00:21:54,640 --> 00:21:59,040 Speaker 1: to go green? You know, kerosene is not cream diesel. 376 00:21:59,080 --> 00:22:03,280 Speaker 1: Maybe I'm choking bio diesel. How about that? Okay, the 377 00:22:03,320 --> 00:22:05,720 Speaker 1: way back machine will run French free Greece. That would 378 00:22:05,760 --> 00:22:08,200 Speaker 1: be fine. I'll get to work on that. I could 379 00:22:08,280 --> 00:22:11,000 Speaker 1: handle this for you. So you you teach us with 380 00:22:11,040 --> 00:22:14,080 Speaker 1: the Renaissance, and the reason the Renaissance was so awesome 381 00:22:14,119 --> 00:22:17,320 Speaker 1: and necessary was because of something else we've talked about, 382 00:22:17,359 --> 00:22:21,640 Speaker 1: which was the Dark Ages. Uh when, which remember that's 383 00:22:21,640 --> 00:22:27,840 Speaker 1: the rationalists disparaging term for this era. That's right, Uh? 384 00:22:27,880 --> 00:22:31,560 Speaker 1: But I think sort of rightfully so, because right before 385 00:22:31,560 --> 00:22:34,240 Speaker 1: the Dark Ages until about a century after, there was 386 00:22:34,280 --> 00:22:38,480 Speaker 1: not much advancement at all in the realm of scientific advancement. 387 00:22:39,480 --> 00:22:42,199 Speaker 1: Uh no, it's it's true. That's hard to argue with that, 388 00:22:42,280 --> 00:22:45,040 Speaker 1: and and the reason why it is again, science wasn't 389 00:22:45,040 --> 00:22:48,320 Speaker 1: really born yet and there was a huge struggle between 390 00:22:48,400 --> 00:22:52,440 Speaker 1: rationalism and mysticism, and ultimately we're living in the age 391 00:22:52,440 --> 00:22:55,360 Speaker 1: of rationalism now. Yeah, and we should point out too 392 00:22:55,440 --> 00:22:59,080 Speaker 1: that this was mainly in Europe over in the Islamic world, 393 00:22:59,240 --> 00:23:01,320 Speaker 1: as I think we had a listener mail point out, 394 00:23:01,600 --> 00:23:04,399 Speaker 1: there were a lot of advancements being made. Uh, just 395 00:23:04,440 --> 00:23:07,160 Speaker 1: sort of flying under the European radar at the time, 396 00:23:07,720 --> 00:23:11,760 Speaker 1: because some say the Catholic Church kind of kept science 397 00:23:11,840 --> 00:23:14,800 Speaker 1: under its thumb for a while and it's a pretty 398 00:23:14,800 --> 00:23:18,080 Speaker 1: big threat and said, you know, you can't do this stuff. 399 00:23:18,080 --> 00:23:20,800 Speaker 1: You can't experiment like this, and don't ask these questions 400 00:23:21,920 --> 00:23:25,639 Speaker 1: because here are your answers. Yeah. But eventually the Renaissance 401 00:23:25,680 --> 00:23:28,400 Speaker 1: came about in the twelfth century and people woke up 402 00:23:28,440 --> 00:23:31,040 Speaker 1: and I saw some of the work in the Islamic 403 00:23:31,080 --> 00:23:33,560 Speaker 1: world and said, you know what, maybe let's start reading 404 00:23:33,600 --> 00:23:36,000 Speaker 1: up on Aristotle and told the me and euclid it 405 00:23:36,080 --> 00:23:39,400 Speaker 1: once again. Yeah, they're like, we forgot about these guys. Yeah, 406 00:23:39,440 --> 00:23:41,119 Speaker 1: I mean, it literally kind of vanished for a while 407 00:23:41,520 --> 00:23:44,920 Speaker 1: it did. From the west. Yes, fortunately it was still around, 408 00:23:45,280 --> 00:23:48,040 Speaker 1: you know, and in its home places. But yes, in 409 00:23:48,080 --> 00:23:50,840 Speaker 1: the West they were lost. The Roman stuff was almost 410 00:23:51,000 --> 00:23:54,160 Speaker 1: entirely lost because it was being suppressed by the locals. 411 00:23:54,400 --> 00:23:58,160 Speaker 1: And I think the Greek knowledge was completely vanished. Yes, somehow, 412 00:23:58,560 --> 00:24:01,359 Speaker 1: somehow they got there was some um. We got another 413 00:24:01,400 --> 00:24:03,719 Speaker 1: listener mail after the Enlightenment when they said that it 414 00:24:03,800 --> 00:24:08,959 Speaker 1: was an Islamic scholar who was the one who translated 415 00:24:09,200 --> 00:24:13,840 Speaker 1: Aristotle into Latin or something like that, and that without 416 00:24:13,920 --> 00:24:16,679 Speaker 1: this guy, like, the West wouldn't have had much to 417 00:24:16,800 --> 00:24:20,240 Speaker 1: start with. Because that's where that birth of rationalism came 418 00:24:20,240 --> 00:24:24,600 Speaker 1: from from, was this rediscovery of Greek and Roman classical thought. 419 00:24:25,480 --> 00:24:29,040 Speaker 1: And this was the basis of scientific inquiry of rationalism, 420 00:24:29,080 --> 00:24:32,000 Speaker 1: of saying, like, okay, there's there's set rules to things, 421 00:24:32,000 --> 00:24:34,560 Speaker 1: and we need to discover these rules and how the 422 00:24:34,720 --> 00:24:37,280 Speaker 1: principles of how the universe works, like there has to 423 00:24:37,280 --> 00:24:39,600 Speaker 1: be principles, and we need to find this in a rational, 424 00:24:39,640 --> 00:24:44,600 Speaker 1: methodical way. And right out of the gate, Europe said, okay, 425 00:24:44,680 --> 00:24:47,600 Speaker 1: well whatever you say is right, then, Aristotle. We're used 426 00:24:47,600 --> 00:24:51,720 Speaker 1: to just believing everything without questioning it. And luckily Albert Magnus, 427 00:24:51,720 --> 00:24:55,600 Speaker 1: I think is who it was. Um was Albertus Magnus 428 00:24:55,760 --> 00:24:59,119 Speaker 1: or Roger Bacon, who said, no, it is Bacon. Roger Bacon, 429 00:24:59,160 --> 00:25:03,800 Speaker 1: who just has this great name, Roger Bacon, the Bacon brothers. Yeah, 430 00:25:03,960 --> 00:25:07,120 Speaker 1: he said, Roger. They weren't brothers though, but they were 431 00:25:07,160 --> 00:25:09,200 Speaker 1: they related at all, you know. I looked that up 432 00:25:09,240 --> 00:25:12,600 Speaker 1: and I don't think people know either way. I don't 433 00:25:12,640 --> 00:25:14,720 Speaker 1: think there's any proof, but a lot of people think, 434 00:25:15,200 --> 00:25:17,760 Speaker 1: because of their names and the way things went back then, 435 00:25:17,840 --> 00:25:20,199 Speaker 1: that they may very well have been red And I 436 00:25:20,200 --> 00:25:23,159 Speaker 1: mean they were separated by three d or so years. 437 00:25:23,200 --> 00:25:26,280 Speaker 1: Although Roger was a was a monk, so he would 438 00:25:26,320 --> 00:25:29,080 Speaker 1: not have had children. So if they were, it's an 439 00:25:29,119 --> 00:25:33,240 Speaker 1: excellent point. It wasn't necessarily through his line, you know. Yeah, 440 00:25:33,240 --> 00:25:35,960 Speaker 1: it could have been a nephew or something. Yeah, or 441 00:25:36,000 --> 00:25:39,639 Speaker 1: his brother Kevin might have had the line that matched. 442 00:25:39,960 --> 00:25:43,040 Speaker 1: So Roger was the one who said, everybody stopped. Just 443 00:25:43,080 --> 00:25:47,040 Speaker 1: because Aristotle wrote something doesn't mean it's fact, especially when 444 00:25:47,320 --> 00:25:51,320 Speaker 1: we find contradictions to it. Yeah, it doesn't. Our cells 445 00:25:51,320 --> 00:25:54,520 Speaker 1: not automatically right, and this is a huge advancement. Yeah, 446 00:25:54,560 --> 00:25:57,840 Speaker 1: And Albertus Magnus was the one I believe who said 447 00:25:59,200 --> 00:26:02,919 Speaker 1: you know this and called revealed truth, which is basically 448 00:26:03,160 --> 00:26:07,120 Speaker 1: God says this instead of a truth found by experimenting, 449 00:26:07,760 --> 00:26:11,119 Speaker 1: is maybe we should experiment instead and not take this 450 00:26:11,200 --> 00:26:14,040 Speaker 1: revealed truth as the truth. Right. And we mentioned in 451 00:26:14,119 --> 00:26:18,800 Speaker 1: the Enlightenment episode as well about scholasticism about using scientific 452 00:26:18,920 --> 00:26:24,320 Speaker 1: inquiry to explain theology, which was you know, you're still 453 00:26:24,320 --> 00:26:27,399 Speaker 1: working from a theological standpoint, right, but you're starting to 454 00:26:27,480 --> 00:26:32,320 Speaker 1: use scientific inquiry and the the idea that you shouldn't 455 00:26:32,359 --> 00:26:35,040 Speaker 1: just accept things as truth. That was again a huge, 456 00:26:35,280 --> 00:26:40,240 Speaker 1: huge breakthrough. Uh. Francis Bacon, the other Bacon brother, he's 457 00:26:40,280 --> 00:26:42,159 Speaker 1: one of the heroes of this story. Yeah, he was 458 00:26:42,200 --> 00:26:47,240 Speaker 1: an attorney and philosopher and its possibly Shakespeare. Oh really, 459 00:26:47,560 --> 00:26:52,760 Speaker 1: I never heard that interesting? So what do you mean 460 00:26:52,840 --> 00:26:56,760 Speaker 1: like wrote those under the pseudonym huh? And there the 461 00:26:56,800 --> 00:26:58,960 Speaker 1: Shakespeare sister was the other theory too, right. That was 462 00:26:59,000 --> 00:27:02,000 Speaker 1: a woman. I've heard that, and she couldn't like you know, 463 00:27:02,080 --> 00:27:06,520 Speaker 1: women couldn't be the play, right, so's her dumb brother, William. 464 00:27:06,600 --> 00:27:09,800 Speaker 1: That's good? Was it her brother? I think that was 465 00:27:09,880 --> 00:27:14,760 Speaker 1: one of the theories. This was a good Smith song too. Uh, 466 00:27:14,880 --> 00:27:17,520 Speaker 1: Shakespeare sister was that the name of it. Wouldn't it 467 00:27:17,560 --> 00:27:21,000 Speaker 1: a band too? I think it was? What was it? Maybe? 468 00:27:22,280 --> 00:27:25,960 Speaker 1: Uh So, anyway, he was a philosopher and a lawyer, 469 00:27:26,480 --> 00:27:31,520 Speaker 1: and he said, you know what, the Baconian method basically 470 00:27:31,560 --> 00:27:35,080 Speaker 1: became the scientific method. He was the first dude who 471 00:27:35,119 --> 00:27:37,680 Speaker 1: really said, this is how the steps that you should 472 00:27:37,680 --> 00:27:42,560 Speaker 1: take to uh investigate science. There has to be a framework. 473 00:27:42,560 --> 00:27:45,000 Speaker 1: And the whole point of this that we take this 474 00:27:45,200 --> 00:27:49,879 Speaker 1: so for granted now because it's so intuitively and on 475 00:27:50,000 --> 00:27:54,880 Speaker 1: its face right, yeah, as far as scienceific inquiry goes. 476 00:27:54,920 --> 00:27:58,639 Speaker 1: But this is an enormous breakthrough to say, you follow 477 00:27:58,760 --> 00:28:03,840 Speaker 1: this step, the up this framework, and if everybody who 478 00:28:04,040 --> 00:28:07,640 Speaker 1: carries out science follows the same framework, then science will 479 00:28:07,640 --> 00:28:11,480 Speaker 1: be universal and interchangeable and anyone in the world, and 480 00:28:11,560 --> 00:28:15,600 Speaker 1: not just now, but anytime, we'll be able to carry 481 00:28:15,600 --> 00:28:19,480 Speaker 1: out the same experiment and we'll be able to verify 482 00:28:19,600 --> 00:28:24,840 Speaker 1: or disprove it. Yea, And that is amazing that that happened. 483 00:28:25,040 --> 00:28:27,080 Speaker 1: That's why Francis Bacon is one of the heroes of 484 00:28:27,160 --> 00:28:29,560 Speaker 1: the story. And he didn't come up with this entirely 485 00:28:29,600 --> 00:28:31,480 Speaker 1: on his own, but he was the one who said 486 00:28:32,200 --> 00:28:34,000 Speaker 1: this is what we're gonna do. I'm going to give 487 00:28:34,000 --> 00:28:36,280 Speaker 1: it a name. I'm going to spell it out, and 488 00:28:36,560 --> 00:28:39,959 Speaker 1: from now on you can call me the dad of 489 00:28:40,000 --> 00:28:43,000 Speaker 1: the scientific method. Yeah. And that's why Newton was such 490 00:28:43,040 --> 00:28:46,160 Speaker 1: a rock star, because he so rigorously stuck to the 491 00:28:46,160 --> 00:28:50,760 Speaker 1: scientific method that all these centuries later, his uh, you know, 492 00:28:50,840 --> 00:28:53,520 Speaker 1: his systems of laws are they have stood the test 493 00:28:53,520 --> 00:28:56,040 Speaker 1: of time. And uh, I think it's a good point 494 00:28:56,040 --> 00:29:01,080 Speaker 1: to bring up to that the collaboration of sciences is 495 00:29:01,800 --> 00:29:06,800 Speaker 1: really the hallmark of advancement and moving forward. It's not 496 00:29:06,880 --> 00:29:10,040 Speaker 1: working in a vacuum. It's sharing your ideas and working 497 00:29:10,040 --> 00:29:13,560 Speaker 1: with one another. And the whole h little sidebar here 498 00:29:13,560 --> 00:29:16,400 Speaker 1: on celth here I thought was pretty cool, which was 499 00:29:16,400 --> 00:29:20,480 Speaker 1: when science quit. We're not quit, but started looking at 500 00:29:20,520 --> 00:29:23,320 Speaker 1: small things instead of looking at the universe around them 501 00:29:23,320 --> 00:29:27,760 Speaker 1: and at the stars. And uh said, basically, you know, 502 00:29:27,840 --> 00:29:32,680 Speaker 1: through the advancement of lens grinding, Antonio van Leben Hook, 503 00:29:33,160 --> 00:29:37,720 Speaker 1: specifically a Dutch tradesman, was pretty good at making simple microscopes, 504 00:29:38,120 --> 00:29:40,560 Speaker 1: and all of a sudden, contemporaries like Robert Hook said, 505 00:29:40,560 --> 00:29:43,680 Speaker 1: you know what, let's start looking at tiny things because 506 00:29:43,680 --> 00:29:46,760 Speaker 1: they're in might lie the answer too many many things yeah, 507 00:29:46,800 --> 00:29:51,840 Speaker 1: and they're right. Robert Hook found cork. He discovered cells 508 00:29:51,920 --> 00:29:56,200 Speaker 1: by looking at cork through an early microscope. So in 509 00:29:56,240 --> 00:30:01,080 Speaker 1: this story, science is hastened by technological advance, spent lens 510 00:30:01,120 --> 00:30:05,080 Speaker 1: grinding to make microscopes, and then this new technology is 511 00:30:05,160 --> 00:30:08,280 Speaker 1: used to further science, right, Yeah, it's like mutual inspiration 512 00:30:08,360 --> 00:30:12,680 Speaker 1: between Leavin Hook and Hook. Levin Hook. Yeah, it was 513 00:30:12,760 --> 00:30:18,520 Speaker 1: neat because Hook heard about Leavin Hooks microscopes, got his 514 00:30:18,600 --> 00:30:22,360 Speaker 1: hands on one or a microscope, looked at him like cork, 515 00:30:22,600 --> 00:30:26,440 Speaker 1: and said, oh, there's such thing as cells. Levin Hook said, oh, 516 00:30:26,480 --> 00:30:28,920 Speaker 1: that's pretty neat. Let me try. And he said, oh, 517 00:30:28,960 --> 00:30:32,080 Speaker 1: there's such a thing as quote little animals, which we 518 00:30:32,160 --> 00:30:36,560 Speaker 1: call protozoan bacteria. And one of the Royal Societies. After 519 00:30:36,720 --> 00:30:41,320 Speaker 1: Leavin Hook presented his findings, turned back to Hook and said, hey, Hook, 520 00:30:41,360 --> 00:30:43,920 Speaker 1: we know you're pretty handy with the microscope. You confirmed 521 00:30:44,000 --> 00:30:49,000 Speaker 1: Leavin Hook's findings are their little animals? Hook said, there are, indeed, 522 00:30:49,160 --> 00:30:51,480 Speaker 1: I can see them with my microscope. That's right. And 523 00:30:51,520 --> 00:30:57,000 Speaker 1: that inspired a German botanist name Matthias schleiden Uh to 524 00:30:57,400 --> 00:31:00,320 Speaker 1: look at a lot of plants, and he was the 525 00:31:00,360 --> 00:31:02,440 Speaker 1: first guy to say, you know what, plants are composed 526 00:31:02,480 --> 00:31:05,160 Speaker 1: of cells. And he's having dinner one night with his 527 00:31:05,280 --> 00:31:08,680 Speaker 1: zoologist buddy. Yeah, this is about a hundred years later. Yeah, 528 00:31:08,720 --> 00:31:13,280 Speaker 1: Theodore schran and said, you know what, dude, Uh, order 529 00:31:13,400 --> 00:31:17,080 Speaker 1: the wine and order the steak. Trust me, because this 530 00:31:17,120 --> 00:31:22,160 Speaker 1: place is fantastic. And uh, also, plants are made of cells. 531 00:31:22,360 --> 00:31:24,680 Speaker 1: Don't tell anyone. And he went, you know what, dude, 532 00:31:24,840 --> 00:31:28,440 Speaker 1: I have been investigating animals with microscopes and they're made 533 00:31:28,440 --> 00:31:30,880 Speaker 1: of cells too. And so they figured out at this 534 00:31:30,960 --> 00:31:34,280 Speaker 1: dinner that everything is made of cells. All living things 535 00:31:34,280 --> 00:31:37,160 Speaker 1: are made of cells. Boom, Okay, so this is huge. 536 00:31:37,240 --> 00:31:40,200 Speaker 1: This is a big advancement, right that we're hitting upon 537 00:31:40,360 --> 00:31:44,200 Speaker 1: right now. Huge, but it laid the further foundation, right, 538 00:31:44,360 --> 00:31:48,760 Speaker 1: So initial scientific inquiry led to further scientific inquiry and 539 00:31:48,840 --> 00:31:54,000 Speaker 1: further scientific conclusions and generalizations. All living things are made 540 00:31:54,040 --> 00:31:57,680 Speaker 1: of cells, and then it was extrapolated elsewhere. Right. Yeah, 541 00:31:57,720 --> 00:32:01,840 Speaker 1: Like twenty years later, Rudolph Virtual said, you know what, 542 00:32:01,960 --> 00:32:05,400 Speaker 1: not only is everything made of living cells, but they 543 00:32:05,640 --> 00:32:08,920 Speaker 1: all come from pre existing cells, which was a huge 544 00:32:08,920 --> 00:32:12,560 Speaker 1: deal at the time because people believed in spontaneous generation 545 00:32:13,160 --> 00:32:16,640 Speaker 1: at the time, Like if you left some um wheat 546 00:32:16,640 --> 00:32:19,760 Speaker 1: seed in a sweaty shirt. It would spawn mice. I 547 00:32:19,800 --> 00:32:23,120 Speaker 1: think was one of them. There's a lot of weird ones. 548 00:32:23,240 --> 00:32:26,600 Speaker 1: Press basil between some bricks and you'll get a scorpion. 549 00:32:27,360 --> 00:32:30,280 Speaker 1: Was one, like they were really out there? Yeah, well 550 00:32:30,320 --> 00:32:32,560 Speaker 1: the one that is well not true, but the one 551 00:32:32,600 --> 00:32:36,000 Speaker 1: that you could actually see was rotten meat would eventually 552 00:32:36,480 --> 00:32:39,920 Speaker 1: uh spawn maggots. How did they possibly get there? Yeah, 553 00:32:40,040 --> 00:32:43,560 Speaker 1: spontaneous generation. But that's the obvious explanation, and if you 554 00:32:43,600 --> 00:32:47,280 Speaker 1: think about it, they're working from Okam's razor and Ockham's razor, says, 555 00:32:47,320 --> 00:32:51,920 Speaker 1: the simplest explanation is usually the right one, all all 556 00:32:51,960 --> 00:32:54,920 Speaker 1: other things given. Well, the thing is is spontaneous generation 557 00:32:54,960 --> 00:32:58,440 Speaker 1: has never been shown to be possible. If we've got 558 00:32:58,480 --> 00:33:02,480 Speaker 1: the cell thing over here, let's investigate that. So this 559 00:33:02,680 --> 00:33:06,640 Speaker 1: uh what was the guy's name? Virtual? Yes, he's saying, okay, 560 00:33:06,720 --> 00:33:09,280 Speaker 1: well wait a minute, I get this cell theory I'm 561 00:33:09,280 --> 00:33:11,560 Speaker 1: working on that's been around for a couple of decades. 562 00:33:11,800 --> 00:33:15,600 Speaker 1: Hypothesis probably the cell hypothesis at the nice catch. Don't 563 00:33:15,600 --> 00:33:17,840 Speaker 1: feel bad though, because this article that you sent said 564 00:33:17,880 --> 00:33:22,960 Speaker 1: that scientists today like still like confuse those terms just 565 00:33:24,080 --> 00:33:26,200 Speaker 1: and the the house of works. Article makes a good 566 00:33:26,200 --> 00:33:28,640 Speaker 1: point in saying that science and everything that has to 567 00:33:28,640 --> 00:33:31,000 Speaker 1: do with it is in the scientific method is very 568 00:33:31,080 --> 00:33:36,640 Speaker 1: fluid and open to interpretation and experimentation. Obviously, but so 569 00:33:36,720 --> 00:33:40,160 Speaker 1: he says, um, okay, this cell hypothesis, this is a 570 00:33:40,200 --> 00:33:45,440 Speaker 1: pretty good explanation for what we now call spontaneous generation. 571 00:33:45,960 --> 00:33:47,800 Speaker 1: He didn't do anything about it. He just put it 572 00:33:47,840 --> 00:33:51,680 Speaker 1: out there. Yeah, and then along comes Louis Pastier, who 573 00:33:51,760 --> 00:33:54,120 Speaker 1: does do something about it. He figures out a great 574 00:33:54,160 --> 00:33:59,800 Speaker 1: experiment to try to disprove spontaneous generation. Yeah, it's pretty 575 00:34:00,040 --> 00:34:04,680 Speaker 1: pool to um. He basically took a broth uh, put 576 00:34:04,720 --> 00:34:08,040 Speaker 1: equal amounts in two different beakers. One had a straight 577 00:34:08,080 --> 00:34:11,360 Speaker 1: neck and one had an S shaped neck. He boiled 578 00:34:11,360 --> 00:34:13,240 Speaker 1: it just to make sure everything and it was killed, 579 00:34:13,800 --> 00:34:16,120 Speaker 1: and then just let it sit there in the same conditions, 580 00:34:16,239 --> 00:34:19,719 Speaker 1: open to the to the world and are open to 581 00:34:19,719 --> 00:34:23,560 Speaker 1: the room, like it wasn't corked. In other words, he 582 00:34:23,719 --> 00:34:26,640 Speaker 1: noticed that the one with the straight neck eventually became 583 00:34:26,680 --> 00:34:30,120 Speaker 1: cloudy and discolored h meaning there was some junk growing 584 00:34:30,160 --> 00:34:32,560 Speaker 1: in there, and the one in the S shaped neck 585 00:34:32,719 --> 00:34:36,520 Speaker 1: did not do anything. It remained the same. So led 586 00:34:36,600 --> 00:34:39,840 Speaker 1: him to think what well, he thought that germs, that 587 00:34:39,880 --> 00:34:42,839 Speaker 1: there were such things as germs which m leaving Hoke 588 00:34:42,920 --> 00:34:47,040 Speaker 1: and Hook had already shown um and that if that 589 00:34:47,280 --> 00:34:50,120 Speaker 1: in the S shaped flask they had gotten trapped in 590 00:34:50,160 --> 00:34:54,120 Speaker 1: the neck, in this the open neck, they had been 591 00:34:54,160 --> 00:34:58,440 Speaker 1: able to just enter unobstructed and had generated there. The 592 00:34:58,520 --> 00:35:02,200 Speaker 1: reason that the S shaped ask was still sterile was 593 00:35:02,280 --> 00:35:05,160 Speaker 1: because there is no such thing as spontaneous generation. If 594 00:35:05,160 --> 00:35:08,920 Speaker 1: there were, then no S shaped neck would impede anything 595 00:35:08,960 --> 00:35:12,040 Speaker 1: like that, and boom, there you have it. So he 596 00:35:12,280 --> 00:35:17,759 Speaker 1: disproved that spontaneous generation is a thing, right, that's right 597 00:35:17,920 --> 00:35:21,920 Speaker 1: through the scientific method. Exactly. Here's the leap that a 598 00:35:21,920 --> 00:35:25,160 Speaker 1: lot of people make, scientists included that really is a 599 00:35:25,200 --> 00:35:30,160 Speaker 1: great disservice to science. He didn't prove cell theory right. 600 00:35:30,400 --> 00:35:34,200 Speaker 1: What he did was take that cell hypothesis and present 601 00:35:34,360 --> 00:35:41,440 Speaker 1: some really persuasive evidence that it's probably right. Yeah, But 602 00:35:41,560 --> 00:35:44,239 Speaker 1: like this article you sent points out, disproving something is 603 00:35:44,320 --> 00:35:46,799 Speaker 1: just as important as proving something. So here's the thing 604 00:35:47,200 --> 00:35:49,040 Speaker 1: that's the most you can hope for a science is 605 00:35:49,120 --> 00:35:53,320 Speaker 1: disproving sure with science, unless you're talking about math. With science, 606 00:35:53,440 --> 00:35:56,959 Speaker 1: there's no such thing as proof a theory, even a law, 607 00:35:57,239 --> 00:36:01,200 Speaker 1: universal law still has the potential for being undermined by 608 00:36:01,280 --> 00:36:05,200 Speaker 1: one single experiment, one single observation, and therefore there is 609 00:36:05,400 --> 00:36:10,640 Speaker 1: no real ultimate proof in science. There's just theories and 610 00:36:10,800 --> 00:36:15,239 Speaker 1: support for theories, and then ultimately laws aim further and 611 00:36:15,280 --> 00:36:20,080 Speaker 1: further support for laws, right, but they're not proven. What 612 00:36:20,239 --> 00:36:25,480 Speaker 1: science does ultimately is disproved things or lend support for 613 00:36:25,520 --> 00:36:29,680 Speaker 1: existing theories, are existing interpretations of why things happen the 614 00:36:29,719 --> 00:36:32,719 Speaker 1: way they do. And that's what past true did. So 615 00:36:32,760 --> 00:36:36,600 Speaker 1: if you look at the experiment, he disproved spontaneous generation, 616 00:36:36,960 --> 00:36:39,880 Speaker 1: but he lent support to the cell theory, and probably 617 00:36:39,920 --> 00:36:42,719 Speaker 1: with his experiment it went from the cell hypothesis to 618 00:36:42,840 --> 00:36:46,759 Speaker 1: the cell theory because it was just so persuasive. And 619 00:36:46,800 --> 00:36:48,719 Speaker 1: that's what a theory is. It means that a lot 620 00:36:48,760 --> 00:36:53,239 Speaker 1: of people out there who are reasonable say this explanation 621 00:36:53,320 --> 00:36:55,759 Speaker 1: is probably the right one. Yeah, it's predictive. If you 622 00:36:55,800 --> 00:36:57,480 Speaker 1: do it over and over, you're probably going to get 623 00:36:57,480 --> 00:37:01,280 Speaker 1: the same result. But that's not to say that past 624 00:37:01,320 --> 00:37:03,799 Speaker 1: year showed that if you do this a million in 625 00:37:03,920 --> 00:37:08,920 Speaker 1: one times that the S shaped flask won't turn cloudy. 626 00:37:09,239 --> 00:37:11,840 Speaker 1: He didn't prove that. You can't prove that, which is 627 00:37:11,920 --> 00:37:16,759 Speaker 1: again science can disprove and when support can't prove. Very 628 00:37:16,800 --> 00:37:19,719 Speaker 1: good point. So right after this message break, we're going 629 00:37:19,760 --> 00:37:41,960 Speaker 1: to get into the actual steps of the scientific method. Alright, dude, 630 00:37:42,080 --> 00:37:46,000 Speaker 1: I guess that long last, we're there. Like you mentioned before, 631 00:37:46,560 --> 00:37:50,000 Speaker 1: the scientific method is fluid and it's not like when 632 00:37:50,040 --> 00:37:52,959 Speaker 1: you get your science degree they hand you a little 633 00:37:53,000 --> 00:37:57,799 Speaker 1: laminated card like the Miranda rights that cop scary that 634 00:37:58,000 --> 00:37:59,920 Speaker 1: you know, list out all the different steps you have 635 00:37:59,920 --> 00:38:04,719 Speaker 1: to take. Um. But generally maybe yeah, I would we 636 00:38:04,760 --> 00:38:06,759 Speaker 1: should carry those around, all right. We should make little 637 00:38:06,800 --> 00:38:10,040 Speaker 1: wallet cards of the scientific method just to carry the stuff. 638 00:38:10,080 --> 00:38:11,719 Speaker 1: You should know a logo on it. Oh yeah, I'll 639 00:38:11,760 --> 00:38:15,839 Speaker 1: make a million bucks a random and sell them. Uh. 640 00:38:16,160 --> 00:38:19,560 Speaker 1: Generally speaking, though it follows these steps. The first thing 641 00:38:19,560 --> 00:38:21,920 Speaker 1: you do, like we mentioned earlier, is you observe something. 642 00:38:22,440 --> 00:38:26,480 Speaker 1: You ask a question. Uh next, Like Darwin was known, 643 00:38:26,920 --> 00:38:29,080 Speaker 1: I think when we did our podcast on him too, 644 00:38:29,280 --> 00:38:31,960 Speaker 1: you would spend like a week on three square feet 645 00:38:32,480 --> 00:38:36,040 Speaker 1: of ground. It was like even longer than that. Remember 646 00:38:36,120 --> 00:38:38,080 Speaker 1: Remember was wasn't it He said that he didn't he 647 00:38:38,080 --> 00:38:40,440 Speaker 1: wasn't gonna MOA's lawn for like three years because he 648 00:38:40,480 --> 00:38:43,719 Speaker 1: wanted to see what what happened? Yeah, so he's the 649 00:38:43,800 --> 00:38:47,719 Speaker 1: ultimate and qualitative data of just observing, writing things down 650 00:38:47,840 --> 00:38:51,040 Speaker 1: and asking questions. And the reason you ask your question 651 00:38:51,920 --> 00:38:55,239 Speaker 1: is so you can narrow something down like that. I 652 00:38:55,239 --> 00:38:58,120 Speaker 1: think the example they use in here is on galapa 653 00:38:58,160 --> 00:39:01,759 Speaker 1: ghosts like the beaks of what bird? Was it? Finches? Yeah, 654 00:39:01,800 --> 00:39:03,720 Speaker 1: the finch bird. He noticed a bunch of different beaks, 655 00:39:04,239 --> 00:39:07,120 Speaker 1: so he finally posed a question like, um, you know, 656 00:39:07,160 --> 00:39:10,040 Speaker 1: I think these beaks are different for a very specific reason, 657 00:39:10,480 --> 00:39:13,160 Speaker 1: and I aim to find out why. Yes, he said, 658 00:39:13,160 --> 00:39:18,920 Speaker 1: what caused the diversification of finches on Galapagos? Who should 659 00:39:18,920 --> 00:39:21,640 Speaker 1: have done that with an accent? Well, yeah, he would 660 00:39:21,640 --> 00:39:24,160 Speaker 1: have had a British accent. Huh yeah, unless he was 661 00:39:24,280 --> 00:39:27,319 Speaker 1: pretending to be someone else. I think of him as like, um, 662 00:39:27,480 --> 00:39:31,240 Speaker 1: sounding like Hemingway or something. Oh yeah, just drunken, violent 663 00:39:31,640 --> 00:39:34,360 Speaker 1: kind of. But he wasn't. He was like the opposite 664 00:39:34,360 --> 00:39:37,120 Speaker 1: of that. Yeah. Well I saw that the movie. So 665 00:39:37,360 --> 00:39:41,760 Speaker 1: a picture of his voice as the dude that played 666 00:39:41,840 --> 00:39:44,840 Speaker 1: him who I can't remember right now, ed Norton. No, 667 00:39:47,239 --> 00:39:50,359 Speaker 1: I finally saw Birdman though. Did you see that? Yeah? Yeah, 668 00:39:50,920 --> 00:39:54,319 Speaker 1: great movie? Um, I disagree, Oh you didn't like it? 669 00:39:54,840 --> 00:40:01,040 Speaker 1: What Wow, that surprises me. Um, we'll get into that 670 00:40:01,239 --> 00:40:06,239 Speaker 1: off here. So, uh sorry you just threw me with that. 671 00:40:06,560 --> 00:40:09,680 Speaker 1: Make an observation. Yes, he's on galapagost and he's like, 672 00:40:09,760 --> 00:40:12,400 Speaker 1: what the hex with all these different finches one small island? 673 00:40:12,640 --> 00:40:15,640 Speaker 1: Why would there be different species of finch so asking? 674 00:40:15,719 --> 00:40:19,319 Speaker 1: And and why are they all seeming to survive and 675 00:40:19,360 --> 00:40:23,560 Speaker 1: coexist so well? What's what makes Yeah, then he leads 676 00:40:23,560 --> 00:40:26,799 Speaker 1: to the question what's making all of these species of 677 00:40:26,840 --> 00:40:30,440 Speaker 1: finches so diverse? Right? Or Bill Harris uses a pretty 678 00:40:30,440 --> 00:40:34,359 Speaker 1: good example that's something everyone can understand, like what car 679 00:40:34,400 --> 00:40:38,480 Speaker 1: body shape is the best for air resistance? Like one 680 00:40:38,520 --> 00:40:40,360 Speaker 1: the shape like a box, or one the shape like 681 00:40:40,400 --> 00:40:43,279 Speaker 1: aerodynamic like a bird. And he carries that out In 682 00:40:43,560 --> 00:40:48,759 Speaker 1: the next step. You formulate your hypothesis based on your 683 00:40:49,239 --> 00:40:53,279 Speaker 1: you know, for knowledge and maybe observations like so, you 684 00:40:53,280 --> 00:40:55,560 Speaker 1: know what, I think that a car shaped like a 685 00:40:55,600 --> 00:40:58,520 Speaker 1: bird is probably more aerodynamic than one shape like a box. Yeah. 686 00:40:58,520 --> 00:41:00,160 Speaker 1: If you're thinking, if you're the type of per soon 687 00:41:00,160 --> 00:41:03,840 Speaker 1: he's sitting around asking questions about aerodynamics, you probably already 688 00:41:03,880 --> 00:41:06,680 Speaker 1: have some sort of sense that a box is less 689 00:41:06,719 --> 00:41:10,720 Speaker 1: aerodynamic than a bird. That's right. Boxes rarely fly unless 690 00:41:10,719 --> 00:41:13,600 Speaker 1: they're carried by one of those delightful Amazon delivery drones. 691 00:41:16,120 --> 00:41:17,759 Speaker 1: They don't have those yet, right, They're not gonna do that, 692 00:41:17,800 --> 00:41:23,200 Speaker 1: are they. There's like a pizza delivery drone service. Man. 693 00:41:23,360 --> 00:41:27,000 Speaker 1: I think where you have pizza are grilled cheese in 694 00:41:27,080 --> 00:41:29,399 Speaker 1: New York and you go stand on an X after 695 00:41:29,440 --> 00:41:31,640 Speaker 1: you order and it like comes and drops it. That 696 00:41:31,760 --> 00:41:34,439 Speaker 1: is the dumbest thing of error. And I can't wait 697 00:41:34,480 --> 00:41:37,160 Speaker 1: to do it where they're making a lot of money. 698 00:41:37,600 --> 00:41:40,239 Speaker 1: That's pretty funny. Um, Yet we can't get food to 699 00:41:40,239 --> 00:41:42,839 Speaker 1: the homeless somehow exactly. We can drop a grilled cheese 700 00:41:42,880 --> 00:41:45,359 Speaker 1: on someone's head. They're like, you homeless, gut, get off 701 00:41:45,400 --> 00:41:50,400 Speaker 1: of that x exactly. Um. Alright, So your hypothesis I 702 00:41:50,440 --> 00:41:53,640 Speaker 1: don't think we ever mentioned is typically represented as an 703 00:41:53,680 --> 00:41:57,120 Speaker 1: if then statement, Yeah, if you're doing good sciences, Yeah, 704 00:41:57,160 --> 00:42:04,120 Speaker 1: like if the cars profile. Uh, well, the the example 705 00:42:04,160 --> 00:42:06,440 Speaker 1: he uses if the body's profile related to the amount 706 00:42:06,440 --> 00:42:10,239 Speaker 1: of air it produces, which is the more general statement. Yeah, 707 00:42:10,280 --> 00:42:12,520 Speaker 1: that's like based on a theory. Yeah, and it's gonna 708 00:42:12,520 --> 00:42:14,919 Speaker 1: get more specific than the car designed, like the body 709 00:42:14,920 --> 00:42:18,160 Speaker 1: of a bird will be more aerodynamic than one like 710 00:42:18,200 --> 00:42:21,320 Speaker 1: a box. So that's inductive reasoning, starting with the broad 711 00:42:21,480 --> 00:42:24,600 Speaker 1: statement and going to something narrow, and it's if. Then 712 00:42:24,800 --> 00:42:28,280 Speaker 1: at the same time, Yeah, and now you have a test. 713 00:42:28,320 --> 00:42:30,239 Speaker 1: You have a question that can be answered, you can 714 00:42:30,280 --> 00:42:32,200 Speaker 1: figure out a way to answer it. Yeah, and he 715 00:42:32,239 --> 00:42:36,640 Speaker 1: points out to this is pretty important that, uh, your hypothesis, 716 00:42:36,640 --> 00:42:40,560 Speaker 1: if it's formulated correctly, means it is testable and it's falsifiable, 717 00:42:40,960 --> 00:42:44,800 Speaker 1: which are often one and the same, you know. Yeah, 718 00:42:44,840 --> 00:42:47,600 Speaker 1: And that's again we go to the people who say 719 00:42:47,640 --> 00:42:53,040 Speaker 1: that they're they're soft sciences, aren't real science. There's pseudoscience 720 00:42:53,080 --> 00:42:56,080 Speaker 1: because a lot of the data that they come up with, 721 00:42:56,120 --> 00:42:59,520 Speaker 1: a lot of the hypotheses they come up with aren't falsifiable, 722 00:42:59,520 --> 00:43:03,480 Speaker 1: they're not testable. It's a it's a thing. It's an issue, 723 00:43:03,880 --> 00:43:07,520 Speaker 1: it's a thing. So next up in the steps, you're 724 00:43:07,520 --> 00:43:11,640 Speaker 1: gonna experiment. And when you experiment, you can't just go 725 00:43:11,680 --> 00:43:14,080 Speaker 1: in there willy nilly and do whatever you want. Um, 726 00:43:14,120 --> 00:43:17,120 Speaker 1: you have to set up specific conditions and they must 727 00:43:17,120 --> 00:43:20,840 Speaker 1: be controlled, and you want to everything that's supposed to 728 00:43:20,880 --> 00:43:24,120 Speaker 1: be identical needs to be identical. So basically you have 729 00:43:24,320 --> 00:43:28,040 Speaker 1: two variables. At least you have an independent variable and 730 00:43:28,080 --> 00:43:31,120 Speaker 1: you have a dependent variable, and if you're talking about 731 00:43:31,400 --> 00:43:35,799 Speaker 1: car shape, that is the independent variable in this study, 732 00:43:35,960 --> 00:43:39,920 Speaker 1: that's the one that's manipulated exactly, it's the one you're controlling. 733 00:43:39,920 --> 00:43:43,359 Speaker 1: The independent variable is the one you, the researcher, is controlling. 734 00:43:43,680 --> 00:43:46,800 Speaker 1: So in this case, you're controlling the shape of the car. 735 00:43:47,120 --> 00:43:50,120 Speaker 1: You have yourself a bird shaped car and you have 736 00:43:50,200 --> 00:43:52,560 Speaker 1: yourself a box shaped car. So the shape of the 737 00:43:52,560 --> 00:43:56,160 Speaker 1: car changed because you made it change. Now when you 738 00:43:56,200 --> 00:43:58,719 Speaker 1: blast a bunch of air over it during your experiment, 739 00:43:59,200 --> 00:44:02,560 Speaker 1: what your measure ring is the dependent variable. So you're 740 00:44:02,600 --> 00:44:06,959 Speaker 1: measuring what happens based on the change that you made. 741 00:44:07,200 --> 00:44:09,480 Speaker 1: That's right, and you want to you you want to 742 00:44:09,480 --> 00:44:12,520 Speaker 1: study one single variable at a time, basically, Yeah, don't 743 00:44:12,520 --> 00:44:16,400 Speaker 1: get fancy, just just do good science, step by step, methodical. 744 00:44:16,920 --> 00:44:20,359 Speaker 1: You also have to have your control group in any experiment. Uh, 745 00:44:20,360 --> 00:44:24,840 Speaker 1: And an experimental group and then controlled group is what's 746 00:44:24,840 --> 00:44:28,080 Speaker 1: gonna allow you to compare the test results to that 747 00:44:28,160 --> 00:44:31,880 Speaker 1: baseline measurement. Yeah, and you need that baseline measurement, so 748 00:44:32,600 --> 00:44:36,160 Speaker 1: it's not just like chance. Basically exactly like if Pasture 749 00:44:36,200 --> 00:44:38,760 Speaker 1: had just done the S shaped neck and nothing happen. 750 00:44:39,520 --> 00:44:42,799 Speaker 1: He wouldn't have necessarily been able to say that he 751 00:44:42,920 --> 00:44:46,320 Speaker 1: was right even though he was right. He needed that control, 752 00:44:46,400 --> 00:44:49,440 Speaker 1: which was the open flask. Right. Or with the cars, 753 00:44:49,480 --> 00:44:52,360 Speaker 1: you need two cars, like you said, one bird shaped 754 00:44:52,360 --> 00:44:55,160 Speaker 1: and one box shaped, right, Or then maybe in this case, 755 00:44:55,200 --> 00:44:57,200 Speaker 1: since the bird shape and the box shape both show 756 00:44:57,280 --> 00:44:59,879 Speaker 1: up in the hypothesis, you need a third like egg 757 00:45:00,040 --> 00:45:02,640 Speaker 1: shaped one or something like that. I bet that would 758 00:45:02,640 --> 00:45:07,359 Speaker 1: be pretty streamline. Yeah, yeah, yeah. But the key though, 759 00:45:07,600 --> 00:45:10,000 Speaker 1: is all of those variables have to be um. All 760 00:45:10,040 --> 00:45:12,520 Speaker 1: the other variables have to be the same, like you 761 00:45:12,520 --> 00:45:14,600 Speaker 1: have to have them. They have to be the same weight, 762 00:45:14,719 --> 00:45:18,120 Speaker 1: they have to be painted the same, the tires, everything, 763 00:45:18,160 --> 00:45:21,120 Speaker 1: the windows. One can't have an antenna on the other not, 764 00:45:21,560 --> 00:45:23,640 Speaker 1: They've all they've got to be identical other than the 765 00:45:23,640 --> 00:45:26,960 Speaker 1: one variable, right, the independent variable that you're that's the 766 00:45:27,000 --> 00:45:29,480 Speaker 1: one you want different. Everything else you want the same 767 00:45:29,600 --> 00:45:31,960 Speaker 1: or else it's possible that, oh, well this one had 768 00:45:32,560 --> 00:45:36,120 Speaker 1: bigger tires, so that actually made it more aerodynamic. Yeah, 769 00:45:36,160 --> 00:45:38,239 Speaker 1: and you're just doing yourself a favor by doing all 770 00:45:38,239 --> 00:45:40,760 Speaker 1: that stuff. You know, you want to rule out everything 771 00:45:40,760 --> 00:45:44,839 Speaker 1: else but that one variable. After that, you want to 772 00:45:45,040 --> 00:45:50,120 Speaker 1: analyze your data, so you can draw your conclusion, and 773 00:45:50,200 --> 00:45:54,360 Speaker 1: sometimes it's kind of straightforward and easy. Sometimes takes a 774 00:45:54,400 --> 00:45:57,879 Speaker 1: lot of work and a lot of various tools draw 775 00:45:57,840 --> 00:46:01,359 Speaker 1: all that out. Let's say you're just asking a car 776 00:46:01,360 --> 00:46:05,040 Speaker 1: and a wind tunnel. You're measuring the wind resistance using 777 00:46:05,120 --> 00:46:08,280 Speaker 1: certain awesome instruments and that kind of stuff, and you're 778 00:46:08,320 --> 00:46:11,200 Speaker 1: taking that data, and then afterward you're going to analyze. 779 00:46:11,239 --> 00:46:13,600 Speaker 1: You're going to compare the data that you gathered from 780 00:46:14,040 --> 00:46:17,399 Speaker 1: the bird shaped car the box shaped car, and then 781 00:46:17,440 --> 00:46:20,120 Speaker 1: the control the egg shaped car. You're gonna compare them, 782 00:46:20,160 --> 00:46:23,120 Speaker 1: and you're gonna say, well, the wind resistance was less 783 00:46:23,640 --> 00:46:27,040 Speaker 1: for the bird shaped car then the box shaped car, 784 00:46:27,120 --> 00:46:31,880 Speaker 1: which means that my hypothesis was correct. And here all 785 00:46:31,880 --> 00:46:34,399 Speaker 1: the data points. Whereas Louis paste or could just say, 786 00:46:34,800 --> 00:46:39,120 Speaker 1: look at the Beaker's exactly, don't be an idiot, I'm 787 00:46:39,160 --> 00:46:43,839 Speaker 1: a scientist. That one's got gross stuff. You can see it, right. 788 00:46:44,000 --> 00:46:47,719 Speaker 1: But the other thing about science to chuck ideally is 789 00:46:47,840 --> 00:46:51,280 Speaker 1: let's say that egg shaped one turned out the control 790 00:46:51,320 --> 00:46:54,839 Speaker 1: group turned out to have better wind resistance than anything. Well, 791 00:46:54,920 --> 00:46:57,920 Speaker 1: just by virtue of carrying out this experiment correctly, you 792 00:46:57,920 --> 00:47:02,080 Speaker 1: would have stumbled upon and even better aerodynamic design, and 793 00:47:02,120 --> 00:47:04,760 Speaker 1: you would have come up with that little egg shaped 794 00:47:04,800 --> 00:47:08,160 Speaker 1: Mercedes suv that was so huge, like ten years ago, 795 00:47:08,480 --> 00:47:13,240 Speaker 1: the Mercedes egg coming to a store near you. So, um, 796 00:47:13,280 --> 00:47:16,000 Speaker 1: that's a big, big part of the scientific method is 797 00:47:16,880 --> 00:47:24,200 Speaker 1: carrying out a a an experiment, controlling the variables, analyzing 798 00:47:24,200 --> 00:47:27,280 Speaker 1: the data. And then there's a step that he missed 799 00:47:27,320 --> 00:47:31,280 Speaker 1: that is very rarely part of a scientific method list 800 00:47:32,000 --> 00:47:35,399 Speaker 1: that is to share your data. And this is a 801 00:47:35,480 --> 00:47:38,600 Speaker 1: huge problem with science right now. Yeah, the article you said, 802 00:47:38,600 --> 00:47:41,600 Speaker 1: it was really eye opening. Uh, scientific research has changed 803 00:47:41,640 --> 00:47:43,799 Speaker 1: the world. Now it needs to change itself. It's an 804 00:47:43,800 --> 00:47:46,480 Speaker 1: economist article. It's up on the internet. Yeah, it was 805 00:47:46,560 --> 00:47:50,799 Speaker 1: kind of scary that it's I mean, here's some of 806 00:47:50,840 --> 00:47:53,560 Speaker 1: the data he points out is one rule of thumb 807 00:47:53,640 --> 00:47:59,080 Speaker 1: among biotech venture capitalists is about half of published research 808 00:47:59,440 --> 00:48:02,839 Speaker 1: can't even be replicated. And on the biotech firm am 809 00:48:02,920 --> 00:48:05,879 Speaker 1: Jin found that they could reproduce only six of their 810 00:48:05,880 --> 00:48:09,880 Speaker 1: fifty three landmark studies in cancer research, So you can't 811 00:48:09,880 --> 00:48:14,240 Speaker 1: repeat these things. It's like everyone's fighting for dollars in fame, 812 00:48:14,880 --> 00:48:17,920 Speaker 1: and maybe not fame, but to some our career advancements 813 00:48:18,680 --> 00:48:21,359 Speaker 1: such that they're kind of not doing that final step 814 00:48:21,400 --> 00:48:25,239 Speaker 1: any longer. No, and it's not necessarily just them, it's 815 00:48:25,360 --> 00:48:28,600 Speaker 1: the other scientists aren't going back and saying, well, let 816 00:48:28,600 --> 00:48:32,600 Speaker 1: me see if your results are reproducible. People are just 817 00:48:32,680 --> 00:48:35,960 Speaker 1: taking it on faith. We need another Roger Bacon to 818 00:48:36,000 --> 00:48:38,440 Speaker 1: come along and be like, dude, we can't just blindly 819 00:48:38,480 --> 00:48:41,839 Speaker 1: accept that one person carried out this one study and 820 00:48:41,880 --> 00:48:44,920 Speaker 1: then just go do clinical trials on it without anybody 821 00:48:44,960 --> 00:48:49,919 Speaker 1: reproducing it to see if the results can be verified independently. Yeah. 822 00:48:49,960 --> 00:48:52,400 Speaker 1: Because uh, and this is a good time to mention bias. 823 00:48:52,480 --> 00:48:56,480 Speaker 1: There is such a thing as bias, and it still happens. Um. 824 00:48:56,640 --> 00:49:02,360 Speaker 1: A scientist is usually to prove something or disprove something 825 00:49:03,000 --> 00:49:06,839 Speaker 1: that they want a specific result. Like, even if you're 826 00:49:06,880 --> 00:49:10,919 Speaker 1: super open minded, you're probably hoping to disprove or prove 827 00:49:11,000 --> 00:49:14,880 Speaker 1: something one way or the other, and your confirmation bias 828 00:49:15,000 --> 00:49:17,600 Speaker 1: might you know, even if you don't think you're doing it, 829 00:49:17,640 --> 00:49:20,960 Speaker 1: you might nudge out some results that don't support your 830 00:49:21,120 --> 00:49:27,360 Speaker 1: hypothesis and so you won't make it into that awesome journal. Um, 831 00:49:27,440 --> 00:49:30,240 Speaker 1: which this author points out that journals need to start 832 00:49:31,080 --> 00:49:36,839 Speaker 1: uh putting in what he calls uninteresting results and experiments, right, 833 00:49:36,960 --> 00:49:39,960 Speaker 1: or like the stuff that's not sexy, right, or studies 834 00:49:40,000 --> 00:49:43,920 Speaker 1: that failed to show that their hypothesis was correct. Yeah, 835 00:49:43,960 --> 00:49:46,960 Speaker 1: stuff is disproved. Those things still need to well not 836 00:49:47,080 --> 00:49:49,719 Speaker 1: even disproved. Well, yeah, I guess it is disproved. But yes, 837 00:49:49,760 --> 00:49:52,520 Speaker 1: like the guy set out to say, like the red 838 00:49:53,160 --> 00:49:57,680 Speaker 1: balloon uses less helium than a silver balloon, and it 839 00:49:57,719 --> 00:49:59,759 Speaker 1: turns out that no, they use the same amount if 840 00:49:59,800 --> 00:50:03,239 Speaker 1: he lium. Well, if that study gets published and put 841 00:50:03,239 --> 00:50:07,000 Speaker 1: out there into the scientific literature on helium and balloons, 842 00:50:07,480 --> 00:50:09,719 Speaker 1: then it's going to prevent some other scientists down the 843 00:50:09,800 --> 00:50:13,440 Speaker 1: road from wasting time, money, and helium, which, as you remember, 844 00:50:13,600 --> 00:50:18,279 Speaker 1: is an increasingly needed commodity. Um By carrying out the 845 00:50:18,320 --> 00:50:23,160 Speaker 1: same experiment, whether whether the results are positive or negative 846 00:50:23,200 --> 00:50:25,920 Speaker 1: or what, the study is meant to be shared. And 847 00:50:25,960 --> 00:50:28,120 Speaker 1: that's the point of the scientific method, is to to 848 00:50:28,280 --> 00:50:30,640 Speaker 1: reduce bias. And if you follow it all the way 849 00:50:30,680 --> 00:50:33,600 Speaker 1: through ideally and do all of the steps, including share 850 00:50:33,640 --> 00:50:39,120 Speaker 1: your research, whether it's happy or sad, then science benefits. 851 00:50:39,120 --> 00:50:41,880 Speaker 1: The world benefits, and by not doing that, the world 852 00:50:42,080 --> 00:50:45,400 Speaker 1: does not benefit. Yeah. He points out that these days 853 00:50:45,520 --> 00:50:50,880 Speaker 1: only four of published papers are quote unquote negative results 854 00:50:51,400 --> 00:50:55,200 Speaker 1: and it used to be like or more um And 855 00:50:55,280 --> 00:50:56,799 Speaker 1: he says, because it's a lot of it has to 856 00:50:56,840 --> 00:51:01,600 Speaker 1: do with this sort of you know, getting in these 857 00:51:01,640 --> 00:51:04,080 Speaker 1: journals and you're the rock star scientists and this study 858 00:51:04,160 --> 00:51:06,680 Speaker 1: is super sexy. Like if they kind of quit going 859 00:51:06,719 --> 00:51:08,879 Speaker 1: that route and made it what it should be, then 860 00:51:09,680 --> 00:51:13,680 Speaker 1: research dollars would be better spent and people could you know, 861 00:51:14,080 --> 00:51:16,040 Speaker 1: he said, the peer reviewed thing isn't even all it's 862 00:51:16,040 --> 00:51:18,560 Speaker 1: cracked up to be. Know that. He mentioned a study 863 00:51:18,760 --> 00:51:21,480 Speaker 1: from a medical journal that gave a bunch of peer 864 00:51:21,480 --> 00:51:26,840 Speaker 1: reviewers some stuff with deliberate errors inserted into the research, 865 00:51:26,880 --> 00:51:29,560 Speaker 1: into the studies, and even when they were told that 866 00:51:29,600 --> 00:51:32,200 Speaker 1: they were being tested to find this, they still missed 867 00:51:32,200 --> 00:51:35,319 Speaker 1: a lot of it. Yeah. So yeah, the science used 868 00:51:35,360 --> 00:51:37,680 Speaker 1: to kind of re evaluate the way it's carrying out science. 869 00:51:37,719 --> 00:51:42,400 Speaker 1: It's not science. The problem isn't science itself. The problem 870 00:51:42,440 --> 00:51:44,640 Speaker 1: isn't the scientific method. It's the way that it's being 871 00:51:44,719 --> 00:51:46,759 Speaker 1: used or not followed through. And a lot of it 872 00:51:46,800 --> 00:51:49,759 Speaker 1: has to do with academia and the people funding science. Yeah, 873 00:51:49,800 --> 00:51:51,960 Speaker 1: and he said, you know, these days there's up seven 874 00:51:52,000 --> 00:51:55,400 Speaker 1: million researchers, and back in the day, even in like 875 00:51:55,480 --> 00:51:58,799 Speaker 1: the nineteen fifties there or like a few thousand maybe, right, 876 00:51:58,960 --> 00:52:01,759 Speaker 1: So there's just a lot of career competition. He calls 877 00:52:01,760 --> 00:52:05,520 Speaker 1: it careerism. And so you fake a result or two, 878 00:52:05,640 --> 00:52:08,360 Speaker 1: or you just nudge out some results that don't support 879 00:52:08,360 --> 00:52:12,279 Speaker 1: your hypothesis. You want the bigger paycheck or the fame 880 00:52:12,400 --> 00:52:16,800 Speaker 1: or notoriety, and all of a sudden, science is not science. Yeah, 881 00:52:16,960 --> 00:52:23,080 Speaker 1: you know, it's pseudoscience exactly. And speaking of pseudoscience, I 882 00:52:23,120 --> 00:52:25,520 Speaker 1: think we've reached the point where, um, we should talk 883 00:52:25,520 --> 00:52:28,640 Speaker 1: about the limitations of the scientific method, because it does 884 00:52:28,719 --> 00:52:32,160 Speaker 1: have its limits, right, like the way that the scientific 885 00:52:32,200 --> 00:52:34,879 Speaker 1: method is set up, especially if you go through um, 886 00:52:34,960 --> 00:52:39,360 Speaker 1: if you include falsification, which most scientists now say is 887 00:52:39,400 --> 00:52:43,400 Speaker 1: a thing like falsifiability of your hypothesis means that you 888 00:52:43,440 --> 00:52:47,239 Speaker 1: have a real scientific hypothesis there if it can be 889 00:52:47,320 --> 00:52:51,080 Speaker 1: disproven by some observation or some measurement or whatever, then 890 00:52:51,120 --> 00:52:54,920 Speaker 1: it's falsifiable. And if it's not falsifiable, then it's not 891 00:52:55,000 --> 00:52:59,680 Speaker 1: really science. So the thing is force something to be falsifiable, 892 00:52:59,680 --> 00:53:01,680 Speaker 1: and it is actually a philosopher that came up with 893 00:53:01,719 --> 00:53:04,520 Speaker 1: the concept of falsification, a guy named Carl Popper in 894 00:53:04,520 --> 00:53:07,799 Speaker 1: the nineteen thirties, and he was the one that said, like, 895 00:53:07,920 --> 00:53:11,319 Speaker 1: you're you have to be able to falsify something for 896 00:53:11,400 --> 00:53:14,680 Speaker 1: it to be disproven or supported, and if not, then 897 00:53:14,680 --> 00:53:18,399 Speaker 1: it's pseudoscience. Well, part and parcel of that is that 898 00:53:18,560 --> 00:53:22,360 Speaker 1: what you're saying has to be able to be detected empirically. 899 00:53:22,640 --> 00:53:25,279 Speaker 1: There's some way that it has to the presence of 900 00:53:25,280 --> 00:53:28,640 Speaker 1: it has to be measured or inferred. And so a 901 00:53:28,640 --> 00:53:32,560 Speaker 1: lot of people say, well, then with the scientific method 902 00:53:33,000 --> 00:53:36,279 Speaker 1: it reaches it's the limits of its current usefulness when 903 00:53:36,320 --> 00:53:40,120 Speaker 1: it tries to explain the supernatural. When somebody says, like, 904 00:53:40,480 --> 00:53:44,360 Speaker 1: ghosts are real, exactly, you can't prove that, well, you 905 00:53:44,400 --> 00:53:48,799 Speaker 1: also can't disprove it either, right, And so if you 906 00:53:48,880 --> 00:53:53,680 Speaker 1: are a scientist who says, uh, because the scientific method 907 00:53:53,760 --> 00:53:57,840 Speaker 1: can't prove or disprove the existence of ghosts or God, 908 00:53:58,680 --> 00:54:01,480 Speaker 1: there is no such thing as ghosts or God, you're 909 00:54:01,520 --> 00:54:04,880 Speaker 1: making a leap of faith just as much as the 910 00:54:04,880 --> 00:54:08,080 Speaker 1: same person who says science can't prove or disprove the 911 00:54:08,120 --> 00:54:11,439 Speaker 1: existence of ghosts or God, therefore God's and ghosts are real. 912 00:54:12,560 --> 00:54:15,240 Speaker 1: They're both leaps of faith. And that really the most 913 00:54:15,280 --> 00:54:19,080 Speaker 1: scientific approach to the existence of the supernatural, whether it 914 00:54:19,200 --> 00:54:22,560 Speaker 1: is ghost or god, is that we simply don't know, 915 00:54:22,960 --> 00:54:26,520 Speaker 1: and that we cannot know scientifically. And but that doesn't 916 00:54:26,560 --> 00:54:29,560 Speaker 1: mean that it does exist or doesn't exist. And that's 917 00:54:29,560 --> 00:54:32,720 Speaker 1: saying that science shows that it does or doesn't exist, 918 00:54:32,840 --> 00:54:35,920 Speaker 1: is by by definition, the opposite of what science shows. 919 00:54:36,400 --> 00:54:41,480 Speaker 1: Science shows neither. It's not capable of showing or showing 920 00:54:41,520 --> 00:54:44,799 Speaker 1: that something doesn't exist. That's a good point. Uh. The 921 00:54:44,840 --> 00:54:48,279 Speaker 1: other place where science can get corrupted is when it 922 00:54:48,480 --> 00:54:52,279 Speaker 1: blurs the lines, or when people blur the lines between uh, 923 00:54:52,480 --> 00:54:57,800 Speaker 1: moral judgments and science value judgments. Like you can study 924 00:54:57,840 --> 00:55:00,319 Speaker 1: global warming, you can study cause and effect, you can 925 00:55:00,320 --> 00:55:05,040 Speaker 1: report data, but when you make that secondly to say, 926 00:55:05,320 --> 00:55:07,319 Speaker 1: and this is a scientist, I mean someone can come 927 00:55:07,320 --> 00:55:09,840 Speaker 1: along and say global warming is bad, shouldn't drive your suv. 928 00:55:10,239 --> 00:55:13,799 Speaker 1: That's fine, But a scientist can't do a study and 929 00:55:13,840 --> 00:55:18,319 Speaker 1: say that because that's, uh, that's a value judgment. And 930 00:55:18,400 --> 00:55:22,319 Speaker 1: that's where science can get corrupted pretty much. Right. You 931 00:55:22,320 --> 00:55:25,040 Speaker 1: can you can study glow warming and results until the 932 00:55:25,040 --> 00:55:28,719 Speaker 1: cows come home, but you can't assert that if you 933 00:55:28,840 --> 00:55:31,839 Speaker 1: use this lightbulb, you're a bad person, right or um 934 00:55:31,920 --> 00:55:36,120 Speaker 1: ocean acidification is bad. It's not good for humans. But 935 00:55:36,160 --> 00:55:39,479 Speaker 1: if you're a jelly fits, it's awesome, you know. So yes, 936 00:55:39,760 --> 00:55:42,600 Speaker 1: and again you made a great point. It's not science, 937 00:55:42,880 --> 00:55:48,400 Speaker 1: it's people using science to make value judgments. Yeah. So, ultimately, 938 00:55:48,640 --> 00:55:52,040 Speaker 1: the scientific method, although it does have its limitations and 939 00:55:52,160 --> 00:55:56,800 Speaker 1: that it needs empirical data, uh to prove or disprove something, 940 00:55:57,680 --> 00:56:01,000 Speaker 1: it's not that it's not flawed. That's not a flaw. 941 00:56:01,080 --> 00:56:05,520 Speaker 1: That's a limitation, and it's it's when it's misused then 942 00:56:05,560 --> 00:56:09,560 Speaker 1: its results become flawed or skewed. And that's the people 943 00:56:09,600 --> 00:56:13,799 Speaker 1: doing it, man, not science. That's right. It's pretty interesting stuff. Yeah, man, 944 00:56:13,840 --> 00:56:15,719 Speaker 1: this is a good one. I thought so too. Man. 945 00:56:16,320 --> 00:56:18,759 Speaker 1: Let it's start out with a bang boom. It's all 946 00:56:18,840 --> 00:56:21,759 Speaker 1: downhill from here. Uh. If you want to know more 947 00:56:21,760 --> 00:56:25,719 Speaker 1: about the scientific method, check out that article on the economists, 948 00:56:25,800 --> 00:56:29,040 Speaker 1: check out explorables uh, and then of course check out 949 00:56:29,280 --> 00:56:31,560 Speaker 1: the scientific method in the search bar at how stuff 950 00:56:31,560 --> 00:56:34,120 Speaker 1: works dot com. And since I said that it's time 951 00:56:34,160 --> 00:56:40,800 Speaker 1: for listener mail, that's right. But quickly before listener mail, uh, 952 00:56:41,000 --> 00:56:44,399 Speaker 1: we get asked by listeners all the time, what can 953 00:56:44,440 --> 00:56:46,879 Speaker 1: we do? Since you have a free podcast, we can't 954 00:56:46,880 --> 00:56:49,560 Speaker 1: pay for it. What can we do to help you, guys? 955 00:56:49,600 --> 00:56:53,640 Speaker 1: And one thing you can do that we would appreciate is, uh, 956 00:56:53,840 --> 00:56:56,360 Speaker 1: go to iTunes and leave a rating and a review 957 00:56:56,480 --> 00:57:00,319 Speaker 1: for us. That makes big, big difference and pen us 958 00:57:00,400 --> 00:57:03,000 Speaker 1: up there in the rankings, which means more people find 959 00:57:03,000 --> 00:57:06,240 Speaker 1: stuff you should know. After they listen to Cereal, they'll 960 00:57:06,280 --> 00:57:08,560 Speaker 1: just say, well, jeez, there's other podcasts in the world, 961 00:57:09,120 --> 00:57:13,399 Speaker 1: what is this pod cast? So ratings and reviews really 962 00:57:13,480 --> 00:57:15,440 Speaker 1: help us out and it doesn't cost you anything but 963 00:57:15,480 --> 00:57:17,960 Speaker 1: a few minutes. Um, be honest, We're not saying go 964 00:57:18,080 --> 00:57:20,520 Speaker 1: leave us some great review, but go leave us a 965 00:57:20,600 --> 00:57:25,000 Speaker 1: great review. You said it, Uh, and tell tell one 966 00:57:25,000 --> 00:57:27,080 Speaker 1: person about stuff you should know. We would appreciate that 967 00:57:27,120 --> 00:57:30,880 Speaker 1: to turn somebody onto the show. And um, that's it. 968 00:57:30,960 --> 00:57:34,280 Speaker 1: That's our version of a pledge drive because they wow, 969 00:57:34,640 --> 00:57:36,960 Speaker 1: we do that what once every three years? No? Not 970 00:57:36,960 --> 00:57:40,760 Speaker 1: not very obnoxious and at last all right, So onto 971 00:57:40,760 --> 00:57:43,040 Speaker 1: listener mail. This is from my sister in law. Actually, 972 00:57:43,680 --> 00:57:47,480 Speaker 1: oh yeah, this is some nepotism. Yeah, Jenny Jenny Bryant, 973 00:57:47,560 --> 00:57:52,200 Speaker 1: she mentioned in the homeschool episode homeschooled her kids for 974 00:57:52,240 --> 00:57:55,400 Speaker 1: a little while, and she sort of corrected me Uh 975 00:57:55,600 --> 00:57:58,080 Speaker 1: love the homeschooling episode. Guys. One very big trend these 976 00:57:58,160 --> 00:58:02,760 Speaker 1: days in the home schooling community is what Abbey my 977 00:58:02,840 --> 00:58:06,760 Speaker 1: niece does, which is hybrid homeschooling. So two to three 978 00:58:06,840 --> 00:58:09,680 Speaker 1: days a week she is at school and then the 979 00:58:09,720 --> 00:58:13,840 Speaker 1: rest of the time she's a plant. She's not a plant. Uh, 980 00:58:14,000 --> 00:58:15,720 Speaker 1: the rest of time she's at home. So she says 981 00:58:15,720 --> 00:58:18,880 Speaker 1: it's a great option with curriculum provided and new topics 982 00:58:18,880 --> 00:58:22,040 Speaker 1: sought at school and then worked out at home. Many 983 00:58:22,080 --> 00:58:25,960 Speaker 1: of these schools are accredited making getting into college, including 984 00:58:26,000 --> 00:58:29,960 Speaker 1: Ivy League schools, Hassle Free and Abbey School has sports teams, 985 00:58:30,680 --> 00:58:34,600 Speaker 1: homecoming Abbey is actually an excellent volleyball player, Beta club, 986 00:58:34,640 --> 00:58:38,240 Speaker 1: newspaper staff, all the good stuff. The flexibility is great 987 00:58:38,240 --> 00:58:40,560 Speaker 1: for families, and we are huge fans of how the 988 00:58:40,640 --> 00:58:43,920 Speaker 1: hybrid approach prepare students for college by allowing them time 989 00:58:44,000 --> 00:58:46,600 Speaker 1: outside of class to manage their work and life schedules. 990 00:58:47,320 --> 00:58:53,080 Speaker 1: So that's from Jenny actually via text first listener mail 991 00:58:53,200 --> 00:58:55,640 Speaker 1: via text How did you print that out? Did you 992 00:58:55,680 --> 00:58:59,280 Speaker 1: retype it and print it? Oh? Dude, are you serious? 993 00:58:59,800 --> 00:59:03,360 Speaker 1: You can part from texts? No, you just copy pasted 994 00:59:03,440 --> 00:59:06,320 Speaker 1: to an email? Oh yeah, yeah, I forgot about that. Men, then, 995 00:59:06,480 --> 00:59:09,280 Speaker 1: how in the world did you did you do that 996 00:59:09,320 --> 00:59:13,160 Speaker 1: with your thoughts? I have a niece who is excellent 997 00:59:13,200 --> 00:59:16,680 Speaker 1: at volleyball too. Hold, we should get them together. I 998 00:59:16,680 --> 00:59:19,040 Speaker 1: don't know. It's ten eleven, okay or something like that. 999 00:59:19,080 --> 00:59:21,760 Speaker 1: Abby just turned thirteen, so there. Oh, maybe they face 1000 00:59:21,840 --> 00:59:24,880 Speaker 1: off against one another. Yeah, is she in Atlanta? Yes, 1001 00:59:24,920 --> 00:59:28,680 Speaker 1: she's up in Canton. You never know where's aving. She's 1002 00:59:28,720 --> 00:59:31,240 Speaker 1: in Roswell. But they think with volleyball they kind of 1003 00:59:31,240 --> 00:59:34,040 Speaker 1: have played all over the state. Bizarre if they play 1004 00:59:34,080 --> 00:59:35,880 Speaker 1: each other. Yeah, we'll just see each other at a 1005 00:59:35,880 --> 00:59:38,280 Speaker 1: match one day on opposite sides of the court with 1006 00:59:38,440 --> 00:59:43,480 Speaker 1: our arms folded. Yeah. Uh, what else? I got nothing else? Well, 1007 00:59:43,640 --> 00:59:46,880 Speaker 1: like Chuck said, go leave us a review. And if 1008 00:59:46,920 --> 00:59:48,440 Speaker 1: you want to get in touch with us, you can 1009 00:59:48,440 --> 00:59:51,120 Speaker 1: tweet to us at s y s K podcast. You 1010 00:59:51,160 --> 00:59:53,840 Speaker 1: can join us on Facebook, dot com, slash Stuff you 1011 00:59:53,840 --> 00:59:58,840 Speaker 1: Should Know. You can email us and we still do that. Yeah, 1012 00:59:58,880 --> 01:00:02,240 Speaker 1: you can't text me at uh stuff podcasts at how 1013 01:00:02,280 --> 01:00:04,600 Speaker 1: Stuff Works dot com and has always joined us at 1014 01:00:04,600 --> 01:00:09,720 Speaker 1: our home on the web. Stuff you Should Know dot Com. 1015 01:00:09,800 --> 01:00:11,560 Speaker 1: Stuff you Should Know is a production of I Heeart 1016 01:00:11,600 --> 01:00:14,520 Speaker 1: Radios how stuff works. For more podcasts for my heart Radio, 1017 01:00:14,600 --> 01:00:17,280 Speaker 1: visit the iHeart Radio app, Apple podcasts, or wherever you 1018 01:00:17,320 --> 01:00:18,480 Speaker 1: listen to your favorite shows.