1 00:00:04,120 --> 00:00:07,200 Speaker 1: Get in tech with technology with tech Stuff from half 2 00:00:07,200 --> 00:00:13,480 Speaker 1: stuff works dot com. Hey there, and welcome to tech 3 00:00:13,520 --> 00:00:17,640 Speaker 1: Stuffs one thousand episode. I am your host, who has 4 00:00:17,640 --> 00:00:24,080 Speaker 1: been on every single of the thousand one thousand. Hi. 5 00:00:24,160 --> 00:00:27,520 Speaker 1: I'm Jonathan Strickland. I'm an executive producer at how Stuff Works. 6 00:00:27,760 --> 00:00:34,199 Speaker 1: I've done a thousand of these, one one thousand of these? 7 00:00:34,840 --> 00:00:39,040 Speaker 1: How crazy is that? And I debated for a while 8 00:00:39,280 --> 00:00:42,360 Speaker 1: on what I should cover for the one thousand episode 9 00:00:42,440 --> 00:00:45,519 Speaker 1: of text Stuff because I've already covered a lot of 10 00:00:45,560 --> 00:00:47,960 Speaker 1: my favorite tech companies and a lot of my favorite 11 00:00:48,000 --> 00:00:52,280 Speaker 1: tech topics, whether it's a specific technology or a person, 12 00:00:52,920 --> 00:00:55,120 Speaker 1: A lot of my favorites are ones that I've already 13 00:00:55,160 --> 00:01:00,000 Speaker 1: focused on in the past. Plus, choosing one company or 14 00:01:00,080 --> 00:01:03,160 Speaker 1: technology or person over all the others seems like a 15 00:01:03,160 --> 00:01:08,600 Speaker 1: pretty difficult decision. How could I say that one company 16 00:01:08,720 --> 00:01:12,039 Speaker 1: or one technology deserved it more than all the others. 17 00:01:12,920 --> 00:01:15,960 Speaker 1: And besides, assigning significance to the number one thousand is 18 00:01:16,000 --> 00:01:19,160 Speaker 1: sort of an arbitrary concept anyway when you think about it. 19 00:01:19,240 --> 00:01:22,280 Speaker 1: So ultimately, I decided to skip all that and cover 20 00:01:22,360 --> 00:01:25,640 Speaker 1: something I think is important that relates to everything in 21 00:01:25,680 --> 00:01:29,399 Speaker 1: our lives, not just tech, but also tech, and that 22 00:01:29,440 --> 00:01:33,560 Speaker 1: would be the concepts of skepticism and critical thinking. These 23 00:01:33,600 --> 00:01:37,920 Speaker 1: are incredibly important tools we should familiarize ourselves with and 24 00:01:38,120 --> 00:01:43,480 Speaker 1: use properly and frequently. First, let's define the terms, and 25 00:01:43,520 --> 00:01:45,679 Speaker 1: I'm going to start with skepticism because I want to 26 00:01:45,680 --> 00:01:48,760 Speaker 1: be clear about what I mean when I use the term, 27 00:01:48,800 --> 00:01:54,560 Speaker 1: as opposed to other usages, including common usages of skepticism. Now, 28 00:01:54,600 --> 00:01:59,240 Speaker 1: philosophically speaking, skepticism refers to the belief that true knowledge 29 00:01:59,440 --> 00:02:03,760 Speaker 1: is difficult and likely impossible for humans to achieve, and 30 00:02:03,760 --> 00:02:06,480 Speaker 1: I'm pretty much on board with that. I think it's 31 00:02:06,480 --> 00:02:10,400 Speaker 1: impossible for us to know everything. Thousands of years ago, 32 00:02:10,919 --> 00:02:14,680 Speaker 1: we had to rely almost solely on our own senses 33 00:02:14,760 --> 00:02:18,480 Speaker 1: to figure out how the world works. But our senses 34 00:02:18,680 --> 00:02:22,600 Speaker 1: are limited. Even the most keenly cited human can't see 35 00:02:22,600 --> 00:02:25,639 Speaker 1: as well at a distance as a falcon, for example, 36 00:02:26,360 --> 00:02:29,680 Speaker 1: and we aren't able to see light in the ultra 37 00:02:29,760 --> 00:02:34,560 Speaker 1: violet or infrared ranges. We're incapable of directly perceiving electro 38 00:02:34,639 --> 00:02:38,840 Speaker 1: magnetic radiation outside of the visible spectrum. The same is 39 00:02:38,840 --> 00:02:41,720 Speaker 1: true for our other senses. That we have a limit 40 00:02:42,160 --> 00:02:45,560 Speaker 1: uh and what we can perceive sounds below twenty hurts 41 00:02:45,639 --> 00:02:49,440 Speaker 1: or above twenty kilo hurts are essentially beyond the range 42 00:02:49,440 --> 00:02:52,799 Speaker 1: of human hearing. Our sense of smell is nowhere near 43 00:02:52,919 --> 00:02:55,920 Speaker 1: sensitive as dogs, so we are limited by the extent 44 00:02:56,000 --> 00:02:59,040 Speaker 1: to which our senses can take in information around us. 45 00:02:59,320 --> 00:03:02,040 Speaker 1: There's a lot of information out there that we are 46 00:03:02,200 --> 00:03:05,840 Speaker 1: incapable of sensing. Now, as time is past, we have 47 00:03:05,919 --> 00:03:10,480 Speaker 1: created technologies that augment our senses and increase our ability 48 00:03:10,520 --> 00:03:13,720 Speaker 1: to detect things we otherwise would never notice. But even so, 49 00:03:14,160 --> 00:03:18,160 Speaker 1: we still don't know how much we don't know. Cosmological 50 00:03:18,200 --> 00:03:21,640 Speaker 1: theories state that dark matter has to exist based on 51 00:03:21,680 --> 00:03:25,440 Speaker 1: our observations of how galaxies move in our universe, and 52 00:03:25,480 --> 00:03:28,440 Speaker 1: that this matter must make up the majority of all 53 00:03:28,560 --> 00:03:32,239 Speaker 1: matter in our reality, but we can't detect it directly. 54 00:03:32,960 --> 00:03:35,240 Speaker 1: That leads me to believe that we will never possess 55 00:03:35,360 --> 00:03:38,800 Speaker 1: true knowledge of the universe, because I think there's always 56 00:03:38,800 --> 00:03:41,920 Speaker 1: going to be something more that's outside our ability to 57 00:03:41,960 --> 00:03:45,920 Speaker 1: perceive and understand. Now, that should not stop us from trying. 58 00:03:46,360 --> 00:03:49,680 Speaker 1: We should just be aware that there's always more to discover, 59 00:03:49,800 --> 00:03:52,080 Speaker 1: and if anything, I think that should make us more 60 00:03:52,120 --> 00:03:56,480 Speaker 1: determined to learn. Frequently we will use the word skeptic 61 00:03:56,520 --> 00:03:59,920 Speaker 1: to refer to someone who doubts or denies the existence 62 00:04:00,040 --> 00:04:02,280 Speaker 1: of something. But when I say it's good to be 63 00:04:02,280 --> 00:04:06,080 Speaker 1: a skeptic, I do not mean you should doubt or 64 00:04:06,160 --> 00:04:10,080 Speaker 1: deny everything. Rather, I use the word to refer to 65 00:04:10,160 --> 00:04:14,280 Speaker 1: someone who applies critical thinking to subjects. They ask questions, 66 00:04:14,360 --> 00:04:17,479 Speaker 1: they look for evidence to support claims. They do not 67 00:04:17,600 --> 00:04:21,920 Speaker 1: accept a statement without supporting proof, and they look to 68 00:04:21,960 --> 00:04:24,360 Speaker 1: make sure that any proof that is offered is of 69 00:04:24,440 --> 00:04:29,400 Speaker 1: sound quality. This is why I won't call anyone a 70 00:04:29,560 --> 00:04:32,880 Speaker 1: climate change skeptic. In that case, I would use the 71 00:04:33,000 --> 00:04:37,479 Speaker 1: term denier rather than skeptic, because there is a mountain 72 00:04:37,800 --> 00:04:42,440 Speaker 1: of research and there's a scientific consensus that climate change 73 00:04:42,480 --> 00:04:45,679 Speaker 1: is a real thing. It is one thing to question 74 00:04:45,720 --> 00:04:48,880 Speaker 1: a topic and demand evidence that I think you should 75 00:04:48,920 --> 00:04:53,880 Speaker 1: totally do. However, it's another thing to dismiss the conclusion 76 00:04:54,480 --> 00:04:59,479 Speaker 1: despite the evidence. If you don't accept evidence despite a 77 00:04:59,600 --> 00:05:04,480 Speaker 1: sidentific a consensus, that's pretty much denial. A skeptic who 78 00:05:04,480 --> 00:05:07,600 Speaker 1: would be confronted by sound evidence should be able to 79 00:05:07,640 --> 00:05:12,080 Speaker 1: accept that information. So, for example, let's move away from 80 00:05:12,080 --> 00:05:15,280 Speaker 1: climate change. Some people argue that that shouldn't count. I 81 00:05:15,320 --> 00:05:18,920 Speaker 1: think they're wrong, but let's take something totally different. Let's 82 00:05:18,960 --> 00:05:22,520 Speaker 1: say let's talk about ghosts. I do not believe in 83 00:05:22,640 --> 00:05:25,880 Speaker 1: ghosts at all. If you came up to me and 84 00:05:25,920 --> 00:05:28,479 Speaker 1: you told me a story about how you once saw 85 00:05:28,560 --> 00:05:32,599 Speaker 1: a ghost, I might humor you, but I wouldn't believe 86 00:05:32,640 --> 00:05:35,600 Speaker 1: you actually saw a ghost. I might believe that you 87 00:05:35,680 --> 00:05:38,440 Speaker 1: believe it, but I would be pretty sure that what 88 00:05:38,560 --> 00:05:43,920 Speaker 1: you experienced wasn't anything supernatural. I would suspect that something 89 00:05:43,960 --> 00:05:47,120 Speaker 1: more mundane had happened, and that over time you had 90 00:05:47,160 --> 00:05:51,880 Speaker 1: formulated a narrative that was reinforced with retellings, perhaps even 91 00:05:51,920 --> 00:05:55,080 Speaker 1: to the extent that the version you tell no longer 92 00:05:55,200 --> 00:05:59,680 Speaker 1: even remotely resembles the thing that actually happened. Because we 93 00:05:59,760 --> 00:06:02,920 Speaker 1: you as we do that when we remember stuff. It's 94 00:06:02,960 --> 00:06:06,120 Speaker 1: not like our brains are acting like a computer. Right 95 00:06:06,160 --> 00:06:08,640 Speaker 1: with a computer, you tell a computer retrieve a file, 96 00:06:09,000 --> 00:06:10,800 Speaker 1: it's gonna pull up a file, and that file is 97 00:06:10,800 --> 00:06:12,880 Speaker 1: going to be exactly the same as it was the 98 00:06:12,960 --> 00:06:16,520 Speaker 1: last time you saved the file. It will be identical 99 00:06:16,800 --> 00:06:20,560 Speaker 1: to the last time you saved it. It's preserved perfectly, 100 00:06:20,720 --> 00:06:23,919 Speaker 1: assuming that there's not something wrong with your computer. But 101 00:06:24,600 --> 00:06:27,719 Speaker 1: that's not how our brains work. When we remember something, 102 00:06:27,760 --> 00:06:32,560 Speaker 1: our brains actually form neural pathways. They create connections between 103 00:06:32,560 --> 00:06:38,000 Speaker 1: neurons in our brains right, and that pathway resembles the 104 00:06:38,040 --> 00:06:41,400 Speaker 1: one that happened when we first experienced the situation that 105 00:06:41,520 --> 00:06:45,440 Speaker 1: formed the memory in the first place. So we experience something, 106 00:06:45,960 --> 00:06:50,000 Speaker 1: our brain creates a connection that is related to that something. 107 00:06:50,080 --> 00:06:53,880 Speaker 1: When we remember, our brain tries to recreate that connection. 108 00:06:54,839 --> 00:06:59,119 Speaker 1: But our brains aren't perfect at recreating those pathways. Detours happen. 109 00:06:59,520 --> 00:07:03,200 Speaker 1: Our memory areas are faulty. To us. It seems like 110 00:07:03,360 --> 00:07:07,440 Speaker 1: the things were remembering our perfect recreations, but in reality, 111 00:07:07,760 --> 00:07:11,280 Speaker 1: we are recreating a scenario in our heads, and our 112 00:07:11,320 --> 00:07:14,600 Speaker 1: brains are not perfect, so we get things wrong and 113 00:07:14,680 --> 00:07:19,320 Speaker 1: sometimes we reinforce the wrong parts. We make those detours 114 00:07:19,360 --> 00:07:22,960 Speaker 1: more permanent parts of a pathway. So I'm not necessarily 115 00:07:22,960 --> 00:07:24,880 Speaker 1: going to think you're trying to hoodwink me. I'm not 116 00:07:24,920 --> 00:07:27,200 Speaker 1: thinking you're lying if you tell me you saw a ghost. 117 00:07:27,680 --> 00:07:29,840 Speaker 1: But still I'm not going to believe your account that 118 00:07:29,920 --> 00:07:32,960 Speaker 1: you saw a ghost. I'm going to think something else 119 00:07:33,040 --> 00:07:35,840 Speaker 1: happened that made you think you saw a ghost, but 120 00:07:36,880 --> 00:07:41,360 Speaker 1: in reality it was something more natural and mundane. However, 121 00:07:42,400 --> 00:07:46,960 Speaker 1: if someone were to provide to me real evidence of 122 00:07:47,040 --> 00:07:52,200 Speaker 1: the existence of ghosts, not an anecdote, but real, incontrovertible 123 00:07:52,240 --> 00:07:56,280 Speaker 1: evidence of ghosts. I would have to reevaluate my beliefs. 124 00:07:56,680 --> 00:08:00,360 Speaker 1: I would need to reconcile my disbelief in ghosts and 125 00:08:00,360 --> 00:08:04,800 Speaker 1: then incorporate a new belief into my worldview. That's my 126 00:08:04,920 --> 00:08:08,360 Speaker 1: responsibility as a skeptic. If you came up to me 127 00:08:08,440 --> 00:08:11,200 Speaker 1: and you said, here's the proof, and you laid it 128 00:08:11,200 --> 00:08:13,600 Speaker 1: out and I look at it and it is undeniable, 129 00:08:13,640 --> 00:08:15,480 Speaker 1: I would have to say, you know what I was wrong, 130 00:08:15,800 --> 00:08:19,800 Speaker 1: You are right, there are ghosts. This probably would not 131 00:08:19,880 --> 00:08:22,080 Speaker 1: be easy for me to do, but I would like 132 00:08:22,160 --> 00:08:25,040 Speaker 1: to think I would do it because I would be 133 00:08:25,080 --> 00:08:30,120 Speaker 1: confronted by evidence. However, the claim that ghosts exist is 134 00:08:30,160 --> 00:08:35,440 Speaker 1: an extraordinary claim, so it requires extraordinary proof. So if 135 00:08:35,440 --> 00:08:39,960 Speaker 1: you are confronted with evidence and the evidence seems reliable, 136 00:08:40,840 --> 00:08:45,480 Speaker 1: then you should take whatever the claim is and think 137 00:08:45,960 --> 00:08:50,080 Speaker 1: this may very well have merit. Climate change deniers do 138 00:08:50,200 --> 00:08:54,439 Speaker 1: not do this. They deny this mountain of evidence and 139 00:08:54,480 --> 00:08:58,679 Speaker 1: a scientific consensus, so they do not take that and 140 00:08:58,720 --> 00:09:03,600 Speaker 1: they incorporated. They instead say this does not align with 141 00:09:03,800 --> 00:09:08,520 Speaker 1: my worldview, and so I reject it. That is not skepticism, 142 00:09:08,559 --> 00:09:12,080 Speaker 1: that is denialism. Now, one of my favorite episodes of 143 00:09:12,120 --> 00:09:16,160 Speaker 1: Tech Stuff was all about ghost hunting technology, and that's 144 00:09:16,200 --> 00:09:19,199 Speaker 1: because Chris Palette and I went through the equipment that 145 00:09:19,440 --> 00:09:23,199 Speaker 1: is frequently used by people in the ghost hunting professions 146 00:09:23,600 --> 00:09:27,400 Speaker 1: and we explained why that technology did not actually prove 147 00:09:27,480 --> 00:09:31,680 Speaker 1: the existence of ghosts. For one thing, ghost hunters tend 148 00:09:31,679 --> 00:09:34,920 Speaker 1: to use equipment that was intended to do other work. 149 00:09:35,840 --> 00:09:40,200 Speaker 1: One example would be a meter to detect electromagnetic fluctuations, 150 00:09:40,240 --> 00:09:43,120 Speaker 1: which you would use in order to look for faulty 151 00:09:43,200 --> 00:09:46,280 Speaker 1: wiring in a house. For example, if wiring didn't have 152 00:09:46,320 --> 00:09:50,040 Speaker 1: proper insulation, then whenever a current runs through that wire, 153 00:09:50,080 --> 00:09:53,920 Speaker 1: it would generate an electromagnetic field and you would be 154 00:09:53,960 --> 00:09:55,840 Speaker 1: able to pick it up and that would tell you, hey, 155 00:09:55,880 --> 00:09:58,360 Speaker 1: maybe we need to fix the wiring here. That's what 156 00:09:58,559 --> 00:10:03,080 Speaker 1: those meters are full. Or however, ghost hunters would say, uh, 157 00:10:03,280 --> 00:10:06,560 Speaker 1: this meter shows fluctuations. That means there's a ghost here, 158 00:10:07,000 --> 00:10:09,960 Speaker 1: But that puts the cart about a mile in front 159 00:10:10,000 --> 00:10:12,720 Speaker 1: of the horse. Because at no point that the ghost 160 00:10:12,800 --> 00:10:18,600 Speaker 1: hunter actually provide evidence that ghosts exist. They're saying, here 161 00:10:18,720 --> 00:10:21,960 Speaker 1: is an effect the cause is this other thing that 162 00:10:22,080 --> 00:10:26,600 Speaker 1: I have yet to prove exists. That's not how science works. 163 00:10:26,920 --> 00:10:30,760 Speaker 1: You have to prove that a ghost exists first. Then 164 00:10:30,840 --> 00:10:33,640 Speaker 1: you have to prove that a ghost can affect or 165 00:10:33,720 --> 00:10:38,319 Speaker 1: generate electromagnetic fields. Then and only then can you say 166 00:10:38,520 --> 00:10:43,680 Speaker 1: I just found a fluctuating electromagnetic field. One possible explanation 167 00:10:43,800 --> 00:10:48,880 Speaker 1: is ghosts. You can't just say because this needle moved, 168 00:10:49,120 --> 00:10:52,079 Speaker 1: ghosts exist. That's not how science works. You have to 169 00:10:52,840 --> 00:10:56,840 Speaker 1: establish the existence first, then established that they can in 170 00:10:56,920 --> 00:11:00,520 Speaker 1: fact produce the effect you claim. Then and you can 171 00:11:00,679 --> 00:11:07,520 Speaker 1: use uh actual instances of that effect to potentially detect 172 00:11:08,240 --> 00:11:12,880 Speaker 1: the presence of such a thing. So that's the way 173 00:11:12,920 --> 00:11:15,679 Speaker 1: that would have to work. Ghost hunting does not do that. 174 00:11:15,760 --> 00:11:19,920 Speaker 1: It ignores that. It skips It presupposes the existence of 175 00:11:19,960 --> 00:11:23,840 Speaker 1: the thing they're looking for without first establishing that actually 176 00:11:23,960 --> 00:11:28,120 Speaker 1: is a thing. Now, I want to stress that I 177 00:11:28,160 --> 00:11:33,839 Speaker 1: don't mean to say ghosts definitely do not exist. What 178 00:11:33,880 --> 00:11:38,040 Speaker 1: I'm saying is I do not believe they exist. But 179 00:11:38,120 --> 00:11:40,040 Speaker 1: as I said at the beginning, I think it is 180 00:11:40,080 --> 00:11:44,320 Speaker 1: impossible for humans to know everything, so I could be 181 00:11:44,480 --> 00:11:48,280 Speaker 1: wrong about this. However, I have seen no evidence that 182 00:11:48,440 --> 00:11:52,720 Speaker 1: cannot be attributed to much more mundane causes than ghosts, 183 00:11:53,440 --> 00:11:55,640 Speaker 1: and that would be necessary for me to change my mind. 184 00:11:55,640 --> 00:11:59,680 Speaker 1: I would need that extraordinary evidence. I do not deny 185 00:11:59,760 --> 00:12:03,640 Speaker 1: the existence of ghosts outright, I just don't believe in them, 186 00:12:03,679 --> 00:12:08,199 Speaker 1: because skepticism and denial are two different things. Now, some 187 00:12:08,280 --> 00:12:12,800 Speaker 1: skeptics do deny certain things that might be climate change. 188 00:12:12,840 --> 00:12:15,520 Speaker 1: There are skeptics who don't believe in it that might 189 00:12:15,600 --> 00:12:19,800 Speaker 1: be the supernatural. And some skeptics believe in certain things 190 00:12:19,800 --> 00:12:22,840 Speaker 1: that are not supported by evidence as a matter of faith. 191 00:12:23,320 --> 00:12:28,280 Speaker 1: They might be uh devout in a religion, and other 192 00:12:28,360 --> 00:12:30,720 Speaker 1: skeptics would say, oh, you're not a skeptic because you 193 00:12:30,760 --> 00:12:34,880 Speaker 1: actually believe in this religious faith and there's no evidence 194 00:12:34,920 --> 00:12:39,199 Speaker 1: to support your beliefs. I don't want to say there's 195 00:12:39,240 --> 00:12:42,480 Speaker 1: only one way to be a skeptic. I do think, however, 196 00:12:42,840 --> 00:12:46,480 Speaker 1: the application of critical thinking and the scientific method is 197 00:12:46,559 --> 00:12:50,439 Speaker 1: really important for everyone, whether you consider yourself a skeptic 198 00:12:50,720 --> 00:12:53,920 Speaker 1: or otherwise. Now, when we come back, I'm gonna go 199 00:12:53,960 --> 00:12:57,240 Speaker 1: into the scientific method, how it works, and why I 200 00:12:57,280 --> 00:13:01,240 Speaker 1: think it's so important, And I'm also going to probably 201 00:13:01,240 --> 00:13:04,960 Speaker 1: talk about some other stuff like anti intellectualism and why 202 00:13:04,960 --> 00:13:07,960 Speaker 1: it's so demoralizing. Maybe I'll even get around to explaining 203 00:13:07,960 --> 00:13:11,280 Speaker 1: why I think the term Boffin's is incredibly insulting. So 204 00:13:11,640 --> 00:13:14,240 Speaker 1: tune in Brits because I got a bone to pick 205 00:13:14,280 --> 00:13:17,800 Speaker 1: with you guys. But first, let's take a quick break 206 00:13:18,120 --> 00:13:28,840 Speaker 1: and thank our sponsor. The scientific method is a process 207 00:13:28,960 --> 00:13:33,040 Speaker 1: by which we test ideas to ascertain their validity. So, 208 00:13:33,080 --> 00:13:35,560 Speaker 1: in other words, it's how we make sure the stuff 209 00:13:35,600 --> 00:13:39,680 Speaker 1: we believe in has some basis in actual reality. The 210 00:13:39,760 --> 00:13:43,280 Speaker 1: history of the scientific method is complex. I can't really 211 00:13:43,320 --> 00:13:46,480 Speaker 1: go into all of it because you could do a 212 00:13:46,600 --> 00:13:51,199 Speaker 1: full podcast series just about the evolution of the scientific method. 213 00:13:51,280 --> 00:13:55,600 Speaker 1: It is fascinating, but it is also exhaustive. Cultures from 214 00:13:55,600 --> 00:13:59,280 Speaker 1: all around the world developed approaches to testing ideas, so 215 00:13:59,559 --> 00:14:02,600 Speaker 1: this was in something that arose specifically out of one 216 00:14:02,640 --> 00:14:05,920 Speaker 1: place in the world. Lots of different cultures came up 217 00:14:05,920 --> 00:14:09,880 Speaker 1: with different ways to take this approach. However, generally speaking, 218 00:14:10,360 --> 00:14:14,719 Speaker 1: we trace the scientific method as we understand it today 219 00:14:14,720 --> 00:14:17,920 Speaker 1: back to the Greek philosopher Aristotle. He gets the credit 220 00:14:17,920 --> 00:14:20,400 Speaker 1: for laying out the basics of the scientific method. He 221 00:14:20,560 --> 00:14:27,440 Speaker 1: proposed combining empirical measurements and observations with inductive reasoning. Now, 222 00:14:27,960 --> 00:14:31,640 Speaker 1: a lot of Greek philosophers before Aristotle essentially said, all 223 00:14:31,680 --> 00:14:33,600 Speaker 1: you need is to be able to think. And if 224 00:14:33,640 --> 00:14:36,160 Speaker 1: you think, and you think good enough, if you're really 225 00:14:36,160 --> 00:14:38,720 Speaker 1: good at the thinking, then you can think the things 226 00:14:38,760 --> 00:14:41,280 Speaker 1: that you never thought before and figure out how the 227 00:14:41,280 --> 00:14:44,840 Speaker 1: world works. All it takes is just pure thought. Aristotle said, 228 00:14:45,120 --> 00:14:50,160 Speaker 1: hang on, what if we see stuff and we measure stuff, 229 00:14:50,400 --> 00:14:54,080 Speaker 1: and we take those empirical measurements and we incorporate that 230 00:14:54,120 --> 00:14:57,520 Speaker 1: into what we think, so that we can actually see 231 00:14:57,520 --> 00:15:00,640 Speaker 1: how the world really works, and by is our knowledge 232 00:15:00,640 --> 00:15:06,720 Speaker 1: off of that. This was pretty rebellious among Greek philosophers 233 00:15:06,760 --> 00:15:09,400 Speaker 1: at the time. Inductive reasoning, by the way, is where 234 00:15:09,400 --> 00:15:13,160 Speaker 1: you take specific points of data and you make generalizations 235 00:15:13,200 --> 00:15:17,520 Speaker 1: from those specific points. Contrast that with deductive reasoning. That's 236 00:15:17,520 --> 00:15:20,240 Speaker 1: where you start with a general idea and then you 237 00:15:20,280 --> 00:15:24,200 Speaker 1: work your way towards specifics. Now, typically scientists use both 238 00:15:24,240 --> 00:15:27,480 Speaker 1: inductive and deductive reasoning in order to form hypotheses and 239 00:15:27,520 --> 00:15:30,480 Speaker 1: then test them to make sure they are true. The 240 00:15:30,600 --> 00:15:35,600 Speaker 1: Islamic scholar even al hath them further developed the scientific method. 241 00:15:35,680 --> 00:15:38,600 Speaker 1: His basic steps are pretty much the foundation of the 242 00:15:38,640 --> 00:15:41,840 Speaker 1: scientific method today. He said, well, first you have to 243 00:15:42,240 --> 00:15:47,000 Speaker 1: state and explicit problem. This would essentially be a claim, 244 00:15:47,440 --> 00:15:51,320 Speaker 1: usually based off observations and experimentation. Then you test a 245 00:15:51,400 --> 00:15:54,680 Speaker 1: hypothesis through various experiments to see if it holds validity. 246 00:15:54,840 --> 00:15:57,680 Speaker 1: You interpret the data from those experiments, and then you 247 00:15:57,720 --> 00:16:02,160 Speaker 1: conclude whether the hypothesis held up under experimentation or whether 248 00:16:02,360 --> 00:16:05,800 Speaker 1: the experimentation didn't support the hypothesis, and then you publish 249 00:16:05,960 --> 00:16:08,320 Speaker 1: what you found so that other people can read it. 250 00:16:08,840 --> 00:16:13,120 Speaker 1: The scientific method developed further throughout the Renaissance, also developed 251 00:16:13,120 --> 00:16:15,600 Speaker 1: in the Enlightenment and the Modern Age of science, and 252 00:16:16,160 --> 00:16:19,840 Speaker 1: basically the principles have remained fairly stable, although the particulars 253 00:16:19,840 --> 00:16:24,440 Speaker 1: have obviously changed over time. Now you can't use the 254 00:16:24,480 --> 00:16:28,440 Speaker 1: scientific method for absolutely everything, but it does provide a 255 00:16:28,480 --> 00:16:32,800 Speaker 1: great model to follow as a skeptic. Basically, you take 256 00:16:32,880 --> 00:16:36,120 Speaker 1: a testable claim, and it is important that the claim 257 00:16:36,360 --> 00:16:41,040 Speaker 1: is testable. It's a claim that can be tested for accuracy. 258 00:16:41,120 --> 00:16:43,760 Speaker 1: If a claim is not testable, then it does not 259 00:16:43,920 --> 00:16:46,680 Speaker 1: belong to the realm of science. We used to call 260 00:16:46,760 --> 00:16:50,480 Speaker 1: this falsifiable. By the way, if a claim is falsifiable, 261 00:16:51,040 --> 00:16:54,640 Speaker 1: then it is scientific. That means if the claim is 262 00:16:54,680 --> 00:16:58,120 Speaker 1: such that you could design a test where the claim 263 00:16:58,240 --> 00:17:01,000 Speaker 1: is proven to be untrue, then you would say it's 264 00:17:01,040 --> 00:17:05,080 Speaker 1: falsifiable and thus scientific. If it's a claim where you 265 00:17:05,160 --> 00:17:09,120 Speaker 1: cannot come up with a test that would have an 266 00:17:09,160 --> 00:17:12,480 Speaker 1: outcome where the claim could be untrue, it is not 267 00:17:12,640 --> 00:17:16,600 Speaker 1: part of science. So, for example, if I claim there 268 00:17:16,760 --> 00:17:21,600 Speaker 1: is a pixie that follows you around and it's always 269 00:17:21,640 --> 00:17:26,080 Speaker 1: just over your right shoulder, however it is completely undetectable 270 00:17:26,119 --> 00:17:30,000 Speaker 1: by any means that we know, that claim is not testable. 271 00:17:30,040 --> 00:17:32,840 Speaker 1: There's nothing you can do to see if there's a 272 00:17:32,840 --> 00:17:36,600 Speaker 1: pixie there, because I just told you it's completely undetectable. 273 00:17:36,840 --> 00:17:40,480 Speaker 1: It's there, but you can't detect it. Well, there's no 274 00:17:40,560 --> 00:17:43,320 Speaker 1: experiment you can conduct to test that claim. It does 275 00:17:43,359 --> 00:17:46,679 Speaker 1: not fall in the realm of science. But if I 276 00:17:46,720 --> 00:17:50,200 Speaker 1: said there's a pixie that floats over your right shoulder 277 00:17:50,440 --> 00:17:54,639 Speaker 1: that can only be detected by some particularly expensive piece 278 00:17:54,640 --> 00:17:59,679 Speaker 1: of scanning equipment, that claim is testable because you could 279 00:17:59,720 --> 00:18:03,120 Speaker 1: go out and secure the appropriate scanning equipment and scan 280 00:18:03,240 --> 00:18:06,239 Speaker 1: yourself and look for the evidence, and if there's no 281 00:18:06,320 --> 00:18:08,560 Speaker 1: evidence of a pixie there, it would show that my 282 00:18:08,640 --> 00:18:13,159 Speaker 1: claim was not substantiated. There was no pixie and I 283 00:18:13,160 --> 00:18:15,359 Speaker 1: had told you, hey, if you get this expensive scanner, 284 00:18:16,080 --> 00:18:18,280 Speaker 1: then you'll find a pixie. If I'm the one selling 285 00:18:18,320 --> 00:18:21,360 Speaker 1: you the expensive scanner, it just means I scammed you, 286 00:18:21,720 --> 00:18:24,320 Speaker 1: and I probably changed my claim slightly, because that's how 287 00:18:24,359 --> 00:18:27,520 Speaker 1: things often go. It's called moving the goal posts. Where 288 00:18:28,000 --> 00:18:32,240 Speaker 1: I make a claim, someone tests the claim, the claim 289 00:18:32,320 --> 00:18:35,439 Speaker 1: does not hold up to the test, so I change 290 00:18:35,520 --> 00:18:38,639 Speaker 1: the claims slightly in order to make it seem like 291 00:18:38,680 --> 00:18:42,800 Speaker 1: it's still valid. That's moving the goal posts. That's not 292 00:18:42,920 --> 00:18:47,480 Speaker 1: good cricket, y'all. Anyway, Let's say I make a testable claim, 293 00:18:47,520 --> 00:18:51,600 Speaker 1: such as, this extremely expensive sound system can playback a 294 00:18:51,720 --> 00:18:55,520 Speaker 1: recording at such high fidelity that it leaves all other 295 00:18:55,600 --> 00:18:59,760 Speaker 1: sound systems behind. Now, you could perhaps measure the performance 296 00:19:00,000 --> 00:19:04,520 Speaker 1: of this sound system with very precise sensitive measurement devices, 297 00:19:05,280 --> 00:19:08,960 Speaker 1: and it might even show that, according to those devices, 298 00:19:09,560 --> 00:19:13,280 Speaker 1: my system is outperforming other systems. But it may not 299 00:19:13,440 --> 00:19:18,920 Speaker 1: actually matter. And that's because again, our senses are limited, right. 300 00:19:19,080 --> 00:19:23,320 Speaker 1: Human senses have limitations. So there comes a point where 301 00:19:23,720 --> 00:19:28,640 Speaker 1: replication will be meaningless because we humans are physically incapable 302 00:19:28,720 --> 00:19:32,880 Speaker 1: of detecting anything at that level of fidelity or resolution. 303 00:19:33,119 --> 00:19:36,400 Speaker 1: So with a claim, like that the way we perceive 304 00:19:36,520 --> 00:19:40,040 Speaker 1: sound might matter more than the empirical measurements we gather 305 00:19:40,119 --> 00:19:43,640 Speaker 1: from the device. And not only that, but the psychology 306 00:19:43,840 --> 00:19:46,720 Speaker 1: of listening to a sound system might matter more. If 307 00:19:46,760 --> 00:19:50,000 Speaker 1: I believe I'm going to get a better experience from 308 00:19:50,040 --> 00:19:53,280 Speaker 1: a particular sound system, my perception might be that I 309 00:19:53,320 --> 00:19:56,720 Speaker 1: did get a better experience, even if objectively, if I 310 00:19:56,760 --> 00:20:00,840 Speaker 1: had no knowledge of the quality or back thereof of 311 00:20:00,920 --> 00:20:03,719 Speaker 1: the sound system, I might not ever have noticed it. 312 00:20:04,800 --> 00:20:07,679 Speaker 1: Brains are funny things. So how can you test a 313 00:20:07,800 --> 00:20:11,160 Speaker 1: claim that says this sound system is better than all 314 00:20:11,240 --> 00:20:14,239 Speaker 1: others so that it makes sense to us that we 315 00:20:14,280 --> 00:20:17,439 Speaker 1: can actually see if there's something there. Well, this is 316 00:20:17,480 --> 00:20:22,040 Speaker 1: where formulating a proper experiment is incredibly important. So a 317 00:20:22,080 --> 00:20:26,840 Speaker 1: blind experiment helps reduce bias in a process. In a 318 00:20:26,880 --> 00:20:30,480 Speaker 1: blind experiment, the subject does not know if he or 319 00:20:30,520 --> 00:20:34,480 Speaker 1: she is in a test group or a control group. Uh. 320 00:20:34,520 --> 00:20:37,399 Speaker 1: And with a sound system issue, you might be in 321 00:20:37,400 --> 00:20:40,240 Speaker 1: a control group sometimes in a test group other times, 322 00:20:40,280 --> 00:20:44,080 Speaker 1: and you're recording your responses to a questionnaire. Let's say 323 00:20:44,080 --> 00:20:46,800 Speaker 1: this design to see if you detect a difference in 324 00:20:46,880 --> 00:20:49,960 Speaker 1: sound quality. Uh. So it might mean that if I 325 00:20:50,000 --> 00:20:54,119 Speaker 1: were to design an experiment to test the claims someone 326 00:20:54,160 --> 00:20:57,080 Speaker 1: makes about a particular sound system. I might make it 327 00:20:57,160 --> 00:21:00,680 Speaker 1: like this. Uh. You are invited to participate in a 328 00:21:00,680 --> 00:21:04,080 Speaker 1: test to see if a particular sound system is perceptibly 329 00:21:04,200 --> 00:21:08,120 Speaker 1: better than other sound systems. So you're led to a room. 330 00:21:08,200 --> 00:21:11,080 Speaker 1: You cannot see the system, or the speakers or anything 331 00:21:11,080 --> 00:21:13,880 Speaker 1: in this room. The room is is designed in such 332 00:21:13,920 --> 00:21:17,440 Speaker 1: way where the entire sound system is hidden from your view. 333 00:21:18,160 --> 00:21:21,119 Speaker 1: You're told to listen to a recording. You do, so, 334 00:21:21,240 --> 00:21:23,400 Speaker 1: you listen to the recording. Maybe you make some notes. 335 00:21:24,119 --> 00:21:27,200 Speaker 1: Then you're led to a second room and you repeat 336 00:21:27,240 --> 00:21:30,359 Speaker 1: the procedure. You you can't see the sound system in 337 00:21:30,400 --> 00:21:33,200 Speaker 1: this room. You don't know if you're listening to an 338 00:21:33,200 --> 00:21:36,399 Speaker 1: identical sound system or a totally different sound system. You 339 00:21:36,480 --> 00:21:39,240 Speaker 1: make notes. Maybe you go to several rooms and you're 340 00:21:39,240 --> 00:21:42,040 Speaker 1: asked to rate the audio quality experience in each of 341 00:21:42,040 --> 00:21:44,359 Speaker 1: the rooms you visit. Some of the rooms might have 342 00:21:44,440 --> 00:21:48,879 Speaker 1: duplicate systems calibrated precisely the same way for their respective 343 00:21:48,920 --> 00:21:53,639 Speaker 1: spaces to give you the same sort of uh, replicated sound, 344 00:21:54,280 --> 00:21:56,080 Speaker 1: but you don't know that. You don't know that going 345 00:21:56,080 --> 00:21:58,800 Speaker 1: from room to room, which one it is and there 346 00:21:58,840 --> 00:22:02,000 Speaker 1: might be as many variables eliminated as possible. So you're 347 00:22:02,040 --> 00:22:04,719 Speaker 1: trying to, you know, keep the rooms the exact same size, 348 00:22:04,760 --> 00:22:08,199 Speaker 1: with the exact same uh sound proofing. Everything needs to 349 00:22:08,200 --> 00:22:11,400 Speaker 1: be as close to identical as possible to eliminate variables. 350 00:22:12,400 --> 00:22:15,640 Speaker 1: And it might be that rooms two and three out 351 00:22:15,680 --> 00:22:20,560 Speaker 1: of seven have the supposedly remarkable system in them, but 352 00:22:20,640 --> 00:22:25,000 Speaker 1: the other five have regular sound systems in them. Now, you, 353 00:22:25,160 --> 00:22:27,200 Speaker 1: as the subject, don't know that. You're just going through 354 00:22:27,240 --> 00:22:31,200 Speaker 1: the experiment. The experimenters will repeat this process with as 355 00:22:31,240 --> 00:22:34,680 Speaker 1: many test subjects as they possibly can. The more subjects 356 00:22:34,680 --> 00:22:37,760 Speaker 1: you have, the better it increases the sample size. It 357 00:22:37,800 --> 00:22:42,119 Speaker 1: helps account for outliers. Then the experimenters collate all the data, 358 00:22:42,520 --> 00:22:46,520 Speaker 1: they analyze it, and they draw conclusions. And perhaps the 359 00:22:46,560 --> 00:22:50,040 Speaker 1: super relative system really is superior, and people are regularly 360 00:22:50,160 --> 00:22:53,439 Speaker 1: and reliably picking it out. They're saying rooms two and 361 00:22:53,520 --> 00:22:56,479 Speaker 1: three sounded really good, like better than all the others. 362 00:22:57,000 --> 00:22:59,040 Speaker 1: If everyone is saying that, or if enough people are 363 00:22:59,040 --> 00:23:02,600 Speaker 1: saying that reliable, then you might say he the claims 364 00:23:02,680 --> 00:23:06,080 Speaker 1: that were made seem to hold true people are able 365 00:23:06,119 --> 00:23:10,360 Speaker 1: to identify it, or you might find that there's no 366 00:23:10,400 --> 00:23:12,920 Speaker 1: conclusive support for the claim, not enough people were able 367 00:23:12,960 --> 00:23:15,760 Speaker 1: to tell a difference, or maybe some other people said, 368 00:23:15,960 --> 00:23:18,840 Speaker 1: you know, rooms two and six sounded better than all 369 00:23:18,840 --> 00:23:21,680 Speaker 1: the other rooms, and you're thinking to yourself, huh, only 370 00:23:21,720 --> 00:23:25,040 Speaker 1: two and three had the sound system that was really 371 00:23:25,040 --> 00:23:27,720 Speaker 1: good in them. The other rooms had regular sound systems 372 00:23:27,720 --> 00:23:30,280 Speaker 1: in them. So if you say two and six, that's 373 00:23:30,280 --> 00:23:33,800 Speaker 1: not conclusive. Or if you say five and six, now 374 00:23:33,840 --> 00:23:35,720 Speaker 1: that's a real problem because you've picked out two of 375 00:23:35,720 --> 00:23:38,359 Speaker 1: the sound systems that we're not supposed to be as good. 376 00:23:38,800 --> 00:23:41,800 Speaker 1: So that's one way of testing it. Now, I would 377 00:23:41,800 --> 00:23:44,760 Speaker 1: actually suggest you go a step further in your experiment 378 00:23:44,800 --> 00:23:48,520 Speaker 1: design and you create what is called a double blind experiment. 379 00:23:49,000 --> 00:23:52,359 Speaker 1: In a double blind experiment, the person in charge of 380 00:23:52,840 --> 00:23:57,040 Speaker 1: conducting the test is unaware of which subjects are in 381 00:23:57,040 --> 00:24:00,199 Speaker 1: a control group or a test group as well. So, 382 00:24:00,520 --> 00:24:03,480 Speaker 1: in other words, the person who leads you the subject 383 00:24:03,560 --> 00:24:06,280 Speaker 1: from room the room also does not know which rooms 384 00:24:06,320 --> 00:24:09,160 Speaker 1: have the really good system in them and which rooms 385 00:24:09,200 --> 00:24:12,880 Speaker 1: have normal systems in them. They have no idea. This way, 386 00:24:13,040 --> 00:24:15,639 Speaker 1: the person who's leading you from room to room can't 387 00:24:16,560 --> 00:24:23,160 Speaker 1: unconsciously introduce bias into the experiment by perhaps giving away 388 00:24:23,240 --> 00:24:25,919 Speaker 1: that rooms two and three have the really good system 389 00:24:25,960 --> 00:24:29,600 Speaker 1: in them because they don't know. Double Blind experiments, in 390 00:24:29,600 --> 00:24:32,720 Speaker 1: my opinion, tend to be the way to go for 391 00:24:32,800 --> 00:24:36,280 Speaker 1: testable claims, especially claims that are on the extraordinary side. 392 00:24:36,320 --> 00:24:39,360 Speaker 1: You want to design a test in such a way 393 00:24:39,440 --> 00:24:42,679 Speaker 1: that you don't influence the subjects to give you the 394 00:24:42,720 --> 00:24:47,320 Speaker 1: result you want to get. Whether you want a test 395 00:24:47,359 --> 00:24:50,280 Speaker 1: to come out positive or negative, you want to avoid 396 00:24:50,400 --> 00:24:54,600 Speaker 1: introducing bias into the actual experiment. Uh, these tests where 397 00:24:54,640 --> 00:24:58,120 Speaker 1: you're testing something that's subjective, like the quality of sound, 398 00:24:58,920 --> 00:25:02,440 Speaker 1: those are particularly tricky because it's not like it's just 399 00:25:02,680 --> 00:25:05,760 Speaker 1: a measurement you can read off a meter. It's the 400 00:25:05,840 --> 00:25:11,440 Speaker 1: experience someone is having, and subjective experiences are very difficult 401 00:25:11,480 --> 00:25:15,320 Speaker 1: to quantify in a meaningful way. But this is the 402 00:25:15,320 --> 00:25:18,840 Speaker 1: way I would go about designing such an experiment. Now, 403 00:25:18,880 --> 00:25:23,200 Speaker 1: not everything can be tested. Some things go beyond observations 404 00:25:23,280 --> 00:25:26,200 Speaker 1: or empirical measurements. Now that does not necessarily mean those 405 00:25:26,200 --> 00:25:30,440 Speaker 1: claims are untrue. They might be true. It just means 406 00:25:30,480 --> 00:25:34,040 Speaker 1: we cannot apply the scientific method to test those claims. 407 00:25:34,520 --> 00:25:39,480 Speaker 1: String theory falls into that category. Mathematically, stringth theory holds 408 00:25:39,560 --> 00:25:43,320 Speaker 1: up if you allow for certain claims such as additional dimensions, 409 00:25:43,960 --> 00:25:47,960 Speaker 1: but we have no means of observing or measuring those claims. 410 00:25:48,480 --> 00:25:52,120 Speaker 1: There's no way for us to test it scientifically. Mathematically, 411 00:25:52,160 --> 00:25:56,359 Speaker 1: it all makes sense, but we can't actually practically test 412 00:25:56,400 --> 00:25:59,119 Speaker 1: it or observe it. That's led some people to argue 413 00:25:59,359 --> 00:26:02,560 Speaker 1: that strength or is not really a scientific theory but 414 00:26:02,640 --> 00:26:05,479 Speaker 1: more of a philosophy. Now that does not mean that 415 00:26:05,520 --> 00:26:08,640 Speaker 1: it's untrue. It just means we do not as yet 416 00:26:08,840 --> 00:26:12,359 Speaker 1: have a means of testing it to apply the scientific method. 417 00:26:12,800 --> 00:26:15,240 Speaker 1: So that's something to keep in mind as well. That 418 00:26:15,440 --> 00:26:19,040 Speaker 1: step that uh that we heard about publishing your findings, 419 00:26:19,080 --> 00:26:23,280 Speaker 1: that's also very important. A good scientist will submit his 420 00:26:23,480 --> 00:26:26,400 Speaker 1: or her work to a peer reviewed journal for publication. 421 00:26:27,000 --> 00:26:30,600 Speaker 1: Peer review means that other scientists will take the work, 422 00:26:30,880 --> 00:26:34,040 Speaker 1: they'll examine it. They'll look at the design of the experiment, 423 00:26:34,119 --> 00:26:36,919 Speaker 1: the claim, how the experiment was designed, how it was 424 00:26:36,960 --> 00:26:40,240 Speaker 1: carried through, the methodology that was used to collect data, 425 00:26:40,600 --> 00:26:43,879 Speaker 1: the process of the researchers used to analyze the information 426 00:26:43,920 --> 00:26:46,800 Speaker 1: that was collected, and then the conclusions that were drawn 427 00:26:46,920 --> 00:26:51,320 Speaker 1: by the researchers based on that analysis. Ideally, that weeds 428 00:26:51,359 --> 00:26:55,400 Speaker 1: out bad experiments. Now, in reality, sometimes stuff slips through, 429 00:26:55,440 --> 00:26:58,359 Speaker 1: at least temporarily, but the purpose of peer review is 430 00:26:58,359 --> 00:27:02,120 Speaker 1: really important. It gives a scientists the opportunity to poke 431 00:27:02,320 --> 00:27:06,240 Speaker 1: and prode at an experiment to make sure it holds 432 00:27:06,320 --> 00:27:09,960 Speaker 1: up to scrutiny. And if it can't hold up to scrutiny, 433 00:27:10,119 --> 00:27:12,879 Speaker 1: that doesn't necessarily mean the conclusions are false. It just 434 00:27:12,960 --> 00:27:15,480 Speaker 1: means there's a lack of support to validate the conclusions, 435 00:27:15,480 --> 00:27:19,360 Speaker 1: and a better experiment should be designed. The publication process 436 00:27:19,400 --> 00:27:22,240 Speaker 1: also allows for another important step in science. It gives 437 00:27:22,320 --> 00:27:26,600 Speaker 1: other scientists the opportunity to attempt to replicate the experiment. 438 00:27:27,160 --> 00:27:30,000 Speaker 1: If an experiment has a proper design and the researchers 439 00:27:30,040 --> 00:27:33,879 Speaker 1: did everything correctly, their work should be replicable by anyone 440 00:27:33,960 --> 00:27:38,320 Speaker 1: else who follows that exact same methodology. If another scientist 441 00:27:38,520 --> 00:27:43,120 Speaker 1: follows your procedure exactly but arrives at a completely different result, 442 00:27:43,600 --> 00:27:47,760 Speaker 1: something has gone wrong. Moreover, scientists should be able to 443 00:27:47,800 --> 00:27:50,840 Speaker 1: come at the experiment from different angles, and with a 444 00:27:50,880 --> 00:27:54,760 Speaker 1: properly designed experiment, their work should arrive at a similar 445 00:27:54,800 --> 00:27:58,720 Speaker 1: conclusion to the previously established experiments. There will always be 446 00:27:58,800 --> 00:28:02,520 Speaker 1: variability and result, but as long as that variability isn't 447 00:28:02,560 --> 00:28:06,359 Speaker 1: statistically significant, or as long as it's not outside the 448 00:28:06,480 --> 00:28:09,399 Speaker 1: range of error, it should still be seen as supporting 449 00:28:09,440 --> 00:28:13,280 Speaker 1: the conclusions of the first experiment. So you want to 450 00:28:13,280 --> 00:28:15,639 Speaker 1: be able to replicate an experiment. You want to be 451 00:28:15,680 --> 00:28:19,480 Speaker 1: able to design new experiments to test the same claim 452 00:28:19,480 --> 00:28:21,840 Speaker 1: in a slightly different way to make sure that the 453 00:28:21,880 --> 00:28:25,080 Speaker 1: claim still holds true. And if all of that ends 454 00:28:25,160 --> 00:28:28,879 Speaker 1: up being the case, then you know you're onto something 455 00:28:29,040 --> 00:28:33,320 Speaker 1: because you're starting to get consistent results and different people 456 00:28:33,600 --> 00:28:36,119 Speaker 1: trying to do the same experiment and coming up with 457 00:28:36,200 --> 00:28:40,440 Speaker 1: the same conclusion. That's a good thing in general. Now, 458 00:28:40,480 --> 00:28:44,440 Speaker 1: clearly we can't all design scientific experiments to test every 459 00:28:44,480 --> 00:28:48,680 Speaker 1: claim we encounter. That's not practical. It's not something that 460 00:28:48,720 --> 00:28:50,560 Speaker 1: you and I are going to do on a day 461 00:28:50,560 --> 00:28:52,680 Speaker 1: to day basis. It would be silly, it would be 462 00:28:52,680 --> 00:28:55,200 Speaker 1: a huge time sync. But we can apply our knowledge 463 00:28:55,200 --> 00:28:59,160 Speaker 1: of the scientific method to ask questions either of ourselves 464 00:28:59,400 --> 00:29:02,840 Speaker 1: of other pull to look into a matter more thoroughly, 465 00:29:03,320 --> 00:29:06,600 Speaker 1: and if we encounter an extraordinary claim, we can look 466 00:29:06,600 --> 00:29:09,640 Speaker 1: for the support for that claim. In some cases we 467 00:29:09,680 --> 00:29:13,360 Speaker 1: may find the support is logical, it's consistent, and it's sufficient, 468 00:29:13,840 --> 00:29:17,280 Speaker 1: And then we might find ourselves able to accept this 469 00:29:17,360 --> 00:29:20,200 Speaker 1: new claim, even if it is extraordinary. But in other 470 00:29:20,280 --> 00:29:23,440 Speaker 1: cases we might not see any support at all. We 471 00:29:23,520 --> 00:29:26,920 Speaker 1: might look and say, you know, the premise that you 472 00:29:27,000 --> 00:29:30,640 Speaker 1: have here is faulty, so your conclusion is not really reliable. 473 00:29:31,120 --> 00:29:34,000 Speaker 1: Or you might say you don't even present premises to 474 00:29:34,080 --> 00:29:37,640 Speaker 1: support your argument, so how do I know your argument 475 00:29:37,680 --> 00:29:41,480 Speaker 1: is sound? Then we become less eager to accept a 476 00:29:41,480 --> 00:29:44,640 Speaker 1: claim at face value. Well, I've got a lot more 477 00:29:44,680 --> 00:29:48,600 Speaker 1: to say about critical thinking and scientific method and skepticism, 478 00:29:48,600 --> 00:29:50,840 Speaker 1: but before I go any further, let's take another quick 479 00:29:50,880 --> 00:30:00,840 Speaker 1: break to thank our sponsors. Are how do we know 480 00:30:01,680 --> 00:30:05,120 Speaker 1: science works? But why have I put so much stock 481 00:30:05,400 --> 00:30:10,320 Speaker 1: in science? Well? We know science works because our stuff works. 482 00:30:10,360 --> 00:30:17,320 Speaker 1: Our computers work, our smartphones work, cellular technology, satellites, rockets, sensors, medicine. 483 00:30:17,400 --> 00:30:21,040 Speaker 1: These things work because men and women have used a 484 00:30:21,120 --> 00:30:25,240 Speaker 1: scientific process to develop these over the course of many 485 00:30:25,360 --> 00:30:29,360 Speaker 1: years of research and development and prototyping. If science did 486 00:30:29,400 --> 00:30:32,480 Speaker 1: not work, we would have a much higher failure rate 487 00:30:32,600 --> 00:30:35,959 Speaker 1: on our exploits because without science, the odds of us 488 00:30:35,960 --> 00:30:39,880 Speaker 1: ever developing something like WiFi just by pure luck are 489 00:30:40,000 --> 00:30:44,600 Speaker 1: astronomically bad. So we know science works because we have 490 00:30:44,680 --> 00:30:48,960 Speaker 1: stuff based on science, and that stuff actually works as 491 00:30:49,040 --> 00:30:51,880 Speaker 1: it's supposed to. Now, a quick word about some more 492 00:30:52,000 --> 00:30:57,560 Speaker 1: terminology like laws versus theories versus hypotheses, because these get 493 00:30:57,760 --> 00:31:00,880 Speaker 1: kind of conflated too. So in science, a law is 494 00:31:00,920 --> 00:31:04,600 Speaker 1: a generalization about data that describes what we might expect 495 00:31:04,640 --> 00:31:08,200 Speaker 1: will happen per a given situation. So the laws of 496 00:31:08,240 --> 00:31:14,640 Speaker 1: thermodynamics are generalizations about fundamental elements like temperature and entropy 497 00:31:14,640 --> 00:31:18,520 Speaker 1: of thermodynamic systems that are at equilibrium. These laws are 498 00:31:18,560 --> 00:31:22,880 Speaker 1: based off of countless observations, and they are well established. 499 00:31:23,200 --> 00:31:27,960 Speaker 1: They are not immutable. Scientific laws can in fact change 500 00:31:28,040 --> 00:31:31,720 Speaker 1: if evidence supports such a thing, but they tend to 501 00:31:31,760 --> 00:31:34,400 Speaker 1: serve as the foundation for much of our knowledge, which 502 00:31:34,440 --> 00:31:37,520 Speaker 1: means if we do need to make changes, we also 503 00:31:37,520 --> 00:31:40,760 Speaker 1: have to reevaluate all the knowledge we base off of 504 00:31:40,800 --> 00:31:43,880 Speaker 1: those ideas. You can think of it as the foundation 505 00:31:43,960 --> 00:31:46,960 Speaker 1: for a house. It holds everything else up. This is 506 00:31:47,000 --> 00:31:50,360 Speaker 1: one of the reasons why scientifically minded folks are pretty 507 00:31:50,360 --> 00:31:54,720 Speaker 1: comfortable saying things like perpetual motion machines are impossible because 508 00:31:55,040 --> 00:31:57,600 Speaker 1: for a perpetual motion machine to work, it would have 509 00:31:57,640 --> 00:32:01,440 Speaker 1: to violate the laws of thermodynamics, and while it is 510 00:32:01,560 --> 00:32:05,360 Speaker 1: at least possible in the strictest sense of the word, 511 00:32:05,760 --> 00:32:09,360 Speaker 1: that our understanding of the laws of thermodynamics is not correct. 512 00:32:09,680 --> 00:32:15,200 Speaker 1: To date, our observations and tests have all validated those laws, 513 00:32:15,240 --> 00:32:19,680 Speaker 1: so it would take extraordinary proof to the extreme to 514 00:32:19,880 --> 00:32:22,720 Speaker 1: overturn that, and it would mean that much of our 515 00:32:22,720 --> 00:32:25,920 Speaker 1: advances in science and technology over the years has worked 516 00:32:26,000 --> 00:32:29,680 Speaker 1: largely by good luck, because if we were off base 517 00:32:29,720 --> 00:32:33,800 Speaker 1: about something so fundamental, it would be amazing that all 518 00:32:33,840 --> 00:32:37,040 Speaker 1: the things we've built that at least relied partly on 519 00:32:37,120 --> 00:32:43,040 Speaker 1: those laws actually works, because our understanding would be faulty. 520 00:32:43,160 --> 00:32:47,480 Speaker 1: So our technology shouldn't work if the principles that was 521 00:32:47,520 --> 00:32:52,440 Speaker 1: built upon were unsound. Next, we have scientific theories. This 522 00:32:52,440 --> 00:32:55,200 Speaker 1: one is tricky because we have different meanings for theory. 523 00:32:55,600 --> 00:33:00,600 Speaker 1: In science, a theory is something specific. The University of California, Yeah, Berkeley, 524 00:33:00,640 --> 00:33:04,120 Speaker 1: has a really great glossary of scientific terms, and this 525 00:33:04,200 --> 00:33:08,280 Speaker 1: is their definition for a scientific theory. Quote in science 526 00:33:08,480 --> 00:33:12,720 Speaker 1: a broad natural explanation for a wide range of phenomena. 527 00:33:13,040 --> 00:33:19,880 Speaker 1: Theories are concise, coherent, systematic, predictive, and broadly applicable, often 528 00:33:19,960 --> 00:33:25,240 Speaker 1: integrating and generalizing many hypotheses. Theories accepted by the scientific 529 00:33:25,280 --> 00:33:30,000 Speaker 1: community are generally strongly supported by many different lines of evidence, 530 00:33:30,320 --> 00:33:34,360 Speaker 1: but even theories may be modified or overturned if warranted 531 00:33:34,480 --> 00:33:38,840 Speaker 1: by new evidence and perspectives. So again, science allows that 532 00:33:39,520 --> 00:33:42,560 Speaker 1: we can't know everything, and it may turn out that 533 00:33:42,600 --> 00:33:46,520 Speaker 1: one day we find some form of evidence that contradicts 534 00:33:46,560 --> 00:33:50,200 Speaker 1: a previously established theory, and that means we have to 535 00:33:50,280 --> 00:33:54,000 Speaker 1: test it and make certain that in fact, this anomaly 536 00:33:54,600 --> 00:33:57,160 Speaker 1: is a real thing, and if so, we have to 537 00:33:57,200 --> 00:34:00,360 Speaker 1: revisit our theory and we have to change it because 538 00:34:00,440 --> 00:34:04,120 Speaker 1: it clearly does not reflect reality. Now, contrast this with 539 00:34:04,200 --> 00:34:08,040 Speaker 1: the more casual use of the word theory to mean idea. 540 00:34:08,320 --> 00:34:11,600 Speaker 1: For example, I have a theory about why the peanut 541 00:34:11,680 --> 00:34:15,719 Speaker 1: butter keeps going missing is a different statement than the 542 00:34:15,719 --> 00:34:18,640 Speaker 1: theory of gravity, unless you mean the peanut butter jar 543 00:34:18,760 --> 00:34:20,800 Speaker 1: keeps falling to the ground because you're releasing it in 544 00:34:20,840 --> 00:34:23,040 Speaker 1: mid air, in which case your theory about why the 545 00:34:23,040 --> 00:34:25,680 Speaker 1: peanut butter goes missing and the theory of gravity are 546 00:34:25,719 --> 00:34:31,600 Speaker 1: actually kind of aligned. The difference causes problems. Sometimes people 547 00:34:31,680 --> 00:34:35,480 Speaker 1: might dismiss as scientific claim by saying, oh, that's only 548 00:34:35,520 --> 00:34:38,880 Speaker 1: a theory, as if to say, that's just your opinion, 549 00:34:39,200 --> 00:34:41,719 Speaker 1: except In science, a theory is an explanation that has 550 00:34:41,760 --> 00:34:45,080 Speaker 1: stood up to numerous tests along different lines of evidence. 551 00:34:45,360 --> 00:34:49,560 Speaker 1: It's not just a proposed explanation that lacks support. A 552 00:34:49,640 --> 00:34:55,240 Speaker 1: hypothesis is an explanation for something. Typically in science, a hypothesis, 553 00:34:55,320 --> 00:34:58,680 Speaker 1: a good hypothesis says will explain a fairly limited set 554 00:34:58,680 --> 00:35:03,480 Speaker 1: of phenomenon. It's it's narrowly focused. Hypotheses are testable, so 555 00:35:03,560 --> 00:35:06,160 Speaker 1: you should be able to take a hypothesis, create a 556 00:35:06,200 --> 00:35:09,640 Speaker 1: test that would produce results that either show the hypothesis 557 00:35:09,680 --> 00:35:13,200 Speaker 1: has merit or the hypothesis does not apply, and then 558 00:35:13,239 --> 00:35:15,680 Speaker 1: you should be able to carry out the experiment and 559 00:35:15,840 --> 00:35:20,600 Speaker 1: observe the results and then determine is the hypothesis good 560 00:35:20,719 --> 00:35:24,000 Speaker 1: or not. The scientific method is sort of a finely 561 00:35:24,040 --> 00:35:27,399 Speaker 1: tuned approach to critical thinking. Critical thinking is all about 562 00:35:27,440 --> 00:35:30,480 Speaker 1: trying to be as objective in your analysis as possible 563 00:35:30,520 --> 00:35:34,680 Speaker 1: to form a judgment about something, and ideally you should 564 00:35:34,680 --> 00:35:38,560 Speaker 1: apply critical thinking too many areas of your life, particularly 565 00:35:38,840 --> 00:35:42,439 Speaker 1: when you encounter various claims about stuff. This can come 566 00:35:42,560 --> 00:35:46,480 Speaker 1: in many forms. For example, politics is a great place 567 00:35:46,520 --> 00:35:51,480 Speaker 1: to apply critical thinking. Politics deals with some incredibly important 568 00:35:51,520 --> 00:35:54,480 Speaker 1: aspects of our lives, things that affect us day to 569 00:35:54,560 --> 00:35:58,080 Speaker 1: day and affect other people, thousands or millions of people 570 00:35:58,280 --> 00:36:03,319 Speaker 1: every day. Political matters are often emotionally charged, and there's 571 00:36:03,400 --> 00:36:07,359 Speaker 1: rhetoric on all sides of issues. It doesn't matter if 572 00:36:07,400 --> 00:36:11,760 Speaker 1: you identify yourself as being conservative or liberal or whatever. 573 00:36:12,360 --> 00:36:17,160 Speaker 1: There's rhetoric on every side. People get passionate. Using critical 574 00:36:17,200 --> 00:36:20,120 Speaker 1: thinking is important, not just as in an effort to 575 00:36:20,160 --> 00:36:22,479 Speaker 1: try and pick apart the arguments that you don't agree 576 00:36:22,520 --> 00:36:25,920 Speaker 1: with philosophically, but also to make certain that those who 577 00:36:25,960 --> 00:36:31,640 Speaker 1: claim to align themselves with your own worldview actually do 578 00:36:31,880 --> 00:36:35,560 Speaker 1: align themselves to your worldview. So let's say I get 579 00:36:35,600 --> 00:36:38,560 Speaker 1: a message from a politician saying I should vote for 580 00:36:38,640 --> 00:36:42,520 Speaker 1: her because she believes the same things I believe, And 581 00:36:42,560 --> 00:36:45,040 Speaker 1: I might go and look at this politicians voting record 582 00:36:45,040 --> 00:36:47,600 Speaker 1: to check and see if that's actually true, and I 583 00:36:47,680 --> 00:36:51,680 Speaker 1: might discover that while the candidate aligns herself with a 584 00:36:51,719 --> 00:36:55,239 Speaker 1: particular party, one that I identify with when it comes 585 00:36:55,239 --> 00:36:57,040 Speaker 1: to voting, their point of view and my point of 586 00:36:57,120 --> 00:36:59,719 Speaker 1: view may not be aligned at all. Well, that would 587 00:36:59,760 --> 00:37:03,440 Speaker 1: be me using critical thinking, saying, just because she says 588 00:37:03,560 --> 00:37:07,200 Speaker 1: that she holds the same beliefs I do, doesn't necessarily 589 00:37:07,200 --> 00:37:09,760 Speaker 1: mean that's the case. I should really look into this further. 590 00:37:10,280 --> 00:37:12,919 Speaker 1: Or to bring it back around to technology, I might 591 00:37:12,920 --> 00:37:17,480 Speaker 1: hear claims about a company saying that they have a 592 00:37:17,560 --> 00:37:22,520 Speaker 1: particularly really awesome cable and it delivers superior performance to 593 00:37:22,680 --> 00:37:26,319 Speaker 1: any other cable. Let's say it's an HDMI cable. And 594 00:37:26,400 --> 00:37:29,440 Speaker 1: to evaluate that claim, I might look at some other information, 595 00:37:29,680 --> 00:37:32,440 Speaker 1: such as what are the limits of the pieces of 596 00:37:32,440 --> 00:37:36,080 Speaker 1: tech the cable will connect, like a set top box 597 00:37:36,120 --> 00:37:39,120 Speaker 1: and a television. Let's say that I've got, you know, 598 00:37:39,239 --> 00:37:41,400 Speaker 1: my own setup at home, and I've got a particular 599 00:37:41,480 --> 00:37:44,920 Speaker 1: kind of set top box and a particular kind of 600 00:37:44,920 --> 00:37:47,520 Speaker 1: TV at home, and they have limits of them themselves, 601 00:37:47,680 --> 00:37:51,719 Speaker 1: right They They are not able to accept all forms 602 00:37:51,880 --> 00:37:55,200 Speaker 1: of media at all forms of resolution. They have a limit. 603 00:37:55,480 --> 00:37:59,719 Speaker 1: Let's say that I've got an HDTV, not a four K, 604 00:38:00,080 --> 00:38:02,360 Speaker 1: not two K or anything like that, just an HDTV. 605 00:38:03,640 --> 00:38:06,319 Speaker 1: No matter how good a cable is, it's not going 606 00:38:06,360 --> 00:38:11,000 Speaker 1: to make my HDTV show video at four K resolution. 607 00:38:11,239 --> 00:38:15,279 Speaker 1: That's impossible. The television cannot do that. Now, there are 608 00:38:15,320 --> 00:38:18,800 Speaker 1: companies out there that sell the idea of a superior experience. 609 00:38:18,800 --> 00:38:21,400 Speaker 1: They're going to say this cable is going to provide 610 00:38:21,520 --> 00:38:25,600 Speaker 1: a superior experience to any other cable. They don't actually 611 00:38:25,680 --> 00:38:29,000 Speaker 1: have to deliver this promise. They don't have to deliver 612 00:38:29,080 --> 00:38:32,840 Speaker 1: that experience, or or maybe the experience they promise is 613 00:38:32,880 --> 00:38:35,799 Speaker 1: beyond the ability for humans to perceive, So there's no 614 00:38:35,880 --> 00:38:38,840 Speaker 1: way for you to actually tell if that experience is 615 00:38:38,880 --> 00:38:41,799 Speaker 1: what they say it is, because you can't discern the 616 00:38:41,840 --> 00:38:44,960 Speaker 1: difference between that and the next step down because it's 617 00:38:44,960 --> 00:38:48,479 Speaker 1: beyond our ability to perceive it. That resolution, but that's 618 00:38:48,480 --> 00:38:50,920 Speaker 1: not what's important to the company that's making the product. 619 00:38:51,000 --> 00:38:54,240 Speaker 1: They're selling an idea. The idea is what you are buying. 620 00:38:55,000 --> 00:38:58,880 Speaker 1: So the reason I wanted to dedicate an episode to 621 00:38:58,920 --> 00:39:01,120 Speaker 1: this topic is that I see a lot of misinformation 622 00:39:01,160 --> 00:39:03,399 Speaker 1: out there, and I've fallen for some of it. I'm 623 00:39:03,400 --> 00:39:06,560 Speaker 1: a human being. I make mistakes, but I try to 624 00:39:06,600 --> 00:39:09,960 Speaker 1: apply critical thinking so that I avoid those situations as 625 00:39:09,960 --> 00:39:12,759 Speaker 1: frequently as I can, knowing that I'm not always going 626 00:39:12,800 --> 00:39:16,279 Speaker 1: to succeed, but always trying to. I think it's an 627 00:39:16,280 --> 00:39:20,279 Speaker 1: important practice. If you practice good critical thinking, you might 628 00:39:20,320 --> 00:39:24,320 Speaker 1: find yourself saving money because you're not following an unsupported trend. 629 00:39:24,480 --> 00:39:27,520 Speaker 1: You're not just buying a new trendy thing because people 630 00:39:27,560 --> 00:39:31,319 Speaker 1: think it's cool. You're actually critically thinking about this thing 631 00:39:31,360 --> 00:39:33,080 Speaker 1: and whether or not it really has a place in 632 00:39:33,120 --> 00:39:35,840 Speaker 1: your life, whether or not it can do the things 633 00:39:35,920 --> 00:39:39,000 Speaker 1: that people promise it can do. You might even improve 634 00:39:39,040 --> 00:39:41,400 Speaker 1: your health, or at the very least you might avoid 635 00:39:41,640 --> 00:39:45,719 Speaker 1: endangering your health. You could even save a life. Just 636 00:39:45,800 --> 00:39:50,680 Speaker 1: be sure to be skeptical without being a denialist. Right 637 00:39:51,160 --> 00:39:54,560 Speaker 1: ask for evidence, look for evidence. But if you get 638 00:39:54,640 --> 00:39:58,400 Speaker 1: evidence and that evidence supports the claim, don't just deny 639 00:39:58,480 --> 00:40:01,960 Speaker 1: the claim. If if you feel the evidence warrants the 640 00:40:02,000 --> 00:40:05,200 Speaker 1: claim being true, then you need to incorporate that into 641 00:40:05,280 --> 00:40:07,480 Speaker 1: your worldview. And don't be afraid to do a little 642 00:40:07,520 --> 00:40:10,680 Speaker 1: digging to verify claims, especially of claims that seemed to 643 00:40:10,760 --> 00:40:15,440 Speaker 1: validate your philosophy. That's one of the go twos of 644 00:40:15,480 --> 00:40:19,399 Speaker 1: any snake oil salesman. Find out what your mark believes, 645 00:40:19,960 --> 00:40:24,640 Speaker 1: reinforce that belief, and claim that whatever you're selling is 646 00:40:24,680 --> 00:40:28,839 Speaker 1: aligned with that belief. You need to take claims that 647 00:40:28,880 --> 00:40:31,799 Speaker 1: seem to align with your worldview, and you need to 648 00:40:31,880 --> 00:40:34,280 Speaker 1: question them just as much as you need to question 649 00:40:34,320 --> 00:40:37,440 Speaker 1: the claims that come in that conflict with your worldview, 650 00:40:38,160 --> 00:40:41,279 Speaker 1: because otherwise you're just gonna buy into something that may 651 00:40:41,280 --> 00:40:45,840 Speaker 1: not be true because it conveniently reinforces what you already 652 00:40:45,840 --> 00:40:50,040 Speaker 1: believe in the world. That is unfortunate. That's like at 653 00:40:50,040 --> 00:40:53,480 Speaker 1: the very heart of the concept of fake news, which 654 00:40:53,719 --> 00:40:56,719 Speaker 1: I hate to even talk about. But fake news, I 655 00:40:56,760 --> 00:40:59,040 Speaker 1: mean it is a thing, and it's true for all 656 00:40:59,120 --> 00:41:02,719 Speaker 1: different types of use. If you see an article out 657 00:41:02,719 --> 00:41:05,720 Speaker 1: there that seems to confirm a belief you already hold, 658 00:41:06,719 --> 00:41:10,480 Speaker 1: then you might be less inclined to question the validity 659 00:41:10,520 --> 00:41:14,759 Speaker 1: of that article, but it behooves you to do so 660 00:41:15,480 --> 00:41:19,440 Speaker 1: because either you're going to question it and turn out, oh, 661 00:41:19,480 --> 00:41:23,799 Speaker 1: it's absolutely accurate, it is true, or as true as 662 00:41:23,800 --> 00:41:26,600 Speaker 1: anything is that we can determine, and therefore I feel 663 00:41:26,640 --> 00:41:29,799 Speaker 1: good about this validating my world view, or you could say, 664 00:41:30,239 --> 00:41:33,680 Speaker 1: turns out this is not true, no matter how much 665 00:41:33,680 --> 00:41:36,279 Speaker 1: I might wish it to be true, and knowing that 666 00:41:36,360 --> 00:41:38,720 Speaker 1: makes it better because I'm not gonna I'm not gonna 667 00:41:38,800 --> 00:41:42,080 Speaker 1: use this to support an argument that later on can 668 00:41:42,120 --> 00:41:44,239 Speaker 1: get undermined because someone else is going to point to 669 00:41:44,280 --> 00:41:46,400 Speaker 1: that thing and say, yeah, but you're citing this as 670 00:41:46,480 --> 00:41:49,439 Speaker 1: your source, and that source has proven to be unreliable. 671 00:41:50,520 --> 00:41:54,360 Speaker 1: It's better to actually use this practice in all areas. Also, 672 00:41:54,400 --> 00:41:58,560 Speaker 1: critical thinking leads to an actual understanding of a subject, 673 00:41:59,080 --> 00:42:03,399 Speaker 1: not just of technology, but all sorts of stuff. People 674 00:42:03,440 --> 00:42:06,640 Speaker 1: talked about critical thinking in my literature classes when I 675 00:42:06,680 --> 00:42:09,759 Speaker 1: was in college. Honestly, that critical thinking concept should have 676 00:42:09,760 --> 00:42:13,640 Speaker 1: been taught much earlier in my educational experience, but uh, 677 00:42:13,719 --> 00:42:16,359 Speaker 1: you know, it just wasn't. I think for a lot 678 00:42:16,360 --> 00:42:19,080 Speaker 1: of kids these days, it's introduced much earlier, which is 679 00:42:19,120 --> 00:42:21,640 Speaker 1: great for me. I didn't get into it until college. 680 00:42:22,080 --> 00:42:26,439 Speaker 1: But critical thinking allows you to question things and then 681 00:42:26,560 --> 00:42:29,400 Speaker 1: through the question get a deeper understanding of the subject matter, 682 00:42:29,640 --> 00:42:33,200 Speaker 1: as opposed to just memorizing stuff. Like you can memorize 683 00:42:33,200 --> 00:42:35,759 Speaker 1: a list of facts and rattle those facts off, but 684 00:42:35,800 --> 00:42:39,240 Speaker 1: if you don't have an actual understanding, then there's nothing 685 00:42:39,320 --> 00:42:43,000 Speaker 1: really going on. You're just you're just regurgitating information, you're 686 00:42:43,000 --> 00:42:49,000 Speaker 1: not really incorporating that. But when you question, there comes 687 00:42:49,000 --> 00:42:50,880 Speaker 1: a point where a little lightbulb comes on in your 688 00:42:50,920 --> 00:42:54,040 Speaker 1: head and you get it, and that's way more powerful. 689 00:42:54,960 --> 00:42:58,200 Speaker 1: To conclude, I like to recommend some interesting books about 690 00:42:58,239 --> 00:43:02,000 Speaker 1: critical thinking and skepticism. Uh, the best one I've ever 691 00:43:02,080 --> 00:43:05,200 Speaker 1: read is The demon Haunted World that was written by 692 00:43:05,280 --> 00:43:09,160 Speaker 1: Carl Sagan. Uh. There's a great audiobook version of this. 693 00:43:09,239 --> 00:43:12,479 Speaker 1: By the way, Carrie Ls of Princess Bride Fame does 694 00:43:12,600 --> 00:43:15,440 Speaker 1: the the narration for it. I highly recommend that if 695 00:43:15,440 --> 00:43:18,440 Speaker 1: you don't want to read it, you can listen to it. Uh. 696 00:43:18,560 --> 00:43:22,319 Speaker 1: Sagan is wonderful and not only advocating for careful critical thought, 697 00:43:22,719 --> 00:43:26,520 Speaker 1: but also he is really good at expressing the wonders 698 00:43:27,080 --> 00:43:30,360 Speaker 1: of the universe and of science. He's great at engaging 699 00:43:30,400 --> 00:43:33,160 Speaker 1: the imagination and grounding us in the real world at 700 00:43:33,160 --> 00:43:36,279 Speaker 1: the same time. And I think his message is ultimately 701 00:43:37,000 --> 00:43:41,000 Speaker 1: one of wonder and hope, because it's all about how 702 00:43:41,200 --> 00:43:44,560 Speaker 1: phenomenal the universe is and how much we can stand 703 00:43:44,600 --> 00:43:47,560 Speaker 1: to learn from it. A lot of people, I think, 704 00:43:47,880 --> 00:43:50,880 Speaker 1: I think of skepticism is taking the magic out of 705 00:43:50,920 --> 00:43:55,520 Speaker 1: the world, but really it's just making sure the stuff 706 00:43:55,680 --> 00:43:59,440 Speaker 1: you are putting stock into is real. But that doesn't 707 00:43:59,440 --> 00:44:03,680 Speaker 1: mean real. He can't be amazing, it can be. It's 708 00:44:03,719 --> 00:44:09,600 Speaker 1: just not magic because magic is not real. Asking the 709 00:44:09,719 --> 00:44:12,680 Speaker 1: Right Questions A Guide to Critical Thinking by m Neil 710 00:44:12,760 --> 00:44:15,920 Speaker 1: Brown and Stuart M. Keely is also a good read. 711 00:44:16,719 --> 00:44:20,560 Speaker 1: Mistakes Were Made But Not by Me by Carol Tigress 712 00:44:20,719 --> 00:44:24,319 Speaker 1: and Elliott Aaronson explores why we are so adept at 713 00:44:24,320 --> 00:44:27,160 Speaker 1: passing the buck when we make a mistake. It's a 714 00:44:27,239 --> 00:44:31,120 Speaker 1: very human thing when we make a mistake, to justify 715 00:44:31,200 --> 00:44:36,640 Speaker 1: that mistake or gloss over it, or try to deflect 716 00:44:36,680 --> 00:44:39,040 Speaker 1: it so it doesn't seem like we've made a mistake. 717 00:44:39,320 --> 00:44:42,600 Speaker 1: It's a very human thing. This book specifically goes into 718 00:44:42,719 --> 00:44:45,799 Speaker 1: why is that? Why are we so reluctant to say 719 00:44:45,840 --> 00:44:48,520 Speaker 1: I was wrong? Because if we could say I was 720 00:44:48,560 --> 00:44:53,120 Speaker 1: wrong more frequently when we are legitimately wrong, then we 721 00:44:53,200 --> 00:44:57,239 Speaker 1: could proceed toward the truth much more quickly. But we're 722 00:44:57,239 --> 00:45:01,000 Speaker 1: really reluctant to do that in general. Flam Flam by 723 00:45:01,080 --> 00:45:05,680 Speaker 1: James Randy is a really interesting book about various hoaxes, scams, 724 00:45:05,680 --> 00:45:09,760 Speaker 1: and unsupported claims, most of which are in the supernatural realm. 725 00:45:09,880 --> 00:45:11,759 Speaker 1: But I think it is very helpful if you want 726 00:45:11,800 --> 00:45:15,360 Speaker 1: to look at critical thinking. Randy is a professional magician, 727 00:45:15,680 --> 00:45:19,480 Speaker 1: and as such he has a working understanding of human psychology. 728 00:45:19,600 --> 00:45:23,839 Speaker 1: You know, not necessarily a scholarly understanding, but human psychology 729 00:45:23,880 --> 00:45:27,520 Speaker 1: is something magicians work on like. They have to understand it. 730 00:45:27,520 --> 00:45:29,760 Speaker 1: They have to understand how people think and how people 731 00:45:29,800 --> 00:45:33,960 Speaker 1: perceive in order to trick them. So they're all about 732 00:45:34,080 --> 00:45:37,640 Speaker 1: misdirection and deception, and a lot of magicians actually have 733 00:45:37,719 --> 00:45:41,560 Speaker 1: created really great work calling for critical thinking and using 734 00:45:41,600 --> 00:45:45,680 Speaker 1: their understanding of human psychology and misdirection to point out 735 00:45:45,760 --> 00:45:48,399 Speaker 1: how we humans are really good at fooling each other 736 00:45:48,760 --> 00:45:52,000 Speaker 1: and ourselves. Also, I feel I need to point this out. 737 00:45:52,200 --> 00:45:56,400 Speaker 1: Randy is a bit of an irascible fellow. He's grouchy 738 00:45:56,520 --> 00:45:59,640 Speaker 1: and it comes across and his writing. Just throwing that 739 00:45:59,680 --> 00:46:01,240 Speaker 1: out there because I don't want to be a surprise 740 00:46:01,280 --> 00:46:03,839 Speaker 1: to anybody if they pick up the book. Another one 741 00:46:03,880 --> 00:46:07,560 Speaker 1: is Pseudoscience and the Paranormal by Terence hinz Uh. That's 742 00:46:07,560 --> 00:46:10,600 Speaker 1: a book I've heard recommended numerous times, but I have 743 00:46:10,640 --> 00:46:13,600 Speaker 1: not yet read it myself. It's on my to read list. 744 00:46:13,719 --> 00:46:16,040 Speaker 1: I just haven't read it yet, but I keep hearing 745 00:46:16,080 --> 00:46:19,120 Speaker 1: Pseudoscience and the Paranormal is a very good one, so 746 00:46:19,640 --> 00:46:22,200 Speaker 1: I hope it is. That's my next book to read 747 00:46:22,239 --> 00:46:24,520 Speaker 1: once I'm done with the one I'm on now. In 748 00:46:24,560 --> 00:46:27,920 Speaker 1: my next episode, I'm going to be joined by the 749 00:46:27,960 --> 00:46:31,680 Speaker 1: fabulous Mr Benjamin Bolan to cover some of the scams 750 00:46:31,719 --> 00:46:36,000 Speaker 1: and hoaxes and flam flam and technology that in cases, 751 00:46:36,120 --> 00:46:38,520 Speaker 1: some cases rob people of money. In some cases that 752 00:46:38,640 --> 00:46:41,160 Speaker 1: just deluded a person into thinking they had found something 753 00:46:41,200 --> 00:46:44,319 Speaker 1: interesting when really they hadn't. In some cases it led 754 00:46:44,360 --> 00:46:48,520 Speaker 1: to even worse outcomes. And I really want to do 755 00:46:48,560 --> 00:46:54,040 Speaker 1: that to explore again why scientific thinking, Why why skepticism 756 00:46:54,080 --> 00:46:57,480 Speaker 1: critical thinking? Why that's so important? And the next episode 757 00:46:57,520 --> 00:46:59,080 Speaker 1: is going to kind of illustrate that with a bunch 758 00:46:59,120 --> 00:47:02,239 Speaker 1: of examples, but it'll be a pretty entertaining story as well. 759 00:47:02,760 --> 00:47:06,280 Speaker 1: Thank you guys so much. We have a thousand episodes 760 00:47:06,320 --> 00:47:09,600 Speaker 1: of tech Stuff and there's no stopping. We're gonna keep 761 00:47:09,640 --> 00:47:13,799 Speaker 1: on going. But one thousand, I can't believe it. That's 762 00:47:13,840 --> 00:47:16,360 Speaker 1: a lot of Jonathan Strickland talking. I feel like I 763 00:47:16,360 --> 00:47:19,920 Speaker 1: should apologize, But if you guys have any suggestions for 764 00:47:19,960 --> 00:47:22,480 Speaker 1: future episodes of tech Stuff, write me, send me a 765 00:47:22,520 --> 00:47:25,560 Speaker 1: message at tech Stuff at how stuff works dot com, 766 00:47:25,680 --> 00:47:28,000 Speaker 1: or drop me a line on Twitter or Facebook. The 767 00:47:28,040 --> 00:47:30,320 Speaker 1: handle at both of those is tech Stuff H s W. 768 00:47:30,960 --> 00:47:33,680 Speaker 1: Don't forget to follow us on Instagram and I'll talk 769 00:47:33,680 --> 00:47:42,840 Speaker 1: to you again really soon for more on this and 770 00:47:42,920 --> 00:47:45,440 Speaker 1: thousands of other topics. Because it how stuff Works dot 771 00:47:45,480 --> 00:47:55,440 Speaker 1: Com