1 00:00:01,480 --> 00:00:04,280 Speaker 1: Welcome to Stuff You Should Know, a production of I 2 00:00:04,360 --> 00:00:13,640 Speaker 1: Heart Radio. Hey, and welcome to the podcast. I'm Josh Clark, 3 00:00:13,720 --> 00:00:16,880 Speaker 1: and there's Charles w Chuck, Bryan, and Jerry's here. Jerry's back. 4 00:00:16,920 --> 00:00:22,159 Speaker 1: Everybody looking well rested and sun kissed and everything, and 5 00:00:22,280 --> 00:00:25,079 Speaker 1: this is stuff you should know. She's like a beautiful, 6 00:00:25,640 --> 00:00:31,400 Speaker 1: juicy orange. That's right. That's right, Chuck, that's a really 7 00:00:31,440 --> 00:00:34,159 Speaker 1: apt description. Ready to be squaws. I wish I could 8 00:00:34,200 --> 00:00:38,040 Speaker 1: squeeze her, but we're still not squeezing. Uh no, not 9 00:00:38,159 --> 00:00:43,680 Speaker 1: in this pandemic. Are you crazy? You out of your mind? No, 10 00:00:43,960 --> 00:00:48,519 Speaker 1: no squeezing. Um Yeah, even Robert, Robert Plant wouldn't let 11 00:00:48,560 --> 00:00:53,720 Speaker 1: you anywhere near him. Okay, check figure out that joke. 12 00:00:53,880 --> 00:01:01,360 Speaker 1: Robert the squeeze my my lemon, the lemon song, No 13 00:01:01,800 --> 00:01:05,480 Speaker 1: until the juice runs down my leg. Yeah, that's the 14 00:01:05,560 --> 00:01:10,680 Speaker 1: lemon song. Right. No, I don't think so. I think 15 00:01:10,680 --> 00:01:14,640 Speaker 1: it is. I don't think it is, Okay, I think 16 00:01:14,680 --> 00:01:17,080 Speaker 1: it's a I don't think it is the lemon song, man. 17 00:01:17,319 --> 00:01:18,800 Speaker 1: I think it's a whole lot of love. It's whole 18 00:01:18,840 --> 00:01:22,520 Speaker 1: lot of love, all right. It's maybe the dirtiest thing 19 00:01:22,520 --> 00:01:28,679 Speaker 1: that was ever said in like a top ten song. Okay, regardless, 20 00:01:29,240 --> 00:01:31,680 Speaker 1: I'll just let the emails take care of this, I 21 00:01:31,800 --> 00:01:36,840 Speaker 1: really think. No, the Lemon song is ums I love 22 00:01:36,959 --> 00:01:40,800 Speaker 1: those lemons, right, No, the Lemon song is all about 23 00:01:41,640 --> 00:01:45,480 Speaker 1: how how how you have friends, like you want to 24 00:01:45,520 --> 00:01:50,280 Speaker 1: have friends, and like friends are good to have. Okay, yeah, 25 00:01:50,560 --> 00:01:52,120 Speaker 1: may be all wrong. No, I think it's a whole 26 00:01:52,160 --> 00:01:55,280 Speaker 1: lot of love. Yeah it is. I'm a dcent sure buddy. 27 00:01:56,920 --> 00:01:58,960 Speaker 1: All right, well I encourage you not to google the lyrics. 28 00:01:59,040 --> 00:02:02,160 Speaker 1: Then well we um we could ask our good friend 29 00:02:02,240 --> 00:02:05,560 Speaker 1: and um stuff. You should know writer Ed Grabmanowski the 30 00:02:05,600 --> 00:02:09,040 Speaker 1: grab Stir. Look at that because he is in a 31 00:02:09,080 --> 00:02:11,480 Speaker 1: band and has been for a while. We've mentioned it before, 32 00:02:11,600 --> 00:02:16,000 Speaker 1: space Lord, which has just a super cool Zeppelin esque 33 00:02:16,120 --> 00:02:21,760 Speaker 1: sound to them. Um and they just there and um 34 00:02:21,800 --> 00:02:24,840 Speaker 1: they just released a new single, which you can find 35 00:02:24,880 --> 00:02:30,360 Speaker 1: on band camp by searching space Lord. Not the space Lords. No, 36 00:02:30,760 --> 00:02:34,080 Speaker 1: not Vinny in the space Lords. Yeah space Lord. Just 37 00:02:34,080 --> 00:02:38,239 Speaker 1: look for space Lord with some cool uh graphics, and 38 00:02:38,320 --> 00:02:41,399 Speaker 1: that'll you know that's Ed. You'll know it's the Grabster 39 00:02:41,840 --> 00:02:46,240 Speaker 1: but uh yeah, good stuff. We also have a game 40 00:02:46,280 --> 00:02:48,720 Speaker 1: out that Trivial Pursuit made. Yeah, we should plug our 41 00:02:48,760 --> 00:02:51,840 Speaker 1: own stuff every now and then. We just did. Yes, 42 00:02:51,960 --> 00:02:55,240 Speaker 1: it is a co branded game with Trivial Pursuit from Hasbro, 43 00:02:55,480 --> 00:02:59,680 Speaker 1: and it is not a trivial pursuit game that you 44 00:02:59,720 --> 00:03:02,320 Speaker 1: are used to. It is a stuff you should know 45 00:03:02,400 --> 00:03:07,560 Speaker 1: game that Trivial Pursuit was happy to UH co brandwith. 46 00:03:07,639 --> 00:03:10,160 Speaker 1: So just what, I don't want his emails that are 47 00:03:10,160 --> 00:03:12,320 Speaker 1: like this, this this is a Trivial Pursuit. This is some 48 00:03:12,639 --> 00:03:16,040 Speaker 1: other different game. You're always worried about the emails, aren't you. 49 00:03:16,800 --> 00:03:19,639 Speaker 1: Don't you just ignore them? Let him roll off, roll 50 00:03:19,680 --> 00:03:23,040 Speaker 1: off my back, Like, I'm disappointed in you guys for this. 51 00:03:23,760 --> 00:03:26,080 Speaker 1: I haven't even listened to the episode, but I'm disappointed 52 00:03:26,080 --> 00:03:27,600 Speaker 1: about that. I just got one of those. Did you 53 00:03:27,639 --> 00:03:30,720 Speaker 1: see that It just rolls off your back? Yeah, yeah, 54 00:03:30,760 --> 00:03:33,040 Speaker 1: those are those are always great. I didn't listen, But 55 00:03:34,960 --> 00:03:37,600 Speaker 1: here's what was wrong. I read that person back. Actually, 56 00:03:37,600 --> 00:03:40,520 Speaker 1: I was like, we actually kind of did exactly what 57 00:03:40,560 --> 00:03:42,360 Speaker 1: you hoped we would do. And they're like, oh, sorry 58 00:03:42,400 --> 00:03:48,720 Speaker 1: for being presumptuous anyway, Oh all is forgiven. So, um, 59 00:03:49,320 --> 00:03:53,440 Speaker 1: we're talking today about bias Chuck, and I want to 60 00:03:53,480 --> 00:03:57,840 Speaker 1: set the scene a little bit because you know, um, 61 00:03:57,880 --> 00:04:00,000 Speaker 1: one of the things that I'm always like harping up 62 00:04:00,000 --> 00:04:04,040 Speaker 1: about is like the death of expertise, right, And it's 63 00:04:04,040 --> 00:04:07,480 Speaker 1: a real problem, like this idea that science can't be trusted, 64 00:04:07,560 --> 00:04:10,720 Speaker 1: that people who go and spend a decade or more 65 00:04:10,800 --> 00:04:14,280 Speaker 1: learning about a specific thing that they go out and 66 00:04:14,320 --> 00:04:18,480 Speaker 1: become an expert in or that's their profession, that's their training, 67 00:04:19,040 --> 00:04:21,880 Speaker 1: um that those people what they have to say is 68 00:04:22,640 --> 00:04:26,640 Speaker 1: basically meaningless, or that it's it's no better than somebody 69 00:04:26,680 --> 00:04:29,840 Speaker 1: on the internet's opinion about so that specific subject that 70 00:04:29,839 --> 00:04:32,160 Speaker 1: that person spent ten or twelve years being trained to 71 00:04:32,520 --> 00:04:34,760 Speaker 1: be an expert in. Like that kind of stuff to 72 00:04:34,800 --> 00:04:38,000 Speaker 1: me is like super dangerous, Like it it's it's in 73 00:04:38,080 --> 00:04:40,480 Speaker 1: a there's an erosion of something, and it's a an 74 00:04:40,480 --> 00:04:43,760 Speaker 1: erosion of intelligence to start with, but it's also an 75 00:04:43,760 --> 00:04:48,400 Speaker 1: erosion of just believing in facts and knowing that you're 76 00:04:48,400 --> 00:04:51,960 Speaker 1: not being taken for a ride or hustled. And is 77 00:04:51,960 --> 00:04:55,599 Speaker 1: it a huge enormous problem that we're just beginning to 78 00:04:55,600 --> 00:04:57,680 Speaker 1: wake up too and is still unfolding. It's not like 79 00:04:57,760 --> 00:05:00,760 Speaker 1: it happened and now we're like real from it. It's 80 00:05:00,800 --> 00:05:04,400 Speaker 1: still happening in real time, and it is a massive, 81 00:05:05,240 --> 00:05:09,440 Speaker 1: huge issue one of the biggest issues that that humanity faces, 82 00:05:09,480 --> 00:05:14,320 Speaker 1: I think because it encompasses so many other large issues 83 00:05:14,360 --> 00:05:20,039 Speaker 1: like climate change, existential risks, the pandemic, um politics, all 84 00:05:20,080 --> 00:05:23,359 Speaker 1: of them kind of fall under this this erosion of 85 00:05:23,480 --> 00:05:26,800 Speaker 1: belief and facts and that there are people out there 86 00:05:26,839 --> 00:05:30,680 Speaker 1: who know more than you do. Um, it's a big problem. Yeah, 87 00:05:30,960 --> 00:05:35,000 Speaker 1: imagine being someone who studied and researched something intensely for 88 00:05:35,040 --> 00:05:40,120 Speaker 1: ten or fifteen years, uh that with when presenting facts 89 00:05:40,120 --> 00:05:43,880 Speaker 1: to be met with. I don't know about that. That's 90 00:05:43,880 --> 00:05:47,040 Speaker 1: a response I hear a lot in the South. Yeah, 91 00:05:47,240 --> 00:05:50,120 Speaker 1: or that they saw something on YouTube that flatly contradicts that, 92 00:05:50,160 --> 00:05:53,479 Speaker 1: and it's like that it doesn't matter what you just said. 93 00:05:53,600 --> 00:05:57,960 Speaker 1: Is ridiculous that somebody posted something on YouTube and that 94 00:05:57,960 --> 00:06:01,479 Speaker 1: that like that has as much weight is what somebody 95 00:06:01,520 --> 00:06:04,240 Speaker 1: who spent ten or twelve year studying this very thing 96 00:06:04,680 --> 00:06:07,320 Speaker 1: has to say about it, like knows exactly what they're 97 00:06:07,320 --> 00:06:11,520 Speaker 1: talking about, has to say about it. It's it's maddening. Yeah, 98 00:06:11,520 --> 00:06:14,720 Speaker 1: there's there's something about people from the South in general, 99 00:06:14,720 --> 00:06:17,560 Speaker 1: and I think that are in this group that I 100 00:06:17,600 --> 00:06:20,599 Speaker 1: have literally heard that response from a lot of different 101 00:06:20,600 --> 00:06:22,159 Speaker 1: people when I've been like oh no, no no, no, here 102 00:06:22,160 --> 00:06:26,040 Speaker 1: the facts actually, and then when presented with something that 103 00:06:26,080 --> 00:06:28,360 Speaker 1: they can't refute, they say, I don't know about that, 104 00:06:29,279 --> 00:06:31,240 Speaker 1: and like that's it. That's the end of the conversation. 105 00:06:31,400 --> 00:06:33,719 Speaker 1: That's different than the people I've encountered. The people I 106 00:06:33,839 --> 00:06:36,719 Speaker 1: encountered like their brow furrows and they start pointing fingers 107 00:06:36,720 --> 00:06:39,280 Speaker 1: and their their tone goes up, like you are you 108 00:06:39,320 --> 00:06:42,520 Speaker 1: hanging out at the country club or something. I think 109 00:06:42,520 --> 00:06:45,880 Speaker 1: it's different types of people that, you know, there's ignorance, 110 00:06:45,920 --> 00:06:48,960 Speaker 1: and then there's also people that actually think they're better 111 00:06:49,040 --> 00:06:52,520 Speaker 1: informed that will fire back with YouTube clips. Right. So 112 00:06:53,320 --> 00:06:56,479 Speaker 1: the reason I brought that up is because and one 113 00:06:56,480 --> 00:06:59,040 Speaker 1: of the reasons that that is being allowed to exist, 114 00:06:59,040 --> 00:07:01,680 Speaker 1: that that does exist, I think it's a it's a 115 00:07:01,760 --> 00:07:05,640 Speaker 1: reaction to something else that's going on simultaneously, which is 116 00:07:07,040 --> 00:07:09,320 Speaker 1: there are a lot of experts out there who are 117 00:07:09,320 --> 00:07:17,160 Speaker 1: performing really sloppy science, sometimes outright fraudulent science, and they're 118 00:07:18,000 --> 00:07:23,680 Speaker 1: frittering away whatever faith the general public or society has 119 00:07:23,760 --> 00:07:26,240 Speaker 1: in their expertise and in their profession. And there are 120 00:07:26,480 --> 00:07:29,240 Speaker 1: a ton of scientists out there. I would say the 121 00:07:29,360 --> 00:07:35,120 Speaker 1: vast majority, by far of scientists are legitimate upstanding, upright 122 00:07:35,600 --> 00:07:39,520 Speaker 1: dedicants to science, right, That's where they that's where they 123 00:07:39,600 --> 00:07:41,760 Speaker 1: place there, that's where they hang their hat, that's where 124 00:07:41,760 --> 00:07:44,840 Speaker 1: their heart is, that's that's what they believe in, and 125 00:07:44,880 --> 00:07:48,320 Speaker 1: that's what they work to support. But science has like 126 00:07:48,400 --> 00:07:52,480 Speaker 1: kind of a problem, Chuck in that it's allowing way 127 00:07:52,520 --> 00:07:56,280 Speaker 1: too much for bias, which is what we're gonna talk about, 128 00:07:56,560 --> 00:08:01,000 Speaker 1: to creep into science, um and your mind science and 129 00:08:01,080 --> 00:08:05,120 Speaker 1: basically produce papers that are just useless and trash. And 130 00:08:05,160 --> 00:08:07,560 Speaker 1: there's a whole lot of reasons for it, but it's 131 00:08:07,560 --> 00:08:10,760 Speaker 1: a it's something that needs to be addressed if we're 132 00:08:10,760 --> 00:08:13,000 Speaker 1: ever going to get back on a footing with a 133 00:08:13,120 --> 00:08:16,600 Speaker 1: faith in experts and expertise and just facts that there 134 00:08:16,640 --> 00:08:19,480 Speaker 1: are such things as objective facts. Yeah, I mean a 135 00:08:19,520 --> 00:08:23,840 Speaker 1: lot of times it's financially related, whether it's a lack 136 00:08:23,880 --> 00:08:27,080 Speaker 1: of funding, a desire for more funding, a desire just 137 00:08:27,120 --> 00:08:31,160 Speaker 1: to keep your your uh, your lab running and people 138 00:08:31,240 --> 00:08:35,120 Speaker 1: paid on staff. Which you know, all this stuff is understandable. 139 00:08:35,400 --> 00:08:38,560 Speaker 1: You want to keep doing this work, but you can't 140 00:08:38,640 --> 00:08:41,160 Speaker 1: let that get in the way. It's like it's like 141 00:08:41,200 --> 00:08:45,280 Speaker 1: in Rushmore at the end when Margaret Yang faked the 142 00:08:45,280 --> 00:08:49,000 Speaker 1: results of that science experiment because she didn't want it 143 00:08:49,040 --> 00:08:52,760 Speaker 1: to be wrong. You know, I don't remember what um, 144 00:08:52,880 --> 00:08:55,600 Speaker 1: I don't remember that part was that like a deleted scene? No? No, no, 145 00:08:55,720 --> 00:08:57,800 Speaker 1: was it in the end when they meet up and 146 00:08:57,840 --> 00:09:00,320 Speaker 1: he's flying the Uh. I think he's flying the kite 147 00:09:00,360 --> 00:09:04,120 Speaker 1: with Dirk and she's talking about her science fair project 148 00:09:04,200 --> 00:09:05,880 Speaker 1: and he was really impressed with it, and she was like, 149 00:09:06,320 --> 00:09:09,400 Speaker 1: I fake the results. And the reason why was because 150 00:09:09,480 --> 00:09:12,480 Speaker 1: because she didn't want to be wrong. Uh. And I 151 00:09:12,480 --> 00:09:14,960 Speaker 1: think a lot of times people will get into a 152 00:09:15,000 --> 00:09:19,640 Speaker 1: certain body of research or data too because they want 153 00:09:19,720 --> 00:09:22,640 Speaker 1: to prove a certain thing and if they can't, it 154 00:09:22,720 --> 00:09:24,360 Speaker 1: might be really hard to live with that. So that 155 00:09:24,400 --> 00:09:28,800 Speaker 1: weighs into it. Uh, money for personal gain, uh, advancing 156 00:09:28,800 --> 00:09:31,920 Speaker 1: your career, you know, publisher, parish, that whole thing. Like, 157 00:09:31,920 --> 00:09:33,880 Speaker 1: we're gonna talk about all this, but there are a 158 00:09:33,960 --> 00:09:36,640 Speaker 1: lot of reasons that it's been allowed to creep in, 159 00:09:37,160 --> 00:09:39,920 Speaker 1: But all of it is at the disservice of their 160 00:09:40,480 --> 00:09:43,280 Speaker 1: the fundamentals of what they base their careers on to 161 00:09:43,320 --> 00:09:46,400 Speaker 1: begin with. Yeah, it's at the it's at the disservice 162 00:09:46,440 --> 00:09:49,679 Speaker 1: of science itself, right, because the whole point of science 163 00:09:49,679 --> 00:09:53,720 Speaker 1: and then the scientific publishing, the whole publishing industry, um 164 00:09:53,840 --> 00:09:58,120 Speaker 1: is to to basically create a hypothesis, test your hypothesis, 165 00:09:58,160 --> 00:09:59,800 Speaker 1: and then share the results with the world. And that's 166 00:09:59,840 --> 00:10:02,880 Speaker 1: I deally what would happen because you're building this body 167 00:10:02,920 --> 00:10:07,120 Speaker 1: of scientific knowledge. But money and corporate interests and academic 168 00:10:07,160 --> 00:10:11,079 Speaker 1: publishing have all kind of come in and taken control 169 00:10:11,160 --> 00:10:14,280 Speaker 1: of this whole thing, and as a result, a lot 170 00:10:14,320 --> 00:10:17,240 Speaker 1: of the stuff that gets published are trash papers that 171 00:10:17,240 --> 00:10:20,160 Speaker 1: shouldn't be published. A lot of the really good papers 172 00:10:20,200 --> 00:10:24,520 Speaker 1: that don't come up with sexy results don't get published. 173 00:10:25,040 --> 00:10:29,440 Speaker 1: And then, like you said, um, people using science for 174 00:10:29,559 --> 00:10:36,520 Speaker 1: personal gain. There are a very small chadra of thoroughly 175 00:10:36,640 --> 00:10:41,160 Speaker 1: evil people who are willing to use their scientific credentials 176 00:10:41,640 --> 00:10:45,319 Speaker 1: to create doubt in the general public, to to prevent 177 00:10:46,520 --> 00:10:51,320 Speaker 1: like people from understanding that climate change was real for 178 00:10:51,480 --> 00:10:56,479 Speaker 1: twenty years, or um, that fossil fuels actually do contribute 179 00:10:56,480 --> 00:11:00,880 Speaker 1: to to anthropogenic climate change. But what remains focusing on 180 00:11:01,080 --> 00:11:05,520 Speaker 1: is like bias in the in the sense that people 181 00:11:05,559 --> 00:11:11,120 Speaker 1: carrying out studies are human beings, and human beings are flawed. 182 00:11:11,200 --> 00:11:14,400 Speaker 1: We're just flawed, and we bring those flaws to our studies, 183 00:11:14,880 --> 00:11:19,120 Speaker 1: and that you really have to work hard at rooting 184 00:11:19,160 --> 00:11:22,679 Speaker 1: those flaws and those biases out to produce a really good, 185 00:11:23,120 --> 00:11:27,600 Speaker 1: thorough scientific study with good reliable results that can be 186 00:11:27,640 --> 00:11:31,960 Speaker 1: reproduced by anybody using the same methods. Um and that 187 00:11:32,679 --> 00:11:35,120 Speaker 1: science is just starting to wake up to the idea 188 00:11:35,200 --> 00:11:38,040 Speaker 1: that it is really biased and it needs to take 189 00:11:38,080 --> 00:11:41,240 Speaker 1: these into account in order to to progress forward from 190 00:11:41,240 --> 00:11:44,000 Speaker 1: the point that it's at right now, which is tenuous, 191 00:11:44,200 --> 00:11:48,360 Speaker 1: I think, perhaps more tenuous than ever the point that 192 00:11:48,400 --> 00:11:52,000 Speaker 1: science is at. I think. So science isn't going away. 193 00:11:52,000 --> 00:11:54,040 Speaker 1: It's not going anywhere. It's probably the greatest of course 194 00:11:54,360 --> 00:11:57,480 Speaker 1: humans have ever come up with. Right, It's not going anywhere, 195 00:11:57,480 --> 00:12:00,200 Speaker 1: But it is a terrible position that it's in, and 196 00:12:00,240 --> 00:12:04,959 Speaker 1: it's going to take some genuine leadership in the scientific 197 00:12:04,960 --> 00:12:07,120 Speaker 1: community from a bunch of different quarters in a bunch 198 00:12:07,160 --> 00:12:09,600 Speaker 1: of different fields to basically step up and be like, guys, 199 00:12:09,600 --> 00:12:13,160 Speaker 1: this is really bad and we need to change it now, 200 00:12:13,440 --> 00:12:15,160 Speaker 1: and a lot of people need to be called out. 201 00:12:15,160 --> 00:12:19,559 Speaker 1: In science typically shies away from naming names and calling 202 00:12:19,559 --> 00:12:24,520 Speaker 1: out by name fraudulent scientists because scientists seem to like 203 00:12:24,640 --> 00:12:28,080 Speaker 1: to um suppose the best in people, which is not 204 00:12:28,200 --> 00:12:31,079 Speaker 1: always the case, right, And having said all of this, 205 00:12:31,400 --> 00:12:34,520 Speaker 1: there could we could root out every bias and and 206 00:12:34,520 --> 00:12:40,040 Speaker 1: and really clean up the scientific publishing community. And there's 207 00:12:40,080 --> 00:12:42,400 Speaker 1: still a set a certain set of people in this 208 00:12:42,480 --> 00:12:44,840 Speaker 1: country and in the world who that wouldn't matter to 209 00:12:45,200 --> 00:12:49,280 Speaker 1: and would still shut down facts and because it doesn't 210 00:12:49,320 --> 00:12:52,800 Speaker 1: fit their narrative, so for sure, But the people have 211 00:12:52,960 --> 00:12:56,480 Speaker 1: always they've always been there, right, and they're always going 212 00:12:56,520 --> 00:12:59,160 Speaker 1: to be there. There's always it's just countrarians that they 213 00:12:59,240 --> 00:13:01,760 Speaker 1: are you can call, and free thinkers you can call them, stubborn, 214 00:13:01,880 --> 00:13:07,560 Speaker 1: you can call purposefully, purposefully ignorant who knows they're always 215 00:13:07,600 --> 00:13:11,760 Speaker 1: going to exist. The problem that this crisis that science 216 00:13:11,800 --> 00:13:15,360 Speaker 1: finds itself in right now is that it's allowed that 217 00:13:15,360 --> 00:13:18,120 Speaker 1: that population to grow and grow, and like people who 218 00:13:18,160 --> 00:13:20,960 Speaker 1: otherwise didn't never really question science have been allowed to 219 00:13:21,040 --> 00:13:24,360 Speaker 1: kind of trickle into that fold. And that those are 220 00:13:24,360 --> 00:13:26,400 Speaker 1: the people that we should be worried about, the ones 221 00:13:26,440 --> 00:13:30,600 Speaker 1: who would would know better if they believed in science again, right, 222 00:13:31,240 --> 00:13:33,520 Speaker 1: And our way into this is to talk about different 223 00:13:33,559 --> 00:13:36,840 Speaker 1: kinds of biases in true stuff. You should know fashion 224 00:13:36,920 --> 00:13:39,240 Speaker 1: a top ten that is not a top ten. That's 225 00:13:39,280 --> 00:13:41,880 Speaker 1: exactly right. We ate into at least three in this 226 00:13:42,000 --> 00:13:44,880 Speaker 1: intro and hopefully shining a light on some of this stuff. 227 00:13:45,240 --> 00:13:49,040 Speaker 1: People at least be more aware of different biases. And uh, well, yeah, 228 00:13:49,080 --> 00:13:52,040 Speaker 1: you know. The first one is is good old confirmation bias. 229 00:13:52,320 --> 00:13:55,560 Speaker 1: I mean, these aren't ranked because confirmation bias would probably 230 00:13:55,559 --> 00:13:59,200 Speaker 1: be number one as far as people's awareness of it. 231 00:13:59,280 --> 00:14:03,760 Speaker 1: But there are different examples, um that people use for 232 00:14:03,840 --> 00:14:06,760 Speaker 1: confirmation bias. And I kind of enjoyed the one from 233 00:14:06,760 --> 00:14:09,199 Speaker 1: the house Stuff Works article even though it was from 234 00:14:09,240 --> 00:14:13,640 Speaker 1: N three. After X rays were discovered in Germany, there 235 00:14:13,760 --> 00:14:18,760 Speaker 1: was a French scientist named Renee Blond Blonde Lot. Yeah, 236 00:14:18,880 --> 00:14:20,880 Speaker 1: he looked at as the X rays he said, wow, 237 00:14:21,720 --> 00:14:25,240 Speaker 1: Well who said, ay, I see N rays I've discovered 238 00:14:25,280 --> 00:14:28,560 Speaker 1: in rays And everyone's like, what's an N ray? He said, well, 239 00:14:28,600 --> 00:14:32,320 Speaker 1: it's like a corona when electricity discharges from a crystal 240 00:14:32,320 --> 00:14:34,800 Speaker 1: and you can only see it in your peripheral vision. 241 00:14:35,760 --> 00:14:40,000 Speaker 1: And American Robert Wood laid the wood and said, I'm 242 00:14:40,000 --> 00:14:43,400 Speaker 1: gonna come to your lab and check this out and 243 00:14:43,600 --> 00:14:47,120 Speaker 1: secretly remove the crystals during one of the experiments, and 244 00:14:48,360 --> 00:14:52,480 Speaker 1: Blonde Lot still saw these N rays and so that's 245 00:14:52,520 --> 00:14:54,920 Speaker 1: confirmation bias. He wanted to see those in rays. And 246 00:14:54,960 --> 00:14:58,520 Speaker 1: then later, even though it was disproved, other French scientists 247 00:14:58,960 --> 00:15:02,680 Speaker 1: supposedly published papers or published papers based on that research 248 00:15:02,920 --> 00:15:05,360 Speaker 1: because they wanted it to be true. So that's what 249 00:15:05,440 --> 00:15:09,880 Speaker 1: confirmation biases is when you're starting out with a hypothesis 250 00:15:10,440 --> 00:15:14,040 Speaker 1: that is going to shape the methodology methodology of your 251 00:15:14,080 --> 00:15:17,280 Speaker 1: study to to confirm it right. And then it can 252 00:15:17,320 --> 00:15:23,440 Speaker 1: also occur where you're um. You you're interpreting info um 253 00:15:23,480 --> 00:15:26,680 Speaker 1: to fit your hypothesis, so you're seeking out stuff that 254 00:15:26,760 --> 00:15:29,640 Speaker 1: supports your hypothesis, and then the stuff that you is 255 00:15:29,800 --> 00:15:31,480 Speaker 1: that's just there in front of you, the results, they 256 00:15:31,480 --> 00:15:33,400 Speaker 1: are there in front of your like, ah, this thing 257 00:15:33,520 --> 00:15:37,640 Speaker 1: proves that those end rays actually exist, um or this 258 00:15:37,760 --> 00:15:41,240 Speaker 1: phenomenon cannot be due to anything but end raise. Therefore 259 00:15:41,400 --> 00:15:43,800 Speaker 1: end Raise exists all of its confirmation bias. And that, 260 00:15:43,960 --> 00:15:46,360 Speaker 1: like you said, that's number one, because that's not just 261 00:15:46,400 --> 00:15:49,640 Speaker 1: a scientific bias. I mean, like that is that's like 262 00:15:50,400 --> 00:15:54,840 Speaker 1: every human uses confirmation bias, and that it's two fold. 263 00:15:54,840 --> 00:15:59,640 Speaker 1: We we avoid contradictory information because we UM I don't 264 00:15:59,680 --> 00:16:03,120 Speaker 1: like to be wrong, and we find information that confirms 265 00:16:03,120 --> 00:16:04,760 Speaker 1: our point of view because we like to be right. 266 00:16:04,800 --> 00:16:10,160 Speaker 1: That's it's confirmation bias, and it's everywhere among everyone. That's right, 267 00:16:10,480 --> 00:16:13,200 Speaker 1: Although I will say I know it happens a lot politically, 268 00:16:14,440 --> 00:16:19,480 Speaker 1: but myself and the people that I congregate with, uh 269 00:16:19,800 --> 00:16:23,920 Speaker 1: question their own leaders as much as they do leaders 270 00:16:23,960 --> 00:16:27,360 Speaker 1: from the other parties. Oh, it's good, it's very good 271 00:16:27,360 --> 00:16:29,760 Speaker 1: to do. And I don't know, there shouldn't be sacred 272 00:16:29,800 --> 00:16:32,480 Speaker 1: calves and politics. That's a bad jam, well know. And 273 00:16:32,520 --> 00:16:35,800 Speaker 1: it's like I've always being been like at the forefront 274 00:16:35,840 --> 00:16:40,200 Speaker 1: of calling out my own parties wrongs and saying no, no, no, 275 00:16:40,360 --> 00:16:43,080 Speaker 1: that's you need to do better than that, whereas I 276 00:16:43,120 --> 00:16:46,320 Speaker 1: see a lot of other people in other situations truly 277 00:16:46,360 --> 00:16:50,520 Speaker 1: bury and ignore those things because they don't you know, 278 00:16:50,560 --> 00:16:53,240 Speaker 1: I just don't want to face that. Yeah. And it's 279 00:16:53,240 --> 00:16:55,600 Speaker 1: not like it's not even like I don't want to 280 00:16:55,640 --> 00:16:59,920 Speaker 1: face it. It just doesn't fit their worldview, so they 281 00:17:00,120 --> 00:17:02,680 Speaker 1: just don't include it. It It just gets tossed out. But 282 00:17:02,720 --> 00:17:09,520 Speaker 1: the point is is like it's not an active process necessarily, right, 283 00:17:10,080 --> 00:17:12,560 Speaker 1: I think we should probably check our first break. I 284 00:17:12,600 --> 00:17:14,239 Speaker 1: think so too. Chi All right, we'll be right back 285 00:17:14,240 --> 00:17:38,120 Speaker 1: and talk about sampling bias right after this alright, Chuck, 286 00:17:38,160 --> 00:17:41,640 Speaker 1: we're back, and we're coming back with UM, something called 287 00:17:41,720 --> 00:17:45,280 Speaker 1: sampling bias, which is it turns out a sub type 288 00:17:45,280 --> 00:17:48,520 Speaker 1: of a larger thing called selection bias, and one other 289 00:17:48,560 --> 00:17:50,399 Speaker 1: thing we should say. We kind of got into it 290 00:17:50,440 --> 00:17:53,439 Speaker 1: before I could say this. There are different stages in 291 00:17:53,840 --> 00:17:57,560 Speaker 1: a study where bias can occur. It can happen in 292 00:17:57,640 --> 00:18:02,359 Speaker 1: like the planning the pre study phase, UM and and uh, 293 00:18:02,359 --> 00:18:05,600 Speaker 1: it can happen during the actual study, and then it 294 00:18:05,640 --> 00:18:08,199 Speaker 1: can happen after the study as well. And so when 295 00:18:08,200 --> 00:18:12,320 Speaker 1: we're talking about any kind of selection bias, including sampling bias, 296 00:18:12,359 --> 00:18:16,600 Speaker 1: this is pre study bias where when you're actually setting 297 00:18:16,680 --> 00:18:19,200 Speaker 1: up the study, this bias is where this is where 298 00:18:19,200 --> 00:18:20,720 Speaker 1: the bias is going. Yeah, and you know what, I 299 00:18:20,800 --> 00:18:23,439 Speaker 1: think it also bears saying that biases you have to 300 00:18:23,480 --> 00:18:26,600 Speaker 1: work really hard to avoid it because it's it's almost 301 00:18:26,680 --> 00:18:30,520 Speaker 1: like a disease that's always trying to get involved. And 302 00:18:30,560 --> 00:18:34,440 Speaker 1: it's not like just do better, everybody and quit being biased. 303 00:18:34,520 --> 00:18:36,720 Speaker 1: It's like it's way more complicated than that, because it 304 00:18:36,800 --> 00:18:39,320 Speaker 1: is always knocking at the door, like you said, in 305 00:18:39,359 --> 00:18:42,199 Speaker 1: all three phases, trying to sneak in there, and it 306 00:18:42,240 --> 00:18:44,760 Speaker 1: takes a lot of work in all three phases to 307 00:18:44,800 --> 00:18:47,600 Speaker 1: avoid it. So it's not as I don't want it 308 00:18:47,600 --> 00:18:49,879 Speaker 1: to come across this as easy as us just saying 309 00:18:49,920 --> 00:18:54,000 Speaker 1: like you shouldn't do that, stop it. No, But the 310 00:18:54,080 --> 00:18:56,879 Speaker 1: first step is to recognizing that there's a lot of 311 00:18:56,880 --> 00:19:00,359 Speaker 1: bias and different kinds of bias that are just sitting 312 00:19:00,359 --> 00:19:03,320 Speaker 1: there waiting for a scientist. And then if you start 313 00:19:03,320 --> 00:19:05,359 Speaker 1: admitting that it's there, you can start being on the 314 00:19:05,359 --> 00:19:07,280 Speaker 1: lookout for it, and you can start adjusting for it, 315 00:19:07,440 --> 00:19:10,200 Speaker 1: and then other people who read your papers or here 316 00:19:10,240 --> 00:19:12,280 Speaker 1: you know, read news articles about your papers, can be 317 00:19:12,520 --> 00:19:16,240 Speaker 1: on the lookout for that kind of thing. Yeah, so exactly, Uh, 318 00:19:16,359 --> 00:19:19,920 Speaker 1: sampling bias is your you know, your sample set not 319 00:19:20,080 --> 00:19:23,080 Speaker 1: being accurate and a good representation of the whole. A 320 00:19:23,119 --> 00:19:28,360 Speaker 1: lot of times you'll find this um in either studies 321 00:19:28,359 --> 00:19:31,439 Speaker 1: that are really small scale because you don't have a 322 00:19:31,520 --> 00:19:34,240 Speaker 1: large sample and you don't have the kind of money 323 00:19:34,280 --> 00:19:36,480 Speaker 1: like near you. Like maybe you work for university, so 324 00:19:36,560 --> 00:19:39,600 Speaker 1: you work with university students as your first sample set, 325 00:19:39,600 --> 00:19:43,480 Speaker 1: who are not indicative of anything, but you know, people 326 00:19:43,640 --> 00:19:47,080 Speaker 1: eighteen to twenty one years old or so. Now, remember 327 00:19:47,119 --> 00:19:52,040 Speaker 1: we talked about weird Western educated and realized rich and democrats. 328 00:19:52,359 --> 00:19:54,600 Speaker 1: That's exactly the thing. It's like, I mean, it's a 329 00:19:54,840 --> 00:19:57,359 Speaker 1: it's a decent place to start if you don't have 330 00:19:57,720 --> 00:19:59,600 Speaker 1: much money and you want to get the ball rolling. 331 00:20:00,080 --> 00:20:02,159 Speaker 1: It's not like, oh, you shouldn't do university studies at 332 00:20:02,200 --> 00:20:07,080 Speaker 1: all and using students, but those findings definitely don't represent 333 00:20:07,119 --> 00:20:10,680 Speaker 1: the wide nation and it needs to grow and get 334 00:20:10,680 --> 00:20:13,040 Speaker 1: more funding if you want to actually have a legitimate 335 00:20:13,080 --> 00:20:17,439 Speaker 1: claim to something. Another way that sampling bias can come 336 00:20:17,480 --> 00:20:20,760 Speaker 1: up is from like the group that you're recruiting from. 337 00:20:20,760 --> 00:20:25,400 Speaker 1: Like if you're doing a strictly online survey, but you're 338 00:20:25,440 --> 00:20:29,160 Speaker 1: trying to apply your findings to the wider society, that's 339 00:20:29,200 --> 00:20:32,000 Speaker 1: just not gonna happen because there's so many people who 340 00:20:32,080 --> 00:20:35,760 Speaker 1: aren't Internet savvy enough to take an Internet survey. Like, 341 00:20:35,880 --> 00:20:38,480 Speaker 1: by by nature, you are a little savvier than the 342 00:20:38,520 --> 00:20:40,879 Speaker 1: average person if you're hanging out on the Internet and 343 00:20:40,920 --> 00:20:44,439 Speaker 1: taking a survey. And then also kind of tangential that 344 00:20:44,480 --> 00:20:47,280 Speaker 1: I like to tell myself that at least um and 345 00:20:47,320 --> 00:20:50,600 Speaker 1: then tangential that is something called self selection bias, which 346 00:20:50,640 --> 00:20:53,640 Speaker 1: is where the people who say, let's say you're doing 347 00:20:53,680 --> 00:20:58,480 Speaker 1: a study on wellness and you know what eating tuna 348 00:20:58,640 --> 00:21:02,720 Speaker 1: can do for your health. Um, people who are interested 349 00:21:02,760 --> 00:21:05,640 Speaker 1: in wellness and health are going to be much more 350 00:21:05,720 --> 00:21:09,240 Speaker 1: likely to volunteer for that study than people who couldn't 351 00:21:09,280 --> 00:21:11,800 Speaker 1: care less about health and have no desire whatsoever to 352 00:21:11,880 --> 00:21:15,000 Speaker 1: further sciences understanding of what makes you healthy. So you 353 00:21:15,040 --> 00:21:18,080 Speaker 1: would have to go out and find those people and 354 00:21:18,119 --> 00:21:20,919 Speaker 1: recruit them rather than just relying on the people who 355 00:21:21,080 --> 00:21:23,439 Speaker 1: volunteered based on the flyer you put up in the 356 00:21:23,480 --> 00:21:28,680 Speaker 1: student right, or you know, study all financial demographics or 357 00:21:28,800 --> 00:21:33,240 Speaker 1: pull all financial demographics rather than just and you know, 358 00:21:33,280 --> 00:21:36,439 Speaker 1: sometimes it's a methodology in which they do try and 359 00:21:36,520 --> 00:21:40,199 Speaker 1: recruit people steers them in that direction unknowingly that I know. 360 00:21:40,320 --> 00:21:45,120 Speaker 1: In the article they've talked about the presidential campaign with 361 00:21:45,440 --> 00:21:49,720 Speaker 1: Roosevelt and alf Landon Republican alf Landon, they were doing 362 00:21:49,760 --> 00:21:54,240 Speaker 1: polling with like country clubs, rosters and uh, people who 363 00:21:54,320 --> 00:21:56,199 Speaker 1: drove cars and stuff. At the time that was kind 364 00:21:56,200 --> 00:21:58,600 Speaker 1: of a luxury, so it was all out of whack. 365 00:21:58,640 --> 00:22:01,680 Speaker 1: Everyone's like Landing's gonna win in a landslide. It's because 366 00:22:01,760 --> 00:22:06,399 Speaker 1: you just kind of basically stuck your polling to uh, 367 00:22:06,480 --> 00:22:08,400 Speaker 1: you know, I don't know about wealthy individuals, but people 368 00:22:08,400 --> 00:22:10,800 Speaker 1: who are a little more well off. And I think, um, 369 00:22:10,880 --> 00:22:13,960 Speaker 1: we talked about that in our polling episode, that is 370 00:22:14,040 --> 00:22:18,200 Speaker 1: that fiasco with polling. I also saw one more two 371 00:22:18,200 --> 00:22:19,600 Speaker 1: that I want to mention because that has a really 372 00:22:19,600 --> 00:22:23,600 Speaker 1: great anecdote attached to It's called survivorship bias, where when 373 00:22:23,600 --> 00:22:28,120 Speaker 1: you're studying something, say like business or something, you're you're 374 00:22:28,160 --> 00:22:30,880 Speaker 1: probably going to just be looking at the extant businesses, 375 00:22:30,880 --> 00:22:33,720 Speaker 1: the businesses that have survived twenty years, thirty years, fifty 376 00:22:33,800 --> 00:22:35,760 Speaker 1: years or something like that, and you're not taking into 377 00:22:35,960 --> 00:22:38,399 Speaker 1: account all of the failures. So when you put together 378 00:22:38,480 --> 00:22:41,520 Speaker 1: like a prognosis for business in America, it might have 379 00:22:41,560 --> 00:22:44,120 Speaker 1: a son your outlook than it should, because all you're 380 00:22:44,119 --> 00:22:46,959 Speaker 1: looking at are the ones that managed to survive and thrive. 381 00:22:47,400 --> 00:22:50,320 Speaker 1: And that's survivorship bias. And did you see that The 382 00:22:50,400 --> 00:22:53,119 Speaker 1: anecdote about the World War Two fighter pilots. It was 383 00:22:53,160 --> 00:22:57,640 Speaker 1: actually pretty funny because they studied uh planes that had 384 00:22:57,680 --> 00:23:01,000 Speaker 1: been returned, that had been fired upon managed to get 385 00:23:01,000 --> 00:23:03,520 Speaker 1: back safely, and they're like, well, let's look at all 386 00:23:03,520 --> 00:23:06,440 Speaker 1: these different bullet holes and where this plane was hit, 387 00:23:07,040 --> 00:23:09,840 Speaker 1: and let's beef up all those areas on the body, 388 00:23:10,040 --> 00:23:14,760 Speaker 1: and a mathematician named Abraham Wald said, uh, no, those 389 00:23:14,760 --> 00:23:17,639 Speaker 1: are the places where they got shot and did Okay, 390 00:23:17,680 --> 00:23:19,919 Speaker 1: what you should really do is find these planes that 391 00:23:19,960 --> 00:23:23,000 Speaker 1: actually went down and beef up those sections of the planet. 392 00:23:23,840 --> 00:23:26,200 Speaker 1: And that survivorship bias. It's just it's failing to take 393 00:23:26,240 --> 00:23:28,719 Speaker 1: into account the failures and that have to do with 394 00:23:28,760 --> 00:23:34,399 Speaker 1: what you're trying to study. What about channeling bias. Channeling 395 00:23:34,400 --> 00:23:37,200 Speaker 1: bias is another kind of selection bias. Did you get 396 00:23:37,200 --> 00:23:40,320 Speaker 1: this one? It wasn't the best example of channeling bias. Yeah, 397 00:23:40,359 --> 00:23:45,320 Speaker 1: I mean that's I got it, all right? Did you 398 00:23:45,320 --> 00:23:47,720 Speaker 1: not get it? I got it, but it took a 399 00:23:47,720 --> 00:23:50,160 Speaker 1: lot of work before I finally did. Well. It's basically 400 00:23:50,160 --> 00:23:53,720 Speaker 1: when let's say you have a patient and they're like, 401 00:23:54,600 --> 00:23:58,399 Speaker 1: their degree of illness might influence what group they're put into. 402 00:23:58,960 --> 00:24:01,240 Speaker 1: So if a doctor or if a surgeon was trying 403 00:24:01,280 --> 00:24:06,320 Speaker 1: to study outcomes of obsticular surgery, they might because they're 404 00:24:06,359 --> 00:24:09,119 Speaker 1: surgeons and they want to help people out, they may 405 00:24:09,640 --> 00:24:13,120 Speaker 1: perform that surgery on maybe younger, healthier people who might 406 00:24:13,160 --> 00:24:16,600 Speaker 1: have better outcomes than someone who is in a different, 407 00:24:16,680 --> 00:24:21,480 Speaker 1: like higher age group. Right, And the the article is 408 00:24:21,560 --> 00:24:23,679 Speaker 1: kind of ends it there, and I was like, so 409 00:24:24,440 --> 00:24:29,200 Speaker 1: what's the problem. And I finally found this example where 410 00:24:29,240 --> 00:24:31,879 Speaker 1: where it says like, Okay, let's say you're studying a 411 00:24:32,000 --> 00:24:36,359 Speaker 1: new heartburn medicine or something or something to treat like 412 00:24:36,359 --> 00:24:41,679 Speaker 1: like gurd and it's new, it's hardcore, it's cutting edge UM. 413 00:24:42,080 --> 00:24:44,879 Speaker 1: And the people who are likely is to get this 414 00:24:45,000 --> 00:24:48,719 Speaker 1: new hardcore UM and acid are the ones who are 415 00:24:48,720 --> 00:24:51,760 Speaker 1: probably in worse shape, right, so, say they're on the 416 00:24:51,840 --> 00:24:54,360 Speaker 1: verge of going to the e ER. Anyway, well, if 417 00:24:54,400 --> 00:24:55,960 Speaker 1: you look back at all of the people who have 418 00:24:56,040 --> 00:24:59,680 Speaker 1: ever been prescribed this new hardcore and acid, you're gonna 419 00:24:59,720 --> 00:25:02,239 Speaker 1: see like a lot of them ended up in the 420 00:25:02,280 --> 00:25:04,679 Speaker 1: e R, even though it had nothing to do with 421 00:25:04,720 --> 00:25:08,280 Speaker 1: this hardcore and acid. And then similarly, the people who 422 00:25:08,320 --> 00:25:12,280 Speaker 1: have so so gird it's not particularly bad, they'll probably 423 00:25:12,560 --> 00:25:15,480 Speaker 1: be prescribed the old drug, the standby that everybody knows 424 00:25:15,520 --> 00:25:17,840 Speaker 1: it's fine, that's going to work. So if you compare 425 00:25:17,840 --> 00:25:19,480 Speaker 1: the old drug and the new drug, it looks like 426 00:25:19,520 --> 00:25:21,640 Speaker 1: the old drug is super safe, but the new drug 427 00:25:21,680 --> 00:25:25,200 Speaker 1: will put you into the e R. Whereas UM, that's channeling. 428 00:25:25,280 --> 00:25:29,720 Speaker 1: You've channeled different different people with different prognoses into different groups, 429 00:25:29,880 --> 00:25:33,199 Speaker 1: and they're kind of pitted against each other, um in 430 00:25:33,240 --> 00:25:35,919 Speaker 1: a in an effort to obscure the truth. If you 431 00:25:35,960 --> 00:25:38,760 Speaker 1: wanted to really know the genuine health outcomes for that 432 00:25:38,840 --> 00:25:41,520 Speaker 1: an acid, you would have to give it to people 433 00:25:41,560 --> 00:25:43,920 Speaker 1: with not so bad gurd and people with really bad 434 00:25:43,960 --> 00:25:46,080 Speaker 1: gurd and see what happens. See if the e ER 435 00:25:46,200 --> 00:25:50,119 Speaker 1: visits continue for people who wouldn't otherwise be going to 436 00:25:50,200 --> 00:25:56,240 Speaker 1: the e R Everyone, right, and not just for you, 437 00:25:56,280 --> 00:25:58,800 Speaker 1: Like if you're debating surgery and you're like, oh, well 438 00:25:58,800 --> 00:26:01,440 Speaker 1: it says shows really good outcomes. We're like, well yeah, 439 00:26:01,440 --> 00:26:06,000 Speaker 1: but who are they operating on right? Right? Yes? So 440 00:26:06,320 --> 00:26:10,120 Speaker 1: I would like to invite anyone UM who got what 441 00:26:10,240 --> 00:26:14,439 Speaker 1: I was saying or got channeling because of what I 442 00:26:14,480 --> 00:26:16,639 Speaker 1: was saying, I invite you to email and let me know. 443 00:26:16,760 --> 00:26:18,840 Speaker 1: I'm doing like a little bit of surveys here, and 444 00:26:18,840 --> 00:26:22,040 Speaker 1: I'd like to know if I confuse things more or 445 00:26:22,160 --> 00:26:24,960 Speaker 1: make it more understandable. Well, but I know it's funny 446 00:26:25,040 --> 00:26:27,560 Speaker 1: either way. I got that part. I'm just trying to 447 00:26:27,600 --> 00:26:30,080 Speaker 1: figure out its understanding. But here with your methodology talking 448 00:26:30,080 --> 00:26:32,280 Speaker 1: about the stuff you should know. No listener who by 449 00:26:32,359 --> 00:26:35,639 Speaker 1: nature is smarter than your average beyer. Well, I'm not 450 00:26:35,680 --> 00:26:39,840 Speaker 1: going to publish it. I'm gonna file draw it either way. 451 00:26:40,480 --> 00:26:45,919 Speaker 1: Oh what a teaser. UH question order bias is the 452 00:26:45,960 --> 00:26:49,119 Speaker 1: next one that is uh. And this is mainly obviously 453 00:26:49,320 --> 00:26:53,320 Speaker 1: with UM when you're just doing uh, like polling and 454 00:26:53,359 --> 00:26:56,280 Speaker 1: stuff or like an online survey or you know, it 455 00:26:56,280 --> 00:26:58,479 Speaker 1: could be just asking people a set of questions like 456 00:26:58,480 --> 00:27:03,479 Speaker 1: in a social science research set, and the way you 457 00:27:03,560 --> 00:27:06,320 Speaker 1: order things can affect the outcome. And this is the 458 00:27:06,400 --> 00:27:09,919 Speaker 1: thing at all, like everything from the brain's tendency to 459 00:27:10,040 --> 00:27:14,840 Speaker 1: organize information into patterns to the brain uh, simply paying 460 00:27:14,840 --> 00:27:19,040 Speaker 1: attention more and being more interested early on. Like I 461 00:27:19,080 --> 00:27:22,080 Speaker 1: know there was one The General Social Survey was a 462 00:27:22,119 --> 00:27:26,439 Speaker 1: big long term study of American attitudes and in they 463 00:27:26,480 --> 00:27:29,200 Speaker 1: were asked to identify the three most important qualities for 464 00:27:29,240 --> 00:27:31,360 Speaker 1: a child to have, and they were given a list 465 00:27:31,359 --> 00:27:35,119 Speaker 1: of these qualities. Honesty was just listed higher on the list. 466 00:27:36,000 --> 00:27:38,080 Speaker 1: When it was it was pick sixty six percent of 467 00:27:38,160 --> 00:27:40,480 Speaker 1: the time. When it was further down on the list, 468 00:27:40,520 --> 00:27:44,480 Speaker 1: it's forty eight percent important. And that's simply because people 469 00:27:44,480 --> 00:27:47,840 Speaker 1: are just reading this list that are like important and 470 00:27:47,840 --> 00:27:49,959 Speaker 1: then yeah, by the time they got down, you know, 471 00:27:50,480 --> 00:27:52,359 Speaker 1: three quarters of the way through the list, they started 472 00:27:52,359 --> 00:27:56,080 Speaker 1: thinking about what they're gonna have for dinner. People get 473 00:27:56,080 --> 00:27:59,560 Speaker 1: pooped when you're giving them lists of stuff, or you 474 00:27:59,560 --> 00:28:01,879 Speaker 1: can people and get them all sort of worked up. 475 00:28:02,600 --> 00:28:06,320 Speaker 1: Like if you have a question like, uh, during the 476 00:28:06,359 --> 00:28:09,720 Speaker 1: Trump administration, how mad were you at that guy about 477 00:28:09,760 --> 00:28:13,640 Speaker 1: stuff he did? And you're like super mad? And then 478 00:28:13,680 --> 00:28:15,920 Speaker 1: you were like, well, how did you feel generally about 479 00:28:15,920 --> 00:28:20,320 Speaker 1: how your life was affected during his administration? You might 480 00:28:20,400 --> 00:28:23,560 Speaker 1: say it was awful, Whereas if they hadn't have asked 481 00:28:23,560 --> 00:28:25,840 Speaker 1: that first question, they were just like, what was your 482 00:28:25,840 --> 00:28:28,560 Speaker 1: life like from two thousand and I am blocking out 483 00:28:28,600 --> 00:28:33,480 Speaker 1: the dates. What you might say, well, you know, it 484 00:28:33,520 --> 00:28:38,000 Speaker 1: wasn't It was okay, man, I had a lot of 485 00:28:38,000 --> 00:28:43,800 Speaker 1: sandwiches just over those four years though. Um. Yeah, so, 486 00:28:43,920 --> 00:28:46,120 Speaker 1: and like you said, that's priming, which is a big 487 00:28:46,160 --> 00:28:47,760 Speaker 1: it's a big thing that you have to worry about 488 00:28:47,800 --> 00:28:50,720 Speaker 1: when you're doing any kind of survey. UM. So there's 489 00:28:50,760 --> 00:28:52,400 Speaker 1: some of the ways that you can combat that. You 490 00:28:52,400 --> 00:28:55,880 Speaker 1: can randomize your question order. Um. Sometimes you'll have a 491 00:28:55,960 --> 00:28:59,800 Speaker 1: survey where one question is predicated on a previous question. 492 00:29:00,280 --> 00:29:02,840 Speaker 1: So one thing you might want to do is UM 493 00:29:03,520 --> 00:29:07,720 Speaker 1: ask that set of questions in a few different ways. UM. 494 00:29:07,760 --> 00:29:11,960 Speaker 1: So that yeah, so that you can kind of maybe, um, 495 00:29:12,000 --> 00:29:14,560 Speaker 1: compare the answers to all three, at them up and 496 00:29:14,600 --> 00:29:17,320 Speaker 1: divide them by three and there's your average answer kind 497 00:29:17,360 --> 00:29:19,560 Speaker 1: of thing. Um. There's a lot of things you can 498 00:29:19,600 --> 00:29:24,400 Speaker 1: do to kind of, I guess, um, manipulate to de 499 00:29:24,600 --> 00:29:31,520 Speaker 1: manipulate your respondent when you're doing a survey like that, manipulate, manipulate. 500 00:29:31,600 --> 00:29:34,520 Speaker 1: Look it up. You won't find anything on it, but 501 00:29:34,680 --> 00:29:37,320 Speaker 1: you could look it up still. Oh it's a Roxy 502 00:29:37,400 --> 00:29:46,560 Speaker 1: music album. Wow, Chuck, Wow, that was great. What's next? 503 00:29:47,040 --> 00:29:50,440 Speaker 1: So with question order bias, we've entered the during the 504 00:29:50,600 --> 00:29:53,160 Speaker 1: study kind of bias, like this is this is why 505 00:29:53,200 --> 00:29:57,840 Speaker 1: you're actually conducting the study, and so is interviewer bias. Um, 506 00:29:57,920 --> 00:30:02,040 Speaker 1: an interviewer bias. It's kind of like, well, question order 507 00:30:02,080 --> 00:30:04,680 Speaker 1: biases has to do more with the study design, but 508 00:30:04,720 --> 00:30:08,160 Speaker 1: it's a bias that emerges during the study interview or 509 00:30:08,200 --> 00:30:11,160 Speaker 1: bias just straight up is in the middle of the study, 510 00:30:11,160 --> 00:30:14,360 Speaker 1: and it has to do with the person actually asking 511 00:30:14,440 --> 00:30:18,560 Speaker 1: the questions in an interview. UM. It can also I 512 00:30:18,560 --> 00:30:22,760 Speaker 1: think apply to somebody conducting uh, like the like a 513 00:30:24,040 --> 00:30:27,640 Speaker 1: a clinical trial and a drug. If they know whether 514 00:30:27,720 --> 00:30:31,120 Speaker 1: somebody is getting placebo or not, it might affect their behavior. 515 00:30:31,160 --> 00:30:34,600 Speaker 1: But ultimately what it is the the person who's wearing 516 00:30:34,600 --> 00:30:38,840 Speaker 1: the white lab coat is influencing the outcome of the 517 00:30:39,280 --> 00:30:42,680 Speaker 1: of the study just through their behavior, through their tone 518 00:30:42,680 --> 00:30:44,800 Speaker 1: of voice, through the way that they're asking a question. 519 00:30:45,320 --> 00:30:50,600 Speaker 1: Sometimes it can be really overt and like let's say, um, 520 00:30:50,640 --> 00:30:53,840 Speaker 1: like a super devout Christian is is, you know, doing 521 00:30:53,840 --> 00:30:56,360 Speaker 1: a study on whether how many what part of the 522 00:30:56,360 --> 00:31:01,800 Speaker 1: population believes Jesus saves? And they might be like, you know, Jesus, 523 00:31:01,800 --> 00:31:04,960 Speaker 1: don't do you think Jesus saves? Is the question? Don't 524 00:31:05,000 --> 00:31:07,760 Speaker 1: you like? It seems like it? Huh, that kind of 525 00:31:07,760 --> 00:31:10,760 Speaker 1: thing would be a pretty extreme example, but it's it's 526 00:31:10,800 --> 00:31:14,760 Speaker 1: sometimes how you understand things is in the absurdities, you know. Yeah, 527 00:31:14,840 --> 00:31:18,160 Speaker 1: I thought this example in the House of Works article 528 00:31:18,280 --> 00:31:20,600 Speaker 1: is kind of funny. It was about just like a 529 00:31:20,600 --> 00:31:26,040 Speaker 1: medical questionnaire where the interviewer knows that the subject has 530 00:31:26,080 --> 00:31:29,320 Speaker 1: a disease that they're talking about, and they may probe 531 00:31:29,320 --> 00:31:32,680 Speaker 1: more intensely for the known risk factors. And they gave 532 00:31:32,720 --> 00:31:35,320 Speaker 1: smoking as an example, and it said so they may 533 00:31:35,320 --> 00:31:38,959 Speaker 1: say something like, are you sure you've never smoked, never, 534 00:31:39,080 --> 00:31:42,760 Speaker 1: not even one. Like, if I heard that coming from 535 00:31:42,760 --> 00:31:47,120 Speaker 1: a researcher, even without knowing a lot about this, would say, what, 536 00:31:47,120 --> 00:31:49,680 Speaker 1: what kind of a researcher are you? Like it seems 537 00:31:49,680 --> 00:31:52,400 Speaker 1: like you're looking for an answer you should you should 538 00:31:52,440 --> 00:31:57,240 Speaker 1: say you are ethically compromised, or even facial expressions or 539 00:31:57,560 --> 00:31:59,600 Speaker 1: you know, body language. All that stuff weighs in I 540 00:31:59,600 --> 00:32:03,600 Speaker 1: don't know, tybrow, Like, why don't they just have the 541 00:32:03,720 --> 00:32:07,360 Speaker 1: robots Alexa or Google or Syria or somebody ask them? Well, 542 00:32:07,360 --> 00:32:10,160 Speaker 1: that's one good thing about something like an Internet survey 543 00:32:10,280 --> 00:32:12,760 Speaker 1: is like it's it's just questions, and as long as 544 00:32:12,800 --> 00:32:16,080 Speaker 1: you design the questions and you randomize their presentation like it, 545 00:32:16,080 --> 00:32:18,840 Speaker 1: it's gonna be fairly helpful in that respect. But then 546 00:32:18,840 --> 00:32:20,920 Speaker 1: it's kind of its own pitfalls and pratfalls. You can 547 00:32:20,920 --> 00:32:22,640 Speaker 1: attract a lot of people who are just taking it 548 00:32:22,680 --> 00:32:25,120 Speaker 1: to mess with you, and there's a lot of problems 549 00:32:25,160 --> 00:32:27,920 Speaker 1: with with all of it. But again, if you're aware 550 00:32:27,920 --> 00:32:29,880 Speaker 1: of all the problems, you can plan for him. And 551 00:32:29,880 --> 00:32:32,400 Speaker 1: then even if you can't plan for him or control them, 552 00:32:32,440 --> 00:32:34,920 Speaker 1: you can write about it in the actual study and 553 00:32:34,960 --> 00:32:38,120 Speaker 1: be like this this study I remember running across studies 554 00:32:38,160 --> 00:32:41,240 Speaker 1: before where they're basically like, there are you know, there 555 00:32:41,320 --> 00:32:44,160 Speaker 1: was a kind of bias that we couldn't we couldn't 556 00:32:44,160 --> 00:32:47,640 Speaker 1: control for, so we can't really we can't really say 557 00:32:47,720 --> 00:32:50,720 Speaker 1: whether it affected this the outcome or not. And I thought, wow, 558 00:32:50,720 --> 00:32:54,720 Speaker 1: this is really refreshing and like even daring kind of 559 00:32:54,760 --> 00:32:57,960 Speaker 1: like I was thrilled. Um, But you don't see that 560 00:32:58,080 --> 00:33:00,520 Speaker 1: very often. But that, from what I am understand, is 561 00:33:00,560 --> 00:33:03,760 Speaker 1: the direction that science is going towards now well, and 562 00:33:03,800 --> 00:33:06,160 Speaker 1: the reason you don't see that. And then something we'll 563 00:33:06,200 --> 00:33:09,520 Speaker 1: talk about is is what actually ends up getting published. 564 00:33:10,520 --> 00:33:12,840 Speaker 1: The it may be less likely to get published if 565 00:33:12,840 --> 00:33:16,320 Speaker 1: they're like, hey, you know what, you know what I'm saying? Yeah, 566 00:33:16,320 --> 00:33:20,120 Speaker 1: I know. So let's do recall an acquiescence bias because 567 00:33:20,160 --> 00:33:22,880 Speaker 1: they're very much related, and then we'll take a break. 568 00:33:23,160 --> 00:33:27,000 Speaker 1: That's our plan. What do you think of it? Everyone 569 00:33:27,000 --> 00:33:30,680 Speaker 1: says it sounds good to me, all right. So this 570 00:33:30,760 --> 00:33:33,200 Speaker 1: is also during study, and this is so in the 571 00:33:33,360 --> 00:33:36,520 Speaker 1: very much the way that an interviewer can influence the outcome, 572 00:33:37,120 --> 00:33:41,080 Speaker 1: the participant can actually influence the outcome too, especially if 573 00:33:41,080 --> 00:33:43,920 Speaker 1: they're being asked questions or they're they're being asked to 574 00:33:44,000 --> 00:33:47,360 Speaker 1: self report. Um, there's a couple of ways that us 575 00:33:47,440 --> 00:33:50,000 Speaker 1: just being humans can foul up the works on on 576 00:33:50,120 --> 00:33:54,120 Speaker 1: the findings of a study. The first is recall bias. Yeah. 577 00:33:54,120 --> 00:33:57,560 Speaker 1: This is when you're obviously you're trying to recall something 578 00:33:57,600 --> 00:34:02,320 Speaker 1: from the past. And it's amazing what might jump out 579 00:34:02,360 --> 00:34:05,640 Speaker 1: at you from your past when probed with a certain question, 580 00:34:06,440 --> 00:34:09,399 Speaker 1: certain correlations that really have nothing to do with it 581 00:34:09,719 --> 00:34:11,719 Speaker 1: with you. But you may be like, oh, well, you 582 00:34:11,760 --> 00:34:14,560 Speaker 1: know what, now that I think back, I remember around 583 00:34:14,560 --> 00:34:17,400 Speaker 1: that time I was I was watching a lot of 584 00:34:17,520 --> 00:34:20,640 Speaker 1: Dancing with the Stars. I kind of binge that show. 585 00:34:20,680 --> 00:34:26,600 Speaker 1: So maybe that's why I had homicidal tendencies. I don't 586 00:34:26,600 --> 00:34:29,040 Speaker 1: think you need to study to prove that. I think 587 00:34:29,080 --> 00:34:32,799 Speaker 1: that's just intuition, you know. Yeah. Um but yeah, so, 588 00:34:32,960 --> 00:34:35,880 Speaker 1: and if enough people do that, especially if there's something 589 00:34:35,920 --> 00:34:38,640 Speaker 1: out kind of in like the zeitgeist about that, how 590 00:34:38,719 --> 00:34:41,840 Speaker 1: like people who watch too much Dancing with the Stars 591 00:34:41,880 --> 00:34:46,200 Speaker 1: want to kill other people? Um, like, a number of 592 00:34:46,239 --> 00:34:50,480 Speaker 1: your participants might recall that thing, whereas other people who 593 00:34:50,520 --> 00:34:53,520 Speaker 1: don't watch Dancing with the Stars aren't going to recall that. 594 00:34:53,600 --> 00:34:57,680 Speaker 1: And so in the same way that um, survivorship bias 595 00:34:57,719 --> 00:35:00,880 Speaker 1: influences that those people who don't have that memory to 596 00:35:01,040 --> 00:35:06,080 Speaker 1: recall that memory can't possibly be included in the study results, 597 00:35:06,280 --> 00:35:08,600 Speaker 1: which means that Dancing with the Stars is going to 598 00:35:08,680 --> 00:35:11,440 Speaker 1: kind of percolate to the top as like a major 599 00:35:11,600 --> 00:35:16,880 Speaker 1: risk factor in homicidal tendencies. Right, that's not good. You 600 00:35:16,880 --> 00:35:20,080 Speaker 1: don't want you don't want Dancing with the Stars unfairly canceled. 601 00:35:20,120 --> 00:35:23,160 Speaker 1: You want to be canceled because it is terrible. I've 602 00:35:23,160 --> 00:35:25,280 Speaker 1: never seen it. I'm sure it's great if you're not dancing. 603 00:35:25,320 --> 00:35:28,480 Speaker 1: I haven't either. But watch we're gonna be asked to 604 00:35:28,520 --> 00:35:33,080 Speaker 1: be on Oh my God, and they'd have to change 605 00:35:33,120 --> 00:35:35,480 Speaker 1: the name of the show to Dancing with the mid 606 00:35:35,560 --> 00:35:42,440 Speaker 1: level Internet famous right exactly. Wow, dust off my jazz shoes. 607 00:35:43,520 --> 00:35:47,000 Speaker 1: It would be us in Chocolate Rain. And you know 608 00:35:47,280 --> 00:35:50,239 Speaker 1: I love that guy. Tjon Day is his name. Yeah, 609 00:35:50,239 --> 00:35:54,200 Speaker 1: we actually met him that time. I remember it was great. Man. Um. 610 00:35:54,280 --> 00:35:56,759 Speaker 1: Another thing, Chuck that people that has to do with 611 00:35:56,800 --> 00:36:00,440 Speaker 1: recall biases. That, um, like, we just tend to have 612 00:36:00,560 --> 00:36:04,480 Speaker 1: fault to your memories with stuff that makes us look bad, 613 00:36:04,920 --> 00:36:09,000 Speaker 1: like say, unhealthy habits. So if if you're doing a 614 00:36:09,080 --> 00:36:15,239 Speaker 1: study on UM junk food and and health outcomes, and 615 00:36:15,360 --> 00:36:18,120 Speaker 1: you interview a bunch of people who are in terrible health, 616 00:36:18,719 --> 00:36:21,799 Speaker 1: and all of them are like, only like cheese. It's 617 00:36:21,840 --> 00:36:23,960 Speaker 1: like once in a blue moon or something like that 618 00:36:24,000 --> 00:36:25,960 Speaker 1: in the researcher, right, so once in blue moon cheese. 619 00:36:26,000 --> 00:36:30,360 Speaker 1: It's um like the results of the study, you're gonna 620 00:36:30,480 --> 00:36:35,080 Speaker 1: suggest that it takes just a very small amount of cheese. 621 00:36:35,080 --> 00:36:38,120 Speaker 1: It's to put you in the hospital with long term 622 00:36:38,239 --> 00:36:43,400 Speaker 1: chronic health conditions. And that's that is a problem with 623 00:36:43,520 --> 00:36:46,880 Speaker 1: recall bias, Like it is. It's it's the participants affecting 624 00:36:46,880 --> 00:36:49,560 Speaker 1: it in this case because they just aren't paying attention 625 00:36:49,600 --> 00:36:52,480 Speaker 1: to aren't really thinking about No, you you've eaten a 626 00:36:52,480 --> 00:36:54,839 Speaker 1: lot of cheese. It's and it takes a lot of cheese. 627 00:36:54,880 --> 00:36:56,680 Speaker 1: It's to put you in the hospital, not a very 628 00:36:56,680 --> 00:36:59,279 Speaker 1: little amount. It's not the best example, but it kind 629 00:36:59,280 --> 00:37:01,200 Speaker 1: of gets the point cross I think now, is this 630 00:37:01,239 --> 00:37:04,880 Speaker 1: part of acquiescence bias. No, that was that was the 631 00:37:05,000 --> 00:37:08,319 Speaker 1: end of recall bias. Acquiescence biases, it's different, but it's 632 00:37:08,320 --> 00:37:11,440 Speaker 1: certainly related. Both of them kind of fall under an 633 00:37:11,520 --> 00:37:15,560 Speaker 1: umbrella of participant bias. Yeah, and acquiescence bias. I feel 634 00:37:15,560 --> 00:37:18,040 Speaker 1: like there's the opposite too. I just don't know they 635 00:37:18,040 --> 00:37:22,160 Speaker 1: have a name, um, because acquiescence biases is generally like 636 00:37:23,040 --> 00:37:25,920 Speaker 1: people people want to be agreeable, and they want to 637 00:37:26,000 --> 00:37:29,799 Speaker 1: answer in the affirmative, and they want to especially they 638 00:37:29,840 --> 00:37:34,560 Speaker 1: found um, if you are maybe less educated, you might 639 00:37:34,600 --> 00:37:36,759 Speaker 1: be more willing to just go along with something and 640 00:37:36,760 --> 00:37:40,880 Speaker 1: say yeah, sure, yeah, yeah, um to maybe appear smarter 641 00:37:41,160 --> 00:37:43,799 Speaker 1: or just to be more agreeable. I do think it's 642 00:37:43,840 --> 00:37:48,680 Speaker 1: the opposite can happen too, especially with political um research 643 00:37:48,920 --> 00:37:51,959 Speaker 1: in social studies and that I think there are also 644 00:37:51,960 --> 00:37:55,960 Speaker 1: people that are like, oh you're from the what well, yeah, sure, 645 00:37:56,000 --> 00:37:58,399 Speaker 1: I'd love to be interviewed, and then they go into 646 00:37:58,440 --> 00:38:01,440 Speaker 1: it with a sort of opposite mental where they're completely 647 00:38:01,440 --> 00:38:05,160 Speaker 1: disagreeable no matter what anyone says or asked. Yeah, I 648 00:38:05,160 --> 00:38:08,200 Speaker 1: didn't run across that, but I'm absolutely sure that that 649 00:38:08,360 --> 00:38:10,440 Speaker 1: is a bias out there. But you can avoid these 650 00:38:10,520 --> 00:38:15,080 Speaker 1: by doing it more smartly, right, more smartly. Yeah, there's 651 00:38:15,120 --> 00:38:19,160 Speaker 1: ways that you can, Um, you can frame your questions, 652 00:38:19,200 --> 00:38:21,600 Speaker 1: like like people don't like to to admit that they 653 00:38:21,600 --> 00:38:26,640 Speaker 1: didn't actually vote American Democracy, so in instead of saying 654 00:38:26,640 --> 00:38:29,640 Speaker 1: there was a there was a pew a pew suggestion 655 00:38:30,400 --> 00:38:35,440 Speaker 1: p um where they said um, rather than saying like, 656 00:38:35,480 --> 00:38:37,880 Speaker 1: did you vote in the last election. A lot of 657 00:38:37,960 --> 00:38:39,960 Speaker 1: people who didn't vote are gonna be like, sure, yeah, 658 00:38:39,960 --> 00:38:43,560 Speaker 1: of course, why why would you ask that um? Instead 659 00:38:43,600 --> 00:38:46,600 Speaker 1: you would phrase it as like, um, in the two 660 00:38:46,600 --> 00:38:51,120 Speaker 1: thousand and twelve presidential election, Uh, did things come up 661 00:38:51,160 --> 00:38:53,560 Speaker 1: that prevented you from voting? Or were you able to vote? 662 00:38:54,400 --> 00:38:57,440 Speaker 1: And you would probably actually want to train your researcher 663 00:38:57,560 --> 00:39:02,120 Speaker 1: to use that same intonation to make it seem casual 664 00:39:02,200 --> 00:39:04,440 Speaker 1: either way, Like you want to give the person a 665 00:39:04,560 --> 00:39:07,720 Speaker 1: sense of comfort that they're not being judged no matter 666 00:39:07,800 --> 00:39:11,040 Speaker 1: how they answer. A good way. That's a good way 667 00:39:11,080 --> 00:39:16,759 Speaker 1: to get away around acquiescence bias. Absolutely, yeah, the old 668 00:39:16,800 --> 00:39:21,680 Speaker 1: backdoor policy. That's right, where you can squeeze a woman. 669 00:39:23,840 --> 00:39:26,680 Speaker 1: All right, are we taking a break? I think we're 670 00:39:27,160 --> 00:39:30,480 Speaker 1: We're mandated by the SEC to do that. After that joke, 671 00:39:30,480 --> 00:39:32,120 Speaker 1: all right, well, we'll be back and finish up with 672 00:39:32,160 --> 00:39:56,319 Speaker 1: our final two biases right after this. I also I 673 00:39:56,320 --> 00:39:58,319 Speaker 1: want to apologize to all the parents who listen with 674 00:39:58,360 --> 00:40:01,879 Speaker 1: their six year olds these days. My daughter is sick. 675 00:40:01,920 --> 00:40:05,040 Speaker 1: She doesn't care about what we do. That's great. So 676 00:40:05,080 --> 00:40:07,799 Speaker 1: it's still just flying overhead, right, I mean she didn't 677 00:40:07,800 --> 00:40:10,480 Speaker 1: even listen. She like movie crush a little bit. Well, 678 00:40:10,560 --> 00:40:13,239 Speaker 1: some kids there are six listen and hey, shout out 679 00:40:13,280 --> 00:40:15,120 Speaker 1: to all of you guys. Listen. No, I'm always whenever 680 00:40:15,120 --> 00:40:18,120 Speaker 1: I see that, whenever someone writes in says their kid 681 00:40:18,239 --> 00:40:22,040 Speaker 1: my daughter's age actually listens, I'm like, really, oh, yes, 682 00:40:22,440 --> 00:40:25,960 Speaker 1: this is my daughter. She loves it, right, Yeah, and 683 00:40:26,000 --> 00:40:29,279 Speaker 1: she voted in the last election to like, uh, my 684 00:40:29,360 --> 00:40:32,760 Speaker 1: daughter likes to watch videos of kids playing with toys 685 00:40:32,920 --> 00:40:36,239 Speaker 1: on YouTube? Kids? Is she into that? Now? Those are 686 00:40:36,280 --> 00:40:38,200 Speaker 1: the worst videos. I'm starting to get on her now 687 00:40:38,239 --> 00:40:40,680 Speaker 1: with just in terms of taste. I'm like, hey, you 688 00:40:40,680 --> 00:40:42,720 Speaker 1: can watch something, but like watch something with a story 689 00:40:42,800 --> 00:40:46,319 Speaker 1: that's good. It's like, this is garbage. I like it. 690 00:40:48,000 --> 00:40:51,120 Speaker 1: I can totally see your saying just like that three 691 00:40:51,160 --> 00:40:56,040 Speaker 1: defiant and happy impression. Alright. Publication bias is one we 692 00:40:56,040 --> 00:40:57,920 Speaker 1: we kind of poked around it earlier a little bit 693 00:40:57,960 --> 00:41:01,240 Speaker 1: with the whole publisher parish mentality. Can I add something 694 00:41:01,280 --> 00:41:04,120 Speaker 1: more to that real quick before we get into publication bias, 695 00:41:04,840 --> 00:41:08,319 Speaker 1: you you know, mind to what to the to just 696 00:41:08,400 --> 00:41:12,680 Speaker 1: talking about publication in general. So I don't think that 697 00:41:12,760 --> 00:41:16,759 Speaker 1: it's fully grasped by most people. It certainly wasn't by 698 00:41:16,760 --> 00:41:21,040 Speaker 1: me until really diving into this, that the academic publishing 699 00:41:21,080 --> 00:41:25,560 Speaker 1: industry has a stranglehold on science right now, in a 700 00:41:25,760 --> 00:41:30,399 Speaker 1: very similar effect that twenty four hour cable news had 701 00:41:30,760 --> 00:41:35,480 Speaker 1: on like journalism, to where it was like it became 702 00:41:35,520 --> 00:41:38,799 Speaker 1: this voracious beast that was willing to just spit out 703 00:41:38,880 --> 00:41:42,279 Speaker 1: money constantly feed it in exchange for yeah, feed it, 704 00:41:42,320 --> 00:41:44,560 Speaker 1: give me more stories, give me more, give me more pundits, 705 00:41:44,560 --> 00:41:46,960 Speaker 1: give me like that was the rise of pundits. Pundits 706 00:41:47,000 --> 00:41:50,000 Speaker 1: didn't really exist prior to that. They just hung out 707 00:41:50,000 --> 00:41:52,719 Speaker 1: on the editorial pages of newspapers. And then twenty four 708 00:41:52,719 --> 00:41:56,400 Speaker 1: hour news came along, and there's not possibly enough news stories, 709 00:41:56,800 --> 00:41:59,359 Speaker 1: like good news stories, to keep going for twenty four hours, 710 00:41:59,360 --> 00:42:00,800 Speaker 1: so you have to talk about the news stories and 711 00:42:00,840 --> 00:42:03,279 Speaker 1: analyze them, and then you start getting into who's wrong 712 00:42:03,320 --> 00:42:06,520 Speaker 1: and all that stuff. The publishing industry is very much 713 00:42:06,560 --> 00:42:09,400 Speaker 1: like that now, where it's this beast that must be fed, 714 00:42:09,719 --> 00:42:13,480 Speaker 1: and so there's there can't possibly be that many high 715 00:42:13,600 --> 00:42:18,160 Speaker 1: quality scientific papers. So scientific papers have just kind of 716 00:42:18,239 --> 00:42:22,200 Speaker 1: dipped down in quality, and then um, one of the 717 00:42:22,200 --> 00:42:25,040 Speaker 1: other things that the publishing industry has done is said, 718 00:42:26,320 --> 00:42:30,879 Speaker 1: we really like like studies that have results. They're called 719 00:42:30,920 --> 00:42:34,680 Speaker 1: positive results, where like it turned up that that you 720 00:42:34,800 --> 00:42:39,520 Speaker 1: found uh correlation between something, or the compound you tried 721 00:42:39,560 --> 00:42:42,279 Speaker 1: on that tumor shrunk the tumor. Like those are what 722 00:42:42,320 --> 00:42:45,560 Speaker 1: we're interested in, the whole furthering of science with positive 723 00:42:45,600 --> 00:42:48,840 Speaker 1: and negative outcomes. Just to say this did work, this 724 00:42:48,920 --> 00:42:52,440 Speaker 1: doesn't work, don't bother trying it. We don't care about 725 00:42:52,440 --> 00:42:54,920 Speaker 1: that kind of stuff. And that's a huge issue for 726 00:42:55,000 --> 00:42:57,960 Speaker 1: the scientific community, Like they have to get control of 727 00:42:58,080 --> 00:43:01,440 Speaker 1: the publishing community again, um if if they're going to 728 00:43:01,480 --> 00:43:03,560 Speaker 1: come out from under this dark class. Yeah, I mean 729 00:43:03,560 --> 00:43:05,760 Speaker 1: they found a in two thousand and ten a study 730 00:43:06,400 --> 00:43:11,160 Speaker 1: about papers in social sciences especially, we're about two and 731 00:43:11,160 --> 00:43:14,400 Speaker 1: a half I'm sorry, two point three times more likely 732 00:43:14,440 --> 00:43:19,239 Speaker 1: to show positive results then papers in physical sciences. Even 733 00:43:19,320 --> 00:43:22,920 Speaker 1: so some some bodies of research or even more apt 734 00:43:23,600 --> 00:43:27,200 Speaker 1: uh to publish positive results. And that means if you're 735 00:43:27,239 --> 00:43:30,839 Speaker 1: going you know this going into your profession, and you 736 00:43:30,880 --> 00:43:35,000 Speaker 1: know this going into your set of research, and it's 737 00:43:35,080 --> 00:43:37,640 Speaker 1: you know, that's when it becomes sort of put up 738 00:43:37,719 --> 00:43:41,000 Speaker 1: or shut up time. As far as standing firm on 739 00:43:42,200 --> 00:43:46,680 Speaker 1: doing good work even if it doesn't get published, right, 740 00:43:46,719 --> 00:43:49,680 Speaker 1: and so that that confirmation bias can really come in 741 00:43:50,040 --> 00:43:56,960 Speaker 1: where you start, hopefully inadvertently but certainly not in all cases, inadvertently, 742 00:43:56,960 --> 00:43:59,839 Speaker 1: start cherry picking data to get a positive outcome where 743 00:44:00,239 --> 00:44:03,239 Speaker 1: there really wasn't one there before, or you use a 744 00:44:03,360 --> 00:44:06,759 Speaker 1: kind of a weird statistical method to to suss out 745 00:44:06,880 --> 00:44:11,839 Speaker 1: the correlation between the variables so that you can have 746 00:44:12,000 --> 00:44:16,319 Speaker 1: a positive outcome. Because if you're not publishing papers, like 747 00:44:16,400 --> 00:44:19,320 Speaker 1: your academic career is not progressing and you can actually 748 00:44:19,360 --> 00:44:22,239 Speaker 1: like lose jobs, so you need to be published. The 749 00:44:22,280 --> 00:44:26,040 Speaker 1: publishing industry wants your paper, but they just want positive outcomes. 750 00:44:26,200 --> 00:44:31,120 Speaker 1: So a high quality, well designed, well executed study that 751 00:44:31,160 --> 00:44:34,759 Speaker 1: found a negative outcome to where they said, well, this 752 00:44:34,840 --> 00:44:38,840 Speaker 1: compound we tried didn't actually shrink the tumor, that's that's 753 00:44:38,880 --> 00:44:41,280 Speaker 1: going to be ignored in favor of a low quality 754 00:44:41,320 --> 00:44:45,000 Speaker 1: paper that found some compound that shrunk a tumor just 755 00:44:45,080 --> 00:44:48,000 Speaker 1: because they like positive outcomes. It's ridiculous. Yeah, And I 756 00:44:48,000 --> 00:44:49,480 Speaker 1: mean that kind of goes hand in hand with the 757 00:44:49,560 --> 00:44:52,000 Speaker 1: last one. You know, there's a lot of overlap with 758 00:44:52,040 --> 00:44:54,160 Speaker 1: these and a lot that work sort of in concert 759 00:44:54,200 --> 00:44:58,040 Speaker 1: with one another. And file draw bias is you know, 760 00:44:58,320 --> 00:45:00,040 Speaker 1: it is what it sounds like. It's like you you 761 00:45:00,200 --> 00:45:03,359 Speaker 1: got a negative outcome, and whether or not you were 762 00:45:03,400 --> 00:45:06,719 Speaker 1: being funded by a company that definitely doesn't want that 763 00:45:06,760 --> 00:45:09,560 Speaker 1: information getting out there, or if it's just as a 764 00:45:09,560 --> 00:45:13,520 Speaker 1: result of it being less likely to be published because 765 00:45:13,520 --> 00:45:16,160 Speaker 1: it doesn't have a positive outcome, you just stick it 766 00:45:16,160 --> 00:45:18,719 Speaker 1: in the file drawer and it goes by by, right, 767 00:45:18,760 --> 00:45:20,840 Speaker 1: and again, like part of the point of science and 768 00:45:20,840 --> 00:45:23,960 Speaker 1: scientific publishing is to generate this body of knowledge. So 769 00:45:24,280 --> 00:45:26,320 Speaker 1: if you're about to do a study, you can search 770 00:45:26,360 --> 00:45:28,520 Speaker 1: and say, oh, somebody already tried the same exact thing 771 00:45:28,520 --> 00:45:30,759 Speaker 1: and they found that it doesn't work. I'm gonna not 772 00:45:30,880 --> 00:45:33,279 Speaker 1: try to reproduce that. I'm just gonna not go with it, 773 00:45:33,719 --> 00:45:36,239 Speaker 1: um and move on to try something else. It's a 774 00:45:36,320 --> 00:45:40,520 Speaker 1: huge waste of resources otherwise. And then also you could 775 00:45:40,960 --> 00:45:45,400 Speaker 1: you can if you aren't publishing that kind of stuff, Um, 776 00:45:45,480 --> 00:45:48,839 Speaker 1: you're missing out on well, I mean you're missing out 777 00:45:48,880 --> 00:45:53,120 Speaker 1: on the real data. If the bad data is vile 778 00:45:53,200 --> 00:45:56,240 Speaker 1: drawered like you're missing out on the truth. You're missing 779 00:45:56,239 --> 00:46:01,919 Speaker 1: out on the whole picture, right, And also Again, it's 780 00:46:02,000 --> 00:46:04,600 Speaker 1: not just that the poor negative outcomes they need to 781 00:46:04,640 --> 00:46:08,000 Speaker 1: be included to. Yes, that's true, but you're also promoting 782 00:46:08,440 --> 00:46:13,560 Speaker 1: positive outcome studies that actually aren't good studies. There's this 783 00:46:13,560 --> 00:46:17,000 Speaker 1: thing called the proteus effect where the initial studies, these 784 00:46:17,040 --> 00:46:22,160 Speaker 1: initial papers on a subject um in sevent cases, a 785 00:46:22,320 --> 00:46:25,919 Speaker 1: follow up study that seeks to reproduce them can't reproduce them. 786 00:46:26,000 --> 00:46:28,319 Speaker 1: They don't come to the same finding, the same conclusions, 787 00:46:28,560 --> 00:46:32,440 Speaker 1: which suggests that a study was really terrible. Um, if 788 00:46:32,480 --> 00:46:35,359 Speaker 1: it can't be reproduced, or if it's reproduced, somebody comes 789 00:46:35,360 --> 00:46:38,880 Speaker 1: to a different finding, different conclusion, that's not a good study. 790 00:46:39,239 --> 00:46:43,320 Speaker 1: So the idea of publishing positive and negative outcomes together 791 00:46:43,800 --> 00:46:47,960 Speaker 1: would definitely kind of slow that whole crazy twenty four 792 00:46:47,960 --> 00:46:51,680 Speaker 1: hour news cycle. Positive outcome study. I don't see how 793 00:46:51,719 --> 00:46:57,200 Speaker 1: it's even legal to bury not bury, but I guess 794 00:46:57,280 --> 00:47:00,239 Speaker 1: just not even just a file drawer, a study suddy 795 00:47:01,000 --> 00:47:04,799 Speaker 1: that included like a drug having negative effects like and 796 00:47:04,840 --> 00:47:06,879 Speaker 1: I know that Congress is stepped up to try and 797 00:47:07,400 --> 00:47:11,680 Speaker 1: pass laws too. I think there was one UH in 798 00:47:11,840 --> 00:47:17,240 Speaker 1: two thousand seven requiring researchers to report results of human 799 00:47:17,239 --> 00:47:21,680 Speaker 1: studies of experimental treatments. Uh, and then they tried to 800 00:47:21,680 --> 00:47:25,640 Speaker 1: strengthen that in sixteen. Basically this like, you know, even 801 00:47:25,680 --> 00:47:27,640 Speaker 1: if your drug doesn't come to market, like, we need 802 00:47:27,680 --> 00:47:31,439 Speaker 1: to have these studies and the results. Like, how, how's 803 00:47:31,440 --> 00:47:33,839 Speaker 1: it even legal? It seems like you're bearing and it's 804 00:47:33,840 --> 00:47:37,400 Speaker 1: almost falsification. Well, it is, for sure, because you're also 805 00:47:37,520 --> 00:47:40,040 Speaker 1: like if you're if you're talking about studies where you 806 00:47:40,120 --> 00:47:44,840 Speaker 1: have multiple studies on say one drug that's an antidepressant, 807 00:47:45,560 --> 00:47:48,040 Speaker 1: and all you're doing is publishing the ones that have 808 00:47:48,160 --> 00:47:52,040 Speaker 1: positive outcomes for that antidepressant, and you're just not publishing 809 00:47:52,080 --> 00:47:55,480 Speaker 1: the ones that that showed no outcomes or maybe even harm, 810 00:47:55,960 --> 00:47:58,520 Speaker 1: then yeah, that should be illegal, especially when you're talking 811 00:47:58,520 --> 00:48:01,760 Speaker 1: about something like an antidepressant or in the biomedical field. 812 00:48:02,080 --> 00:48:06,600 Speaker 1: But it's certainly unethical for any field of science in particular. 813 00:48:06,719 --> 00:48:08,760 Speaker 1: Just bury the stuff you don't like that doesn't support 814 00:48:08,760 --> 00:48:11,000 Speaker 1: your conclusion. It's a kind of a meta form of 815 00:48:11,440 --> 00:48:15,040 Speaker 1: um confirmation bias, just putting aside the stuff that doesn't 816 00:48:15,040 --> 00:48:18,840 Speaker 1: fit your hypothesis or your worldview, and um just promoting 817 00:48:18,840 --> 00:48:23,280 Speaker 1: the stuff that does. I saw one one way around 818 00:48:23,280 --> 00:48:26,120 Speaker 1: this is the Lancet, you know, the very respected medical journal. 819 00:48:26,160 --> 00:48:30,080 Speaker 1: I think it's British. The Lancet um has taken to 820 00:48:30,840 --> 00:48:35,919 Speaker 1: um accepting papers based on the study design and methodology 821 00:48:35,920 --> 00:48:38,920 Speaker 1: and goals. So when you first plan your study and 822 00:48:38,920 --> 00:48:41,440 Speaker 1: you have it all together before you ever start, that's 823 00:48:41,520 --> 00:48:44,440 Speaker 1: when you would apply to have your paper studied and 824 00:48:44,560 --> 00:48:47,319 Speaker 1: published in the Lancet, and that's when they decide whether 825 00:48:47,360 --> 00:48:50,359 Speaker 1: it's a high quality enough study to publish or not. 826 00:48:50,760 --> 00:48:53,239 Speaker 1: So then they're locked into publishing your study, whether your 827 00:48:53,280 --> 00:48:57,520 Speaker 1: outcome is negative or positive, and has the knock on 828 00:48:57,600 --> 00:49:00,680 Speaker 1: effect of the Lancet basically being like this is this 829 00:49:00,760 --> 00:49:03,279 Speaker 1: is a trash study. We would never publish this, don't 830 00:49:03,280 --> 00:49:06,080 Speaker 1: even bother. So it's saving funds and then the high 831 00:49:06,160 --> 00:49:08,840 Speaker 1: quality studies are the ones that are going to get published. 832 00:49:09,040 --> 00:49:11,840 Speaker 1: And then also the positive outcomes and the negative outcomes 833 00:49:11,880 --> 00:49:15,080 Speaker 1: get get published regardless because they have no idea what 834 00:49:15,120 --> 00:49:17,160 Speaker 1: outcome is going to be because they accept the paper 835 00:49:17,400 --> 00:49:20,400 Speaker 1: before the paper, before the study has even been conducted. 836 00:49:20,840 --> 00:49:24,200 Speaker 1: I saw another thing that said that uh paper would 837 00:49:24,200 --> 00:49:26,480 Speaker 1: be more likely to get published in the Lancet if 838 00:49:26,480 --> 00:49:30,400 Speaker 1: it had cool illustrations. That's right, That never hurts. Everybody 839 00:49:30,400 --> 00:49:33,680 Speaker 1: knows that that's not unethical, especially in color it just 840 00:49:33,719 --> 00:49:36,880 Speaker 1: put a few of those New Yorker cartoons forget about it. 841 00:49:36,960 --> 00:49:39,960 Speaker 1: Everybody loves those. Are you got anything else, I've got 842 00:49:40,040 --> 00:49:42,360 Speaker 1: nothing else. You know, this is a little soapboxy, but 843 00:49:42,440 --> 00:49:44,279 Speaker 1: this is something that we believe in. It's it's kind 844 00:49:44,280 --> 00:49:48,960 Speaker 1: of like our episode on the Scientific Method a little bit. 845 00:49:49,680 --> 00:49:52,399 Speaker 1: Mm hmm. I like it too. Yeah, thanks for doing 846 00:49:52,400 --> 00:49:54,759 Speaker 1: it with me, man, Thank you for doing it with me. 847 00:49:55,280 --> 00:49:59,400 Speaker 1: Thank you for squeezing my limits. Sure. Uh, if you 848 00:49:59,440 --> 00:50:03,160 Speaker 1: want to know more or about scientific bias, there's a lot, fortunately, 849 00:50:03,320 --> 00:50:06,759 Speaker 1: a lot of sites and great articles dedicated um to 850 00:50:07,280 --> 00:50:09,720 Speaker 1: routing that stuff out and to make you a smarter 851 00:50:09,880 --> 00:50:13,799 Speaker 1: consumer of science. And so go check that out and 852 00:50:13,920 --> 00:50:15,839 Speaker 1: learn more about it. And since I said learn more 853 00:50:15,880 --> 00:50:17,959 Speaker 1: about it, it means it's time for a listener mail. 854 00:50:21,360 --> 00:50:25,279 Speaker 1: You know. Sometimes the listener mail dovetails quite nicely with 855 00:50:25,440 --> 00:50:30,040 Speaker 1: the topic, and that was the case today with our inclusion. 856 00:50:30,760 --> 00:50:35,240 Speaker 1: Oh yeah, on the Media Bias List, which was pretty exciting. Yeah, 857 00:50:35,280 --> 00:50:38,040 Speaker 1: what an honor. You know, there's something called the media 858 00:50:38,080 --> 00:50:41,040 Speaker 1: bias Is it called the media Bias List? I believe so. 859 00:50:41,719 --> 00:50:44,560 Speaker 1: And it's you know, what it does is it takes 860 00:50:44,600 --> 00:50:47,640 Speaker 1: news outlets and newspapers and you know, TV and stuff 861 00:50:47,680 --> 00:50:49,839 Speaker 1: like that, and they just sort of it's a big 862 00:50:49,920 --> 00:50:53,920 Speaker 1: chart where they're ranked according to like how biased they are, 863 00:50:54,320 --> 00:50:56,560 Speaker 1: you know, kind of up down, left and right. And 864 00:50:56,560 --> 00:50:59,439 Speaker 1: they included podcasts this year they did, and we were 865 00:50:59,600 --> 00:51:01,239 Speaker 1: on the US and it was really kind of cool. 866 00:51:01,239 --> 00:51:02,600 Speaker 1: We had a bunch of people right in. And this 867 00:51:02,640 --> 00:51:05,520 Speaker 1: is from Nicholas Bett. Oh, he said, I found this 868 00:51:05,520 --> 00:51:08,840 Speaker 1: post while I was scrolling through Facebook, uh and waiting 869 00:51:08,880 --> 00:51:12,719 Speaker 1: for the NFL season to start. Add fonts media? Is 870 00:51:12,760 --> 00:51:15,439 Speaker 1: it fonts or fontests? We should know? I'm not sure 871 00:51:15,520 --> 00:51:19,600 Speaker 1: one of the two. It's a it's a watchdog organization 872 00:51:19,800 --> 00:51:22,600 Speaker 1: known for the media bias chart. Um. They do a 873 00:51:22,640 --> 00:51:25,440 Speaker 1: media bias chart where they rank every news outlets political bias, 874 00:51:25,440 --> 00:51:28,040 Speaker 1: and in the recent update they included you guys, and 875 00:51:28,040 --> 00:51:30,239 Speaker 1: wouldn't you know it the most politically fair piece of 876 00:51:30,280 --> 00:51:32,520 Speaker 1: media you can possibly consume and all of the known 877 00:51:32,600 --> 00:51:36,480 Speaker 1: universes stuff you should know. You guys say you're liberal. 878 00:51:36,560 --> 00:51:39,200 Speaker 1: But until I heard Chuck outright stated, I didn't even 879 00:51:39,280 --> 00:51:44,360 Speaker 1: know wow that well, I think I think it slips 880 00:51:44,360 --> 00:51:49,080 Speaker 1: through their son. Well, yeah, we're certainly human beings and 881 00:51:49,160 --> 00:51:51,120 Speaker 1: we have their own biases, but we definitely try to 882 00:51:51,200 --> 00:51:53,000 Speaker 1: keep them in check. Is we try to, And I 883 00:51:53,040 --> 00:51:54,960 Speaker 1: think it's just been confirmed because they're not. Just like, 884 00:51:55,719 --> 00:51:57,400 Speaker 1: listen to a couple of shows and I want to 885 00:51:57,440 --> 00:52:00,120 Speaker 1: see these guys seem okay, like they really listen, then 886 00:52:00,120 --> 00:52:03,480 Speaker 1: they really rank people. Um. They probably saw that too, 887 00:52:03,600 --> 00:52:06,200 Speaker 1: or perhaps they listen to the North Korea episode where 888 00:52:06,239 --> 00:52:08,640 Speaker 1: Josh suggested wolf Blitzer apply hot paper clips to a 889 00:52:08,719 --> 00:52:13,520 Speaker 1: center thighs while writing a nice piece on Trump's Curreyan relations. 890 00:52:13,600 --> 00:52:16,719 Speaker 1: Hilarious either way, Thank you guys for your fairness and 891 00:52:16,760 --> 00:52:18,920 Speaker 1: hilarity of all these years. You're both the best. That 892 00:52:19,000 --> 00:52:22,320 Speaker 1: is from Nicholas bad Oh. Thanks a lot of Nicholas. 893 00:52:22,320 --> 00:52:24,120 Speaker 1: Thanks to everybody who wrote in to say that they 894 00:52:24,120 --> 00:52:26,799 Speaker 1: saw that. We appreciate it. Uh. And it was neat 895 00:52:26,840 --> 00:52:28,680 Speaker 1: to see ourselves right in the middle of the rainbow. 896 00:52:28,800 --> 00:52:31,080 Speaker 1: Love being in the middle of that rainbow. I do 897 00:52:31,200 --> 00:52:33,399 Speaker 1: to Chuck. It's nice and warm and cozy in there, 898 00:52:33,440 --> 00:52:35,680 Speaker 1: isn't it. Yes, Well, if you want to get in 899 00:52:35,719 --> 00:52:38,040 Speaker 1: touch with this, like Nicholas and the Gang did, you 900 00:52:38,080 --> 00:52:41,440 Speaker 1: can send us an email to Stuff Podcast at iHeart 901 00:52:41,520 --> 00:52:47,560 Speaker 1: radio dot com. Stuff you Should Know is a production 902 00:52:47,560 --> 00:52:51,000 Speaker 1: of iHeart Radio. For more podcasts My heart Radio, visit 903 00:52:51,040 --> 00:52:54,120 Speaker 1: the iHeart Radio app, Apple podcasts, or wherever you listen 904 00:52:54,200 --> 00:53:00,319 Speaker 1: to your favorite shows. Two