1 00:00:01,160 --> 00:00:04,600 Speaker 1: Welcome to Stuff You Should Know from House Stuff Works 2 00:00:04,600 --> 00:00:13,000 Speaker 1: dot com. Hey, and welcome to the podcast. I'm Josh 3 00:00:13,039 --> 00:00:17,960 Speaker 1: Clark with Charles W. Chuck Bryant and Jerry This Stuff 4 00:00:17,960 --> 00:00:22,600 Speaker 1: you Should Know. Um, Josh, We're gonna do something weird today. 5 00:00:22,720 --> 00:00:25,360 Speaker 1: I'm gonna do a listener mail at the head of 6 00:00:25,360 --> 00:00:29,120 Speaker 1: the podcast. I know, all right, what? All right, let's 7 00:00:29,120 --> 00:00:32,720 Speaker 1: do it. Okay, this is from b Wait, hold on, 8 00:00:32,760 --> 00:00:34,960 Speaker 1: can you have the listener mail? Music going? Oh? I 9 00:00:34,960 --> 00:00:37,080 Speaker 1: don't know, Jerry, should we go the whole nine yards? 10 00:00:37,080 --> 00:00:41,680 Speaker 1: So let's do it? People might freak out? All right? 11 00:00:41,720 --> 00:00:46,120 Speaker 1: This is from Bianca. Uh voice is what I'm gonna say. 12 00:00:46,320 --> 00:00:48,280 Speaker 1: I think that's great. Hey, guys wrote you not too 13 00:00:48,320 --> 00:00:50,920 Speaker 1: long ago asking about Hey, research your own podcast. It 14 00:00:51,040 --> 00:00:53,280 Speaker 1: just got back from a class where we talked about 15 00:00:53,760 --> 00:00:58,800 Speaker 1: research misrepresentation and journal articles. Apparently journals don't publish everything 16 00:00:58,800 --> 00:01:01,360 Speaker 1: that is submitted, and lot of researchers don't even publish 17 00:01:01,360 --> 00:01:04,319 Speaker 1: their studies they don't like the results. Uh. Some laws 18 00:01:04,319 --> 00:01:07,479 Speaker 1: have been put into place to prevent misrepresentation, such as 19 00:01:07,520 --> 00:01:10,760 Speaker 1: researchers having to register their studies before they get results 20 00:01:10,800 --> 00:01:14,120 Speaker 1: and journals only accepting preregistered studies. But apparently this is 21 00:01:14,120 --> 00:01:17,559 Speaker 1: not happening at all, even though it is now technically law. 22 00:01:18,160 --> 00:01:21,520 Speaker 1: This ends with the general public being misinformed about methods 23 00:01:21,520 --> 00:01:24,399 Speaker 1: and drugs that work. For example, there are twenty five 24 00:01:24,440 --> 00:01:28,000 Speaker 1: studies proving a drug works, twenty five that don't. It's 25 00:01:28,040 --> 00:01:31,360 Speaker 1: more likely that twenty of the positive results have been 26 00:01:31,360 --> 00:01:34,480 Speaker 1: published and only one or two of the negative. Uh. 27 00:01:34,480 --> 00:01:37,640 Speaker 1: And that is from Bianca, and that led us to 28 00:01:37,840 --> 00:01:42,160 Speaker 1: this article on our own website. Ten signs that that 29 00:01:42,240 --> 00:01:46,480 Speaker 1: study is bogus, and here it is nice Chuck. Well, 30 00:01:46,480 --> 00:01:49,200 Speaker 1: we get asked a lot about research from people usually 31 00:01:49,200 --> 00:01:52,840 Speaker 1: in college. Are like, you guys are professional researchers. How 32 00:01:52,840 --> 00:01:54,880 Speaker 1: do I know I'm doing a good job and getting 33 00:01:54,880 --> 00:01:58,920 Speaker 1: good info? And it's getting harder and harder these days, 34 00:01:59,160 --> 00:02:01,720 Speaker 1: it really is. You know. One sign that I've learned 35 00:02:01,800 --> 00:02:06,200 Speaker 1: is if you are searching about his study, and all 36 00:02:06,240 --> 00:02:11,639 Speaker 1: of the hits that come back are from different news organizations, 37 00:02:11,680 --> 00:02:13,800 Speaker 1: and they're all within like a two three day period 38 00:02:14,160 --> 00:02:17,960 Speaker 1: from a year ago, nothing like, nothing more recent than 39 00:02:18,000 --> 00:02:21,919 Speaker 1: that than somebody released a sensational study and no one 40 00:02:22,000 --> 00:02:25,160 Speaker 1: did any actual effort into investigating it, and there was 41 00:02:25,200 --> 00:02:28,040 Speaker 1: no follow up. If you dig deep enough, somebody might 42 00:02:28,080 --> 00:02:29,600 Speaker 1: have done follow up or something like that, but for 43 00:02:29,639 --> 00:02:32,880 Speaker 1: the most part, it was just something that splashed across 44 00:02:32,919 --> 00:02:35,680 Speaker 1: the headlines, which more often than not is the is 45 00:02:35,720 --> 00:02:38,440 Speaker 1: the case as far as science reporting goes. So that's 46 00:02:38,440 --> 00:02:43,600 Speaker 1: a bonus. That's the eleven boom. How about that? Yeah, 47 00:02:43,840 --> 00:02:46,239 Speaker 1: should we just start banging these out let's do it? 48 00:02:46,280 --> 00:02:51,280 Speaker 1: Al Do you have some other clever well apart and 49 00:02:51,360 --> 00:02:53,600 Speaker 1: parcel with that, I don't know if it's clever. You 50 00:02:53,680 --> 00:02:56,359 Speaker 1: do come across people who you know can be trusted 51 00:02:56,400 --> 00:03:00,400 Speaker 1: and relied upon to do good science reporting. So like 52 00:03:00,560 --> 00:03:04,200 Speaker 1: Ed Young is one, another guy named Ben gold Acre 53 00:03:04,280 --> 00:03:06,839 Speaker 1: has something called bad Science I don't remember what, Yeah, 54 00:03:06,880 --> 00:03:09,200 Speaker 1: outlet he's with. And then there's a guy I think 55 00:03:09,240 --> 00:03:13,360 Speaker 1: scientific American named John Horgan who's awesome. Yeah, or some 56 00:03:13,480 --> 00:03:16,799 Speaker 1: journalism organizations that have been around instood the test of 57 00:03:16,840 --> 00:03:18,639 Speaker 1: time that you know are really doing it right, like 58 00:03:18,919 --> 00:03:23,680 Speaker 1: a nature Yeah. Scientific Americans are like really science Yeah, 59 00:03:23,760 --> 00:03:26,600 Speaker 1: Like I feel I feel really good about using those sources. Yeah, 60 00:03:26,639 --> 00:03:29,520 Speaker 1: but even they can you know, there's there's something called 61 00:03:29,600 --> 00:03:32,520 Speaker 1: scientis um where there's a lot of like faith in 62 00:03:32,639 --> 00:03:36,200 Speaker 1: dogma associated with the scientific process, and you know, you 63 00:03:36,240 --> 00:03:40,640 Speaker 1: have to root through that as well. Try it. I'm done. Uh. 64 00:03:40,800 --> 00:03:42,600 Speaker 1: The first one that they have here on the list 65 00:03:42,680 --> 00:03:46,280 Speaker 1: is that it's unrepeatable, and that's a big one. UM. 66 00:03:46,320 --> 00:03:49,840 Speaker 1: The Center for Open Science did a study, UH was 67 00:03:49,880 --> 00:03:52,120 Speaker 1: a project really where they took two hundred and seventy 68 00:03:52,760 --> 00:03:55,120 Speaker 1: researchers and they said, you know what, take these one 69 00:03:55,200 --> 00:03:59,720 Speaker 1: hundred studies that have been published already psychological studies and 70 00:04:00,120 --> 00:04:03,840 Speaker 1: pour over them. And uh in just last year. They 71 00:04:03,880 --> 00:04:06,200 Speaker 1: took him a while, took them several years. They said, 72 00:04:06,200 --> 00:04:08,280 Speaker 1: you know what, more than half of these can't even 73 00:04:08,320 --> 00:04:11,880 Speaker 1: be repeated using the same methods. They're not reproducible. Nope, 74 00:04:11,920 --> 00:04:15,760 Speaker 1: not reproducible. That's a big one. And and ones that 75 00:04:15,800 --> 00:04:18,360 Speaker 1: means that they they when they carried out they followed 76 00:04:18,360 --> 00:04:22,280 Speaker 1: the methodology. UM Scientific method podcasts. You should listen to 77 00:04:22,360 --> 00:04:25,000 Speaker 1: that one. That was a good one. That they found 78 00:04:25,040 --> 00:04:27,560 Speaker 1: that their results were just not what the what the 79 00:04:27,600 --> 00:04:31,160 Speaker 1: people published, not anywhere near them. UM. For example, they 80 00:04:31,240 --> 00:04:34,880 Speaker 1: used one as an example where a study found that 81 00:04:34,920 --> 00:04:40,599 Speaker 1: men were terrible at a determining whether a woman was 82 00:04:40,680 --> 00:04:44,200 Speaker 1: giving them UM some sort of like a clues to 83 00:04:44,279 --> 00:04:49,640 Speaker 1: attraction or just being friendly, sexy, sexy stuff or yeah, 84 00:04:49,839 --> 00:04:53,320 Speaker 1: or good to meet you or buzz off jerky um. 85 00:04:53,440 --> 00:04:56,080 Speaker 1: And they did the study again and as part of 86 00:04:56,120 --> 00:05:00,680 Speaker 1: this Open Science Center for Open Science study or survey, 87 00:05:00,720 --> 00:05:04,279 Speaker 1: and they found that that was not reproducible or that 88 00:05:04,360 --> 00:05:06,880 Speaker 1: they came up with totally different results. And that was 89 00:05:06,960 --> 00:05:09,640 Speaker 1: just one of many. Yeah, and in this case specifically, 90 00:05:09,680 --> 00:05:13,159 Speaker 1: they looked into that study and they found that it was, um, 91 00:05:13,320 --> 00:05:15,320 Speaker 1: one was in the United Kingdom, one was in the 92 00:05:15,400 --> 00:05:17,640 Speaker 1: United States. May have something to do with it. But 93 00:05:17,800 --> 00:05:21,840 Speaker 1: the point is, Chuck, is if you're talking about humanity, 94 00:05:22,160 --> 00:05:24,839 Speaker 1: I don't think that the study was like the American 95 00:05:25,000 --> 00:05:28,320 Speaker 1: male is terrible at it. It's men are terrible at it. Right. 96 00:05:28,400 --> 00:05:30,479 Speaker 1: So that means that whether it's in the UK, which 97 00:05:30,480 --> 00:05:33,680 Speaker 1: is basically the US with an accent and a penchant 98 00:05:33,760 --> 00:05:38,920 Speaker 1: for t I'm just kidding UK s soon, Um, the 99 00:05:39,600 --> 00:05:44,680 Speaker 1: it should be universal. Yeah, you know, unless you're saying, no, 100 00:05:44,880 --> 00:05:51,160 Speaker 1: it's just this only applies to American men, the American men, right, 101 00:05:51,400 --> 00:05:54,400 Speaker 1: then it's not even study. Yeah. The next one we 102 00:05:54,440 --> 00:06:00,680 Speaker 1: have is, uh, it's it's plausible, not necessarily a provable. 103 00:06:00,800 --> 00:06:04,400 Speaker 1: And this is a big one because and I think, um, 104 00:06:04,480 --> 00:06:07,839 Speaker 1: we're talking about observational studies here more than lab experiments, 105 00:06:07,880 --> 00:06:10,120 Speaker 1: because with observational studies, you know, you sit in a 106 00:06:10,240 --> 00:06:14,080 Speaker 1: room and get asked three questions about something, and all 107 00:06:14,120 --> 00:06:15,920 Speaker 1: these people get asked the same questions, and then they 108 00:06:15,960 --> 00:06:19,400 Speaker 1: pour over the data and they draw out their own observations. 109 00:06:20,200 --> 00:06:23,000 Speaker 1: And one of the very famously an observational study that 110 00:06:23,080 --> 00:06:26,479 Speaker 1: led to false results found a correlation between having a 111 00:06:26,520 --> 00:06:31,200 Speaker 1: type A personality and UM being prone to risk for 112 00:06:31,360 --> 00:06:35,360 Speaker 1: heart attacks and UM. For a long time, you know 113 00:06:35,640 --> 00:06:38,000 Speaker 1: that the news outlets were like, oh, yes, of course, 114 00:06:38,080 --> 00:06:40,800 Speaker 1: that makes total sense. This study proved what we've all 115 00:06:40,839 --> 00:06:44,680 Speaker 1: known all along, UM, And then it came out that no, 116 00:06:44,880 --> 00:06:49,760 Speaker 1: actually what was going on was a well known anomaly 117 00:06:50,240 --> 00:06:54,720 Speaker 1: where you have a five percent um risk that chance 118 00:06:54,800 --> 00:06:59,240 Speaker 1: will produce something that looks like a statistically significant correlation 119 00:06:59,440 --> 00:07:01,680 Speaker 1: when it's not all when really it's just total chance. 120 00:07:02,200 --> 00:07:06,320 Speaker 1: And science is aware of this, especially with observational studies, 121 00:07:07,080 --> 00:07:10,280 Speaker 1: because the more questions you have, the more opportunity you 122 00:07:10,320 --> 00:07:13,600 Speaker 1: have for that five percent chance to create a seemingly 123 00:07:13,680 --> 00:07:19,480 Speaker 1: statistically significant correlation when really it's not there. It was 124 00:07:19,520 --> 00:07:22,280 Speaker 1: just random chance where if somebody else goes back and 125 00:07:22,520 --> 00:07:25,360 Speaker 1: does the same same study, they're not going to come 126 00:07:25,440 --> 00:07:30,200 Speaker 1: up with the same results. But the if a researcher 127 00:07:31,240 --> 00:07:36,680 Speaker 1: is I would guess willfully blind to that five percent chance, um, 128 00:07:36,800 --> 00:07:38,960 Speaker 1: they will go ahead and produce the study and be like, no, 129 00:07:39,040 --> 00:07:41,200 Speaker 1: it's true, here's the results right here, go ahead and 130 00:07:41,200 --> 00:07:43,440 Speaker 1: report on it and make my career. Yeah. Well, and 131 00:07:43,440 --> 00:07:45,960 Speaker 1: they also might be looking for something inter affect Chances 132 00:07:45,960 --> 00:07:50,360 Speaker 1: are they are, Um, it's not just some random studying. Look, 133 00:07:50,440 --> 00:07:51,720 Speaker 1: let's just see what we get if we ask a 134 00:07:51,720 --> 00:07:54,040 Speaker 1: bunch of weird questions. It's like, hey, we're looking to 135 00:07:54,040 --> 00:07:57,679 Speaker 1: try and prove something most likely so that better minehoff 136 00:07:57,720 --> 00:07:59,920 Speaker 1: thing might come into play where you're kind of chair 137 00:08:00,160 --> 00:08:03,600 Speaker 1: picking data. Yeah, that's a big problem that kind of 138 00:08:03,640 --> 00:08:05,280 Speaker 1: comes up. A lot of these are really kind of 139 00:08:05,320 --> 00:08:08,200 Speaker 1: interrelated to totally. The other big thing that's in related 140 00:08:08,280 --> 00:08:11,680 Speaker 1: is how the media reports on science these days. Yeah, 141 00:08:11,720 --> 00:08:14,840 Speaker 1: you know, he's a big deal. Yeah, Like in John 142 00:08:14,880 --> 00:08:17,480 Speaker 1: Oliver just recently went off on this and NPR did 143 00:08:17,480 --> 00:08:21,120 Speaker 1: a thing on it, like they might even like the 144 00:08:21,160 --> 00:08:23,920 Speaker 1: researcher might say plausible, but it doesn't get portrayed that 145 00:08:23,920 --> 00:08:27,240 Speaker 1: way in the media. Sure, remember that poor kid who 146 00:08:27,280 --> 00:08:31,360 Speaker 1: thought he found the ancient Mayan city. The media just 147 00:08:31,400 --> 00:08:33,840 Speaker 1: took it and ran with it. You know, yeah, I 148 00:08:33,840 --> 00:08:36,640 Speaker 1: think there was a lot of maybe or it's possible, 149 00:08:36,760 --> 00:08:38,480 Speaker 1: we need to go check kind of thing. That mean, 150 00:08:38,520 --> 00:08:41,240 Speaker 1: He's like, no, he discovered an ancient Mayan city never 151 00:08:41,280 --> 00:08:43,439 Speaker 1: known before. Yeah, and let's put it in a headline. 152 00:08:43,520 --> 00:08:45,720 Speaker 1: And that's I mean, that's the That's just kind of 153 00:08:45,760 --> 00:08:48,559 Speaker 1: the way it is these days. Do you have to 154 00:08:48,600 --> 00:08:50,160 Speaker 1: be able to sort through it? I guess that's what 155 00:08:50,200 --> 00:08:52,760 Speaker 1: we're doing here, aren't we, Chuck, We're telling everybody how 156 00:08:52,800 --> 00:08:55,120 Speaker 1: to sort through it, or at the very least take 157 00:08:55,520 --> 00:09:00,320 Speaker 1: scientific reporting with a grain of salt, right, Not like 158 00:09:00,360 --> 00:09:03,880 Speaker 1: you don't necessarily have the time to go through and 159 00:09:04,000 --> 00:09:06,640 Speaker 1: double that research and then check on that research and 160 00:09:07,640 --> 00:09:09,400 Speaker 1: you know, right, so take it with a grain of salt. 161 00:09:10,640 --> 00:09:16,840 Speaker 1: Um unsound samples. Uh, here was this study that basically said, um, 162 00:09:17,200 --> 00:09:20,040 Speaker 1: how you lost your virginity. It's going to have a 163 00:09:20,160 --> 00:09:24,600 Speaker 1: very large impact and play a role on how you 164 00:09:24,679 --> 00:09:27,640 Speaker 1: feel about sex and experienced sex for the rest of 165 00:09:27,679 --> 00:09:33,320 Speaker 1: your life. Yeah, it's possible. Sure, it seems logical, so 166 00:09:33,440 --> 00:09:37,240 Speaker 1: we'll just go with it. But when you um only 167 00:09:38,040 --> 00:09:44,240 Speaker 1: interview college students and uh, you don't you only interview 168 00:09:44,720 --> 00:09:47,760 Speaker 1: heterosexual people, Then you can't really say you've done a 169 00:09:47,840 --> 00:09:50,840 Speaker 1: robust study, now, can't you. Plus you also take out 170 00:09:50,840 --> 00:09:55,480 Speaker 1: of the sample size your sample population, anybody who reports 171 00:09:55,520 --> 00:09:59,720 Speaker 1: having had a violent encounter, Throw them out, that data out. 172 00:10:00,080 --> 00:10:02,400 Speaker 1: That's not gonna inform how you feel about sex, right exactly. 173 00:10:02,440 --> 00:10:05,320 Speaker 1: You're just narrowing it down further and further and again, 174 00:10:05,480 --> 00:10:08,839 Speaker 1: cherry picking the data by throwing people out of your 175 00:10:08,880 --> 00:10:12,120 Speaker 1: population sample that don't that will throw off the data 176 00:10:12,160 --> 00:10:14,320 Speaker 1: that you want. Yeah, and I've never heard of this 177 00:10:14,400 --> 00:10:18,520 Speaker 1: acronym weird and um. A lot of these studies are 178 00:10:18,559 --> 00:10:21,960 Speaker 1: conducted by professors in academics. So a lot of times 179 00:10:21,960 --> 00:10:25,160 Speaker 1: you've got college students as your sample, and there's something 180 00:10:25,160 --> 00:10:31,600 Speaker 1: called weird Western educated from industrialized rich in democratic countries. Right, 181 00:10:31,600 --> 00:10:34,840 Speaker 1: those are the participants in the studies studies subject. But 182 00:10:34,840 --> 00:10:39,280 Speaker 1: then they will say, men, right, well, what about the 183 00:10:39,400 --> 00:10:44,040 Speaker 1: gay man in Africa? Like you didn't ask him? So 184 00:10:44,120 --> 00:10:47,040 Speaker 1: that was that's actually a really really big deal. Um. 185 00:10:47,080 --> 00:10:50,120 Speaker 1: In two thousand and ten, these three researchers did a 186 00:10:50,640 --> 00:10:55,479 Speaker 1: a survey of a ton of social science and behavioral 187 00:10:55,520 --> 00:11:00,240 Speaker 1: science studies found that eight percent of them used your 188 00:11:00,600 --> 00:11:03,920 Speaker 1: study participants. So basically it was college kids for eighty 189 00:11:04,080 --> 00:11:07,960 Speaker 1: percent of these papers. And they surveyed a bunch of 190 00:11:07,960 --> 00:11:10,720 Speaker 1: papers and they took it a little further and they 191 00:11:10,760 --> 00:11:15,359 Speaker 1: said that, um, people who fit into the weird category 192 00:11:15,760 --> 00:11:18,320 Speaker 1: only make up twelve percent of the world population, but 193 00:11:18,360 --> 00:11:21,400 Speaker 1: they represent eight percent of the population of these studies. 194 00:11:21,800 --> 00:11:27,800 Speaker 1: And a college student, chuck in North America, Europe, Israel, 195 00:11:28,040 --> 00:11:32,520 Speaker 1: or Australia is four thousand times more likely to be 196 00:11:32,640 --> 00:11:35,840 Speaker 1: in a scientific study than anyone else on the planet. 197 00:11:36,440 --> 00:11:40,880 Speaker 1: And they're basing psychology and behavioral sciences are basing their 198 00:11:40,920 --> 00:11:45,600 Speaker 1: findings onto everybody else based on this this small tranch 199 00:11:45,920 --> 00:11:49,400 Speaker 1: of humanity. And that's a that's a big problem. That's 200 00:11:49,400 --> 00:11:52,680 Speaker 1: extremely misleading. Yeah, and it's also a little insulting because 201 00:11:52,760 --> 00:11:56,360 Speaker 1: what they're essentially saying is like, this is who matters. 202 00:11:57,200 --> 00:12:00,280 Speaker 1: Well also, yeah, but what's sad is this is who 203 00:12:00,440 --> 00:12:03,439 Speaker 1: I am going to go to the trouble of recruiting 204 00:12:03,520 --> 00:12:07,760 Speaker 1: for my study. It's just sheer laziness. And I'm sure 205 00:12:07,760 --> 00:12:08,960 Speaker 1: a lot of them are like, well, I don't have 206 00:12:09,000 --> 00:12:13,040 Speaker 1: the funding to do that. I guess I see that. 207 00:12:13,080 --> 00:12:16,400 Speaker 1: But at the same time, I guarantee there's a tremendous 208 00:12:16,440 --> 00:12:19,560 Speaker 1: amount of laziness involved. Yeah, or maybe if you don't 209 00:12:19,559 --> 00:12:24,400 Speaker 1: have the money, maybe don't do that study. Is it 210 00:12:24,480 --> 00:12:26,640 Speaker 1: that simple? I'm probably over simple plying. I don't know. 211 00:12:26,720 --> 00:12:28,360 Speaker 1: I'm sure we're going to hear from some people in 212 00:12:28,400 --> 00:12:33,760 Speaker 1: academia about this one. We'll stop using weird participants, or 213 00:12:33,800 --> 00:12:37,880 Speaker 1: at the very least say, um, like this is sexual 214 00:12:38,360 --> 00:12:43,680 Speaker 1: Dartmouth students. This applies to them, not everybody in the world. 215 00:12:46,000 --> 00:12:50,160 Speaker 1: Of these studies where used those people as study participants, 216 00:12:50,200 --> 00:12:52,720 Speaker 1: and they're not even they're not even emblematic of the 217 00:12:52,720 --> 00:12:56,160 Speaker 1: rest of the human race. Like college students are shown 218 00:12:56,480 --> 00:13:00,720 Speaker 1: to see the world differently than other people around the world. 219 00:13:01,280 --> 00:13:03,200 Speaker 1: And so it's not like you can be like, well, 220 00:13:03,240 --> 00:13:06,000 Speaker 1: it still works, you can still extrapolate. It's like flawed 221 00:13:06,080 --> 00:13:09,680 Speaker 1: in every way, shape and form. Probably take a break. Come, yeah, 222 00:13:09,720 --> 00:13:12,040 Speaker 1: let's take a break because, uh, you're get a little 223 00:13:12,080 --> 00:13:15,280 Speaker 1: hot under the collar. I love it, man. Uh, we'll 224 00:13:15,320 --> 00:13:40,640 Speaker 1: be right back after this. Just so much, All right, 225 00:13:40,640 --> 00:13:47,559 Speaker 1: what's next, buddy? Uh? Very small sample sizes. Right, if 226 00:13:47,600 --> 00:13:53,600 Speaker 1: you do a study with twenty mice, then you're not 227 00:13:54,160 --> 00:13:58,120 Speaker 1: doing a good enough study. No, So They used this 228 00:13:58,280 --> 00:14:02,640 Speaker 1: um in the the article. They use the idea of 229 00:14:02,679 --> 00:14:07,480 Speaker 1: ten thousand smokers and ten thousand non smokers, and they said, okay, 230 00:14:07,520 --> 00:14:11,080 Speaker 1: if you have a population sample that size, that's not bad. 231 00:14:11,120 --> 00:14:13,400 Speaker 1: It's a pretty good start. And you find that fifty 232 00:14:13,800 --> 00:14:16,360 Speaker 1: of the smokers developed lung cancer but only five percent 233 00:14:16,440 --> 00:14:19,760 Speaker 1: of non smokers did, then your study has what's called 234 00:14:19,880 --> 00:14:24,520 Speaker 1: a high power. UM. It's if if you had something 235 00:14:24,560 --> 00:14:27,400 Speaker 1: like ten smokers and ten non smokers, and two of 236 00:14:27,480 --> 00:14:30,920 Speaker 1: the smokers developed lung cancer and one developed lung cancer 237 00:14:30,960 --> 00:14:35,160 Speaker 1: as well, you have very little power and you should 238 00:14:35,160 --> 00:14:38,960 Speaker 1: have a very little confidence in your findings. But regardless, 239 00:14:39,400 --> 00:14:41,960 Speaker 1: it's still going to get reported if it's a sexy 240 00:14:42,000 --> 00:14:47,200 Speaker 1: idea yea for sure. Um. And because these are kind 241 00:14:47,200 --> 00:14:49,400 Speaker 1: of overlapping in a lot of ways, it was I 242 00:14:49,440 --> 00:14:53,080 Speaker 1: want to mention this guy, a scientist named Ulrich uh 243 00:14:53,360 --> 00:14:57,440 Speaker 1: Dirnegle Uh. He and his colleague Malcolm McCloud have been trying, 244 00:14:57,600 --> 00:14:59,200 Speaker 1: I mean, and there are a lot of scientists that 245 00:14:59,240 --> 00:15:01,760 Speaker 1: are trying to clean this up because they know it's 246 00:15:01,760 --> 00:15:04,560 Speaker 1: a problem. But he co wrote an article in Nature. 247 00:15:04,920 --> 00:15:10,160 Speaker 1: Uh that's called robust research. Colon institutions must do their 248 00:15:10,200 --> 00:15:14,000 Speaker 1: part for reproducibility. So this kind of ties back into 249 00:15:14,000 --> 00:15:18,120 Speaker 1: the reproducing things, like we said earlier, and his whole ideas, 250 00:15:18,160 --> 00:15:21,360 Speaker 1: you know what good funding they should tie funding too 251 00:15:21,360 --> 00:15:25,160 Speaker 1: good institutional practices, like you shouldn't get the money if 252 00:15:25,160 --> 00:15:28,920 Speaker 1: you can't show that you're doing it right. Um, and 253 00:15:28,960 --> 00:15:30,800 Speaker 1: he said that would just weed out a lot of stuff. 254 00:15:31,520 --> 00:15:36,680 Speaker 1: Here's one staggering stat for reproducibility and small, simple size. Uh. 255 00:15:36,880 --> 00:15:41,400 Speaker 1: Biomedical researchers for drug companies reported that of their only 256 00:15:42,840 --> 00:15:45,840 Speaker 1: of the papers that they published or even reproducible, that 257 00:15:46,000 --> 00:15:50,560 Speaker 1: was like an insider stat And it doesn't matter. They 258 00:15:50,640 --> 00:15:54,000 Speaker 1: still drugs are still going to market, yeah, which is 259 00:15:54,040 --> 00:15:56,800 Speaker 1: that's a really good example of why this does matter 260 00:15:56,880 --> 00:15:59,720 Speaker 1: to the average person. You know, like if you hear 261 00:15:59,800 --> 00:16:05,240 Speaker 1: something thing like um uh, monkeys like to cuddle with 262 00:16:05,320 --> 00:16:10,040 Speaker 1: one another because they are reminded of their mothers. Study shows. 263 00:16:10,720 --> 00:16:12,600 Speaker 1: Do you could just be like, oh, that's great, I'm 264 00:16:12,600 --> 00:16:15,360 Speaker 1: going to share that on the internet. Doesn't really affect 265 00:16:15,400 --> 00:16:19,200 Speaker 1: you in any way. But when their studies being conducted 266 00:16:19,240 --> 00:16:23,840 Speaker 1: that are that are creating drugs that could kill you 267 00:16:24,000 --> 00:16:26,680 Speaker 1: or not treat you or that kind of thing, And 268 00:16:26,840 --> 00:16:30,680 Speaker 1: is it's attracting money and funding and that kind of 269 00:16:30,680 --> 00:16:36,960 Speaker 1: stuff that's like that's harmful. Yeah. Absolutely. I found another survey, 270 00:16:37,160 --> 00:16:40,800 Speaker 1: did you like that terrible study idea? That it came up? 271 00:16:41,600 --> 00:16:47,480 Speaker 1: The monkeys like the cuddle A hundred and forty trainees 272 00:16:47,520 --> 00:16:52,360 Speaker 1: at the MD Anderson Cancer Center in Houston, Texas, Thank 273 00:16:52,400 --> 00:16:54,560 Speaker 1: you Houston for being so kind to us at a 274 00:16:54,640 --> 00:16:58,680 Speaker 1: recent show. They found that nearly a third of these 275 00:16:58,880 --> 00:17:04,600 Speaker 1: UM trainees felt pressure to support their mentors work. It's 276 00:17:04,640 --> 00:17:07,440 Speaker 1: like to get ahead or not get fired. So that's 277 00:17:07,440 --> 00:17:11,080 Speaker 1: another issue. As you've got these trainees or residents, uh, 278 00:17:11,080 --> 00:17:13,160 Speaker 1: and you have these mentors, and even if you disagree 279 00:17:13,280 --> 00:17:15,880 Speaker 1: or don't think it's a great study, you're you're pressured 280 00:17:15,920 --> 00:17:18,080 Speaker 1: into just going along with it. I could see that 281 00:17:18,720 --> 00:17:21,159 Speaker 1: for sure. There's there seems to be a huge hierarchy. 282 00:17:21,240 --> 00:17:24,439 Speaker 1: And UM science can a lab. You know, you've got 283 00:17:24,480 --> 00:17:26,600 Speaker 1: the person who runs the lab, it's their lab, and 284 00:17:27,000 --> 00:17:30,720 Speaker 1: go against them. But there are people UM like Science 285 00:17:30,840 --> 00:17:34,199 Speaker 1: and Nature to great journals are updating their guidelines right now. 286 00:17:34,280 --> 00:17:39,960 Speaker 1: They're introducing checklists. UM Science hired statisticians to their panel 287 00:17:40,760 --> 00:17:44,320 Speaker 1: of reviewing editors, not just other you know, peer reviewed 288 00:17:44,600 --> 00:17:49,520 Speaker 1: like they actually actually hired numbers people specifically because that's 289 00:17:49,520 --> 00:17:52,240 Speaker 1: a bigger process. That's a huge part of studies. It's 290 00:17:52,280 --> 00:17:56,639 Speaker 1: like these this mind breaking statistical analysis that can be 291 00:17:56,720 --> 00:17:59,800 Speaker 1: used for good or ill. And I mean, I don't 292 00:17:59,800 --> 00:18:03,840 Speaker 1: think the average scientists necessarily is a whiz at that, 293 00:18:04,160 --> 00:18:06,560 Speaker 1: although I it has to be part of training, but 294 00:18:06,560 --> 00:18:09,360 Speaker 1: not necessarily. And that's a different kind of beast altogether. 295 00:18:09,960 --> 00:18:12,880 Speaker 1: Um Stats, we talked about it earlier. I took a 296 00:18:12,920 --> 00:18:16,119 Speaker 1: stats class in college, had so much trouble that was 297 00:18:16,200 --> 00:18:19,640 Speaker 1: awful at it. It really just it's a special kind 298 00:18:19,640 --> 00:18:25,240 Speaker 1: of is it even matter? Yeah, I didn't get it. 299 00:18:25,960 --> 00:18:29,800 Speaker 1: I passed it, though I passed it because my professor 300 00:18:29,880 --> 00:18:37,359 Speaker 1: took pity on me. Um that Ulric during durnagal Ulric 301 00:18:37,680 --> 00:18:42,359 Speaker 1: narnagal Um. He is a he's a big time crusader 302 00:18:42,440 --> 00:18:46,800 Speaker 1: for his jam making sure that science is good science. 303 00:18:47,119 --> 00:18:49,520 Speaker 1: One of the things um he crusades against is the 304 00:18:49,640 --> 00:18:52,840 Speaker 1: idea of remembering that virginity study where they just threw 305 00:18:52,880 --> 00:18:55,560 Speaker 1: out anybody who had a violent encounter for their first 306 00:18:56,160 --> 00:18:59,960 Speaker 1: sexual experience. UM. Apparently that's a big deal with annam 307 00:19:00,320 --> 00:19:02,720 Speaker 1: studies as well. If you're studying the effects of a 308 00:19:02,800 --> 00:19:06,120 Speaker 1: drug or something like there was this one in the article. Um, 309 00:19:06,160 --> 00:19:09,600 Speaker 1: if you're studying the effects of a stroke drug and 310 00:19:09,680 --> 00:19:12,680 Speaker 1: you've got a control group of mice that are taking 311 00:19:12,720 --> 00:19:15,000 Speaker 1: the drug or that aren't taking the drug, and then 312 00:19:15,000 --> 00:19:18,080 Speaker 1: a test group that are getting the drug. Um, and 313 00:19:18,080 --> 00:19:22,360 Speaker 1: then like three mice from the test group die even 314 00:19:22,359 --> 00:19:24,040 Speaker 1: though they're on the stroke drug. They die of a 315 00:19:24,080 --> 00:19:27,880 Speaker 1: massive stroke, and you just literally and figuratively throw them 316 00:19:27,920 --> 00:19:30,879 Speaker 1: out of the study, um, and don't include them in 317 00:19:30,920 --> 00:19:34,879 Speaker 1: the results. That changes the data. And he's been on 318 00:19:35,440 --> 00:19:37,600 Speaker 1: a peer of you on a paper before. He's like, no, 319 00:19:37,720 --> 00:19:39,880 Speaker 1: this doesn't pass peer of you. You can't just throw 320 00:19:39,960 --> 00:19:42,680 Speaker 1: out what happened to these three rodents? You started with ten, 321 00:19:42,680 --> 00:19:45,720 Speaker 1: there's only seven reported in the end. What happened to 322 00:19:45,760 --> 00:19:49,040 Speaker 1: those three? And how many of them just don't report 323 00:19:49,119 --> 00:19:51,879 Speaker 1: the ten? They're like, oh, we only started with seven. 324 00:19:51,920 --> 00:19:55,080 Speaker 1: Were going, you know, well, I was about to say 325 00:19:55,080 --> 00:19:56,800 Speaker 1: I get the urge. I don't get it because it's 326 00:19:56,800 --> 00:19:58,960 Speaker 1: not right. But I think what happens is you work 327 00:19:59,040 --> 00:20:01,480 Speaker 1: so hard at something yeah yeah, and you're like, how 328 00:20:01,520 --> 00:20:04,000 Speaker 1: can I just walk away from two years of this 329 00:20:04,359 --> 00:20:08,199 Speaker 1: because it didn't get a result? Okay, The point of 330 00:20:08,320 --> 00:20:11,160 Speaker 1: real science, though you have to walk away from it. Well, 331 00:20:11,280 --> 00:20:14,320 Speaker 1: you have to publish that. And that's the other thing too, 332 00:20:14,320 --> 00:20:18,159 Speaker 1: And I guarantee scientists will say, hey, man, try getting 333 00:20:18,160 --> 00:20:21,120 Speaker 1: a negative paper published in a good journal. These days, 334 00:20:21,200 --> 00:20:23,399 Speaker 1: you don't want that kind of stuff. But part of 335 00:20:23,400 --> 00:20:25,960 Speaker 1: it also is I don't think it's enough to just 336 00:20:26,000 --> 00:20:28,880 Speaker 1: have to be published in like a journal. You want 337 00:20:28,880 --> 00:20:31,000 Speaker 1: to make the news cycle as well. That makes it 338 00:20:31,040 --> 00:20:34,000 Speaker 1: even better, right, Um So, I think there's a lot 339 00:20:34,040 --> 00:20:36,760 Speaker 1: of factors involved. But ultimately, if you take all that 340 00:20:36,800 --> 00:20:38,960 Speaker 1: stuff away, if you take the culture away from it, 341 00:20:39,520 --> 00:20:42,960 Speaker 1: you're if you get negative results, you're supposed to publish 342 00:20:43,000 --> 00:20:45,199 Speaker 1: that so that some other scientists can come along and 343 00:20:45,240 --> 00:20:48,480 Speaker 1: be like, oh, somebody else already did this using these 344 00:20:48,520 --> 00:20:50,520 Speaker 1: methods that I was going to use. I'm not gonna 345 00:20:50,560 --> 00:20:54,040 Speaker 1: waste two years of my career because somebody else already did. 346 00:20:54,280 --> 00:20:57,199 Speaker 1: Thank you, buddy for saving me this time and trouble 347 00:20:57,200 --> 00:21:00,120 Speaker 1: and effort to know that this does not work. You've 348 00:21:00,160 --> 00:21:02,880 Speaker 1: proven this doesn't work when you saw it to prove 349 00:21:02,920 --> 00:21:04,960 Speaker 1: it does work, you actually proved it didn't work. That's 350 00:21:05,000 --> 00:21:09,760 Speaker 1: part of science. Yeah, I wish there wasn't a negative 351 00:21:10,720 --> 00:21:14,359 Speaker 1: connotation to a negative result, because to me, it's the 352 00:21:14,440 --> 00:21:17,479 Speaker 1: value is the same as proving something does work as 353 00:21:17,520 --> 00:21:22,439 Speaker 1: proving something doesn't work. Right again, it's just not Yeah, 354 00:21:22,480 --> 00:21:24,879 Speaker 1: but I'm not sexy either, so maybe that's why I 355 00:21:24,880 --> 00:21:27,359 Speaker 1: get it. Uh. Here's one that I didn't know was 356 00:21:27,400 --> 00:21:31,359 Speaker 1: a thing, predatory publishing. You never heard of this. So 357 00:21:31,520 --> 00:21:35,720 Speaker 1: here's the scenario. You're a doctor or a scientists and um, 358 00:21:35,840 --> 00:21:39,719 Speaker 1: you get an email from a journal that says, hey, 359 00:21:39,920 --> 00:21:42,080 Speaker 1: you got anything interesting for us. I've heard about your 360 00:21:42,080 --> 00:21:43,959 Speaker 1: work and you say, well, actually, do I have this 361 00:21:44,760 --> 00:21:47,320 Speaker 1: this study right here? They say, cool, we'll publish it. 362 00:21:47,560 --> 00:21:50,159 Speaker 1: You go great, my career is taking off. Then you 363 00:21:50,200 --> 00:21:54,240 Speaker 1: get a bill he says, where's my three grand for 364 00:21:54,280 --> 00:21:57,040 Speaker 1: publishing your article? And you're like, I don't owe you 365 00:21:57,080 --> 00:22:00,560 Speaker 1: three grand, all right, give us two? And you know 366 00:22:00,560 --> 00:22:02,360 Speaker 1: I can't even give you two. And if you fight 367 00:22:02,440 --> 00:22:06,040 Speaker 1: them long enough, maybe they'll drop it and never work 368 00:22:06,080 --> 00:22:09,160 Speaker 1: with you again. Or maybe it'll just be like, well, 369 00:22:09,200 --> 00:22:13,920 Speaker 1: we'll talk to your next court. Exactly. That's called predatory publishing. 370 00:22:13,920 --> 00:22:16,159 Speaker 1: And it is I'm not sure how new it is. 371 00:22:16,560 --> 00:22:19,120 Speaker 1: Maybe it's pretty new. Is it pretty new? But it's 372 00:22:19,119 --> 00:22:24,600 Speaker 1: a thing now where uh, you can pay essentially to 373 00:22:24,760 --> 00:22:29,840 Speaker 1: get something published. Yes, you can, um it kind of 374 00:22:29,880 --> 00:22:32,320 Speaker 1: it's kind of like who's who in behavioral science is 375 00:22:32,440 --> 00:22:35,960 Speaker 1: kind of thing, you know. Um. And apparently it's new 376 00:22:36,000 --> 00:22:39,720 Speaker 1: because it's a result of open source academic journals, which 377 00:22:39,720 --> 00:22:43,360 Speaker 1: a lot of people push for, including Aaron Schwartz very 378 00:22:43,400 --> 00:22:46,280 Speaker 1: famously who like took a bunch of academic articles and 379 00:22:46,280 --> 00:22:50,320 Speaker 1: published him online and was prosecuted heavily for it. Persecuted. 380 00:22:50,400 --> 00:22:53,399 Speaker 1: You could even say, um, but the idea that science 381 00:22:53,440 --> 00:22:56,800 Speaker 1: is behind this paywall, which is another great article from 382 00:22:56,800 --> 00:23:00,480 Speaker 1: Priceonomics by the way, um, really just takes a lot 383 00:23:00,480 --> 00:23:03,600 Speaker 1: of people off. So they started open source journals right, 384 00:23:04,160 --> 00:23:07,920 Speaker 1: and as a result, predatory publishers came about and said, okay, yeah, 385 00:23:08,040 --> 00:23:10,679 Speaker 1: let's make this free, but we need to make our 386 00:23:10,720 --> 00:23:13,919 Speaker 1: money anyway, so we're going to charge the academic who 387 00:23:14,000 --> 00:23:16,760 Speaker 1: wrote the study for publishing it. Well yeah, and and 388 00:23:16,920 --> 00:23:22,040 Speaker 1: sometimes now it's just a flat out scam operation. There's 389 00:23:22,040 --> 00:23:26,280 Speaker 1: this guy named Jeffrey Beale who is a research librarian. 390 00:23:26,400 --> 00:23:30,240 Speaker 1: He is my new hero because he's truly like one 391 00:23:30,280 --> 00:23:33,879 Speaker 1: of these dudes that has uh, he's trying to make 392 00:23:33,880 --> 00:23:36,840 Speaker 1: a difference and he's not profiting from this, but he's 393 00:23:36,920 --> 00:23:41,760 Speaker 1: spending a lot of time by creating a list of 394 00:23:41,760 --> 00:23:45,960 Speaker 1: of predatory publishers. Yeah, a significant list too. Yeah, how 395 00:23:45,960 --> 00:23:51,600 Speaker 1: many four thousand of them right now? Um. Some of 396 00:23:51,640 --> 00:23:54,840 Speaker 1: these companies flat out lie, like they're literally based out 397 00:23:54,840 --> 00:23:57,760 Speaker 1: of Pakistan or Nigeria and they say no, we're in 398 00:23:57,800 --> 00:24:01,919 Speaker 1: New York. Yea publisher, So it's just a flat out scam. 399 00:24:02,040 --> 00:24:05,359 Speaker 1: Or they lie about their review practices. Um, like they 400 00:24:05,440 --> 00:24:08,439 Speaker 1: might not have any review practices and they straight up 401 00:24:08,520 --> 00:24:11,359 Speaker 1: lie and say they do. There was one called Scientific 402 00:24:11,440 --> 00:24:15,040 Speaker 1: Journals International out of Minnesota that he found out was 403 00:24:15,080 --> 00:24:18,520 Speaker 1: just one guy like literally working out of his home, 404 00:24:19,560 --> 00:24:23,359 Speaker 1: just lobbying for articles, charging to get them published, not 405 00:24:23,480 --> 00:24:27,639 Speaker 1: reviewing anything, and just saying I'm a journal, I'm a 406 00:24:27,680 --> 00:24:31,159 Speaker 1: scientific journal. He shut it down apparently or tried to 407 00:24:31,160 --> 00:24:35,480 Speaker 1: sell it. I think he was found out. Um and 408 00:24:35,600 --> 00:24:39,920 Speaker 1: this other one, the International Journal of Engineering Research and Applications. 409 00:24:40,440 --> 00:24:43,280 Speaker 1: They created an award and then gave it to itself 410 00:24:45,080 --> 00:24:48,359 Speaker 1: and even modeled the award from an Australian TV award 411 00:24:48,600 --> 00:24:52,880 Speaker 1: like the physical Statute. That's fascinating. I didn't they could 412 00:24:52,920 --> 00:24:56,399 Speaker 1: do that. I'm gonna give ourselves. Yeah, let's the best 413 00:24:56,480 --> 00:25:02,119 Speaker 1: podcast in the Universal Award. It's gonna look like the Oscar. Yeah, okay, 414 00:25:02,840 --> 00:25:05,640 Speaker 1: the Oscar crusted with the Emmy. Uh this other one 415 00:25:05,760 --> 00:25:11,160 Speaker 1: med med No publications actually confused the meaning of STM 416 00:25:11,280 --> 00:25:16,080 Speaker 1: Science Technology Medicine. They thought it meant sports technology in Medicine. No. Well, 417 00:25:16,119 --> 00:25:20,280 Speaker 1: a lot of UM science journalists or scientists too. But 418 00:25:20,400 --> 00:25:25,400 Speaker 1: watchdogs like to send like gibberish articles into those things 419 00:25:25,400 --> 00:25:27,199 Speaker 1: to see if they publish him, and sometimes they do. 420 00:25:27,400 --> 00:25:30,359 Speaker 1: Frequently they do sniff them off the case it's the 421 00:25:30,359 --> 00:25:33,480 Speaker 1: big time. How about that call back? It's been a while, 422 00:25:33,720 --> 00:25:37,040 Speaker 1: it needs to be a T shirt. Did we take 423 00:25:37,080 --> 00:25:39,399 Speaker 1: a break? Yeah, all right, we'll be back and finish 424 00:25:39,480 --> 00:26:05,439 Speaker 1: up right after this. Just so much so, here's a 425 00:26:05,440 --> 00:26:08,360 Speaker 1: big one. You ever heard the term follow the money? Hm. 426 00:26:09,600 --> 00:26:14,240 Speaker 1: That's applicable to a lot of realms of society, and 427 00:26:14,320 --> 00:26:19,919 Speaker 1: most certainly in journals. UM. If something looks hinky, just 428 00:26:20,000 --> 00:26:23,840 Speaker 1: do a little investigating and see who's sponsoring their work. Well, 429 00:26:23,920 --> 00:26:27,920 Speaker 1: especially if that person is like, no, everyone else is wrong. 430 00:26:28,680 --> 00:26:31,400 Speaker 1: Climate change is not man made kind of thing. Sure, 431 00:26:31,760 --> 00:26:34,440 Speaker 1: you know, if you look at where their funding is 432 00:26:34,480 --> 00:26:37,680 Speaker 1: coming from, you might be unsurprised to find that it's 433 00:26:37,720 --> 00:26:40,679 Speaker 1: coming from people who would benefit from the idea that 434 00:26:41,280 --> 00:26:44,159 Speaker 1: anthropogenic climate change isn't real. Yeah, well we might as 435 00:26:44,160 --> 00:26:48,240 Speaker 1: well talk about him Willie Soon. Yeah, Mr Soon. Is 436 00:26:48,240 --> 00:26:52,040 Speaker 1: he a doctor? He's a He's a physicist of some sort. Yeah, 437 00:26:52,480 --> 00:26:56,840 Speaker 1: all right, m M. I'm just gonna say Mr. Or 438 00:26:56,920 --> 00:27:00,920 Speaker 1: doctor Soon because I'm not positive. Uh. He is one 439 00:27:00,920 --> 00:27:05,400 Speaker 1: of a few people on the planet Earth. Um professionals, 440 00:27:05,440 --> 00:27:11,439 Speaker 1: that is, who deny human climate change, human influence climate 441 00:27:11,480 --> 00:27:13,639 Speaker 1: change like you said, he said the fancier word for 442 00:27:13,640 --> 00:27:19,199 Speaker 1: it though, and anthropogenic um. And he works at the 443 00:27:19,200 --> 00:27:24,040 Speaker 1: Harvard Smithsonian Center for Astrophysics. So hey, he's with Harvard. 444 00:27:24,760 --> 00:27:28,280 Speaker 1: He's got the cred right. Um. Turns out when you 445 00:27:28,320 --> 00:27:30,960 Speaker 1: look into where he's getting his funding. Uh. He received 446 00:27:30,960 --> 00:27:33,960 Speaker 1: one point two million dollars over the past decade from 447 00:27:34,119 --> 00:27:39,160 Speaker 1: Exxon Mobile, the Southern Company, the Cokes, and the Koke brothers, 448 00:27:39,920 --> 00:27:43,440 Speaker 1: their foundation, the Charles G. Coke Foundation. Excen stopped in 449 00:27:44,000 --> 00:27:47,240 Speaker 1: stopped funding him, but the bulk of his money and 450 00:27:47,280 --> 00:27:50,560 Speaker 1: his funding came and I'm sorry, I forgot the American 451 00:27:50,600 --> 00:27:54,320 Speaker 1: Petroleum Institute came from people who clearly had a dog 452 00:27:54,400 --> 00:27:59,439 Speaker 1: in this fight. And it's just how can you trust this? 453 00:27:59,560 --> 00:28:02,840 Speaker 1: You know? Yeah, well you trusted because there's a guy 454 00:28:02,880 --> 00:28:05,440 Speaker 1: and he has a PhD in aerospace engineering by the way, 455 00:28:05,480 --> 00:28:08,359 Speaker 1: all right, he's a duck. He works with this, um, 456 00:28:08,600 --> 00:28:12,679 Speaker 1: this organization, the Harvard Smithsonian Center for Astrophysics, which is 457 00:28:12,680 --> 00:28:16,719 Speaker 1: a legitimate place. Um, it doesn't get any funding from Harvard, 458 00:28:16,720 --> 00:28:20,080 Speaker 1: but it gets a lot from NASA and from the Smithsonian. Well, 459 00:28:20,080 --> 00:28:22,280 Speaker 1: and Harvard's very clear to point this out when people 460 00:28:22,280 --> 00:28:26,000 Speaker 1: ask him about Willie Soon, Um, they're kind of like, well, 461 00:28:26,040 --> 00:28:29,000 Speaker 1: here's the quote. Willie Soon as a Smithsonian staff researcher 462 00:28:29,600 --> 00:28:33,439 Speaker 1: at Harvard Smithsonian Center for Astrophysics, a collaboration of the 463 00:28:33,480 --> 00:28:38,600 Speaker 1: Harvard College Observatory in the Smithsonian Astrophysical Observatory. Like they 464 00:28:38,640 --> 00:28:40,680 Speaker 1: just want to be real clear. Even though he uses 465 00:28:40,680 --> 00:28:43,760 Speaker 1: a Harvard email address, he's not our employee. No, but 466 00:28:44,040 --> 00:28:46,400 Speaker 1: again he's getting lots of funding from NASA and lots 467 00:28:46,400 --> 00:28:51,000 Speaker 1: of funding from the Smithsonian. This guy, Um, if his 468 00:28:51,560 --> 00:28:55,200 Speaker 1: scientific beliefs are what they are, and he's a smart guy, 469 00:28:55,560 --> 00:28:59,080 Speaker 1: then yeah, I don't know about like getting fired for saying, 470 00:28:59,200 --> 00:29:03,200 Speaker 1: you know, here's a paper on on the idea that 471 00:29:03,600 --> 00:29:06,440 Speaker 1: climate change is not human made. Yeah, he thinks it's 472 00:29:06,640 --> 00:29:10,720 Speaker 1: the Sun's fault. But he didn't. He doesn't reveal in 473 00:29:10,760 --> 00:29:15,200 Speaker 1: any of his um conflicts of interest. Uh, that should 474 00:29:15,200 --> 00:29:16,880 Speaker 1: go at the end of the paper. He didn't reveal 475 00:29:16,920 --> 00:29:20,600 Speaker 1: where his funding was coming from. And I get the 476 00:29:20,600 --> 00:29:23,880 Speaker 1: impression that in academia, if you're are totally cool with 477 00:29:23,920 --> 00:29:28,480 Speaker 1: everybody thinking like you're a shill, you can get away 478 00:29:28,520 --> 00:29:32,600 Speaker 1: with it. Right. Well, this stuff, a lot of this 479 00:29:32,640 --> 00:29:36,320 Speaker 1: stuff is not illegal. Right, Even predatory publishing is not illegal, 480 00:29:36,720 --> 00:29:40,040 Speaker 1: just unethical. And if you're counting on people to police 481 00:29:40,040 --> 00:29:42,680 Speaker 1: themselves with ethics, a lot of times will disappoint you. 482 00:29:43,400 --> 00:29:47,760 Speaker 1: The Heartland Institute gave Willie Soon a Courage Award, and 483 00:29:47,800 --> 00:29:50,680 Speaker 1: if you're not caring about what other scientists think about 484 00:29:50,760 --> 00:29:53,480 Speaker 1: if you've heard the Heartland Institute, you might remember them. 485 00:29:53,800 --> 00:29:56,440 Speaker 1: They are a conservative think tank. You might remember them 486 00:29:56,440 --> 00:30:00,160 Speaker 1: in the nineties when they worked alongside Philip Morris to 487 00:30:00,840 --> 00:30:04,120 Speaker 1: deny the risks of secondhand smoke. Yeah, that's all chronicle 488 00:30:04,240 --> 00:30:08,040 Speaker 1: In that book. I've talked about merchants of doubt, A 489 00:30:08,080 --> 00:30:11,680 Speaker 1: bunch of scientists, legitimate bona fide scientists who are like 490 00:30:11,960 --> 00:30:18,720 Speaker 1: up for for um, being bought by groups like that said, 491 00:30:19,040 --> 00:30:21,680 Speaker 1: it is sad um and the whole the whole thing 492 00:30:21,760 --> 00:30:25,080 Speaker 1: is they're saying like, well, you can't stay without beyond 493 00:30:25,160 --> 00:30:28,920 Speaker 1: a shadow of a doubt with absolute certainty, that that's 494 00:30:29,000 --> 00:30:32,320 Speaker 1: the case, and science is like, no, science doesn't do that. 495 00:30:32,320 --> 00:30:35,080 Speaker 1: Science doesn't do absolute certainty. But the average person reading 496 00:30:35,080 --> 00:30:38,960 Speaker 1: a newspaper sees that, oh you can't stay with absolute certainty, Well, 497 00:30:38,960 --> 00:30:41,800 Speaker 1: then maybe it isn't man made. And then there's that 498 00:30:41,920 --> 00:30:44,040 Speaker 1: doubt that the people just go and get the money 499 00:30:44,080 --> 00:30:47,400 Speaker 1: for for saying that, for writing papers about it. It's 500 00:30:47,480 --> 00:30:54,440 Speaker 1: millions of Yeah, it really is. Um self reviewed. Uh, 501 00:30:54,800 --> 00:30:56,640 Speaker 1: you've heard of peer review. We've talked about it quite 502 00:30:56,640 --> 00:30:59,000 Speaker 1: a bit. Your reviews. When you have a study and 503 00:30:59,040 --> 00:31:03,040 Speaker 1: then one or more ideally more of your peers reviews 504 00:31:03,080 --> 00:31:05,560 Speaker 1: your study and says, you know what, you had best practices, 505 00:31:05,600 --> 00:31:08,720 Speaker 1: You did it right. Um, it was reproducible, you follow 506 00:31:08,720 --> 00:31:11,320 Speaker 1: the scientific method. Um, I'm gonna give it my stamp 507 00:31:11,320 --> 00:31:13,640 Speaker 1: of approval and put my name on it, not literally 508 00:31:14,320 --> 00:31:16,959 Speaker 1: or is it? I think so? It says who reviewed 509 00:31:16,960 --> 00:31:20,680 Speaker 1: it in the journal when it's published, but not my 510 00:31:20,760 --> 00:31:23,880 Speaker 1: name as the author of study. You know what I mean? Um, 511 00:31:24,080 --> 00:31:26,360 Speaker 1: And the peer reviewer. Yeah, as a peer reviewer, and 512 00:31:26,400 --> 00:31:30,840 Speaker 1: that's a wonderful thing. But people have faked this and 513 00:31:31,760 --> 00:31:36,000 Speaker 1: been their own peer reviewer, which is not how it works. No, 514 00:31:36,560 --> 00:31:41,200 Speaker 1: who is this guy? Uh? Well, I'm terrible at pronouncing 515 00:31:42,360 --> 00:31:46,280 Speaker 1: Korean names, so all apologies, but I'm gonna say nung 516 00:31:46,520 --> 00:31:50,760 Speaker 1: In Moon. Nice, Dr Moon, I think, yeah, let's call 517 00:31:50,840 --> 00:31:54,600 Speaker 1: m Dr Moon. Okay. So Dr Moon Um worked on 518 00:31:54,840 --> 00:31:59,280 Speaker 1: natural medicine, I believe, and was submitting all these papers 519 00:31:59,320 --> 00:32:02,560 Speaker 1: that were getting very quickly because apparently part of the 520 00:32:02,600 --> 00:32:05,200 Speaker 1: process of peer of views to say, this paper is great, 521 00:32:05,400 --> 00:32:08,560 Speaker 1: Can you recommend some people in your field that can 522 00:32:08,800 --> 00:32:12,480 Speaker 1: review your paper? And Dr Moon said, I sure can. 523 00:32:12,800 --> 00:32:14,880 Speaker 1: He was on fire. Let me go make up some 524 00:32:14,920 --> 00:32:17,959 Speaker 1: people and make up some email addresses that actually come 525 00:32:18,000 --> 00:32:20,720 Speaker 1: to my inbox and just posed as all of his 526 00:32:20,760 --> 00:32:25,200 Speaker 1: own peer reviewers. He was lazy, though, is the thing, Like, 527 00:32:25,240 --> 00:32:26,880 Speaker 1: I don't know that he would have been found out 528 00:32:26,880 --> 00:32:32,840 Speaker 1: if he hadn't been um careless. I guess because he 529 00:32:33,080 --> 00:32:36,560 Speaker 1: was returning the reviews within like twenty four hours. Sometimes 530 00:32:36,920 --> 00:32:40,440 Speaker 1: a peer of view of like a real um study 531 00:32:40,680 --> 00:32:43,800 Speaker 1: should take I would guess weeks, if not months, like 532 00:32:44,000 --> 00:32:48,800 Speaker 1: the the study the publication schedule for the average study 533 00:32:48,880 --> 00:32:50,800 Speaker 1: or paper. I don't think it's a very quick thing. 534 00:32:50,880 --> 00:32:53,320 Speaker 1: There's not a lot of quick turner And this guy 535 00:32:53,360 --> 00:32:57,080 Speaker 1: was like twenty four hours. Dr Moon. I see your 536 00:32:57,080 --> 00:33:02,000 Speaker 1: paper was reviewed and accepted by Dr Mooney. It's like 537 00:33:03,000 --> 00:33:06,640 Speaker 1: I just added a Y to the end. It seemed easy. Uh. 538 00:33:06,680 --> 00:33:10,840 Speaker 1: If you google peer review fraud, you will be shocked 539 00:33:11,200 --> 00:33:16,040 Speaker 1: at how often this happens and how many legit science 540 00:33:16,680 --> 00:33:21,400 Speaker 1: publishers are having to retract studies. Uh. And it doesn't 541 00:33:21,440 --> 00:33:24,480 Speaker 1: mean they're bad. Um, they're getting duped as well. But 542 00:33:24,520 --> 00:33:28,040 Speaker 1: there was one based in Berlin that had sixty four 543 00:33:28,120 --> 00:33:33,040 Speaker 1: retractions because of fraudulent reviews. And they're just one publisher 544 00:33:33,120 --> 00:33:37,280 Speaker 1: of many. Every publisher out there probably has been duped. Um. 545 00:33:37,320 --> 00:33:41,680 Speaker 1: Maybe not everyone, I'm surmising that, but it's a big problem. 546 00:33:44,160 --> 00:33:47,520 Speaker 1: I'll review it. It'll end up in the headlines. Now, 547 00:33:48,360 --> 00:33:52,760 Speaker 1: every single publisher duped, says Chuck. Uh. And speaking of 548 00:33:52,960 --> 00:33:56,400 Speaker 1: um the headlines, Chuck. One of the problems with science 549 00:33:56,440 --> 00:34:01,560 Speaker 1: reporting or reading science reporting is that what you usually 550 00:34:01,600 --> 00:34:04,240 Speaker 1: are hearing, especially if it's making a big splash, is 551 00:34:04,280 --> 00:34:07,880 Speaker 1: what's called the initial findings. Somebody carried out a study, 552 00:34:07,960 --> 00:34:10,399 Speaker 1: and this is what they found, and it's amazing and 553 00:34:10,560 --> 00:34:15,120 Speaker 1: mind blowing and it um, it supports everything everyone's always known. 554 00:34:15,160 --> 00:34:17,799 Speaker 1: But now there's a scientific study that says, yes, that's 555 00:34:17,840 --> 00:34:21,560 Speaker 1: the case. And then if you wait a year or two, 556 00:34:21,600 --> 00:34:25,399 Speaker 1: when people follow up and reproduce the study and find 557 00:34:25,440 --> 00:34:28,080 Speaker 1: that it's actually not the case, it doesn't get reported on. 558 00:34:28,239 --> 00:34:33,799 Speaker 1: Usually yeah, and and sometimes the science scientists or the 559 00:34:33,800 --> 00:34:37,760 Speaker 1: publisher is they're doing it right, and they say initial findings, 560 00:34:38,320 --> 00:34:41,879 Speaker 1: but the public and sometimes even the reporter will say 561 00:34:41,920 --> 00:34:45,960 Speaker 1: initial findings. But we as a people that ingest this 562 00:34:46,000 --> 00:34:49,759 Speaker 1: stuff need to understand what that means, um, And the 563 00:34:49,800 --> 00:34:52,759 Speaker 1: fine print is always like you know, you know, more 564 00:34:52,800 --> 00:34:56,000 Speaker 1: studies needed, but no one if it's something that you 565 00:34:56,040 --> 00:34:58,960 Speaker 1: want to be true, you'll just say, hey, look at 566 00:34:59,000 --> 00:35:02,920 Speaker 1: the study, right. You know it's brand new and they 567 00:35:02,960 --> 00:35:05,560 Speaker 1: need to study for twenty more years, but hey, look 568 00:35:05,600 --> 00:35:08,919 Speaker 1: what it says. And the more the more you start 569 00:35:08,920 --> 00:35:10,759 Speaker 1: paying attention to this kind of thing, the more kind 570 00:35:10,760 --> 00:35:14,560 Speaker 1: of disdain you have for that kind of just off 571 00:35:14,640 --> 00:35:20,840 Speaker 1: hand um sensationalist science reporting. But you'll still get caught 572 00:35:20,920 --> 00:35:22,640 Speaker 1: up in it. Like every once in a while, I'll 573 00:35:22,680 --> 00:35:24,680 Speaker 1: catch myself like saying something you'd be like, oh, did 574 00:35:24,719 --> 00:35:26,520 Speaker 1: you hear this? And then as I'm saying it out loud, 575 00:35:26,560 --> 00:35:29,680 Speaker 1: I'm like, that's preposterous. Yeah, there's no way that's going 576 00:35:29,719 --> 00:35:33,799 Speaker 1: to pan out to be true. I got collebated, I know. 577 00:35:34,080 --> 00:35:36,759 Speaker 1: I mean, we we have to avoid this stuff. It's 578 00:35:36,800 --> 00:35:40,680 Speaker 1: stuff because we have our name on this podcast. But 579 00:35:40,800 --> 00:35:43,960 Speaker 1: luckily we've given ourselves the back door of saying, hey, 580 00:35:44,000 --> 00:35:46,880 Speaker 1: we make mistakes a lot. It's true though we're not 581 00:35:48,400 --> 00:35:51,520 Speaker 1: we're not scientists. Uh. And then finally we're gonna finish 582 00:35:51,600 --> 00:35:55,000 Speaker 1: up with the header on this one is it's a 583 00:35:55,000 --> 00:35:59,480 Speaker 1: cool story. And that's a big one because, um, it's 584 00:35:59,520 --> 00:36:02,600 Speaker 1: not enough these days. And this all ties in with 585 00:36:02,640 --> 00:36:05,520 Speaker 1: the media and how we read things as people. But 586 00:36:05,640 --> 00:36:08,080 Speaker 1: it's not enough just to have a study that might 587 00:36:08,120 --> 00:36:10,480 Speaker 1: prove something. You have to wrap it up in a 588 00:36:10,560 --> 00:36:13,960 Speaker 1: nice package to deliver people, get it in the news cycle. 589 00:36:14,040 --> 00:36:18,720 Speaker 1: And the cooler the better. Yep. It almost doesn't matter 590 00:36:19,640 --> 00:36:23,200 Speaker 1: about the science as far as the media is concerned. 591 00:36:23,239 --> 00:36:26,799 Speaker 1: They just want a good headline and a scientist who 592 00:36:26,800 --> 00:36:29,840 Speaker 1: will say, yeah, that's that's cool. Here's what I found. 593 00:36:30,480 --> 00:36:35,160 Speaker 1: This is going to change the world. Lockness Monster is real. 594 00:36:35,680 --> 00:36:38,160 Speaker 1: This is a kind of ended up being depressing somehow 595 00:36:38,800 --> 00:36:45,520 Speaker 1: not somehow Yeah, like, yeah, it's kind of depressing. We'll 596 00:36:45,520 --> 00:36:47,960 Speaker 1: figure it out, Chuck. Well, we do our best. I'll 597 00:36:48,000 --> 00:36:52,080 Speaker 1: say that science will prevail, I hope. So. Uh, if 598 00:36:52,160 --> 00:36:55,160 Speaker 1: you want to know more about science and scientific studies 599 00:36:55,200 --> 00:36:57,440 Speaker 1: and research fraud and all that kind of stuff, just 600 00:36:57,520 --> 00:37:00,040 Speaker 1: type some random words into the search part how to 601 00:37:00,160 --> 00:37:02,560 Speaker 1: work dot com. See what comes up. And since I 602 00:37:02,600 --> 00:37:06,600 Speaker 1: said random, it's time for a listener mail. Oh no, oh, yeah, 603 00:37:06,719 --> 00:37:16,720 Speaker 1: you know what. It's time for administrator All right, Josh, 604 00:37:16,840 --> 00:37:19,319 Speaker 1: administrative details. If you're new to the show, you don't 605 00:37:19,320 --> 00:37:22,479 Speaker 1: know what it is. That's the very clunky title. We're saying, 606 00:37:22,480 --> 00:37:25,840 Speaker 1: thank you to listeners who send us neat things. It 607 00:37:26,000 --> 00:37:28,799 Speaker 1: is clunky and generic and I've totally gotten used to 608 00:37:28,840 --> 00:37:30,200 Speaker 1: it by now. Well you're the one who made it 609 00:37:30,280 --> 00:37:33,640 Speaker 1: up to be clunky and generic, and it's stuck. Yeah. 610 00:37:33,800 --> 00:37:35,719 Speaker 1: So people send us stuff from time to time, and 611 00:37:35,760 --> 00:37:38,239 Speaker 1: it's just very kind of you to do so, yes, 612 00:37:38,440 --> 00:37:40,520 Speaker 1: and we like to give shout outs whether or not 613 00:37:40,600 --> 00:37:42,560 Speaker 1: it's just out of the goodness of your heart, or 614 00:37:42,560 --> 00:37:44,879 Speaker 1: if you have a little small business that you're trying 615 00:37:44,920 --> 00:37:47,360 Speaker 1: to plug either way, it's a sneaky way of getting 616 00:37:47,360 --> 00:37:49,400 Speaker 1: it in there. Yeah, but I mean I think we 617 00:37:49,400 --> 00:37:51,480 Speaker 1: we brought that on, didn't we didn't We say like, 618 00:37:51,520 --> 00:37:53,640 Speaker 1: if you have a small business, then you send us something, 619 00:37:53,719 --> 00:37:56,920 Speaker 1: we'll we'll be happy to say something exactly. Thank you. 620 00:37:56,960 --> 00:37:58,920 Speaker 1: All right, so let's get it going here. We got 621 00:37:58,960 --> 00:38:02,520 Speaker 1: some coffee right from uh, from one thousand faces right 622 00:38:02,520 --> 00:38:07,920 Speaker 1: here in Athens, Georgia from Kayla. Yeah, delicious, Yes it was. 623 00:38:08,200 --> 00:38:10,560 Speaker 1: We also got some other coffee too, from Jonathan at 624 00:38:10,600 --> 00:38:13,760 Speaker 1: Steamworks Coffee. He came up with a Josh and Chuck blend. 625 00:38:13,920 --> 00:38:16,239 Speaker 1: Oh yeah, it's pretty awesome. I believe it's available for 626 00:38:16,280 --> 00:38:19,160 Speaker 1: sale to Yeah. That Josh and Chuck blend is dark 627 00:38:19,200 --> 00:38:26,640 Speaker 1: and bitter. Uh. Jim Simmons, he's a retired teacher who 628 00:38:26,719 --> 00:38:30,719 Speaker 1: sent us some lovely handmade wooden bowls and a very 629 00:38:30,800 --> 00:38:34,560 Speaker 1: nice handwritten letter, which is always great. Thanks a lot, Jim. Uh. 630 00:38:34,680 --> 00:38:38,960 Speaker 1: Let's see. Chamberlayne sent us homemade pasta, including a delicious 631 00:38:39,000 --> 00:38:44,440 Speaker 1: savory pumpkin fettuccini. It was very nice. Um. Jake graft 632 00:38:44,680 --> 00:38:47,600 Speaker 1: two F's send us a postcard from Great Wall of China. 633 00:38:47,920 --> 00:38:50,440 Speaker 1: It's kind of neat. Sometimes we get those postcards from 634 00:38:50,440 --> 00:38:56,080 Speaker 1: places we've talked about. I was like, thanks, here, let's 635 00:38:56,080 --> 00:38:58,319 Speaker 1: see the Hammer Press team. They sent us a bunch 636 00:38:58,360 --> 00:39:00,719 Speaker 1: of Mother's Day cards that are wonderful. Oh those were 637 00:39:00,719 --> 00:39:02,640 Speaker 1: really nice, really great. You should check them out. The 638 00:39:02,719 --> 00:39:07,759 Speaker 1: Hammer Press team. Yeah. Uh, Misty Billy and Jessica. They 639 00:39:07,800 --> 00:39:09,919 Speaker 1: sent us a care package of a lot of things. 640 00:39:10,040 --> 00:39:13,360 Speaker 1: There were some cookies, um, including one of my favorite 641 00:39:13,400 --> 00:39:17,759 Speaker 1: white chocolate dipped rits, and peanut butter crackers. Oh yeah, man, 642 00:39:17,800 --> 00:39:21,040 Speaker 1: I love those homemade right yeah. And uh, then some 643 00:39:21,120 --> 00:39:27,280 Speaker 1: seventies Macromay for you, along with seventies Macromay magazines because 644 00:39:27,320 --> 00:39:30,600 Speaker 1: you're obsessed with Macromay. We have a Macromay plant holder 645 00:39:30,640 --> 00:39:36,200 Speaker 1: hanging from my um microphone arms a coffee mug sent 646 00:39:36,280 --> 00:39:38,279 Speaker 1: to us by Joe and Lynda heckt oh that's right, 647 00:39:38,600 --> 00:39:40,839 Speaker 1: and it has some pens in it. Uh. And they 648 00:39:40,880 --> 00:39:43,480 Speaker 1: also sent us a Misty Billy and Jessica a lovely 649 00:39:43,520 --> 00:39:45,600 Speaker 1: little hand drawn picture of us with their family, which 650 00:39:45,600 --> 00:39:48,839 Speaker 1: was so sweet, awesome. Um. We've said it before, we'll 651 00:39:48,880 --> 00:39:52,160 Speaker 1: say it again. Huge thank you to Jim Ruaine. I 652 00:39:52,200 --> 00:39:54,440 Speaker 1: believe that's how you say his name and the Crown 653 00:39:54,520 --> 00:39:57,799 Speaker 1: Royal people for sending us all the Crown Royal We 654 00:39:57,840 --> 00:40:02,320 Speaker 1: are running low. Uh. Mark Silberg at the Rocky Mountain 655 00:40:02,320 --> 00:40:06,800 Speaker 1: Institute sent us a book called Reinventing Fire. They're great 656 00:40:06,800 --> 00:40:09,000 Speaker 1: out there, man, they know what they're talking about. And 657 00:40:09,040 --> 00:40:12,880 Speaker 1: I think it's reinventing Fire. Colon Bold Businesses Bowld business 658 00:40:12,880 --> 00:40:16,400 Speaker 1: Solutions for the New Energy Era. Yeah, they're they're basically 659 00:40:16,440 --> 00:40:20,799 Speaker 1: like um, green energy observers, but I think they um, 660 00:40:20,840 --> 00:40:23,120 Speaker 1: they're experts in like all sectors of energy, but they 661 00:40:23,120 --> 00:40:25,239 Speaker 1: have a focus on green energy, which is awesome. Yeah, 662 00:40:25,239 --> 00:40:30,440 Speaker 1: they're pretty cool. Um john whose wife makes Delightfully Delicious 663 00:40:30,480 --> 00:40:33,280 Speaker 1: doggie treats. Delightfully Delicious is the name of the company. 664 00:40:33,560 --> 00:40:36,600 Speaker 1: There's no artificial colors or flavors. And they got um 665 00:40:36,640 --> 00:40:40,640 Speaker 1: sweet little Momo hooked on sweet potato dog treats. I 666 00:40:40,680 --> 00:40:43,360 Speaker 1: thought you're gonna say, hooked on the junk, the the 667 00:40:43,520 --> 00:40:47,359 Speaker 1: sweet potato junk. She's crazy cuckoo for sweet potatoes. Nice. 668 00:40:47,960 --> 00:40:50,080 Speaker 1: That's good for a dog too. It is very h 669 00:40:50,160 --> 00:40:53,719 Speaker 1: Stratt Johnson sent us his band's LP And if you're 670 00:40:53,760 --> 00:40:57,240 Speaker 1: in a band, your name is Strat. That's pretty cool. Uh, 671 00:40:57,600 --> 00:41:04,120 Speaker 1: die omeya still mhm. I think that was great. Yeah, 672 00:41:04,280 --> 00:41:05,799 Speaker 1: I'm not sure if I pronounced all right, d I 673 00:41:05,880 --> 00:41:11,120 Speaker 1: O M A e a uh. Frederick, this is long overdue. 674 00:41:11,120 --> 00:41:15,319 Speaker 1: Frederick at the store one five to one store dot 675 00:41:15,400 --> 00:41:19,759 Speaker 1: com sent us some awesome low profile cork iPhone cases 676 00:41:20,040 --> 00:41:23,279 Speaker 1: and passport holders, and I was telling him, Jerry walks 677 00:41:23,280 --> 00:41:26,400 Speaker 1: around with her iPhone in the cork holder and it 678 00:41:26,440 --> 00:41:29,120 Speaker 1: looks pretty sweet. Yeah, so he said, AWESO, I'm glad 679 00:41:29,160 --> 00:41:31,919 Speaker 1: to hear. Joe and Holly Harper sent us some really 680 00:41:31,960 --> 00:41:35,799 Speaker 1: cool three D printed stuff you should know, things like 681 00:41:36,080 --> 00:41:39,160 Speaker 1: s y s k uh, you know, like a little 682 00:41:39,160 --> 00:41:42,799 Speaker 1: desk oh as, like after Robert Indiana's love sculpture. Yeah, 683 00:41:42,800 --> 00:41:44,839 Speaker 1: that's what I couldn't think of what that was from. Yeah, 684 00:41:44,880 --> 00:41:48,759 Speaker 1: it's awesome. It's really neat and like a bracelet um 685 00:41:48,760 --> 00:41:51,400 Speaker 1: made out of stuff you should know, three D carved 686 00:41:51,920 --> 00:41:54,400 Speaker 1: like plastics, really neat. Yeah, they did some good stuff. 687 00:41:54,560 --> 00:41:57,040 Speaker 1: Thanks Joe and Holly Harper for that. And then last 688 00:41:57,080 --> 00:42:01,560 Speaker 1: for this one, we got a postcard from Yosemite National 689 00:42:01,640 --> 00:42:04,040 Speaker 1: Park from Laura Jackson, So thanks a lot for that. 690 00:42:04,440 --> 00:42:06,719 Speaker 1: Thanks to everybody who sends us stuff. It's nice to 691 00:42:06,760 --> 00:42:09,880 Speaker 1: know we're thought of and we appreciate it. Yeah, we're 692 00:42:09,920 --> 00:42:13,520 Speaker 1: gonna finish up with another set on the next episode 693 00:42:13,760 --> 00:42:17,080 Speaker 1: of Administrative Details. You got anything else, No, it's it. 694 00:42:17,480 --> 00:42:19,400 Speaker 1: Oh yeah. If you guys want to hang out with 695 00:42:19,480 --> 00:42:21,960 Speaker 1: us on social media, you can go to s Y 696 00:42:22,120 --> 00:42:26,200 Speaker 1: s K Podcast on Twitter or on Instagram. You can 697 00:42:26,239 --> 00:42:28,760 Speaker 1: hang out with us at Facebook dot com, slash stuff 698 00:42:28,760 --> 00:42:30,640 Speaker 1: you Should Know. You can send us an email to 699 00:42:30,800 --> 00:42:33,399 Speaker 1: Stuff Podcast at how stuff Works dot com and has 700 00:42:33,400 --> 00:42:35,360 Speaker 1: always joined us at our home on the web, Stuff 701 00:42:35,360 --> 00:42:43,880 Speaker 1: you Should Know dot com For more on this and 702 00:42:43,920 --> 00:42:53,960 Speaker 1: thousands of other topics. Is it how stuff Works dot com.