1 00:00:00,040 --> 00:00:04,080 Speaker 1: M Hey everybody, it's me Josh and for this week's 2 00:00:04,200 --> 00:00:07,240 Speaker 1: s Y s K Selects, I've chosen our guide to 3 00:00:07,440 --> 00:00:11,800 Speaker 1: research tips. It's a surprisingly good episode that shares the 4 00:00:11,840 --> 00:00:15,000 Speaker 1: ins and outs of keeping from being duped online by 5 00:00:15,040 --> 00:00:18,119 Speaker 1: bad information and how to read between the lines on 6 00:00:18,200 --> 00:00:22,079 Speaker 1: sensational science reporting, all sorts of stuff like that. And 7 00:00:22,120 --> 00:00:25,160 Speaker 1: you might notice in this episode Chuck sounds different than usual. 8 00:00:25,480 --> 00:00:27,560 Speaker 1: That's because this is during the period that he was 9 00:00:27,600 --> 00:00:30,480 Speaker 1: transitioning into a person with a full set of teeth, 10 00:00:31,360 --> 00:00:33,199 Speaker 1: so that adds to the hilarity of the whole thing. 11 00:00:33,479 --> 00:00:35,800 Speaker 1: I hope you enjoyed this as much as we did, 12 00:00:36,240 --> 00:00:42,599 Speaker 1: making it. Welcome to Stuff You Should Know, a production 13 00:00:42,640 --> 00:00:50,880 Speaker 1: of My Heart Radios Have Stuff Works. Hey, and welcome 14 00:00:50,920 --> 00:00:53,559 Speaker 1: to the podcast. I'm Josh Clark with Charles W. Chuck 15 00:00:53,640 --> 00:01:00,600 Speaker 1: Bryant and Jerry This Stuff you Should Know. Um, Josh, 16 00:01:00,640 --> 00:01:03,480 Speaker 1: we're gonna do something weird today and we're gonna do 17 00:01:03,520 --> 00:01:05,840 Speaker 1: a listener mail at the head of the podcast, right, 18 00:01:05,880 --> 00:01:10,759 Speaker 1: all right, what, all right, let's do it. Okay, this 19 00:01:10,840 --> 00:01:14,200 Speaker 1: is from being the listener mail music going. Oh, I 20 00:01:14,200 --> 00:01:18,720 Speaker 1: don't know, should we go the whole nine yards? People 21 00:01:18,800 --> 00:01:23,000 Speaker 1: might freak out all right, this is from Bianca. Uh 22 00:01:23,840 --> 00:01:26,720 Speaker 1: VoiceAge is what I'm gonna say. I think that's great. Hey, 23 00:01:26,720 --> 00:01:28,640 Speaker 1: guys wrote you not too long ago asking about Hey 24 00:01:28,720 --> 00:01:31,360 Speaker 1: research your own podcast. It just got back from a 25 00:01:31,400 --> 00:01:36,240 Speaker 1: class where we talked about research misrepresentation and journal articles. 26 00:01:36,440 --> 00:01:39,280 Speaker 1: Apparently journals don't publish everything that is submitted. A lot 27 00:01:39,280 --> 00:01:41,800 Speaker 1: of researchers don't even publish their studies they don't like 28 00:01:41,840 --> 00:01:44,400 Speaker 1: the results. Some laws have been put into place to 29 00:01:44,440 --> 00:01:48,720 Speaker 1: prevent misrepresentation, such as researchers having to register their studies 30 00:01:48,760 --> 00:01:52,600 Speaker 1: before they get results and journals only accepting preregistered studies. 31 00:01:52,640 --> 00:01:55,480 Speaker 1: But apparently this is not happening at all, even though 32 00:01:55,560 --> 00:01:58,360 Speaker 1: it is now technically law. This ends with the general 33 00:01:58,400 --> 00:02:01,640 Speaker 1: public being missing for about methods and drugs that work. 34 00:02:01,880 --> 00:02:04,720 Speaker 1: For example, there are twenty five studies proving a drug 35 00:02:04,760 --> 00:02:09,440 Speaker 1: works that don't. It's more likely that twenty of the 36 00:02:09,480 --> 00:02:11,880 Speaker 1: positive results have been published and only one or two 37 00:02:11,919 --> 00:02:16,040 Speaker 1: of the negative. Uh. And that is from Bianca, and 38 00:02:16,160 --> 00:02:20,120 Speaker 1: that led us to this article on our own website. 39 00:02:20,560 --> 00:02:23,840 Speaker 1: Ten signs that that study is bogus and here it 40 00:02:23,880 --> 00:02:26,720 Speaker 1: is nice Chuck. Well, we get asked a lot about 41 00:02:26,760 --> 00:02:29,919 Speaker 1: research from people, usually in college are like you guys 42 00:02:29,960 --> 00:02:33,360 Speaker 1: are professional researchers. How do I know I'm doing a 43 00:02:33,360 --> 00:02:36,959 Speaker 1: good job and getting good info? And it's getting harder 44 00:02:36,960 --> 00:02:39,959 Speaker 1: and harder these days, it really is. You know. One 45 00:02:40,000 --> 00:02:43,080 Speaker 1: sign that I've learned is if you are searching about 46 00:02:43,080 --> 00:02:47,800 Speaker 1: his study and all of the hits that come back 47 00:02:48,480 --> 00:02:51,760 Speaker 1: are from different news organizations and they're all within like 48 00:02:51,760 --> 00:02:55,680 Speaker 1: a two three day period from a year ago, nothing like, 49 00:02:55,960 --> 00:02:59,560 Speaker 1: nothing more recent than that than somebody released the sensational 50 00:02:59,600 --> 00:03:04,000 Speaker 1: study and no one did any actual effort into investigating it, 51 00:03:04,040 --> 00:03:06,639 Speaker 1: and there was no follow up. If you dig deep enough, 52 00:03:06,720 --> 00:03:08,640 Speaker 1: somebody might have done follow up or something like that, 53 00:03:08,639 --> 00:03:11,280 Speaker 1: but for the most part, it was just something that 54 00:03:11,400 --> 00:03:14,440 Speaker 1: splashed across the headlines, which more often than not is 55 00:03:14,520 --> 00:03:17,000 Speaker 1: the is the case as far as science reporting goes. 56 00:03:17,320 --> 00:03:22,840 Speaker 1: So that's a bonus, that's the eleventh boom. How about that? Yeah, 57 00:03:23,080 --> 00:03:25,519 Speaker 1: should we just start banging these out, let's do it? 58 00:03:25,600 --> 00:03:30,519 Speaker 1: Or do you have some other clever well apart and 59 00:03:30,600 --> 00:03:32,840 Speaker 1: parcel with that. I don't know. If it's clever, you 60 00:03:32,919 --> 00:03:35,600 Speaker 1: do come across people who you know can be trusted 61 00:03:35,640 --> 00:03:39,640 Speaker 1: and relied upon to do good science reporting. So like 62 00:03:39,800 --> 00:03:43,440 Speaker 1: ed Young is one another guy named Ben gold Acre 63 00:03:43,520 --> 00:03:46,520 Speaker 1: has something called bad science. I don't remember what outlet 64 00:03:46,560 --> 00:03:49,000 Speaker 1: he's with. And then there's a guy I think scientific 65 00:03:49,000 --> 00:03:53,160 Speaker 1: American named John Horgan who's awesome. Yeah. Or some journalism 66 00:03:53,280 --> 00:03:56,040 Speaker 1: organizations that have been around and stood the test of 67 00:03:56,120 --> 00:03:57,880 Speaker 1: time that you know are really doing it right, like 68 00:03:58,200 --> 00:04:02,920 Speaker 1: a nature yeah, scientific amor kids are like really science. Yeah, 69 00:04:03,000 --> 00:04:05,840 Speaker 1: Like I feel I feel really good about using those sources. Yeah, 70 00:04:05,880 --> 00:04:08,760 Speaker 1: but even they can you know, there's there's something called 71 00:04:08,840 --> 00:04:11,760 Speaker 1: scientis um where there's a lot of like faith in 72 00:04:11,880 --> 00:04:15,440 Speaker 1: dogma associated with the scientific process, and you know you 73 00:04:15,480 --> 00:04:19,880 Speaker 1: have to root through that as well. Try it. I'm done. Uh. 74 00:04:20,040 --> 00:04:21,839 Speaker 1: The first one that they have here on the list 75 00:04:21,920 --> 00:04:25,479 Speaker 1: is that it's unrepeatable, and that's a big one. UM. 76 00:04:25,560 --> 00:04:29,080 Speaker 1: The Center for Open Science did a study, uh was 77 00:04:29,120 --> 00:04:31,760 Speaker 1: a project really where they took two hundred and seventy 78 00:04:32,000 --> 00:04:34,400 Speaker 1: researchers and they said, you know what, take these one 79 00:04:34,440 --> 00:04:39,000 Speaker 1: hundred studies that have been published already psychological studies and 80 00:04:39,120 --> 00:04:42,839 Speaker 1: just pour over them. And uh, in just last year. 81 00:04:43,000 --> 00:04:45,440 Speaker 1: It took him a while, took them several years. They said, 82 00:04:45,440 --> 00:04:47,520 Speaker 1: you know what, more than half of these can't even 83 00:04:47,560 --> 00:04:51,119 Speaker 1: be repeated using the same methods. They're not reproducible. Nope, 84 00:04:51,160 --> 00:04:54,960 Speaker 1: not reproducible. That's a big one. And and ones that 85 00:04:55,040 --> 00:04:57,680 Speaker 1: means that they when they carried out they followed the 86 00:04:57,720 --> 00:05:01,520 Speaker 1: methodology UM signed sific method podcasts. You should listen to 87 00:05:01,600 --> 00:05:03,880 Speaker 1: that one. That was a good one that they they 88 00:05:03,920 --> 00:05:06,680 Speaker 1: found that their results were just not what the what 89 00:05:06,720 --> 00:05:10,239 Speaker 1: the people published, not anywhere near them. UM. For example, 90 00:05:10,279 --> 00:05:14,000 Speaker 1: they used one as an example where a study found 91 00:05:14,040 --> 00:05:19,599 Speaker 1: that men were terrible at a determining whether a woman 92 00:05:19,760 --> 00:05:23,320 Speaker 1: was giving them um, some sort of like a clues 93 00:05:23,360 --> 00:05:27,520 Speaker 1: to attraction or just being friendly sexy sexy stuff or 94 00:05:28,040 --> 00:05:30,640 Speaker 1: friends or yeah or good to meet you or buzz 95 00:05:30,640 --> 00:05:34,559 Speaker 1: off jerk yeah. Um. And they did the study again 96 00:05:34,600 --> 00:05:37,360 Speaker 1: and as part of this Open Science Center for Open 97 00:05:37,400 --> 00:05:41,160 Speaker 1: Science study your survey, and they found that that was 98 00:05:41,760 --> 00:05:45,800 Speaker 1: not reproducible or that they came up with totally different results. 99 00:05:45,800 --> 00:05:47,720 Speaker 1: And that was just one of many. Yeah. And in 100 00:05:47,760 --> 00:05:50,480 Speaker 1: this case specifically, they looked into that study and they 101 00:05:50,520 --> 00:05:53,960 Speaker 1: found that it was UM. One was in the United Kingdom, 102 00:05:54,040 --> 00:05:56,159 Speaker 1: one was in the United States. May have something to 103 00:05:56,160 --> 00:05:58,320 Speaker 1: do with it. But the point is, Chuck, is if 104 00:05:58,360 --> 00:06:02,480 Speaker 1: you're talking about humanity, I don't think that the study 105 00:06:02,600 --> 00:06:05,880 Speaker 1: was like the American male is terrible at it. It's 106 00:06:06,080 --> 00:06:08,200 Speaker 1: men are terrible at it. Right. So that means that 107 00:06:08,240 --> 00:06:10,840 Speaker 1: whether it's in the UK, which is basically the US 108 00:06:11,320 --> 00:06:15,040 Speaker 1: with an accent and a penchant for t I'm just 109 00:06:15,120 --> 00:06:21,880 Speaker 1: kidding UK s soon, UM, the it should be universal. Yeah, 110 00:06:22,040 --> 00:06:24,560 Speaker 1: you know, I agreed, unless you're saying no, it's just 111 00:06:24,760 --> 00:06:28,680 Speaker 1: this only applies to American men, right, or the American 112 00:06:28,720 --> 00:06:33,159 Speaker 1: men right, then it's not even study. Yeah. Uh. The 113 00:06:33,200 --> 00:06:38,840 Speaker 1: next one we have is, uh, it's it's plausible, not 114 00:06:38,880 --> 00:06:42,640 Speaker 1: necessarily provable. And this is a big one because and 115 00:06:42,680 --> 00:06:45,680 Speaker 1: I think, um, we're talking about observational studies here more 116 00:06:45,720 --> 00:06:48,960 Speaker 1: than lab experiments, because with observational studies, you know, you 117 00:06:49,040 --> 00:06:52,320 Speaker 1: sit in a room and get asked three questions about something, 118 00:06:53,000 --> 00:06:54,880 Speaker 1: and all these people get asked the same questions, and 119 00:06:54,880 --> 00:06:57,520 Speaker 1: then they pour over the data and they draw out 120 00:06:57,560 --> 00:07:02,160 Speaker 1: their own observations. And one very famously, an observational study 121 00:07:02,160 --> 00:07:05,640 Speaker 1: that led to false results found a correlation between having 122 00:07:05,640 --> 00:07:10,240 Speaker 1: a type A personality and um, being prone to risk 123 00:07:10,320 --> 00:07:14,440 Speaker 1: for heart attacks and um for a long time, you 124 00:07:14,480 --> 00:07:16,920 Speaker 1: know that the news outlets were like, oh, yes, of 125 00:07:16,960 --> 00:07:19,840 Speaker 1: course that makes total sense. This study proved what we've 126 00:07:19,840 --> 00:07:23,360 Speaker 1: all known all along, um, And then it came out 127 00:07:23,480 --> 00:07:27,160 Speaker 1: that no, actually what was going on was a well 128 00:07:27,200 --> 00:07:33,240 Speaker 1: known anomaly where you have a five percent um risk 129 00:07:33,320 --> 00:07:37,120 Speaker 1: that chance will produce something that looks like a statistically 130 00:07:37,200 --> 00:07:43,000 Speaker 1: significant correlation when really it's just total chance. And science 131 00:07:43,080 --> 00:07:46,880 Speaker 1: is aware of this, especially with observational studies, because the 132 00:07:46,960 --> 00:07:49,840 Speaker 1: more questions you have, the more opportunity you have for 133 00:07:49,920 --> 00:07:54,560 Speaker 1: that five percent chance to create a seemingly statistically significant 134 00:07:54,920 --> 00:07:59,679 Speaker 1: correlation when really it's not there. It was just random chance, 135 00:07:59,720 --> 00:08:02,560 Speaker 1: where if somebody else goes back and does the same 136 00:08:02,760 --> 00:08:04,960 Speaker 1: same study, they're not going to come up with the 137 00:08:04,960 --> 00:08:11,400 Speaker 1: same results. But the if a researcher is I would guess, 138 00:08:11,440 --> 00:08:16,360 Speaker 1: willfully blind to that five percent chance, um, they will 139 00:08:16,360 --> 00:08:18,200 Speaker 1: go ahead and produce the study and be like, no, 140 00:08:18,280 --> 00:08:20,440 Speaker 1: it's true, here's the results right here, go ahead and 141 00:08:20,440 --> 00:08:22,680 Speaker 1: report on it and make my career. Yeah. Well, and 142 00:08:22,680 --> 00:08:25,200 Speaker 1: they also might be looking for something in affect chances 143 00:08:25,240 --> 00:08:29,360 Speaker 1: are they are, Um, it's not just some random studying. 144 00:08:29,680 --> 00:08:30,960 Speaker 1: Let's just see what we get if we ask a 145 00:08:30,960 --> 00:08:33,280 Speaker 1: bunch of weird questions. It's like, hey, we're looking to 146 00:08:33,280 --> 00:08:36,920 Speaker 1: try and prove something most likely, so that better minehoff 147 00:08:36,960 --> 00:08:39,320 Speaker 1: thing might come into play where you're kind of cherry 148 00:08:39,360 --> 00:08:42,840 Speaker 1: picking data. Yeah, that's a big problem that kind of 149 00:08:42,880 --> 00:08:44,520 Speaker 1: comes up. A lot of these are really kind of 150 00:08:44,559 --> 00:08:47,440 Speaker 1: interrelated to totally. The other big thing that's in related 151 00:08:47,520 --> 00:08:50,880 Speaker 1: is how the media reports on science these days. Yeah, 152 00:08:50,960 --> 00:08:54,400 Speaker 1: you know, he's a big deal. Yeah, Like John Oliver 153 00:08:54,559 --> 00:08:56,800 Speaker 1: just recently went off on this and NPR did a 154 00:08:56,840 --> 00:09:00,840 Speaker 1: thing on it, like they might even like the researcher 155 00:09:00,920 --> 00:09:03,719 Speaker 1: might say plausible, but it doesn't get portrayed that way 156 00:09:03,880 --> 00:09:06,720 Speaker 1: in the media. Sure. Remember that poor kid who thought 157 00:09:06,800 --> 00:09:10,840 Speaker 1: he found the ancient Mayan city. The media just took 158 00:09:10,840 --> 00:09:13,280 Speaker 1: it and ran with it. You know, Yeah, I think 159 00:09:13,280 --> 00:09:16,080 Speaker 1: there was a lot of maybe or it's possible, we 160 00:09:16,120 --> 00:09:18,160 Speaker 1: need to go check kind of thing. That mean, he's like, no, 161 00:09:18,480 --> 00:09:21,480 Speaker 1: he discovered an ancient Mayan city never known before. Yeah, 162 00:09:21,520 --> 00:09:23,480 Speaker 1: and let's put it in a headline. And that's I mean, 163 00:09:23,520 --> 00:09:25,600 Speaker 1: that's the That's just kind of the way it is 164 00:09:25,679 --> 00:09:28,520 Speaker 1: these days. Do you have to be able to sort 165 00:09:28,559 --> 00:09:30,680 Speaker 1: through And I guess that's what we're doing here, aren't we, Chuck, 166 00:09:30,679 --> 00:09:33,120 Speaker 1: We're telling everybody how to sort through it, or at 167 00:09:33,120 --> 00:09:38,920 Speaker 1: the very least take scientific reporting with a grain of salt, yes, right, 168 00:09:38,920 --> 00:09:41,319 Speaker 1: and not well, like you don't necessarily have the time 169 00:09:41,360 --> 00:09:44,760 Speaker 1: to go through and double that research and then check 170 00:09:44,800 --> 00:09:47,960 Speaker 1: on that research, and you know, right, so take it 171 00:09:48,000 --> 00:09:53,120 Speaker 1: with a grain of salt. Um unsound samples. Uh. Here 172 00:09:53,200 --> 00:09:57,199 Speaker 1: was his study that basically said, um, how you lost 173 00:09:57,240 --> 00:10:00,800 Speaker 1: your virginity is going to have a very a large 174 00:10:00,960 --> 00:10:04,600 Speaker 1: impact and play a role on how you feel about 175 00:10:04,840 --> 00:10:08,239 Speaker 1: sex and experienced sex for the rest of your life. Yeah, 176 00:10:08,520 --> 00:10:13,200 Speaker 1: it's possible. Sure, it seems logical, so we'll just go 177 00:10:13,320 --> 00:10:19,600 Speaker 1: with it. But when you, um, only interview college students 178 00:10:20,400 --> 00:10:25,880 Speaker 1: and uh you don't you only interview heterosexual people, then 179 00:10:25,920 --> 00:10:27,840 Speaker 1: you can't really say you've done a robust study, now, 180 00:10:27,880 --> 00:10:31,439 Speaker 1: can't you. Plus you also take out of the sample 181 00:10:31,520 --> 00:10:35,600 Speaker 1: size your sample population, anybody who reports having had a 182 00:10:35,679 --> 00:10:39,439 Speaker 1: violent encounter, Throw them out, that data out, because that's 183 00:10:39,440 --> 00:10:41,640 Speaker 1: not gonna inform how you feel about sex, right exactly. 184 00:10:41,679 --> 00:10:44,560 Speaker 1: You're just narrowing it down further and further and again 185 00:10:44,720 --> 00:10:48,079 Speaker 1: cherry picking the data by throwing people out of your 186 00:10:48,120 --> 00:10:51,360 Speaker 1: population sample that don't that will throw off the data 187 00:10:51,400 --> 00:10:53,560 Speaker 1: that you want. Yeah, and I've never heard of this 188 00:10:53,640 --> 00:10:57,679 Speaker 1: acronym weird and UM. A lot of these studies are 189 00:10:57,800 --> 00:11:00,920 Speaker 1: conducted by professors in acade make so a lot of 190 00:11:00,960 --> 00:11:04,160 Speaker 1: times you've got college students as your sample, and there's 191 00:11:04,160 --> 00:11:10,839 Speaker 1: something called weird Western educated from industrialized, rich in democratic countries. Right, 192 00:11:10,840 --> 00:11:14,080 Speaker 1: those are the participants in the studies studies subject. But 193 00:11:14,080 --> 00:11:18,559 Speaker 1: then they will say, men, right, well, what about the 194 00:11:18,640 --> 00:11:22,920 Speaker 1: gay man in Africa? Right, like you didn't ask him, 195 00:11:23,120 --> 00:11:26,280 Speaker 1: so that was that's actually a really really big deal. Um. 196 00:11:26,320 --> 00:11:29,240 Speaker 1: In two thousand and ten, they these three researchers did 197 00:11:29,280 --> 00:11:34,719 Speaker 1: a survey of a ton of social science and behavioral 198 00:11:34,760 --> 00:11:39,480 Speaker 1: science studies found that eight percent of them used weird 199 00:11:39,840 --> 00:11:43,200 Speaker 1: study participants. So basically it was college kids for eighty 200 00:11:43,320 --> 00:11:47,200 Speaker 1: percent of these papers. And they surveyed a bunch of 201 00:11:47,200 --> 00:11:49,960 Speaker 1: papers and they took it a little further and they 202 00:11:50,000 --> 00:11:54,599 Speaker 1: said that, um, people who fit into the weird category 203 00:11:55,000 --> 00:11:57,560 Speaker 1: only make up twelve percent of the world population, but 204 00:11:57,600 --> 00:12:00,680 Speaker 1: they represent eight percent of the population of these studies. 205 00:12:01,040 --> 00:12:07,040 Speaker 1: And a college student chuck in North America, Europe, Israel, 206 00:12:07,280 --> 00:12:11,760 Speaker 1: or Australia is four thousand times more likely to be 207 00:12:11,880 --> 00:12:15,079 Speaker 1: in a scientific study than anyone else on the planet. 208 00:12:15,679 --> 00:12:20,120 Speaker 1: And they're basing psychology and behavioral sciences are basing their 209 00:12:20,160 --> 00:12:24,880 Speaker 1: findings onto everybody else based on this this small tranche 210 00:12:25,160 --> 00:12:28,640 Speaker 1: of humanity, and that's a that's a big problem. That's 211 00:12:28,640 --> 00:12:31,959 Speaker 1: extremely misleading. Yeah, and it's also a little insulting because 212 00:12:32,000 --> 00:12:35,559 Speaker 1: what they're essentially saying is like, this is who matters. 213 00:12:36,440 --> 00:12:39,520 Speaker 1: Well also, yeah, but what's sad is this is who 214 00:12:39,679 --> 00:12:42,679 Speaker 1: I am going to go to the trouble of recruiting 215 00:12:42,760 --> 00:12:47,000 Speaker 1: for my study. It's just sheer laziness. And I'm sure 216 00:12:47,000 --> 00:12:48,200 Speaker 1: a lot of them are like, well, I don't have 217 00:12:48,240 --> 00:12:52,280 Speaker 1: the funding to do that. I guess I see that, 218 00:12:52,320 --> 00:12:55,640 Speaker 1: But at the same time, I guarantee there's a tremendous 219 00:12:55,679 --> 00:12:58,800 Speaker 1: amount of laziness involved. Yeah, or maybe if you don't 220 00:12:58,800 --> 00:13:03,640 Speaker 1: have the money, maybe don't do that study. Is it 221 00:13:03,720 --> 00:13:06,120 Speaker 1: that simple? I'm probably over simplifying. I don't know. I'm 222 00:13:06,120 --> 00:13:08,280 Speaker 1: sure we're going to hear from some people in academia 223 00:13:08,280 --> 00:13:13,160 Speaker 1: about this one. We'll stop using weird participants, or at 224 00:13:13,200 --> 00:13:19,400 Speaker 1: the very least say, um, like this is sexual dartmouth students. Yeah, 225 00:13:19,720 --> 00:13:25,400 Speaker 1: this applies to them, not everybody in the world of 226 00:13:25,440 --> 00:13:29,400 Speaker 1: these studies where they use those people as study participants 227 00:13:29,440 --> 00:13:31,960 Speaker 1: and they're not even they're not even emblematic of the 228 00:13:31,960 --> 00:13:35,400 Speaker 1: rest of the human race. Like college students are shown 229 00:13:35,720 --> 00:13:39,960 Speaker 1: to see the world differently than other people around the world. 230 00:13:40,679 --> 00:13:42,640 Speaker 1: So it's not like you can be like, well, it's 231 00:13:42,640 --> 00:13:45,400 Speaker 1: still works, you can still extrapolate. It's like flawed in 232 00:13:45,440 --> 00:13:48,920 Speaker 1: every way, shape and form. Probably take a break. Come, yeah, 233 00:13:48,960 --> 00:13:51,280 Speaker 1: let's take a break because you can get a little 234 00:13:51,280 --> 00:13:54,520 Speaker 1: hot under the collar. I love it. Man. Uh, we'll 235 00:13:54,559 --> 00:14:09,800 Speaker 1: be right back after this. Just god so much. Sorry, 236 00:14:19,720 --> 00:14:25,800 Speaker 1: all right, what's next, buddy? Uh? Very small sample sizes. Right, 237 00:14:26,800 --> 00:14:32,800 Speaker 1: if you do a study with twenty mice, then you're 238 00:14:32,840 --> 00:14:37,400 Speaker 1: not doing a good enough study. No, So they used 239 00:14:37,400 --> 00:14:41,480 Speaker 1: this um in the in the article, they use the 240 00:14:41,600 --> 00:14:45,040 Speaker 1: idea of ten thousand smokers and ten thousand non smokers, 241 00:14:45,920 --> 00:14:49,040 Speaker 1: and they said, okay, if you have a population sample 242 00:14:49,120 --> 00:14:51,560 Speaker 1: that size, that's not bad. It's a pretty good start. 243 00:14:51,920 --> 00:14:54,840 Speaker 1: And you find that of the smokers developed lung cancer, 244 00:14:54,880 --> 00:14:58,360 Speaker 1: but only five of non smokers did, then your study 245 00:14:58,400 --> 00:15:02,840 Speaker 1: has what's called a high power Yeah. Um, it's if 246 00:15:02,880 --> 00:15:06,000 Speaker 1: if you had something like ten smokers and ten non smokers, 247 00:15:06,360 --> 00:15:09,200 Speaker 1: and to the smokers developed lung cancer and one developed 248 00:15:09,680 --> 00:15:13,800 Speaker 1: lung cancer. As well, you have very little power, and 249 00:15:13,960 --> 00:15:18,360 Speaker 1: you should have very little confidence in your findings. But regardless, 250 00:15:18,760 --> 00:15:22,880 Speaker 1: it's still going to get reported if it's a sexy idea, Yeah, 251 00:15:23,160 --> 00:15:27,360 Speaker 1: for sure. Um. And because these are kind of overlapping 252 00:15:27,400 --> 00:15:29,080 Speaker 1: in a lot of ways, it was I want to 253 00:15:29,080 --> 00:15:34,560 Speaker 1: mention this guy, a scientist named Ulrich uh Dirnegle Uh. 254 00:15:34,600 --> 00:15:37,240 Speaker 1: He and his colleague Malcolm McCloud have been trying I mean, 255 00:15:37,360 --> 00:15:39,400 Speaker 1: and there are a lot of scientists that are trying 256 00:15:39,440 --> 00:15:41,640 Speaker 1: to clean this up because they know it's a problem. 257 00:15:42,200 --> 00:15:45,280 Speaker 1: But he co wrote an article in Nature. Uh that's 258 00:15:45,320 --> 00:15:51,880 Speaker 1: called robust research colin institutions must do their part for reproduceability. 259 00:15:51,960 --> 00:15:55,280 Speaker 1: So this kind of ties back into the reproducing things, 260 00:15:55,280 --> 00:15:57,760 Speaker 1: like we said earlier, and his whole ideas, you know 261 00:15:57,800 --> 00:16:00,960 Speaker 1: what good funding they should tie fund ing two good 262 00:16:01,000 --> 00:16:04,640 Speaker 1: institutional practices, like you shouldn't get the money if you 263 00:16:04,680 --> 00:16:08,440 Speaker 1: can't show that you're doing it right. Um. And he 264 00:16:08,480 --> 00:16:10,200 Speaker 1: said that would just weed out a lot of stuff. 265 00:16:10,920 --> 00:16:15,720 Speaker 1: Here's one staggering stat for reproducibility and small sample size. 266 00:16:16,240 --> 00:16:20,800 Speaker 1: Biomedical researchers for drug companies reported that of their only 267 00:16:22,240 --> 00:16:25,120 Speaker 1: of the papers that they publish are even reproducible and 268 00:16:25,200 --> 00:16:29,040 Speaker 1: that was like an insider stat and it doesn't matter. 269 00:16:29,840 --> 00:16:33,320 Speaker 1: They's the drugs are still going to market. Yeah, which 270 00:16:33,320 --> 00:16:35,760 Speaker 1: is that's a really good example of why this does 271 00:16:35,880 --> 00:16:38,880 Speaker 1: matter to the average person. You know, like if you 272 00:16:38,960 --> 00:16:44,640 Speaker 1: hear something like um uh, monkeys like to cuddle with 273 00:16:44,720 --> 00:16:50,000 Speaker 1: one another because they are reminded of their mothers. Study shows, right, 274 00:16:50,080 --> 00:16:51,960 Speaker 1: do you can just be like, oh, that's great, I'm 275 00:16:52,000 --> 00:16:54,760 Speaker 1: going to share that on the internet. Doesn't really affect 276 00:16:54,760 --> 00:16:58,600 Speaker 1: you in any way. But when their studies being conducted 277 00:16:58,640 --> 00:17:03,240 Speaker 1: that are that are creating drugs that could kill you 278 00:17:03,360 --> 00:17:06,080 Speaker 1: or not treat you or that kind of thing, And 279 00:17:06,240 --> 00:17:10,439 Speaker 1: is it's attracting money and funding and that kind of stuff, 280 00:17:10,760 --> 00:17:16,320 Speaker 1: that's like that's harmful. Yeah. Absolutely. I found another survey 281 00:17:16,560 --> 00:17:19,960 Speaker 1: did you like that terrible study idea that it came 282 00:17:20,040 --> 00:17:26,119 Speaker 1: up like the monkey's like the cuddle A hundred and 283 00:17:26,160 --> 00:17:30,560 Speaker 1: forty trainees at the MD Anderson Cancer Center in Houston, Texas, 284 00:17:31,560 --> 00:17:33,879 Speaker 1: Thank you Houston for being so kind to us at 285 00:17:33,920 --> 00:17:37,760 Speaker 1: a recent show. They found that nearly a third of 286 00:17:37,800 --> 00:17:42,800 Speaker 1: these um trainees felt pressure to support their mentors work, 287 00:17:44,040 --> 00:17:46,840 Speaker 1: like to get ahead or not get fired. So that's 288 00:17:46,840 --> 00:17:50,480 Speaker 1: another issue. As you've got these trainees or residents, uh, 289 00:17:50,480 --> 00:17:52,560 Speaker 1: and you have these mentors, and even if you disagree 290 00:17:52,680 --> 00:17:55,280 Speaker 1: or don't think it's a great study, you're you're pressured 291 00:17:55,320 --> 00:17:57,480 Speaker 1: into just going along with it. I could see that 292 00:17:58,119 --> 00:18:00,560 Speaker 1: for sure. There's there seems to be a huge hierarchy. 293 00:18:00,640 --> 00:18:03,880 Speaker 1: And UM Science, Ye canna lab. You know you've got 294 00:18:03,880 --> 00:18:06,000 Speaker 1: the person who runs the lab. It's their lab, and 295 00:18:06,400 --> 00:18:10,120 Speaker 1: go against them. But there are people UM like Science 296 00:18:10,240 --> 00:18:13,640 Speaker 1: and Nature to great journals are updating their guidelines right now. 297 00:18:13,680 --> 00:18:19,359 Speaker 1: They're introducing checklists. UM Science hired statisticians to their panel 298 00:18:20,280 --> 00:18:24,119 Speaker 1: reviewing editors, not just other you know, peer reviewed like 299 00:18:24,160 --> 00:18:28,960 Speaker 1: they actually actually hard numbers people specifically, because that's a 300 00:18:29,200 --> 00:18:31,800 Speaker 1: the process that's a huge part of studies. It's like 301 00:18:31,880 --> 00:18:36,359 Speaker 1: these this mind breaking statistical analysis. It can be used 302 00:18:36,359 --> 00:18:39,400 Speaker 1: for good or ill. And I mean, I don't think 303 00:18:39,400 --> 00:18:43,879 Speaker 1: the average scientists necessarily is a whiz at that, although 304 00:18:44,119 --> 00:18:46,560 Speaker 1: it has to be part of training, but not necessarily. 305 00:18:46,560 --> 00:18:49,960 Speaker 1: And that's a different kind of beast altogether. UM stats, 306 00:18:50,119 --> 00:18:53,000 Speaker 1: we talked about it earlier. I took a stats class 307 00:18:53,000 --> 00:18:56,560 Speaker 1: in college. So much trouble that was awful at it. 308 00:18:56,560 --> 00:19:03,920 Speaker 1: It really just it's a special kind of is even mad. Yeah, 309 00:19:03,920 --> 00:19:07,879 Speaker 1: I didn't get it. I passed it, though I passed 310 00:19:07,880 --> 00:19:11,760 Speaker 1: it because my professor took pity on me. Um that 311 00:19:12,000 --> 00:19:20,640 Speaker 1: ulric Durna durnagal ulric arnaal Um. He is a he's 312 00:19:20,680 --> 00:19:24,919 Speaker 1: a big time crusader for his jam making sure that 313 00:19:24,960 --> 00:19:27,560 Speaker 1: science is good science. One of the things um he 314 00:19:27,600 --> 00:19:31,439 Speaker 1: crusades against is the idea of remembering that virginity study 315 00:19:31,440 --> 00:19:33,720 Speaker 1: where they just threw out anybody who had a violent 316 00:19:33,800 --> 00:19:38,159 Speaker 1: encounter for their first sexual experience. UM. Apparently that's a 317 00:19:38,160 --> 00:19:41,440 Speaker 1: big deal with animal studies as well. If you're studying 318 00:19:41,440 --> 00:19:43,679 Speaker 1: the effects of a drug or something like there was 319 00:19:43,800 --> 00:19:46,200 Speaker 1: there one in the article. Um. If you're studying the 320 00:19:46,200 --> 00:19:49,880 Speaker 1: effects of a stroke drug and you've got a control 321 00:19:49,880 --> 00:19:53,000 Speaker 1: group of mice that are taking the drug or that 322 00:19:53,040 --> 00:19:55,159 Speaker 1: aren't taking the drug, and then a test group that 323 00:19:55,200 --> 00:19:59,160 Speaker 1: are getting the drug. Um, and then like three mice 324 00:19:59,280 --> 00:20:02,920 Speaker 1: from the group die even though they're on the stroke drug. 325 00:20:02,920 --> 00:20:05,960 Speaker 1: They die of a massive stroke, and you just literally 326 00:20:06,000 --> 00:20:09,480 Speaker 1: and figuratively throw them out of the study, um, and 327 00:20:09,520 --> 00:20:13,040 Speaker 1: don't include them in the results. That changes the data. 328 00:20:13,160 --> 00:20:15,760 Speaker 1: And he's been on a peer of you on a 329 00:20:15,800 --> 00:20:18,480 Speaker 1: paper before He's like, no, this doesn't pass peer of you. 330 00:20:18,480 --> 00:20:21,399 Speaker 1: You can't just throw out what happened to these three rodents? 331 00:20:21,400 --> 00:20:24,360 Speaker 1: You started with ten, there's only seven reported in the end. 332 00:20:24,560 --> 00:20:27,200 Speaker 1: What happened to those three? And how many of them 333 00:20:27,480 --> 00:20:30,520 Speaker 1: just don't report the ten? They're like, oh, we only 334 00:20:30,520 --> 00:20:33,960 Speaker 1: started with seven. Were going, you know, well, I was 335 00:20:34,000 --> 00:20:35,720 Speaker 1: about to say I get the urge. I don't get 336 00:20:35,720 --> 00:20:37,920 Speaker 1: it because it's not right. But I think what happens 337 00:20:37,960 --> 00:20:40,320 Speaker 1: is you work so hard at something. Yeah, yeah, and 338 00:20:40,359 --> 00:20:42,640 Speaker 1: you're like, how can I just walk away from two 339 00:20:42,720 --> 00:20:46,360 Speaker 1: years of this because it didn't get a result? Okay? 340 00:20:46,640 --> 00:20:49,680 Speaker 1: The point of real science, though, you have to walk 341 00:20:49,680 --> 00:20:52,800 Speaker 1: away from it, Well, you have to publish that. And 342 00:20:52,840 --> 00:20:56,280 Speaker 1: that's the other thing too, And I guarantee scientists will say, hey, man, 343 00:20:56,920 --> 00:20:59,919 Speaker 1: try getting a negative paper published in a good journal. 344 00:21:00,040 --> 00:21:02,399 Speaker 1: These days, you don't want that kind of stuff. But 345 00:21:02,480 --> 00:21:05,080 Speaker 1: part of it also is I don't think it's enough 346 00:21:05,119 --> 00:21:07,720 Speaker 1: to just have to be published in like a journal. 347 00:21:08,000 --> 00:21:09,960 Speaker 1: You want to make the news cycle as well. That 348 00:21:10,080 --> 00:21:13,120 Speaker 1: makes it even better, right, Um, So, I think there's 349 00:21:13,119 --> 00:21:15,880 Speaker 1: a lot of factors involved, but ultimately, if you take 350 00:21:15,920 --> 00:21:18,000 Speaker 1: all that stuff away, if you take the culture away, 351 00:21:18,000 --> 00:21:21,800 Speaker 1: from it. You're if you get negative results, you're supposed 352 00:21:21,840 --> 00:21:24,239 Speaker 1: to publish that so that some other scientists can come 353 00:21:24,240 --> 00:21:26,959 Speaker 1: along and be like, oh, somebody else already did this 354 00:21:27,320 --> 00:21:29,560 Speaker 1: using these methods that I was going to use. I'm 355 00:21:29,560 --> 00:21:32,680 Speaker 1: not gonna waste two years of my career because somebody 356 00:21:32,680 --> 00:21:35,760 Speaker 1: else already did. Thank you, buddy for saving me this 357 00:21:35,840 --> 00:21:38,360 Speaker 1: time and trouble and effort to know that this does 358 00:21:38,359 --> 00:21:41,880 Speaker 1: not work. You've proven this doesn't work. When you start 359 00:21:42,000 --> 00:21:44,160 Speaker 1: to prove it does work, you actually proved it didn't work. 360 00:21:44,200 --> 00:21:48,560 Speaker 1: That's part of science. Yeah, And I wish there wasn't 361 00:21:48,600 --> 00:21:53,320 Speaker 1: a negative connotation to a negative result because to me, 362 00:21:53,359 --> 00:21:56,520 Speaker 1: it's the value is the same as proving something does 363 00:21:56,560 --> 00:21:59,679 Speaker 1: work as proving something doesn't work. Right. Again, it's just 364 00:21:59,800 --> 00:22:03,639 Speaker 1: not it's sexy. Yeah, but I'm not sexy either, so 365 00:22:03,680 --> 00:22:06,119 Speaker 1: maybe that's why I get it. Uh. Here's one that 366 00:22:06,160 --> 00:22:10,040 Speaker 1: I didn't know was a thing. Predatory publishing. You never 367 00:22:10,080 --> 00:22:12,960 Speaker 1: heard of this. So here's the scenario. You're a doctor 368 00:22:13,040 --> 00:22:17,440 Speaker 1: or a scientist and um, you get an email from 369 00:22:17,480 --> 00:22:20,760 Speaker 1: a journal that says, hey, you got anything interesting for us. 370 00:22:20,800 --> 00:22:22,520 Speaker 1: I've heard about your work, and you say, well, I 371 00:22:22,520 --> 00:22:25,920 Speaker 1: actually do I have this this study right here. They say, cool, 372 00:22:25,960 --> 00:22:29,120 Speaker 1: we'll publish it. You go, great, my career is taking off. 373 00:22:29,320 --> 00:22:32,280 Speaker 1: Then you get a bill he says, where's my three 374 00:22:32,359 --> 00:22:36,160 Speaker 1: grand for publishing your article? And you're like, I don't 375 00:22:36,160 --> 00:22:39,760 Speaker 1: owe you three grand, all right, give us two. And 376 00:22:39,760 --> 00:22:41,479 Speaker 1: you're like, I can't even give you two. And if 377 00:22:41,520 --> 00:22:44,600 Speaker 1: you fight them long enough, maybe they'll drop it and 378 00:22:44,960 --> 00:22:48,560 Speaker 1: never work with you again, or maybe it'll just be like, well, 379 00:22:48,600 --> 00:22:53,280 Speaker 1: we'll talk to your next quarter. Exactly. That's called predatory publishing, 380 00:22:53,320 --> 00:22:55,600 Speaker 1: and it's a I'm not sure how new it is, 381 00:22:55,920 --> 00:22:58,800 Speaker 1: maybe it's is it pretty new, but it's a thing 382 00:22:58,840 --> 00:23:07,040 Speaker 1: now where uh, you can pay essentially to get something published. Yes, 383 00:23:07,119 --> 00:23:09,800 Speaker 1: you can. Um it kind of it's kind of like 384 00:23:09,840 --> 00:23:14,160 Speaker 1: who's who in behavioral science is kind of thing you know. Um. 385 00:23:14,160 --> 00:23:16,919 Speaker 1: And apparently it's new because it's a result of open 386 00:23:17,000 --> 00:23:20,199 Speaker 1: source academic journals, which a lot of people push for, 387 00:23:20,400 --> 00:23:24,200 Speaker 1: including Aaron Schwartz very famously who like took a bunch 388 00:23:24,200 --> 00:23:28,000 Speaker 1: of academic articles and published him online and was prosecuted 389 00:23:28,040 --> 00:23:31,720 Speaker 1: heavily for it. Persecuted. You could even say, um, but 390 00:23:31,800 --> 00:23:34,560 Speaker 1: the idea that science is behind this paywall, which is 391 00:23:34,600 --> 00:23:38,719 Speaker 1: another great article from prisonomics by the way, Um, it 392 00:23:38,760 --> 00:23:40,719 Speaker 1: really just ticks a lot of people off. So they 393 00:23:40,720 --> 00:23:45,000 Speaker 1: started open source journals right and as a result, predatory 394 00:23:45,080 --> 00:23:48,399 Speaker 1: publishers came about and said, Okay, yeah, let's make this free, 395 00:23:49,240 --> 00:23:50,919 Speaker 1: but we need to make our money anyway, so we're 396 00:23:50,920 --> 00:23:54,159 Speaker 1: going to charge the academic who wrote the study for 397 00:23:54,240 --> 00:23:57,719 Speaker 1: publishing it. Well yeah, and and sometimes now it's just 398 00:23:57,800 --> 00:24:02,480 Speaker 1: a flat out scam operation. There's this guy named Jeffrey 399 00:24:02,520 --> 00:24:07,240 Speaker 1: Beale who is a research librarian. He is my new 400 00:24:07,280 --> 00:24:10,439 Speaker 1: hero because he's truly like one of these dudes that 401 00:24:10,840 --> 00:24:14,200 Speaker 1: has uh he's trying to make a difference and he's 402 00:24:14,240 --> 00:24:17,080 Speaker 1: not profiting from this, but he's spending a lot of 403 00:24:17,119 --> 00:24:23,160 Speaker 1: time by creating a list of of predatory publishers. Yea 404 00:24:23,359 --> 00:24:26,359 Speaker 1: a significant list too. Yeah, how many four thousand of 405 00:24:26,440 --> 00:24:32,240 Speaker 1: them right now? Um. Some of these companies flat out 406 00:24:32,440 --> 00:24:36,160 Speaker 1: lie like they're literally based out of Pakistan or Nigeria 407 00:24:36,280 --> 00:24:39,560 Speaker 1: and they say, no, we're in New York publisher so 408 00:24:39,600 --> 00:24:42,240 Speaker 1: it's just a flat out scam. Or they lie about 409 00:24:42,280 --> 00:24:45,600 Speaker 1: their review practices, um, like they might not have any 410 00:24:45,800 --> 00:24:48,680 Speaker 1: review practices and they straight up lie and say they do. 411 00:24:49,400 --> 00:24:53,040 Speaker 1: There was one called Scientific Journals International out of Minnesota 412 00:24:53,480 --> 00:24:56,960 Speaker 1: that he found out was just one guy, like literally 413 00:24:57,000 --> 00:25:00,840 Speaker 1: working out of his home, just live, being for articles, 414 00:25:01,080 --> 00:25:04,639 Speaker 1: charging to get them published, not reviewing anything, and just 415 00:25:04,720 --> 00:25:09,119 Speaker 1: saying I'm a journal. Yeah, I'm a scientific journal. He 416 00:25:09,160 --> 00:25:11,520 Speaker 1: shut it down apparently or tried to sell it. I 417 00:25:11,560 --> 00:25:15,960 Speaker 1: think he was found out. UM and this other one, 418 00:25:16,200 --> 00:25:20,520 Speaker 1: the International Journal of Engineering Research and Applications. They created 419 00:25:20,560 --> 00:25:24,920 Speaker 1: an award and then gave it to itself and even 420 00:25:24,960 --> 00:25:28,200 Speaker 1: modeled the award from an Australian TV award like the 421 00:25:28,520 --> 00:25:32,440 Speaker 1: physical statute. That's fascinating. I didn't make you do that. 422 00:25:32,480 --> 00:25:36,600 Speaker 1: I'm gonna give ourselves. Yeah, let's the best podcast in 423 00:25:36,640 --> 00:25:41,760 Speaker 1: the universal ward that it's gonna look like the Oscar. Yeah, okay, 424 00:25:42,240 --> 00:25:45,040 Speaker 1: the Oscar crusted with the Emmy. Uh. This other one, 425 00:25:45,160 --> 00:25:50,520 Speaker 1: med med No Publications actually confused the meaning of STM 426 00:25:50,680 --> 00:25:55,480 Speaker 1: Science Technology Medicine. They thought it meant sports technology in Medicine. Well, 427 00:25:55,520 --> 00:25:59,600 Speaker 1: a lot of UM science journalists or scientists too. But 428 00:26:00,040 --> 00:26:04,480 Speaker 1: UCH dogs like to send like gibberish articles into those 429 00:26:04,520 --> 00:26:06,600 Speaker 1: things to see if they published him, and sometimes they do. 430 00:26:06,760 --> 00:26:09,000 Speaker 1: Frequently they do. They sniff them off the case it's 431 00:26:09,160 --> 00:26:12,560 Speaker 1: the big time. How about that call back. It's been 432 00:26:12,560 --> 00:26:16,240 Speaker 1: a while. It needs to be a T shirt. Did 433 00:26:16,240 --> 00:26:18,160 Speaker 1: we take a break? Yeah, all right, we'll be back 434 00:26:18,359 --> 00:26:21,240 Speaker 1: and finish up right after this. Just like the number 435 00:26:21,359 --> 00:26:44,640 Speaker 1: of stars the sky so much, Sorry, George. So here's 436 00:26:44,680 --> 00:26:46,600 Speaker 1: a big one. You ever heard the term follow the 437 00:26:46,640 --> 00:26:52,199 Speaker 1: money m That's applicable to a lot of realms of 438 00:26:52,240 --> 00:26:58,200 Speaker 1: society and most certainly in journals. Um, if something looks hinky, 439 00:26:59,080 --> 00:27:03,119 Speaker 1: just do a little instigating and see who's sponsoring their work. Well, 440 00:27:03,200 --> 00:27:07,560 Speaker 1: especially if that person is like, no, everyone else is wrong, right, 441 00:27:07,960 --> 00:27:10,720 Speaker 1: climate change is not man made kind of thing. Sure. 442 00:27:11,080 --> 00:27:13,639 Speaker 1: You know, if you look at the where their finding 443 00:27:13,680 --> 00:27:16,720 Speaker 1: is coming from, you might be unsurprised to find that 444 00:27:16,800 --> 00:27:19,840 Speaker 1: it's coming from people who would benefit from the idea 445 00:27:19,920 --> 00:27:23,320 Speaker 1: that anthropogenic climate change isn't real. Yeah, well we might 446 00:27:23,359 --> 00:27:27,240 Speaker 1: as well talk about him Willie Soon. Yeah, Mr Soon? 447 00:27:27,440 --> 00:27:30,600 Speaker 1: Is he a doctor? He's a He's a physicist of 448 00:27:30,640 --> 00:27:34,880 Speaker 1: some sort. Yeah, all right, m M. I'm just gonna 449 00:27:34,920 --> 00:27:39,399 Speaker 1: say Mr or doctor Soon because I'm not positive. Uh. 450 00:27:39,440 --> 00:27:43,800 Speaker 1: He is one of a few people on the planet Earth, um, 451 00:27:43,960 --> 00:27:50,399 Speaker 1: professionals that is, who deny human climate change, human influence 452 00:27:50,440 --> 00:27:52,840 Speaker 1: climate change. Like you said, you said the fancier word 453 00:27:52,840 --> 00:27:57,720 Speaker 1: for it though, and anthropogenic. Yeah, it's a good word. Um. 454 00:27:57,760 --> 00:28:00,840 Speaker 1: And he works at the Harvard Smithsonian Center for Astrophysics. 455 00:28:01,680 --> 00:28:06,720 Speaker 1: So hey, he's with Harvard, he's got the cred, right right. Um. 456 00:28:06,880 --> 00:28:09,119 Speaker 1: Turns out when you look into where he's getting his funding, 457 00:28:09,720 --> 00:28:12,560 Speaker 1: he received one point two million dollars over the past 458 00:28:12,600 --> 00:28:17,639 Speaker 1: decade from Exxon Mobile, the Southern Company, the Cokes, and 459 00:28:17,680 --> 00:28:21,440 Speaker 1: the Koke brothers, their foundation, the Charles G. Coke Foundation. 460 00:28:21,960 --> 00:28:25,639 Speaker 1: Exn stopped in stopped funding him. But the bulk of 461 00:28:25,720 --> 00:28:29,040 Speaker 1: his money and his funding came and I'm sorry, I 462 00:28:29,040 --> 00:28:32,919 Speaker 1: forgot the American Petroleum Institute came from people who clearly 463 00:28:33,000 --> 00:28:38,000 Speaker 1: had a dog in this fight. And it's just how 464 00:28:38,040 --> 00:28:41,160 Speaker 1: can you trust this? You know? Yeah, well you trusted 465 00:28:41,200 --> 00:28:43,280 Speaker 1: because there's a guy and he has a PhD in 466 00:28:43,320 --> 00:28:45,600 Speaker 1: aerospace engineering by the way, all right, he's a doc. 467 00:28:45,760 --> 00:28:50,200 Speaker 1: He works with this um this organization, the Harvard Smithsonian 468 00:28:50,280 --> 00:28:54,400 Speaker 1: Center for Astrophysics, which is a legitimate place. Um. It 469 00:28:54,440 --> 00:28:56,760 Speaker 1: doesn't get any funding from Harvard, but it gets a 470 00:28:56,800 --> 00:28:59,840 Speaker 1: lot from NASA and from the Smithsonian well in Harvard's 471 00:28:59,880 --> 00:29:01,720 Speaker 1: very very clear to point this out when people ask 472 00:29:01,800 --> 00:29:05,280 Speaker 1: him about Willie Soon. Um, they're kind of like, well, 473 00:29:05,320 --> 00:29:08,280 Speaker 1: here's the quote, Willie Soon as a Smithsonian staff researcher 474 00:29:08,880 --> 00:29:12,720 Speaker 1: at Harvard Smithsonian Center for Astrophysics, a collaboration of the 475 00:29:12,760 --> 00:29:17,880 Speaker 1: Harvard College Observatory in the Smithsonian Astrophysical Observatory. Like they 476 00:29:17,920 --> 00:29:19,960 Speaker 1: just want to be real clear. Even though he uses 477 00:29:19,960 --> 00:29:23,640 Speaker 1: a Harvard email address, he's not our employee. No, but again, 478 00:29:23,680 --> 00:29:25,760 Speaker 1: he's getting lots of funding from NASA and lots of 479 00:29:25,800 --> 00:29:31,480 Speaker 1: funding from the Smithsonian. This guy, Um, if his scientific 480 00:29:31,520 --> 00:29:34,240 Speaker 1: beliefs are what they are and he's a smart guy, 481 00:29:34,840 --> 00:29:38,360 Speaker 1: then yeah, I don't know about like getting fired first saying, 482 00:29:38,480 --> 00:29:42,520 Speaker 1: you know, here's a paper on on the idea that 483 00:29:42,880 --> 00:29:45,720 Speaker 1: climate change is not human made. Yeah, he thinks it's 484 00:29:45,920 --> 00:29:50,000 Speaker 1: the Sun's fault. But he didn't he doesn't reveal in 485 00:29:50,080 --> 00:29:54,480 Speaker 1: any of his um conflicts of interest. Uh, that should 486 00:29:54,520 --> 00:29:56,160 Speaker 1: go at the end of the paper. He didn't reveal 487 00:29:56,200 --> 00:29:59,840 Speaker 1: where his funding was coming from. And I get the 488 00:30:00,000 --> 00:30:03,160 Speaker 1: oppression that in academia, if you're are totally cool with 489 00:30:03,200 --> 00:30:07,760 Speaker 1: everybody thinking like you're a shill, you can get away 490 00:30:07,800 --> 00:30:11,920 Speaker 1: with it. Right. Well, this stuff, a lot of this 491 00:30:11,920 --> 00:30:15,600 Speaker 1: stuff is not illegal, right, even predatory publishing is not illegal, 492 00:30:16,000 --> 00:30:19,320 Speaker 1: just unethical. And if you're counting on people to police 493 00:30:19,360 --> 00:30:21,959 Speaker 1: themselves with ethics, a lot of times will disappoint you. 494 00:30:22,680 --> 00:30:27,080 Speaker 1: The Heartland Institute gave Willie Soon a Courage Award, and 495 00:30:27,120 --> 00:30:29,960 Speaker 1: if you're not caring about what other scientists think about 496 00:30:30,040 --> 00:30:32,760 Speaker 1: if you've heard the Heartland Institute, you might remember them. 497 00:30:33,120 --> 00:30:35,720 Speaker 1: They are a conservative thank tank. You might remember them 498 00:30:35,720 --> 00:30:39,400 Speaker 1: in the nineties when they worked alongside Philip Morris to 499 00:30:40,120 --> 00:30:43,400 Speaker 1: deny the risks of secondhand smoke. Yeah, that's all chronicle. 500 00:30:43,520 --> 00:30:46,160 Speaker 1: In that book, I've talked about merchants of doubt that 501 00:30:46,240 --> 00:30:50,080 Speaker 1: really just a bunch of scientists, legitimate bona fide scientists 502 00:30:50,160 --> 00:30:54,960 Speaker 1: who are like up for for um, being bought by 503 00:30:55,240 --> 00:31:00,400 Speaker 1: groups like that said, it is sad um and the whole, 504 00:31:00,440 --> 00:31:02,960 Speaker 1: the whole thing is they're saying like, well, you can't 505 00:31:02,960 --> 00:31:07,440 Speaker 1: say without beyond a shadow of a doubt, with absolute certainty, 506 00:31:07,760 --> 00:31:11,000 Speaker 1: that that's the case. And science is like, no, science 507 00:31:11,000 --> 00:31:13,440 Speaker 1: doesn't do that. Science doesn't do absolute certainty. But the 508 00:31:13,480 --> 00:31:16,840 Speaker 1: average person reading a newspaper sees that, oh you can't 509 00:31:16,840 --> 00:31:19,880 Speaker 1: stay with absolute certainty, Well then maybe it isn't man made. 510 00:31:20,320 --> 00:31:22,600 Speaker 1: And then there's that doubt that. The people just go 511 00:31:22,640 --> 00:31:24,800 Speaker 1: and get the money for for saying that, for writing 512 00:31:24,800 --> 00:31:29,160 Speaker 1: papers about it. It's millions of despicable. Yeah, it really is. 513 00:31:30,080 --> 00:31:35,200 Speaker 1: Um self reviewed. Uh you've heard of peer review. We've 514 00:31:35,200 --> 00:31:37,240 Speaker 1: talked about it quite a bit. Your reviews. When you 515 00:31:37,240 --> 00:31:40,400 Speaker 1: have a study and then one or more ideally more 516 00:31:40,560 --> 00:31:43,520 Speaker 1: of your peers reviews your study and says, you know what, 517 00:31:43,800 --> 00:31:46,400 Speaker 1: you had, best practices, you did it right. Um, it 518 00:31:46,440 --> 00:31:49,960 Speaker 1: was reproducible, you follow the scientific method. Um, I'm gonna 519 00:31:49,960 --> 00:31:52,000 Speaker 1: give it my stamp of approval and put my name 520 00:31:52,040 --> 00:31:54,640 Speaker 1: on it. Not literally or is it? I think so? 521 00:31:55,320 --> 00:31:58,520 Speaker 1: It says who reviewed it. I believe in the journal 522 00:31:58,520 --> 00:32:00,880 Speaker 1: when it's published, but not my name as the author 523 00:32:00,880 --> 00:32:03,600 Speaker 1: of study, you know what I mean? Um, and the 524 00:32:03,680 --> 00:32:06,600 Speaker 1: peer reviewer as a peer reviewer, and that's a wonderful thing. 525 00:32:07,080 --> 00:32:12,560 Speaker 1: But people have faked this and been their own peer reviewer, 526 00:32:13,000 --> 00:32:19,280 Speaker 1: which is not how it works. No, who's this guy? Uh? Well, 527 00:32:19,320 --> 00:32:24,400 Speaker 1: I'm terrible at pronouncing Korean names, so all apologies, but 528 00:32:24,440 --> 00:32:29,640 Speaker 1: I'm gonna say nung In Moon nice Dr Moon, I think, yeah, 529 00:32:29,720 --> 00:32:32,840 Speaker 1: let's call him Dr Moon. Okay, So Dr Moon um 530 00:32:33,200 --> 00:32:38,120 Speaker 1: worked on natural medicine. I believe, and was submitting all 531 00:32:38,120 --> 00:32:41,400 Speaker 1: these papers that were getting reviewed very quickly, because apparently 532 00:32:41,480 --> 00:32:43,640 Speaker 1: part of the process of peer reviews to say, this 533 00:32:43,720 --> 00:32:46,320 Speaker 1: paper is great, can you recommend some people in your 534 00:32:46,360 --> 00:32:51,080 Speaker 1: field that can review your paper? And Dr Moon said, 535 00:32:51,160 --> 00:32:53,440 Speaker 1: I sure can. He was on fire. Let me go 536 00:32:53,560 --> 00:32:56,640 Speaker 1: make up some people and make up some email addresses 537 00:32:56,640 --> 00:32:59,600 Speaker 1: that actually come to my inbox and just posed as 538 00:32:59,640 --> 00:33:03,480 Speaker 1: all of zone peer reviewers. He was lazy, though, is 539 00:33:03,520 --> 00:33:05,480 Speaker 1: the thing, like, I don't know that he would have 540 00:33:05,520 --> 00:33:10,640 Speaker 1: been found out if he hadn't been um careless. I 541 00:33:10,680 --> 00:33:14,680 Speaker 1: guess because he was returning the reviews within like twenty 542 00:33:14,720 --> 00:33:18,120 Speaker 1: four hours. Sometimes a peer review of like a real 543 00:33:18,840 --> 00:33:22,520 Speaker 1: um study should take I would guess weeks, if not months, 544 00:33:22,960 --> 00:33:28,080 Speaker 1: like the study the publication schedule for the average study 545 00:33:28,160 --> 00:33:30,080 Speaker 1: or paper, I don't think it's a very quick thing. 546 00:33:30,160 --> 00:33:32,360 Speaker 1: There's not a lot of quick turner, right, And this 547 00:33:32,400 --> 00:33:35,880 Speaker 1: guy was like twenty four hours. When they're like Dr Moon, 548 00:33:35,960 --> 00:33:40,240 Speaker 1: I see your paper was reviewed and accepted by Dr Mooney, 549 00:33:40,960 --> 00:33:43,560 Speaker 1: It's like I just added a Y to the end. 550 00:33:43,840 --> 00:33:48,680 Speaker 1: It seemed easy. Uh. If you google peer review fraud, 551 00:33:49,240 --> 00:33:52,840 Speaker 1: you will be shocked and how often this happens and 552 00:33:52,880 --> 00:34:00,280 Speaker 1: how many legit science publishers are having to retract studies? Uh, 553 00:34:00,280 --> 00:34:02,760 Speaker 1: And it doesn't mean they're bad. Um, they're getting duped 554 00:34:02,760 --> 00:34:06,680 Speaker 1: as well. But there's one based in Berlin that had 555 00:34:06,720 --> 00:34:11,480 Speaker 1: sixty four retractions because of fraudulent reviews. And they're just 556 00:34:11,560 --> 00:34:14,680 Speaker 1: one publisher of many. Every publisher out there probably has 557 00:34:14,719 --> 00:34:19,600 Speaker 1: been duped. Um. Maybe not everyone, I'm surmising that, but 558 00:34:19,680 --> 00:34:23,640 Speaker 1: it's a big problem. We should study on it. I'll 559 00:34:23,680 --> 00:34:27,960 Speaker 1: review it. It'll end up in the headlines. Now, every 560 00:34:28,040 --> 00:34:32,560 Speaker 1: single publisher duped, says Chuck. Uh. And speaking of um 561 00:34:32,640 --> 00:34:36,160 Speaker 1: the headlines, Chuck. One of the problems with science reporting 562 00:34:36,800 --> 00:34:41,279 Speaker 1: or reading science reporting is that what you usually are hearing, 563 00:34:41,360 --> 00:34:44,000 Speaker 1: especially if it's making a big splash, is what's called 564 00:34:44,000 --> 00:34:47,560 Speaker 1: the initial findings. Somebody carried out a study and this 565 00:34:47,640 --> 00:34:50,600 Speaker 1: is what they found, and it's amazing and mind blowing 566 00:34:50,640 --> 00:34:54,520 Speaker 1: and it um, it supports everything everyone's always known. But 567 00:34:54,560 --> 00:34:57,520 Speaker 1: now there's a scientific study that says, yes, that's the case. 568 00:34:58,080 --> 00:35:01,120 Speaker 1: And then if you wait a year or two, when 569 00:35:01,719 --> 00:35:04,839 Speaker 1: people follow up and reproduce the study and find that 570 00:35:04,840 --> 00:35:07,360 Speaker 1: it's actually not the case, it doesn't get reported on. 571 00:35:07,520 --> 00:35:13,080 Speaker 1: Usually Yeah, and and sometimes the science scientists or the 572 00:35:13,080 --> 00:35:17,600 Speaker 1: publisher is they're doing it right and they say initial findings, 573 00:35:17,640 --> 00:35:21,160 Speaker 1: but the public and sometimes even the reporter will say 574 00:35:21,200 --> 00:35:25,239 Speaker 1: initial findings. But we as a people that ingest this 575 00:35:25,280 --> 00:35:29,080 Speaker 1: stuff need to understand what that means, um, and the 576 00:35:29,080 --> 00:35:32,080 Speaker 1: fine print is always like you know, you know more 577 00:35:32,080 --> 00:35:35,120 Speaker 1: study is needed, but no one if it's something that 578 00:35:35,160 --> 00:35:38,120 Speaker 1: you want to be true, you'll just say, hey, look 579 00:35:38,160 --> 00:35:42,080 Speaker 1: at the study, right. You know it's brand new and 580 00:35:42,120 --> 00:35:44,640 Speaker 1: they need to study it for twenty more years, but hey, 581 00:35:44,680 --> 00:35:47,920 Speaker 1: look what it says. And the more the more you 582 00:35:47,960 --> 00:35:49,799 Speaker 1: start paying attention to this kind of thing, the more 583 00:35:49,880 --> 00:35:52,600 Speaker 1: kind of disdain you have for that kind of just 584 00:35:53,560 --> 00:36:00,160 Speaker 1: offhand um sensationalist science reporting. But you'll still get it 585 00:36:00,239 --> 00:36:01,919 Speaker 1: up in it. Like every once in a while, I'll 586 00:36:01,960 --> 00:36:04,000 Speaker 1: catch myself like saying something. You'd be like, oh, did 587 00:36:04,000 --> 00:36:05,839 Speaker 1: you hear this? And then as I'm saying it out loud, 588 00:36:05,840 --> 00:36:08,960 Speaker 1: I'm like, that's preposterous. Yeah, there's no way that's going 589 00:36:09,000 --> 00:36:13,080 Speaker 1: to pan out to be true. I got colleckbated, I know. 590 00:36:13,360 --> 00:36:16,040 Speaker 1: I mean, we we have to avoid this stuff. It's 591 00:36:16,080 --> 00:36:19,960 Speaker 1: stuff because we have our name on this podcast. But 592 00:36:20,080 --> 00:36:23,239 Speaker 1: luckily we've given ourselves the back door of saying, hey, 593 00:36:23,280 --> 00:36:26,680 Speaker 1: we make mistakes a lot. It's true though, we're not experts, 594 00:36:27,680 --> 00:36:30,800 Speaker 1: we're not scientists. Uh. And then finally we're gonna finish 595 00:36:30,880 --> 00:36:34,279 Speaker 1: up with the header on this one is it's a 596 00:36:34,280 --> 00:36:38,799 Speaker 1: cool story. And that's a big one because um, it's 597 00:36:38,840 --> 00:36:41,880 Speaker 1: not enough these days. And this all ties in with 598 00:36:42,000 --> 00:36:44,880 Speaker 1: media and how we read things as as people. But 599 00:36:44,920 --> 00:36:47,359 Speaker 1: it's not enough just to have a study that might 600 00:36:47,400 --> 00:36:49,759 Speaker 1: prove something. You have to wrap it up in a 601 00:36:49,840 --> 00:36:52,640 Speaker 1: nice package. Yeah, to deliver people get it in the 602 00:36:52,640 --> 00:36:57,320 Speaker 1: news cycle. And the cooler the better. Yep. It almost 603 00:36:57,360 --> 00:37:01,760 Speaker 1: doesn't matter about the science as far as the media 604 00:37:01,840 --> 00:37:05,239 Speaker 1: is concerned. They just want a good headline and a 605 00:37:05,320 --> 00:37:09,120 Speaker 1: scientist tools say, yeah, that's that's cool. Here's what I found. 606 00:37:09,760 --> 00:37:14,439 Speaker 1: This is going to change the world. Lockness monster is real. 607 00:37:14,960 --> 00:37:17,440 Speaker 1: This is a kind of ended up being depressing somehow. 608 00:37:17,760 --> 00:37:22,600 Speaker 1: Yeah not somehow. Yeah, like yeah, it's kind of depressing. 609 00:37:24,600 --> 00:37:27,040 Speaker 1: We'll figure it out, chuck, Well, we do our best. 610 00:37:27,080 --> 00:37:31,080 Speaker 1: I'll say that science will prevail, I hope. So. Uh, 611 00:37:31,160 --> 00:37:33,920 Speaker 1: if you want to know more about science and scientific 612 00:37:34,000 --> 00:37:36,600 Speaker 1: studies and research fraud and all that kind of stuff, 613 00:37:36,640 --> 00:37:39,160 Speaker 1: just type some random words into the search part how 614 00:37:39,200 --> 00:37:41,759 Speaker 1: stiff works dot com. See what comes up? And since 615 00:37:41,800 --> 00:37:45,879 Speaker 1: I said random, it's time for a listener mail. Oh no, oh, yeah, 616 00:37:46,000 --> 00:37:56,000 Speaker 1: you know what. It's time for administ all right, Josh, 617 00:37:56,120 --> 00:37:58,600 Speaker 1: administrative details. If you're new to the show, you don't 618 00:37:58,600 --> 00:38:01,759 Speaker 1: know what it is. That's the very clunky title. We're saying, 619 00:38:01,760 --> 00:38:05,160 Speaker 1: thank you too listeners who send us neat things. It 620 00:38:05,280 --> 00:38:08,120 Speaker 1: is clunky and generic and I've totally gotten used to 621 00:38:08,120 --> 00:38:09,480 Speaker 1: it by now. Well you're the one who made it 622 00:38:09,560 --> 00:38:12,920 Speaker 1: up to be clunky and generic, and it's stuck. Yeah. 623 00:38:13,080 --> 00:38:15,040 Speaker 1: So people send U stuff from time to time, and 624 00:38:15,040 --> 00:38:17,600 Speaker 1: it's just very kind of you to do so, Yes, 625 00:38:17,760 --> 00:38:19,799 Speaker 1: And we like to give shout outs, whether or not 626 00:38:19,880 --> 00:38:21,840 Speaker 1: it's just out of the goodness of your heart or 627 00:38:21,880 --> 00:38:24,200 Speaker 1: if you have a little small business that you're trying 628 00:38:24,200 --> 00:38:26,640 Speaker 1: to plug. Either way, it's a sneaky way of getting 629 00:38:26,640 --> 00:38:28,680 Speaker 1: it in there. Yeah, but I mean, I think we 630 00:38:28,680 --> 00:38:30,799 Speaker 1: we brought that on, didn't we didn't we say like, 631 00:38:30,840 --> 00:38:32,959 Speaker 1: if you have a small business and you send us something, 632 00:38:33,000 --> 00:38:36,200 Speaker 1: we'll we'll be happy to say something exactly thank you. 633 00:38:36,239 --> 00:38:38,200 Speaker 1: All right, So let's get it going here. We got 634 00:38:38,239 --> 00:38:42,360 Speaker 1: some coffee right from one thousand faces right here in Athens, 635 00:38:42,400 --> 00:38:47,919 Speaker 1: Georgia from Kayla. Yeah delicious, Yes it was. We also 636 00:38:48,000 --> 00:38:50,840 Speaker 1: got some other coffee too, from Jonathan at Steamworks Coffee. 637 00:38:51,000 --> 00:38:53,560 Speaker 1: He came up with a Josh and Chuck blend. Oh yeah, 638 00:38:53,640 --> 00:38:56,520 Speaker 1: it's pretty awesome. I believe it's available for sale to Yeah. 639 00:38:56,520 --> 00:39:02,799 Speaker 1: That Josh and Chuck blend is dark and bitter. Uh. 640 00:39:02,920 --> 00:39:07,000 Speaker 1: Jim Simmons, he's a retired teacher who sent us some 641 00:39:07,120 --> 00:39:11,000 Speaker 1: lovely handmade wooden bowls and a very nice handwritten letter, 642 00:39:11,120 --> 00:39:14,160 Speaker 1: which is always great. Thanks a lot, Jim. Uh. Let's 643 00:39:14,160 --> 00:39:18,720 Speaker 1: see Chamberlayne send us homemade pasta, including a delicious savory 644 00:39:18,760 --> 00:39:24,239 Speaker 1: pumpkin fettuccini. It was very nice. Yum. Jay Graft two 645 00:39:24,239 --> 00:39:26,879 Speaker 1: F's send us a postcard from a great wall of China. 646 00:39:27,200 --> 00:39:29,520 Speaker 1: It's kind of neat sometimes we get those postcards from 647 00:39:29,719 --> 00:39:33,040 Speaker 1: places we've talked about. I was like a thanks James 648 00:39:33,200 --> 00:39:37,080 Speaker 1: right here. Let's see the Hammer Press team. They sent 649 00:39:37,200 --> 00:39:39,640 Speaker 1: us a bunch of Mother's Day cards that are wonderful. Oh, 650 00:39:39,680 --> 00:39:41,759 Speaker 1: those are really nice, really great. You should check him out. 651 00:39:41,880 --> 00:39:46,160 Speaker 1: The Hammer Press team. Yeah. Uh, Misty, Billie and Jessica. 652 00:39:46,880 --> 00:39:49,239 Speaker 1: They sent us a care package of a lot of things. 653 00:39:49,320 --> 00:39:52,680 Speaker 1: There were some cookies, um, including one of my favorite 654 00:39:52,680 --> 00:39:57,000 Speaker 1: white chocolate dipped rits and peanut butter crackers. Oh yeah, man, 655 00:39:57,080 --> 00:40:00,319 Speaker 1: I love those homemade, right yeah. And uh, then some 656 00:40:00,400 --> 00:40:06,560 Speaker 1: seventies Macromay for you, along with seventies Macromay magazine because 657 00:40:06,600 --> 00:40:09,880 Speaker 1: you're obsessed with Macromay. We have a Macromay plant holder 658 00:40:09,960 --> 00:40:15,120 Speaker 1: hanging from my UM microphone arm having a coffee mug 659 00:40:15,320 --> 00:40:17,560 Speaker 1: sent to us by Joe and Lynda heckt, that's right, 660 00:40:17,920 --> 00:40:20,120 Speaker 1: and it has some pens in it. Uh. And they 661 00:40:20,160 --> 00:40:22,800 Speaker 1: also sent us a misty Billy and Jessica a lovely 662 00:40:22,800 --> 00:40:24,879 Speaker 1: little hand drawn picture of us with their family, which 663 00:40:24,920 --> 00:40:27,960 Speaker 1: was so sweet. That's very awesome. Um. We've said it before, 664 00:40:28,000 --> 00:40:30,719 Speaker 1: we'll say it again. Huge. Thank you to Jim Ruwaine 665 00:40:31,320 --> 00:40:33,400 Speaker 1: I believe that's how you say his name. And the 666 00:40:33,400 --> 00:40:36,160 Speaker 1: Crown Royal people for sending us all the Crown Royal 667 00:40:36,880 --> 00:40:41,239 Speaker 1: we are running low. Uh. Mark Silberg at the Rocky 668 00:40:41,280 --> 00:40:45,799 Speaker 1: Mountain Institute send us a book called Reinventing Fire. They're 669 00:40:45,840 --> 00:40:48,040 Speaker 1: great out there, man, they know what they're talking about. 670 00:40:48,200 --> 00:40:51,839 Speaker 1: And I think it's Reinventing Fire. Colon Bold Businesses Bowld 671 00:40:51,840 --> 00:40:55,239 Speaker 1: Business solutions for the new energy era. Yeah, they're they're 672 00:40:55,280 --> 00:40:58,920 Speaker 1: basically like um green energy observers, but I think they 673 00:40:58,960 --> 00:41:02,200 Speaker 1: they they're experts in like all sectors of energy, but 674 00:41:02,239 --> 00:41:04,520 Speaker 1: they have a focus on green energy, which is awesome. Yeah, 675 00:41:04,520 --> 00:41:09,719 Speaker 1: they're pretty cool. Um john whose wife makes Delightfully Delicious 676 00:41:09,760 --> 00:41:12,600 Speaker 1: doggie treats. Delightfully Delicious is the name of the company. 677 00:41:12,840 --> 00:41:15,879 Speaker 1: There's no artificial colors or flavors. And they got um 678 00:41:15,920 --> 00:41:19,920 Speaker 1: sweet little momo hooked on sweet potato dog treats. I 679 00:41:20,000 --> 00:41:22,640 Speaker 1: thought you're gonna say, hooked on the junk, the the 680 00:41:22,800 --> 00:41:26,720 Speaker 1: sweet potato junk. She's crazy cuckoo for sweet potatoes. Nice. 681 00:41:26,800 --> 00:41:28,600 Speaker 1: Oh man, that's good for a dog too. It is 682 00:41:28,719 --> 00:41:32,719 Speaker 1: very stratt Johnson send us his band's LP and if 683 00:41:32,840 --> 00:41:34,319 Speaker 1: you're in a band, your name is Strat And that's 684 00:41:34,320 --> 00:41:42,640 Speaker 1: pretty cool. Uh die omeya still mhm. I think that 685 00:41:42,719 --> 00:41:44,759 Speaker 1: was great. Yeah, I'm not sure if I pronounce it right. 686 00:41:44,840 --> 00:41:49,560 Speaker 1: D I O M A E A. Uh, Frederick, this 687 00:41:49,640 --> 00:41:53,480 Speaker 1: is long overdue. Frederick at the store one five to 688 00:41:53,760 --> 00:41:57,280 Speaker 1: one store dot com. Send us some awesome low profile 689 00:41:57,400 --> 00:42:01,520 Speaker 1: cork iPhone cases and passport holders. And I was telling 690 00:42:01,600 --> 00:42:04,960 Speaker 1: him Jerry Walks around with her iPhone in the cork 691 00:42:05,000 --> 00:42:08,040 Speaker 1: holder and it looks pretty sweet. Oh yeah, so he said, AWESO, 692 00:42:08,040 --> 00:42:10,719 Speaker 1: I'm glad to hear Joe and Holly Harper send is 693 00:42:10,800 --> 00:42:13,759 Speaker 1: some really cool three D printed stuff. You should know, 694 00:42:13,880 --> 00:42:18,080 Speaker 1: things like s Y s k uh, you know, like 695 00:42:18,120 --> 00:42:22,040 Speaker 1: a little desk oh as, like after Robert Indiana's love sculpture. Yeah, 696 00:42:22,080 --> 00:42:24,120 Speaker 1: that's what I couldn't think of what that was from. Yeah, 697 00:42:24,160 --> 00:42:28,040 Speaker 1: it's awesome. It's really neat and like a bracelet um 698 00:42:28,080 --> 00:42:30,680 Speaker 1: made out of stuff you should know, three D carved 699 00:42:31,200 --> 00:42:33,720 Speaker 1: like plastics, really neat. Yeah, they did some good stuff, 700 00:42:33,719 --> 00:42:35,960 Speaker 1: So thanks Joe and Holly Harper for that. And then 701 00:42:36,040 --> 00:42:40,360 Speaker 1: last for this one, we got a postcard from Yosemite 702 00:42:40,440 --> 00:42:43,399 Speaker 1: National Park from Laura Jackson, so thanks a lot for that. 703 00:42:43,760 --> 00:42:46,000 Speaker 1: Thanks to everybody who sends this stuff. It's nice to 704 00:42:46,040 --> 00:42:49,160 Speaker 1: know we're thought of and we appreciate it. Yeah. We're 705 00:42:49,200 --> 00:42:52,840 Speaker 1: gonna finish up with another set on the next episode 706 00:42:53,040 --> 00:42:56,360 Speaker 1: of Administrative Details. You got anything else, No, that's it. 707 00:42:56,760 --> 00:42:58,680 Speaker 1: Oh yeah. If you guys want to hang out with 708 00:42:58,760 --> 00:43:01,239 Speaker 1: us on social media, you can go to s Y 709 00:43:01,400 --> 00:43:05,480 Speaker 1: s K Podcast on Twitter or on Instagram. You can 710 00:43:05,520 --> 00:43:08,040 Speaker 1: hang out with us at Facebook dot com slash stuff 711 00:43:08,040 --> 00:43:09,920 Speaker 1: you Should Know. You can send us an email to 712 00:43:10,080 --> 00:43:12,680 Speaker 1: stuff podcast at how stuff Works dot com and has 713 00:43:12,680 --> 00:43:14,680 Speaker 1: always joined us at home on the web. Stuff you 714 00:43:14,680 --> 00:43:20,759 Speaker 1: Should Know dot com. Stuff you Should Know is a 715 00:43:20,760 --> 00:43:23,880 Speaker 1: production of iHeart Radios. How stuff Works for more podcasts 716 00:43:23,920 --> 00:43:25,839 Speaker 1: for my Heart Radio because at the iHeart Radio app, 717 00:43:25,920 --> 00:43:28,520 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows