1 00:00:07,840 --> 00:00:10,840 Speaker 1: Science is supposed to be our best way of revealing 2 00:00:10,840 --> 00:00:14,360 Speaker 1: the truth about the universe. But science is constantly being 3 00:00:14,440 --> 00:00:17,920 Speaker 1: updated and corrected, and sometimes we learn after the fact 4 00:00:17,960 --> 00:00:20,439 Speaker 1: that a study was flawed or even was shoddy to 5 00:00:20,480 --> 00:00:20,920 Speaker 1: begin with. 6 00:00:21,480 --> 00:00:23,880 Speaker 2: So how do scientists decide. 7 00:00:23,440 --> 00:00:26,560 Speaker 1: Whether a new result is robust or not? And how 8 00:00:26,600 --> 00:00:29,000 Speaker 1: does the general public know when the science is settled 9 00:00:29,160 --> 00:00:32,559 Speaker 1: or about to be upended. No system is perfect, but 10 00:00:32,600 --> 00:00:34,800 Speaker 1: it's important that the process is transparent. 11 00:00:35,240 --> 00:00:36,080 Speaker 2: So today we're. 12 00:00:35,920 --> 00:00:37,920 Speaker 1: Going to shine a light on the inner workings of 13 00:00:37,920 --> 00:00:41,280 Speaker 1: one of the crucial parts of the scientific process, peer review. 14 00:00:41,960 --> 00:00:44,960 Speaker 1: How does it work? Who are these peers? Can peer 15 00:00:45,000 --> 00:00:48,200 Speaker 1: reviewed papers be wrong? And does that mean that all 16 00:00:48,280 --> 00:00:52,320 Speaker 1: of science is questionable? Welcome to Daniel and Kelly's peer 17 00:00:52,360 --> 00:01:06,800 Speaker 1: reviewed universe. Hi. 18 00:01:07,040 --> 00:01:10,639 Speaker 3: I'm Daniel. I'm a particle physicist and I've published over 19 00:01:10,720 --> 00:01:14,160 Speaker 3: a thousand papers, but haven't read most of them. 20 00:01:14,360 --> 00:01:14,520 Speaker 4: Hi. 21 00:01:14,600 --> 00:01:18,600 Speaker 2: I'm Kelly Wieder Smith and I'm confused. What does that mean? Daniel? 22 00:01:18,640 --> 00:01:21,839 Speaker 2: How have you published over a thousand papers at the LHC. 23 00:01:22,040 --> 00:01:27,360 Speaker 1: Does everybody who is reading Oh that's cheating, man, that's cheating. 24 00:01:27,600 --> 00:01:31,280 Speaker 3: Well, the way the collaboration works is everybody who's contributed 25 00:01:31,319 --> 00:01:33,559 Speaker 3: to the project is an author. And when you're an author, 26 00:01:34,040 --> 00:01:37,119 Speaker 3: you get included on every paper that comes out during 27 00:01:37,160 --> 00:01:40,959 Speaker 3: your authorship period. And ATLAS is a huge collaboration of 28 00:01:41,000 --> 00:01:43,039 Speaker 3: lots of clever people. We put out about one hundred 29 00:01:43,040 --> 00:01:45,959 Speaker 3: papers a year. Every single one has my name on it, 30 00:01:46,000 --> 00:01:48,400 Speaker 3: and I haven't read most of them. I couldn't even 31 00:01:48,440 --> 00:01:50,280 Speaker 3: explain the titles of some of them to you. 32 00:01:50,840 --> 00:01:54,480 Speaker 1: So in my so fields have their own cultures. In 33 00:01:54,520 --> 00:02:00,720 Speaker 1: my field, in my culture, that would ethically not be okay. 34 00:02:01,120 --> 00:02:03,680 Speaker 1: You should be responsible for what is in every paper 35 00:02:03,960 --> 00:02:06,120 Speaker 1: that your name is on, in my opinion. 36 00:02:06,080 --> 00:02:09,400 Speaker 3: And I was in another collaboration previously where things work differently, 37 00:02:09,440 --> 00:02:11,720 Speaker 3: where you weren't an author on a paper unless you 38 00:02:11,800 --> 00:02:14,200 Speaker 3: asked to be, and nobody questioned it. If you had 39 00:02:14,200 --> 00:02:16,040 Speaker 3: the right to be an author, if you were on 40 00:02:16,080 --> 00:02:17,560 Speaker 3: the list, you could be. But it was an opt 41 00:02:17,600 --> 00:02:20,160 Speaker 3: in and I thought that was much better. The lengths 42 00:02:20,160 --> 00:02:22,960 Speaker 3: of all the list was shorter, and authorship meant something. 43 00:02:23,760 --> 00:02:26,360 Speaker 3: But here you can ask to be removed, but you 44 00:02:26,360 --> 00:02:28,080 Speaker 3: have to go through this process and it would be 45 00:02:28,080 --> 00:02:30,640 Speaker 3: like I'd be doing it twice a week every week. 46 00:02:30,720 --> 00:02:35,120 Speaker 1: So wow, So what does your CV include all of 47 00:02:35,160 --> 00:02:37,520 Speaker 1: those papers or just the ones that you feel you've 48 00:02:37,560 --> 00:02:39,160 Speaker 1: actually contributed to, So. 49 00:02:39,080 --> 00:02:41,880 Speaker 3: That depends the CV I make and share. People only 50 00:02:41,919 --> 00:02:44,839 Speaker 3: includes papers that I actually did anything on, or had 51 00:02:44,880 --> 00:02:48,840 Speaker 3: ideas for, or really contributed to. But sometimes an institution 52 00:02:49,120 --> 00:02:52,200 Speaker 3: wants a list of your official papers, and then I 53 00:02:52,280 --> 00:02:54,600 Speaker 3: have to like include all I don't even know the 54 00:02:54,680 --> 00:02:57,079 Speaker 3: number one thousand plus papers at this point. 55 00:02:57,320 --> 00:02:58,080 Speaker 2: Oh poor dude. 56 00:02:58,440 --> 00:03:01,240 Speaker 3: Now, so. 57 00:03:02,800 --> 00:03:04,320 Speaker 2: How do you get them? Ball? Do you just copy 58 00:03:04,400 --> 00:03:05,400 Speaker 2: them from Google Scholar? 59 00:03:05,880 --> 00:03:08,880 Speaker 3: Yeah, Particle physics is pretty good at automation and stuff 60 00:03:08,919 --> 00:03:12,760 Speaker 3: like this, so we have our own databases of papers 61 00:03:12,800 --> 00:03:14,760 Speaker 3: and it's pretty easy to download that kind of stuff. 62 00:03:15,120 --> 00:03:17,480 Speaker 3: But yeah, it's pretty tough to have so many papers. 63 00:03:17,840 --> 00:03:20,600 Speaker 2: Oh, so many pages you have to print out. The 64 00:03:20,680 --> 00:03:23,160 Speaker 2: file for the PDA for your resume is so big. 65 00:03:23,720 --> 00:03:25,600 Speaker 3: How about you, Kelly, how many papers do you haven't? 66 00:03:25,680 --> 00:03:27,639 Speaker 3: Have you finished that paper from your thesis yet? 67 00:03:27,919 --> 00:03:32,000 Speaker 2: Oh I'm feeling angry now, Dan, I have not. 68 00:03:32,800 --> 00:03:36,200 Speaker 1: I've been busy with other things, including my new Ducks 69 00:03:36,240 --> 00:03:39,800 Speaker 1: and geese. But when goats and goats, which I won't 70 00:03:39,840 --> 00:03:40,200 Speaker 1: mention of. 71 00:03:40,200 --> 00:03:42,400 Speaker 3: My co authors, do your ducks and geese and goats? 72 00:03:42,520 --> 00:03:46,440 Speaker 3: Go on your CV. That's my question. No, uh, students supervised? 73 00:03:46,480 --> 00:03:46,720 Speaker 3: Come on? 74 00:03:46,960 --> 00:03:48,600 Speaker 2: Oh yeah, no, I've got that list. 75 00:03:48,720 --> 00:03:53,800 Speaker 1: But in the Great, the Great Goat, I think I've 76 00:03:53,840 --> 00:03:56,160 Speaker 1: got I haven't I haven't counted. I think I've got 77 00:03:56,240 --> 00:03:59,800 Speaker 1: something like thirty that. My pace has slowed since I 78 00:04:00,560 --> 00:04:03,160 Speaker 1: writing POPSI, but you know I still get a couple 79 00:04:03,200 --> 00:04:03,880 Speaker 1: in there Every year. 80 00:04:04,240 --> 00:04:07,400 Speaker 3: My CV has something whimsical and unseerious on it. Yours 81 00:04:07,560 --> 00:04:08,840 Speaker 3: is totally professional. 82 00:04:09,440 --> 00:04:16,839 Speaker 2: Ah, yes, what whimsical thing do you have on yours? 83 00:04:17,120 --> 00:04:17,240 Speaker 4: Oh? 84 00:04:17,360 --> 00:04:19,120 Speaker 3: I'm just gonna leave that as an easter eggs. Oh. 85 00:04:19,279 --> 00:04:21,000 Speaker 3: I put it in there to see if anybody actually 86 00:04:21,040 --> 00:04:23,560 Speaker 3: reads my CV, because if they don't respond to that, 87 00:04:23,680 --> 00:04:26,120 Speaker 3: I'm like, hmm, you didn't really read it. Okay, yeah 88 00:04:26,400 --> 00:04:26,840 Speaker 3: check it out. 89 00:04:26,880 --> 00:04:29,120 Speaker 2: All right, Well, I'll be looking up your CV. Do 90 00:04:29,120 --> 00:04:30,440 Speaker 2: you haven't updated? Here's the question. 91 00:04:30,960 --> 00:04:34,320 Speaker 1: Most scholars don't update their CV, but like once every 92 00:04:34,360 --> 00:04:35,000 Speaker 1: half decade. 93 00:04:35,960 --> 00:04:37,960 Speaker 2: Is your online CV updated? 94 00:04:38,800 --> 00:04:40,520 Speaker 3: Yeah? I think I updated a couple months ago. 95 00:04:40,720 --> 00:04:42,680 Speaker 2: Oh all right, bravo, good out. 96 00:04:43,560 --> 00:04:45,799 Speaker 1: All right, Well today we're not talking about the thousands 97 00:04:45,839 --> 00:04:48,559 Speaker 1: of publications that Daniel has, but we're all very proud 98 00:04:48,600 --> 00:04:50,359 Speaker 1: of Daniel for all of his hard. 99 00:04:50,200 --> 00:04:54,279 Speaker 3: Work, or the tens of geese and ducks that Kelly 100 00:04:54,400 --> 00:04:56,600 Speaker 3: is currently raising, though we're very proud of her and 101 00:04:56,720 --> 00:04:58,680 Speaker 3: her literally extended family. 102 00:04:58,920 --> 00:05:01,720 Speaker 1: Well, I don't have that many, and I wouldn't claim 103 00:05:01,839 --> 00:05:03,840 Speaker 1: other people's ducks that I didn't help out with as 104 00:05:03,920 --> 00:05:04,559 Speaker 1: my own ducks. 105 00:05:04,680 --> 00:05:07,040 Speaker 3: Oh wow, now no. 106 00:05:09,240 --> 00:05:13,440 Speaker 1: But anyway, today we are answering a question from a 107 00:05:13,680 --> 00:05:16,599 Speaker 1: superfan and we're going to go into a bit more detail. 108 00:05:16,680 --> 00:05:19,400 Speaker 1: This question inspired us to make a whole episode about 109 00:05:19,440 --> 00:05:21,520 Speaker 1: the process of peer reviews. So let's go ahead and 110 00:05:21,600 --> 00:05:25,159 Speaker 1: hear what Stephen from British Columbia has to say. 111 00:05:25,960 --> 00:05:29,800 Speaker 4: Hi, Daniel and Kelly, this is show super fan Stephen 112 00:05:29,960 --> 00:05:33,960 Speaker 4: from British Columbia, Canada. My question today has to do 113 00:05:34,160 --> 00:05:37,640 Speaker 4: with how scientific studies are peer reviewed? 114 00:05:38,360 --> 00:05:38,880 Speaker 1: Is it me? 115 00:05:39,720 --> 00:05:43,920 Speaker 4: Or is science denial spreading these days? And it came 116 00:05:44,040 --> 00:05:48,640 Speaker 4: to my attention recently that not all peer reviewed studies 117 00:05:48,880 --> 00:05:53,080 Speaker 4: are replicated, And then it occurred to me, doesn't this 118 00:05:53,320 --> 00:05:57,800 Speaker 4: automatically cast doubt on a discovery if a study is 119 00:05:57,920 --> 00:06:01,800 Speaker 4: not replicated but is still considered peer reviewed. 120 00:06:02,880 --> 00:06:04,080 Speaker 3: I would love to. 121 00:06:04,200 --> 00:06:09,040 Speaker 5: Hear an explanation about how the peer reviewing process works 122 00:06:09,520 --> 00:06:13,440 Speaker 5: and how scientists determine when a fact is a fact, 123 00:06:14,040 --> 00:06:18,200 Speaker 5: a finding is a finding, and how the community comes. 124 00:06:18,000 --> 00:06:20,599 Speaker 4: To a consensus on new discoveries. 125 00:06:21,360 --> 00:06:23,680 Speaker 3: I also think your audience. 126 00:06:23,320 --> 00:06:27,719 Speaker 4: Could use some tools for how to determine what studies 127 00:06:27,839 --> 00:06:32,760 Speaker 4: are legitimate and how to spot a bad study. How 128 00:06:33,000 --> 00:06:38,679 Speaker 4: should one respond to someone who claims a scientific study 129 00:06:39,200 --> 00:06:42,960 Speaker 4: is bogus. Thank you Daniel and Kelly for this great 130 00:06:43,040 --> 00:06:47,000 Speaker 4: podcast and for explaining science topics to the masses. 131 00:06:47,320 --> 00:06:49,960 Speaker 3: Thank you Stephen for asking this question. I think it's 132 00:06:50,040 --> 00:06:52,680 Speaker 3: really the right moment to dig into this. You know, 133 00:06:52,760 --> 00:06:56,160 Speaker 3: when we see institutions being attacked, when we see science 134 00:06:56,279 --> 00:06:59,440 Speaker 3: being denied, when we see the whole process of science 135 00:06:59,520 --> 00:07:02,240 Speaker 3: being q and I think it's important to shine some 136 00:07:02,400 --> 00:07:04,479 Speaker 3: light on how does this all work, what does it mean, 137 00:07:04,720 --> 00:07:07,200 Speaker 3: especially for those folks who are not scientists, who don't know, 138 00:07:07,640 --> 00:07:09,560 Speaker 3: what does it mean for a paper to be peer reviewed? 139 00:07:09,920 --> 00:07:12,560 Speaker 3: What is bad science? Thank you Stephen for asking this 140 00:07:12,680 --> 00:07:15,080 Speaker 3: question and giving us the chance to dig into all 141 00:07:15,160 --> 00:07:15,280 Speaker 3: of this. 142 00:07:15,640 --> 00:07:18,000 Speaker 1: That's right, and so the question in particular is about 143 00:07:18,200 --> 00:07:22,320 Speaker 1: peer review for scientific manuscripts. But peer review actually happens 144 00:07:22,360 --> 00:07:25,200 Speaker 1: at multiple stages in the process of doing science, and 145 00:07:25,320 --> 00:07:27,600 Speaker 1: so we thought we would take a step back and 146 00:07:27,720 --> 00:07:29,960 Speaker 1: talk about peer review from the very beginning, which is 147 00:07:30,000 --> 00:07:34,640 Speaker 1: when your idea becomes a grant. And this is one 148 00:07:34,680 --> 00:07:37,960 Speaker 1: of the less fun parts about science is writing this 149 00:07:38,120 --> 00:07:40,960 Speaker 1: grant because peer review for grants is harsh. 150 00:07:41,480 --> 00:07:43,560 Speaker 3: This is like the science version of when a build 151 00:07:43,600 --> 00:07:45,400 Speaker 3: becomes a law. We should have written a song for this. 152 00:07:45,640 --> 00:07:47,920 Speaker 1: No, well we still can. You know, we could write 153 00:07:47,920 --> 00:07:51,320 Speaker 1: it and then we can add it to the top. No, 154 00:07:51,440 --> 00:07:52,520 Speaker 1: we won't do that to everyone. 155 00:07:52,840 --> 00:07:55,640 Speaker 3: So Kelly, tell me what is your process of grant writing? 156 00:07:55,680 --> 00:07:57,640 Speaker 3: First of all, who do you submit your grants to? 157 00:07:58,240 --> 00:07:59,760 Speaker 3: And what's it like for you to prepare that? 158 00:08:00,440 --> 00:08:03,080 Speaker 1: So I submit my grants to the National Science Foundation. 159 00:08:03,360 --> 00:08:06,640 Speaker 1: And actually it's possible that in the last six months 160 00:08:06,720 --> 00:08:09,120 Speaker 1: the process has changed a bit because I know the 161 00:08:09,200 --> 00:08:11,600 Speaker 1: new administration is shaking things up. 162 00:08:11,720 --> 00:08:14,560 Speaker 3: The agency formerly known as the National Science Foundation. 163 00:08:15,160 --> 00:08:16,400 Speaker 2: No, what is it called now? 164 00:08:16,520 --> 00:08:17,040 Speaker 3: I'm joking? 165 00:08:17,480 --> 00:08:20,240 Speaker 2: Oh god, anything is believable, Daniel. 166 00:08:20,280 --> 00:08:24,520 Speaker 1: At this point, all right, the institution still known as 167 00:08:24,560 --> 00:08:28,040 Speaker 1: the NSF or the National Science Foundation. So for this 168 00:08:28,280 --> 00:08:31,320 Speaker 1: you write a grant, and there's different programs that accept 169 00:08:31,360 --> 00:08:33,640 Speaker 1: different kinds of grants, but often the grant is like 170 00:08:33,760 --> 00:08:38,319 Speaker 1: fifteen pages where you go into extreme detail and what 171 00:08:38,559 --> 00:08:41,319 Speaker 1: we know about the question you want to ask based 172 00:08:41,360 --> 00:08:43,959 Speaker 1: on the literature that's already out there, extreme detail and 173 00:08:44,040 --> 00:08:46,840 Speaker 1: the experimental design, and then a lot of detail about 174 00:08:47,040 --> 00:08:49,920 Speaker 1: what you think the implications will be of your results. 175 00:08:50,160 --> 00:08:53,360 Speaker 3: So you talk about why this question is interesting, to 176 00:08:53,600 --> 00:08:56,440 Speaker 3: argue that your work will help answer it, and then 177 00:08:56,480 --> 00:08:58,360 Speaker 3: you lay out in detail how are you going to 178 00:08:58,360 --> 00:09:00,800 Speaker 3: spend every dollar, who's going to do what, and when 179 00:09:00,840 --> 00:09:02,559 Speaker 3: you get like a whole gan chart for like when 180 00:09:02,600 --> 00:09:04,760 Speaker 3: everything is going to happen yep, And then you have 181 00:09:04,800 --> 00:09:07,520 Speaker 3: to lay out also like the products for the grant, right, yeah. 182 00:09:07,440 --> 00:09:09,400 Speaker 1: Yeah, the budgets and so the you're supposed to There's 183 00:09:09,440 --> 00:09:12,080 Speaker 1: a section called intellectual merit where you talk about how 184 00:09:12,120 --> 00:09:14,120 Speaker 1: many papers you think you're going to end up producing, 185 00:09:14,200 --> 00:09:16,200 Speaker 1: what big questions you're going to answer, And then there's 186 00:09:16,240 --> 00:09:18,720 Speaker 1: also a section called broader impacts, where you talk about 187 00:09:18,760 --> 00:09:22,040 Speaker 1: how your work is going to benefit society writ large. 188 00:09:22,559 --> 00:09:24,559 Speaker 1: And you know, for my grants in the past, it's 189 00:09:24,600 --> 00:09:26,800 Speaker 1: been things like, you know, my information is going to 190 00:09:26,840 --> 00:09:30,880 Speaker 1: be helpful to fisheries managers who are managing fish populations, 191 00:09:30,920 --> 00:09:33,640 Speaker 1: which is like a literally a multi billion dollar industry 192 00:09:34,320 --> 00:09:36,120 Speaker 1: for like people who want to go out fishing and stuff. 193 00:09:36,120 --> 00:09:38,599 Speaker 1: So most of my research has been on fish and 194 00:09:38,679 --> 00:09:42,320 Speaker 1: their parasites. Also intellectual merit will include things like how 195 00:09:42,360 --> 00:09:44,839 Speaker 1: many students you're going to train and what skills you're 196 00:09:44,880 --> 00:09:46,640 Speaker 1: going to give them that they can then use when 197 00:09:46,679 --> 00:09:49,160 Speaker 1: they go out in the workforce. And depending on the 198 00:09:49,200 --> 00:09:52,440 Speaker 1: field you're in, sometimes your broader impacts will be things like, Hey, 199 00:09:52,520 --> 00:09:55,760 Speaker 1: this could be what we learn about the brains of 200 00:09:55,840 --> 00:09:57,800 Speaker 1: this fish could one day help us produce a new 201 00:09:57,880 --> 00:10:01,720 Speaker 1: medication for anxiety or something like. So you talk about 202 00:10:01,720 --> 00:10:04,320 Speaker 1: how your stuff would impact the broader society. 203 00:10:04,559 --> 00:10:07,040 Speaker 3: And so for those folks who might think, hey, scientists 204 00:10:07,080 --> 00:10:10,240 Speaker 3: are just cashing in on the government gravy train, tell 205 00:10:10,360 --> 00:10:12,040 Speaker 3: us like, is this the kind of thing you throw 206 00:10:12,120 --> 00:10:15,439 Speaker 3: together in an afternoon and then can confidently think it 207 00:10:15,559 --> 00:10:16,240 Speaker 3: will be funded? 208 00:10:16,440 --> 00:10:17,360 Speaker 2: Oh my god? Okay. 209 00:10:17,520 --> 00:10:19,959 Speaker 1: So one of the reasons, to be honest, one of 210 00:10:19,960 --> 00:10:23,360 Speaker 1: the reasons I sort of transitioned away from academia in 211 00:10:23,480 --> 00:10:27,000 Speaker 1: the most purest sense, so I'm like academic adjacent now, 212 00:10:27,640 --> 00:10:31,800 Speaker 1: but was because I spent too many Christmas Eves working 213 00:10:31,880 --> 00:10:35,040 Speaker 1: on my grants because they were due like sometime in January, 214 00:10:35,720 --> 00:10:40,080 Speaker 1: and they just they were like so many I would 215 00:10:40,080 --> 00:10:41,880 Speaker 1: say months of work. In a lot of cases, if 216 00:10:41,880 --> 00:10:43,800 Speaker 1: it's a new idea, if you're just cleaning up an 217 00:10:43,840 --> 00:10:47,199 Speaker 1: old grant, then it can be once of work. And 218 00:10:47,320 --> 00:10:50,440 Speaker 1: then so many great grants are submitted, and when they 219 00:10:50,559 --> 00:10:54,000 Speaker 1: go to review, you have three peers of yours who 220 00:10:54,080 --> 00:10:57,200 Speaker 1: are in closely related fields review it, and then there's 221 00:10:57,320 --> 00:11:00,480 Speaker 1: a panel discussion where they talk about their views of 222 00:11:00,559 --> 00:11:02,840 Speaker 1: your grant in front of a bunch of other experts 223 00:11:02,840 --> 00:11:05,480 Speaker 1: who can also weigh in. And then all those reviews 224 00:11:05,559 --> 00:11:09,000 Speaker 1: go to program directors who pick out of like the 225 00:11:09,120 --> 00:11:12,320 Speaker 1: twenty best grants, you know, maybe they can only fund 226 00:11:12,440 --> 00:11:15,640 Speaker 1: fifteen or something. And then they pick some on the 227 00:11:15,760 --> 00:11:17,679 Speaker 1: East coast, some on the West coast, some in the 228 00:11:17,800 --> 00:11:20,839 Speaker 1: center of the country, some from major research institutions, some 229 00:11:21,000 --> 00:11:24,000 Speaker 1: from institutions that focus a lot on undergraduate research, and 230 00:11:24,200 --> 00:11:27,960 Speaker 1: they have research programs as well, And so so many 231 00:11:28,320 --> 00:11:32,000 Speaker 1: really great grants don't make it into the pile that 232 00:11:32,120 --> 00:11:33,040 Speaker 1: get funded. 233 00:11:33,120 --> 00:11:35,880 Speaker 3: Meaning that people have read them, have said this is excellent, 234 00:11:36,000 --> 00:11:39,400 Speaker 3: the science is good, it's interesting, it's important. We would 235 00:11:39,440 --> 00:11:42,000 Speaker 3: benefit as a society if we did this. It's all 236 00:11:42,080 --> 00:11:42,840 Speaker 3: well thought. 237 00:11:42,679 --> 00:11:46,719 Speaker 2: Through, but no, that is exactly right, yes, and it 238 00:11:46,960 --> 00:11:49,679 Speaker 2: is soul crushing. I know so many people who are like, 239 00:11:49,800 --> 00:11:51,640 Speaker 2: I gave up Christmas Eve on this grant, and this 240 00:11:51,760 --> 00:11:53,959 Speaker 2: is the third time I've submitted it, and it just 241 00:11:54,080 --> 00:11:56,720 Speaker 2: never makes it quite high enough, even though everybody thinks 242 00:11:56,760 --> 00:11:59,360 Speaker 2: it's amazing. And then quickly just to mention for follow up. 243 00:11:59,400 --> 00:12:01,920 Speaker 1: If you do get grant, every year you have to 244 00:12:01,960 --> 00:12:04,640 Speaker 1: write a progress report saying what you've done, have you 245 00:12:04,679 --> 00:12:07,040 Speaker 1: stuck to your timeline, how have you spent the money 246 00:12:07,080 --> 00:12:09,920 Speaker 1: that you've been given, are you going to hit your timelines? 247 00:12:10,040 --> 00:12:12,760 Speaker 1: And what have you done that impacts society? And so 248 00:12:12,920 --> 00:12:15,320 Speaker 1: every year you have to report on that and then 249 00:12:15,360 --> 00:12:17,560 Speaker 1: at the end you have to write a bigger report. 250 00:12:17,800 --> 00:12:19,800 Speaker 1: So you need to be justifying what you're doing with 251 00:12:19,880 --> 00:12:21,120 Speaker 1: the money every step of the way. 252 00:12:21,360 --> 00:12:23,559 Speaker 3: And before we turn it around and ask me questions, 253 00:12:24,000 --> 00:12:26,200 Speaker 3: who are the peers here, Who are folks who are 254 00:12:26,320 --> 00:12:28,640 Speaker 3: reading your grants and commenting on it. 255 00:12:29,000 --> 00:12:31,480 Speaker 1: There are folks in the same discipline at other universities. 256 00:12:31,600 --> 00:12:34,800 Speaker 1: So you also have to fill out a detailed excel 257 00:12:34,880 --> 00:12:37,240 Speaker 1: sheet where you talk about who in the field who 258 00:12:37,280 --> 00:12:40,520 Speaker 1: does similar work, has been your student or your postdoc 259 00:12:40,840 --> 00:12:42,839 Speaker 1: or your mentor or a co author. 260 00:12:42,679 --> 00:12:43,480 Speaker 3: You might have a conflict. 261 00:12:43,640 --> 00:12:47,160 Speaker 1: Yeah, and anyone who might like you for you and 262 00:12:47,240 --> 00:12:50,280 Speaker 1: who might be too nice to you when they review 263 00:12:50,320 --> 00:12:52,360 Speaker 1: the grant. So it has to be only people that 264 00:12:52,440 --> 00:12:54,520 Speaker 1: you haven't worked with or haven't co author of paper 265 00:12:54,559 --> 00:12:57,559 Speaker 1: with for something like five to ten years. I guess 266 00:12:57,600 --> 00:13:00,240 Speaker 1: they assume that the liking of someone wears off about 267 00:13:00,240 --> 00:13:03,920 Speaker 1: half a decade. But it can be tough because you know, 268 00:13:04,000 --> 00:13:05,880 Speaker 1: these fields are kind of small, but you know, usually 269 00:13:05,920 --> 00:13:09,600 Speaker 1: it's other people at research institutions who do similar work, 270 00:13:09,640 --> 00:13:12,000 Speaker 1: are familiar with the literature and can tell if what 271 00:13:12,080 --> 00:13:14,480 Speaker 1: you're proposing makes sense or not and is good or not. 272 00:13:14,760 --> 00:13:16,520 Speaker 3: And so the thing to take away from this is 273 00:13:16,960 --> 00:13:20,280 Speaker 3: getting a grand funded is hard, right. Not only is 274 00:13:20,320 --> 00:13:22,040 Speaker 3: it a huge amount of work, but you have to 275 00:13:22,120 --> 00:13:26,920 Speaker 3: survive an excruciating process where really only the best are selected. 276 00:13:27,400 --> 00:13:30,520 Speaker 3: It's sort of like watching the Olympics and you're wondering, like, oh, 277 00:13:30,640 --> 00:13:32,960 Speaker 3: here's somebody from this country and somebody from that country, 278 00:13:33,000 --> 00:13:35,959 Speaker 3: and you know that each person has already won some 279 00:13:36,080 --> 00:13:40,199 Speaker 3: sort of like really competitive national competition just to even 280 00:13:40,360 --> 00:13:44,079 Speaker 3: be there at the Olympics representing their country. And so 281 00:13:44,240 --> 00:13:46,679 Speaker 3: like every grant that's funded, even if it seems silly 282 00:13:46,840 --> 00:13:49,679 Speaker 3: and it's about like the sex lives of ducks, you 283 00:13:49,880 --> 00:13:54,480 Speaker 3: know that it's been like excruciatingly written, reviewed in detail, 284 00:13:54,679 --> 00:13:57,480 Speaker 3: questioned by experts without conflicts of interest, and found to 285 00:13:57,520 --> 00:14:00,840 Speaker 3: be excellent, beaten out lots of other grand These are 286 00:14:00,920 --> 00:14:04,839 Speaker 3: not slush funds shoveled to scientists, so you know, just 287 00:14:04,920 --> 00:14:07,559 Speaker 3: like do whatever they want with. These are hard one 288 00:14:07,760 --> 00:14:10,760 Speaker 3: funds to do science that the agency thinks are a 289 00:14:10,840 --> 00:14:11,360 Speaker 3: good idea. 290 00:14:11,720 --> 00:14:15,480 Speaker 1: Yeah, it is absolutely excruciating to do these things. It 291 00:14:15,559 --> 00:14:17,440 Speaker 1: feels great when you get it funded. And I said, 292 00:14:17,440 --> 00:14:20,120 Speaker 1: I hate writing grants. Sometimes I enjoy the process of 293 00:14:20,280 --> 00:14:23,240 Speaker 1: like figuring out just the right experimental design for a question. 294 00:14:23,400 --> 00:14:26,080 Speaker 1: That's like fun for me. But in general, when the 295 00:14:26,120 --> 00:14:29,840 Speaker 1: grant doesn't get funded, it sucks. But anyway, all right, 296 00:14:29,960 --> 00:14:31,560 Speaker 1: so what's your experience. 297 00:14:31,400 --> 00:14:34,000 Speaker 3: Yeah, my experience is very similar. You know, we come 298 00:14:34,080 --> 00:14:35,960 Speaker 3: up with an idea, we spend a lot of time 299 00:14:36,080 --> 00:14:39,720 Speaker 3: polishing it, often doing a lot of the work in 300 00:14:39,880 --> 00:14:43,600 Speaker 3: advance that we need to demonstrate that the work described 301 00:14:43,680 --> 00:14:47,800 Speaker 3: it's reasonable. Right. If it's too far forward thinking, then 302 00:14:48,080 --> 00:14:50,120 Speaker 3: you won't get the money because they're like, well, that 303 00:14:50,280 --> 00:14:52,400 Speaker 3: might work, but can you really prove it. It's too 304 00:14:52,480 --> 00:14:54,960 Speaker 3: much of a risk. If you've already done too much 305 00:14:55,000 --> 00:14:57,200 Speaker 3: of the work, then they're like, oh, this is already done, 306 00:14:57,200 --> 00:14:59,160 Speaker 3: why would we fund it? So it's really sort of 307 00:14:59,240 --> 00:15:01,320 Speaker 3: on the edge there of like you've done the initial 308 00:15:01,440 --> 00:15:04,480 Speaker 3: work for free right or on some other grant or whatever. 309 00:15:04,880 --> 00:15:07,920 Speaker 3: So you prove that the idea is valid and can fly, 310 00:15:08,200 --> 00:15:10,120 Speaker 3: but not so much that they're like, why would we 311 00:15:10,200 --> 00:15:12,080 Speaker 3: fund this? You've already done it, which is often a 312 00:15:12,160 --> 00:15:16,200 Speaker 3: delicate balance. And my feeling with grants is submit it 313 00:15:16,320 --> 00:15:19,720 Speaker 3: and forget it, because such a tiny fraction ever come 314 00:15:19,800 --> 00:15:21,800 Speaker 3: back with money that you just got to like let 315 00:15:21,880 --> 00:15:24,680 Speaker 3: it go, like, hey, I've submitted it, and I never 316 00:15:24,800 --> 00:15:28,680 Speaker 3: expect to hear back, you know, anything positive, And so 317 00:15:28,800 --> 00:15:31,120 Speaker 3: I just sort of like give up emotionally on everyone 318 00:15:31,200 --> 00:15:34,200 Speaker 3: because otherwise it's too hard, you know, it's too hard 319 00:15:34,240 --> 00:15:36,760 Speaker 3: to deal with, yea. So my process is very similar 320 00:15:36,920 --> 00:15:40,800 Speaker 3: to yours, with a couple of exceptions. Sometimes I submit 321 00:15:40,920 --> 00:15:44,240 Speaker 3: to private foundations, like I recently found a private foundation 322 00:15:44,440 --> 00:15:48,360 Speaker 3: that likes to fund projects that are too blue sky 323 00:15:48,880 --> 00:15:52,280 Speaker 3: that were rejected by the NSF for being like way 324 00:15:52,360 --> 00:15:56,560 Speaker 3: too out there, and that submission process was like write 325 00:15:56,560 --> 00:16:00,360 Speaker 3: a paragraph, send us your rejected NSF grant with the views, 326 00:16:00,640 --> 00:16:01,080 Speaker 3: and that's it. 327 00:16:01,360 --> 00:16:01,600 Speaker 4: Whoa. 328 00:16:02,040 --> 00:16:03,880 Speaker 3: And that one actually just came back and they just 329 00:16:03,960 --> 00:16:07,800 Speaker 3: gave me some money to build my cosmic ray smartphone 330 00:16:07,840 --> 00:16:10,880 Speaker 3: telescope down here at Irvine. So yeah, that was actually 331 00:16:10,880 --> 00:16:13,040 Speaker 3: a really positive experience graduations. 332 00:16:13,160 --> 00:16:14,040 Speaker 2: Yeah, that's awesome. 333 00:16:14,680 --> 00:16:18,120 Speaker 3: Yeah exactly, because you know, private foundations can do whatever 334 00:16:18,160 --> 00:16:20,480 Speaker 3: they like with your money. What we talked about previously 335 00:16:20,600 --> 00:16:24,000 Speaker 3: was mostly the process of applying to government institutions, the 336 00:16:24,160 --> 00:16:27,800 Speaker 3: NIH the NSF. Most of my fund incomes from Department 337 00:16:27,840 --> 00:16:30,600 Speaker 3: of Energy, Office of Science, which funds particle physics and 338 00:16:30,680 --> 00:16:34,160 Speaker 3: lots of other stuff in the United States. But private 339 00:16:34,200 --> 00:16:36,480 Speaker 3: fundations can do anything. Like the Bill Gates Foundation, you 340 00:16:36,560 --> 00:16:38,600 Speaker 3: can't even apply to. They have to like reach out 341 00:16:38,640 --> 00:16:38,840 Speaker 3: to you. 342 00:16:39,320 --> 00:16:39,520 Speaker 2: Wow. 343 00:16:39,640 --> 00:16:42,120 Speaker 3: And you know Mackenzie Scott just like gives money to 344 00:16:42,160 --> 00:16:44,040 Speaker 3: people that don't even apply. They just get a phone 345 00:16:44,080 --> 00:16:46,680 Speaker 3: call saying like, by the way, here comes a million 346 00:16:46,720 --> 00:16:47,640 Speaker 3: dollars or more. 347 00:16:47,840 --> 00:16:48,040 Speaker 4: Whoa. 348 00:16:48,400 --> 00:16:50,600 Speaker 3: So yeah, private foundations are weird. 349 00:16:51,440 --> 00:16:55,200 Speaker 1: Yeah, so I'm sure that they make solid choices about 350 00:16:55,200 --> 00:16:56,800 Speaker 1: who they're going to give their money. To But I 351 00:16:56,840 --> 00:16:58,760 Speaker 1: guess one way to sort of check how much you 352 00:16:58,960 --> 00:17:01,080 Speaker 1: can trust an article as you can look in the 353 00:17:01,160 --> 00:17:03,440 Speaker 1: acknowledgment section at the end and say, oh, this was 354 00:17:03,480 --> 00:17:05,600 Speaker 1: funded by the National Science Foundation, And then you know 355 00:17:06,200 --> 00:17:08,680 Speaker 1: this went through rigorous peer review. And just because something 356 00:17:08,800 --> 00:17:11,320 Speaker 1: doesn't go through rigorous peer review at the grant stage 357 00:17:11,640 --> 00:17:14,280 Speaker 1: doesn't mean it's bad science. But at least you know 358 00:17:14,320 --> 00:17:16,720 Speaker 1: if it went through a government agency, it has been 359 00:17:16,760 --> 00:17:19,199 Speaker 1: sort of gone over with a fine tooth coust And the. 360 00:17:19,200 --> 00:17:22,879 Speaker 3: Thing that's frustrating to me is this process is so inefficient. 361 00:17:23,200 --> 00:17:26,760 Speaker 3: Scientists spend so much of their time preparing grant proposals 362 00:17:26,920 --> 00:17:30,120 Speaker 3: and having them rejected. And you know, each of these 363 00:17:30,280 --> 00:17:32,320 Speaker 3: is a great idea, each of them. If we did 364 00:17:32,359 --> 00:17:35,600 Speaker 3: it would benefit society in terms of sheer knowledge or 365 00:17:36,240 --> 00:17:39,800 Speaker 3: you know, new technology or something. We have all these 366 00:17:39,840 --> 00:17:43,399 Speaker 3: smart people constantly pitching great ideas to the government, the 367 00:17:43,480 --> 00:17:45,960 Speaker 3: government saying, yeah, that's awesome, we'd love to do it, 368 00:17:46,119 --> 00:17:50,280 Speaker 3: but we can't. Like, why don't we just double triple 369 00:17:51,160 --> 00:17:54,800 Speaker 3: funding for science. It would be like better for us 370 00:17:54,840 --> 00:17:57,800 Speaker 3: every dollar we spend, we know comes back two threefold 371 00:17:57,880 --> 00:18:00,960 Speaker 3: in terms of like economic output. It just seems crazy 372 00:18:01,000 --> 00:18:04,160 Speaker 3: to me to reject all of these excellent ideas. 373 00:18:04,440 --> 00:18:07,240 Speaker 2: I mean, I gotta say, not every idea I've reviewed 374 00:18:07,320 --> 00:18:08,840 Speaker 2: on a panel has been excellent. 375 00:18:09,000 --> 00:18:10,600 Speaker 3: No, but there are plenty above threshold. 376 00:18:10,680 --> 00:18:11,960 Speaker 2: Yes, absolutely, absolutely. 377 00:18:12,040 --> 00:18:14,960 Speaker 1: There's always a grant every round where my heart breaks 378 00:18:15,000 --> 00:18:17,080 Speaker 1: a little that it didn't get funded because I thought, oh, 379 00:18:17,200 --> 00:18:19,760 Speaker 1: that is so cool, but there wasn't enough money to 380 00:18:19,960 --> 00:18:21,960 Speaker 1: cover it. And so yes, there's a lot of good 381 00:18:22,000 --> 00:18:23,040 Speaker 1: work that's not getting done. 382 00:18:23,160 --> 00:18:26,000 Speaker 3: And I agree. I'm a grant reviewer often and I 383 00:18:26,160 --> 00:18:28,240 Speaker 3: see grants I'm like, no, this is not well thought out, 384 00:18:28,400 --> 00:18:30,840 Speaker 3: or there's a flaw here, or this isn't cutting edge. 385 00:18:30,920 --> 00:18:32,840 Speaker 3: You know, somebody did this last year, and it's important 386 00:18:33,080 --> 00:18:35,240 Speaker 3: that this stuff gets reviewed and gets reviewed in a 387 00:18:35,320 --> 00:18:38,320 Speaker 3: fair way. But there's so many that are above threshold, 388 00:18:38,520 --> 00:18:40,720 Speaker 3: and only a fraction of those get funded, and it 389 00:18:40,800 --> 00:18:42,440 Speaker 3: just seems to me to be a waste. But whatever. 390 00:18:42,880 --> 00:18:46,159 Speaker 3: I'm not a political person. I understand these things. But 391 00:18:46,240 --> 00:18:48,359 Speaker 3: I want folks out there to realize that every grant 392 00:18:48,400 --> 00:18:51,880 Speaker 3: you've seen that's been funded has been reviewed and excruciating 393 00:18:51,960 --> 00:18:54,960 Speaker 3: detailed before dollar one was even sent to the institution. 394 00:18:55,240 --> 00:18:55,639 Speaker 2: That's right. 395 00:18:55,760 --> 00:18:57,480 Speaker 1: I don't think we need to get into lots of 396 00:18:57,560 --> 00:18:59,359 Speaker 1: detail about this, but every once in a while, my 397 00:18:59,440 --> 00:19:01,480 Speaker 1: husband and I will have chats about what would be 398 00:19:01,560 --> 00:19:04,800 Speaker 1: a better system for funding grants, Like maybe every four 399 00:19:04,920 --> 00:19:07,359 Speaker 1: years you get a chance to submit and that way 400 00:19:07,400 --> 00:19:09,720 Speaker 1: you don't have to write grants the other three years 401 00:19:09,840 --> 00:19:12,000 Speaker 1: and there's fewer people in the pool and you're more 402 00:19:12,119 --> 00:19:14,320 Speaker 1: likely to get it. I think that's not the answer, 403 00:19:14,480 --> 00:19:16,720 Speaker 1: but I wonder if there's some way we could save 404 00:19:16,800 --> 00:19:19,960 Speaker 1: scientists from spending all of this time on grants that 405 00:19:20,040 --> 00:19:22,680 Speaker 1: don't get funded, or maybe we should just throw twenty 406 00:19:22,760 --> 00:19:25,520 Speaker 1: times as much money at the scientific community. There's our solution. 407 00:19:25,760 --> 00:19:28,240 Speaker 3: Well, you know, it used to be fifty years ago 408 00:19:28,600 --> 00:19:30,879 Speaker 3: that the philosophy is a little bit different that the 409 00:19:30,920 --> 00:19:34,639 Speaker 3: government funded people rather than projects. And they were like, oh, 410 00:19:34,760 --> 00:19:38,680 Speaker 3: scientist X at this university, you're a smart lady. You've 411 00:19:38,720 --> 00:19:40,960 Speaker 3: done good work. We'll just keep giving you money and 412 00:19:41,240 --> 00:19:42,840 Speaker 3: it'll just go for a while. As long as you 413 00:19:42,960 --> 00:19:45,320 Speaker 3: keep doing something, we'll keep giving you money. And I 414 00:19:45,359 --> 00:19:47,520 Speaker 3: think that we used to be the major model, and 415 00:19:47,640 --> 00:19:49,680 Speaker 3: it's just not that way anymore. Now. It's projects, and 416 00:19:49,720 --> 00:19:52,040 Speaker 3: so if you have a lab at a university, it's 417 00:19:52,080 --> 00:19:53,920 Speaker 3: sort of like running a small business. You know, you 418 00:19:53,960 --> 00:19:57,359 Speaker 3: have to constantly be pitching grants to get funds. You know, 419 00:19:57,600 --> 00:19:59,719 Speaker 3: you're like running a store at the mall. Katrina likes 420 00:19:59,760 --> 00:20:03,840 Speaker 3: to say, you're constantly looking for new ways to contribute. 421 00:20:04,240 --> 00:20:07,080 Speaker 3: The one holdover that I'm aware of is actually experimental 422 00:20:07,160 --> 00:20:10,200 Speaker 3: particle physics is still a little bit of the older model. 423 00:20:10,760 --> 00:20:13,840 Speaker 3: They run competitions for junior faculty, and if you win 424 00:20:13,960 --> 00:20:17,480 Speaker 3: one of these awards, like I want a Outstanding Junior 425 00:20:17,520 --> 00:20:20,479 Speaker 3: Investigator Award when I was a very young professor, then 426 00:20:20,520 --> 00:20:22,800 Speaker 3: you sort of get in the club and then they 427 00:20:22,960 --> 00:20:25,040 Speaker 3: mostly fund you, and your find can go up if 428 00:20:25,080 --> 00:20:27,280 Speaker 3: you do great, and go down if you're less productive. 429 00:20:27,880 --> 00:20:31,119 Speaker 3: But it's much more stable than typically I think, because 430 00:20:31,160 --> 00:20:34,119 Speaker 3: these particle physics projects last like twenty years. You know, 431 00:20:34,240 --> 00:20:36,240 Speaker 3: we build a collider, we expect to use it for 432 00:20:36,320 --> 00:20:39,600 Speaker 3: twenty five years. That's my thinking for why they still 433 00:20:39,640 --> 00:20:41,440 Speaker 3: do it in the sort of older model. But it 434 00:20:41,480 --> 00:20:43,359 Speaker 3: means I don't have to write as many grants because 435 00:20:43,359 --> 00:20:45,639 Speaker 3: I do have one more stable source of funding. 436 00:20:45,920 --> 00:20:47,960 Speaker 2: Wow. So even to this day, you still get a 437 00:20:48,080 --> 00:20:49,159 Speaker 2: chunk of money every year. 438 00:20:49,520 --> 00:20:51,639 Speaker 3: I have to submit a grant every three years to 439 00:20:51,800 --> 00:20:53,720 Speaker 3: propose new work and to tell them what I did 440 00:20:53,760 --> 00:20:55,480 Speaker 3: in the last three years, and so far every time 441 00:20:55,600 --> 00:20:59,040 Speaker 3: my grant has been renewed and continued. So yeah, I've 442 00:20:59,040 --> 00:21:01,720 Speaker 3: been funded from the Armament of Energy for a couple 443 00:21:01,760 --> 00:21:04,080 Speaker 3: of decades now, which is very nice. Yeah, and I'm 444 00:21:04,160 --> 00:21:06,320 Speaker 3: very grateful to the Department of Energy, thank you very much, 445 00:21:06,600 --> 00:21:08,440 Speaker 3: and to all the taxpayers who support them. 446 00:21:08,600 --> 00:21:13,040 Speaker 1: Oh you government, shill, I'm just kidding. I am so 447 00:21:13,200 --> 00:21:15,480 Speaker 1: happy for you. That's awesome. I know there's a couple 448 00:21:15,680 --> 00:21:18,560 Speaker 1: grants for people in the medical field that work that 449 00:21:18,640 --> 00:21:21,440 Speaker 1: way also where they fund your lab and just trust 450 00:21:21,520 --> 00:21:23,600 Speaker 1: that you're doing awesome stuff and you will continue to 451 00:21:23,640 --> 00:21:24,920 Speaker 1: do awesome stuff with that money. 452 00:21:25,320 --> 00:21:26,399 Speaker 2: And that sounds pretty sweet. 453 00:21:26,520 --> 00:21:28,640 Speaker 3: Yeah, it's pretty nice. Yeah. I think you're talking about 454 00:21:28,640 --> 00:21:30,359 Speaker 3: like the Howard Hughes Awards, for example. 455 00:21:30,560 --> 00:21:33,119 Speaker 2: That sounds very yeah. Yeah yeah. All right, Well, my 456 00:21:33,320 --> 00:21:34,159 Speaker 2: jealousy aside. 457 00:21:34,240 --> 00:21:37,000 Speaker 1: Let's all take a break, and when we get back, 458 00:21:37,040 --> 00:21:39,120 Speaker 1: we're going to talk about the next phase when peer 459 00:21:39,200 --> 00:21:40,040 Speaker 1: review is done, and. 460 00:21:40,119 --> 00:21:42,440 Speaker 2: That is when you're about to start your experiments. 461 00:22:03,080 --> 00:22:05,280 Speaker 3: All right, we are back and we are talking about 462 00:22:05,400 --> 00:22:08,320 Speaker 3: peer review. Thanks to a question from a listener, and 463 00:22:08,400 --> 00:22:11,160 Speaker 3: we dug into the process of peer review for grant 464 00:22:11,240 --> 00:22:15,000 Speaker 3: proposals and grant writing. And now let's talk about what 465 00:22:15,200 --> 00:22:18,600 Speaker 3: it's like to review an experiment while it's running, before 466 00:22:18,680 --> 00:22:22,040 Speaker 3: the paper is even sent to the journal. Kelly, I 467 00:22:22,200 --> 00:22:25,320 Speaker 3: mostly do research on particles that don't have rights and 468 00:22:25,920 --> 00:22:29,399 Speaker 3: don't have institutional review boards protecting them. What's it like 469 00:22:29,520 --> 00:22:32,639 Speaker 3: to do an experiment on living creatures with emotions that 470 00:22:32,760 --> 00:22:35,159 Speaker 3: can feel pain and have people looking over your shoulder. 471 00:22:35,440 --> 00:22:36,920 Speaker 2: Oh, it can be pretty stressful. 472 00:22:37,080 --> 00:22:40,359 Speaker 1: So say you get that National Science Foundation grant, they 473 00:22:40,359 --> 00:22:43,160 Speaker 1: won't actually release the funds to you until you've shown 474 00:22:43,359 --> 00:22:46,880 Speaker 1: that you've acquired certain permits and protocols so that your 475 00:22:47,000 --> 00:22:50,000 Speaker 1: institution is giving you permission to do the research. So 476 00:22:50,480 --> 00:22:53,880 Speaker 1: my PhD work required collecting some fish out of estuaries 477 00:22:53,880 --> 00:22:56,600 Speaker 1: in California. So first I had to talk to the 478 00:22:56,680 --> 00:22:59,320 Speaker 1: state government and fill out permits to get permission to 479 00:22:59,359 --> 00:23:02,240 Speaker 1: go collect those fish. So I had to convince the 480 00:23:02,320 --> 00:23:04,520 Speaker 1: California government that I wasn't going to take too many, 481 00:23:05,000 --> 00:23:07,520 Speaker 1: that there were plenty of these fish out there, that 482 00:23:07,640 --> 00:23:09,760 Speaker 1: what I was going to do to them was asking 483 00:23:09,840 --> 00:23:11,320 Speaker 1: a worthwhile scientific. 484 00:23:11,000 --> 00:23:12,919 Speaker 3: Question, Kelly, what are you going to do to. 485 00:23:13,040 --> 00:23:17,520 Speaker 1: Them, I'm going to hear well the questions I'm going 486 00:23:17,600 --> 00:23:20,560 Speaker 1: to ask of them that they will be contributing in 487 00:23:20,640 --> 00:23:23,720 Speaker 1: a meaningful way to science. And so first you have 488 00:23:23,800 --> 00:23:25,680 Speaker 1: to get that permission to take the animals out of 489 00:23:25,680 --> 00:23:27,680 Speaker 1: the wild, and then you need to fill out a 490 00:23:27,800 --> 00:23:31,520 Speaker 1: protocol through the Institutional Animal Care and Use Committee, which 491 00:23:31,560 --> 00:23:35,120 Speaker 1: is iya Cook. It includes five people, and they include 492 00:23:35,160 --> 00:23:38,920 Speaker 1: folks like veterinarians, an outside member, so like somebody who's 493 00:23:38,960 --> 00:23:41,400 Speaker 1: part of the community who is just sort of going 494 00:23:41,480 --> 00:23:43,760 Speaker 1: to weigh in on what the general public feels about 495 00:23:43,760 --> 00:23:46,280 Speaker 1: the work that's being done. Sometimes you'll also have like 496 00:23:46,280 --> 00:23:49,520 Speaker 1: an ethicist or a lawyer in there, and often you'll 497 00:23:49,560 --> 00:23:51,440 Speaker 1: have like a faculty member on there also. 498 00:23:51,800 --> 00:23:53,639 Speaker 3: And what do these people want? Do they want to 499 00:23:53,680 --> 00:23:55,760 Speaker 3: say yes? Do they want to say no? Are they 500 00:23:56,040 --> 00:23:59,000 Speaker 3: just totally disinterested? Like what are their motivations? Why is 501 00:23:59,040 --> 00:24:01,359 Speaker 3: like a rando from the public doing this? Anyway? 502 00:24:01,720 --> 00:24:04,720 Speaker 1: So, if I'm being completely honest, my field in the 503 00:24:04,920 --> 00:24:10,639 Speaker 1: past did some uncomfortable things to animals, and you know, 504 00:24:10,760 --> 00:24:14,080 Speaker 1: they didn't use, for example, anesthetic when they were doing 505 00:24:14,160 --> 00:24:16,320 Speaker 1: some surgeries, and they should have things like that. And 506 00:24:16,440 --> 00:24:19,520 Speaker 1: so this committee is trying to make sure that you 507 00:24:19,680 --> 00:24:22,639 Speaker 1: are treating the animals as nice as possible and using 508 00:24:22,720 --> 00:24:25,359 Speaker 1: the fewest animals that you need to get answers to 509 00:24:25,400 --> 00:24:28,200 Speaker 1: your questions. So you need to convince them that you 510 00:24:28,440 --> 00:24:32,119 Speaker 1: have read up on the most effective anesthetics for whatever 511 00:24:32,200 --> 00:24:35,600 Speaker 1: animal it is that you're using. You also often have 512 00:24:35,760 --> 00:24:38,359 Speaker 1: to take a bunch of online training courses where you 513 00:24:38,520 --> 00:24:41,240 Speaker 1: show I am well trained and the best way to 514 00:24:41,280 --> 00:24:43,680 Speaker 1: treat these animals and I know how these anesthetics work. 515 00:24:44,080 --> 00:24:46,000 Speaker 1: You often have to prove to them that you have 516 00:24:46,200 --> 00:24:47,840 Speaker 1: like teamed up with someone who can make sure you're 517 00:24:47,880 --> 00:24:50,679 Speaker 1: doing the process correctly, and you need to convince them 518 00:24:50,720 --> 00:24:53,240 Speaker 1: that you've thought really hard about the exact number of 519 00:24:53,320 --> 00:24:56,080 Speaker 1: animals that you're going to use for these experiments. You know, 520 00:24:56,160 --> 00:24:59,240 Speaker 1: their job isn't to stop science, and they're not necessarily 521 00:24:59,400 --> 00:25:01,320 Speaker 1: trying to figure you're out of the question that's being 522 00:25:01,440 --> 00:25:04,080 Speaker 1: asked is good or not. They assume that if you 523 00:25:04,160 --> 00:25:06,919 Speaker 1: have funding from the National Science Foundation, that has already happened. 524 00:25:07,320 --> 00:25:09,520 Speaker 1: But their goal is to just make sure that nothing 525 00:25:09,640 --> 00:25:12,399 Speaker 1: unethical happens and that the animals who are contributing their 526 00:25:12,440 --> 00:25:14,400 Speaker 1: lives to science have the best life possible. 527 00:25:14,600 --> 00:25:16,879 Speaker 3: Oh that sounds nice. It is, and it's nice to 528 00:25:16,960 --> 00:25:19,840 Speaker 3: see the process sort of self correcting, like, yeah, okay, 529 00:25:19,880 --> 00:25:23,360 Speaker 3: we trust scientists. Actually, maybe they need some more eyeballs 530 00:25:23,400 --> 00:25:26,520 Speaker 3: on them because they have conflicts of interest, and so 531 00:25:26,640 --> 00:25:28,520 Speaker 3: it's good to have other people who don't have those 532 00:25:28,560 --> 00:25:31,520 Speaker 3: conflicts looking over your shoulder and making sure you're doing 533 00:25:31,560 --> 00:25:32,280 Speaker 3: things with the right way. 534 00:25:32,400 --> 00:25:34,840 Speaker 1: That's nice, it is, And I also appreciate that there's 535 00:25:34,840 --> 00:25:37,200 Speaker 1: a veterinarian on there. The veterinarian will like check in 536 00:25:37,280 --> 00:25:41,200 Speaker 1: pretty regularly. I have training in parasitology, but I don't 537 00:25:41,200 --> 00:25:44,760 Speaker 1: necessarily have training in care of fish or something. 538 00:25:44,800 --> 00:25:45,760 Speaker 2: And I've done a bunch of reading. 539 00:25:45,800 --> 00:25:47,560 Speaker 1: But it's nice to have a veterinarian check in every 540 00:25:47,600 --> 00:25:50,040 Speaker 1: once in a while and offer their expertise to make 541 00:25:50,040 --> 00:25:52,120 Speaker 1: sure that these animals really are being treated as well 542 00:25:52,400 --> 00:25:54,639 Speaker 1: as you can in a lab setting. And I have 543 00:25:54,800 --> 00:25:57,640 Speaker 1: never worked on humans, but if you are doing things 544 00:25:57,680 --> 00:26:01,080 Speaker 1: with humans, including just like sending out survey to humans 545 00:26:01,320 --> 00:26:03,720 Speaker 1: that might ask questions that would make people feel uncomfortable, 546 00:26:04,080 --> 00:26:07,440 Speaker 1: you have to pass those procedures through an institutional review board. 547 00:26:07,800 --> 00:26:10,960 Speaker 1: So there's a similar procedure for working through humans. So 548 00:26:11,000 --> 00:26:14,240 Speaker 1: I guess you don't have anything like that for particle 549 00:26:14,280 --> 00:26:16,600 Speaker 1: physics because we don't care what you do to the particles. 550 00:26:17,000 --> 00:26:19,320 Speaker 3: No, we don't. But I did one time come home 551 00:26:19,400 --> 00:26:22,680 Speaker 3: from work to find Katrina collecting samples for an experiment 552 00:26:22,800 --> 00:26:26,160 Speaker 3: from our children. Oh, And I was like, hmm, shouldn't 553 00:26:26,160 --> 00:26:30,280 Speaker 3: you be asking the other parent pofore experimenting on your children, 554 00:26:30,400 --> 00:26:33,000 Speaker 3: like you have a conflict of interest here? And she 555 00:26:33,200 --> 00:26:35,720 Speaker 3: was like, it's just a saliva sample. I'm like, the 556 00:26:35,840 --> 00:26:38,440 Speaker 3: children can't consent to. So we have a lot of 557 00:26:38,440 --> 00:26:39,280 Speaker 3: fun joking about that. 558 00:26:39,480 --> 00:26:42,760 Speaker 2: Did you give permission? That's good? That's good. 559 00:26:43,640 --> 00:26:45,760 Speaker 1: So, Daniel, there's this new thing that's getting sort of 560 00:26:45,840 --> 00:26:48,160 Speaker 1: big in science that my friends have been talking about. 561 00:26:48,200 --> 00:26:49,800 Speaker 1: I haven't had a chance to do this yet, but 562 00:26:50,000 --> 00:26:52,800 Speaker 1: pre registering a scientific study, have you done this yet? 563 00:26:53,040 --> 00:26:55,600 Speaker 3: We don't actually do this in particle physics, because this 564 00:26:55,720 --> 00:26:58,000 Speaker 3: is another way that particle physics is weird, is that 565 00:26:58,480 --> 00:27:01,520 Speaker 3: we publish our studies when we get a negative result. 566 00:27:02,240 --> 00:27:04,600 Speaker 3: Like lots of fields of science, you might say I 567 00:27:04,640 --> 00:27:06,800 Speaker 3: have an idea for how we might learn something, and 568 00:27:06,880 --> 00:27:08,760 Speaker 3: you know, you do this study and then it didn't 569 00:27:08,800 --> 00:27:12,639 Speaker 3: work or you learned that there's nothing new there like okay, 570 00:27:13,040 --> 00:27:15,359 Speaker 3: bears eat salmon. We already knew that you know like 571 00:27:15,640 --> 00:27:18,760 Speaker 3: or whatever. And often those studies don't get published if 572 00:27:18,800 --> 00:27:21,960 Speaker 3: the result isn't statistically significant or you didn't learn something new, 573 00:27:22,800 --> 00:27:24,879 Speaker 3: and there's a statistical issue there, which is that this 574 00:27:25,000 --> 00:27:27,879 Speaker 3: can lead to a bias in our data and our understanding. 575 00:27:28,280 --> 00:27:30,920 Speaker 3: The effect is called PA hacking, and it means that 576 00:27:31,080 --> 00:27:34,040 Speaker 3: sometimes things that are not real can appear to be 577 00:27:34,119 --> 00:27:38,160 Speaker 3: significant just due to random fluctuation. Say, for example, I'm 578 00:27:38,400 --> 00:27:40,640 Speaker 3: testing a bunch of coins to see if they're fare. 579 00:27:41,160 --> 00:27:44,560 Speaker 3: I flip each one one hundred times. Then like most 580 00:27:44,600 --> 00:27:46,840 Speaker 3: of the time, I'm going to get fifty heads, but occasionally, 581 00:27:47,119 --> 00:27:49,360 Speaker 3: just due to random fluctuations, I'm going to get one 582 00:27:49,440 --> 00:27:52,639 Speaker 3: that gets like seventy heads right. And the more I 583 00:27:52,760 --> 00:27:54,960 Speaker 3: do this experiment, the more likely I am to have 584 00:27:55,160 --> 00:27:58,080 Speaker 3: one that's a fluctuation. And so if I only publish 585 00:27:58,520 --> 00:28:00,680 Speaker 3: the ones that seem to be sick magnificant, that that 586 00:28:00,800 --> 00:28:03,840 Speaker 3: crossed some statistical threshold to be weird, it's going to 587 00:28:03,920 --> 00:28:06,080 Speaker 3: look like these coins are not fair, when in reality 588 00:28:06,200 --> 00:28:09,000 Speaker 3: they are. That's what PA hacking is. P refers to 589 00:28:09,440 --> 00:28:12,440 Speaker 3: the probability for the alternative hypothesis that the coin is 590 00:28:12,520 --> 00:28:16,600 Speaker 3: fair to fluctuate randomly to look like it's not fair. 591 00:28:17,400 --> 00:28:20,719 Speaker 3: And so the way to counteract this is to publish 592 00:28:20,800 --> 00:28:23,480 Speaker 3: negative results, to say, look, I did all these experiments 593 00:28:23,880 --> 00:28:26,399 Speaker 3: and the coin came out fair. We already knew that, yawn. 594 00:28:26,560 --> 00:28:29,359 Speaker 3: But it's important that we include this context so that 595 00:28:29,480 --> 00:28:31,560 Speaker 3: if one in a thousand studies says, look I saw 596 00:28:31,640 --> 00:28:34,480 Speaker 3: one in a thousand times effect, you know what it means. 597 00:28:35,160 --> 00:28:38,360 Speaker 3: And often we do this by pre registering studies by 598 00:28:38,400 --> 00:28:40,040 Speaker 3: saying I'm going to go out and do this study 599 00:28:40,160 --> 00:28:42,720 Speaker 3: and I don't know the results yet, but I'm going 600 00:28:42,800 --> 00:28:45,560 Speaker 3: to publish it either way, right, And so that's a 601 00:28:45,640 --> 00:28:46,920 Speaker 3: way to counteract p hacking. 602 00:28:47,120 --> 00:28:50,200 Speaker 1: So in my field, what you've just described, I think 603 00:28:50,200 --> 00:28:52,680 Speaker 1: we would call the file draw effect. And so the 604 00:28:53,240 --> 00:28:55,320 Speaker 1: idea here is that, yeah, when you get a null 605 00:28:55,480 --> 00:28:58,720 Speaker 1: result or a result that's less interesting, you might try 606 00:28:58,760 --> 00:29:00,800 Speaker 1: to publish it somewhere, but you're less likely to put 607 00:29:00,800 --> 00:29:02,240 Speaker 1: in the effort to try to publish it in a 608 00:29:02,280 --> 00:29:04,240 Speaker 1: lower tier journal that you're going to get less credit 609 00:29:04,320 --> 00:29:07,600 Speaker 1: from your institution for having published in. And that could 610 00:29:07,640 --> 00:29:09,280 Speaker 1: make it look like an effect is there, because when 611 00:29:09,280 --> 00:29:11,400 Speaker 1: you randomly get a positive effect, it gets published and 612 00:29:11,480 --> 00:29:14,600 Speaker 1: otherwise it doesn't. For us, pe hacking is when you 613 00:29:14,840 --> 00:29:17,520 Speaker 1: didn't get the result that you wanted, but you're looking 614 00:29:17,560 --> 00:29:19,760 Speaker 1: at your data and you're like, oh, you know, I 615 00:29:19,880 --> 00:29:23,080 Speaker 1: wasn't actually asking a question about this other thing, but 616 00:29:23,320 --> 00:29:25,440 Speaker 1: if I look at my data, there's actually a statistically 617 00:29:25,480 --> 00:29:28,920 Speaker 1: significant result there, and so I'm going to publish on that, 618 00:29:29,760 --> 00:29:31,400 Speaker 1: and you know, maybe I'll mention that this wasn't the 619 00:29:31,480 --> 00:29:34,840 Speaker 1: initial thing, but I found this, you know, significant relationship 620 00:29:35,080 --> 00:29:37,240 Speaker 1: that's positive, So I'll publish on that. And the deal 621 00:29:37,320 --> 00:29:39,920 Speaker 1: there is, you know, like you said, randomly, you would 622 00:29:39,920 --> 00:29:42,600 Speaker 1: expect to get results that look like something is really 623 00:29:42,680 --> 00:29:45,080 Speaker 1: going on. And if you just are searching through your 624 00:29:45,160 --> 00:29:48,400 Speaker 1: data set, you're more likely to find something that's significant. 625 00:29:48,400 --> 00:29:51,040 Speaker 1: And that wasn't the question you're originally asking. And so 626 00:29:51,160 --> 00:29:54,160 Speaker 1: the reason we do pre registering a scientific study is 627 00:29:54,240 --> 00:29:57,040 Speaker 1: you'll say, ahead of time, I am specifically looking at 628 00:29:57,080 --> 00:30:01,720 Speaker 1: the relationship between I don't know, parasite and how often 629 00:30:01,760 --> 00:30:04,240 Speaker 1: a fish darts. And so if I go on and 630 00:30:04,320 --> 00:30:07,840 Speaker 1: I publish a paper about parasites and how active a 631 00:30:07,960 --> 00:30:11,560 Speaker 1: fish is, you could say, hey, did you just find 632 00:30:11,680 --> 00:30:13,520 Speaker 1: that result? And that you changed your paper to be 633 00:30:13,560 --> 00:30:15,880 Speaker 1: about that because it looked interesting. And so anyway, this 634 00:30:15,960 --> 00:30:18,000 Speaker 1: is how we make sure that you're not looking for 635 00:30:18,120 --> 00:30:20,760 Speaker 1: significant results and just publishing whatever you find. 636 00:30:21,000 --> 00:30:24,120 Speaker 3: And in particle physics we're lucky enough that we always 637 00:30:24,160 --> 00:30:28,200 Speaker 3: publish because a negative result is still interesting. Like if 638 00:30:28,240 --> 00:30:30,400 Speaker 3: you look for a particle because you think it's there 639 00:30:30,680 --> 00:30:33,640 Speaker 3: and the universe would make more sense for it to exist, 640 00:30:33,720 --> 00:30:36,000 Speaker 3: like the Higgs boson, then if you see it, great, 641 00:30:36,080 --> 00:30:38,960 Speaker 3: you publish it. If you don't, that's still interesting. You 642 00:30:39,040 --> 00:30:41,400 Speaker 3: still want to know, oh, there isn't a Higgs Boson, 643 00:30:42,200 --> 00:30:44,960 Speaker 3: because if we smash particles together often enough and we 644 00:30:45,040 --> 00:30:47,600 Speaker 3: don't see it, we can say something about the likelihood 645 00:30:47,600 --> 00:30:50,200 Speaker 3: of it not existing, because if it existed, we would 646 00:30:50,320 --> 00:30:52,840 Speaker 3: have seen it. Now. We still are susceptible to p 647 00:30:53,000 --> 00:30:56,400 Speaker 3: hacking because we will sometimes see fluctuations like data will 648 00:30:56,520 --> 00:30:59,640 Speaker 3: just like look like a new particle sometimes, and to 649 00:31:00,440 --> 00:31:02,840 Speaker 3: that we have a very stringent criteria. It's called this 650 00:31:03,000 --> 00:31:05,880 Speaker 3: five sigma threshold, which means if we only claim the 651 00:31:05,960 --> 00:31:10,040 Speaker 3: particle is discovered if it's very very very unlikely to 652 00:31:10,360 --> 00:31:12,880 Speaker 3: come from a fluctuation, but we still in principle could 653 00:31:12,920 --> 00:31:15,600 Speaker 3: get fooled. And that's why we always have duplicate experiments, 654 00:31:15,680 --> 00:31:17,400 Speaker 3: like at the largechange in cloud, we have two of 655 00:31:17,480 --> 00:31:21,000 Speaker 3: these big detectors and they're totally independent, and that's why 656 00:31:21,120 --> 00:31:23,280 Speaker 3: you expect to see the same physics in both. And 657 00:31:23,360 --> 00:31:25,320 Speaker 3: if you don't, you know something this squarely. 658 00:31:25,600 --> 00:31:28,160 Speaker 1: Yeah, And in my field, we're getting better where if 659 00:31:28,240 --> 00:31:31,880 Speaker 1: you get no result or negative results, as long as 660 00:31:31,920 --> 00:31:34,560 Speaker 1: you can show that you did a sound experiment and 661 00:31:34,680 --> 00:31:38,120 Speaker 1: the answer is no, then you can still publish it somewhere. 662 00:31:38,200 --> 00:31:40,800 Speaker 1: And so Public Library of Science or PLUS is a 663 00:31:40,880 --> 00:31:44,320 Speaker 1: journal that encourages folks to submit they're null results as 664 00:31:44,400 --> 00:31:46,160 Speaker 1: long as the science was good and you can defend 665 00:31:46,240 --> 00:31:47,960 Speaker 1: what you did to try to make sure that you're 666 00:31:47,960 --> 00:31:49,560 Speaker 1: not getting this file drawer effects stuff. 667 00:31:49,640 --> 00:31:51,479 Speaker 2: So you get the no answers just as often as 668 00:31:51,480 --> 00:31:52,480 Speaker 2: the yes answers, and. 669 00:31:52,560 --> 00:31:55,360 Speaker 3: That's helpful, right. It saves people time because if Kelly 670 00:31:55,400 --> 00:31:56,880 Speaker 3: had a great idea and it turned out the answer 671 00:31:57,000 --> 00:31:58,760 Speaker 3: is no, and then she just throws it in the 672 00:31:58,800 --> 00:32:01,440 Speaker 3: file drawer, then Sam across the country is going to 673 00:32:01,520 --> 00:32:04,360 Speaker 3: try the same thing someday and waste their time. So 674 00:32:04,560 --> 00:32:06,440 Speaker 3: it's it's good to get this stuff out there. 675 00:32:06,720 --> 00:32:09,760 Speaker 1: Right, And you know, in my field, wasting time also 676 00:32:09,960 --> 00:32:12,760 Speaker 1: often means that some animals may have died in the process. 677 00:32:12,960 --> 00:32:15,120 Speaker 1: And so to me, I feel like there's this additional 678 00:32:15,240 --> 00:32:18,600 Speaker 1: ethical requirement that you get anything that you learned from 679 00:32:18,640 --> 00:32:20,760 Speaker 1: these animals out so that nobody has to go and 680 00:32:20,840 --> 00:32:22,960 Speaker 1: repeat it and it you know, more animals might have 681 00:32:23,080 --> 00:32:26,120 Speaker 1: to pass away for science. So anyway, good to get 682 00:32:26,160 --> 00:32:26,520 Speaker 1: it out there. 683 00:32:26,600 --> 00:32:29,200 Speaker 3: All right, So then let's talk more about what happens 684 00:32:29,280 --> 00:32:31,120 Speaker 3: when you think you have an interesting result and you've 685 00:32:31,160 --> 00:32:33,760 Speaker 3: written it up and you want to publish it. 686 00:32:33,920 --> 00:32:35,840 Speaker 1: And we're going to leave everybody on a ledge for 687 00:32:36,000 --> 00:32:37,800 Speaker 1: just a second, and when we get back from the break, 688 00:32:37,880 --> 00:32:40,600 Speaker 1: we'll talk more about the fascinating world of peer review. 689 00:33:00,560 --> 00:33:03,040 Speaker 1: All right, Daniel, you have a significant result. You've got 690 00:33:03,080 --> 00:33:04,800 Speaker 1: a result that looks good, or at least even if 691 00:33:04,840 --> 00:33:07,040 Speaker 1: it's no, you're sure it was no because the experiment 692 00:33:07,200 --> 00:33:07,480 Speaker 1: was done. 693 00:33:07,520 --> 00:33:09,120 Speaker 2: Well, what do you do next? 694 00:33:09,440 --> 00:33:12,479 Speaker 3: So we write it up, and when all of our 695 00:33:12,560 --> 00:33:14,880 Speaker 3: co authors are happy with it, the first thing we 696 00:33:15,000 --> 00:33:18,280 Speaker 3: do is we post it online. So particle physics is 697 00:33:18,320 --> 00:33:22,000 Speaker 3: a very very online science, and we invented this thing 698 00:33:22,120 --> 00:33:26,000 Speaker 3: called the archive where scientists post their papers and it's 699 00:33:26,000 --> 00:33:29,680 Speaker 3: called a preprint because we posted before we submitted to 700 00:33:29,720 --> 00:33:32,520 Speaker 3: the journal. So for example, I finished a paper this 701 00:33:32,640 --> 00:33:36,560 Speaker 3: week and yesterday it appeared on the archive, and nobody's 702 00:33:36,600 --> 00:33:39,000 Speaker 3: reviewed it, nobody's confirmed it. It's just out there for 703 00:33:39,120 --> 00:33:41,880 Speaker 3: other scientists to look at. And in a week or so, 704 00:33:41,960 --> 00:33:44,880 Speaker 3: if we don't get like screaming fits from somebody saying 705 00:33:44,960 --> 00:33:47,440 Speaker 3: you stole my idea or this is deeply wrong, then 706 00:33:47,520 --> 00:33:50,080 Speaker 3: we'll submit it to a journal. So the process for 707 00:33:50,200 --> 00:33:53,480 Speaker 3: particle physics is first post it online, then submitted to 708 00:33:53,520 --> 00:33:55,400 Speaker 3: a journal where it's going to get reviewed, and that's 709 00:33:55,440 --> 00:33:57,800 Speaker 3: going to take months and months, but it's so slow. 710 00:33:57,880 --> 00:34:00,360 Speaker 3: The particle physicis just read the pre print server and 711 00:34:00,560 --> 00:34:03,920 Speaker 3: read those papers, and nobody really cares when it comes 712 00:34:04,000 --> 00:34:07,040 Speaker 3: out in the journal later because we already have seen 713 00:34:07,080 --> 00:34:07,960 Speaker 3: the paper months earlier. 714 00:34:08,239 --> 00:34:11,640 Speaker 1: But what if the peer reviewed version catches an error? 715 00:34:11,760 --> 00:34:14,120 Speaker 1: Do you go back and update the archive? Versions out? 716 00:34:14,200 --> 00:34:16,520 Speaker 3: Okay, you can update it. And usually papers in the 717 00:34:16,600 --> 00:34:18,840 Speaker 3: archive are like version three, version four, and there's comments 718 00:34:18,880 --> 00:34:20,759 Speaker 3: in between. You can compare them. You can see the 719 00:34:20,840 --> 00:34:24,000 Speaker 3: mistakes getting fixed, absolutely, And some papers appear in the 720 00:34:24,120 --> 00:34:27,399 Speaker 3: archive and then are never published and that's fine. I'd 721 00:34:27,480 --> 00:34:29,320 Speaker 3: actually had one paper like ten years ago put on 722 00:34:29,400 --> 00:34:32,560 Speaker 3: the archive. It took like more than a year to review, 723 00:34:33,239 --> 00:34:36,480 Speaker 3: and then they rejected it because the paper already had 724 00:34:36,760 --> 00:34:39,320 Speaker 3: like thirty citations, and so they were like, why publish this, 725 00:34:39,680 --> 00:34:41,680 Speaker 3: It's already out there, people are reading it and using it. 726 00:34:41,800 --> 00:34:44,040 Speaker 2: Huh, that's such a stupid reason. 727 00:34:44,160 --> 00:34:46,840 Speaker 3: Yeah, it was really dumb. But then we just gave up, 728 00:34:46,920 --> 00:34:48,880 Speaker 3: and now it's like one of my more cited papers 729 00:34:48,920 --> 00:34:52,560 Speaker 3: and it was never published. The system is silly. It 730 00:34:52,680 --> 00:34:54,880 Speaker 3: is silly, but I know that in other field is different. 731 00:34:55,040 --> 00:34:57,840 Speaker 3: Like my friends in astronomy, they never put papers on 732 00:34:57,920 --> 00:35:01,839 Speaker 3: the archive until after they've been reviewed, and they think 733 00:35:01,880 --> 00:35:03,440 Speaker 3: that if they send it to a journal, it's already 734 00:35:03,440 --> 00:35:05,759 Speaker 3: out in the archive. Those journal referees are going to 735 00:35:05,800 --> 00:35:08,920 Speaker 3: find that disrespectful like that they didn't wait for them 736 00:35:09,000 --> 00:35:11,520 Speaker 3: to chime in. So it's definitely a field by field 737 00:35:11,560 --> 00:35:13,759 Speaker 3: culture kind of thing. How does it work for you guys? 738 00:35:14,040 --> 00:35:17,880 Speaker 1: Yeah, so y'all were definitely the trailblazers. We subsequently came 739 00:35:17,960 --> 00:35:22,239 Speaker 1: up with bio Archive, and I remember when I was 740 00:35:22,320 --> 00:35:25,000 Speaker 1: initially wanting to submit to bio archive, I was hesitant 741 00:35:25,040 --> 00:35:27,600 Speaker 1: to do it because some of our journals did say 742 00:35:28,760 --> 00:35:31,160 Speaker 1: your paper should not be available anywhere else, because we 743 00:35:31,239 --> 00:35:33,080 Speaker 1: want to be the only place where you can find it. 744 00:35:33,320 --> 00:35:36,440 Speaker 1: And that's because journals make money off of people who 745 00:35:36,640 --> 00:35:39,560 Speaker 1: download their articles, and so they didn't want someone else 746 00:35:39,560 --> 00:35:42,319 Speaker 1: to be able to get another version for free somewhere else. 747 00:35:42,520 --> 00:35:45,080 Speaker 1: And on bio archive or archive, all of the articles 748 00:35:45,120 --> 00:35:48,680 Speaker 1: are free. But it seems like it's now become acceptable 749 00:35:48,960 --> 00:35:51,320 Speaker 1: that you can put papers in these pre print servers 750 00:35:51,480 --> 00:35:53,919 Speaker 1: like bio Archive, and so now a lot of people 751 00:35:53,960 --> 00:35:55,840 Speaker 1: will do that, and they do it for a variety 752 00:35:55,880 --> 00:35:57,680 Speaker 1: of reasons. Some of it is they want to get 753 00:35:57,719 --> 00:36:00,120 Speaker 1: feedback early, they want to get the results out out 754 00:36:00,120 --> 00:36:03,239 Speaker 1: there early. But sometimes it's also like if you've got 755 00:36:03,360 --> 00:36:06,560 Speaker 1: a PhD student who wants to go looking for a job. 756 00:36:07,320 --> 00:36:10,200 Speaker 1: On their resume, if they finished a study, they'll write 757 00:36:10,320 --> 00:36:13,600 Speaker 1: in preparation next to the manuscript. But in preparation could 758 00:36:13,680 --> 00:36:15,680 Speaker 1: mean I've got the data and I still need to 759 00:36:15,719 --> 00:36:17,680 Speaker 1: do all the stats and I haven't even started writing. 760 00:36:18,239 --> 00:36:20,200 Speaker 1: Or it could mean I'm about to submit it to 761 00:36:20,239 --> 00:36:22,160 Speaker 1: a journal, and if you've put it on bio archive, 762 00:36:22,239 --> 00:36:25,080 Speaker 1: you're saying, look, it's done. I just need to start 763 00:36:25,120 --> 00:36:27,239 Speaker 1: the peer review process, or even it's in peer review, 764 00:36:27,239 --> 00:36:29,800 Speaker 1: because that process can take a long time. So anyway, 765 00:36:29,840 --> 00:36:32,080 Speaker 1: it's becoming a lot more common to put your papers 766 00:36:32,120 --> 00:36:34,920 Speaker 1: on bio archive, just to make them easier to access 767 00:36:34,960 --> 00:36:37,080 Speaker 1: for other people, and just to show that you really 768 00:36:37,160 --> 00:36:41,440 Speaker 1: have actually finished that study finally, after a decade or 769 00:36:41,520 --> 00:36:43,239 Speaker 1: however long it took, the. 770 00:36:43,320 --> 00:36:46,080 Speaker 3: Clock has still taken on your thesis. There, Kelly, we'll see. 771 00:36:46,040 --> 00:36:47,320 Speaker 2: I'll get into preparation. 772 00:36:48,280 --> 00:36:50,800 Speaker 3: In preparation, so then you send it to a journal, 773 00:36:51,160 --> 00:36:52,880 Speaker 3: and you pick a journal, like if you think it's 774 00:36:52,880 --> 00:36:55,520 Speaker 3: a really exciting paper, you send it to Nature or Science. 775 00:36:55,719 --> 00:36:57,360 Speaker 3: If you think it's less exciting, you send it to 776 00:36:57,400 --> 00:37:00,880 Speaker 3: a journal with a smaller impact factor. So in particle 777 00:37:00,920 --> 00:37:03,920 Speaker 3: physics we often publish in like Physical Review or in 778 00:37:04,040 --> 00:37:06,960 Speaker 3: Journal of highind Physics or these kinds of journals, and 779 00:37:07,280 --> 00:37:09,960 Speaker 3: it goes to an editor, and an editor finds people 780 00:37:10,040 --> 00:37:12,279 Speaker 3: to review it. These are the peers, and they reach 781 00:37:12,320 --> 00:37:15,200 Speaker 3: out to folks they know who are experts in the field, 782 00:37:15,400 --> 00:37:18,520 Speaker 3: who aren't your supervisor or your brother or this kind 783 00:37:18,560 --> 00:37:21,080 Speaker 3: of thing, and ask them to review these things. And 784 00:37:21,160 --> 00:37:23,200 Speaker 3: Katrina is a senior editor on a journal, so I 785 00:37:23,239 --> 00:37:24,840 Speaker 3: see this process all the time, and she has to 786 00:37:24,880 --> 00:37:27,120 Speaker 3: read the paper, think about who might know about it, 787 00:37:27,680 --> 00:37:30,600 Speaker 3: and then find people to review it. And usually you 788 00:37:30,680 --> 00:37:33,919 Speaker 3: have one to three people read this paper and give 789 00:37:34,040 --> 00:37:37,200 Speaker 3: comments on it. But finding reviewers can be tough. 790 00:37:37,760 --> 00:37:39,800 Speaker 1: So for our grants that we talked about earlier, you 791 00:37:39,880 --> 00:37:41,640 Speaker 1: have to make sure that the person reviewing the grant 792 00:37:41,719 --> 00:37:43,960 Speaker 1: doesn't have a conflict of interest. They're not your brother 793 00:37:44,120 --> 00:37:47,000 Speaker 1: or your supervisor or something like that. And in my field, 794 00:37:47,160 --> 00:37:50,160 Speaker 1: the same holds true for when you're reviewing manuscripts. Is 795 00:37:50,200 --> 00:37:51,359 Speaker 1: that true for your field as well? 796 00:37:51,600 --> 00:37:53,919 Speaker 3: Yeah? Absolutely, Yeah. You have to say, oh, I can't 797 00:37:53,960 --> 00:37:57,240 Speaker 3: read this one because that was my student or something 798 00:37:57,360 --> 00:37:57,560 Speaker 3: like that. 799 00:37:58,239 --> 00:38:00,040 Speaker 2: Yeah. How long does review usually take for your. 800 00:38:00,680 --> 00:38:03,320 Speaker 3: I'd say it's you know, like four to eight weeks 801 00:38:03,440 --> 00:38:04,040 Speaker 3: or something like that. 802 00:38:04,440 --> 00:38:06,920 Speaker 1: Wow, how about you, guys, I'd be happy four to 803 00:38:07,000 --> 00:38:09,200 Speaker 1: eight weeks. That sounds good, but it can be you know, 804 00:38:09,320 --> 00:38:11,800 Speaker 1: sometimes six months, and at six months you start writing 805 00:38:11,880 --> 00:38:12,759 Speaker 1: the editor being. 806 00:38:12,719 --> 00:38:16,279 Speaker 2: Like, come on man, this is crazy. But yeah, it 807 00:38:16,320 --> 00:38:18,080 Speaker 2: can be hard to find reviewers. 808 00:38:18,040 --> 00:38:21,279 Speaker 3: Because reviewers are just other scientists busy writing their own 809 00:38:21,320 --> 00:38:23,600 Speaker 3: grants and writing their own papers and doing all their 810 00:38:23,640 --> 00:38:25,800 Speaker 3: work and picking up their kids from daycare and stuff 811 00:38:25,840 --> 00:38:28,400 Speaker 3: like this and trying to get through that mountain of laundry. 812 00:38:28,800 --> 00:38:30,800 Speaker 3: You know. I think that people forget that, like science 813 00:38:30,920 --> 00:38:33,279 Speaker 3: is just people, right, and people are busy. Yeah, And 814 00:38:33,400 --> 00:38:35,400 Speaker 3: if you're out there and you've written some treatise on 815 00:38:35,560 --> 00:38:38,279 Speaker 3: the fundamental physics of the universe and nobody has read it, 816 00:38:38,600 --> 00:38:41,320 Speaker 3: it's not because we're gatekeeping. Is just because we're busy, 817 00:38:41,560 --> 00:38:44,439 Speaker 3: you know. And there's lots of those. And I often 818 00:38:44,480 --> 00:38:46,320 Speaker 3: say no to editors who ask me to review stuff 819 00:38:46,320 --> 00:38:47,680 Speaker 3: because I'm just too busy to do it in a 820 00:38:47,760 --> 00:38:48,359 Speaker 3: timely manner. 821 00:38:48,719 --> 00:38:49,600 Speaker 2: Yeah, no, me as well. 822 00:38:49,640 --> 00:38:51,839 Speaker 1: And I think it's worth noting that while we're doing 823 00:38:51,920 --> 00:38:54,279 Speaker 1: this review work, we're doing it for free. Yeah, Like 824 00:38:54,360 --> 00:38:57,720 Speaker 1: when you're on a grant review panel, you can lose 825 00:38:58,160 --> 00:39:00,600 Speaker 1: weeks of time to reviewing these things and you don't 826 00:39:00,600 --> 00:39:03,160 Speaker 1: get reimbursed for that time. And the same thing goes 827 00:39:03,200 --> 00:39:05,480 Speaker 1: So when I review a manuscript, I read it for 828 00:39:05,600 --> 00:39:08,600 Speaker 1: one day, takes me a couple hours to read carefully, 829 00:39:09,000 --> 00:39:10,520 Speaker 1: and then I sit on it for a few days, 830 00:39:10,560 --> 00:39:12,759 Speaker 1: and that's the main thing that is occupying my brain 831 00:39:12,840 --> 00:39:14,759 Speaker 1: in the background for those days. And that could be 832 00:39:14,920 --> 00:39:17,160 Speaker 1: like I could have been thinking about the introduction to 833 00:39:17,200 --> 00:39:19,239 Speaker 1: my next book or something, but instead I'm thinking about 834 00:39:19,239 --> 00:39:19,799 Speaker 1: the methods to. 835 00:39:19,800 --> 00:39:20,640 Speaker 2: Make sure it made sense. 836 00:39:21,000 --> 00:39:23,479 Speaker 1: And then I spend another three to four hours reading 837 00:39:23,520 --> 00:39:26,600 Speaker 1: it again and writing in detail my comments. And if 838 00:39:26,600 --> 00:39:28,320 Speaker 1: I think there's some literature they missed, I go and 839 00:39:28,400 --> 00:39:31,560 Speaker 1: I find it and like it's a multi day process. 840 00:39:31,560 --> 00:39:33,960 Speaker 1: When I commit to reviewing a paper, and reviewers don't 841 00:39:34,000 --> 00:39:35,960 Speaker 1: get paid for that, nor do the editors. 842 00:39:36,160 --> 00:39:37,040 Speaker 2: How long does it take you? 843 00:39:37,480 --> 00:39:40,000 Speaker 3: Well, I wish that all of my papers were reviewed 844 00:39:40,040 --> 00:39:44,200 Speaker 3: by somebody as thorough as you. But my process is similar, 845 00:39:44,360 --> 00:39:47,680 Speaker 3: like an initial read, some thoughts, and then let it sit. 846 00:39:48,000 --> 00:39:50,399 Speaker 3: Sometimes I'll discuss it with people in my group. We'll 847 00:39:50,440 --> 00:39:53,400 Speaker 3: read it together, and then I read it again in detail. 848 00:39:53,560 --> 00:39:55,960 Speaker 3: And then, especially if the review is negative, I wait 849 00:39:56,560 --> 00:39:58,040 Speaker 3: and I sit on it, and I come back later 850 00:39:58,080 --> 00:40:01,560 Speaker 3: I'm like, was this fair? And also, most importantly, was 851 00:40:01,600 --> 00:40:06,560 Speaker 3: I nice enough on my comments? Constructive and helpful and 852 00:40:06,680 --> 00:40:09,359 Speaker 3: not like just negative or harsh in any way? I'd 853 00:40:09,400 --> 00:40:12,440 Speaker 3: like try to take a sandpaper to anything that's negative 854 00:40:12,760 --> 00:40:15,480 Speaker 3: and smooth it over, because I've been on the other 855 00:40:15,680 --> 00:40:19,160 Speaker 3: end of harsh reviews and it's it doesn't help anybody 856 00:40:19,640 --> 00:40:21,759 Speaker 3: to get zingers in there and like especially because a 857 00:40:21,800 --> 00:40:23,520 Speaker 3: lot of my papers are written by young people and 858 00:40:23,520 --> 00:40:24,960 Speaker 3: it's often their first paper, and then I have to 859 00:40:24,960 --> 00:40:27,479 Speaker 3: show them this review that's like, this guy's a jerk. 860 00:40:28,200 --> 00:40:30,279 Speaker 3: This isn't necessary. I have to explain to them that 861 00:40:30,400 --> 00:40:33,680 Speaker 3: the anonymity here is protecting them. But you know, most 862 00:40:33,719 --> 00:40:37,239 Speaker 3: reviewers are fair and are thoughtful and are courteous, and 863 00:40:37,360 --> 00:40:40,120 Speaker 3: so often we get like helpful feedback, like what about this, 864 00:40:40,360 --> 00:40:42,520 Speaker 3: or have you thought about this question? Or what would 865 00:40:42,520 --> 00:40:45,320 Speaker 3: you say to this concern? Do you find your feedback 866 00:40:45,360 --> 00:40:46,200 Speaker 3: to be mostly useful? 867 00:40:46,360 --> 00:40:48,600 Speaker 1: Yeah, But most of the time the feedback I get 868 00:40:48,719 --> 00:40:51,120 Speaker 1: improves the paper. I did have one reviewer when it 869 00:40:51,200 --> 00:40:52,759 Speaker 1: was one of my first papers who told me they 870 00:40:52,800 --> 00:40:53,960 Speaker 1: were very disappointed in me. 871 00:40:54,280 --> 00:41:01,040 Speaker 3: It's like, you're a jerk, disappointed, thanks dead. 872 00:41:01,920 --> 00:41:04,799 Speaker 2: And that was because my supplemental materials were too long. 873 00:41:05,000 --> 00:41:06,319 Speaker 2: I was like, God, give me a break. 874 00:41:06,400 --> 00:41:09,320 Speaker 1: But anyway, so in my field, we're moving towards double 875 00:41:09,360 --> 00:41:12,800 Speaker 1: blind review, so the reviewers are not supposed to know 876 00:41:12,920 --> 00:41:15,440 Speaker 1: the names of the authors, and the authors are not 877 00:41:15,480 --> 00:41:18,279 Speaker 1: supposed to know who reviewed their paper. And that's a 878 00:41:18,360 --> 00:41:20,040 Speaker 1: really great idea. So that you feel like you can 879 00:41:20,120 --> 00:41:22,200 Speaker 1: honestly tell someone if their paper was good or not. 880 00:41:22,280 --> 00:41:24,200 Speaker 1: You don't have to worry about someone getting mad at 881 00:41:24,200 --> 00:41:28,279 Speaker 1: you in particular, though in practice there's usually like four 882 00:41:28,400 --> 00:41:31,160 Speaker 1: people who could review your paper, so you know, like 883 00:41:31,400 --> 00:41:34,959 Speaker 1: you know, it was Joe or Francceine and Franccene tends 884 00:41:35,000 --> 00:41:36,080 Speaker 1: to be meaner and so. 885 00:41:36,760 --> 00:41:39,200 Speaker 2: But anyway, so yeah, what about mirror field? Is it 886 00:41:39,480 --> 00:41:40,600 Speaker 2: blind or double blind? 887 00:41:40,840 --> 00:41:43,360 Speaker 3: It's only blind to one direction. The review is anonymous, 888 00:41:43,520 --> 00:41:46,279 Speaker 3: but you see who the authors are. I know, in 889 00:41:46,360 --> 00:41:49,880 Speaker 3: some other fields, like in top computer science conferences, for example, 890 00:41:49,960 --> 00:41:52,840 Speaker 3: it's double blind, and that's helpful. But also it's not 891 00:41:52,920 --> 00:41:54,279 Speaker 3: that hard to figure it out. If you wanted to 892 00:41:54,360 --> 00:41:56,439 Speaker 3: know who these people are, you could figure it out. 893 00:41:57,120 --> 00:41:58,800 Speaker 3: But you might also be wondering, like, what is the 894 00:41:58,920 --> 00:42:01,399 Speaker 3: job of the reviewer exactlytly does reviewer have to make 895 00:42:01,480 --> 00:42:05,160 Speaker 3: sure the experiment was right? Like if I'm checking Kelly's work, 896 00:42:05,160 --> 00:42:06,360 Speaker 3: I'm the reviewer, do you have to go out and 897 00:42:06,440 --> 00:42:08,600 Speaker 3: do the experiment and make sure she was right? And 898 00:42:08,680 --> 00:42:11,719 Speaker 3: that's the things that peer review is not replication. The 899 00:42:11,880 --> 00:42:15,080 Speaker 3: job of the reviewer is to ask do the claims 900 00:42:15,080 --> 00:42:18,040 Speaker 3: of the paper are they supported by the evidence provided, 901 00:42:18,680 --> 00:42:22,120 Speaker 3: is their logic? There is there mathematical tissue between the 902 00:42:22,200 --> 00:42:25,120 Speaker 3: work that's been done and the conclusions that are being claimed. 903 00:42:25,560 --> 00:42:28,440 Speaker 3: And also is it well written in the citations? And finally, 904 00:42:28,800 --> 00:42:31,759 Speaker 3: is it interesting? Is it important? Does it play a 905 00:42:31,920 --> 00:42:35,040 Speaker 3: role in the scientific conversation? Which is a little bit 906 00:42:35,120 --> 00:42:37,759 Speaker 3: subjective of course, but science is by the people and 907 00:42:37,880 --> 00:42:40,960 Speaker 3: for the people. But if you're working on something nobody's 908 00:42:41,040 --> 00:42:43,120 Speaker 3: interested in, then nobody's going to read your paper. Then 909 00:42:43,160 --> 00:42:46,680 Speaker 3: a journal might say this is solid work, but nobody cares. 910 00:42:46,880 --> 00:42:49,760 Speaker 3: It's a boring question, or you didn't learn anything interesting. 911 00:42:50,760 --> 00:42:53,560 Speaker 3: The reviewer's job is not to say, hey, I did 912 00:42:53,640 --> 00:42:55,520 Speaker 3: this experiment also, and I know that this is a 913 00:42:55,600 --> 00:42:57,880 Speaker 3: feature of the universe. That's not the task of the 914 00:42:57,920 --> 00:43:00,799 Speaker 3: Reviewer's not their job to do that for you. It's 915 00:43:00,800 --> 00:43:03,160 Speaker 3: also not the task of the reviewer to say, like, hey, 916 00:43:03,200 --> 00:43:05,080 Speaker 3: this is a cool idea. I would have done this differently, 917 00:43:05,160 --> 00:43:06,680 Speaker 3: and so you now need to go back and do 918 00:43:06,840 --> 00:43:09,960 Speaker 3: all these additional studies that I think are important. And 919 00:43:10,040 --> 00:43:12,240 Speaker 3: you see this a lot in peer review that reviews 920 00:43:12,280 --> 00:43:14,560 Speaker 3: come back and say this is cool, but also do 921 00:43:14,960 --> 00:43:18,440 Speaker 3: xyz and like that's frustrating to me, because that's not 922 00:43:18,560 --> 00:43:20,000 Speaker 3: the job, but the reviewer. 923 00:43:19,960 --> 00:43:21,759 Speaker 2: Yes, yep, Nope. That's super frustrating. 924 00:43:22,600 --> 00:43:26,640 Speaker 1: And I think it's worth noting that sometimes bad papers 925 00:43:26,680 --> 00:43:29,399 Speaker 1: get through this procedure, and I one of the ways 926 00:43:29,440 --> 00:43:33,520 Speaker 1: that bad papers get through is that reviewers aren't required 927 00:43:33,800 --> 00:43:36,279 Speaker 1: or and this makes sense, but they're not responsible for 928 00:43:36,400 --> 00:43:39,400 Speaker 1: going through, for example, the codes for the statistical models 929 00:43:39,400 --> 00:43:41,759 Speaker 1: that you ran. And so if somebody you know, check 930 00:43:41,800 --> 00:43:43,840 Speaker 1: their codes a bunch of times, but they forgot a 931 00:43:43,960 --> 00:43:47,719 Speaker 1: minus signed somewhere, the result could be wrong and maybe 932 00:43:47,760 --> 00:43:49,879 Speaker 1: nobody knows. And I have known a couple of people 933 00:43:50,000 --> 00:43:53,680 Speaker 1: who afterwards have gone to use those models again for 934 00:43:53,800 --> 00:43:56,320 Speaker 1: some other question and caught their mistake. And then a 935 00:43:56,440 --> 00:43:59,880 Speaker 1: good scientist will contact that journal and say, I have 936 00:44:00,000 --> 00:44:01,800 Speaker 1: I have to retract my paper, or I have to 937 00:44:02,120 --> 00:44:05,080 Speaker 1: submit errata and let everybody know I forgot the minus 938 00:44:05,160 --> 00:44:08,280 Speaker 1: signed these it's a different result, and that's really painful. 939 00:44:08,440 --> 00:44:11,200 Speaker 1: But at least they're being honest, and that's great, and 940 00:44:11,280 --> 00:44:15,280 Speaker 1: I think that's really important. Or some people are dishonest 941 00:44:15,440 --> 00:44:17,560 Speaker 1: and there's no way for the reviewer to know that 942 00:44:17,800 --> 00:44:20,800 Speaker 1: my field had somebody where they were one of the 943 00:44:20,920 --> 00:44:25,239 Speaker 1: biggest names in the field, and there's this new requirement 944 00:44:25,320 --> 00:44:25,799 Speaker 1: in our field. 945 00:44:25,840 --> 00:44:27,479 Speaker 2: What's a couple decades old? Now I'm old. 946 00:44:27,560 --> 00:44:31,120 Speaker 1: But where you have to put the data that you use, 947 00:44:31,440 --> 00:44:34,080 Speaker 1: that you collected as part of the experiment online, somewhere 948 00:44:34,200 --> 00:44:36,399 Speaker 1: in a public place where people can download your data. 949 00:44:36,960 --> 00:44:39,440 Speaker 1: And again, you know, in a field where animals are 950 00:44:39,640 --> 00:44:41,719 Speaker 1: being used to collect data. It's great to know that 951 00:44:42,320 --> 00:44:44,640 Speaker 1: those data could be used by other people to ask questions. 952 00:44:44,719 --> 00:44:46,840 Speaker 1: You can get more information out of them. But it 953 00:44:46,960 --> 00:44:50,200 Speaker 1: also means that if someone gets suspect results, you can 954 00:44:50,480 --> 00:44:53,320 Speaker 1: pull their data and rerun the models. And when somebody 955 00:44:53,440 --> 00:44:56,080 Speaker 1: was looking at the data, it was clear that the 956 00:44:56,200 --> 00:44:59,200 Speaker 1: numbers just didn't make sense, like one column was always 957 00:44:59,719 --> 00:45:04,640 Speaker 1: the prior column times three and it anyway, after more scrutiny, 958 00:45:04,680 --> 00:45:06,799 Speaker 1: it became clear that this person was making up their data. 959 00:45:07,239 --> 00:45:09,120 Speaker 1: But you know, we have this new check where your 960 00:45:09,200 --> 00:45:11,080 Speaker 1: data have to be available to everyone else, and that 961 00:45:11,200 --> 00:45:13,319 Speaker 1: has helped us sort of tease out people who are 962 00:45:13,400 --> 00:45:16,520 Speaker 1: being dishonest. So the system is evolving and getting better 963 00:45:16,600 --> 00:45:19,000 Speaker 1: over time. But still sometimes you get stuff through. 964 00:45:19,239 --> 00:45:21,759 Speaker 3: You certainly do, and there are folks out there who 965 00:45:21,840 --> 00:45:25,360 Speaker 3: are like combing through papers to find this stuff and 966 00:45:25,920 --> 00:45:28,560 Speaker 3: there's a scientist, for example, her name is Elizabeth Bick, 967 00:45:28,600 --> 00:45:31,720 Speaker 3: and this is her passion. She combs through old papers 968 00:45:31,760 --> 00:45:36,320 Speaker 3: and finds evidence of like photoshopping in biology. You know, 969 00:45:36,400 --> 00:45:38,360 Speaker 3: you take a picture of your gel and it's supposed 970 00:45:38,360 --> 00:45:40,080 Speaker 3: to have this blob and she finds, oh, this blob 971 00:45:40,160 --> 00:45:41,960 Speaker 3: is the same as that blob, or it's been reverted 972 00:45:42,400 --> 00:45:44,960 Speaker 3: or whatever. And so there are definitely lots of ways 973 00:45:45,000 --> 00:45:46,840 Speaker 3: that you could find this stuff now that we couldn't 974 00:45:46,840 --> 00:45:50,880 Speaker 3: have done beforehand. But peer review is not a guarantee 975 00:45:51,320 --> 00:45:53,520 Speaker 3: that this hasn't been done, like whereviewers should look for 976 00:45:53,600 --> 00:45:55,520 Speaker 3: this stuff and call it out if you see it. 977 00:45:55,600 --> 00:45:58,640 Speaker 3: But stuff definitely can get through. It's not a perfect system, 978 00:45:59,080 --> 00:46:01,400 Speaker 3: you know, it's one of these. It's the terrible system. 979 00:46:01,440 --> 00:46:03,080 Speaker 3: But it's also the best that we have so far. 980 00:46:03,400 --> 00:46:05,520 Speaker 2: Just in aside here, I am a massive wimp. 981 00:46:05,640 --> 00:46:10,520 Speaker 1: If I like purposefully fabricated data in this era where 982 00:46:10,520 --> 00:46:13,440 Speaker 1: everything is like public and people can peek behind the curtain. 983 00:46:13,200 --> 00:46:15,919 Speaker 2: I would never sleep again. I'd be like, someone's gonna 984 00:46:15,920 --> 00:46:18,279 Speaker 2: find me out, and it would be like I would 985 00:46:18,320 --> 00:46:20,520 Speaker 2: be miserable the rest of my life. 986 00:46:20,800 --> 00:46:24,480 Speaker 1: I'd rather have a bunch of low impact papers than 987 00:46:24,800 --> 00:46:26,719 Speaker 1: worry that I was gonna get called out for lying, 988 00:46:26,880 --> 00:46:28,600 Speaker 1: but anyway, I am a whim. 989 00:46:28,600 --> 00:46:31,000 Speaker 3: Yeah, And so the highest standard really is not just 990 00:46:31,120 --> 00:46:33,239 Speaker 3: that something has been pure reviewed, but that something has 991 00:46:33,320 --> 00:46:37,000 Speaker 3: been independently replicated. Like you might have a scientist who's 992 00:46:37,080 --> 00:46:40,279 Speaker 3: doing totally solid work and see some effect in their lab, 993 00:46:41,040 --> 00:46:43,759 Speaker 3: but doesn't realize that it's an artifact due to some 994 00:46:43,960 --> 00:46:48,480 Speaker 3: condition the humidity or the local gravity or the trains 995 00:46:48,600 --> 00:46:51,040 Speaker 3: or something are causing some influence on their experiments, And 996 00:46:51,120 --> 00:46:52,719 Speaker 3: so you want people on the other side of the 997 00:46:52,800 --> 00:46:55,680 Speaker 3: world who built it differently, who made different assumptions, who 998 00:46:55,760 --> 00:46:59,239 Speaker 3: sensitive different stuff to reproduce it. You might remember this 999 00:46:59,400 --> 00:47:02,520 Speaker 3: excitement of about high temperature superconductors. A couple of years 1000 00:47:02,560 --> 00:47:06,200 Speaker 3: ago LK ninety nine. Korean scientists claimed to have created 1001 00:47:06,760 --> 00:47:10,920 Speaker 3: this room temperature, cheap superconductor which would revolutionize the industry, 1002 00:47:11,600 --> 00:47:13,920 Speaker 3: and so very quickly people were out there trying to 1003 00:47:14,120 --> 00:47:17,440 Speaker 3: replicate it, and people were excited. But until another independent 1004 00:47:17,480 --> 00:47:20,400 Speaker 3: group built it and showed that it was real, nobody 1005 00:47:20,480 --> 00:47:22,600 Speaker 3: really accepted it and thought we were in a new era. 1006 00:47:23,239 --> 00:47:26,680 Speaker 3: And personally, for example, when we were discovering the Higgs boson, 1007 00:47:26,840 --> 00:47:29,880 Speaker 3: I saw the data accumulating around a mass of one 1008 00:47:29,960 --> 00:47:32,160 Speaker 3: hundred and twenty five. We were looking at it constantly. 1009 00:47:32,200 --> 00:47:34,239 Speaker 3: It was building up and up and up. But until 1010 00:47:34,320 --> 00:47:37,359 Speaker 3: I heard that my colleagues around the ring also saw 1011 00:47:37,400 --> 00:47:41,000 Speaker 3: an effect at the same place, I didn't personally think, Okay, yeah, 1012 00:47:41,160 --> 00:47:44,160 Speaker 3: we found this thing. And so this sort of independent 1013 00:47:44,320 --> 00:47:46,759 Speaker 3: replication is really sort of the highest standard. What do 1014 00:47:46,840 --> 00:47:47,840 Speaker 3: you think, Kelly, I. 1015 00:47:47,880 --> 00:47:50,960 Speaker 1: Think so, And to me, this is one of the 1016 00:47:51,239 --> 00:47:54,719 Speaker 1: current weaknesses in science, at least in my field. So 1017 00:47:55,400 --> 00:47:58,200 Speaker 1: replicating data is absolutely critical. But if you write to 1018 00:47:58,239 --> 00:48:00,560 Speaker 1: the National Science Foundation, for example, and you're like, oh, 1019 00:48:00,640 --> 00:48:03,200 Speaker 1: I just want to do the same experiment that Maria did, 1020 00:48:03,280 --> 00:48:05,640 Speaker 1: but I'm going to do it on another continent, it's 1021 00:48:05,680 --> 00:48:07,320 Speaker 1: going to be hard to get money for that because 1022 00:48:07,320 --> 00:48:10,880 Speaker 1: it's not a new idea, and it's also probably not 1023 00:48:10,960 --> 00:48:13,840 Speaker 1: going to get published in a top tier journal. And 1024 00:48:14,000 --> 00:48:16,920 Speaker 1: so if you are training PhD students who are going 1025 00:48:16,960 --> 00:48:19,400 Speaker 1: to want to get jobs and they're just replic quote unquote, 1026 00:48:19,480 --> 00:48:22,279 Speaker 1: just replicating somebody else's work, that's not going to help 1027 00:48:22,320 --> 00:48:24,239 Speaker 1: them get a job as much as following up on 1028 00:48:24,320 --> 00:48:27,040 Speaker 1: their own exciting new idea. And so we've got this 1029 00:48:27,160 --> 00:48:31,520 Speaker 1: incentive structure. It's actually not super great for encouraging people 1030 00:48:31,600 --> 00:48:34,799 Speaker 1: to replicate each other's studies, but I do think it's 1031 00:48:34,840 --> 00:48:37,000 Speaker 1: absolutely critical, and I would love to see us sort 1032 00:48:37,000 --> 00:48:40,440 Speaker 1: of work on that incentive structure to make replication just 1033 00:48:40,520 --> 00:48:42,960 Speaker 1: as important as the initial thing that got done. 1034 00:48:43,640 --> 00:48:46,960 Speaker 3: That's interesting. In particle physics, there's maybe a slightly healthier environment. 1035 00:48:47,320 --> 00:48:50,640 Speaker 3: We have a few signals of new physics that we've 1036 00:48:50,680 --> 00:48:53,600 Speaker 3: seen in experiments that we're all curious about, but nobody 1037 00:48:53,719 --> 00:48:56,799 Speaker 3: really accepts because it doesn't quite make sense. And then 1038 00:48:56,880 --> 00:48:59,520 Speaker 3: there's been another generation of experiments to follow up on those. 1039 00:49:00,000 --> 00:49:03,040 Speaker 3: For example, there's a very significant signal of dark matter 1040 00:49:03,120 --> 00:49:06,400 Speaker 3: an experiment in Italy called DOMA, and nobody really believes 1041 00:49:06,440 --> 00:49:08,680 Speaker 3: it because we've never seen it in another experiment. And 1042 00:49:08,760 --> 00:49:11,839 Speaker 3: people have set up other experiments very similar to DAMA, 1043 00:49:11,920 --> 00:49:14,400 Speaker 3: but in another part of the world with different conditions, 1044 00:49:14,480 --> 00:49:17,120 Speaker 3: in a different cave, for example, and not seeing the 1045 00:49:17,160 --> 00:49:20,640 Speaker 3: same effects. And so those were experiments definitely inspired by 1046 00:49:20,760 --> 00:49:24,160 Speaker 3: this signal to test this in other conditions. And in 1047 00:49:24,239 --> 00:49:27,120 Speaker 3: the last few years there was this quote unquote discovery 1048 00:49:27,200 --> 00:49:30,840 Speaker 3: of a fifth force in this experiment in Hungary, and 1049 00:49:31,120 --> 00:49:33,560 Speaker 3: folks in Berkeley are trying to replicate it, and there's 1050 00:49:33,560 --> 00:49:36,239 Speaker 3: an experiment in Italy trying to probe it as well, 1051 00:49:36,600 --> 00:49:39,120 Speaker 3: And so there's definitely like follow up work. But usually 1052 00:49:39,160 --> 00:49:41,040 Speaker 3: those follow up experiments are a little bit broader, and 1053 00:49:41,080 --> 00:49:43,160 Speaker 3: they try to not just check this one new result, 1054 00:49:43,239 --> 00:49:47,400 Speaker 3: but also learn something else along the way to make it, 1055 00:49:47,480 --> 00:49:50,320 Speaker 3: so they're also covering new ground. But I agree we 1056 00:49:50,360 --> 00:49:53,400 Speaker 3: should definitely have replication. But it comes back to funding, right, 1057 00:49:53,560 --> 00:49:56,120 Speaker 3: Like if your reviewer, what would you rather fund, Like 1058 00:49:56,200 --> 00:49:58,600 Speaker 3: let's check Kelly study, or let's do this brand new 1059 00:49:58,719 --> 00:50:00,880 Speaker 3: thing that could tell us something new about the universe. 1060 00:50:01,160 --> 00:50:03,040 Speaker 1: Yeah, And when I write a grant, I try to 1061 00:50:03,160 --> 00:50:06,360 Speaker 1: work my own replication in there, like as part of 1062 00:50:06,520 --> 00:50:09,680 Speaker 1: asking a new question, I'm going to repeat the experiment, 1063 00:50:09,760 --> 00:50:11,279 Speaker 1: but maybe put a little tweak But then I can 1064 00:50:11,320 --> 00:50:13,359 Speaker 1: at least make sure that I'm still getting the same 1065 00:50:13,440 --> 00:50:16,160 Speaker 1: results in you know, a different space or with slightly 1066 00:50:16,200 --> 00:50:18,440 Speaker 1: different lighting or something like that. And so, and you know, 1067 00:50:18,560 --> 00:50:21,520 Speaker 1: especially when to defend my field a little bit, especially 1068 00:50:21,600 --> 00:50:24,200 Speaker 1: when animals need to be euthanized as part of the experiment, 1069 00:50:25,080 --> 00:50:27,080 Speaker 1: you know, you might be a little bit hesitant to 1070 00:50:27,160 --> 00:50:28,719 Speaker 1: be like, well, this is just for replication, Well, if 1071 00:50:28,719 --> 00:50:31,920 Speaker 1: you already got an answer, do animals need to get so? Anyway, 1072 00:50:31,960 --> 00:50:34,480 Speaker 1: So we do also try to ask additional questions while 1073 00:50:34,560 --> 00:50:36,560 Speaker 1: trying to replicate, but it would be nice if we 1074 00:50:36,600 --> 00:50:37,480 Speaker 1: had more incentive for that. 1075 00:50:37,800 --> 00:50:40,040 Speaker 3: And there's folks out there just doing this, Like you 1076 00:50:40,160 --> 00:50:42,960 Speaker 3: might have heard of the replication crisis that comes out 1077 00:50:43,000 --> 00:50:45,600 Speaker 3: of people finding papers in the literature and saying like, 1078 00:50:45,680 --> 00:50:47,920 Speaker 3: all right, let's reproduce this, let's see if it holds up. 1079 00:50:48,440 --> 00:50:50,800 Speaker 3: And this is one reason that like p hacking is 1080 00:50:50,960 --> 00:50:53,320 Speaker 3: a thing people talk about, because we discovered that some 1081 00:50:53,440 --> 00:50:57,400 Speaker 3: of the results in literature are just statistical artifacts that 1082 00:50:57,480 --> 00:51:00,520 Speaker 3: were selected in order to get a paper out there. 1083 00:51:01,080 --> 00:51:03,520 Speaker 3: And so I think what you're seeing in a broader 1084 00:51:03,600 --> 00:51:06,560 Speaker 3: sentence is science is self correcting. Just like we saw 1085 00:51:06,600 --> 00:51:09,440 Speaker 3: that we needed to add external reviewers and members of 1086 00:51:09,480 --> 00:51:12,080 Speaker 3: the public when we're talking about the experiments you do 1087 00:51:12,200 --> 00:51:14,840 Speaker 3: on animals. Now we see like, okay, we need some 1088 00:51:15,080 --> 00:51:17,640 Speaker 3: sort of way to protect against this kind of abuse 1089 00:51:17,680 --> 00:51:20,239 Speaker 3: as well. And so you know, the process is a 1090 00:51:20,320 --> 00:51:23,680 Speaker 3: living thing. It's not like science is a crisp philosophy. 1091 00:51:23,760 --> 00:51:26,359 Speaker 3: What we call science has changed over the last ten years, 1092 00:51:26,400 --> 00:51:29,480 Speaker 3: fifty years, one hundred years and it will continue to 1093 00:51:29,560 --> 00:51:33,080 Speaker 3: evolve and I hope keep delivering great truths about the universe. 1094 00:51:33,360 --> 00:51:36,600 Speaker 1: Yeah, me too, And so Steven asked, how can you 1095 00:51:36,800 --> 00:51:39,399 Speaker 1: know if something is good science or not? I feel 1096 00:51:39,400 --> 00:51:43,279 Speaker 1: like that was the big push behind the question. And 1097 00:51:43,360 --> 00:51:45,200 Speaker 1: so what do you think when you read a paper 1098 00:51:45,239 --> 00:51:45,919 Speaker 1: for the first time? 1099 00:51:46,360 --> 00:51:47,160 Speaker 2: What do you look for? 1100 00:51:47,640 --> 00:51:49,440 Speaker 3: Well, you know, science has to stand on its own, 1101 00:51:49,520 --> 00:51:51,400 Speaker 3: So the thing I look for is like, does the 1102 00:51:51,480 --> 00:51:54,000 Speaker 3: paper make sense? Does it lay out an argument? Do 1103 00:51:54,200 --> 00:51:57,560 Speaker 3: the conclusions follow from the evidence presented? That's the most 1104 00:51:57,600 --> 00:52:00,000 Speaker 3: important thing to me. But before I'm in a person 1105 00:52:00,080 --> 00:52:02,680 Speaker 3: something believe that something is real part of the universe, yeah, 1106 00:52:02,680 --> 00:52:05,239 Speaker 3: I need to see it reproduced by another group. I've 1107 00:52:05,280 --> 00:52:08,560 Speaker 3: often reproduced papers myself, especially if they're like statistical, just 1108 00:52:08,560 --> 00:52:10,560 Speaker 3: to make sure I understand, like, how is this calculation 1109 00:52:10,719 --> 00:52:12,759 Speaker 3: being done? Exactly? How do you get from step three 1110 00:52:12,880 --> 00:52:15,960 Speaker 3: to four? Because I really want to understand what are 1111 00:52:16,000 --> 00:52:19,120 Speaker 3: the assumptions being made and think about whether those are 1112 00:52:19,200 --> 00:52:22,439 Speaker 3: broad enough. So I would say that, yes, peer review 1113 00:52:22,480 --> 00:52:25,840 Speaker 3: papers can be wrong, but that's part of science, and 1114 00:52:26,120 --> 00:52:28,520 Speaker 3: the highest standard I think is independent replication. 1115 00:52:28,760 --> 00:52:30,920 Speaker 1: What do you think, Kelly, Yeah, no, I agree. I mean, 1116 00:52:30,960 --> 00:52:33,360 Speaker 1: when I read a paper, I'll look to see, you know, 1117 00:52:33,440 --> 00:52:36,520 Speaker 1: what journal is it in. So lately there in this 1118 00:52:36,680 --> 00:52:38,359 Speaker 1: day and age, there are some new what we call 1119 00:52:38,440 --> 00:52:41,759 Speaker 1: predatory journals where they will encourage people to submit, but 1120 00:52:41,880 --> 00:52:44,960 Speaker 1: they publish just about anything they get. And peer review 1121 00:52:45,040 --> 00:52:47,839 Speaker 1: isn't so much about peers having a chance to turn 1122 00:52:47,960 --> 00:52:50,000 Speaker 1: down a study, but they'll like give a. 1123 00:52:49,960 --> 00:52:51,960 Speaker 2: Little bit of input. But the paper gets published one 1124 00:52:52,000 --> 00:52:52,479 Speaker 2: way or another. 1125 00:52:52,600 --> 00:52:54,640 Speaker 1: So you need to look to see what kind of 1126 00:52:54,760 --> 00:52:57,120 Speaker 1: journal it was published in, and then you know, if 1127 00:52:57,160 --> 00:52:58,879 Speaker 1: it's a field I know about, I'll look to see 1128 00:52:59,000 --> 00:53:01,000 Speaker 1: did you cite the paper that I would expect you 1129 00:53:01,080 --> 00:53:04,560 Speaker 1: to be citing. Was your literature search deep enough? And 1130 00:53:04,800 --> 00:53:08,400 Speaker 1: where your experiments designed? Well, what else could those results 1131 00:53:08,480 --> 00:53:11,560 Speaker 1: have meant? And then who funded the study and stuff 1132 00:53:11,600 --> 00:53:12,879 Speaker 1: like that. But you know, if you are a member 1133 00:53:12,880 --> 00:53:14,520 Speaker 1: of the general public and you can't do all of 1134 00:53:14,600 --> 00:53:16,960 Speaker 1: that and you're just reading a pop side summary, I 1135 00:53:16,960 --> 00:53:19,560 Speaker 1: would look for again, like where is this being published in? 1136 00:53:19,640 --> 00:53:22,000 Speaker 1: You know, like if it's The Atlantic by ed Young, 1137 00:53:22,320 --> 00:53:25,839 Speaker 1: it's probably great. And if the science journalist had other 1138 00:53:26,239 --> 00:53:29,600 Speaker 1: scientists way in on what might have been wrong about 1139 00:53:29,640 --> 00:53:32,279 Speaker 1: the study, that's a good sign, and so you look 1140 00:53:32,360 --> 00:53:35,279 Speaker 1: for where you're getting the information, how critical they seem 1141 00:53:35,360 --> 00:53:37,560 Speaker 1: to be, and stuff like that. What do you look 1142 00:53:37,600 --> 00:53:38,720 Speaker 1: for in a pop side article. 1143 00:53:39,360 --> 00:53:41,680 Speaker 3: Yeah, in a pop side article, I definitely look to 1144 00:53:41,760 --> 00:53:45,680 Speaker 3: see whether they have talked to people who are not authors, 1145 00:53:46,160 --> 00:53:48,440 Speaker 3: you know, other people in the field who know this stuff, 1146 00:53:49,120 --> 00:53:52,480 Speaker 3: and is it just a press release from the university, 1147 00:53:52,640 --> 00:53:54,680 Speaker 3: or as a journalist who's actually thought about this stuff 1148 00:53:54,800 --> 00:53:57,200 Speaker 3: and written something up. Those are definitely the things I 1149 00:53:57,280 --> 00:54:00,400 Speaker 3: look for in popside press releases, because there's the temptation 1150 00:54:00,560 --> 00:54:03,920 Speaker 3: and the sort of marketplace of ideas to overinflate the 1151 00:54:04,040 --> 00:54:05,320 Speaker 3: meaning of an incremental study. 1152 00:54:05,680 --> 00:54:07,719 Speaker 1: Yep, yep, all right, So in general I would say 1153 00:54:07,760 --> 00:54:10,080 Speaker 1: that you know, we have this process in place to 1154 00:54:10,160 --> 00:54:11,920 Speaker 1: try to make sure that the best ideas are the 1155 00:54:11,960 --> 00:54:14,239 Speaker 1: ones that move forward, and that we're all checking to 1156 00:54:14,280 --> 00:54:16,440 Speaker 1: make sure no one missed anything important along the way. 1157 00:54:17,239 --> 00:54:20,680 Speaker 1: Some stuff gets through either by mistake or because some 1158 00:54:20,840 --> 00:54:24,480 Speaker 1: people are unscrupulous, but hopefully that doesn't happen that often. 1159 00:54:25,360 --> 00:54:27,600 Speaker 1: But you know, the system is evolving at all times. 1160 00:54:27,640 --> 00:54:30,200 Speaker 1: And if you have a study that you're interested in, 1161 00:54:30,440 --> 00:54:32,400 Speaker 1: but you don't know if you can trust the press 1162 00:54:32,440 --> 00:54:34,560 Speaker 1: release or whatever. You can send it to us, and 1163 00:54:34,640 --> 00:54:36,600 Speaker 1: if it happens to me in our wheelhouse, we're happy 1164 00:54:36,680 --> 00:54:38,960 Speaker 1: to weigh in and tell you what sort of you know, 1165 00:54:39,160 --> 00:54:42,480 Speaker 1: set off our alarm bells in our heads, or what 1166 00:54:42,600 --> 00:54:45,239 Speaker 1: we liked about the study, and we're happy to help 1167 00:54:45,239 --> 00:54:47,920 Speaker 1: people figure out what was done well and what was 1168 00:54:48,440 --> 00:54:49,439 Speaker 1: just squeaking through. 1169 00:54:50,920 --> 00:54:53,200 Speaker 3: All right, Thanks very much, Steven for asking this question, 1170 00:54:53,320 --> 00:54:56,440 Speaker 3: if for shining a light on the inner workings of science, and. 1171 00:54:56,440 --> 00:54:58,279 Speaker 1: If you'd like to reach out to us, send us 1172 00:54:58,280 --> 00:55:01,279 Speaker 1: an email at questions at Daniel and Kelly dot org 1173 00:55:01,440 --> 00:55:04,280 Speaker 1: and let's see what Stephen had to say about our answer. 1174 00:55:04,640 --> 00:55:07,160 Speaker 6: Hi, Daniel and Kelly, it's Steve here. Thanks so much 1175 00:55:07,200 --> 00:55:09,520 Speaker 6: for answering my question. I think you shed a lot 1176 00:55:09,560 --> 00:55:13,400 Speaker 6: of light on some topics that most of the general 1177 00:55:13,520 --> 00:55:16,719 Speaker 6: public are not aware of, and I really appreciate that. 1178 00:55:17,360 --> 00:55:21,239 Speaker 6: It's actually really fascinating to learn that just because something 1179 00:55:21,320 --> 00:55:24,440 Speaker 6: is peer reviewed doesn't mean it's one hundred percent fact. 1180 00:55:25,280 --> 00:55:29,160 Speaker 6: And it's definitely a lot to take away here, so 1181 00:55:29,239 --> 00:55:32,719 Speaker 6: I appreciate you guys diving into the topic and looking 1182 00:55:32,760 --> 00:55:33,840 Speaker 6: forward to the next episode. 1183 00:55:40,640 --> 00:55:44,160 Speaker 1: Daniel and Kelly's Extraordinary Universe is produced by iHeartRadio. 1184 00:55:44,400 --> 00:55:47,000 Speaker 2: We would love to hear from you, We really would. 1185 00:55:47,239 --> 00:55:49,920 Speaker 3: We want to know what questions you have about this 1186 00:55:50,200 --> 00:55:51,800 Speaker 3: extraordinary universe. 1187 00:55:51,960 --> 00:55:54,839 Speaker 1: We want to know your thoughts on recent shows, suggestions 1188 00:55:54,880 --> 00:55:55,800 Speaker 1: for future shows. 1189 00:55:56,000 --> 00:55:58,279 Speaker 2: If you contact us, we will get back to you. 1190 00:55:58,600 --> 00:56:02,279 Speaker 3: We really mean it. Answer every message. Email us at 1191 00:56:02,440 --> 00:56:05,280 Speaker 3: Questions at Danielankelly. 1192 00:56:04,440 --> 00:56:06,479 Speaker 1: Dot org, or you can find us on social media. 1193 00:56:06,600 --> 00:56:10,320 Speaker 1: We have accounts on x, Instagram, Blue Sky and on 1194 00:56:10,480 --> 00:56:12,400 Speaker 1: all of those platforms. You can find us at D 1195 00:56:12,880 --> 00:56:14,400 Speaker 1: and K Universe. 1196 00:56:14,600 --> 00:56:16,120 Speaker 3: Don't be shy, write to us