1 00:00:07,760 --> 00:00:11,920 Speaker 1: Hollywood tells us what it's like to make a scientific discovery. Okay, 2 00:00:12,080 --> 00:00:15,280 Speaker 1: set the scene. A lone scientist wearing a lab coat 3 00:00:15,360 --> 00:00:17,480 Speaker 1: because they're always wearing a lab coat for some reason, 4 00:00:17,840 --> 00:00:21,440 Speaker 1: has a flash of inspiration, sometimes during a musical montage, 5 00:00:21,760 --> 00:00:25,000 Speaker 1: and that's when the ideas come together. He and it's 6 00:00:25,040 --> 00:00:27,800 Speaker 1: almost always a he rushes out to tell the world 7 00:00:27,840 --> 00:00:31,400 Speaker 1: and everyone greets the news with enthusiasm. That's a fun 8 00:00:31,440 --> 00:00:34,880 Speaker 1: bit of storytelling. But what is it really like? Does 9 00:00:35,040 --> 00:00:39,440 Speaker 1: that scenario ever happen? Or are scientists working slowly for 10 00:00:39,520 --> 00:00:43,080 Speaker 1: decades pushing the fuzzy bits of the puzzle together until 11 00:00:43,120 --> 00:00:45,880 Speaker 1: people are finally convinced. And yes, I have to admit 12 00:00:45,920 --> 00:00:49,240 Speaker 1: that wouldn't make quite as good of a movie. But anyway, 13 00:00:49,280 --> 00:00:51,440 Speaker 1: today we're going to pull back the curtain on the 14 00:00:51,479 --> 00:00:55,280 Speaker 1: process of scientific discovery and tell you stories of dramatic 15 00:00:55,480 --> 00:01:00,760 Speaker 1: as well as frustratingly slow discoveries. You'll hear the actual 16 00:01:00,920 --> 00:01:04,440 Speaker 1: historical audio of scientists being shocked at a discovery that 17 00:01:04,480 --> 00:01:07,600 Speaker 1: they were making in real time, a conversation with a 18 00:01:07,720 --> 00:01:10,520 Speaker 1: historian of science, and an interview with a man who 19 00:01:10,600 --> 00:01:14,200 Speaker 1: has spoken to more Nobel prize winners than maybe anyone 20 00:01:14,200 --> 00:01:16,720 Speaker 1: else on the planet, and we'll try to learn what 21 00:01:16,880 --> 00:01:21,839 Speaker 1: led to moments of understanding and discovery. Welcome to Daniel 22 00:01:21,840 --> 00:01:37,880 Speaker 1: and Kelly's Extraordinary Universe. Hello. I'm Kelly winder Smith. I 23 00:01:37,959 --> 00:01:40,640 Speaker 1: study parasites and space, and today we're going to talk 24 00:01:40,680 --> 00:01:44,040 Speaker 1: about how many times I have not discovered things. 25 00:01:45,360 --> 00:01:45,679 Speaker 2: Hello. 26 00:01:45,920 --> 00:01:48,920 Speaker 3: I'm Daniel Whitson. I'm a particle physicist, and I got 27 00:01:48,920 --> 00:01:52,120 Speaker 3: into particle physics to reveal the fundamental nature of the 28 00:01:52,320 --> 00:01:56,640 Speaker 3: universe and make earth shattering discoveries. But in thirty years 29 00:01:56,720 --> 00:01:57,920 Speaker 3: I've made exactly zero. 30 00:01:58,240 --> 00:02:00,240 Speaker 1: You've made exactly zero. Okay, Well that's a nice lead 31 00:02:00,280 --> 00:02:02,680 Speaker 1: into the question I have for you today. So you know, 32 00:02:02,720 --> 00:02:04,120 Speaker 1: at least in my field, and I assume this is 33 00:02:04,160 --> 00:02:06,680 Speaker 1: the same in your field. Before you start an experiment, 34 00:02:06,720 --> 00:02:09,560 Speaker 1: you have a prediction, you have an expectation for how 35 00:02:09,600 --> 00:02:11,280 Speaker 1: the results are going to go, and then you design 36 00:02:11,320 --> 00:02:14,320 Speaker 1: your experiment well so that if you're wrong, you can 37 00:02:14,360 --> 00:02:16,880 Speaker 1: be sure that you're wrong. That's a good experiment. So 38 00:02:17,800 --> 00:02:21,800 Speaker 1: what percent of the time roughly does the work that 39 00:02:21,840 --> 00:02:25,000 Speaker 1: you do match the predictions that you made initially? 40 00:02:25,480 --> 00:02:27,760 Speaker 3: Wow, way to put your finger on a source bot, Kelly, 41 00:02:28,160 --> 00:02:33,200 Speaker 3: So far every single experiment. We've done matches our expectations, 42 00:02:33,560 --> 00:02:37,040 Speaker 3: and we've even analyzed the statistics of that, Like you 43 00:02:37,040 --> 00:02:39,480 Speaker 3: don't expect when you flip a coin to get exactly 44 00:02:39,560 --> 00:02:42,240 Speaker 3: fifty percent heads and tails on a fair coin. You 45 00:02:42,280 --> 00:02:46,560 Speaker 3: expect some fluctuations, and we see exactly those kinds of fluctuations. 46 00:02:46,760 --> 00:02:49,519 Speaker 3: Sometimes the data is a little bit weird, rarely it's 47 00:02:49,680 --> 00:02:52,720 Speaker 3: very weird, and almost never is it super duper weird. 48 00:02:53,240 --> 00:02:56,680 Speaker 3: So we have a beautiful Gaussian curve of all of 49 00:02:56,680 --> 00:02:59,480 Speaker 3: our weirdness and really no surprises so far. 50 00:03:00,080 --> 00:03:03,280 Speaker 1: So every paper you've done, the prediction you were testing, 51 00:03:03,760 --> 00:03:05,440 Speaker 1: you found exactly what you expected. 52 00:03:05,760 --> 00:03:07,480 Speaker 3: I mean, I work in a field where if we 53 00:03:07,560 --> 00:03:11,000 Speaker 3: find something unexpected, it's a Nobel prize. Right. If you 54 00:03:11,040 --> 00:03:13,520 Speaker 3: find a new particle, if you find a new force, 55 00:03:14,080 --> 00:03:18,320 Speaker 3: that's a huge revelation. So we are constantly searching for stuff. 56 00:03:18,800 --> 00:03:21,200 Speaker 3: No we didn't find dark matter, No we didn't find this, 57 00:03:21,360 --> 00:03:23,720 Speaker 3: No we didn't find that. Ninety nine point ninety nine 58 00:03:23,760 --> 00:03:27,000 Speaker 3: percent of our papers are negative results. We looked for 59 00:03:27,200 --> 00:03:30,040 Speaker 3: X and we didn't see it. The standard model wins again. 60 00:03:31,200 --> 00:03:34,960 Speaker 1: Well, congratulations for being right on thousands of papers, Like 61 00:03:35,000 --> 00:03:38,520 Speaker 1: you told us the other day you have thousands of publications. 62 00:03:39,080 --> 00:03:42,200 Speaker 3: No, no, it's a great disappointment. I wish that we 63 00:03:42,200 --> 00:03:44,600 Speaker 3: were wrong. I got into this field to prove the 64 00:03:44,640 --> 00:03:47,680 Speaker 3: standard model wrong, to find situations where we see something 65 00:03:47,720 --> 00:03:51,320 Speaker 3: we don't expect, and not just to discover something new. Right, 66 00:03:51,360 --> 00:03:53,800 Speaker 3: some theorists could come up with a model of supersymmetry 67 00:03:53,800 --> 00:03:55,800 Speaker 3: and predict the selectron or whatever, and we could go 68 00:03:55,840 --> 00:03:58,000 Speaker 3: off and see that. That's sort of what happened with 69 00:03:58,000 --> 00:04:00,920 Speaker 3: the Higgs boson and with the top quark twenty years earlier. 70 00:04:01,480 --> 00:04:03,720 Speaker 3: But it's been a long time since we had a surprise, 71 00:04:03,920 --> 00:04:06,560 Speaker 3: a moment when the data told us something about the 72 00:04:06,640 --> 00:04:08,440 Speaker 3: universe we weren't expecting. 73 00:04:08,680 --> 00:04:10,640 Speaker 1: Well, Daniel, I'm excited that I can say to you 74 00:04:10,720 --> 00:04:12,000 Speaker 1: that I wish you failure. 75 00:04:12,440 --> 00:04:15,520 Speaker 3: There you go exactly. What about you? How about your 76 00:04:15,600 --> 00:04:18,719 Speaker 3: great moments of discovery where they unexpected or expected. 77 00:04:19,320 --> 00:04:21,279 Speaker 1: I think I'm at like fifty to fifty on my 78 00:04:21,320 --> 00:04:25,320 Speaker 1: predictions panning out, like I mean, so it's it's not 79 00:04:25,400 --> 00:04:27,440 Speaker 1: a like coin flip. I actually, you know, I do 80 00:04:27,560 --> 00:04:30,640 Speaker 1: serious literature searches, but you know, when you study animal behaviors, 81 00:04:30,720 --> 00:04:33,320 Speaker 1: so often it's like, oh, you know, we know that 82 00:04:33,320 --> 00:04:36,120 Speaker 1: the neurotransmitters are doing this, so we should predict that 83 00:04:36,160 --> 00:04:38,640 Speaker 1: the animal will do this, and then they don't do 84 00:04:38,839 --> 00:04:42,640 Speaker 1: what you expected. And that's pretty typical as we sort 85 00:04:42,640 --> 00:04:44,880 Speaker 1: of muddle through neuroscience and whatnot. 86 00:04:45,000 --> 00:04:47,760 Speaker 3: Well, when that happens, when they don't do what you expect, 87 00:04:48,040 --> 00:04:51,440 Speaker 3: have you learned something about the animal, something fundamental that 88 00:04:51,680 --> 00:04:54,239 Speaker 3: is scientific, or have you learned oops, I made a mistake, 89 00:04:54,440 --> 00:04:56,520 Speaker 3: or I don't know how to do neuroscience or something. 90 00:04:56,680 --> 00:04:59,400 Speaker 1: No, we always learn something like I do. I spend 91 00:04:59,400 --> 00:05:01,919 Speaker 1: a lot of time. I'm carefully designing my experiments so 92 00:05:01,960 --> 00:05:04,840 Speaker 1: that even if the answer is things didn't go the 93 00:05:04,880 --> 00:05:08,200 Speaker 1: way you thought they would, there's still something interesting to. 94 00:05:08,160 --> 00:05:12,159 Speaker 3: Say, Well, that's a well designed experiment. Congratulations, Oh thanks, 95 00:05:12,320 --> 00:05:13,440 Speaker 3: we published No matter what. 96 00:05:14,240 --> 00:05:17,039 Speaker 1: That's right. Well, I've gotten very comfortable with this form 97 00:05:17,080 --> 00:05:19,840 Speaker 1: of failure, and. 98 00:05:19,880 --> 00:05:22,640 Speaker 3: The success and failure of scientists is part of our 99 00:05:22,680 --> 00:05:25,400 Speaker 3: topic today. We are doing a deep dive into the 100 00:05:25,480 --> 00:05:28,000 Speaker 3: nature of the scientific process, pulling back the curtain on 101 00:05:28,120 --> 00:05:31,919 Speaker 3: how science actually works. What scientists do all day. We 102 00:05:31,960 --> 00:05:34,839 Speaker 3: don't just take naps and wake up with great moments 103 00:05:34,839 --> 00:05:37,240 Speaker 3: of inspiration, though I guess that does happen for some 104 00:05:37,320 --> 00:05:40,960 Speaker 3: of us. We work slowly and carefully through the process 105 00:05:41,040 --> 00:05:43,960 Speaker 3: of science, figuring out how the universe works and convincing 106 00:05:44,040 --> 00:05:46,520 Speaker 3: our peers of what we have learned. We have an 107 00:05:46,560 --> 00:05:49,919 Speaker 3: episode later this week on the scientific review process, but 108 00:05:50,000 --> 00:05:53,760 Speaker 3: today we're digging into the juiciest bit how scientific discoveries 109 00:05:54,000 --> 00:05:55,080 Speaker 3: actually happen. 110 00:05:54,960 --> 00:05:57,640 Speaker 1: And we are super lucky to have a bunch of 111 00:05:57,640 --> 00:06:00,320 Speaker 1: different input for this particular episode. We're going to start 112 00:06:00,320 --> 00:06:03,120 Speaker 1: with Daniel walking us through what discovery tends to look 113 00:06:03,200 --> 00:06:06,400 Speaker 1: like in physics. Then we're going to bring an expert 114 00:06:06,440 --> 00:06:09,760 Speaker 1: on to talk to us about the history of scientific discoveries, 115 00:06:10,080 --> 00:06:12,719 Speaker 1: and finally we're talking to someone who interviewed a bunch 116 00:06:12,720 --> 00:06:16,720 Speaker 1: of Nobel Prize winners and asked them about their discoveries 117 00:06:16,720 --> 00:06:19,719 Speaker 1: and how that went along. So Daniel on this path 118 00:06:19,800 --> 00:06:25,200 Speaker 1: of discovery. Maybe this should be a self help podcast 119 00:06:25,279 --> 00:06:27,000 Speaker 1: now it feels self healthy right now. 120 00:06:27,040 --> 00:06:29,479 Speaker 3: But how to win a Nobel Prize Steps one through ten. 121 00:06:29,600 --> 00:06:30,960 Speaker 3: You'll be shocked at step seven. 122 00:06:31,120 --> 00:06:33,280 Speaker 1: Well, let's start with your first discovery that you want 123 00:06:33,360 --> 00:06:34,679 Speaker 1: to tell us about today. 124 00:06:35,080 --> 00:06:37,920 Speaker 3: Yeah, I think it's important to walk through some examples 125 00:06:37,960 --> 00:06:41,120 Speaker 3: of discoveries because I think that the picture people have 126 00:06:41,200 --> 00:06:45,039 Speaker 3: in their minds of how scientific discovery happens is shaped 127 00:06:45,040 --> 00:06:47,040 Speaker 3: by a lot of these stories, most of which are 128 00:06:47,160 --> 00:06:50,520 Speaker 3: apocryphal and give people sort of a cartoonish view of 129 00:06:50,560 --> 00:06:53,680 Speaker 3: the process. And I want to get into the nitty gritty. 130 00:06:53,600 --> 00:06:56,159 Speaker 1: So I have to make a confession here, which is 131 00:06:56,160 --> 00:06:58,880 Speaker 1: that for a long time I didn't realize that apocryphal 132 00:06:58,960 --> 00:07:03,400 Speaker 1: means a story that isn't true. Oh no, So I'd 133 00:07:03,440 --> 00:07:05,279 Speaker 1: just like to clear that up for anyone who is 134 00:07:05,680 --> 00:07:08,560 Speaker 1: comparably bad at the English language. 135 00:07:09,200 --> 00:07:12,600 Speaker 3: These are great dinner party stories or popsy clickbait, but 136 00:07:12,600 --> 00:07:16,119 Speaker 3: they're not always real. And maybe one of the most 137 00:07:16,200 --> 00:07:20,880 Speaker 3: famous moments of discovery is Newton and the Apple. As 138 00:07:20,880 --> 00:07:23,160 Speaker 3: the story goes, Newton is sitting in his garden. He 139 00:07:23,200 --> 00:07:25,480 Speaker 3: sees an apple fall down, and he thinks about it 140 00:07:25,480 --> 00:07:28,200 Speaker 3: deep and he goes, hmmm, why do apples fall down 141 00:07:28,240 --> 00:07:30,320 Speaker 3: towards the center of the earth? And he comes up 142 00:07:30,360 --> 00:07:33,600 Speaker 3: with this theory of gravity. Is that the story you've heard, Kelly? 143 00:07:33,960 --> 00:07:34,560 Speaker 1: Absolutely? 144 00:07:35,720 --> 00:07:38,400 Speaker 3: And does that story make sense to you? Like, how 145 00:07:38,520 --> 00:07:41,679 Speaker 3: does looking at an apple tell anybody how gravity works? 146 00:07:42,120 --> 00:07:45,200 Speaker 1: I mean, I guess I didn't imagine that he looked 147 00:07:45,240 --> 00:07:48,240 Speaker 1: at the apple and immediately knew how gravity works. I 148 00:07:48,240 --> 00:07:51,640 Speaker 1: imagined that he saw the apple fall and thought to himself, 149 00:07:51,920 --> 00:07:53,960 Speaker 1: why did it go down and not up? And that 150 00:07:54,080 --> 00:07:56,400 Speaker 1: got him sort of thinking about the question more deeply. 151 00:07:56,920 --> 00:07:59,320 Speaker 3: Well, you know, that's a question that Aristotle worked on 152 00:07:59,440 --> 00:08:02,160 Speaker 3: thousands of years earlier, like why do things fall down? 153 00:08:02,280 --> 00:08:04,800 Speaker 3: It was a deep question, and you know, his answer 154 00:08:05,000 --> 00:08:07,520 Speaker 3: was like, well, there's some things it's in their nature 155 00:08:07,560 --> 00:08:10,720 Speaker 3: to fall down, and an apple and rocks and earth 156 00:08:10,760 --> 00:08:13,400 Speaker 3: are made of the falling down kind of stuff. And 157 00:08:13,440 --> 00:08:16,720 Speaker 3: that's not really an answer, you know, but this definitely 158 00:08:16,720 --> 00:08:18,840 Speaker 3: has been a question for a long long time. And 159 00:08:18,880 --> 00:08:21,560 Speaker 3: so it doesn't even really make sense to me, because 160 00:08:21,560 --> 00:08:24,239 Speaker 3: like seeing the apple inspires the question, but the question 161 00:08:24,320 --> 00:08:27,400 Speaker 3: is an ancient and outstanding one. Anyway, this whole story 162 00:08:27,560 --> 00:08:30,560 Speaker 3: doesn't make any sense because it's all made up. It 163 00:08:30,600 --> 00:08:33,600 Speaker 3: didn't really happen that way at all. The real story 164 00:08:33,880 --> 00:08:37,000 Speaker 3: is that Newton took years, decades to develop his theory 165 00:08:37,000 --> 00:08:40,040 Speaker 3: of gravity. He was writing letters back and forth with 166 00:08:40,080 --> 00:08:43,680 Speaker 3: another scientist, Hook for years, and he was thinking about gravity, 167 00:08:43,720 --> 00:08:45,720 Speaker 3: and he was wondering if you could come up with 168 00:08:45,760 --> 00:08:49,000 Speaker 3: a theory of gravity to explain why apples fall down, 169 00:08:49,040 --> 00:08:53,080 Speaker 3: and also why the moon doesn't fall down? Right, like 170 00:08:53,160 --> 00:08:55,600 Speaker 3: why is the moon in orbit around the Earth? This 171 00:08:55,679 --> 00:08:57,680 Speaker 3: kind of stuff, And so he was trying to develop that, 172 00:08:57,800 --> 00:09:00,439 Speaker 3: and he was trying to make it mathematical. He didn't 173 00:09:00,440 --> 00:09:04,160 Speaker 3: want just a story like Aristotle provided. He wanted a theory, 174 00:09:04,240 --> 00:09:07,760 Speaker 3: something that would let you calculate the force between two objects, 175 00:09:07,800 --> 00:09:10,840 Speaker 3: for example. And so it took him two decades to 176 00:09:10,840 --> 00:09:13,079 Speaker 3: put this theory together. And the you know, the fundamental 177 00:09:13,120 --> 00:09:16,000 Speaker 3: idea he came up with is that gravity gets weaker 178 00:09:16,040 --> 00:09:18,679 Speaker 3: with distance. And he showed that if you framed it 179 00:09:18,760 --> 00:09:21,320 Speaker 3: in this one over are squared or are as the 180 00:09:21,360 --> 00:09:25,679 Speaker 3: distance between objects, the force drops with a distance squared, 181 00:09:26,080 --> 00:09:28,640 Speaker 3: that you could actually calculate not just that an apple 182 00:09:28,679 --> 00:09:31,000 Speaker 3: should fall, but also that the moon should be in 183 00:09:31,080 --> 00:09:33,920 Speaker 3: its orbit. And he was able to reproduce the motion 184 00:09:34,120 --> 00:09:36,400 Speaker 3: of the moon. And this is the great triumph to 185 00:09:36,400 --> 00:09:39,240 Speaker 3: have a single theory of gravity that describes not just 186 00:09:39,360 --> 00:09:41,720 Speaker 3: what's happening on Earth, but also in the heavens. 187 00:09:41,800 --> 00:09:44,480 Speaker 1: All right, Well, so first I'm wondering with the Aristotle 188 00:09:44,520 --> 00:09:46,720 Speaker 1: and the birds. Did he think that the that birds 189 00:09:46,760 --> 00:09:50,360 Speaker 1: were like made of uppy stuff sometimes and downy stuff others, 190 00:09:50,360 --> 00:09:54,120 Speaker 1: and like what did this transmogrification look like? But let's 191 00:09:54,160 --> 00:09:55,400 Speaker 1: not get too far off topic. 192 00:09:55,480 --> 00:09:57,840 Speaker 3: I would love to have Aristotle on the podcast him 193 00:09:57,920 --> 00:10:00,360 Speaker 3: some of these questions, you know, also questions like why 194 00:10:00,360 --> 00:10:02,679 Speaker 3: didn't you ever do an experiment to try out some 195 00:10:02,760 --> 00:10:07,040 Speaker 3: of your ideas? Like in ten minutes, Galileo disproved a 196 00:10:07,040 --> 00:10:10,400 Speaker 3: lot of Aristontal's ideas just by doing experiments. 197 00:10:10,480 --> 00:10:10,600 Speaker 4: Well. 198 00:10:10,600 --> 00:10:13,559 Speaker 1: I was looking at the New York Times Bestseller's paperback 199 00:10:13,559 --> 00:10:16,080 Speaker 1: list the other day, and a book that beat mine 200 00:10:16,720 --> 00:10:19,760 Speaker 1: is one where a medium is giving you advice from 201 00:10:19,800 --> 00:10:22,000 Speaker 1: the dead. So maybe we can there's some people we 202 00:10:22,040 --> 00:10:23,360 Speaker 1: can reach out to you to make that happen. 203 00:10:23,360 --> 00:10:24,280 Speaker 3: But that's nonfiction. 204 00:10:24,559 --> 00:10:28,160 Speaker 1: That's in the nonfiction. Yeah, I know, so anyway, but 205 00:10:28,480 --> 00:10:31,120 Speaker 1: and it beat me fai But anyway. 206 00:10:30,880 --> 00:10:33,480 Speaker 3: So, but you guys are on the nonfiction bestseller list. 207 00:10:33,559 --> 00:10:34,160 Speaker 4: How we are. 208 00:10:34,320 --> 00:10:34,920 Speaker 3: Congrats? 209 00:10:35,160 --> 00:10:37,720 Speaker 1: Thanks, it'll be it. It won't be top secret by 210 00:10:37,760 --> 00:10:40,480 Speaker 1: the time the episode comes out, but we're on eleven. 211 00:10:40,520 --> 00:10:42,240 Speaker 1: And Ada will point out that we didn't break the 212 00:10:42,240 --> 00:10:44,400 Speaker 1: top ten, which is what she does every time I 213 00:10:44,520 --> 00:10:47,760 Speaker 1: mentioned that we're on the New York Times bestsellers list. 214 00:10:47,840 --> 00:10:50,240 Speaker 3: Oh my god, that's huge. Wow, wonderful. 215 00:10:50,480 --> 00:10:50,760 Speaker 5: Thanks. 216 00:10:51,160 --> 00:10:53,280 Speaker 1: Anyway, would be nice if I was above the medium person, 217 00:10:53,280 --> 00:10:53,640 Speaker 1: but I'm not. 218 00:10:53,679 --> 00:10:56,079 Speaker 3: But anyway, okay, So well, that gives you a very 219 00:10:56,160 --> 00:10:58,280 Speaker 3: nice cocktail story to tell when next time you go 220 00:10:58,320 --> 00:11:01,000 Speaker 3: to a fancy party. And this is what happened with Newton. Oh, 221 00:11:01,240 --> 00:11:04,160 Speaker 3: Newton developed his theory of gravity, and then he told 222 00:11:04,320 --> 00:11:07,600 Speaker 3: this story at like parties where it's like I saw 223 00:11:07,640 --> 00:11:10,240 Speaker 3: the apple and I had a moment of inspiration. And 224 00:11:10,320 --> 00:11:13,640 Speaker 3: so this is Newton basically writing clickbait. And then the 225 00:11:13,679 --> 00:11:16,960 Speaker 3: story got around and Voltaire heard about it. He's a 226 00:11:17,000 --> 00:11:19,920 Speaker 3: famous writer, and he wrote about it, and that's what 227 00:11:20,000 --> 00:11:23,520 Speaker 3: popularized it. And so Newton sort of like wrote a 228 00:11:23,520 --> 00:11:26,800 Speaker 3: pr pitch about his moment of inspiration that didn't really happen, 229 00:11:27,080 --> 00:11:29,360 Speaker 3: and then it got propagated by the mainstream media, which 230 00:11:29,440 --> 00:11:30,600 Speaker 3: was Voltaire at the time. 231 00:11:31,480 --> 00:11:34,079 Speaker 1: Amazing, How do we know that this didn't really happen 232 00:11:34,120 --> 00:11:35,600 Speaker 1: to Newton if he said it did? 233 00:11:35,720 --> 00:11:38,439 Speaker 3: Well, we have records of Newton's work, right, Like the 234 00:11:38,520 --> 00:11:40,640 Speaker 3: dude kept logbooks and we have his letters, and we 235 00:11:40,720 --> 00:11:44,520 Speaker 3: see him struggling with these concepts together with Hook over years. 236 00:11:44,559 --> 00:11:46,920 Speaker 3: And then it was twenty years later that his theory 237 00:11:46,960 --> 00:11:49,360 Speaker 3: was fully developed. So we see the development of it 238 00:11:49,440 --> 00:11:50,920 Speaker 3: in his notes and in his letters. 239 00:11:51,080 --> 00:11:54,320 Speaker 1: Huh, all right, fantastic. I didn't expect Voltaire to come 240 00:11:54,320 --> 00:11:56,280 Speaker 1: into our episode today, but there he is. 241 00:11:56,520 --> 00:11:58,360 Speaker 3: Yeah. Well, that's because this is the best of all 242 00:11:58,400 --> 00:11:59,439 Speaker 3: possible podcasts. 243 00:11:59,559 --> 00:12:01,720 Speaker 1: Oh my god, us you're so right, you're so right. 244 00:12:02,040 --> 00:12:04,600 Speaker 1: All right. So while people are discovering that this is 245 00:12:04,640 --> 00:12:07,760 Speaker 1: the best podcast ever, let's move on to another amazing 246 00:12:07,800 --> 00:12:10,520 Speaker 1: bombshell of a discovery about radioactivity. 247 00:12:10,800 --> 00:12:13,559 Speaker 3: So here's a moment of discovery that really is sort 248 00:12:13,600 --> 00:12:16,600 Speaker 3: of like the cartoon. There's an accident which leads to 249 00:12:16,640 --> 00:12:20,319 Speaker 3: a moment of inspiration and then like very rapidly, publication 250 00:12:20,559 --> 00:12:21,320 Speaker 3: and awards. 251 00:12:21,360 --> 00:12:23,240 Speaker 1: Just real quick to say, you usually don't want the 252 00:12:23,240 --> 00:12:27,400 Speaker 1: words accident and radioactivity in the same sentence. So I'm 253 00:12:27,440 --> 00:12:30,199 Speaker 1: hoping this accidental discovery didn't end anyone's life. 254 00:12:30,720 --> 00:12:34,720 Speaker 3: Pretty sure everybody involved in early radiation discovery's got cancer. 255 00:12:34,840 --> 00:12:36,240 Speaker 3: Oh sometimes several times. 256 00:12:36,360 --> 00:12:37,800 Speaker 1: All right, you're the downer today. 257 00:12:37,920 --> 00:12:43,000 Speaker 3: All right, speaking of X rays and cancer, some listener 258 00:12:43,040 --> 00:12:47,120 Speaker 3: wrote in recently and told me that until fairly recently, 259 00:12:47,280 --> 00:12:49,040 Speaker 3: you could go to a shoe store and get a 260 00:12:49,160 --> 00:12:52,160 Speaker 3: very intense X ray of your foot to make sure 261 00:12:52,200 --> 00:12:53,480 Speaker 3: your shoe is sized correctly. 262 00:12:53,840 --> 00:12:56,439 Speaker 1: Huh yeah, do not recommend. 263 00:12:56,160 --> 00:13:00,720 Speaker 3: Do not recommend, absolutely not. Anyway, Becherel is credited with 264 00:13:00,720 --> 00:13:05,640 Speaker 3: the discovery of radioactivity, specifically in uranium, and this comes 265 00:13:05,840 --> 00:13:08,800 Speaker 3: quick on the heels of Runken's discovery of X rays, 266 00:13:08,800 --> 00:13:10,920 Speaker 3: which was also an accident. We can dig into that 267 00:13:11,000 --> 00:13:13,720 Speaker 3: another time. But X rays with a new thing. Everybody 268 00:13:13,760 --> 00:13:17,239 Speaker 3: was excited about X rays and becherrel knew that uranium, 269 00:13:17,440 --> 00:13:20,760 Speaker 3: if you left it near photographic plates, would leave an 270 00:13:20,840 --> 00:13:23,520 Speaker 3: imprint on the plate. So, for example, uranium crystal and 271 00:13:23,520 --> 00:13:25,400 Speaker 3: you put it on top of the plate, that it 272 00:13:25,400 --> 00:13:28,040 Speaker 3: would leave the shape of the crystal onto the plate. 273 00:13:28,120 --> 00:13:30,400 Speaker 3: And he was wondering how this worked, and he actually 274 00:13:30,440 --> 00:13:33,360 Speaker 3: had the totally wrong theory. He thought that uranium was 275 00:13:33,440 --> 00:13:36,920 Speaker 3: absorbing sunlight and emitting X rays because X rays with 276 00:13:37,000 --> 00:13:40,280 Speaker 3: this new exciting thing, and so he thought, maybe that's 277 00:13:40,320 --> 00:13:42,840 Speaker 3: what's happening, and so he wanted to do this experiment 278 00:13:42,840 --> 00:13:46,840 Speaker 3: where he wrapped photographic plates in paper so that visible 279 00:13:46,880 --> 00:13:48,960 Speaker 3: light didn't hit them, and then he would put a 280 00:13:48,960 --> 00:13:50,760 Speaker 3: block of uranium on top of that, and they put 281 00:13:50,800 --> 00:13:53,080 Speaker 3: the whole thing out in the sunlight. And the idea 282 00:13:53,320 --> 00:13:56,120 Speaker 3: was the uranium would absorb sunlight amid X rays which 283 00:13:56,160 --> 00:13:58,080 Speaker 3: would go through the paper and leave an imprint on 284 00:13:58,160 --> 00:14:01,160 Speaker 3: the plate. That was his big experiment. Okay, but it 285 00:14:01,200 --> 00:14:04,200 Speaker 3: was cloudy in Paris, right, Paris did not cooperate with 286 00:14:04,240 --> 00:14:07,600 Speaker 3: his plans. His experiment needed sunlight, so he put the 287 00:14:07,600 --> 00:14:09,920 Speaker 3: whole thing in a drawer over the weekend, and he 288 00:14:10,000 --> 00:14:13,440 Speaker 3: came back after the weekend and he decided to develop 289 00:14:13,440 --> 00:14:15,320 Speaker 3: the photographic plate, even though he hadn't put it out 290 00:14:15,320 --> 00:14:17,560 Speaker 3: in the sunlight. And what he saw was a perfect 291 00:14:17,559 --> 00:14:20,480 Speaker 3: picture of the uranium crystal, even though there hadn't been 292 00:14:20,520 --> 00:14:23,160 Speaker 3: any sunlight. And that's when he realized, Oh, the uranium 293 00:14:23,200 --> 00:14:26,520 Speaker 3: is actually just generating radiation on its own. It doesn't 294 00:14:26,520 --> 00:14:30,080 Speaker 3: require sunlight, and it wasn't X rays, and so this 295 00:14:30,240 --> 00:14:33,680 Speaker 3: is like his moment of discovery. He realized, Wow, this 296 00:14:33,880 --> 00:14:37,320 Speaker 3: uranium is generating something on its own. He reported it 297 00:14:37,360 --> 00:14:40,240 Speaker 3: the very next day to like the Academic Society and 298 00:14:40,280 --> 00:14:41,720 Speaker 3: then won the Nobel Prize for it. 299 00:14:41,920 --> 00:14:45,480 Speaker 1: WHOA, I wonder why he decided to develop the photographic 300 00:14:45,480 --> 00:14:46,920 Speaker 1: plate anyway, Yeah. 301 00:14:46,760 --> 00:14:48,600 Speaker 3: People asked him this and he was just like, I 302 00:14:48,680 --> 00:14:51,360 Speaker 3: don't know on a hunch. I just was wondering, you know, 303 00:14:51,520 --> 00:14:52,640 Speaker 3: just like curiosity. 304 00:14:53,000 --> 00:14:55,080 Speaker 1: Wow. Yeah, that is amazingly lucky. 305 00:14:55,200 --> 00:14:57,920 Speaker 3: It's very lucky. Yeah, absolutely, And he's also very lucky 306 00:14:57,920 --> 00:15:00,200 Speaker 3: because it turns out that somebody else did the same 307 00:15:00,240 --> 00:15:03,640 Speaker 3: thing forty years earlier and wrote it up and reported 308 00:15:03,680 --> 00:15:05,080 Speaker 3: it and was just totally ignored. 309 00:15:05,200 --> 00:15:09,200 Speaker 1: Oh no, And did Becquerel cite this other guy. 310 00:15:09,480 --> 00:15:12,240 Speaker 3: No, it was just like lost in the literature. You know, 311 00:15:12,320 --> 00:15:14,320 Speaker 3: how you've done something and then you think it's clever, 312 00:15:14,400 --> 00:15:16,360 Speaker 3: and then you discovered that some Soviet dude did it 313 00:15:16,360 --> 00:15:18,720 Speaker 3: in nineteen seventy eight much better than you did. 314 00:15:19,360 --> 00:15:20,280 Speaker 1: Happens to me all the time. 315 00:15:20,400 --> 00:15:22,960 Speaker 3: That's because we have good literature searches and they didn't 316 00:15:23,040 --> 00:15:25,920 Speaker 3: at that time. And so yeah, but this was really 317 00:15:25,960 --> 00:15:30,040 Speaker 3: a moment write an accident. Very quickly becherel realized what 318 00:15:30,080 --> 00:15:32,960 Speaker 3: it meant, and it changed our understanding of the whole 319 00:15:33,000 --> 00:15:36,400 Speaker 3: microscopic world. And this led to Curi's experiments and the 320 00:15:36,400 --> 00:15:37,880 Speaker 3: foundation of quantum mechanics. 321 00:15:38,040 --> 00:15:38,200 Speaker 4: Yeah. 322 00:15:38,200 --> 00:15:40,920 Speaker 1: So I always thought that Kuri discovered radioactivity. Can you 323 00:15:41,000 --> 00:15:43,320 Speaker 1: quickly tell us what it was that she discovered in particular? 324 00:15:43,560 --> 00:15:47,200 Speaker 3: Right? So, Curi discovered two new radioactive elements. Right, Becquerel 325 00:15:47,240 --> 00:15:52,640 Speaker 3: discovered radioactivity from uranium salts. Curi discovered polonium and radium. 326 00:15:52,720 --> 00:15:56,840 Speaker 3: She actually coined the term radioactivity. And Curi's real insight 327 00:15:57,240 --> 00:16:00,720 Speaker 3: is that radioactivity is an atomic property, not chemical one. 328 00:16:00,760 --> 00:16:04,080 Speaker 3: It's not like you got atoms bumping together and emitting 329 00:16:04,160 --> 00:16:07,440 Speaker 3: something due to some reaction. It's something inside the atom 330 00:16:07,480 --> 00:16:08,080 Speaker 3: that's happening. 331 00:16:08,360 --> 00:16:11,000 Speaker 1: Okay, awesome. So next I see that we're talking about 332 00:16:11,040 --> 00:16:14,400 Speaker 1: the m M experiment, which makes me think about eminem's 333 00:16:14,480 --> 00:16:17,840 Speaker 1: and now I'm hungry. Is this experiment delicious? 334 00:16:18,480 --> 00:16:20,920 Speaker 3: Yes, this is the experiment to discover whether different colors 335 00:16:20,920 --> 00:16:22,560 Speaker 3: of eminems actually have different flavors and. 336 00:16:22,560 --> 00:16:24,320 Speaker 1: Why they melt in your mouth but not in your hand, 337 00:16:24,400 --> 00:16:28,160 Speaker 1: which side note absolutely. 338 00:16:28,840 --> 00:16:32,080 Speaker 3: No, No, this is the Michael Sinmoreley experiment, the famous 339 00:16:32,120 --> 00:16:35,960 Speaker 3: experiment that taught us that light travels the same speed 340 00:16:36,040 --> 00:16:39,560 Speaker 3: for all observers and disproved the existence of the ether. 341 00:16:39,800 --> 00:16:39,920 Speaker 4: Oh. 342 00:16:40,000 --> 00:16:40,640 Speaker 1: That's important. 343 00:16:40,720 --> 00:16:45,080 Speaker 3: It's important, and it's often told as this groundbreaking experiment 344 00:16:45,280 --> 00:16:48,280 Speaker 3: which pivoted our understanding of the universe. And it's true 345 00:16:48,280 --> 00:16:51,440 Speaker 3: that this was an unexpected result and it proves something 346 00:16:51,520 --> 00:16:55,240 Speaker 3: really important about the universe. But contrary to the popular lore, 347 00:16:55,320 --> 00:16:58,840 Speaker 3: it's not something that was widely understood or appreciated at 348 00:16:58,880 --> 00:17:01,960 Speaker 3: the time. It's a little bit revisionist history to go 349 00:17:02,040 --> 00:17:04,600 Speaker 3: back and say, oh, yeah, this experiment happened, and then 350 00:17:04,680 --> 00:17:06,240 Speaker 3: everybody changed their mind. 351 00:17:06,600 --> 00:17:09,800 Speaker 1: Oh so this experiment happened. Nobody changed their mind because 352 00:17:09,840 --> 00:17:12,880 Speaker 1: they ignored it, just like the last guy who discovered radioactivity, 353 00:17:12,880 --> 00:17:14,720 Speaker 1: whose name I think we managed to not even say. 354 00:17:14,720 --> 00:17:14,800 Speaker 6: So. 355 00:17:14,960 --> 00:17:16,960 Speaker 1: Take that guy who did it first. 356 00:17:17,359 --> 00:17:21,359 Speaker 3: That was Abel, the Saint Victor, who discovered radioactivity forty 357 00:17:21,440 --> 00:17:24,680 Speaker 3: years before Becquerel, but was ignored by the Nobel Committee. 358 00:17:24,880 --> 00:17:25,920 Speaker 1: But we've just said it straight. 359 00:17:26,119 --> 00:17:29,080 Speaker 3: Yeah, and that's not exactly what happened here. People were 360 00:17:29,119 --> 00:17:33,040 Speaker 3: aware of this experiment. They just really struggled to digest 361 00:17:33,080 --> 00:17:36,520 Speaker 3: this bizarre concept that like could travel without a medium. 362 00:17:37,160 --> 00:17:39,960 Speaker 3: And so let's go back to eighteen eighty seven when 363 00:17:39,960 --> 00:17:43,680 Speaker 3: this experiment happened. Back then, we had interference experiments and 364 00:17:43,720 --> 00:17:46,600 Speaker 3: diffraction studies. We had all this data showing that light 365 00:17:46,720 --> 00:17:50,160 Speaker 3: was a wave, and Maxwell had his equations that described 366 00:17:50,280 --> 00:17:54,119 Speaker 3: light as ripples in electromagnetism. But they were wondering, like, 367 00:17:54,240 --> 00:17:57,119 Speaker 3: what is it a ripple in you know like sound 368 00:17:57,200 --> 00:17:59,439 Speaker 3: is a ripple in air, and water waves are obviously 369 00:17:59,520 --> 00:18:03,720 Speaker 3: ripples in water. But what is light propagating through you? 370 00:18:03,880 --> 00:18:06,840 Speaker 3: This velocity that we see should be relative to some 371 00:18:07,080 --> 00:18:09,479 Speaker 3: medium if light is the same kind of thing as 372 00:18:09,560 --> 00:18:13,160 Speaker 3: everything else we've studied. And so Michael Sen Morley did 373 00:18:13,160 --> 00:18:16,159 Speaker 3: this experiment to try to detect that medium. They said, well, 374 00:18:16,160 --> 00:18:18,280 Speaker 3: if light is moving through some medium, let's call it 375 00:18:18,320 --> 00:18:21,240 Speaker 3: the ether, and it fills the universe, the Earth is 376 00:18:21,280 --> 00:18:23,600 Speaker 3: also moving through it because the Earth goes around the Sun. 377 00:18:24,040 --> 00:18:26,280 Speaker 3: And so as the Earth goes around the Sun, we 378 00:18:26,280 --> 00:18:28,720 Speaker 3: should see different velocities of the speed of light because 379 00:18:28,720 --> 00:18:32,240 Speaker 3: we have a different velocity relative to the ether. And 380 00:18:32,359 --> 00:18:35,399 Speaker 3: so they did this cool experiment with interferometers where they 381 00:18:35,400 --> 00:18:37,480 Speaker 3: had a light beam and they split it and it 382 00:18:37,560 --> 00:18:40,479 Speaker 3: went into perpendicular directions and then came back. And they 383 00:18:40,480 --> 00:18:43,520 Speaker 3: were very sensitive to small differences because of their cool 384 00:18:43,560 --> 00:18:46,960 Speaker 3: optics and interferometry, and they expected when they did the 385 00:18:47,000 --> 00:18:49,439 Speaker 3: experiment in spring and in summer and in fall and 386 00:18:49,480 --> 00:18:51,720 Speaker 3: in winter, they would get different results and they could 387 00:18:51,760 --> 00:18:55,040 Speaker 3: measure our velocity through the ether. But what they found 388 00:18:55,200 --> 00:18:58,800 Speaker 3: was no difference. That there was never any difference in 389 00:18:58,880 --> 00:19:01,320 Speaker 3: how long it took light to go go one direction 390 00:19:01,520 --> 00:19:04,199 Speaker 3: or the other, and this was totally insensitive to the 391 00:19:04,240 --> 00:19:07,280 Speaker 3: time of year, and they it was really an amazing experiment, 392 00:19:07,320 --> 00:19:09,200 Speaker 3: like the detail they put in it to make this 393 00:19:09,240 --> 00:19:12,439 Speaker 3: thing super sensitive, and so they found nothing, and that 394 00:19:12,600 --> 00:19:16,120 Speaker 3: was very confusing. Like obviously now in hindsight, the conclusion 395 00:19:16,200 --> 00:19:17,960 Speaker 3: is there is no ether, and light moves to the 396 00:19:18,000 --> 00:19:21,480 Speaker 3: same velocity regardless of the observer, and it's a propagation 397 00:19:21,520 --> 00:19:24,960 Speaker 3: of electromagnetic waves through space itself. We know that now, 398 00:19:25,440 --> 00:19:27,760 Speaker 3: but it's not fair to say, like we thought there 399 00:19:27,800 --> 00:19:29,920 Speaker 3: was an ether. We had Michael Simmore the experiment. The 400 00:19:29,960 --> 00:19:32,120 Speaker 3: next day it was like, yeah, let's move on. Obviously 401 00:19:32,160 --> 00:19:35,800 Speaker 3: there's no ether. Instead, people clung to the ether hypothesis 402 00:19:35,800 --> 00:19:39,000 Speaker 3: for a long time. They thought, maybe there's a blob 403 00:19:39,040 --> 00:19:42,000 Speaker 3: of ether and the Earth is dragging it along with it, 404 00:19:42,400 --> 00:19:45,040 Speaker 3: so we can't detect our velocity relative to the ether 405 00:19:45,080 --> 00:19:48,120 Speaker 3: because we're like in a little pocket of ether. And 406 00:19:48,280 --> 00:19:50,080 Speaker 3: we had to do all sorts of other studies to 407 00:19:50,119 --> 00:19:53,280 Speaker 3: disprove that by looking at like the angles of stars 408 00:19:53,320 --> 00:19:56,200 Speaker 3: and how they changed through the year. And so it 409 00:19:56,240 --> 00:20:00,600 Speaker 3: wasn't widely accepted until after Einstein's theory of relative nineteen 410 00:20:00,680 --> 00:20:04,280 Speaker 3: oh five, So this is twenty years later. Einstein comes 411 00:20:04,359 --> 00:20:07,600 Speaker 3: up with this theoretical explanation for this experiment, which brings 412 00:20:07,640 --> 00:20:09,399 Speaker 3: it all together and finally makes it all make sense. 413 00:20:09,600 --> 00:20:12,520 Speaker 3: And it wasn't until then that everybody's like, Okay, yeah, 414 00:20:12,560 --> 00:20:14,520 Speaker 3: I can put this together and this is the way 415 00:20:14,560 --> 00:20:18,320 Speaker 3: the universe works. Really, the physics establishment was like, well, 416 00:20:18,320 --> 00:20:20,880 Speaker 3: that was a strange experiment. We don't understand it. Let's 417 00:20:20,880 --> 00:20:23,640 Speaker 3: put it in the HM category until we figure it out. 418 00:20:23,880 --> 00:20:25,679 Speaker 1: Well, one, I think it's nice that they at least 419 00:20:26,080 --> 00:20:27,919 Speaker 1: paid attention to it, even if they put it in 420 00:20:27,920 --> 00:20:30,760 Speaker 1: the HUM category. It was red, so that's good. But 421 00:20:30,840 --> 00:20:34,840 Speaker 1: did Eminem survive until nineteen oh five to see the 422 00:20:34,920 --> 00:20:37,719 Speaker 1: work validated? Oh, good question, because it would have been 423 00:20:37,720 --> 00:20:39,720 Speaker 1: a delicious moment to know that you were right. 424 00:20:40,800 --> 00:20:43,359 Speaker 3: Yes, both of them lived on for decades longer, so 425 00:20:43,400 --> 00:20:46,320 Speaker 3: they definitely saw the theory of relativity become widely accepted. 426 00:20:46,520 --> 00:20:46,840 Speaker 7: Ah. 427 00:20:46,880 --> 00:20:50,200 Speaker 1: I love hearing that scientists get validation within their lifetime. 428 00:20:51,200 --> 00:20:53,760 Speaker 1: That's a good feeling. All right, let's take a break 429 00:20:53,920 --> 00:20:57,199 Speaker 1: and we'll talk about another discovery before bringing on our 430 00:20:57,320 --> 00:21:20,040 Speaker 1: other experts. All right, So in our last experiment, our 431 00:21:20,119 --> 00:21:24,080 Speaker 1: scientists were deliciously validated. Who is the next scientist we're 432 00:21:24,080 --> 00:21:24,760 Speaker 1: going to talk about? 433 00:21:24,880 --> 00:21:27,520 Speaker 3: So we're going to talk about another famous discovery, that 434 00:21:27,680 --> 00:21:31,440 Speaker 3: of pulsars by Joscelyn Bell Burnell. This is a really 435 00:21:31,480 --> 00:21:34,120 Speaker 3: fun story, but there's a nuance here that I think 436 00:21:34,200 --> 00:21:37,480 Speaker 3: is not widely appreciated, which is again, how long it 437 00:21:37,560 --> 00:21:42,000 Speaker 3: took to really accept this sort of surprising result. Jocelyn 438 00:21:42,040 --> 00:21:45,359 Speaker 3: Bell Burnell was a graduate student. She was studying quasars. 439 00:21:45,359 --> 00:21:48,159 Speaker 3: She was not out to look for pulsars. She was 440 00:21:48,200 --> 00:21:52,160 Speaker 3: looking for these huge jets that shoot out of black holes. 441 00:21:52,480 --> 00:21:54,720 Speaker 3: So black holes at the center of galaxies have accretion 442 00:21:54,840 --> 00:21:57,879 Speaker 3: disks stuff that's swirling around them, but they also shoot 443 00:21:57,920 --> 00:22:01,000 Speaker 3: material up their north and south pole. And these things 444 00:22:01,040 --> 00:22:04,760 Speaker 3: are called quasars, are super duper bright and for a 445 00:22:04,760 --> 00:22:07,400 Speaker 3: long time not really understood because nobody could understand where 446 00:22:07,440 --> 00:22:10,920 Speaker 3: the energy for creating such a bright source was coming from. 447 00:22:11,480 --> 00:22:13,840 Speaker 3: And she was studying these and wanting to understand their 448 00:22:13,920 --> 00:22:17,200 Speaker 3: time variation. Like you know how a star twinkles because 449 00:22:17,200 --> 00:22:20,679 Speaker 3: it goes through the atmosphere, These quasars radiate in the 450 00:22:20,800 --> 00:22:24,680 Speaker 3: radio spectrum, and she was looking for their scintillation due 451 00:22:24,680 --> 00:22:28,040 Speaker 3: to the interaction with the solar wind like particles in space. 452 00:22:28,720 --> 00:22:32,200 Speaker 3: So she built this huge radio telescope. And a radio 453 00:22:32,200 --> 00:22:34,399 Speaker 3: telescope is not like a telescope you looked through with 454 00:22:34,440 --> 00:22:37,960 Speaker 3: your eyeball. It's more like a huge antenna. And she 455 00:22:38,200 --> 00:22:40,960 Speaker 3: rolled out one hundred and twenty miles of wire over 456 00:22:41,040 --> 00:22:43,600 Speaker 3: like four and a half acres to build this big 457 00:22:43,720 --> 00:22:47,080 Speaker 3: radio antenna to capture this information and to try to 458 00:22:47,160 --> 00:22:48,320 Speaker 3: understand quasars. 459 00:22:48,560 --> 00:22:53,240 Speaker 1: You know, sometimes graduate students do just like absolute mind 460 00:22:53,720 --> 00:22:56,639 Speaker 1: blowing quantities of work, Like I imagine it took a 461 00:22:56,720 --> 00:22:59,960 Speaker 1: long time to lay all of that out. So oh yeah, 462 00:23:00,040 --> 00:23:01,720 Speaker 1: shout out to the grad students out there. 463 00:23:01,920 --> 00:23:04,320 Speaker 3: I know she did all the work on this. She 464 00:23:04,359 --> 00:23:07,920 Speaker 3: spent two years just building this telescope. And the data 465 00:23:07,960 --> 00:23:11,040 Speaker 3: is hilariously old fashioned. You know. You might imagine you're 466 00:23:11,040 --> 00:23:13,359 Speaker 3: sitting in your laptop, the data comes in, you're analyzing 467 00:23:13,400 --> 00:23:16,560 Speaker 3: it with some cool visuals. She had a printer which 468 00:23:16,600 --> 00:23:19,320 Speaker 3: produced one hundred feet of paper per day with the 469 00:23:19,400 --> 00:23:22,320 Speaker 3: data on it, and she like visually analyzed it and 470 00:23:22,359 --> 00:23:23,760 Speaker 3: looked for stuff like this. 471 00:23:24,000 --> 00:23:24,320 Speaker 7: Wow. 472 00:23:24,680 --> 00:23:28,440 Speaker 3: And on November twenty eighth, nineteen sixty seven, while looking 473 00:23:28,440 --> 00:23:33,280 Speaker 3: for Quasars. She saw something weird. She saw pulses separated 474 00:23:33,320 --> 00:23:36,919 Speaker 3: by regular time intervals from one location. So it's like 475 00:23:37,320 --> 00:23:41,440 Speaker 3: beep beep, beep beep, and this is really weird, right, 476 00:23:41,640 --> 00:23:43,399 Speaker 3: this is not the kind of thing you expect to 477 00:23:43,440 --> 00:23:46,040 Speaker 3: hear from the universe. You might expect to hear it 478 00:23:46,040 --> 00:23:50,760 Speaker 3: from like satellites or from radios or other artificial sources. 479 00:23:51,480 --> 00:23:53,560 Speaker 3: And so at first she nicknamed it in her notes 480 00:23:53,920 --> 00:23:57,800 Speaker 3: LGM one for Little Green Men. She was like, am 481 00:23:57,880 --> 00:24:01,159 Speaker 3: I getting signals from aliens? Here have I received the 482 00:24:01,200 --> 00:24:03,160 Speaker 3: first interstellar transmission. 483 00:24:03,359 --> 00:24:06,600 Speaker 1: It's interesting that the Little Green Men troope was around 484 00:24:06,640 --> 00:24:08,480 Speaker 1: that early. I guess I hadn't realized that we've been 485 00:24:08,480 --> 00:24:12,080 Speaker 1: imagining aliens as little green news for that long. 486 00:24:12,720 --> 00:24:15,200 Speaker 3: I think it comes from the history of like badly 487 00:24:15,200 --> 00:24:17,639 Speaker 3: informed fiction about life on Mars, doesn't it. 488 00:24:17,760 --> 00:24:20,639 Speaker 1: Oh, I don't know, Yeah, there's an episode we should do. 489 00:24:20,840 --> 00:24:22,480 Speaker 3: And so she was wondering, like, well, what are the 490 00:24:22,560 --> 00:24:27,560 Speaker 3: possible explanations for this other than alien's right, And so 491 00:24:27,640 --> 00:24:30,400 Speaker 3: they went through all sorts of cross checks to try 492 00:24:30,440 --> 00:24:32,920 Speaker 3: to understand what this is. So this is not like 493 00:24:33,040 --> 00:24:36,520 Speaker 3: Becquarel where she discovers this. She understands immediately what it is, 494 00:24:36,600 --> 00:24:39,119 Speaker 3: she goes and publishes it and then wins the Nobel Prize. 495 00:24:39,440 --> 00:24:42,720 Speaker 3: Now instead, she spent months thinking about ways she could 496 00:24:42,720 --> 00:24:46,640 Speaker 3: be fooling herself, Like, could this be signals reflected off 497 00:24:46,640 --> 00:24:49,119 Speaker 3: the moon? Right? Could this just be something from an 498 00:24:49,200 --> 00:24:52,399 Speaker 3: orbiting satellite. Could be like an effect from a big 499 00:24:52,440 --> 00:24:56,360 Speaker 3: building near the telescope, that's like gathering and focusing radio waves. 500 00:24:56,560 --> 00:24:58,680 Speaker 3: She thought about all of these things and like this 501 00:24:58,880 --> 00:24:59,520 Speaker 3: is good. 502 00:24:59,320 --> 00:25:00,719 Speaker 1: Science, right, yeah, good for her. 503 00:25:00,800 --> 00:25:03,280 Speaker 3: Think about all the ways that you could be fooling yourself, 504 00:25:03,320 --> 00:25:05,840 Speaker 3: because she didn't want to embarrass herself go off and 505 00:25:05,840 --> 00:25:07,880 Speaker 3: publish a paper about aliens. And then it turns out 506 00:25:07,960 --> 00:25:09,320 Speaker 3: it was just a tea kettle. 507 00:25:09,040 --> 00:25:11,879 Speaker 1: In the lounge, right, yeah, yeah, that would be embarrassing. 508 00:25:12,240 --> 00:25:14,960 Speaker 3: But finally it was confirmed with another radio telescope, so 509 00:25:15,040 --> 00:25:18,199 Speaker 3: she knew it wasn't instrumentation. But this took months. It 510 00:25:18,240 --> 00:25:21,480 Speaker 3: took a long time, and then finally people understood also 511 00:25:21,840 --> 00:25:24,560 Speaker 3: that it really was a signal, But it wasn't aliens. 512 00:25:24,840 --> 00:25:28,000 Speaker 3: These were just super fast spinning neutron stars that emit 513 00:25:28,080 --> 00:25:31,240 Speaker 3: beams along their poles, and then their poles sweep across 514 00:25:31,240 --> 00:25:34,080 Speaker 3: the surface of the earth leaving these regular blips in 515 00:25:34,119 --> 00:25:37,399 Speaker 3: the radio. So really an amazing and very important discovery 516 00:25:37,640 --> 00:25:40,520 Speaker 3: for which her advisor won the Nobel Prize. And she 517 00:25:40,680 --> 00:25:41,760 Speaker 3: didn't know. 518 00:25:41,960 --> 00:25:44,240 Speaker 1: That's what I was gonna guess. Is one of those 519 00:25:44,240 --> 00:25:47,760 Speaker 1: stories where the woman's advisor gets the credit. Ah cud. 520 00:25:48,040 --> 00:25:50,679 Speaker 3: Yeah, And she is so classy. She came to UCI 521 00:25:50,840 --> 00:25:53,880 Speaker 3: and talked about this, and she is very classy and 522 00:25:54,200 --> 00:25:55,080 Speaker 3: not bitter at all. 523 00:25:55,240 --> 00:25:55,840 Speaker 1: That's for her. 524 00:25:56,000 --> 00:25:57,600 Speaker 3: Anyway. It's a good story. 525 00:25:57,720 --> 00:26:01,240 Speaker 1: Well, it's an awful story, but a good Yeah, she 526 00:26:01,320 --> 00:26:02,080 Speaker 1: did great. 527 00:26:02,080 --> 00:26:04,720 Speaker 3: She's handled it very well. Yeah, okay, exactly, and a 528 00:26:04,800 --> 00:26:07,920 Speaker 3: really fascinating discovery and one that really took a long 529 00:26:08,000 --> 00:26:10,880 Speaker 3: time to verify that this is real, right, And that's 530 00:26:10,920 --> 00:26:13,000 Speaker 3: the thing I want people to understand, is like it's 531 00:26:13,160 --> 00:26:15,760 Speaker 3: very rare to have a moment where it's obvious that 532 00:26:15,800 --> 00:26:18,320 Speaker 3: you've discovered something and you really don't need to do 533 00:26:18,359 --> 00:26:21,360 Speaker 3: any other cross checks. But it does happen, and very 534 00:26:21,440 --> 00:26:24,640 Speaker 3: soon after the discovery of the pulsars, there was exactly 535 00:26:24,680 --> 00:26:28,120 Speaker 3: this kind of discovery. And while these folks were making 536 00:26:28,160 --> 00:26:31,840 Speaker 3: their discovery, they accidentally left a tape recorder on what 537 00:26:32,240 --> 00:26:35,560 Speaker 3: so we have audio you're gonna hear of folks making 538 00:26:35,600 --> 00:26:37,720 Speaker 3: a mind blowing discovery in real time. 539 00:26:37,840 --> 00:26:38,600 Speaker 1: Oh that's awesome. 540 00:26:38,800 --> 00:26:42,440 Speaker 3: So this story starts with Bell's discovery of the pulsars, right, 541 00:26:42,440 --> 00:26:45,320 Speaker 3: but these are in the radio, and people were wondering, like, 542 00:26:45,840 --> 00:26:49,160 Speaker 3: could you also have pulsars that are in the optical 543 00:26:49,160 --> 00:26:52,440 Speaker 3: that you could like see in a telescope. So John 544 00:26:52,520 --> 00:26:56,080 Speaker 3: Cock and Mike Disney were two theorists. These are not astronomers. 545 00:26:56,119 --> 00:26:58,399 Speaker 3: They didn't like know how to operate a telescope or 546 00:26:58,640 --> 00:27:01,600 Speaker 3: do data analysis. They were like, well, this is possible. 547 00:27:01,680 --> 00:27:03,359 Speaker 3: Let's give this a try. Let's go out there and 548 00:27:03,359 --> 00:27:07,119 Speaker 3: try some experiments. And so they got some time on 549 00:27:07,160 --> 00:27:09,600 Speaker 3: a telescope at kit Peek near Tucson, which is a 550 00:27:09,640 --> 00:27:11,600 Speaker 3: gorgeous place and anyone in Arizona should go up to 551 00:27:11,680 --> 00:27:14,600 Speaker 3: kit Peak. And they set up this machinery to convert 552 00:27:14,600 --> 00:27:17,720 Speaker 3: the flashes into ticks and then listen to it, which 553 00:27:17,720 --> 00:27:20,119 Speaker 3: is why they had a tape recorder going. But then 554 00:27:20,160 --> 00:27:22,320 Speaker 3: they converted the tics into frequency, and so they were 555 00:27:22,320 --> 00:27:24,719 Speaker 3: looking for a pulse on their a silloscope, looking for 556 00:27:24,760 --> 00:27:27,520 Speaker 3: like a little peak on their selescope. And they thought 557 00:27:27,520 --> 00:27:29,000 Speaker 3: they had it all set up, and they went up 558 00:27:29,000 --> 00:27:30,520 Speaker 3: there and they tried it and they saw nothing, and 559 00:27:30,520 --> 00:27:33,320 Speaker 3: they were very disappointed. And they had two more days 560 00:27:33,359 --> 00:27:35,920 Speaker 3: to do observations, and those two days were both cloudy, 561 00:27:36,119 --> 00:27:38,679 Speaker 3: so they lost out. And while it was cloudy, they 562 00:27:38,720 --> 00:27:40,320 Speaker 3: were going for walks and thinking about it, and they 563 00:27:40,359 --> 00:27:42,879 Speaker 3: realized they had a mistake in their calculation. They were 564 00:27:42,880 --> 00:27:45,480 Speaker 3: looking in the wrong place. But they didn't have any 565 00:27:45,480 --> 00:27:49,000 Speaker 3: more time. Oh no, Fortunately the next guy on the 566 00:27:49,000 --> 00:27:51,000 Speaker 3: telescope got sick and so he had to give up 567 00:27:51,040 --> 00:27:54,000 Speaker 3: his time. So they had one more night, and so 568 00:27:54,040 --> 00:27:56,760 Speaker 3: they went and they tried their new calculations, and they 569 00:27:56,760 --> 00:27:58,879 Speaker 3: plugged the thing in, and you know, they're just like 570 00:27:58,920 --> 00:28:01,399 Speaker 3: getting started. They just plug it in, turn it on. Okay, 571 00:28:01,480 --> 00:28:04,600 Speaker 3: let's get going. They didn't really expect to see something. 572 00:28:04,960 --> 00:28:08,000 Speaker 3: And as you'll hear in this audio, they're really surprised 573 00:28:08,040 --> 00:28:10,560 Speaker 3: to be making this discovery. So here it is the 574 00:28:10,600 --> 00:28:11,960 Speaker 3: audio of their discovery. 575 00:28:12,680 --> 00:28:18,000 Speaker 4: This next observation will be observation number eighteen. You've got 576 00:28:18,000 --> 00:28:27,280 Speaker 4: a bleeding pulse here. Hey, wow, you don't suppose that's 577 00:28:27,280 --> 00:28:27,760 Speaker 4: really working. 578 00:28:27,800 --> 00:28:31,800 Speaker 8: You can do sure bang in the middle of the periods, 579 00:28:31,840 --> 00:28:33,639 Speaker 8: but don't mean right bang the middle of scale. 580 00:28:34,240 --> 00:28:35,840 Speaker 7: I really looks something from the oven. 581 00:28:36,880 --> 00:28:39,040 Speaker 3: M it was growing too. 582 00:28:40,120 --> 00:28:41,360 Speaker 5: Let's been out the side of it too. 583 00:28:41,520 --> 00:28:49,240 Speaker 8: Here bottle, isn't it? Yeah, cookies, look a bleeding post. 584 00:28:51,160 --> 00:28:56,000 Speaker 8: It's growing, John, it is, Look it is. You're right. 585 00:28:56,720 --> 00:28:57,680 Speaker 1: This is so much fun. 586 00:28:57,840 --> 00:29:00,480 Speaker 3: I love also their mid Atlantic accents. It looks like 587 00:29:00,520 --> 00:29:01,440 Speaker 3: a bleeding pulse. 588 00:29:03,000 --> 00:29:05,520 Speaker 1: I'm not sure that's exactly the mid Atlantic accent that 589 00:29:05,560 --> 00:29:08,680 Speaker 1: I grew up with in New Jersey, but yes, it's 590 00:29:08,680 --> 00:29:09,360 Speaker 1: a fun accent. 591 00:29:09,480 --> 00:29:11,520 Speaker 3: I love listening to this. You know where they're trying 592 00:29:11,520 --> 00:29:13,880 Speaker 3: to convince themselves it can't be it, but it really is. 593 00:29:13,960 --> 00:29:16,840 Speaker 3: Oh my gosh, look at it's going. It's so exciting. 594 00:29:16,880 --> 00:29:17,640 Speaker 3: Way we did it. 595 00:29:17,760 --> 00:29:18,160 Speaker 1: We did it. 596 00:29:18,240 --> 00:29:20,200 Speaker 3: There really was nothing else that this could be. It 597 00:29:20,280 --> 00:29:23,000 Speaker 3: was exactly what they were hoping for and exactly the 598 00:29:23,040 --> 00:29:24,960 Speaker 3: place they thought they might be able to see it, 599 00:29:24,960 --> 00:29:27,600 Speaker 3: and it all worked and boom and they had it. 600 00:29:27,640 --> 00:29:31,480 Speaker 3: So sometimes you really do have those amazing moments of discovery. 601 00:29:31,760 --> 00:29:33,040 Speaker 1: Did they also get Nobels? 602 00:29:33,240 --> 00:29:35,480 Speaker 3: No, it helped us understand the crab nebula and it 603 00:29:35,480 --> 00:29:38,200 Speaker 3: was a really important result. But their discovery was in 604 00:29:38,280 --> 00:29:42,160 Speaker 3: sixty nine, and Bell discovered her first pulsar in sixty eight. 605 00:29:42,200 --> 00:29:45,280 Speaker 3: And in seventy four the Nobel Prize in Physics was 606 00:29:45,320 --> 00:29:47,640 Speaker 3: not given to any of these folks, only to the 607 00:29:47,720 --> 00:29:49,000 Speaker 3: advisor of Bell. 608 00:29:49,520 --> 00:29:55,920 Speaker 1: That is not cool, cool exactly. All right, Well, we've 609 00:29:55,960 --> 00:29:59,000 Speaker 1: now gone through some really exciting examples to give you, 610 00:29:59,040 --> 00:30:01,560 Speaker 1: like a real personal taste for what it's like to 611 00:30:01,560 --> 00:30:02,680 Speaker 1: make these discoveries. 612 00:30:02,840 --> 00:30:06,120 Speaker 3: Let's talk to somebody who's actual expert in discoveries, a 613 00:30:06,320 --> 00:30:09,720 Speaker 3: historian of science, a philosopher of science, who thinks about 614 00:30:09,720 --> 00:30:13,320 Speaker 3: the nature of discovery. So it's my pleasure to welcome 615 00:30:13,320 --> 00:30:16,080 Speaker 3: to the podcast professor and Lydia Patton. She's a professor 616 00:30:16,080 --> 00:30:19,160 Speaker 3: of philosophy at Virginia Tech, where she specializes in philosophy 617 00:30:19,240 --> 00:30:22,320 Speaker 3: of science and history of science, especially on the development 618 00:30:22,320 --> 00:30:25,280 Speaker 3: of experimental and formal methods. Some of her recent work 619 00:30:25,280 --> 00:30:29,360 Speaker 3: focuses on gravitational wave discoveries. Lydia, thank you very much 620 00:30:29,360 --> 00:30:30,720 Speaker 3: for joining us on the podcast. 621 00:30:31,000 --> 00:30:32,800 Speaker 5: Absolutely, it's great to talk to you. 622 00:30:33,640 --> 00:30:36,560 Speaker 3: So we are, too, scientists on this podcast talking about 623 00:30:36,600 --> 00:30:40,640 Speaker 3: our experience of discovery and our understanding of it. But 624 00:30:40,760 --> 00:30:43,240 Speaker 3: we're not experts in that right. We are scientists, doesn't 625 00:30:43,280 --> 00:30:45,880 Speaker 3: make us experts in like the history of science, philosophy 626 00:30:45,880 --> 00:30:47,880 Speaker 3: of science, which is why I want to invite you 627 00:30:47,920 --> 00:30:50,719 Speaker 3: on the show and ask you about the concept of 628 00:30:50,920 --> 00:30:54,520 Speaker 3: scientific discovery. Most people, I think, have a view of 629 00:30:54,560 --> 00:30:58,080 Speaker 3: discovery as sort of a Eureka moment. You see one thing, 630 00:30:58,320 --> 00:31:00,600 Speaker 3: you understand the universe is different from the way you 631 00:31:00,640 --> 00:31:03,080 Speaker 3: thought it was. It all clicks in your head, You 632 00:31:03,160 --> 00:31:05,040 Speaker 3: run down the street naked, shouting at the top of 633 00:31:05,080 --> 00:31:07,040 Speaker 3: your lungs. Everybody accepts it, and then we sort of 634 00:31:07,080 --> 00:31:10,280 Speaker 3: move on and make the next discovery. Is that real? 635 00:31:10,320 --> 00:31:12,640 Speaker 3: Does that really happen often? Or does that sort of 636 00:31:12,680 --> 00:31:15,160 Speaker 3: clash with the reality of the day to day working 637 00:31:15,240 --> 00:31:17,080 Speaker 3: of science. Yeah? 638 00:31:17,160 --> 00:31:21,000 Speaker 6: So, I mean our chimmedies supposedly did happen at least 639 00:31:21,000 --> 00:31:24,840 Speaker 6: once that someone did that. But I think there I 640 00:31:24,840 --> 00:31:30,720 Speaker 6: think there are moments when everything falls into place. And 641 00:31:30,960 --> 00:31:33,120 Speaker 6: the first thing that I think it's really easy to 642 00:31:33,200 --> 00:31:37,840 Speaker 6: understand is that those moments are often hard one. So 643 00:31:38,960 --> 00:31:42,560 Speaker 6: even the our Communities moment, it wasn't as if he 644 00:31:43,480 --> 00:31:45,520 Speaker 6: just came up with the concept of the lever came 645 00:31:45,600 --> 00:31:48,280 Speaker 6: up with the concept of displacement like out of nowhere. 646 00:31:48,440 --> 00:31:50,400 Speaker 6: He had been thinking about that for a long time. 647 00:31:50,440 --> 00:31:53,200 Speaker 6: And there are great sort of accounts of that in 648 00:31:53,200 --> 00:31:54,160 Speaker 6: the history of science. 649 00:31:54,600 --> 00:31:56,680 Speaker 3: So are you telling me this really happened, like we 650 00:31:56,800 --> 00:32:00,040 Speaker 3: actually have documented evidence that this happened. 651 00:31:59,760 --> 00:32:04,840 Speaker 5: Or oh, probably not, that's the no. 652 00:32:05,040 --> 00:32:08,320 Speaker 6: The myth is that he was in his bath tub, 653 00:32:08,360 --> 00:32:11,200 Speaker 6: which I you know, who knows whether they even have 654 00:32:11,400 --> 00:32:14,479 Speaker 6: bath tubs at the time, right, But and that's he 655 00:32:14,680 --> 00:32:18,000 Speaker 6: figured out the concept of displacement from something falling in 656 00:32:18,040 --> 00:32:20,600 Speaker 6: the water, and that that's why he was running through 657 00:32:20,600 --> 00:32:25,160 Speaker 6: the streets naked shouting eureka, like I figured But even 658 00:32:25,240 --> 00:32:29,200 Speaker 6: that story, which is probably apocryphal, he has to know 659 00:32:29,720 --> 00:32:30,840 Speaker 6: what he's looking for. 660 00:32:30,720 --> 00:32:33,440 Speaker 5: In the first place, Like he has to know why. 661 00:32:33,480 --> 00:32:35,680 Speaker 6: Like the average person if they just see something fall 662 00:32:35,720 --> 00:32:37,560 Speaker 6: in their bath water, are not going to say, oh, 663 00:32:37,600 --> 00:32:40,760 Speaker 6: this is a physics concept. You know, this is something 664 00:32:40,800 --> 00:32:43,840 Speaker 6: that that I can use to solve all these physics problems. Well, 665 00:32:43,880 --> 00:32:45,320 Speaker 6: you had to have a lot of training to even 666 00:32:45,360 --> 00:32:48,400 Speaker 6: recognize that that's the problem, and you had to have 667 00:32:48,400 --> 00:32:50,760 Speaker 6: a lot of background to even figure out like, Okay, 668 00:32:50,760 --> 00:32:52,480 Speaker 6: this is going to help me with mechanics, This is 669 00:32:52,480 --> 00:32:54,200 Speaker 6: going to help me with something with a problem that 670 00:32:54,200 --> 00:32:56,840 Speaker 6: I want to solve. And so I think that the 671 00:32:56,880 --> 00:32:59,640 Speaker 6: first thing is even with those kind of Eureka moments, 672 00:32:59,720 --> 00:33:03,760 Speaker 6: there's often, you know, five ten years of difficult training 673 00:33:03,800 --> 00:33:06,600 Speaker 6: and preparation in the background of them to even recognize 674 00:33:06,600 --> 00:33:07,800 Speaker 6: what it is when you see. 675 00:33:07,640 --> 00:33:11,720 Speaker 3: It, and not just training to recognize, but also lots 676 00:33:11,720 --> 00:33:13,840 Speaker 3: of failures, right, lots of moments where it didn't all 677 00:33:13,840 --> 00:33:15,720 Speaker 3: come together, things you tried that didn't. 678 00:33:15,480 --> 00:33:18,000 Speaker 6: Work, that didn't work out, and things that didn't solve 679 00:33:18,000 --> 00:33:21,320 Speaker 6: the problem. And it's like it. I mean, people often 680 00:33:21,440 --> 00:33:24,240 Speaker 6: use the example of solving a puzzle. That's not quite it, 681 00:33:24,400 --> 00:33:28,600 Speaker 6: but it's the idea, is you even if it is 682 00:33:28,640 --> 00:33:32,600 Speaker 6: something as simple as solving a puzzle, and I would 683 00:33:32,640 --> 00:33:35,920 Speaker 6: agree that science is more complicated than that. But even 684 00:33:35,960 --> 00:33:39,800 Speaker 6: with a puzzle solving metaphor, you have to try and 685 00:33:39,840 --> 00:33:41,400 Speaker 6: fail a whole bunch of times. 686 00:33:41,160 --> 00:33:42,680 Speaker 5: Before you start figuring it out. 687 00:33:42,760 --> 00:33:44,000 Speaker 6: Like if you think about when you were a kid 688 00:33:44,000 --> 00:33:46,280 Speaker 6: and you tried to do the Rubi's cue, it took 689 00:33:46,280 --> 00:33:48,960 Speaker 6: a long time to try to figure it out until 690 00:33:49,000 --> 00:33:50,160 Speaker 6: you could reliably do it. 691 00:33:51,040 --> 00:33:52,200 Speaker 5: And it's kind of the same thing. 692 00:33:52,400 --> 00:33:55,880 Speaker 3: Fascinating, and so even this canonical story, this Eureka moment, 693 00:33:56,520 --> 00:34:00,360 Speaker 3: is probably a story that's been made up to invey 694 00:34:00,480 --> 00:34:03,880 Speaker 3: to the general public. What this is, Like how long 695 00:34:03,880 --> 00:34:05,720 Speaker 3: has there been sort of this disconnect? Why are we 696 00:34:05,840 --> 00:34:09,759 Speaker 3: making up stories about how scientific discoveries happen? Like where 697 00:34:09,800 --> 00:34:12,279 Speaker 3: do these cartoon versions come from? And why do we 698 00:34:12,360 --> 00:34:12,719 Speaker 3: need them? 699 00:34:12,920 --> 00:34:15,720 Speaker 6: That is one of the biggest questions that I think 700 00:34:16,440 --> 00:34:19,480 Speaker 6: historians of science wrestle with philosophers of science. 701 00:34:19,520 --> 00:34:21,960 Speaker 5: Maybe a bit less, but philosophers of. 702 00:34:21,960 --> 00:34:25,200 Speaker 6: Science deal with something where we wonder a lot about 703 00:34:26,200 --> 00:34:28,800 Speaker 6: why we have a need for truth in science. 704 00:34:29,000 --> 00:34:31,239 Speaker 3: That seems like an obvious question, and it's an. 705 00:34:31,120 --> 00:34:35,000 Speaker 6: Obvious questions, but it's a tough question to answer. Why 706 00:34:35,000 --> 00:34:37,480 Speaker 6: do we want to think that science describes reality? 707 00:34:37,719 --> 00:34:37,959 Speaker 5: Right? 708 00:34:38,040 --> 00:34:39,840 Speaker 6: And so this is one of the biggest questions in 709 00:34:39,880 --> 00:34:44,279 Speaker 6: contemporary philosophy of science is why what makes us think 710 00:34:44,320 --> 00:34:47,960 Speaker 6: that the claims of science or claims about real things 711 00:34:48,000 --> 00:34:51,200 Speaker 6: that actually exist, or claims about truth or true claims. 712 00:34:52,120 --> 00:34:56,640 Speaker 6: And I think that there's something so seductive about truth 713 00:34:56,719 --> 00:34:58,880 Speaker 6: that the idea is that, for one thing, you can 714 00:34:58,960 --> 00:35:00,879 Speaker 6: use it to win any our argument, which to any 715 00:35:00,880 --> 00:35:03,000 Speaker 6: philosopher is going to be super attractive. 716 00:35:03,200 --> 00:35:06,799 Speaker 5: Right. It's like your sort of trump card. 717 00:35:06,880 --> 00:35:09,160 Speaker 6: You lay it down and you win the argument because 718 00:35:09,400 --> 00:35:13,360 Speaker 6: you have made a claim that's just true, and I 719 00:35:13,360 --> 00:35:17,520 Speaker 6: think that that's that kind of feeling, like, oh, now 720 00:35:17,560 --> 00:35:18,600 Speaker 6: I win any argument. 721 00:35:18,760 --> 00:35:20,239 Speaker 5: Now I just come out on top. 722 00:35:21,000 --> 00:35:23,080 Speaker 6: You know, if I end up in an argument on 723 00:35:23,120 --> 00:35:25,520 Speaker 6: social media, like if I just bring this out. 724 00:35:25,400 --> 00:35:27,680 Speaker 5: Everybody will have to agree that I'm right, you know. 725 00:35:28,719 --> 00:35:34,640 Speaker 6: And I think that kind of certainty is what's very attractive. Certainty, 726 00:35:35,560 --> 00:35:39,160 Speaker 6: winning the argument, being right, These are all very attractive things. 727 00:35:40,160 --> 00:35:44,880 Speaker 6: And I think that what's masked behind that. So, of course, 728 00:35:45,120 --> 00:35:48,080 Speaker 6: if we think of science as being the source of certainty, 729 00:35:48,120 --> 00:35:52,799 Speaker 6: the source of rightness, and a privileged source of being 730 00:35:52,800 --> 00:35:56,360 Speaker 6: able to win any argument, even a political or social argument. 731 00:35:56,760 --> 00:35:59,040 Speaker 6: If we think of science as being in that position, 732 00:36:00,120 --> 00:36:04,719 Speaker 6: then that means that if someone makes a scientific discovery 733 00:36:05,200 --> 00:36:09,360 Speaker 6: that gives us truth and certainty and ways of winning 734 00:36:09,360 --> 00:36:12,400 Speaker 6: the argument, then that sort of fits in with that 735 00:36:12,480 --> 00:36:16,800 Speaker 6: narrative that oh, okay, now we're on top, we're winning, 736 00:36:17,719 --> 00:36:20,439 Speaker 6: and we have this certain, true picture of the way 737 00:36:20,440 --> 00:36:22,920 Speaker 6: the world works, which is also extremely attractive. 738 00:36:23,280 --> 00:36:25,920 Speaker 3: I see. So it's compelling to imagine that truth is 739 00:36:26,000 --> 00:36:29,080 Speaker 3: revealed to us in these moments and that we can 740 00:36:29,120 --> 00:36:31,160 Speaker 3: share with people and be like, see, look, this is 741 00:36:31,200 --> 00:36:33,880 Speaker 3: the way the universe works. It's X and it's not y, 742 00:36:34,520 --> 00:36:38,160 Speaker 3: and the data itself will convince everyone that's the idea. 743 00:36:38,120 --> 00:36:41,319 Speaker 5: That's the idea. And there have been multiple examples of that. 744 00:36:41,360 --> 00:36:44,480 Speaker 6: One of my favorites is with somebody I've studied some 745 00:36:44,560 --> 00:36:48,399 Speaker 6: in my careers, someone named hermon Vun Helmholtz and who's 746 00:36:48,440 --> 00:36:55,880 Speaker 6: a German polymath. Really he physicists, philosopher, mathematician, many other things. 747 00:36:55,920 --> 00:36:59,520 Speaker 6: And one of the things that he did was in 748 00:36:59,560 --> 00:37:02,920 Speaker 6: the beginning of his career he really went after vitalism 749 00:37:02,960 --> 00:37:05,600 Speaker 6: and medicine, the idea that there's a kind of vital 750 00:37:05,640 --> 00:37:09,719 Speaker 6: force over and above the forces of metabolism and so 751 00:37:09,760 --> 00:37:13,080 Speaker 6: forth in the human body. And the thing is that 752 00:37:13,320 --> 00:37:19,759 Speaker 6: many medical systems, many medical approaches in like ancient Chinese medicine, 753 00:37:20,120 --> 00:37:24,560 Speaker 6: and in many traditional medical sort of paradigms, are based 754 00:37:24,600 --> 00:37:28,400 Speaker 6: on this idea of a life force, that what medicine 755 00:37:28,440 --> 00:37:30,920 Speaker 6: is doing is kind of helping the life force to 756 00:37:31,000 --> 00:37:34,480 Speaker 6: get stronger so that people will survive. And one of 757 00:37:34,560 --> 00:37:36,480 Speaker 6: the first things that Helmholtz did, and he wasn't even 758 00:37:36,480 --> 00:37:38,919 Speaker 6: a medical doctor, you know, was do a whole bunch 759 00:37:38,960 --> 00:37:45,440 Speaker 6: of experiments that disproved in his mind the vitalist hypothesis 760 00:37:46,520 --> 00:37:50,000 Speaker 6: and his achievement, in conjunction with the achievements of a 761 00:37:50,040 --> 00:37:53,760 Speaker 6: bunch of other people in that same vein, basically killed 762 00:37:53,760 --> 00:37:56,279 Speaker 6: the vitalist paradigm and. 763 00:37:56,200 --> 00:37:57,880 Speaker 5: It had a huge impact. 764 00:37:58,080 --> 00:38:00,480 Speaker 6: And so what happened was that people have this kind 765 00:38:00,480 --> 00:38:03,680 Speaker 6: of certainty like he had this kind of certainty. I 766 00:38:03,760 --> 00:38:06,800 Speaker 6: honestly look back at Helmholtz, and I think he didn't 767 00:38:06,840 --> 00:38:10,800 Speaker 6: really know. He was involved with a group of people 768 00:38:11,120 --> 00:38:16,520 Speaker 6: who the Berlin Physical Society. They thought that vitalism was 769 00:38:16,560 --> 00:38:20,200 Speaker 6: wrong and that everything should be explained by material processes 770 00:38:20,239 --> 00:38:22,400 Speaker 6: in the body and so forth. But they didn't have 771 00:38:22,400 --> 00:38:25,040 Speaker 6: any absolute proof of it, but they were seeking it, 772 00:38:25,120 --> 00:38:28,319 Speaker 6: and so that was what they wanted to find. And 773 00:38:28,360 --> 00:38:31,120 Speaker 6: so there's this kind of sense that if there's something 774 00:38:31,239 --> 00:38:35,520 Speaker 6: that you want to establish beyond any possible doubt, you 775 00:38:35,680 --> 00:38:37,759 Speaker 6: try to look for this eureka moment, You try to 776 00:38:37,800 --> 00:38:40,719 Speaker 6: look for this certainty in science, and that that's the 777 00:38:40,840 --> 00:38:43,120 Speaker 6: value of science. That's one account of the value of 778 00:38:43,160 --> 00:38:45,840 Speaker 6: science is that it gives you this kind of truth 779 00:38:45,880 --> 00:38:47,279 Speaker 6: and certainty that you're looking for. 780 00:38:47,600 --> 00:38:50,439 Speaker 3: That's just something a little bit troubling that you're more 781 00:38:50,560 --> 00:38:54,239 Speaker 3: likely to be convinced by something you expect to hear right, 782 00:38:54,640 --> 00:38:56,840 Speaker 3: which is maybe why it takes a long time to 783 00:38:56,920 --> 00:39:00,560 Speaker 3: accept some data which counters you or understanding, you know, 784 00:39:00,600 --> 00:39:04,560 Speaker 3: tectonic plates in the Michaelson Morley experiment. Why can't we 785 00:39:04,760 --> 00:39:07,400 Speaker 3: just let the data speak? Is there some part of 786 00:39:07,440 --> 00:39:11,000 Speaker 3: our science which is too subjective, which you know, makes 787 00:39:11,080 --> 00:39:15,000 Speaker 3: us skeptical of some discoveries and more accepting of others. 788 00:39:15,120 --> 00:39:18,319 Speaker 3: Is there a way we can upgrade our science to 789 00:39:18,440 --> 00:39:19,719 Speaker 3: make it less subjective? 790 00:39:20,200 --> 00:39:20,359 Speaker 7: Oh? 791 00:39:20,440 --> 00:39:23,000 Speaker 5: Yeah, that is That is a great and huge question. 792 00:39:23,120 --> 00:39:26,080 Speaker 6: I think why can't we I'll tackle why can't we 793 00:39:26,160 --> 00:39:29,480 Speaker 6: just let the data speak? Because to me, that is 794 00:39:29,640 --> 00:39:31,759 Speaker 6: that is one of the biggest questions that I look at. 795 00:39:32,920 --> 00:39:35,280 Speaker 6: Data does not speak in and of itself. 796 00:39:35,320 --> 00:39:35,840 Speaker 5: That's one. 797 00:39:36,640 --> 00:39:38,000 Speaker 6: There are a lot of people who say, well, oh 798 00:39:38,040 --> 00:39:40,279 Speaker 6: I'm evidence based. Oh I just go by what the 799 00:39:40,400 --> 00:39:42,880 Speaker 6: data said. Oh I just go And there's something to 800 00:39:42,880 --> 00:39:44,640 Speaker 6: be said for that. I mean, you do need to 801 00:39:44,680 --> 00:39:48,000 Speaker 6: test your claims against the evidence. If your claims just 802 00:39:48,120 --> 00:39:52,319 Speaker 6: keep getting refuted by obvious experiments, then either you need 803 00:39:52,360 --> 00:39:54,839 Speaker 6: to adjust your claims somehow, or you know. I mean 804 00:39:54,840 --> 00:39:58,080 Speaker 6: that I think everyone knows is the scientific method that 805 00:39:58,120 --> 00:40:01,319 Speaker 6: you have to part of a scientific meth method is 806 00:40:01,360 --> 00:40:04,560 Speaker 6: that if what you're saying just keeps being refuted by 807 00:40:04,760 --> 00:40:10,839 Speaker 6: experiments or tests, then it's wrong. But to say that 808 00:40:11,239 --> 00:40:14,320 Speaker 6: is not to say that you can gather new data 809 00:40:14,680 --> 00:40:18,440 Speaker 6: and immediately know everything about what it says. And a 810 00:40:18,440 --> 00:40:22,200 Speaker 6: lot of times even very high level scientists will say, look, 811 00:40:22,280 --> 00:40:24,799 Speaker 6: you know, we're running this experiment and one of the 812 00:40:24,800 --> 00:40:27,160 Speaker 6: most exciting things about it is that we're getting data 813 00:40:27,200 --> 00:40:28,759 Speaker 6: that even we don't understand. 814 00:40:29,360 --> 00:40:30,440 Speaker 5: You know, even. 815 00:40:30,200 --> 00:40:33,200 Speaker 6: We need a new paradigm or a new framework to 816 00:40:33,600 --> 00:40:36,920 Speaker 6: fit this in in order to understand what it's telling us. 817 00:40:36,920 --> 00:40:39,719 Speaker 6: It's like learning a new language. You know, you have 818 00:40:39,880 --> 00:40:42,360 Speaker 6: all of the information there, but you need to be 819 00:40:42,360 --> 00:40:45,040 Speaker 6: able to translate it into something to allow us to 820 00:40:45,160 --> 00:40:48,040 Speaker 6: understand what's happened. And I think there are a lot 821 00:40:48,040 --> 00:40:51,000 Speaker 6: of discoveries in science that worked that way, where a 822 00:40:51,040 --> 00:40:53,839 Speaker 6: lot of experiments, especially in science, that work that way, 823 00:40:54,480 --> 00:40:59,160 Speaker 6: where they were what Friedrich Steinley calls exploratory experiments where 824 00:40:59,200 --> 00:41:01,960 Speaker 6: people were just trying eyeing out different hypotheses, just testing 825 00:41:02,000 --> 00:41:04,560 Speaker 6: out what they might be able to find, and then 826 00:41:04,600 --> 00:41:07,360 Speaker 6: they get this data and it's really interesting data, but 827 00:41:07,400 --> 00:41:09,279 Speaker 6: they're not really sure what it means, and they have 828 00:41:09,320 --> 00:41:12,399 Speaker 6: to come up with a new explanation even to even 829 00:41:12,480 --> 00:41:14,520 Speaker 6: get the kind of the juice out of it, to 830 00:41:14,560 --> 00:41:16,840 Speaker 6: get the real information out of the data. 831 00:41:16,920 --> 00:41:18,840 Speaker 3: Well, so it sounds like you're telling me, and I 832 00:41:18,880 --> 00:41:22,560 Speaker 3: apologize for asking you to summarize or simplify an entire 833 00:41:22,800 --> 00:41:26,200 Speaker 3: like one hundred year long argument among philosophers, but it 834 00:41:26,280 --> 00:41:28,040 Speaker 3: sounds like you're telling me that there's no way to 835 00:41:28,080 --> 00:41:32,440 Speaker 3: be purely objective about science because the process of interpreting 836 00:41:32,520 --> 00:41:38,359 Speaker 3: data is inherently subjective or personal or dependent on your 837 00:41:38,360 --> 00:41:40,279 Speaker 3: point of view and the questions you're asking and the 838 00:41:40,880 --> 00:41:43,000 Speaker 3: explanations you're interested in accepting. 839 00:41:43,360 --> 00:41:46,640 Speaker 6: Okay, so that's a slight that's a somewhat provocative way 840 00:41:46,640 --> 00:41:51,960 Speaker 6: of putting way I just said, so I would somewhat 841 00:41:52,160 --> 00:41:56,120 Speaker 6: So the whole objectivity subjectivity debate is a big one. 842 00:41:56,239 --> 00:41:59,480 Speaker 6: And so what I would say is it's not so 843 00:41:59,600 --> 00:42:02,600 Speaker 6: much that you have to choose your subjective slant on 844 00:42:02,640 --> 00:42:07,160 Speaker 6: the data, but it is that even objective facts require 845 00:42:07,200 --> 00:42:11,600 Speaker 6: an interpretation of the data. So I think it's actually 846 00:42:11,680 --> 00:42:15,600 Speaker 6: kind of independent of the objective subjective divide. I think 847 00:42:15,600 --> 00:42:18,359 Speaker 6: it's that if a lot of you know, a lot 848 00:42:18,360 --> 00:42:22,920 Speaker 6: of times what you have is just like, this detector 849 00:42:23,120 --> 00:42:26,319 Speaker 6: clicked five times in a minute, Well, what does that mean? 850 00:42:27,640 --> 00:42:29,120 Speaker 5: We only know what that means. 851 00:42:30,040 --> 00:42:31,719 Speaker 6: We don't have It's not like we have to pick 852 00:42:31,800 --> 00:42:34,360 Speaker 6: what we think about it or what we expect or 853 00:42:34,400 --> 00:42:37,040 Speaker 6: what we want out of it. It's that even in 854 00:42:37,160 --> 00:42:39,960 Speaker 6: order to know what that means, why did the detector 855 00:42:39,960 --> 00:42:42,960 Speaker 6: click so many times? What's going on there? You need 856 00:42:43,000 --> 00:42:45,200 Speaker 6: to know what the setup of the experiment is. You 857 00:42:45,239 --> 00:42:46,879 Speaker 6: need to know what the theory is that it's trying 858 00:42:46,920 --> 00:42:49,080 Speaker 6: to test. You need to have some kind of framework 859 00:42:49,080 --> 00:42:51,680 Speaker 6: for interpretation. And that I think is the part that 860 00:42:52,120 --> 00:42:55,600 Speaker 6: sometimes gets confused is people think, well, that's just your opinion, 861 00:42:55,680 --> 00:42:59,240 Speaker 6: then that's not science, and like, well, no, the science 862 00:42:59,320 --> 00:43:03,640 Speaker 6: is in knowing all of that, like knowing how the 863 00:43:03,719 --> 00:43:07,440 Speaker 6: experiment works, what kind of information we can sort of 864 00:43:08,040 --> 00:43:11,120 Speaker 6: get from the data once we get it, and that 865 00:43:11,200 --> 00:43:15,200 Speaker 6: process doesn't have to be subjective, but it doesn't give 866 00:43:15,239 --> 00:43:19,920 Speaker 6: us objective results without any effort. I think that's really 867 00:43:19,960 --> 00:43:21,200 Speaker 6: where I see attention. 868 00:43:21,480 --> 00:43:25,000 Speaker 3: So science is sort of a complex and nuanced process. 869 00:43:25,520 --> 00:43:27,279 Speaker 3: But I think that a lot of people have the 870 00:43:27,320 --> 00:43:31,760 Speaker 3: impression that science sort of came into being all very quickly. 871 00:43:31,880 --> 00:43:34,759 Speaker 3: A few hundred years ago, when you know Galleo and 872 00:43:34,840 --> 00:43:40,200 Speaker 3: Bacon understood the importance of empiricism and doing experiments. Is 873 00:43:40,200 --> 00:43:43,160 Speaker 3: that a cartoonish, simplified version of the development of science. 874 00:43:43,320 --> 00:43:45,439 Speaker 3: Can you give us a sort of more nuanced view 875 00:43:45,560 --> 00:43:50,000 Speaker 3: of like how we came to develop this engine for discovery. 876 00:43:50,520 --> 00:43:57,320 Speaker 6: The process of coming to a scientific understanding didn't come 877 00:43:57,480 --> 00:44:04,319 Speaker 6: into being immediately, and even thinking of our understanding of 878 00:44:04,320 --> 00:44:09,680 Speaker 6: the world as scientific is a relatively recent phenomenon. So 879 00:44:10,920 --> 00:44:12,959 Speaker 6: most of the people who we think of as the 880 00:44:13,040 --> 00:44:15,840 Speaker 6: pioneers of the scientific method would have thought of themselves 881 00:44:15,880 --> 00:44:22,839 Speaker 6: as natural philosophers. The tradition of natural philosophy encompassed philosophy, science, theology, 882 00:44:23,160 --> 00:44:28,600 Speaker 6: just multiple ways of understanding the world. And the publication 883 00:44:28,680 --> 00:44:31,080 Speaker 6: of Newton, as you probably know, the publication of Newton, 884 00:44:31,120 --> 00:44:33,839 Speaker 6: where he introduces the laws of nature and the laws 885 00:44:33,840 --> 00:44:38,320 Speaker 6: of physics and so forth, was called the mathematical Principles 886 00:44:38,320 --> 00:44:42,600 Speaker 6: of natural philosophy, not the mathematical Principles of physics. And 887 00:44:42,680 --> 00:44:44,680 Speaker 6: so for a long time the idea was just we're 888 00:44:44,719 --> 00:44:48,520 Speaker 6: trying to understand the world. We're trying to understand things 889 00:44:48,600 --> 00:44:54,480 Speaker 6: from whatever perspective we may have, and the idea of science, however, 890 00:44:54,640 --> 00:44:59,560 Speaker 6: is extremely old. So you have even you know, some 891 00:44:59,640 --> 00:45:04,960 Speaker 6: of the ancient Greek philosophers talking about science, and so 892 00:45:05,239 --> 00:45:07,200 Speaker 6: the idea that it came into being with Bacon and 893 00:45:07,239 --> 00:45:11,239 Speaker 6: Galileo is actually even too recent, right, Like the idea 894 00:45:11,239 --> 00:45:15,280 Speaker 6: of scientific understanding is very old. But at the same time, 895 00:45:15,680 --> 00:45:17,840 Speaker 6: even people who were doing what we would think of 896 00:45:17,880 --> 00:45:22,399 Speaker 6: as pioneering science did so under the banner of another heading, right. 897 00:45:22,520 --> 00:45:26,399 Speaker 6: So it was really, in my view historically in the 898 00:45:26,600 --> 00:45:31,480 Speaker 6: eighteen hundreds that those two things started blending in an 899 00:45:31,520 --> 00:45:35,880 Speaker 6: institutional context to give us something like the modern idea 900 00:45:36,000 --> 00:45:38,799 Speaker 6: that there is a department or a faculty in the 901 00:45:38,920 --> 00:45:43,720 Speaker 6: university that is specifically devoted to science. And that's really 902 00:45:43,760 --> 00:45:47,200 Speaker 6: more of a professional idea than anything to do with 903 00:45:47,320 --> 00:45:51,480 Speaker 6: the essence of the way that science is carried out. 904 00:45:52,360 --> 00:45:54,399 Speaker 6: This is the briefest thing I can say about it. 905 00:45:54,800 --> 00:45:57,360 Speaker 6: If you spend a lot of time around historians of 906 00:45:57,360 --> 00:46:00,920 Speaker 6: philosophy of science, historians of science, you will realize that 907 00:46:01,000 --> 00:46:04,600 Speaker 6: the further back you get, the more complicated this all is, 908 00:46:05,560 --> 00:46:09,520 Speaker 6: and the more you find people in very different fields 909 00:46:09,640 --> 00:46:13,560 Speaker 6: contributing to science. You find people like Girta contributing to 910 00:46:13,640 --> 00:46:18,680 Speaker 6: plant science Schiller in the nineteenth century. They were cited 911 00:46:18,760 --> 00:46:23,280 Speaker 6: often by like major scientists in the German nineteenth century, 912 00:46:23,760 --> 00:46:25,920 Speaker 6: and we don't really know what to do with that 913 00:46:26,280 --> 00:46:31,239 Speaker 6: because we have a particular idea of the way of 914 00:46:31,280 --> 00:46:34,920 Speaker 6: who a scientist is and who gets to be a scientist, 915 00:46:36,040 --> 00:46:39,000 Speaker 6: and that person works at a certain type of university 916 00:46:39,160 --> 00:46:42,239 Speaker 6: or a research project, that person has certain types of 917 00:46:42,280 --> 00:46:47,200 Speaker 6: professional bona fides that we require, and historically that just 918 00:46:47,239 --> 00:46:50,799 Speaker 6: hasn't been true because that didn't exist. And so I 919 00:46:50,800 --> 00:46:53,440 Speaker 6: think that there's been much more of a broad, sort 920 00:46:53,440 --> 00:46:56,480 Speaker 6: of pluralistic understanding of what science is. The more you 921 00:46:56,520 --> 00:46:59,440 Speaker 6: sort of push things backwards. A lot of people were 922 00:46:59,440 --> 00:47:02,400 Speaker 6: doing research for like private corporations. 923 00:47:02,440 --> 00:47:03,400 Speaker 5: They were doing research. 924 00:47:03,440 --> 00:47:05,239 Speaker 6: I mean, if you look at Michael Faraday, you know 925 00:47:05,280 --> 00:47:07,680 Speaker 6: he was one of the most important people in the 926 00:47:07,719 --> 00:47:10,560 Speaker 6: history of electricity and magnetism, and a lot of his 927 00:47:10,640 --> 00:47:13,640 Speaker 6: work was done privately. It wasn't done at a university 928 00:47:13,640 --> 00:47:17,440 Speaker 6: because he didn't have university training. But that's one point, 929 00:47:17,480 --> 00:47:20,440 Speaker 6: that's the sort of historical point that science is very 930 00:47:20,440 --> 00:47:24,560 Speaker 6: complicated in it. The current understanding is actually very historically specific, 931 00:47:25,080 --> 00:47:28,719 Speaker 6: even though we think of it as again searching for 932 00:47:28,800 --> 00:47:32,160 Speaker 6: this kind of certainty and eternal truths. We think of 933 00:47:32,200 --> 00:47:34,080 Speaker 6: it as like the way to be a scientist, but 934 00:47:34,160 --> 00:47:35,960 Speaker 6: it's certainly not in history. 935 00:47:36,239 --> 00:47:39,000 Speaker 3: And Faraday's examples should give motivation to all the folks 936 00:47:39,040 --> 00:47:42,319 Speaker 3: out there who are amateur physicists coming up with their 937 00:47:42,360 --> 00:47:44,480 Speaker 3: own theories of everything in the garage, right, it. 938 00:47:44,440 --> 00:47:47,160 Speaker 6: Can happen one hundred percent. I mean, this is the 939 00:47:47,160 --> 00:47:51,680 Speaker 6: guy who came up with the motor. Basically, I think 940 00:47:52,440 --> 00:47:53,920 Speaker 6: that should be an example to anyone. 941 00:47:54,200 --> 00:47:57,200 Speaker 3: So you alluded earlier to this deep question in philosophy 942 00:47:57,320 --> 00:48:00,879 Speaker 3: about whether science is discovering truth. Is what we're learning 943 00:48:00,920 --> 00:48:03,879 Speaker 3: about the universe really universal? Does it reflect the way 944 00:48:03,880 --> 00:48:06,640 Speaker 3: we think? Fascinating question. I'd love to dig into it 945 00:48:06,680 --> 00:48:08,319 Speaker 3: in another episode, but I want to ask you a 946 00:48:08,320 --> 00:48:12,600 Speaker 3: related question, which is about the universality of the process 947 00:48:12,640 --> 00:48:16,040 Speaker 3: of science. We have this technique we've been building up 948 00:48:16,080 --> 00:48:19,760 Speaker 3: and evolving, this that developing to learn about the universe. 949 00:48:20,200 --> 00:48:24,440 Speaker 3: Do you think that it's likely that other intelligent civilized 950 00:48:24,520 --> 00:48:29,359 Speaker 3: races around the galaxy, for example, are doing science. You know, 951 00:48:29,400 --> 00:48:31,759 Speaker 3: I'm not asking is there a person they call a 952 00:48:31,840 --> 00:48:34,560 Speaker 3: scientist and do they have the same cultural institutions I 953 00:48:34,560 --> 00:48:36,880 Speaker 3: think that's very unlikely. But do you think they have 954 00:48:37,040 --> 00:48:42,359 Speaker 3: also stumbled on the process of building hypotheses, doing experiments 955 00:48:42,400 --> 00:48:44,600 Speaker 3: refining that. Do you think we're likely to find that 956 00:48:45,000 --> 00:48:46,799 Speaker 3: in alien species? 957 00:48:47,040 --> 00:48:51,040 Speaker 5: Oh, that's a great question. So I think one of 958 00:48:51,080 --> 00:48:51,800 Speaker 5: the things I think. 959 00:48:51,680 --> 00:48:55,640 Speaker 6: About that is that it's closely related to another question, 960 00:48:56,600 --> 00:49:00,920 Speaker 6: which is science inevitable in the way that we've developed it. 961 00:49:01,080 --> 00:49:05,279 Speaker 6: So on any planet, with any species, or even if 962 00:49:05,360 --> 00:49:10,000 Speaker 6: we went back and re ran the tape of our history, 963 00:49:10,480 --> 00:49:13,959 Speaker 6: would it all happen the same way? And I think 964 00:49:14,320 --> 00:49:18,360 Speaker 6: it wouldn't necessarily, even if the changes were just minor. 965 00:49:19,000 --> 00:49:23,960 Speaker 6: There are people who argue that certain formal features of 966 00:49:24,000 --> 00:49:27,160 Speaker 6: science would always inevitably be the same way. We would 967 00:49:27,160 --> 00:49:29,720 Speaker 6: always find some way to do experiments, we would always 968 00:49:29,760 --> 00:49:31,880 Speaker 6: find some way to test our claims, we would always 969 00:49:31,880 --> 00:49:34,160 Speaker 6: find some way to incorporate formal reasoning. 970 00:49:34,280 --> 00:49:34,440 Speaker 4: Right. 971 00:49:35,520 --> 00:49:36,560 Speaker 5: I'm not sure that's true. 972 00:49:36,640 --> 00:49:39,200 Speaker 3: It seems awfully flattering, right to say that the way 973 00:49:39,239 --> 00:49:41,040 Speaker 3: we're doing it has got to be the only way. 974 00:49:41,560 --> 00:49:45,840 Speaker 6: We are very triumphalist about our way of doing science. 975 00:49:47,400 --> 00:49:49,040 Speaker 6: We think that we have the way and that this 976 00:49:49,160 --> 00:49:52,520 Speaker 6: is the right way. And I think that sometimes people 977 00:49:52,560 --> 00:49:54,920 Speaker 6: cling to it as a way of solving our problems. 978 00:49:54,920 --> 00:49:55,080 Speaker 3: You know. 979 00:49:55,200 --> 00:49:57,160 Speaker 6: The idea is if we could just all get on board, 980 00:49:57,320 --> 00:50:00,800 Speaker 6: if everyone could just trust the science and tru scientists, 981 00:50:00,800 --> 00:50:02,719 Speaker 6: and we would all get And it's funny how the 982 00:50:02,719 --> 00:50:04,839 Speaker 6: people who get the most skeptical look in their eyes 983 00:50:04,880 --> 00:50:07,160 Speaker 6: when they hear this are scientists, right, They're. 984 00:50:06,960 --> 00:50:09,760 Speaker 5: Like us, why are we supposed to save everybody? 985 00:50:09,840 --> 00:50:10,000 Speaker 7: You know? 986 00:50:10,080 --> 00:50:11,600 Speaker 5: Like, what's wait a minute? 987 00:50:11,680 --> 00:50:14,360 Speaker 6: And I think that's one of the one of the 988 00:50:14,400 --> 00:50:17,200 Speaker 6: aspects of science that's kind of funny is that, you know, 989 00:50:17,239 --> 00:50:20,799 Speaker 6: what it shouldn't be required to do is save the world. 990 00:50:21,640 --> 00:50:24,760 Speaker 6: And I think we want it to, but it shouldn't 991 00:50:24,800 --> 00:50:27,160 Speaker 6: be required to It's a means of discovery. It's a 992 00:50:27,160 --> 00:50:30,359 Speaker 6: means of exploration. Now do I think that there would 993 00:50:30,400 --> 00:50:36,719 Speaker 6: be scientific discovery in any curious, intelligent species on other 994 00:50:37,000 --> 00:50:39,799 Speaker 6: planets or wherever they might be. Of course, yeah, right, 995 00:50:39,880 --> 00:50:42,480 Speaker 6: I mean I think in their own way, right, Like 996 00:50:42,560 --> 00:50:46,520 Speaker 6: bacteria explore and you know, this is something that, of 997 00:50:46,520 --> 00:50:49,720 Speaker 6: course a biologist would be better suited to talk about 998 00:50:49,760 --> 00:50:52,920 Speaker 6: in detail. But there are species that in their own 999 00:50:52,960 --> 00:50:56,720 Speaker 6: way are exploring, making experiments, figuring out which environments are better, 1000 00:50:57,520 --> 00:50:59,600 Speaker 6: and we don't have any way of knowing whether they're 1001 00:50:59,640 --> 00:51:03,319 Speaker 6: doing that intentionally or for what purpose. But I think 1002 00:51:03,360 --> 00:51:07,239 Speaker 6: that it's a little condescending to assume that because we 1003 00:51:07,320 --> 00:51:11,160 Speaker 6: don't know that they're not doing anything, you know, I 1004 00:51:11,160 --> 00:51:14,080 Speaker 6: think even if we just look at our planet, there 1005 00:51:14,120 --> 00:51:16,720 Speaker 6: are lots more species that are probably doing something closer 1006 00:51:16,719 --> 00:51:19,080 Speaker 6: to the scientific method than we might think. 1007 00:51:19,320 --> 00:51:20,960 Speaker 3: What's a good candidate to think. 1008 00:51:21,080 --> 00:51:23,800 Speaker 6: Well, one of my colleagues at Virginia Tech, Ashley, she 1009 00:51:24,080 --> 00:51:27,360 Speaker 6: did her dissertation on New Caledonian crows, that there is 1010 00:51:27,480 --> 00:51:30,160 Speaker 6: other work on New Caledonian crows. I think they're a 1011 00:51:30,160 --> 00:51:34,239 Speaker 6: good example of tool using creatures in any case, and 1012 00:51:34,560 --> 00:51:35,160 Speaker 6: who have done. 1013 00:51:35,360 --> 00:51:37,000 Speaker 5: If I were better versus in this area, I would 1014 00:51:37,000 --> 00:51:38,040 Speaker 5: have lots more examples. 1015 00:51:38,120 --> 00:51:41,600 Speaker 3: But just having many examples on Earth make an argument 1016 00:51:41,719 --> 00:51:44,760 Speaker 3: that it's more likely to exist on other planets as well, 1017 00:51:44,800 --> 00:51:46,480 Speaker 3: like other environments. 1018 00:51:46,719 --> 00:51:48,839 Speaker 6: I would like for that to be true, because, as 1019 00:51:48,880 --> 00:51:51,480 Speaker 6: you say, maybe you didn't intend to say this, but 1020 00:51:51,520 --> 00:51:52,640 Speaker 6: it's his interpretation. 1021 00:51:53,239 --> 00:51:55,520 Speaker 5: It kind of throws a mirror up to our own. 1022 00:51:55,480 --> 00:52:00,640 Speaker 6: Practices and says, look again, what we want is this 1023 00:52:00,760 --> 00:52:03,279 Speaker 6: idea of the inevitability of the scientific method in the 1024 00:52:03,280 --> 00:52:07,719 Speaker 6: way that we've discovered it or developed it. The certainty 1025 00:52:07,760 --> 00:52:10,640 Speaker 6: of science, the truth of science, the idea that we've 1026 00:52:10,680 --> 00:52:14,120 Speaker 6: figured out the one right way. And I think the 1027 00:52:14,440 --> 00:52:18,839 Speaker 6: triumphalism is a nice word for that. And I think 1028 00:52:18,880 --> 00:52:21,920 Speaker 6: that thinking about, well, wait, what if they do it 1029 00:52:22,000 --> 00:52:24,879 Speaker 6: differently elsewhere? What if there are other ways of doing this, 1030 00:52:25,920 --> 00:52:29,319 Speaker 6: whether on the Earth or elsewhere in the galaxy. And 1031 00:52:29,400 --> 00:52:35,160 Speaker 6: we're more able to reach out send signals to other 1032 00:52:35,239 --> 00:52:39,319 Speaker 6: places now than we ever have been. And I think 1033 00:52:39,360 --> 00:52:43,360 Speaker 6: that the possibility that there might be another way of 1034 00:52:43,400 --> 00:52:46,200 Speaker 6: doing science, on the one hand, it sort of undermines 1035 00:52:46,280 --> 00:52:49,240 Speaker 6: that idea of certainty and truth, and on the other hand, 1036 00:52:49,280 --> 00:52:51,480 Speaker 6: that could be seen as a good thing. 1037 00:52:51,640 --> 00:52:53,280 Speaker 5: That could be a good thing, wonderful. 1038 00:52:53,360 --> 00:52:55,719 Speaker 3: Well, I look forward to all these developments in the 1039 00:52:55,760 --> 00:52:59,360 Speaker 3: process of science itself and our social relationship with science. 1040 00:53:00,160 --> 00:53:02,279 Speaker 3: To end by asking you one last question, and this 1041 00:53:02,360 --> 00:53:06,279 Speaker 3: is going to be the most controversial, politically charged question. 1042 00:53:06,320 --> 00:53:08,560 Speaker 3: I'm going to ask you, if you had to choose, 1043 00:53:08,600 --> 00:53:11,360 Speaker 3: would you rather live in Virginia or California? 1044 00:53:11,520 --> 00:53:20,280 Speaker 6: Oh? Oh, oh, my gosh, Okay, I mean but California. 1045 00:53:22,080 --> 00:53:24,680 Speaker 3: Thank you, all right, excellent, you've come down on my 1046 00:53:24,800 --> 00:53:27,520 Speaker 3: side of the argument. I appreciate it from a professor 1047 00:53:27,680 --> 00:53:30,319 Speaker 3: in Virginia. All right, well, thank you very much for 1048 00:53:30,320 --> 00:53:32,160 Speaker 3: coming on the pod and talking to us about the 1049 00:53:32,160 --> 00:53:33,840 Speaker 3: process of science and discovery. 1050 00:53:33,960 --> 00:53:35,919 Speaker 5: Absolutely, thank you, great to talk. 1051 00:53:56,840 --> 00:53:59,160 Speaker 3: We're back and today we're talking about the process of 1052 00:53:59,400 --> 00:54:03,040 Speaker 3: science discovery. Up next, we have a fun interview with 1053 00:54:03,120 --> 00:54:06,520 Speaker 3: Professor Brian Keating, who's written a book about his interviews 1054 00:54:06,560 --> 00:54:10,280 Speaker 3: with Nobel Prize laureates. So it's my pleasure to welcome 1055 00:54:10,320 --> 00:54:14,240 Speaker 3: back to the podcast Professor Brian Keating. He's a cosmologist 1056 00:54:14,239 --> 00:54:17,480 Speaker 3: and a distinguished professor of physics at University of California, 1057 00:54:17,560 --> 00:54:20,879 Speaker 3: San Diego. He's also the co director of the Arthur C. 1058 00:54:20,880 --> 00:54:24,279 Speaker 3: Clark Center for the Human Imagination. He's a principal investigator 1059 00:54:24,320 --> 00:54:27,640 Speaker 3: of the Simons Observatory, and he has a side hustle 1060 00:54:27,840 --> 00:54:30,520 Speaker 3: of writing books and doing podcasts. He's the author of 1061 00:54:30,680 --> 00:54:35,080 Speaker 3: Losing the Nobel Prize of Into the Impossible of volume two, 1062 00:54:35,160 --> 00:54:37,680 Speaker 3: focused like a Nobel Prize winner will be talking about today, 1063 00:54:38,000 --> 00:54:41,000 Speaker 3: and he's the host of the End of the Impossible podcast. 1064 00:54:41,400 --> 00:54:44,239 Speaker 3: So he's one of those rare unicorns that both does 1065 00:54:44,280 --> 00:54:47,200 Speaker 3: physics research and talks about it to the public. Brian 1066 00:54:47,360 --> 00:54:48,319 Speaker 3: welcome back to the pod. 1067 00:54:48,600 --> 00:54:50,920 Speaker 2: Ah, it's great to see you, you know, yeah, it's 1068 00:54:50,960 --> 00:54:52,040 Speaker 2: always great to be with you. 1069 00:54:52,160 --> 00:54:54,879 Speaker 3: Wonderful. Well, I really enjoyed reading your book. It's fascinating 1070 00:54:55,040 --> 00:54:58,400 Speaker 3: to hear these thoughts from all of these luminaries. My 1071 00:54:58,440 --> 00:55:00,319 Speaker 3: first question to you is a simple one, though, like, 1072 00:55:00,480 --> 00:55:03,480 Speaker 3: what is your secret for getting access to all these 1073 00:55:03,480 --> 00:55:07,319 Speaker 3: Nobel Prize winners for young science journalists or aspiring podcasters 1074 00:55:07,360 --> 00:55:10,640 Speaker 3: out there? How do you manage to set up these conversations? 1075 00:55:11,000 --> 00:55:13,800 Speaker 2: Well, I think you know it's it's called the I 1076 00:55:13,840 --> 00:55:16,680 Speaker 2: think it's called the Matthew effect. So say Matthew said, 1077 00:55:17,360 --> 00:55:20,960 Speaker 2: the rich get richer effectively. So it started off, as 1078 00:55:21,000 --> 00:55:23,840 Speaker 2: you said, with the Arthur C. Clark Center for Human Imagination, 1079 00:55:24,560 --> 00:55:26,680 Speaker 2: and we were blessed here to have people like Freeman 1080 00:55:26,800 --> 00:55:31,080 Speaker 2: Dyson and you know, a local on staff, and so 1081 00:55:31,160 --> 00:55:33,200 Speaker 2: we just got to hang out. And to say he 1082 00:55:33,320 --> 00:55:35,680 Speaker 2: was my first guest on the podcast is pretty awesome. 1083 00:55:36,120 --> 00:55:39,400 Speaker 2: Awesome expression. I never thought would would you know, be 1084 00:55:40,000 --> 00:55:42,480 Speaker 2: a thing that I could say. And then after you know, 1085 00:55:42,520 --> 00:55:45,760 Speaker 2: getting people like him, then a Nobel laureate like Roger Penrose, 1086 00:55:45,760 --> 00:55:47,960 Speaker 2: who I knew before he was a Nobel laureate. 1087 00:55:48,080 --> 00:55:52,080 Speaker 7: Some say, you know, responsible for it, but that's just me. 1088 00:55:52,160 --> 00:55:55,280 Speaker 3: People are saying, yeah. 1089 00:55:54,239 --> 00:55:55,520 Speaker 7: The voices in my skull. 1090 00:55:56,160 --> 00:55:59,480 Speaker 2: And then other just great luminaries would come to give 1091 00:55:59,480 --> 00:56:02,799 Speaker 2: a colloquial very parish or you know, people like that. 1092 00:56:03,400 --> 00:56:05,040 Speaker 2: And then I thought it was a real shame and 1093 00:56:05,080 --> 00:56:08,759 Speaker 2: a disservice to the University of California, the people that 1094 00:56:08,840 --> 00:56:12,240 Speaker 2: you and I serve so selflessly in such low wages, 1095 00:56:12,640 --> 00:56:16,080 Speaker 2: that we you know, wouldn't share that with the California 1096 00:56:16,120 --> 00:56:18,840 Speaker 2: taxpayers and with the locals that couldn't make it the campus. 1097 00:56:18,840 --> 00:56:22,560 Speaker 2: And so decided to record audio and then later on 1098 00:56:22,719 --> 00:56:25,480 Speaker 2: made it into videos. And then every time someone of 1099 00:56:25,520 --> 00:56:28,399 Speaker 2: a great stature, whether a Nobel laureate or not, would 1100 00:56:28,480 --> 00:56:30,919 Speaker 2: come by, I would ask them if they wouldn't mind 1101 00:56:30,920 --> 00:56:33,000 Speaker 2: sitting for an interview. Well, you know, half of them 1102 00:56:33,000 --> 00:56:35,920 Speaker 2: agreed to come on. Unfortunately I didn't have the opportunity. 1103 00:56:35,960 --> 00:56:38,640 Speaker 2: There were only I think there's only four living women 1104 00:56:38,760 --> 00:56:40,919 Speaker 2: who have won the Nobel Prize, and only I think 1105 00:56:40,920 --> 00:56:43,480 Speaker 2: two are American or three are American, and so it 1106 00:56:43,520 --> 00:56:45,520 Speaker 2: was hard to get you know them, especially because they 1107 00:56:45,640 --> 00:56:46,600 Speaker 2: they're sick of getting asked. 1108 00:56:46,640 --> 00:56:47,239 Speaker 7: So, what's it like? 1109 00:56:47,280 --> 00:56:49,960 Speaker 2: To be a woman, you know, so I try not 1110 00:56:50,040 --> 00:56:52,560 Speaker 2: to do that. So but for this volume, the second volume, 1111 00:56:53,000 --> 00:56:56,040 Speaker 2: I did get the opportunity to speak to Donna Strickland, 1112 00:56:56,040 --> 00:57:01,600 Speaker 2: who is an amazing experimentalist and hilarious and disarming and 1113 00:57:01,800 --> 00:57:06,680 Speaker 2: ultimately incredibly gracious. I've interviewed twenty two so far, including 1114 00:57:06,800 --> 00:57:07,279 Speaker 2: I have his. 1115 00:57:07,280 --> 00:57:08,160 Speaker 7: Books, some maround. 1116 00:57:08,239 --> 00:57:12,600 Speaker 2: Here the guy who invented viagra, doctor leuig Naro, who's 1117 00:57:12,600 --> 00:57:16,280 Speaker 2: at UCLA not far from you. And here's my twenty 1118 00:57:16,320 --> 00:57:19,440 Speaker 2: second interview. And after the second group of nine, so eighteen, 1119 00:57:19,560 --> 00:57:21,760 Speaker 2: I decided I'd put out another volume. 1120 00:57:21,800 --> 00:57:23,480 Speaker 7: And that's where we're at today. 1121 00:57:23,880 --> 00:57:25,920 Speaker 3: So of all the people on the earth, or at 1122 00:57:25,960 --> 00:57:28,320 Speaker 3: least the people I know, you've probably spoken to more 1123 00:57:28,360 --> 00:57:30,000 Speaker 3: Nobel Prize winners than anybody. 1124 00:57:30,080 --> 00:57:30,760 Speaker 7: I think that's tran. 1125 00:57:31,200 --> 00:57:33,120 Speaker 3: You know. I want to dig into in a minute 1126 00:57:33,160 --> 00:57:36,120 Speaker 3: what you think their methods have in common. But what 1127 00:57:36,160 --> 00:57:38,960 Speaker 3: do you think there are moments of discovery have in common? 1128 00:57:39,000 --> 00:57:41,960 Speaker 3: Do you think they all share this like Eureka moment 1129 00:57:42,080 --> 00:57:43,320 Speaker 3: or do you think in each case it was like 1130 00:57:43,320 --> 00:57:47,800 Speaker 3: a gradual understanding of this novel realization about the universe. 1131 00:57:48,080 --> 00:57:49,480 Speaker 3: What are those moments have in common? 1132 00:57:49,880 --> 00:57:52,840 Speaker 2: Yeah, I mean, there's just a cliche from Isaac Asimov 1133 00:57:52,960 --> 00:57:56,120 Speaker 2: that you know, a real scientist doesn't say eureka, because 1134 00:57:56,120 --> 00:57:59,120 Speaker 2: that's kind of means I have found it in Greek, 1135 00:57:59,200 --> 00:58:01,560 Speaker 2: as we all know, and uh, and that means you 1136 00:58:01,640 --> 00:58:03,520 Speaker 2: found what you're looking for, which is the recipe for 1137 00:58:03,560 --> 00:58:06,160 Speaker 2: confirmation bias, which we're not supposed to fall victim to. 1138 00:58:07,080 --> 00:58:10,280 Speaker 2: So I think the you know, the reaction is more 1139 00:58:10,560 --> 00:58:12,840 Speaker 2: more often than not, you know what I call sheer 1140 00:58:12,960 --> 00:58:17,040 Speaker 2: terror of suspecting that you might be right, but with 1141 00:58:17,200 --> 00:58:20,760 Speaker 2: so little confidence and conviction that you could be wrong. 1142 00:58:21,240 --> 00:58:24,360 Speaker 2: And effectively that that least this type of paralysis where 1143 00:58:24,360 --> 00:58:26,280 Speaker 2: you're like, not sure, and so what do you do 1144 00:58:26,320 --> 00:58:28,760 Speaker 2: as a good science You just keep collecting data. And 1145 00:58:28,840 --> 00:58:31,560 Speaker 2: I think the thing that separates these individuals from you 1146 00:58:31,560 --> 00:58:35,280 Speaker 2: know me, I'll say, not you, but me, is that 1147 00:58:35,360 --> 00:58:37,880 Speaker 2: they don't, you know, kind of they had this courage 1148 00:58:37,960 --> 00:58:40,840 Speaker 2: to be you know, to lean into the discovery and 1149 00:58:40,920 --> 00:58:43,840 Speaker 2: really you know, kind of reify it and make it, 1150 00:58:44,040 --> 00:58:46,480 Speaker 2: make it whole. And and I think that that kind 1151 00:58:46,520 --> 00:58:49,160 Speaker 2: of courage is rare. It's rare and individuals, let alone 1152 00:58:49,200 --> 00:58:52,240 Speaker 2: and scientists. So I think that ability to see that 1153 00:58:52,280 --> 00:58:54,760 Speaker 2: they've done enough that like the perfect is the enemy 1154 00:58:54,760 --> 00:58:57,280 Speaker 2: of the good, enough that you know, once you've established 1155 00:58:57,280 --> 00:58:59,480 Speaker 2: this thing, is now to you to kind of then 1156 00:58:59,600 --> 00:59:03,520 Speaker 2: convert from scientists to what I call salesman mode, where 1157 00:59:03,680 --> 00:59:06,520 Speaker 2: you really have to convince other scientists that you're right, 1158 00:59:06,520 --> 00:59:08,240 Speaker 2: and it's not enough for you to think you're brilliant. 1159 00:59:08,560 --> 00:59:11,400 Speaker 2: I mean, only one person in this whole collection of 1160 00:59:11,400 --> 00:59:16,520 Speaker 2: twenty two people has admitted to me that they deserved 1161 00:59:16,520 --> 00:59:18,520 Speaker 2: the Noble Prize. Like, you know, there was something they 1162 00:59:18,560 --> 00:59:20,560 Speaker 2: were gone for their whole life. They knew where they 1163 00:59:20,560 --> 00:59:23,200 Speaker 2: were going to win it. It was preternaturally preordained. 1164 00:59:23,320 --> 00:59:25,160 Speaker 3: Yeah, So what do you think these folks have done 1165 00:59:25,280 --> 00:59:28,440 Speaker 3: to prepare themselves for these moments, for these great discoveries. 1166 00:59:28,960 --> 00:59:31,760 Speaker 3: Is it just luck? Or have they sort of made themselves? 1167 00:59:31,760 --> 00:59:33,880 Speaker 3: Have they sort of set themselves up to be lucky? 1168 00:59:34,200 --> 00:59:38,240 Speaker 2: Some say that they are lucky a lot just never 1169 00:59:38,360 --> 00:59:41,400 Speaker 2: stop working on stuff, and the fact that they won 1170 00:59:41,520 --> 00:59:45,040 Speaker 2: a Nobel Prize was sort of incidental that it was, 1171 00:59:45,360 --> 00:59:47,920 Speaker 2: you know, it was something that they didn't plan on. 1172 00:59:48,040 --> 00:59:48,200 Speaker 4: You know. 1173 00:59:48,240 --> 00:59:51,480 Speaker 2: For example, I talked to Georgio Parisi, who you know, 1174 00:59:51,520 --> 00:59:54,760 Speaker 2: won the Nobel Prize in part for you know, predictions 1175 00:59:54,760 --> 00:59:59,400 Speaker 2: and theoretical physics ranging from spin glasses what are called 1176 00:59:59,440 --> 01:00:04,640 Speaker 2: spin glass to your kind of chaotic invariance and chaos theory. 1177 01:00:04,760 --> 01:00:06,680 Speaker 2: And there's a few people in the book that have 1178 01:00:06,960 --> 01:00:10,360 Speaker 2: relevance to chaos theory. And so it's almost impossible to 1179 01:00:11,120 --> 01:00:13,520 Speaker 2: kind of predict that I'm gonna, you know, go out 1180 01:00:13,520 --> 01:00:15,200 Speaker 2: and solve this thing that has to do with how 1181 01:00:15,200 --> 01:00:18,400 Speaker 2: these birds migrate called starlings, and how they flock and 1182 01:00:18,960 --> 01:00:22,200 Speaker 2: the behavior and the phase transitions that they exhibit after 1183 01:00:22,280 --> 01:00:26,640 Speaker 2: working on so ten symmetry group, after looking at spin 1184 01:00:26,720 --> 01:00:29,760 Speaker 2: glasses and so forth. So a lot of them have 1185 01:00:29,840 --> 01:00:33,600 Speaker 2: these very tortured paths to the Nobel Prize. But their 1186 01:00:33,640 --> 01:00:38,000 Speaker 2: intellects are just such of such a magnitude that it's 1187 01:00:38,080 --> 01:00:39,760 Speaker 2: sort of obvious. 1188 01:00:39,440 --> 01:00:41,520 Speaker 7: In hindsight that they would get to this level. 1189 01:00:41,880 --> 01:00:44,680 Speaker 3: And what about their daily habits? Is there anything they 1190 01:00:44,680 --> 01:00:47,160 Speaker 3: have in common? You know, do they all start with 1191 01:00:47,200 --> 01:00:50,480 Speaker 3: the same super espresso or do they all you know, 1192 01:00:50,640 --> 01:00:54,640 Speaker 3: like block out timed for themselves, or is there anything 1193 01:00:54,640 --> 01:00:57,040 Speaker 3: there that's like very concrete that we can extract from 1194 01:00:57,040 --> 01:00:57,640 Speaker 3: their success. 1195 01:00:58,200 --> 01:01:02,080 Speaker 2: Unfortunately, no, there's no like you know, special cereal, you know, 1196 01:01:04,520 --> 01:01:06,200 Speaker 2: Wheedi's or sweeties or. 1197 01:01:06,160 --> 01:01:06,800 Speaker 7: Something like that. 1198 01:01:07,920 --> 01:01:11,000 Speaker 2: But there are you know, kind of traits I would 1199 01:01:11,040 --> 01:01:15,040 Speaker 2: say there wouldn't say necessarily habits, although they all have 1200 01:01:15,200 --> 01:01:19,680 Speaker 2: this you know, kind of chimeric ability to be incredibly 1201 01:01:20,360 --> 01:01:23,200 Speaker 2: joyous when they're working. It's not a drudge. It's not 1202 01:01:23,400 --> 01:01:26,840 Speaker 2: something that they do if that's tedious or and I 1203 01:01:26,880 --> 01:01:29,960 Speaker 2: found it, you know, a little bit depressing, because what 1204 01:01:30,000 --> 01:01:33,960 Speaker 2: we do is experimental scientists working on big projects. Almost 1205 01:01:34,000 --> 01:01:36,200 Speaker 2: none of it, at least in my experiences, has to 1206 01:01:36,240 --> 01:01:39,000 Speaker 2: do with physics. I mean, yesterday, we are on telecon 1207 01:01:39,120 --> 01:01:41,160 Speaker 2: with my fellow you know, kind of co leaders, and 1208 01:01:41,200 --> 01:01:44,800 Speaker 2: we're talking about like how to get these louvers that 1209 01:01:44,960 --> 01:01:49,240 Speaker 2: open on the generators that power the Simons observatories, telescope 1210 01:01:49,280 --> 01:01:53,000 Speaker 2: motor platforms when they get clogged with snow and you 1211 01:01:53,040 --> 01:01:56,520 Speaker 2: want to you know, ingest the right volume of air 1212 01:01:56,640 --> 01:01:58,919 Speaker 2: to cool the turbines. You have to cool them even 1213 01:01:58,920 --> 01:02:02,000 Speaker 2: though you're you know, eight meters of snow fell this year, 1214 01:02:02,040 --> 01:02:04,320 Speaker 2: you know, and so it's just like, oh, the concrete 1215 01:02:04,400 --> 01:02:08,160 Speaker 2: you know, contractors on strike in Chile, which happens you know, 1216 01:02:08,560 --> 01:02:10,840 Speaker 2: once a month it's the season or whatever, and then 1217 01:02:10,920 --> 01:02:13,040 Speaker 2: we have to deal with so it's rare that we 1218 01:02:13,080 --> 01:02:16,440 Speaker 2: get to spend time like thinking about the cosmic macure background. 1219 01:02:16,840 --> 01:02:20,040 Speaker 2: And so I think the tenacity, the intellectual rigor, and 1220 01:02:20,120 --> 01:02:22,640 Speaker 2: the desire to lean into teaching and service and giving 1221 01:02:22,680 --> 01:02:24,720 Speaker 2: back after the prize, And I'm sure they did before 1222 01:02:24,760 --> 01:02:27,640 Speaker 2: the prize too, But that's a commonality I observe in 1223 01:02:27,680 --> 01:02:30,560 Speaker 2: their current state as I got to observe them collapsed 1224 01:02:30,560 --> 01:02:31,240 Speaker 2: in that way from me. 1225 01:02:31,800 --> 01:02:34,560 Speaker 3: And you draw another lesson from all of their experiences. 1226 01:02:34,680 --> 01:02:36,160 Speaker 3: I mean, it's in the title of your book and 1227 01:02:36,200 --> 01:02:38,560 Speaker 3: you go into it in great detail. You think that 1228 01:02:38,600 --> 01:02:41,320 Speaker 3: folks should focus on a topic, that we should go 1229 01:02:41,480 --> 01:02:45,440 Speaker 3: deep instead of broad as scientists. That's sort of clashes 1230 01:02:45,480 --> 01:02:48,760 Speaker 3: with some historical trends, right, folks like Gauss or Newton, 1231 01:02:49,040 --> 01:02:51,600 Speaker 3: you know, they're extremely broad. Why do you think that 1232 01:02:51,680 --> 01:02:54,840 Speaker 3: today scientists have to focus have to be deeper? 1233 01:02:55,200 --> 01:02:58,280 Speaker 2: Yeah, I think it's the fields and the amount of 1234 01:02:58,320 --> 01:03:03,280 Speaker 2: knowledge has expanded so much done it's it's basically impossible 1235 01:03:03,400 --> 01:03:07,720 Speaker 2: even when you focus on one subfield, sub sub subfield 1236 01:03:07,760 --> 01:03:10,800 Speaker 2: or you know, substances is a subfield, and to do that, 1237 01:03:11,040 --> 01:03:13,080 Speaker 2: you know, it's easier to do that in one field 1238 01:03:13,160 --> 01:03:16,160 Speaker 2: obviously than it is in Many, and I think it's 1239 01:03:16,200 --> 01:03:19,520 Speaker 2: incredibly fascinating when you see that they could do so 1240 01:03:19,600 --> 01:03:21,960 Speaker 2: many other things. You know, Ryan hard Genzel is a 1241 01:03:22,000 --> 01:03:24,240 Speaker 2: great example, like he could have done any He actually 1242 01:03:24,240 --> 01:03:28,520 Speaker 2: could have been an Olympic athlete. He was an incredible athlete. 1243 01:03:28,600 --> 01:03:32,520 Speaker 2: His father was very into physical sports in Germany. And 1244 01:03:32,800 --> 01:03:35,040 Speaker 2: you know, he could have done a lot of things, 1245 01:03:35,200 --> 01:03:39,680 Speaker 2: not just in outside of science or in technology and optics. 1246 01:03:39,720 --> 01:03:42,680 Speaker 2: You know, he really pioneered along with our colleague in 1247 01:03:42,680 --> 01:03:46,120 Speaker 2: the University of California, Andrea I Guez, this this concept 1248 01:03:46,320 --> 01:03:50,400 Speaker 2: application rather of adaptive optics to being up the black 1249 01:03:50,400 --> 01:03:53,040 Speaker 2: hole in the Milky Way Center as a laboratory to 1250 01:03:53,040 --> 01:03:54,200 Speaker 2: test general relativity. 1251 01:03:54,440 --> 01:03:55,480 Speaker 7: So there's so many things there. 1252 01:03:55,480 --> 01:03:57,280 Speaker 2: He could have gotten into optics, he could have gotten 1253 01:03:57,320 --> 01:04:02,160 Speaker 2: into general relativity, experimental, he could have done more stuff. 1254 01:04:02,280 --> 01:04:04,240 Speaker 7: But he could have also gone into. 1255 01:04:04,360 --> 01:04:07,400 Speaker 2: You know, DARPA, and you know, would they use these 1256 01:04:07,560 --> 01:04:11,320 Speaker 2: the same techniques in for example, in adaptive optics, where 1257 01:04:11,320 --> 01:04:14,480 Speaker 2: we have these deformable mirrors that compensate for the distortion 1258 01:04:14,640 --> 01:04:19,600 Speaker 2: of the Earth's lens like atmosphere that causes stars to twinkle, twinkle, 1259 01:04:19,840 --> 01:04:20,400 Speaker 2: Little Star. 1260 01:04:21,160 --> 01:04:22,120 Speaker 7: He could have applied that. 1261 01:04:22,080 --> 01:04:25,120 Speaker 2: As they do now to like sniper scopes, you know, 1262 01:04:25,120 --> 01:04:27,880 Speaker 2: which is an application of adaptive optics that you know 1263 01:04:28,000 --> 01:04:31,320 Speaker 2: is for military purposes. But there's many other things, artificial guides. 1264 01:04:31,520 --> 01:04:31,960 Speaker 7: A lot of the. 1265 01:04:31,880 --> 01:04:35,560 Speaker 2: Technology was classified, you know in the US at least, 1266 01:04:35,920 --> 01:04:37,800 Speaker 2: so he could have done a multitude of things. 1267 01:04:37,840 --> 01:04:39,280 Speaker 7: But that's really what he's done. 1268 01:04:39,440 --> 01:04:42,360 Speaker 2: And I think you know it's but it's he's only 1269 01:04:42,560 --> 01:04:46,640 Speaker 2: he's the literal next generation after Charles Towns, also a 1270 01:04:46,720 --> 01:04:50,040 Speaker 2: U see, you know, professor fellow of ours and and 1271 01:04:50,200 --> 01:04:54,800 Speaker 2: he you know, is known for extremely broad knowledge. I mean, 1272 01:04:54,840 --> 01:04:58,240 Speaker 2: he credits his his ability to blow glass. I don't 1273 01:04:58,240 --> 01:05:00,080 Speaker 2: know if knew that he went to like house in 1274 01:05:00,480 --> 01:05:04,240 Speaker 2: some small school had a like blow glass for you know, 1275 01:05:04,640 --> 01:05:07,520 Speaker 2: champagne bottles, and then that became very useful in making 1276 01:05:08,080 --> 01:05:13,440 Speaker 2: vacuum tubes for you know, eventually creating rarefied gas fials 1277 01:05:13,560 --> 01:05:17,560 Speaker 2: that were then used to do maser stimulation. That led 1278 01:05:17,600 --> 01:05:19,920 Speaker 2: to the mazer and then the laser, and then he 1279 01:05:19,960 --> 01:05:23,440 Speaker 2: got into like looking for aliens and optical search for 1280 01:05:23,480 --> 01:05:28,320 Speaker 2: extraterrestrial intelligence and adaptive optic just incredibly, so that was 1281 01:05:28,360 --> 01:05:32,440 Speaker 2: one generation between him and his advisy is his student 1282 01:05:33,000 --> 01:05:36,440 Speaker 2: Reinhart Gonzel and yet you know he could do it 1283 01:05:36,520 --> 01:05:40,080 Speaker 2: and it was I don't think Reinhart's less intelligent. So yeah, 1284 01:05:40,120 --> 01:05:42,320 Speaker 2: from my perspective, I think there's just so much to 1285 01:05:42,400 --> 01:05:46,560 Speaker 2: know now, So it's hard to focus because so many distractions. 1286 01:05:46,720 --> 01:05:48,600 Speaker 2: I'm not talking about outside the lot, I'm talking about 1287 01:05:48,640 --> 01:05:49,680 Speaker 2: inside your own field. 1288 01:05:50,120 --> 01:05:50,960 Speaker 7: How do you focus? 1289 01:05:51,000 --> 01:05:53,960 Speaker 2: And I like this acronym that people you know use 1290 01:05:54,080 --> 01:05:56,000 Speaker 2: that you know, focus should be thought of as an 1291 01:05:56,040 --> 01:06:01,760 Speaker 2: acronym for follow one course until success, and I wish 1292 01:06:01,840 --> 01:06:03,640 Speaker 2: I had done that. You know, I'm glad that I 1293 01:06:03,640 --> 01:06:06,720 Speaker 2: have kind of a broad education, not just to within 1294 01:06:06,800 --> 01:06:09,920 Speaker 2: physics but outside of physics. But I think there's a 1295 01:06:09,960 --> 01:06:13,000 Speaker 2: lot greater path to success to do something that only 1296 01:06:13,040 --> 01:06:13,600 Speaker 2: you can do. 1297 01:06:14,040 --> 01:06:16,560 Speaker 3: But how do you know when to focus? Like, how 1298 01:06:16,560 --> 01:06:19,000 Speaker 3: do you know as a young scientist that you found 1299 01:06:19,040 --> 01:06:21,480 Speaker 3: the right field? Personally, I started out in plasma physics 1300 01:06:21,520 --> 01:06:23,560 Speaker 3: and then solid state physics, and it wasn't until I 1301 01:06:23,600 --> 01:06:25,400 Speaker 3: got to particle physics and I was like, oh, this 1302 01:06:25,480 --> 01:06:28,200 Speaker 3: is my jam, this is where I want to dive deep. 1303 01:06:28,680 --> 01:06:30,640 Speaker 3: So you know, if I had focused too early, i'd 1304 01:06:30,640 --> 01:06:33,160 Speaker 3: be doing fusion research right now and promising you know, 1305 01:06:33,160 --> 01:06:36,160 Speaker 3: the Tokomac would turn on and ignite next year for 1306 01:06:36,200 --> 01:06:38,600 Speaker 3: the last ten years. So how do you know when 1307 01:06:38,680 --> 01:06:39,240 Speaker 3: to focus? 1308 01:06:39,640 --> 01:06:43,240 Speaker 2: So I think maybe you wouldn't in the sense that 1309 01:06:43,600 --> 01:06:46,840 Speaker 2: you weren't really able to focus at the level of say, 1310 01:06:47,360 --> 01:06:50,520 Speaker 2: you know, Michael Jordan practicing a thousand jump shots after 1311 01:06:50,560 --> 01:06:51,880 Speaker 2: every game or something like that. 1312 01:06:51,960 --> 01:06:52,160 Speaker 7: You know. 1313 01:06:52,760 --> 01:06:56,360 Speaker 2: In other words, you had to find your path and 1314 01:06:56,440 --> 01:06:59,080 Speaker 2: then you followed the course that led to success. In 1315 01:06:59,120 --> 01:07:02,400 Speaker 2: your case, I was particle physics. I also started off 1316 01:07:02,480 --> 01:07:05,360 Speaker 2: I want to be a condensed matter theorist. God forbid, 1317 01:07:05,760 --> 01:07:09,800 Speaker 2: you know. Now I'm an experimental cosmologist. But I think 1318 01:07:09,960 --> 01:07:12,960 Speaker 2: a lot of my success, at least are my ability 1319 01:07:13,000 --> 01:07:15,920 Speaker 2: to maintain This is not the subject at all. I 1320 01:07:15,960 --> 01:07:18,400 Speaker 2: think has nothing to do with the subject. So from 1321 01:07:18,440 --> 01:07:22,520 Speaker 2: my perspective, the focus of the book is to implement 1322 01:07:22,640 --> 01:07:26,280 Speaker 2: skills and tactics and habits and strategies so that you 1323 01:07:26,320 --> 01:07:27,640 Speaker 2: can become an expert. 1324 01:07:28,000 --> 01:07:31,280 Speaker 3: So there's this lore about big discoveries. I've often heard 1325 01:07:31,320 --> 01:07:35,040 Speaker 3: people say that you can't make paradigm shifting discoveries after 1326 01:07:35,080 --> 01:07:38,760 Speaker 3: you're thirty or something. So in your experience talking to folks, 1327 01:07:38,800 --> 01:07:41,120 Speaker 3: did they make these discoveries when they were young or 1328 01:07:41,200 --> 01:07:44,200 Speaker 3: is it after like decades of focusing and refining and 1329 01:07:44,240 --> 01:07:46,720 Speaker 3: coming to the edge of the field that they've made 1330 01:07:46,760 --> 01:07:47,600 Speaker 3: their discoveries. 1331 01:07:48,000 --> 01:07:50,240 Speaker 7: Most of them did make it as young people. 1332 01:07:51,760 --> 01:07:52,240 Speaker 3: Yeah. 1333 01:07:52,280 --> 01:07:55,400 Speaker 2: Well, well here's the thing though, Well, first time we say, 1334 01:07:55,480 --> 01:07:58,320 Speaker 2: I think that also correlates with what I said earlier, 1335 01:07:58,360 --> 01:08:00,200 Speaker 2: that you know, you want to get on course early 1336 01:08:00,760 --> 01:08:04,320 Speaker 2: in life. I don't necessarily correlate it with age as 1337 01:08:04,360 --> 01:08:07,680 Speaker 2: well as I do with thinking yourself as a professional. 1338 01:08:08,080 --> 01:08:10,720 Speaker 2: So getting on track early is I think, you know, 1339 01:08:10,760 --> 01:08:13,360 Speaker 2: a cornerstone. So they all got on track that some 1340 01:08:13,440 --> 01:08:15,800 Speaker 2: of them did, you know, kind of branch out either 1341 01:08:16,120 --> 01:08:19,200 Speaker 2: after or at the same time, you know, most notably, 1342 01:08:19,280 --> 01:08:22,000 Speaker 2: you know, I think Kip Thorn is probably the exception 1343 01:08:22,120 --> 01:08:23,880 Speaker 2: that he really did the work that I want him 1344 01:08:23,880 --> 01:08:26,599 Speaker 2: the Nobel Prize in his fifties, you know, if you'd 1345 01:08:26,640 --> 01:08:29,920 Speaker 2: think about it. But the groundwork was laid in his 1346 01:08:30,120 --> 01:08:32,960 Speaker 2: twenties and thirties, so I think that's important to know. 1347 01:08:33,080 --> 01:08:35,040 Speaker 2: But but the way that they get to it, everyone 1348 01:08:35,080 --> 01:08:36,639 Speaker 2: gets to Sweden in a different way. 1349 01:08:37,920 --> 01:08:40,120 Speaker 3: So talking to all these folks, has it changed the 1350 01:08:40,120 --> 01:08:41,360 Speaker 3: way that you do science? 1351 01:08:41,720 --> 01:08:44,680 Speaker 2: Well, first of all, I had an unhealthy obsession with 1352 01:08:44,720 --> 01:08:47,600 Speaker 2: the Nobel Prize as a as a kid, as a 1353 01:08:47,640 --> 01:08:50,639 Speaker 2: young scientist, as I wrote about in my first book, 1354 01:08:50,840 --> 01:08:53,840 Speaker 2: Losing the Nobel Prize, which is a memoir about, you know, 1355 01:08:53,960 --> 01:08:58,440 Speaker 2: the bicep affair of thinking we discovered cosmic inflationary gravitational 1356 01:08:58,479 --> 01:09:01,240 Speaker 2: waves and then having to re track that, and then 1357 01:09:01,680 --> 01:09:04,320 Speaker 2: you know the aftermath of that, biting the dust as 1358 01:09:04,360 --> 01:09:06,320 Speaker 2: you called it, very painfully. 1359 01:09:06,360 --> 01:09:08,400 Speaker 7: So damn it. I'm still smarting from that. 1360 01:09:08,720 --> 01:09:11,840 Speaker 2: But in reality, yeah, how do you recover? How do 1361 01:09:11,880 --> 01:09:15,360 Speaker 2: you do science? How do you compete with your colleagues 1362 01:09:15,439 --> 01:09:17,960 Speaker 2: and all sorts of nasty stuff about science that you 1363 01:09:18,000 --> 01:09:20,160 Speaker 2: don't really ever get to see, because science is always 1364 01:09:20,200 --> 01:09:22,719 Speaker 2: presented as you know, so and so had this brilliant idea, 1365 01:09:22,720 --> 01:09:24,479 Speaker 2: and then so and so won the Nobel Prize, and 1366 01:09:24,479 --> 01:09:26,479 Speaker 2: then this is now how we teach it, even our 1367 01:09:26,560 --> 01:09:30,080 Speaker 2: labs at UCSD. I'm sure Ervine too, you know we're 1368 01:09:30,120 --> 01:09:32,680 Speaker 2: teaching here's a Nobel Prize winning experiment, here's some of 1369 01:09:32,720 --> 01:09:35,879 Speaker 2: these things took forty years to get to work, and 1370 01:09:35,920 --> 01:09:38,840 Speaker 2: we just do it in an afternoon. So I think there, 1371 01:09:39,200 --> 01:09:41,720 Speaker 2: it's changed my opinion that I don't venerate it. I 1372 01:09:41,760 --> 01:09:45,080 Speaker 2: don't venerate the prize. The people are impressive, but they're 1373 01:09:45,120 --> 01:09:47,920 Speaker 2: just people, and a lot of them. Bary Barrass wrote 1374 01:09:47,920 --> 01:09:51,080 Speaker 2: the forward to the first volume, Think like a Nobel Prize, 1375 01:09:51,120 --> 01:09:55,200 Speaker 2: and you know, he said that he had the imposter 1376 01:09:55,360 --> 01:09:58,720 Speaker 2: syndrome even worse after he won the Nobel Prize than 1377 01:09:58,760 --> 01:09:59,760 Speaker 2: he did before it. 1378 01:10:00,120 --> 01:10:01,960 Speaker 7: I said, what are you talking about? He said, well, 1379 01:10:02,120 --> 01:10:02,720 Speaker 7: when you went in. 1380 01:10:02,680 --> 01:10:05,880 Speaker 2: About prize keating, you'll never know this feeling. But you 1381 01:10:05,920 --> 01:10:09,799 Speaker 2: go to Stockholm and you get this huge gold medal, 1382 01:10:09,920 --> 01:10:13,040 Speaker 2: like you know, flava flame, and they want to make 1383 01:10:13,080 --> 01:10:16,040 Speaker 2: sure that you confirm that you got your prize due 1384 01:10:16,080 --> 01:10:18,519 Speaker 2: to you, and so they make you sign this book, 1385 01:10:18,560 --> 01:10:21,639 Speaker 2: this ledger that has every single Nobel laureate in physics 1386 01:10:21,640 --> 01:10:24,559 Speaker 2: back to the beginning in nineteen o one with the 1387 01:10:24,640 --> 01:10:28,000 Speaker 2: invention of the X ray by Wilhelm Renken. And so 1388 01:10:28,600 --> 01:10:32,120 Speaker 2: Barry tells me in twenty you know, twenty twenty that 1389 01:10:32,200 --> 01:10:34,680 Speaker 2: when he won it in twenty seventeen, he went there 1390 01:10:34,720 --> 01:10:37,160 Speaker 2: and he's a curious guy and he's turns the pages 1391 01:10:37,200 --> 01:10:39,679 Speaker 2: in the book and he sees, you know, oh my god, 1392 01:10:39,720 --> 01:10:42,920 Speaker 2: there's there's fine man, Oh my god, there's you know, 1393 01:10:43,040 --> 01:10:47,280 Speaker 2: Madame Currie, Oh my god, there's Einstein. And he said 1394 01:10:47,280 --> 01:10:49,400 Speaker 2: that he saw Einstein's signature. He said, I don't deserve 1395 01:10:49,439 --> 01:10:51,880 Speaker 2: to be in the same book as him, let alone, 1396 01:10:51,920 --> 01:10:54,519 Speaker 2: you know, be in the same mention as him. And 1397 01:10:54,560 --> 01:10:56,800 Speaker 2: I said, Barry, I've got good news, and good news 1398 01:10:56,800 --> 01:11:00,000 Speaker 2: for you. First of all, Einstein had the imposter syndrome. 1399 01:11:00,320 --> 01:11:02,759 Speaker 2: And he's like, what are you talking about. He's like Einstein. 1400 01:11:02,840 --> 01:11:06,759 Speaker 2: I told him. Einstein wrote that Isaac Newton contributed more 1401 01:11:07,240 --> 01:11:11,400 Speaker 2: to civilization than even he did to science, and his 1402 01:11:11,560 --> 01:11:14,920 Speaker 2: contributions to science will never be matched again. And I said, 1403 01:11:14,920 --> 01:11:17,920 Speaker 2: but that's not all, Arry, because Isaac Newton had the 1404 01:11:17,960 --> 01:11:20,679 Speaker 2: imposter syndrome. And now he's like, oh, you gotta be kidding. 1405 01:11:21,840 --> 01:11:24,400 Speaker 2: And then he said, I told him no. Actually, Isaac 1406 01:11:24,439 --> 01:11:26,800 Speaker 2: Newton felt that he utterly failed to live up to 1407 01:11:27,400 --> 01:11:29,240 Speaker 2: the standard set by his hero. 1408 01:11:29,280 --> 01:11:30,799 Speaker 7: Wow Jesus Christ. 1409 01:11:30,960 --> 01:11:34,479 Speaker 2: Okay, in fact, he reputedly died of virgin in order 1410 01:11:34,520 --> 01:11:36,400 Speaker 2: to emulate the only way he could emulate. You know, 1411 01:11:36,720 --> 01:11:40,280 Speaker 2: couldn't turn loaves into fishes, water into wine, but he 1412 01:11:40,320 --> 01:11:42,200 Speaker 2: could die of virgin and in fact he did. 1413 01:11:42,520 --> 01:11:43,720 Speaker 7: But some say that was because of his. 1414 01:11:43,720 --> 01:11:48,920 Speaker 3: Personality, Like the first inel. That's right. 1415 01:11:48,960 --> 01:11:50,160 Speaker 7: So it's changed me. 1416 01:11:50,280 --> 01:11:53,439 Speaker 2: I think to not venerate the prize as much as 1417 01:11:53,479 --> 01:11:55,639 Speaker 2: I do. Think it's type of idol on So one. 1418 01:11:55,560 --> 01:11:57,559 Speaker 3: Of the things I like about your book is that 1419 01:11:57,640 --> 01:12:00,839 Speaker 3: you look forward to the next generations and you imagine 1420 01:12:00,840 --> 01:12:03,880 Speaker 3: that young people are reading it and thinking about their careers, 1421 01:12:04,479 --> 01:12:06,960 Speaker 3: and so what is the takeaway for young readers? You 1422 01:12:06,960 --> 01:12:10,120 Speaker 3: have to give them one piece of advice. To aspiring scientists, 1423 01:12:10,240 --> 01:12:13,679 Speaker 3: or you're talking to your graduate students or prospective grad students, 1424 01:12:13,800 --> 01:12:16,000 Speaker 3: what do you advise them about how to chart a 1425 01:12:16,040 --> 01:12:18,840 Speaker 3: path through a changing field, you know, which is different 1426 01:12:18,840 --> 01:12:20,720 Speaker 3: from the field that we grew up in and will 1427 01:12:20,760 --> 01:12:22,800 Speaker 3: be different in twenty years. What is your advice to 1428 01:12:22,840 --> 01:12:23,559 Speaker 3: the next generation. 1429 01:12:23,880 --> 01:12:24,800 Speaker 7: That's exactly right. 1430 01:12:24,960 --> 01:12:27,840 Speaker 2: So for me, it's all comes down to conservation laws 1431 01:12:28,080 --> 01:12:31,479 Speaker 2: in this case, conservation of energy, you know, energy time, 1432 01:12:31,560 --> 01:12:33,880 Speaker 2: whatever you want to say, and it's very hard to 1433 01:12:34,040 --> 01:12:36,960 Speaker 2: but if you concentrate and you conserve, it's a form 1434 01:12:37,000 --> 01:12:40,200 Speaker 2: of focus, right. I mean, you take a magnifying glass, 1435 01:12:40,200 --> 01:12:42,519 Speaker 2: you take a light, you can concentrate the sunlight and 1436 01:12:43,240 --> 01:12:46,360 Speaker 2: burn up those little worms that no, no, I'm just kidding. 1437 01:12:46,520 --> 01:12:49,559 Speaker 2: I never do that out there. Peta but you can 1438 01:12:49,640 --> 01:12:51,559 Speaker 2: melt an army man, right, You ever did that there? 1439 01:12:51,960 --> 01:12:55,040 Speaker 7: Yeah, But you can't just hold them up in the sun. 1440 01:12:55,120 --> 01:12:55,280 Speaker 3: Right. 1441 01:12:55,320 --> 01:12:57,240 Speaker 2: So you have to concentrate, You have to focus, you 1442 01:12:57,320 --> 01:13:00,479 Speaker 2: have to conserve it and narrow down. So for me, 1443 01:13:00,560 --> 01:13:04,840 Speaker 2: it's prioritization. What is the most important thing on your plate? 1444 01:13:04,960 --> 01:13:08,800 Speaker 2: Like do the hardest tasks that are most necessary. You know, 1445 01:13:08,840 --> 01:13:12,599 Speaker 2: there's this Eisenhower matrix framework and an important urgent you know, 1446 01:13:12,640 --> 01:13:14,640 Speaker 2: and whatever. 1447 01:13:14,320 --> 01:13:16,360 Speaker 7: Different spectrum of tasks. 1448 01:13:16,400 --> 01:13:18,960 Speaker 2: And you know, for me, it's like the most important 1449 01:13:18,960 --> 01:13:20,800 Speaker 2: thing I think a young person can do is to 1450 01:13:20,840 --> 01:13:24,080 Speaker 2: say no, because the better you are. You know, there's 1451 01:13:24,320 --> 01:13:26,600 Speaker 2: there's a saying in the business world like if you 1452 01:13:26,640 --> 01:13:29,160 Speaker 2: want something done right, ask someone who's too busy to 1453 01:13:29,240 --> 01:13:31,960 Speaker 2: do it, because they're the ones that are. And you 1454 01:13:32,040 --> 01:13:34,439 Speaker 2: know this, there's like only a handful of people on 1455 01:13:34,479 --> 01:13:38,240 Speaker 2: an experiment that really do you know, ninety percent of 1456 01:13:38,240 --> 01:13:40,320 Speaker 2: the work. There might be ten percent that do ninety 1457 01:13:40,320 --> 01:13:44,160 Speaker 2: percent ork And those people they're so oversubscribed that their 1458 01:13:44,280 --> 01:13:47,400 Speaker 2: energy is so drained or so distracted and they're so 1459 01:13:48,040 --> 01:13:51,400 Speaker 2: you know, kind of torn by their eagerness to please 1460 01:13:52,320 --> 01:13:55,519 Speaker 2: that they don't set boundaries, and so I really do 1461 01:13:55,720 --> 01:13:59,360 Speaker 2: tell my students to concentrate, conserve, focus whatever you want 1462 01:13:59,400 --> 01:14:03,280 Speaker 2: to say on energy, and do that by having appropriate 1463 01:14:03,280 --> 01:14:05,639 Speaker 2: boundaries in time and in space. 1464 01:14:05,760 --> 01:14:08,320 Speaker 3: All right, well, thanks very much. The book is called 1465 01:14:08,680 --> 01:14:12,679 Speaker 3: Into the Impossible, Volume two Focus like end Nobel Prize winner, 1466 01:14:12,920 --> 01:14:14,960 Speaker 3: Thanks very much for coming and telling us about all 1467 01:14:14,960 --> 01:14:18,400 Speaker 3: the wisdom you've gleaned from all of these successful stories. 1468 01:14:18,600 --> 01:14:19,320 Speaker 7: Thank you, Daniel. 1469 01:14:19,360 --> 01:14:21,280 Speaker 2: I want to get back to your audience too, because 1470 01:14:21,280 --> 01:14:23,400 Speaker 2: I love the audience and your audience is kind of 1471 01:14:23,400 --> 01:14:26,519 Speaker 2: a key demographic. So for people that do get a 1472 01:14:26,560 --> 01:14:29,599 Speaker 2: copy of this book, if you're in academia, I love 1473 01:14:29,640 --> 01:14:30,760 Speaker 2: to give out these meteorites. 1474 01:14:30,760 --> 01:14:32,120 Speaker 7: I think I've given them to Daniel. 1475 01:14:32,360 --> 01:14:33,439 Speaker 3: I have one here on my shelf. 1476 01:14:33,479 --> 01:14:36,080 Speaker 2: Yes, give them to your kids and so to get 1477 01:14:36,080 --> 01:14:39,520 Speaker 2: one if you're in academia, like my ideal target demographic, 1478 01:14:39,960 --> 01:14:43,240 Speaker 2: just go to Brian Keating dot com slash edu and 1479 01:14:43,640 --> 01:14:45,639 Speaker 2: sign up for my mailing list, which I sent out 1480 01:14:45,640 --> 01:14:49,000 Speaker 2: every Monday with some cool stuff including appearances like this 1481 01:14:49,200 --> 01:14:52,080 Speaker 2: and thoughts on academia and life as a scientist, et cetera. 1482 01:14:52,520 --> 01:14:54,960 Speaker 2: So Brian Kane dot com slash du with your edu 1483 01:14:55,000 --> 01:14:57,519 Speaker 2: email address, and if you live in the USA, you 1484 01:14:57,560 --> 01:15:00,679 Speaker 2: will get one of these beauties that was delivered by gravity, 1485 01:15:00,760 --> 01:15:02,120 Speaker 2: not the US Postal Service. 1486 01:15:02,200 --> 01:15:05,679 Speaker 3: I will deliver it amazing. And you can also catch 1487 01:15:05,680 --> 01:15:09,599 Speaker 3: Brian on his podcast Into the Impossible. All right, Thanks 1488 01:15:09,680 --> 01:15:11,080 Speaker 3: very much, Brian, Thanks Dan. 1489 01:15:18,160 --> 01:15:22,000 Speaker 1: Daniel and Kelly's Extraordinary Universe is produced by iHeartRadio. We 1490 01:15:22,040 --> 01:15:23,439 Speaker 1: would love to hear from you. 1491 01:15:23,560 --> 01:15:26,519 Speaker 3: We really would. We want to know what questions you 1492 01:15:26,720 --> 01:15:29,360 Speaker 3: have about this Extraordinary Universe. 1493 01:15:29,439 --> 01:15:32,400 Speaker 1: We want to know your thoughts on recent shows, suggestions 1494 01:15:32,400 --> 01:15:35,400 Speaker 1: for future shows. If you contact us, we will get 1495 01:15:35,439 --> 01:15:35,840 Speaker 1: back to you. 1496 01:15:36,120 --> 01:15:39,639 Speaker 3: We really mean it. We answer every message. Email us 1497 01:15:39,680 --> 01:15:42,880 Speaker 3: at Questions at Danielankelly. 1498 01:15:41,960 --> 01:15:44,040 Speaker 1: Dot org, or you can find us on social media. 1499 01:15:44,120 --> 01:15:47,920 Speaker 1: We have accounts on x, Instagram, Blue Sky and on 1500 01:15:48,000 --> 01:15:49,960 Speaker 1: all of those platforms. You can find us at d 1501 01:15:50,400 --> 01:15:51,920 Speaker 1: and kuniverse. 1502 01:15:52,160 --> 01:15:53,960 Speaker 3: Don't be shy write to us.