1 00:00:07,320 --> 00:00:10,240 Speaker 1: So you know, the gold standard in science, the absolute 2 00:00:10,280 --> 00:00:12,840 Speaker 1: pinnacle you can reach in your career, of course, is 3 00:00:12,880 --> 00:00:16,720 Speaker 1: the Nobel Prize. Once you get one, people think that 4 00:00:16,760 --> 00:00:18,919 Speaker 1: you can't make a mistake, that you're a genius, that 5 00:00:19,000 --> 00:00:21,200 Speaker 1: everything you do is fantastic. It's like a stamp it 6 00:00:21,320 --> 00:00:24,400 Speaker 1: just on your forehead that says genius. That's right. But 7 00:00:24,480 --> 00:00:29,360 Speaker 1: here's a question. Okay, so can the Nobel Prize be wrong? 8 00:00:30,400 --> 00:00:33,800 Speaker 1: I'm gasping at even the idea their or hey, that 9 00:00:33,920 --> 00:00:37,120 Speaker 1: the highest prize in science could ever ever have a 10 00:00:37,159 --> 00:00:40,160 Speaker 1: flaw in it, But apparently it has happened a couple 11 00:00:40,159 --> 00:00:43,400 Speaker 1: of times in history. Even this very top level prize 12 00:00:43,400 --> 00:00:47,800 Speaker 1: in science is sometimes awarded erroneously. Can you get the 13 00:00:47,840 --> 00:00:52,320 Speaker 1: Nobel Prize Nobel Prize errors? Can you get a Nobel 14 00:00:52,320 --> 00:01:16,840 Speaker 1: Prize for podcasting? Podbill Prize? Him? And I'm Daniel, and 15 00:01:16,880 --> 00:01:19,639 Speaker 1: I'm a cartoonist, and I do not have a Nobel Prize. 16 00:01:19,920 --> 00:01:22,920 Speaker 1: I'm a particle physicist, and I also do not have 17 00:01:22,959 --> 00:01:26,360 Speaker 1: a Nobel Prize. I have fewer than one Nobel prizes. 18 00:01:26,520 --> 00:01:30,880 Speaker 1: So Nobel Prize is this amazing cultural icon now right, 19 00:01:30,920 --> 00:01:34,120 Speaker 1: Like it's kind of the the name that means your genius. 20 00:01:34,440 --> 00:01:36,800 Speaker 1: That's right. It's covered in the news. You know, whoever 21 00:01:36,840 --> 00:01:40,520 Speaker 1: wins it is a breathlessly described in the New York Times, 22 00:01:40,600 --> 00:01:43,720 Speaker 1: and it's announced on television. You know, there's really no 23 00:01:43,800 --> 00:01:45,440 Speaker 1: other prize like it. Like, if you made a list 24 00:01:45,480 --> 00:01:47,600 Speaker 1: of the most important prizes in science, would be like 25 00:01:47,760 --> 00:01:51,680 Speaker 1: number one. Nobel Prize, number fifty would be the next thing. Right, 26 00:01:51,720 --> 00:01:53,960 Speaker 1: there's just like nothing else even comes close to y 27 00:01:54,280 --> 00:01:58,200 Speaker 1: has good brand recognition across the board. Sorry, whoever is 28 00:01:58,240 --> 00:02:00,920 Speaker 1: running that pr campaign has really earned their money. I 29 00:02:00,920 --> 00:02:02,680 Speaker 1: don't even know how that's happened. They should get a 30 00:02:02,720 --> 00:02:09,240 Speaker 1: Nobel Prize for the marketing efforts. It's the gold standard 31 00:02:09,320 --> 00:02:11,440 Speaker 1: in science. But today on the program, we're going to 32 00:02:11,520 --> 00:02:19,720 Speaker 1: ask a pretty controversial question. Can the Nobel Prize be wrong? 33 00:02:19,760 --> 00:02:22,560 Speaker 1: Has anybody ever won the Nobel Prize for something which 34 00:02:22,600 --> 00:02:26,240 Speaker 1: turned out to not be good science? Yeah? Nobel oops, 35 00:02:26,680 --> 00:02:29,080 Speaker 1: Nobel oops. And I think this is a really important question. 36 00:02:29,120 --> 00:02:32,280 Speaker 1: We're not here to run down science. Obviously, we're both 37 00:02:32,320 --> 00:02:34,880 Speaker 1: scientists and we love science, and science is a fantastic 38 00:02:34,919 --> 00:02:37,600 Speaker 1: way to explore the universe. But we also don't want 39 00:02:37,600 --> 00:02:40,080 Speaker 1: to pretend that science is always definitive right, you do 40 00:02:40,120 --> 00:02:43,400 Speaker 1: some experiments, you make a claim, even if it's backed up, 41 00:02:43,440 --> 00:02:45,600 Speaker 1: and then later you win the biggest prize in all 42 00:02:45,639 --> 00:02:49,000 Speaker 1: of human intellectual achievement. It could still be that your 43 00:02:49,040 --> 00:02:52,000 Speaker 1: results are wrong. Yeah, it's amazing to think that the 44 00:02:52,080 --> 00:02:55,919 Speaker 1: Nobel Prize could be wrong, right, Like you think that 45 00:02:56,200 --> 00:03:00,160 Speaker 1: it's done very carefully, right, But like everything else, it's 46 00:03:00,160 --> 00:03:02,680 Speaker 1: a human endeavor, right, And anything that's done by humans 47 00:03:02,800 --> 00:03:05,360 Speaker 1: is most likely sometimes going to be wrong, you know. 48 00:03:05,680 --> 00:03:08,840 Speaker 1: And it's important as scientists that we don't just enshrine 49 00:03:08,919 --> 00:03:12,600 Speaker 1: results that are know about prize winning as definite fact. Right. 50 00:03:12,600 --> 00:03:15,280 Speaker 1: We should always keep an open mind because anything we 51 00:03:15,320 --> 00:03:17,480 Speaker 1: think is true could turn out later to be wrong, 52 00:03:17,639 --> 00:03:19,560 Speaker 1: or could turn out to have been you know, just 53 00:03:19,600 --> 00:03:22,880 Speaker 1: seeing the wrong way, or true in some circumstances, but 54 00:03:22,960 --> 00:03:25,760 Speaker 1: not generally true. And so I think it's very important 55 00:03:25,800 --> 00:03:27,760 Speaker 1: that we that we question these things, that we always 56 00:03:27,960 --> 00:03:31,120 Speaker 1: look back on what our predecessors have discovered and think 57 00:03:31,480 --> 00:03:34,600 Speaker 1: could that be wrong? Yeah? And so to kind of 58 00:03:34,639 --> 00:03:38,200 Speaker 1: show you how crazy this idea is, we went out 59 00:03:38,200 --> 00:03:41,080 Speaker 1: there and ask people do you think the Nobel Prize 60 00:03:41,120 --> 00:03:44,040 Speaker 1: can be wrong? Here's what they had to say. I 61 00:03:44,040 --> 00:03:49,360 Speaker 1: have no clue of that. Yes, I haven't heard of 62 00:03:49,600 --> 00:03:54,320 Speaker 1: any discoveries like that. No, okay, personally, no, I don't. Okay, awesome, 63 00:03:54,800 --> 00:03:56,720 Speaker 1: I know that it's happened before. I don't have any 64 00:03:56,760 --> 00:04:00,920 Speaker 1: specific examples, but I know what's happened. Yeah, I don't. 65 00:04:01,240 --> 00:04:08,280 Speaker 1: Are there many? Not many, but a few. That's all right? 66 00:04:08,320 --> 00:04:11,480 Speaker 1: So overall, still pretty good branding, right, Like, nobody thought 67 00:04:11,640 --> 00:04:13,800 Speaker 1: the Nobel Prize could be wrong like that? That just 68 00:04:13,840 --> 00:04:16,960 Speaker 1: seemed unthinkable to some people. Yeah, it's shocking. I remember that. 69 00:04:17,040 --> 00:04:18,719 Speaker 1: You should have seen these people's faces when I asked 70 00:04:18,760 --> 00:04:22,080 Speaker 1: them that. Um yeah, they're like, what, No, come on, 71 00:04:22,120 --> 00:04:25,280 Speaker 1: the Nobel Prize wrong? Did they look at you? They 72 00:04:25,279 --> 00:04:30,359 Speaker 1: think maybe you're one of these science deniers. Yeah, and 73 00:04:30,400 --> 00:04:33,000 Speaker 1: then there's no evolution, and climate change the hoax, and 74 00:04:33,040 --> 00:04:36,480 Speaker 1: the Earth is flat. It's the right. I was kind 75 00:04:36,480 --> 00:04:38,520 Speaker 1: of worried about this episode. It's like, are we gonna 76 00:04:38,640 --> 00:04:41,640 Speaker 1: be bashing on science and the Nobel Prize. No. But 77 00:04:41,680 --> 00:04:44,920 Speaker 1: there's a huge difference between saying some of the conclusions 78 00:04:44,960 --> 00:04:47,520 Speaker 1: of some scientists could be wrong and saying, like the 79 00:04:47,640 --> 00:04:51,120 Speaker 1: very foundations of science are flawed. Right, It's it's in fact, 80 00:04:51,160 --> 00:04:54,520 Speaker 1: it's the process of science to disprove old results. That 81 00:04:54,640 --> 00:04:58,040 Speaker 1: shows you that science is robust. If no discoveries from 82 00:04:58,040 --> 00:05:00,360 Speaker 1: Nobel Prize winners had ever improven to be wrong, then 83 00:05:00,400 --> 00:05:03,359 Speaker 1: you might be suspicious. Right, It's like, m do you 84 00:05:03,360 --> 00:05:05,880 Speaker 1: expect them occasionally to make a mistake or for something 85 00:05:05,960 --> 00:05:09,000 Speaker 1: later to come out? Right? I think my favorite reaction 86 00:05:09,080 --> 00:05:11,359 Speaker 1: was the woman who said, I've never heard of that, 87 00:05:11,400 --> 00:05:12,839 Speaker 1: but I bet if there were there, there would be 88 00:05:12,839 --> 00:05:18,080 Speaker 1: in physics. Why do you go straight to physics? If 89 00:05:18,120 --> 00:05:20,920 Speaker 1: anyone's going to make a mistake, it's not the literature people. 90 00:05:21,240 --> 00:05:23,919 Speaker 1: It is not the economics people who are out wrong 91 00:05:24,040 --> 00:05:27,360 Speaker 1: like all the time. It's medicine, you know, I know 92 00:05:27,400 --> 00:05:30,200 Speaker 1: it's got to be the physicists. What do you think 93 00:05:30,200 --> 00:05:31,840 Speaker 1: she went there? Did she? Did she know you were 94 00:05:31,839 --> 00:05:34,280 Speaker 1: a physicist? Like? Did she look? Were you dressed like 95 00:05:34,320 --> 00:05:37,920 Speaker 1: a physicist? I think I look well, I mean that's 96 00:05:37,920 --> 00:05:39,440 Speaker 1: a question for you, maybe not for me. Do you 97 00:05:39,480 --> 00:05:41,320 Speaker 1: think I look like a physicist? Like if you have 98 00:05:41,400 --> 00:05:43,360 Speaker 1: to pick a physicist out of a crowd, would you 99 00:05:43,400 --> 00:05:45,880 Speaker 1: say that guy over there, he looks like a physicist. 100 00:05:46,400 --> 00:05:50,160 Speaker 1: I would probably guess poetry or physics. You know, you 101 00:05:50,279 --> 00:05:54,599 Speaker 1: got you got the hair, that's insightfuled because you maybe 102 00:05:54,600 --> 00:05:58,800 Speaker 1: you didn't realize, but I actually do write poetry. All right, yeah, yeah, 103 00:05:58,920 --> 00:06:01,239 Speaker 1: you should have recite some of you. Yeah, let's let's 104 00:06:01,240 --> 00:06:04,360 Speaker 1: just do that for the rest of the episode. Okay, 105 00:06:04,400 --> 00:06:07,400 Speaker 1: folks were switching gears. We're trashing the whole concept of 106 00:06:07,440 --> 00:06:11,800 Speaker 1: the podcast. We're going into pure poetry. Daniel is in 107 00:06:11,839 --> 00:06:14,000 Speaker 1: trouble with this pouse. He needs to make up some 108 00:06:14,279 --> 00:06:17,760 Speaker 1: romantic points here. That's right. I don't know why you're laughing. 109 00:06:17,760 --> 00:06:20,279 Speaker 1: I'm very serious about my poetry. No, um I am. 110 00:06:20,400 --> 00:06:22,640 Speaker 1: I am not a poet, and this woman certainly didn't 111 00:06:22,640 --> 00:06:25,000 Speaker 1: know that I'm a physicist, so that's probably why she 112 00:06:25,040 --> 00:06:26,560 Speaker 1: gets I don't think she was trying to impu in 113 00:06:26,600 --> 00:06:29,760 Speaker 1: the entire edifice of modern physics. I think she was 114 00:06:29,839 --> 00:06:32,839 Speaker 1: just thought I was leading her on. Oh all right, yea, 115 00:06:36,400 --> 00:06:39,320 Speaker 1: let's start with this question then, um, if it, if 116 00:06:39,320 --> 00:06:41,320 Speaker 1: it's possible that it can be wrong. Let's talk about 117 00:06:41,320 --> 00:06:44,039 Speaker 1: how Nobel Prizes are awarded, Like, do you know what 118 00:06:44,080 --> 00:06:47,160 Speaker 1: the process is for who gets one, who gets nominated, 119 00:06:47,200 --> 00:06:50,080 Speaker 1: who decides who gets a Nobel Prize? M yeah, I 120 00:06:50,120 --> 00:06:53,360 Speaker 1: think the gates of heaven open and the angels sing 121 00:06:53,880 --> 00:06:56,320 Speaker 1: and it just sort of descends on this glowing cloud. 122 00:06:57,440 --> 00:07:02,280 Speaker 1: It says you you have won themselves. Flies down, that's right, 123 00:07:02,560 --> 00:07:06,200 Speaker 1: little golden wings, you know, just little lands. Um right, 124 00:07:06,200 --> 00:07:08,080 Speaker 1: I mean I think people would like to believe that, right, 125 00:07:08,120 --> 00:07:09,960 Speaker 1: if you would like to think that there's something greater, 126 00:07:10,160 --> 00:07:14,160 Speaker 1: something superhuman, something above us. That's like you know, anointing 127 00:07:14,200 --> 00:07:16,760 Speaker 1: these geniuses. But in the end, it's just people, right, 128 00:07:17,000 --> 00:07:18,960 Speaker 1: It's got to be people, obviously, there's no other way 129 00:07:18,960 --> 00:07:23,400 Speaker 1: to do it. Not just so it's sweets, right, I 130 00:07:23,400 --> 00:07:24,920 Speaker 1: mean they are a kind of I don't know how 131 00:07:24,920 --> 00:07:28,280 Speaker 1: to take that, not just people. It's sweet. They're kind 132 00:07:28,280 --> 00:07:31,000 Speaker 1: of superhuman, It's what I mean. The same they are. 133 00:07:31,080 --> 00:07:34,080 Speaker 1: They are angelic. Oh, I see you're you're saying swedes 134 00:07:34,120 --> 00:07:38,000 Speaker 1: are basically the best of us. Um No, it's it's 135 00:07:38,080 --> 00:07:41,080 Speaker 1: um it's not just sweets. Okay, So it's it's concentrated 136 00:07:41,080 --> 00:07:43,880 Speaker 1: in Scandinavia. But so this is a two step process, right. 137 00:07:43,920 --> 00:07:46,480 Speaker 1: So the first step is you gotta get nominated, okay, 138 00:07:46,920 --> 00:07:49,720 Speaker 1: And to be nominated you have to be either nominated 139 00:07:49,720 --> 00:07:53,240 Speaker 1: by somebody on the Nobel Prize Committee or a member 140 00:07:53,240 --> 00:07:56,560 Speaker 1: of the Swedish Academy of sciences, or you have to 141 00:07:56,680 --> 00:08:01,360 Speaker 1: have a Nobel Prize, right, or there's a select group 142 00:08:01,400 --> 00:08:04,240 Speaker 1: of other people who they will take nominations from, basically 143 00:08:04,280 --> 00:08:08,800 Speaker 1: like you know, internationally famous physicists from around the world. Right, 144 00:08:08,880 --> 00:08:12,680 Speaker 1: So all those Nobel Prize winning listeners we have out there, 145 00:08:13,240 --> 00:08:15,320 Speaker 1: you should tell him your name and your address, right, 146 00:08:16,200 --> 00:08:18,840 Speaker 1: that's right, that's right. It's a very select group. And 147 00:08:18,880 --> 00:08:20,600 Speaker 1: you know they've done these studies where they say, like, 148 00:08:20,880 --> 00:08:23,080 Speaker 1: what's the best way to get a Nobel Prize. Well, 149 00:08:23,120 --> 00:08:26,000 Speaker 1: it's to have your advisor win the Nobel Prize. You know, 150 00:08:26,120 --> 00:08:28,559 Speaker 1: to work for somebody who has the Nobel Prize, because 151 00:08:28,600 --> 00:08:31,680 Speaker 1: then they can nominate you. Right. Yeah, it's like the 152 00:08:31,720 --> 00:08:34,880 Speaker 1: oldest boy of the oldest boy clubs in the world. Right, 153 00:08:34,880 --> 00:08:39,640 Speaker 1: it's the only group of of of self reinforcing um accolades. 154 00:08:40,400 --> 00:08:42,120 Speaker 1: So you have to be you have to get nominated. 155 00:08:42,120 --> 00:08:44,280 Speaker 1: Some people have to know your work, and you have 156 00:08:44,360 --> 00:08:47,240 Speaker 1: to sort of know somebody in the note Yeah, exactly. 157 00:08:47,360 --> 00:08:49,520 Speaker 1: You have to be nominated. So then you're on the list, right, 158 00:08:49,679 --> 00:08:51,880 Speaker 1: and you're in the list of people they consider. And 159 00:08:51,920 --> 00:08:54,280 Speaker 1: then you know the Nobel Prize committee, which is a 160 00:08:54,280 --> 00:08:56,800 Speaker 1: bunch of Swedes and you know, maybe some other folks 161 00:08:56,800 --> 00:09:00,200 Speaker 1: from from Scandinavia. They get together and they just us 162 00:09:00,200 --> 00:09:02,000 Speaker 1: it and they argue about it, and you know, they 163 00:09:02,080 --> 00:09:05,160 Speaker 1: solicit input from the people they know. But in the end, 164 00:09:05,320 --> 00:09:08,439 Speaker 1: it's subjective, you know, like how do you compare discovery 165 00:09:08,480 --> 00:09:11,840 Speaker 1: A versus discovery B. Every piece of research is totally different. 166 00:09:11,960 --> 00:09:15,560 Speaker 1: It's not like there's some metric you can use to 167 00:09:15,240 --> 00:09:19,000 Speaker 1: to give them points where it's really fair to to establish. 168 00:09:19,040 --> 00:09:21,240 Speaker 1: And even if it were, it's still subjective. It's still 169 00:09:21,280 --> 00:09:24,360 Speaker 1: like how important is this, how groundbreaking is it, how 170 00:09:24,440 --> 00:09:27,560 Speaker 1: innovative is it? Right? Well, what's the standard? What's supposed 171 00:09:27,559 --> 00:09:30,600 Speaker 1: to be the standard? The standard is the greatest contribution 172 00:09:30,760 --> 00:09:34,480 Speaker 1: to humankind, right, yeah, exactly, And I think that those 173 00:09:34,480 --> 00:09:38,160 Speaker 1: are the words originally in the will, right, But I 174 00:09:38,200 --> 00:09:41,440 Speaker 1: think more recently the committee is focused on things that 175 00:09:41,440 --> 00:09:44,600 Speaker 1: are like deep innovations or or things that would change 176 00:09:44,679 --> 00:09:47,600 Speaker 1: the direction of the field, right truly, like ground shaking 177 00:09:47,600 --> 00:09:51,480 Speaker 1: discoveries um and in physics at least, it's sort of 178 00:09:51,760 --> 00:09:55,480 Speaker 1: alternated between like crazy theoretical ideas that turned out to 179 00:09:55,480 --> 00:09:58,480 Speaker 1: be correct, even if they haven't really had any impact 180 00:09:58,480 --> 00:10:01,600 Speaker 1: on humanity, like they this boson, right, If you don't 181 00:10:01,600 --> 00:10:03,040 Speaker 1: know what the Higgs boson is, you can listen to 182 00:10:03,120 --> 00:10:06,079 Speaker 1: our whole podcast episode about that. But recently people won 183 00:10:06,120 --> 00:10:08,120 Speaker 1: the Nobel Prize for the Higgs boson because it was 184 00:10:08,160 --> 00:10:10,480 Speaker 1: discovered at the large age On collider. I hadn't really 185 00:10:10,520 --> 00:10:13,559 Speaker 1: had any impact on humanity other than you know, people 186 00:10:13,600 --> 00:10:16,840 Speaker 1: hearing about it. Um, But it's it's a big breakthrough 187 00:10:16,960 --> 00:10:19,640 Speaker 1: in theoretical physics. Yeah, it's a big deal. It's a 188 00:10:19,640 --> 00:10:22,480 Speaker 1: big deal. And then sometimes it's something very practical, you know, 189 00:10:22,600 --> 00:10:25,200 Speaker 1: like inventing a new technology like blue l e eds. 190 00:10:25,320 --> 00:10:28,920 Speaker 1: That one the Nobel prize years ago. Blue l d s. Yeah, 191 00:10:30,040 --> 00:10:33,160 Speaker 1: that's a Nipple prize. No, not the yellow, not the green, 192 00:10:33,240 --> 00:10:36,240 Speaker 1: not the cyan, No, not the turcoise, the blue LEDs. 193 00:10:36,400 --> 00:10:38,440 Speaker 1: Imagine me in the discover of the red l D 194 00:10:38,520 --> 00:10:43,000 Speaker 1: and be like, what the it spent all my time 195 00:10:43,040 --> 00:10:45,040 Speaker 1: on the purple l e d s. What a mistake? 196 00:10:45,120 --> 00:10:50,960 Speaker 1: Oh my god? Big blue. No, blue was particularly difficult 197 00:10:51,160 --> 00:10:53,360 Speaker 1: because of that wavelength, and so there was a lot 198 00:10:53,440 --> 00:10:56,160 Speaker 1: of you know, interesting innovation along the way. It's not 199 00:10:56,240 --> 00:10:58,839 Speaker 1: just like hey, we really needed this technology and they 200 00:10:58,880 --> 00:11:01,480 Speaker 1: figured it out. There was some interesting physics going on 201 00:11:01,520 --> 00:11:04,120 Speaker 1: inside the blue LEDs um. But in the end it's 202 00:11:04,120 --> 00:11:06,839 Speaker 1: it's political. It's chosen by a committee of people, chosen 203 00:11:06,840 --> 00:11:11,199 Speaker 1: by committee people and people campaign. You know, there are yeah, 204 00:11:11,520 --> 00:11:13,600 Speaker 1: you know, this is a group of This is just 205 00:11:13,640 --> 00:11:16,320 Speaker 1: a community people and they talk and they chat, and 206 00:11:16,320 --> 00:11:18,920 Speaker 1: when the prizes are coming up and the decisions are 207 00:11:18,960 --> 00:11:21,560 Speaker 1: being made, you hear people. If you're in the physics community, 208 00:11:21,559 --> 00:11:23,480 Speaker 1: you hear people talking maybe blah should get me, but 209 00:11:23,520 --> 00:11:25,600 Speaker 1: blah blah should get it, and they and people go 210 00:11:25,640 --> 00:11:29,600 Speaker 1: around and give presentations. And I remember when everybody knew 211 00:11:29,960 --> 00:11:32,800 Speaker 1: that the Higgs Boson discovery was going to yield a 212 00:11:32,800 --> 00:11:36,520 Speaker 1: Nobel Prize the following year, and people were jostling for 213 00:11:36,559 --> 00:11:39,040 Speaker 1: a position because we knew also that you could only 214 00:11:39,080 --> 00:11:41,600 Speaker 1: get give it to three people. So everybody knew Higgs 215 00:11:41,679 --> 00:11:43,480 Speaker 1: was going to get it, that was for sure. And 216 00:11:43,480 --> 00:11:46,240 Speaker 1: the question was who were the other two theorists that 217 00:11:46,240 --> 00:11:48,560 Speaker 1: we're going to win it? Yeah, because there are rules 218 00:11:48,640 --> 00:11:51,280 Speaker 1: to the Nobel price, right, like they you can only 219 00:11:51,320 --> 00:11:55,040 Speaker 1: pick at most three people for a discovery, and they 220 00:11:55,080 --> 00:11:57,280 Speaker 1: all have to be alive. They all have to be 221 00:11:57,320 --> 00:12:00,439 Speaker 1: alive exactly. And um, there were you know, there was 222 00:12:00,520 --> 00:12:03,240 Speaker 1: Higgs of course, and then there was a couple of others. 223 00:12:03,240 --> 00:12:07,160 Speaker 1: There were two other people who everybody agrees made contributions 224 00:12:07,200 --> 00:12:08,720 Speaker 1: sort of at the same level of Higgs. But then 225 00:12:08,760 --> 00:12:10,960 Speaker 1: one of them died, so there was sort of like 226 00:12:11,000 --> 00:12:13,800 Speaker 1: an empty slot, and a lot of folks, a lot 227 00:12:13,840 --> 00:12:15,920 Speaker 1: of the folks in the theoretical physics community started going 228 00:12:15,960 --> 00:12:18,560 Speaker 1: around giving seminars about how important their work was in 229 00:12:18,600 --> 00:12:21,200 Speaker 1: that era and the contributions they made to the Higgs 230 00:12:21,200 --> 00:12:24,439 Speaker 1: boson discovery. And because it can be kind of fuzzy, right, 231 00:12:24,520 --> 00:12:28,359 Speaker 1: like this idea that discoveries are made by one scientist 232 00:12:28,480 --> 00:12:31,400 Speaker 1: in a lab yelling eureka, I mean that's sort of 233 00:12:31,520 --> 00:12:34,000 Speaker 1: been going away for a while. Discoveries are kind of 234 00:12:34,000 --> 00:12:37,520 Speaker 1: made in groups, and people contribute little things and little 235 00:12:37,520 --> 00:12:40,520 Speaker 1: bits here, some ideas, and then the ideas get shaped, 236 00:12:40,559 --> 00:12:43,680 Speaker 1: and so like who actually gets credit for something is 237 00:12:43,679 --> 00:12:47,800 Speaker 1: getting harder and harder, exactly, And sometimes it's somebody with 238 00:12:47,840 --> 00:12:50,880 Speaker 1: a really genius idea comes along just like shatters the field, 239 00:12:51,000 --> 00:12:54,400 Speaker 1: like Einstein. Right, there's a crazy new idea. Other times 240 00:12:54,440 --> 00:12:56,680 Speaker 1: it's a bit incremental, and you could say, well, you know, 241 00:12:56,679 --> 00:12:58,800 Speaker 1: if Bob hadn't done it, then Sally would have done 242 00:12:58,800 --> 00:13:00,760 Speaker 1: it two weeks later, and if I only hadn't done it, 243 00:13:00,800 --> 00:13:02,800 Speaker 1: then you know, Samantha would have figured it out or something. 244 00:13:03,080 --> 00:13:06,520 Speaker 1: Sometimes an idea has its time and it's sort of 245 00:13:06,559 --> 00:13:09,040 Speaker 1: like the field is slowly rolling down the hill towards 246 00:13:09,120 --> 00:13:11,240 Speaker 1: it and eventually somebody's going to figure it out. In 247 00:13:11,240 --> 00:13:13,719 Speaker 1: that case, it's hard to know, Like you can't give 248 00:13:13,800 --> 00:13:16,439 Speaker 1: the credit to everybody, so they just sort of pick 249 00:13:16,520 --> 00:13:19,640 Speaker 1: somebody or three. They picked three people to win it 250 00:13:19,800 --> 00:13:21,920 Speaker 1: for a discovery. So why do you think they need 251 00:13:21,960 --> 00:13:24,880 Speaker 1: to be alive and they can't be dead, you know 252 00:13:24,880 --> 00:13:26,880 Speaker 1: what I mean, because then they need to go to 253 00:13:26,920 --> 00:13:29,200 Speaker 1: that fancy ceremony in Sweden and you know, we're a 254 00:13:29,200 --> 00:13:32,000 Speaker 1: tuxedo and give a speech and uh and all that 255 00:13:32,080 --> 00:13:34,280 Speaker 1: kind of stuff. It's just one of the things in 256 00:13:34,280 --> 00:13:35,959 Speaker 1: the will. But you know, they need to be alive 257 00:13:36,000 --> 00:13:39,920 Speaker 1: when the prize is is um determined, not when the 258 00:13:39,960 --> 00:13:42,240 Speaker 1: prize is awarded. And it was a case of a 259 00:13:42,280 --> 00:13:45,199 Speaker 1: guy who was given the award of the Nobel Prize 260 00:13:45,600 --> 00:13:49,520 Speaker 1: and then two days later he died right um, and 261 00:13:49,559 --> 00:13:51,400 Speaker 1: so before they could award it to him, he was dead, 262 00:13:51,400 --> 00:13:53,400 Speaker 1: and it was like a big controversy, but they decided 263 00:13:53,640 --> 00:13:56,960 Speaker 1: he was alive and we made the decision. He gets 264 00:13:57,040 --> 00:14:00,679 Speaker 1: to keep it. But he knew he knew he had won. Yeah, 265 00:14:00,760 --> 00:14:03,960 Speaker 1: you knew you so maybe that maybe maybe holding out 266 00:14:04,000 --> 00:14:06,240 Speaker 1: for it. He's like, all right, I'm done. Oh. There's 267 00:14:06,280 --> 00:14:09,000 Speaker 1: definitely cases of that, you know that, like people survive 268 00:14:09,760 --> 00:14:15,000 Speaker 1: um to like to see birthdays and like grandchildren's graduations 269 00:14:15,040 --> 00:14:16,600 Speaker 1: and stuff, to people like who are about to die. 270 00:14:16,600 --> 00:14:19,160 Speaker 1: I can hold out for like one last event. Wow. 271 00:14:19,400 --> 00:14:21,680 Speaker 1: So if I convinced myself that I'm going to win one, 272 00:14:21,800 --> 00:14:23,760 Speaker 1: I just have to hold out for it. I might 273 00:14:23,840 --> 00:14:27,200 Speaker 1: live forever. Yeah was the last time? Who won the 274 00:14:27,240 --> 00:14:29,520 Speaker 1: Nobel Prize for Cartooning last week? And I don't remember. 275 00:14:30,400 --> 00:14:33,320 Speaker 1: I'll be the Bill Watson. I'll just convinced to myself 276 00:14:33,360 --> 00:14:36,000 Speaker 1: I'll be the first one, and then you know, I'll 277 00:14:36,000 --> 00:14:38,280 Speaker 1: live forever. Right, and then you can nominate yourself for 278 00:14:38,320 --> 00:14:40,640 Speaker 1: every subsequent one, right, because you'll be the only winner. 279 00:14:42,000 --> 00:14:47,400 Speaker 1: I'll just start my own committee of Swedish people. Exactly. 280 00:14:47,920 --> 00:14:50,200 Speaker 1: I think the point people should come away with is 281 00:14:50,200 --> 00:14:52,480 Speaker 1: that this is a committee people and their humans, and 282 00:14:52,480 --> 00:14:55,920 Speaker 1: they're influenced by fads and politics, and you know they're 283 00:14:55,920 --> 00:14:58,920 Speaker 1: doing their best to try to find something which is pure, 284 00:14:59,080 --> 00:15:02,000 Speaker 1: something which will last. Right. But in the end, you know, 285 00:15:02,040 --> 00:15:05,400 Speaker 1: they're making human decisions, right, But they do take it 286 00:15:05,520 --> 00:15:07,880 Speaker 1: very carefully. You know, they take like a whole year 287 00:15:07,920 --> 00:15:10,080 Speaker 1: or two to make this decision, right. Yeah, I think that. 288 00:15:10,200 --> 00:15:11,680 Speaker 1: You know, they wrote all the names on the wall, 289 00:15:11,800 --> 00:15:13,720 Speaker 1: and they passed out the darts and they throw them 290 00:15:13,800 --> 00:15:18,120 Speaker 1: very carefully. You know, they very precise measurements. Yeah. Um, no, 291 00:15:18,280 --> 00:15:20,240 Speaker 1: I would love to be in the room and here, like, 292 00:15:20,280 --> 00:15:22,120 Speaker 1: what kind of arguments are made? Are they like, you know, 293 00:15:22,160 --> 00:15:25,440 Speaker 1: appealing to you know, this is more fundamental than that, 294 00:15:25,560 --> 00:15:27,960 Speaker 1: or they're just like, hey, you know, Bob really deserves 295 00:15:27,960 --> 00:15:30,160 Speaker 1: it because he got you know, his wife left him, 296 00:15:30,240 --> 00:15:32,320 Speaker 1: or like, what kind of arguments are being made? I 297 00:15:32,400 --> 00:15:35,240 Speaker 1: have no idea. Look how cool this blue led looks. 298 00:15:35,280 --> 00:15:38,320 Speaker 1: I mean, come on, stop shining that blue led in 299 00:15:38,320 --> 00:15:41,440 Speaker 1: my eyes. We'll give him the prize if you can 300 00:15:41,440 --> 00:15:44,080 Speaker 1: stop shining that my eyes. Yeah, I've actually been in 301 00:15:44,120 --> 00:15:46,160 Speaker 1: the room. Did I tell you this story? I was. 302 00:15:46,440 --> 00:15:47,960 Speaker 1: I was in the room for the noble where they 303 00:15:47,960 --> 00:15:51,040 Speaker 1: decide Nobel Prize in medicine. Not while they were deciding, 304 00:15:51,120 --> 00:15:54,320 Speaker 1: I just toured towards the Carolans get Institute. They showed 305 00:15:54,360 --> 00:15:57,640 Speaker 1: me the room. Oh wow, yeah, it's just the room 306 00:15:57,720 --> 00:16:00,200 Speaker 1: that's that makes me feel like I'm one step close 307 00:16:00,240 --> 00:16:03,240 Speaker 1: to the Nobel Prize because I know you and right, 308 00:16:03,360 --> 00:16:07,640 Speaker 1: and I've been in the room. So yeah, And hasn't 309 00:16:07,680 --> 00:16:10,720 Speaker 1: your work actually been mentioned specifically by the Nobel Prize Committee? 310 00:16:10,760 --> 00:16:13,400 Speaker 1: This is not a joke. Yeah, that's right. The video 311 00:16:13,440 --> 00:16:15,120 Speaker 1: and the comics that you and I made about the 312 00:16:15,200 --> 00:16:18,280 Speaker 1: Higgs Boson, the Higgs Boson Explained, which is on YouTube 313 00:16:18,320 --> 00:16:20,640 Speaker 1: you can watch and if you want, that video was 314 00:16:20,880 --> 00:16:24,800 Speaker 1: quoted in the Nobel prod the official Nobel Prize poster. 315 00:16:24,960 --> 00:16:29,200 Speaker 1: So the Nobel Prize folks make a poster every year 316 00:16:29,320 --> 00:16:32,000 Speaker 1: of the discovery, and so we were mentioned in that poster. 317 00:16:32,320 --> 00:16:34,320 Speaker 1: Pretty cool, that's right. It's probably the only time the 318 00:16:34,360 --> 00:16:36,560 Speaker 1: Nobel Prize Committee will ever cite any of my work, 319 00:16:36,640 --> 00:16:40,240 Speaker 1: But hey, I'll take it. Or a cartoon, right, a 320 00:16:40,640 --> 00:16:43,720 Speaker 1: cartoon exactly. You definitely broke ground there. You probably the 321 00:16:43,760 --> 00:16:46,200 Speaker 1: only cartoon ever cited by the Nobel Prize Committee. Yeah, 322 00:16:46,200 --> 00:16:49,200 Speaker 1: but that's another association. So they know who you are, Daniel, 323 00:16:49,400 --> 00:16:51,720 Speaker 1: Oh my gosh. Yeah. So I'll just you know, I'm 324 00:16:51,720 --> 00:16:53,880 Speaker 1: gonna stay up every every night I'm waiting for that 325 00:16:53,920 --> 00:16:56,560 Speaker 1: phone call from Sweden. Yeah, don't don't die. Don't die. 326 00:16:58,160 --> 00:17:00,600 Speaker 1: That's now my number one reason to keep living because 327 00:17:01,280 --> 00:17:04,600 Speaker 1: it's potentially Nobel Prize. Okay, so that's kind of how 328 00:17:04,640 --> 00:17:07,600 Speaker 1: they decide to get the Nobel Prize. And so can 329 00:17:07,680 --> 00:17:11,119 Speaker 1: this process somehow be awarded to somebody or for a 330 00:17:11,160 --> 00:17:13,879 Speaker 1: discovery that turned out to be wrong? And so we 331 00:17:13,960 --> 00:17:16,800 Speaker 1: have two stories to tell you today of a Nobel 332 00:17:16,800 --> 00:17:33,000 Speaker 1: Prize gone wrong. But first let's take a quick break. Okay, 333 00:17:33,040 --> 00:17:36,480 Speaker 1: so what's the first instance of a Nobel Prize being wrong. 334 00:17:36,800 --> 00:17:39,200 Speaker 1: One of my favorites is a Nobel Prize which went 335 00:17:39,240 --> 00:17:41,640 Speaker 1: to somebody who was definitely a certified genius and made 336 00:17:41,640 --> 00:17:44,200 Speaker 1: lots of contributions to physics, but won the prize for 337 00:17:44,280 --> 00:17:47,359 Speaker 1: something which later turned out to not be accurate. And 338 00:17:47,400 --> 00:17:50,480 Speaker 1: that's Neil's Bore. And you know this strikes close to 339 00:17:50,520 --> 00:17:52,919 Speaker 1: home for Scandinavians because bores Danish and he's like the 340 00:17:53,000 --> 00:17:56,119 Speaker 1: grandfather of Danish science. And you know, the Nobel Prize 341 00:17:56,119 --> 00:18:01,320 Speaker 1: is a Scandinavian cherished Scandinavian um tredy, and but Bore 342 00:18:01,440 --> 00:18:03,679 Speaker 1: is famous for coming up with what's known as the 343 00:18:03,760 --> 00:18:07,000 Speaker 1: Bore model of the atom, right, not the Boring model, 344 00:18:07,240 --> 00:18:11,040 Speaker 1: but the Boring company, right exactly. He's not the Elon 345 00:18:11,119 --> 00:18:15,800 Speaker 1: Musk hundred years ago. Um, No, Neil's Bore b o Hr. Yeah. 346 00:18:15,840 --> 00:18:18,840 Speaker 1: This was in the early part of the twentieth century, 347 00:18:18,920 --> 00:18:22,520 Speaker 1: right when we didn't really know what atoms were made 348 00:18:22,520 --> 00:18:26,440 Speaker 1: out of or what they were structured. Like, right, that's right. 349 00:18:26,480 --> 00:18:30,160 Speaker 1: We had recently discovered that electrons exist, right, there are 350 00:18:30,200 --> 00:18:34,160 Speaker 1: particles had negative charge. Um. That was just like twenty 351 00:18:34,240 --> 00:18:37,840 Speaker 1: years earlier. And then Rutherford discovered that there's a positively 352 00:18:37,960 --> 00:18:41,560 Speaker 1: charged particle also in the atom, and so um, we 353 00:18:41,640 --> 00:18:44,240 Speaker 1: had this concept like, okay, there's some positive charges, there's 354 00:18:44,280 --> 00:18:46,920 Speaker 1: some negative charges. How does it all work? We need 355 00:18:46,920 --> 00:18:49,600 Speaker 1: a different kinds of atoms. Did we know that they 356 00:18:49,600 --> 00:18:52,400 Speaker 1: were all the same atom with just with different numbers 357 00:18:52,440 --> 00:18:55,600 Speaker 1: of protons and electrons and stuff. No, No, not at all. 358 00:18:55,680 --> 00:18:58,400 Speaker 1: We just had this table of different kinds of stuff, right, 359 00:18:58,480 --> 00:19:00,879 Speaker 1: like you know, these metals and those metals and the 360 00:19:00,920 --> 00:19:02,960 Speaker 1: other metals. We know all we knew they all weigh 361 00:19:03,040 --> 00:19:06,760 Speaker 1: different amounts um and so that was a clue and 362 00:19:06,800 --> 00:19:10,320 Speaker 1: we knew that somehow electrons were involved and somehow these 363 00:19:10,359 --> 00:19:13,240 Speaker 1: positive particles were involved, but we didn't know what the 364 00:19:13,280 --> 00:19:16,359 Speaker 1: structure was inside that. We didn't know how an atom worked, 365 00:19:16,720 --> 00:19:18,960 Speaker 1: like what did it look like if you zoomed in 366 00:19:19,720 --> 00:19:22,119 Speaker 1: and back? Then the technology to see inside these atoms 367 00:19:22,200 --> 00:19:24,040 Speaker 1: was really rudimentary, and so it was it was a 368 00:19:24,040 --> 00:19:26,480 Speaker 1: big puzzle like how does the what is it inside 369 00:19:26,480 --> 00:19:28,960 Speaker 1: the atom? And how does that explain all the different elements? 370 00:19:29,000 --> 00:19:31,320 Speaker 1: That was basically the question that was at the top 371 00:19:31,480 --> 00:19:33,680 Speaker 1: of the list of physics wish list at the time, 372 00:19:34,240 --> 00:19:36,680 Speaker 1: and so Boar came up with an answer to that question. Yeah, 373 00:19:36,800 --> 00:19:40,400 Speaker 1: Boor took inspiration from planets, you know. He said, well, 374 00:19:40,440 --> 00:19:43,879 Speaker 1: the Sun and the Earth, right, that's a system, and 375 00:19:43,920 --> 00:19:46,240 Speaker 1: the Earth moves around the Sun and they're attracted to 376 00:19:46,280 --> 00:19:49,479 Speaker 1: each other, um. But the Earth doesn't fall into the Sun, right, 377 00:19:49,520 --> 00:19:52,080 Speaker 1: because it moves in an orbit. It has enough energy 378 00:19:52,119 --> 00:19:54,159 Speaker 1: that it's whizzing around the Sun. It doesn't fall in. 379 00:19:54,240 --> 00:19:57,240 Speaker 1: And so he said, well, that's cool. Maybe that explains 380 00:19:57,480 --> 00:20:00,000 Speaker 1: how you can have a positive charge and a negative 381 00:20:00,000 --> 00:20:02,960 Speaker 1: a charge inside the atom, right, and have the whole 382 00:20:03,000 --> 00:20:05,720 Speaker 1: atom b neutral but not have the positive negative charge 383 00:20:05,840 --> 00:20:08,639 Speaker 1: just you know, collapse and annihilate each other. So he 384 00:20:08,680 --> 00:20:12,320 Speaker 1: had this idea that the electron is orbiting the positive charge, 385 00:20:12,359 --> 00:20:14,240 Speaker 1: that the positive charge would be the Sun and the 386 00:20:14,280 --> 00:20:18,000 Speaker 1: electron would be the Earth. That's right, because Earth doesn't 387 00:20:18,040 --> 00:20:21,080 Speaker 1: fall into the Sun. We just keep spinning around it. Yeah. 388 00:20:21,119 --> 00:20:23,280 Speaker 1: I haven't looked outside recently, but last I checked, we 389 00:20:23,320 --> 00:20:25,879 Speaker 1: did not fall into the Sun as as of the 390 00:20:25,960 --> 00:20:27,800 Speaker 1: data this recording. Oh man, you should get a noble 391 00:20:27,840 --> 00:20:31,720 Speaker 1: price for that, as long as we don't fall into 392 00:20:31,760 --> 00:20:34,760 Speaker 1: the Sun before I can collect it yet. Um. And 393 00:20:34,800 --> 00:20:37,119 Speaker 1: this is a cool idea. And it's always nice, you know, 394 00:20:37,119 --> 00:20:40,000 Speaker 1: when you get inspired by one area of physics to say, well, look, 395 00:20:40,080 --> 00:20:42,119 Speaker 1: this is the way it works over here. Maybe something 396 00:20:42,160 --> 00:20:44,399 Speaker 1: similar works over there. And when it when it clicked 397 00:20:44,440 --> 00:20:47,320 Speaker 1: together like that, it's a wonderful moment in science when 398 00:20:47,320 --> 00:20:50,760 Speaker 1: you're like, look, there's something universal, something general that's like, 399 00:20:51,119 --> 00:20:53,960 Speaker 1: this is a structure the universe likes. We can apply 400 00:20:54,080 --> 00:20:57,040 Speaker 1: these ideas in multiple places. That's always a nice feeling. 401 00:20:57,160 --> 00:20:59,320 Speaker 1: And it was a super powerful idea, right like it 402 00:20:59,400 --> 00:21:03,200 Speaker 1: explains in the periodic table of elements. Yeah, and even 403 00:21:03,200 --> 00:21:05,800 Speaker 1: more so, it explains something that people have been puzzling over, 404 00:21:05,880 --> 00:21:09,879 Speaker 1: which is why atoms give light only in certain wavelengths. 405 00:21:10,880 --> 00:21:13,280 Speaker 1: If you get atoms hot, you know, they will give 406 00:21:13,320 --> 00:21:15,760 Speaker 1: off light, they will glow. But if you look at 407 00:21:15,760 --> 00:21:17,840 Speaker 1: the glow, if you look at the at the wavelengths 408 00:21:17,880 --> 00:21:20,200 Speaker 1: of light that come out, they don't glow in every 409 00:21:20,200 --> 00:21:23,119 Speaker 1: single color, right, It's not like the glow totally white 410 00:21:23,600 --> 00:21:26,000 Speaker 1: um where every single color and the spectrum is represented. 411 00:21:26,400 --> 00:21:29,919 Speaker 1: Different kinds of stuff glow with different colors. So you 412 00:21:29,920 --> 00:21:33,639 Speaker 1: get these lines where you like, hydrogen has you know, 413 00:21:33,680 --> 00:21:35,920 Speaker 1: a few different colors it will shine with, and helium 414 00:21:35,960 --> 00:21:38,560 Speaker 1: has a different set of colors. And that was a puzzle. 415 00:21:38,600 --> 00:21:40,600 Speaker 1: People are like, why is that? What is it about 416 00:21:40,840 --> 00:21:43,000 Speaker 1: hydrogen that makes it only glow with these colors and 417 00:21:43,040 --> 00:21:46,440 Speaker 1: helium glow only with those colors. The cool thing about 418 00:21:46,440 --> 00:21:49,120 Speaker 1: bars model was that it explained it and he kind 419 00:21:49,119 --> 00:21:51,920 Speaker 1: of gave the Basically, he kind of invented the icon 420 00:21:52,200 --> 00:21:55,240 Speaker 1: for science, right like the whenever you see anyone talking 421 00:21:55,280 --> 00:21:58,880 Speaker 1: about science or you know, branding science, they always used 422 00:21:58,920 --> 00:22:00,760 Speaker 1: this little model of the atom where it's like a 423 00:22:00,760 --> 00:22:03,199 Speaker 1: little thought in the middle and these hula hoops with 424 00:22:03,280 --> 00:22:06,000 Speaker 1: little electrons in each one. That's right the way to 425 00:22:06,040 --> 00:22:09,840 Speaker 1: go straight for the cartooning implications or is discovered. Right, 426 00:22:09,920 --> 00:22:12,679 Speaker 1: that's no, you're but you're absolutely right. That is like 427 00:22:12,760 --> 00:22:16,320 Speaker 1: the icon of science, especially in movies, especially when you 428 00:22:16,359 --> 00:22:19,720 Speaker 1: have mad scientists bent understooing the world, they always have 429 00:22:19,720 --> 00:22:23,080 Speaker 1: a company whose icon is like that Bore model of 430 00:22:23,080 --> 00:22:26,000 Speaker 1: the atom, right, because that's somehow like powerful and dangerous. 431 00:22:26,920 --> 00:22:29,399 Speaker 1: And the cool thing about this model is that explains 432 00:22:29,880 --> 00:22:33,880 Speaker 1: why those atoms give off different color light. And that's 433 00:22:33,920 --> 00:22:36,879 Speaker 1: that the electron has these orbitals, right, the electronic and 434 00:22:37,000 --> 00:22:39,560 Speaker 1: orbit hydrogen, and it can and it can only do 435 00:22:39,600 --> 00:22:42,919 Speaker 1: it in complete wavelengths, right. So the electron wiggles as 436 00:22:42,960 --> 00:22:46,600 Speaker 1: it goes around and um, so it has various energy 437 00:22:46,680 --> 00:22:48,920 Speaker 1: levels because it needs to wiggle either five times or 438 00:22:48,960 --> 00:22:50,960 Speaker 1: it needs to wiggle six times or seven times. It 439 00:22:50,960 --> 00:22:53,720 Speaker 1: can't wiggle like five and a half times. And so 440 00:22:53,760 --> 00:22:56,000 Speaker 1: it has these specific orbitals that can it can go 441 00:22:56,040 --> 00:22:59,320 Speaker 1: around um the atom in, and then it can jump 442 00:22:59,400 --> 00:23:02,199 Speaker 1: between one and then the other one. The color of 443 00:23:02,200 --> 00:23:04,800 Speaker 1: the light that it gives off reflects the energy of 444 00:23:04,840 --> 00:23:07,200 Speaker 1: the light that's sent out, which comes from the energy 445 00:23:07,240 --> 00:23:10,879 Speaker 1: difference of those electron orbitals. If the gas gets hot, 446 00:23:11,040 --> 00:23:13,639 Speaker 1: the electrons move up in those orbitals, and then when 447 00:23:13,640 --> 00:23:15,960 Speaker 1: it cools down, they jumped down, and they give off 448 00:23:16,000 --> 00:23:19,119 Speaker 1: a certain frequency of light. And so people had measured 449 00:23:19,160 --> 00:23:21,680 Speaker 1: the spectra, and they had all these equations to describe 450 00:23:21,720 --> 00:23:24,760 Speaker 1: the spectra, and there were these patterns that nobody could explain. 451 00:23:24,800 --> 00:23:27,080 Speaker 1: And so then Bore come up with this model, and 452 00:23:27,160 --> 00:23:29,800 Speaker 1: with just a few lines of mathematics, he was able 453 00:23:29,840 --> 00:23:33,600 Speaker 1: to predict those equations and and and his model perfectly 454 00:23:33,640 --> 00:23:36,320 Speaker 1: described um a lot of the spectra that we saw, 455 00:23:36,359 --> 00:23:38,080 Speaker 1: which was amazing, right, It was like a big open 456 00:23:38,119 --> 00:23:40,600 Speaker 1: puzzle in physics. And he came along with his cool 457 00:23:40,680 --> 00:23:43,520 Speaker 1: idea that was beautiful because it reflected what the planets 458 00:23:43,520 --> 00:23:46,719 Speaker 1: were doing and also explained the experiments we were seeing, 459 00:23:47,160 --> 00:23:48,879 Speaker 1: and it made a lot of sense. And so it 460 00:23:48,960 --> 00:23:52,560 Speaker 1: was just very quickly very popular. Like for all practical purposes, 461 00:23:53,000 --> 00:23:56,080 Speaker 1: it was true, like this model describe what was going 462 00:23:56,119 --> 00:23:58,640 Speaker 1: on inside of the atom as far as they need exactly. 463 00:23:58,720 --> 00:24:00,520 Speaker 1: And you know, as far as people had tested at 464 00:24:00,560 --> 00:24:02,880 Speaker 1: the time and as far as it had probed at 465 00:24:02,880 --> 00:24:06,320 Speaker 1: the time, it was true, Like it worked, it made sense, 466 00:24:06,680 --> 00:24:09,439 Speaker 1: it predicted experiments. What else could you ask for from 467 00:24:09,440 --> 00:24:11,240 Speaker 1: a theory, right, And so of course he won the 468 00:24:11,280 --> 00:24:14,560 Speaker 1: Nobel Prize for right, but it turned out to be wrong. 469 00:24:15,400 --> 00:24:18,440 Speaker 1: That's right. It turned out to not be what's actually happening. 470 00:24:22,720 --> 00:24:24,960 Speaker 1: So there's a couple of problems with bores theory. One 471 00:24:25,080 --> 00:24:29,240 Speaker 1: is it's not what's actually happening, right, Um, that's a 472 00:24:29,320 --> 00:24:33,720 Speaker 1: problem um. Problem number two is that it violates quantum 473 00:24:33,720 --> 00:24:38,880 Speaker 1: mechanics because you know, electrons don't actually have these orbitals um. 474 00:24:38,880 --> 00:24:40,760 Speaker 1: And problem number three is that it didn't work for 475 00:24:40,840 --> 00:24:43,960 Speaker 1: more complicated atoms. You know, it worked really well for hydrogen, 476 00:24:44,359 --> 00:24:47,320 Speaker 1: but once you get into multi electron atoms it didn't 477 00:24:47,320 --> 00:24:51,400 Speaker 1: really work. So it seemed really promising, but then as 478 00:24:51,400 --> 00:24:53,760 Speaker 1: we dug deeper, it turned out it didn't work. Okay, 479 00:24:53,760 --> 00:24:55,520 Speaker 1: so let's think. Let's dig a little bit into that 480 00:24:55,600 --> 00:24:59,919 Speaker 1: first one. So it violates quantum mechanics. What does that mean? Yeah, well, 481 00:25:00,040 --> 00:25:04,280 Speaker 1: quantum mechanics tells you that electrons don't have classical trajectories. 482 00:25:04,600 --> 00:25:06,960 Speaker 1: I mean you, a classical trajectory is something where you 483 00:25:06,960 --> 00:25:09,400 Speaker 1: know the position and the velocity of a particle, right 484 00:25:09,440 --> 00:25:11,439 Speaker 1: like you throw a baseball up in the air. You 485 00:25:11,480 --> 00:25:13,520 Speaker 1: know where it is and you know it's velocity, and 486 00:25:13,520 --> 00:25:15,199 Speaker 1: that allows to you to predict where it's going to go. 487 00:25:15,520 --> 00:25:18,440 Speaker 1: And we naturally think of everything is having a location 488 00:25:18,480 --> 00:25:21,439 Speaker 1: in a velocity, right, like everything is somewhere at a 489 00:25:21,440 --> 00:25:23,560 Speaker 1: certain time. That's the way you like to think about stuff. 490 00:25:23,800 --> 00:25:26,760 Speaker 1: But electrons don't act that way, right. They don't have 491 00:25:27,000 --> 00:25:29,440 Speaker 1: a path where they say I'm here and then I'm here, 492 00:25:29,480 --> 00:25:32,280 Speaker 1: and then I'm here and then I'm here. Insteads as 493 00:25:32,320 --> 00:25:34,440 Speaker 1: you can learn about if you listen to our podcast 494 00:25:34,440 --> 00:25:37,520 Speaker 1: about the universe being random. They just have a probability 495 00:25:37,560 --> 00:25:40,080 Speaker 1: distribution right there, like I might be here and I 496 00:25:40,200 --> 00:25:43,000 Speaker 1: might be there, and when you ask them where are you, 497 00:25:43,280 --> 00:25:45,400 Speaker 1: then they say, okay, I was here, I was there, 498 00:25:45,520 --> 00:25:48,400 Speaker 1: I was over here. But they don't travel in between 499 00:25:48,480 --> 00:25:52,520 Speaker 1: those locations, right, They just have these probability distributions. So 500 00:25:52,560 --> 00:25:54,560 Speaker 1: the idea that it's going around in a like a 501 00:25:54,600 --> 00:25:58,359 Speaker 1: clean orbit is just wrong because nothing really at that 502 00:25:58,520 --> 00:26:01,240 Speaker 1: level at the small this goes in at such a 503 00:26:01,280 --> 00:26:03,960 Speaker 1: perfect little loop, I know. And that's a shame because 504 00:26:04,000 --> 00:26:06,000 Speaker 1: that was one of the beautiful things about his theories, 505 00:26:06,040 --> 00:26:08,920 Speaker 1: that it mirrored the planetary dynamics, right, But it turns 506 00:26:08,920 --> 00:26:11,840 Speaker 1: out that parts pretty much the most wrong part, right, 507 00:26:11,880 --> 00:26:15,640 Speaker 1: that electrons don't move in these circular orbitals, and um 508 00:26:15,680 --> 00:26:17,639 Speaker 1: and it's not even just that their their motion is 509 00:26:17,640 --> 00:26:20,840 Speaker 1: a little fuzzy, right, their motion is not circular. Now 510 00:26:20,880 --> 00:26:23,040 Speaker 1: that we know more about how electrons move, there all 511 00:26:23,080 --> 00:26:25,520 Speaker 1: these weird shapes that they move in, you know, as 512 00:26:25,600 --> 00:26:27,800 Speaker 1: the atoms get more and more complicated, to have all 513 00:26:27,840 --> 00:26:30,159 Speaker 1: these pedals and s and d M and P waves. 514 00:26:30,160 --> 00:26:32,119 Speaker 1: And for those of you who know chemistry, you know 515 00:26:32,160 --> 00:26:34,840 Speaker 1: that electron orbitals can get very complicated. They look more 516 00:26:34,880 --> 00:26:38,720 Speaker 1: like clouds, right, like little balloon cloud. So you can't 517 00:26:38,760 --> 00:26:41,800 Speaker 1: say an electron is moving in an orbit and the 518 00:26:41,840 --> 00:26:44,560 Speaker 1: way planets moving in orbit, because electrons are fundamentally just 519 00:26:44,680 --> 00:26:48,920 Speaker 1: really different kinds of things than planets or even rocks. Right, 520 00:26:49,320 --> 00:26:51,600 Speaker 1: But could you say that kind of his model was 521 00:26:52,280 --> 00:26:55,520 Speaker 1: conceptually generally right, you know, like maybe they're not traveling 522 00:26:55,520 --> 00:26:58,639 Speaker 1: in a perfect circle, but it's like the it's a 523 00:26:58,640 --> 00:27:01,520 Speaker 1: good analogy for what's actually happening. Yeah, And that's a 524 00:27:01,520 --> 00:27:03,840 Speaker 1: fair defense of him, right, Like it's like saying, well 525 00:27:03,840 --> 00:27:08,080 Speaker 1: it was Newton wrong. No, I mean, newton theory didn't generalize, 526 00:27:08,160 --> 00:27:11,320 Speaker 1: but it predicts the way apples fall from trees, and 527 00:27:11,359 --> 00:27:13,800 Speaker 1: it predicts the way things moved through the air. And 528 00:27:13,840 --> 00:27:17,000 Speaker 1: so Einstein came along and generalized and showed where it 529 00:27:17,000 --> 00:27:19,880 Speaker 1: would break down. But you know, to a reasonable approximation, 530 00:27:19,880 --> 00:27:22,320 Speaker 1: it was correct. And everything we do in physics is 531 00:27:22,359 --> 00:27:24,960 Speaker 1: only correct up to some point. Will always be something 532 00:27:24,960 --> 00:27:27,719 Speaker 1: we discovered which shows us that it's wrong. In some regime, 533 00:27:27,760 --> 00:27:31,040 Speaker 1: were very high energy, you're very high mass or something, 534 00:27:31,640 --> 00:27:33,400 Speaker 1: and so it's a fair point to say, like, look, 535 00:27:33,480 --> 00:27:35,199 Speaker 1: given the data at the time and our understand of 536 00:27:35,240 --> 00:27:38,400 Speaker 1: the time, it worked and it explained the what we observed, 537 00:27:38,400 --> 00:27:41,359 Speaker 1: and so there's nothing wrong about it is just improved later. 538 00:27:42,000 --> 00:27:44,720 Speaker 1: But my problem with with Boars models that it fundamentally 539 00:27:44,760 --> 00:27:47,720 Speaker 1: misses the mark because it imagines that electrons are moving 540 00:27:47,720 --> 00:27:51,280 Speaker 1: in these classical orbitals and they just really don't. The 541 00:27:51,359 --> 00:27:53,560 Speaker 1: real problem with Boars models that it just doesn't work. 542 00:27:53,600 --> 00:27:56,240 Speaker 1: As the atoms get more complicated, you have multi elector 543 00:27:56,280 --> 00:27:57,720 Speaker 1: you have more than one electron in there, it just 544 00:27:57,760 --> 00:28:01,040 Speaker 1: doesn't work. It doesn't describe the data. It's amazing for 545 00:28:01,080 --> 00:28:03,399 Speaker 1: a model which is conceptually and fundamentally wrong that it 546 00:28:03,440 --> 00:28:07,040 Speaker 1: worked so well. It's like, it's incredible that it predicted 547 00:28:07,160 --> 00:28:10,240 Speaker 1: the results of experiments perfectly even though it was wrong. 548 00:28:10,800 --> 00:28:13,160 Speaker 1: To me, It's like an amazing coincidence. But the funny 549 00:28:13,160 --> 00:28:15,480 Speaker 1: thing to me is that that's what most people think 550 00:28:15,480 --> 00:28:18,120 Speaker 1: of when they think of an atom, is the bore model. Yeah, 551 00:28:18,160 --> 00:28:21,080 Speaker 1: and I think that's what most physics cartoonists keep drawing 552 00:28:21,200 --> 00:28:29,840 Speaker 1: and perpetuating this incorrect idea. Yes, that's right. Before we 553 00:28:29,920 --> 00:28:44,920 Speaker 1: keep going, let's take a short break. Al right. So 554 00:28:44,960 --> 00:28:47,240 Speaker 1: that's the first story of how someone got the Nobel 555 00:28:47,280 --> 00:28:50,200 Speaker 1: Prize wrong. Tell us about the second story. So the 556 00:28:50,200 --> 00:28:53,800 Speaker 1: second one is Enrico Fermi. Rico Feremi also a staggering 557 00:28:53,920 --> 00:28:55,760 Speaker 1: genius of science, and the last thing I want to 558 00:28:55,760 --> 00:28:58,160 Speaker 1: do is besmirch his legacy because he's made so many 559 00:28:58,200 --> 00:29:02,280 Speaker 1: important discoveries and contributions to physics. Um. But he played 560 00:29:02,280 --> 00:29:05,240 Speaker 1: an important role in the discovery of nuclear fission, but 561 00:29:05,320 --> 00:29:08,120 Speaker 1: he didn't realize it. So this is in the thirties 562 00:29:08,160 --> 00:29:10,880 Speaker 1: when people are understanding radiation and trying to figure out 563 00:29:10,920 --> 00:29:13,400 Speaker 1: like can you break the atom? Or what happens if 564 00:29:13,440 --> 00:29:16,200 Speaker 1: you hit this atom with this particle, And you know, 565 00:29:16,280 --> 00:29:18,760 Speaker 1: this is just a few years before in the Manhattan 566 00:29:18,800 --> 00:29:22,040 Speaker 1: Project and the atomic bomb and nuclear energy, and he 567 00:29:22,080 --> 00:29:25,080 Speaker 1: was trying to understand, like what happens when you shoot 568 00:29:25,120 --> 00:29:29,040 Speaker 1: a slow neutron at uranium. He was trying to get 569 00:29:29,160 --> 00:29:32,280 Speaker 1: uranium to capture another neutron. He was trying to make 570 00:29:32,320 --> 00:29:35,600 Speaker 1: something heavier than uranium, like to absorb it. Yeah, to 571 00:29:35,680 --> 00:29:37,600 Speaker 1: absorb it, maybe have it turned into a proton. He 572 00:29:37,600 --> 00:29:40,200 Speaker 1: didn't really understand how it worked. And so they had 573 00:29:40,200 --> 00:29:42,400 Speaker 1: this beam of neutrons and they would slow them down 574 00:29:42,440 --> 00:29:44,840 Speaker 1: by passing them through a lot of wax. The ideas 575 00:29:44,840 --> 00:29:46,960 Speaker 1: that like, if you slow it down, then maybe as 576 00:29:46,960 --> 00:29:49,000 Speaker 1: it passes the nucleus is a better chance of sort 577 00:29:49,000 --> 00:29:51,200 Speaker 1: of getting grabbed and held in there. He's trying to 578 00:29:51,200 --> 00:29:54,440 Speaker 1: build a bigger atom. Yeah. Yeah. They call them trans 579 00:29:54,600 --> 00:29:57,640 Speaker 1: uranium elements, which is a pretty awesome name for a band. 580 00:29:57,840 --> 00:29:59,880 Speaker 1: Like just throw stuff at it so it stakes a 581 00:30:00,040 --> 00:30:02,920 Speaker 1: gets bigger. Yeah, And he's like trying to gently toss 582 00:30:03,080 --> 00:30:05,920 Speaker 1: neutrons at the uranium atom so they stick rather than 583 00:30:05,920 --> 00:30:09,200 Speaker 1: flying by really fast. And um, you know, he got 584 00:30:09,240 --> 00:30:12,200 Speaker 1: something to work. He slow, He shot these slow neutrons 585 00:30:12,240 --> 00:30:15,040 Speaker 1: at uranium, and he got something out which was definitely 586 00:30:15,080 --> 00:30:18,120 Speaker 1: not uranium. And so he thought, oh, look I've created 587 00:30:18,160 --> 00:30:21,880 Speaker 1: trans uranium elements. I've I've done it. Um and he 588 00:30:21,920 --> 00:30:24,600 Speaker 1: won the Nobel Prize for that. Uh, it turns out 589 00:30:24,720 --> 00:30:27,280 Speaker 1: he had not created trans uranium elements. I mean he 590 00:30:27,280 --> 00:30:29,800 Speaker 1: even like named this new element which he had not 591 00:30:30,120 --> 00:30:33,760 Speaker 1: did not created um. Too much fanfare. But it turns 592 00:30:33,760 --> 00:30:35,600 Speaker 1: out what he had done was he had discovered fission. 593 00:30:35,680 --> 00:30:39,960 Speaker 1: He had broken the uranium atom into two. Oh. So 594 00:30:40,000 --> 00:30:43,120 Speaker 1: he shot these things at it and it started behaving differently. 595 00:30:43,160 --> 00:30:44,840 Speaker 1: So he thought, hey, I did what I was what 596 00:30:44,920 --> 00:30:48,120 Speaker 1: I thought I was doing. But actually he broke uranium. Yeah, 597 00:30:48,120 --> 00:30:50,960 Speaker 1: instead of making something heavier, he made things lighter. Right, 598 00:30:50,960 --> 00:30:54,480 Speaker 1: So he just turned uranium into barium and crypton. Right, 599 00:30:54,640 --> 00:30:58,560 Speaker 1: uranium breaks up into two smaller, lighter elements, and that's fission, 600 00:30:58,600 --> 00:31:01,760 Speaker 1: and that's hugely important and it's like really consequential, and 601 00:31:01,800 --> 00:31:04,320 Speaker 1: that triggered like the nuclear age. And you know, people 602 00:31:04,320 --> 00:31:07,000 Speaker 1: who came later who won the Nobel Prize for discovering 603 00:31:07,040 --> 00:31:10,280 Speaker 1: fission were definitely worked on the on the shoulders of fami. So, 604 00:31:10,640 --> 00:31:12,920 Speaker 1: I mean, it's it's really important work. But it's like 605 00:31:13,040 --> 00:31:15,600 Speaker 1: he just thought he discovered A. Turns out he discovered B. 606 00:31:16,600 --> 00:31:18,360 Speaker 1: But he took credit for A and he got Nobel 607 00:31:18,400 --> 00:31:21,160 Speaker 1: Prize for A, when in fact he had discovered something 608 00:31:21,240 --> 00:31:24,040 Speaker 1: else totally different but but the Nobel Prize committee at 609 00:31:24,080 --> 00:31:26,120 Speaker 1: the time thought that he had done it. That's right, Like, 610 00:31:26,200 --> 00:31:27,880 Speaker 1: they looked at his data, they looked at what he 611 00:31:27,920 --> 00:31:29,920 Speaker 1: had done, and he said, yeah, yeah, he made a 612 00:31:29,960 --> 00:31:33,320 Speaker 1: bigger uran uranium. He should win the prize. Yeah, that's right. 613 00:31:33,320 --> 00:31:35,800 Speaker 1: They gave him the Nobel Prize for discovering trains uranium 614 00:31:36,160 --> 00:31:40,120 Speaker 1: trans uranic elements. But he hadn't. Um. He had of 615 00:31:40,200 --> 00:31:42,479 Speaker 1: course done something really important. But so how did they 616 00:31:42,480 --> 00:31:44,440 Speaker 1: figure out that he had not done it? Well, people 617 00:31:44,520 --> 00:31:48,080 Speaker 1: came along later, um and um, you know, Lisa Mightner 618 00:31:48,160 --> 00:31:50,080 Speaker 1: and Otto Hahn, and they actually figured out they're like 619 00:31:50,120 --> 00:31:52,160 Speaker 1: trying to reproduce it, and they had more details, and 620 00:31:52,200 --> 00:31:55,000 Speaker 1: they studied what came out of the experiments and they realized, oh, 621 00:31:55,120 --> 00:31:57,040 Speaker 1: this is not something heavi or this is something lighter. 622 00:31:57,600 --> 00:32:00,240 Speaker 1: And so the follow up work revealed that what had 623 00:32:00,240 --> 00:32:03,480 Speaker 1: happened with something totally different and even more interesting, even 624 00:32:03,560 --> 00:32:06,440 Speaker 1: more consequential. Right. So that's the thing about this discovery 625 00:32:07,040 --> 00:32:09,120 Speaker 1: is that he was wrong, but he was also wrong 626 00:32:09,160 --> 00:32:12,640 Speaker 1: about how important it was. He underestimated how important it was. 627 00:32:13,360 --> 00:32:16,160 Speaker 1: He understood it and still got a Nobel Prize. But 628 00:32:16,280 --> 00:32:19,880 Speaker 1: even though he basically did it first, he didn't get 629 00:32:19,920 --> 00:32:23,600 Speaker 1: the Nobel Prize for fission for the thing he accidentally did. Like, 630 00:32:23,680 --> 00:32:25,239 Speaker 1: you have to do it on purpose, right to win 631 00:32:25,320 --> 00:32:29,120 Speaker 1: the prize. The secret rule secret you have to do 632 00:32:29,240 --> 00:32:31,040 Speaker 1: it on purpose. You don't have to do it on purpose, 633 00:32:31,120 --> 00:32:33,520 Speaker 1: but you have to recognize that you've done it. A 634 00:32:33,600 --> 00:32:36,480 Speaker 1: lot of Nobel Prizes um or rewarded for things people 635 00:32:36,520 --> 00:32:39,760 Speaker 1: did accidentally and discovered, like oops, you know, but the 636 00:32:40,280 --> 00:32:43,880 Speaker 1: discovery the cosmic marcrowave background radiation totally an accident, but 637 00:32:43,920 --> 00:32:45,640 Speaker 1: then they recognized what they had done and read a 638 00:32:45,680 --> 00:32:47,680 Speaker 1: paper about it, and then they were given the Nobel prices. 639 00:32:47,680 --> 00:32:50,200 Speaker 1: You have to pretend you did it on purpose. There 640 00:32:50,240 --> 00:32:52,280 Speaker 1: you go, that's the rule. I'm sure that's somewhere in 641 00:32:52,320 --> 00:32:55,360 Speaker 1: Alfred Nobel's will. Yeah, I'm sure there's a Swedish translation 642 00:32:55,400 --> 00:33:00,560 Speaker 1: to what I just said exactly. Okay, Google translate RhE 643 00:33:00,600 --> 00:33:04,840 Speaker 1: into Swedish in real time. Okay, So can you imagine 644 00:33:04,880 --> 00:33:07,880 Speaker 1: being the scientists who basically are just trying to like 645 00:33:08,560 --> 00:33:12,520 Speaker 1: replicate with this Nobel Prize project and then you figure 646 00:33:12,520 --> 00:33:15,000 Speaker 1: out that it's wrong. Yeah, I know. And Feremi was 647 00:33:15,200 --> 00:33:17,680 Speaker 1: he was you know, he's staggering he was a huge 648 00:33:17,760 --> 00:33:20,080 Speaker 1: figure in physics at the time. He was well, he 649 00:33:20,160 --> 00:33:23,600 Speaker 1: was famous and internationally respected, and so you know, disagreeing 650 00:33:23,640 --> 00:33:25,920 Speaker 1: with Faremie or going up against Faremer claiming that Fremie 651 00:33:26,000 --> 00:33:28,440 Speaker 1: was wrong is uh, it's a pretty big deal. That 652 00:33:28,520 --> 00:33:31,120 Speaker 1: took some guts. They must have been pretty confident in 653 00:33:31,160 --> 00:33:34,520 Speaker 1: their data. So yeah, that's a pretty um kind of 654 00:33:34,800 --> 00:33:38,240 Speaker 1: a big error in the Nobel Prize committee. Yeah. Yeah, 655 00:33:38,240 --> 00:33:40,000 Speaker 1: I think looking back, they probably wish they had waited 656 00:33:40,000 --> 00:33:43,160 Speaker 1: a couple of years to give that Nobel prize. Yeah, exactly, 657 00:33:43,240 --> 00:33:45,160 Speaker 1: although you know, for me certainly should have won a 658 00:33:45,280 --> 00:33:48,200 Speaker 1: Nobel Prize for something or something. Okay, So it's not 659 00:33:48,320 --> 00:33:50,280 Speaker 1: like it's unjust that he wins a Nobel Prize. It's 660 00:33:50,280 --> 00:33:52,560 Speaker 1: not like he's undeserving. It's just not really for what 661 00:33:52,760 --> 00:33:55,280 Speaker 1: they actually gave it to him for. They don't feel 662 00:33:55,320 --> 00:33:57,480 Speaker 1: bad bad, No, I don't think they feel too bad 663 00:33:57,920 --> 00:33:59,880 Speaker 1: having for me be one of the Nobel Prize winners. 664 00:33:59,920 --> 00:34:01,880 Speaker 1: And though I think it feels pretty good. It's sort 665 00:34:01,920 --> 00:34:03,840 Speaker 1: of like Einstein, right, he won the Nobel Prize for 666 00:34:03,880 --> 00:34:06,920 Speaker 1: the photo electric effect explaining this tiny, weird little thing. 667 00:34:07,000 --> 00:34:09,160 Speaker 1: He never won the Nobel Prize for relativity. He didn't 668 00:34:09,239 --> 00:34:13,320 Speaker 1: write general relativity or special relativity. Now Nobel Prize for that, 669 00:34:14,080 --> 00:34:17,120 Speaker 1: I don't know. It's just like it's the idiosyncrasies of 670 00:34:17,760 --> 00:34:20,279 Speaker 1: the Nobel Prize committee. You know, they didn't think that 671 00:34:20,440 --> 00:34:22,360 Speaker 1: was worth it, and then later they gave it to 672 00:34:22,440 --> 00:34:24,399 Speaker 1: him for the photo electric effect, which is like sort 673 00:34:24,440 --> 00:34:26,239 Speaker 1: of like making it up to him. You know. It's 674 00:34:26,239 --> 00:34:29,640 Speaker 1: sort of like al Pacino getting the Lifetime Oscar even 675 00:34:29,719 --> 00:34:31,719 Speaker 1: you know, like, well, we should have given you the 676 00:34:31,800 --> 00:34:33,719 Speaker 1: Oscar for this one, and we didn't, and so here 677 00:34:33,719 --> 00:34:35,520 Speaker 1: you're gonna have this other one. That's why the Oscars 678 00:34:35,560 --> 00:34:38,960 Speaker 1: have that Lifetime Achievement award, right, Yeah, that's the oops. 679 00:34:39,040 --> 00:34:51,200 Speaker 1: Oscar Award should be a lifetime Achievement Nobel Prize. Okay, 680 00:34:51,239 --> 00:34:54,520 Speaker 1: So those are two examples of a wayne you can 681 00:34:54,560 --> 00:34:56,920 Speaker 1: get the Nobel Prize wrong, meaning that you make a 682 00:34:56,960 --> 00:34:59,000 Speaker 1: discovery but then later it turns out to be not 683 00:34:59,280 --> 00:35:02,480 Speaker 1: quite right. But we were talking earlier that there's other 684 00:35:02,560 --> 00:35:06,000 Speaker 1: ways in which you can get the Nobel Prize wrong, right, Yeah. 685 00:35:06,120 --> 00:35:08,480 Speaker 1: The other direction, which is you can fail to give 686 00:35:08,520 --> 00:35:11,520 Speaker 1: it to people who are clearly deserving of it, right, 687 00:35:11,880 --> 00:35:14,200 Speaker 1: And that's you know, almost as big a mistake in 688 00:35:14,280 --> 00:35:17,600 Speaker 1: my view. Yeah, there's lots of examples of people who 689 00:35:17,800 --> 00:35:21,200 Speaker 1: were kind of who are known to be crucial part 690 00:35:21,719 --> 00:35:24,080 Speaker 1: of the team and the discovery, but they for some 691 00:35:24,200 --> 00:35:26,480 Speaker 1: reason didn't get it. Yeah. And I think one of 692 00:35:26,520 --> 00:35:30,200 Speaker 1: the biggest examples is Vera Rubin. She's an astronomer who 693 00:35:30,360 --> 00:35:34,879 Speaker 1: everybody credits with making huge contributions and establishing the dark 694 00:35:34,960 --> 00:35:37,400 Speaker 1: matter is a thing. Right, She's the one who went 695 00:35:37,440 --> 00:35:40,240 Speaker 1: out and made careful measurements of how galaxies are rotating 696 00:35:40,280 --> 00:35:43,759 Speaker 1: and really proved that there's huge blobs of invisible matter 697 00:35:43,880 --> 00:35:47,840 Speaker 1: in these galaxies. Um. And so she's a huge figure 698 00:35:47,920 --> 00:35:50,880 Speaker 1: in astronomy and in in dark matter is one of 699 00:35:50,880 --> 00:35:53,799 Speaker 1: the biggest discoveries ever. But she never won the Nobel 700 00:35:53,840 --> 00:35:56,120 Speaker 1: Prize for it, right, And she's passed away now, so 701 00:35:56,239 --> 00:35:59,440 Speaker 1: she can't. And um, it's pretty widely acknowledged that that 702 00:35:59,600 --> 00:36:02,040 Speaker 1: was because as of you know, gender bias on the 703 00:36:02,080 --> 00:36:04,120 Speaker 1: committee and in the field in general. You know, this 704 00:36:04,239 --> 00:36:08,160 Speaker 1: is a field of men, and it's the committee's mostly men, 705 00:36:08,680 --> 00:36:11,719 Speaker 1: and in the history of the Physics Nobel Prize only 706 00:36:11,760 --> 00:36:14,440 Speaker 1: a few have ever been given to women. It's terrible, 707 00:36:14,719 --> 00:36:17,200 Speaker 1: it's shameful, it's shameful. A lot of these things you 708 00:36:17,280 --> 00:36:20,000 Speaker 1: sent me of people who should have gotten the Nobel 709 00:36:20,040 --> 00:36:24,240 Speaker 1: Prize are were female. Their women, right, Roslyn Franklin, And yeah, exactly, 710 00:36:24,360 --> 00:36:26,360 Speaker 1: Rosin Franklin. You know, she's the one who made that 711 00:36:26,440 --> 00:36:29,600 Speaker 1: amazing X ray pictures of d N A and which 712 00:36:29,680 --> 00:36:33,040 Speaker 1: Watson and Crick basically stole and then used to claim 713 00:36:33,080 --> 00:36:35,640 Speaker 1: the discovery of DNA for which they won the Nobel Prize. 714 00:36:36,200 --> 00:36:39,040 Speaker 1: And then she died of due to radiation disffuse a 715 00:36:39,239 --> 00:36:42,120 Speaker 1: few years later. Um, so they really stood on the 716 00:36:42,200 --> 00:36:44,840 Speaker 1: backs of her hard work. They stole her data and 717 00:36:44,880 --> 00:36:48,239 Speaker 1: won a Nobel Prize. Um. That's really embarrassing. And there's 718 00:36:48,239 --> 00:36:51,880 Speaker 1: other examples, you know, there's some Jocelyn Bell she um 719 00:36:52,120 --> 00:36:55,760 Speaker 1: was a graduate student and she discovered pulsars. The problem 720 00:36:55,840 --> 00:36:59,239 Speaker 1: is that her advisor got the Nobel Prize for and 721 00:36:59,440 --> 00:37:01,440 Speaker 1: she was out. They're doing all the work. I know, 722 00:37:01,880 --> 00:37:04,040 Speaker 1: how did how did they justify that? You know, at 723 00:37:04,080 --> 00:37:06,400 Speaker 1: the time, I think they just thought it would be 724 00:37:06,600 --> 00:37:08,600 Speaker 1: demeaning to the Nobel Prize to give it to a 725 00:37:08,680 --> 00:37:12,080 Speaker 1: graduate student or something. It makes no sense. It's to me, 726 00:37:12,200 --> 00:37:17,240 Speaker 1: it's really embarrassing. And but but in in some karmic justice. 727 00:37:17,320 --> 00:37:20,640 Speaker 1: She won the Breakthrough Prize last year, UM, which is 728 00:37:20,719 --> 00:37:23,400 Speaker 1: which is great, and she donated, like there's more than 729 00:37:23,480 --> 00:37:26,200 Speaker 1: two million dollars in prize money, she donated all that 730 00:37:26,400 --> 00:37:31,440 Speaker 1: money to UM supporting women and underrepresented minorities in science. 731 00:37:31,640 --> 00:37:33,399 Speaker 1: So like she's really a stand up kind of person. 732 00:37:33,840 --> 00:37:36,200 Speaker 1: That's awesome, and that's bigger than the money you get 733 00:37:36,239 --> 00:37:39,160 Speaker 1: from the Nobel Prize. Yeah, exactly. I think the Breakthrough 734 00:37:39,200 --> 00:37:41,359 Speaker 1: tries the prize is trying to buy its way into 735 00:37:41,480 --> 00:37:44,880 Speaker 1: cultural relevance by having more money than the Nobel Prize. 736 00:37:45,920 --> 00:37:48,760 Speaker 1: Al Right, well again, UM, we don't want to badge 737 00:37:48,800 --> 00:37:51,680 Speaker 1: the Nobel Prize or science. It's still the most awesome 738 00:37:51,760 --> 00:37:54,839 Speaker 1: thing to learn about the universe and to know what's 739 00:37:54,880 --> 00:37:57,560 Speaker 1: true and not. And we know that the committee out 740 00:37:57,560 --> 00:37:59,440 Speaker 1: there is doing its level best and they've made a 741 00:37:59,520 --> 00:38:01,680 Speaker 1: lot of X choices, and the folks that they have 742 00:38:01,760 --> 00:38:05,879 Speaker 1: elevated Nobel Prize winners are almost all entirely deserving, even 743 00:38:05,960 --> 00:38:09,920 Speaker 1: those for whom the prize was incorrectly awarded, they're still deserving, um, 744 00:38:10,320 --> 00:38:12,840 Speaker 1: And so we commend them for doing their best. But 745 00:38:12,920 --> 00:38:14,960 Speaker 1: of course it's a human endeavor and science is a 746 00:38:15,040 --> 00:38:17,080 Speaker 1: human endeavor and sometimes we make mistakes. And one of 747 00:38:17,080 --> 00:38:19,320 Speaker 1: the best things about science is that it's self correcting. 748 00:38:19,440 --> 00:38:21,000 Speaker 1: We will go back and we look at these things, 749 00:38:21,120 --> 00:38:23,840 Speaker 1: we understand them better, and we make improvements. Just like 750 00:38:23,960 --> 00:38:26,279 Speaker 1: this podcast, we hope to get better and better every time. 751 00:38:27,080 --> 00:38:29,359 Speaker 1: All right, well, thanks everyone for listening, See you next time, 752 00:38:29,640 --> 00:38:39,800 Speaker 1: See you next time. If you still have a question 753 00:38:39,880 --> 00:38:43,319 Speaker 1: after listening to all these explanations, please drop us a line. 754 00:38:43,360 --> 00:38:45,480 Speaker 1: We'd love to hear from you. You can find us 755 00:38:45,480 --> 00:38:49,319 Speaker 1: at Facebook, Twitter, and Instagram at Daniel and Jorge That's 756 00:38:49,360 --> 00:38:52,680 Speaker 1: one word, or email us at Feedback at Daniel and 757 00:38:52,840 --> 00:39:00,200 Speaker 1: Jorge dot com tw