1 00:00:05,200 --> 00:00:10,800 Speaker 1: Why do revolutionary ideas so often come from outsiders? Do 2 00:00:11,039 --> 00:00:16,440 Speaker 1: good scientists sometimes crowd out great scientists? Do we still 3 00:00:16,440 --> 00:00:20,520 Speaker 1: have room for scientific cowboys? And what is the relationship 4 00:00:20,560 --> 00:00:26,439 Speaker 1: between national security and modern science? Are scientists participants in 5 00:00:26,520 --> 00:00:30,159 Speaker 1: a larger game that they barely see? What if the 6 00:00:30,200 --> 00:00:33,440 Speaker 1: most important ideas are the ones you're not allowed to 7 00:00:33,479 --> 00:00:37,080 Speaker 1: hear about. From universities to public health to Watson and 8 00:00:37,200 --> 00:00:40,159 Speaker 1: crick and nuclear bombs and AI. Today we're going to 9 00:00:40,240 --> 00:00:45,240 Speaker 1: cover it all with physicists and mathematician and iconoclast Eric Weinstein. 10 00:00:45,600 --> 00:00:51,120 Speaker 1: So get ready for a great brain stretch. Welcome to 11 00:00:51,159 --> 00:00:54,600 Speaker 1: Inner Cosmos with me David Eagleman. I'm a neuroscientist and 12 00:00:54,600 --> 00:00:57,520 Speaker 1: an author at Stanford, and in these episodes we sail 13 00:00:57,680 --> 00:01:01,720 Speaker 1: deeply into our three pound universe to understand how we 14 00:01:01,800 --> 00:01:04,480 Speaker 1: see the world, and as we'll discuss today, we don't 15 00:01:04,520 --> 00:01:21,160 Speaker 1: always see what we believe we're seeing. We usually think 16 00:01:21,200 --> 00:01:25,880 Speaker 1: of science as a calm enterprise that's cumulative. You have 17 00:01:26,000 --> 00:01:30,680 Speaker 1: thousands of people testing hypotheses, you have vast amounts of 18 00:01:30,800 --> 00:01:36,560 Speaker 1: data getting gathered and knowledge slowly accretes. But underneath that 19 00:01:36,760 --> 00:01:40,720 Speaker 1: veneer sometimes things can be a little more turbulent, because 20 00:01:41,000 --> 00:01:47,160 Speaker 1: what happens sometimes is that scientific discoveries can quickly reshape 21 00:01:47,200 --> 00:01:50,840 Speaker 1: economies and alter what nations can do to each other, 22 00:01:51,200 --> 00:01:56,600 Speaker 1: and redraw political boundaries and redirect the future long before 23 00:01:57,040 --> 00:02:01,560 Speaker 1: most of us even notice. In other words, ideas that 24 00:02:01,640 --> 00:02:06,240 Speaker 1: begin on a chalkboard can end up steering history. But 25 00:02:06,280 --> 00:02:10,359 Speaker 1: we very rarely pause to ask whether the way science 26 00:02:10,440 --> 00:02:15,320 Speaker 1: is organize today is matched to the power it now holds. 27 00:02:15,919 --> 00:02:20,040 Speaker 1: Who decides which questions get asked or which lines of 28 00:02:20,160 --> 00:02:26,160 Speaker 1: inquiry get funded. How much of scientific progress depends on conformity, 29 00:02:26,240 --> 00:02:29,360 Speaker 1: and how much of it depends on rule breakers who 30 00:02:29,400 --> 00:02:33,160 Speaker 1: are willing to look crazy before they are proven right. 31 00:02:33,800 --> 00:02:37,800 Speaker 1: These questions sit at the intersection of labs and nations, 32 00:02:37,880 --> 00:02:41,520 Speaker 1: and they matter more than ever in a world where 33 00:02:41,560 --> 00:02:46,000 Speaker 1: any single discovery could reverberate globally. My guest today is 34 00:02:46,040 --> 00:02:50,280 Speaker 1: someone who spent years thinking about science from that high altitude. 35 00:02:50,600 --> 00:02:54,519 Speaker 1: Eric Weinstein is a mathematician, a physicist, and a public 36 00:02:54,560 --> 00:03:00,720 Speaker 1: intellectual who is unusually thoughtful about the architecture of knowledge, 37 00:03:01,120 --> 00:03:05,000 Speaker 1: like how ideas are filtered by the system and how 38 00:03:05,120 --> 00:03:10,840 Speaker 1: originality survives or fails inside universities. Eric hosts a podcast 39 00:03:10,919 --> 00:03:14,400 Speaker 1: called the portal, and he has a reputation for asking 40 00:03:14,520 --> 00:03:19,280 Speaker 1: questions that make people uncomfortable and for refusing to treat 41 00:03:19,440 --> 00:03:25,359 Speaker 1: existing structures as inevitable just because they are familiar. Today, 42 00:03:25,440 --> 00:03:27,919 Speaker 1: Eric and I are going to talk about how discovery 43 00:03:28,240 --> 00:03:32,240 Speaker 1: really happens, who it serves, and what might be required 44 00:03:32,360 --> 00:03:35,480 Speaker 1: if we want science to live up to its highest 45 00:03:35,560 --> 00:03:45,480 Speaker 1: ideals in the decades ahead. Here's my conversation with Eric Weinstein. So, Eric, 46 00:03:45,600 --> 00:03:48,120 Speaker 1: Jim Watson recently passed away, and I know that you 47 00:03:48,360 --> 00:03:50,960 Speaker 1: were close with him and you really admired him. Tell 48 00:03:51,000 --> 00:03:51,440 Speaker 1: me about that. 49 00:03:52,040 --> 00:03:52,320 Speaker 2: Well. 50 00:03:53,440 --> 00:03:55,680 Speaker 3: I spent time with Jim, and for the time that 51 00:03:55,720 --> 00:03:59,520 Speaker 3: we were together was very intense. But Jim is in 52 00:03:59,560 --> 00:04:03,920 Speaker 3: many ways my spirit animal. And one of the things 53 00:04:03,960 --> 00:04:07,640 Speaker 3: that's really important is to recognize who Jim was as 54 00:04:07,640 --> 00:04:13,480 Speaker 3: a scientist, who he was as a writer, and to 55 00:04:13,600 --> 00:04:17,240 Speaker 3: relegate everything else that he was to a tertiary status 56 00:04:18,000 --> 00:04:22,640 Speaker 3: that shouldn't distract us from the miracle that was Jim. 57 00:04:22,880 --> 00:04:23,800 Speaker 1: Give me example of that. 58 00:04:24,320 --> 00:04:27,240 Speaker 3: What's happened is that people who are not close to 59 00:04:27,360 --> 00:04:32,400 Speaker 3: science have set up a mimetic complex, which Jim Watson 60 00:04:33,520 --> 00:04:37,320 Speaker 3: auto completes somehow to a bunch of things that are 61 00:04:37,480 --> 00:04:38,480 Speaker 3: much less relevant. 62 00:04:38,520 --> 00:04:41,680 Speaker 1: So you know, can you unpack that? Sure? 63 00:04:41,880 --> 00:04:45,320 Speaker 3: Imagine that you knew Archimedes personally, and Archimedes had some 64 00:04:45,360 --> 00:04:47,400 Speaker 3: sort of like personal hygiene problem. 65 00:04:47,440 --> 00:04:49,640 Speaker 2: They didn't brush, you know. 66 00:04:49,920 --> 00:04:54,160 Speaker 3: Okay, so you imagine that everybody in Archimedes time. Oh 67 00:04:54,360 --> 00:04:57,520 Speaker 3: the breath on Archimedes was so terrible, you know, but 68 00:04:57,560 --> 00:05:02,359 Speaker 3: that's Archimedes. We important to recognize that that was Jim Watson. 69 00:05:02,480 --> 00:05:06,680 Speaker 3: Jim Watson was a blinding figure of science, world science, 70 00:05:06,760 --> 00:05:11,599 Speaker 3: American science in particular, and he auto completes to some 71 00:05:11,680 --> 00:05:15,840 Speaker 3: stuff that is far less relevant and salient. And so 72 00:05:16,040 --> 00:05:21,080 Speaker 3: it's important to me that we discussed Jim Watson and 73 00:05:21,160 --> 00:05:23,640 Speaker 3: not be captured by when we happen to be having 74 00:05:23,640 --> 00:05:26,920 Speaker 3: the conversation, because this is a ten thousand year. As 75 00:05:26,960 --> 00:05:28,800 Speaker 3: long as there are humans and as long as there 76 00:05:28,839 --> 00:05:32,120 Speaker 3: will be life as we know it, Jim Watson will 77 00:05:32,120 --> 00:05:33,960 Speaker 3: be one of the most important people who ever lived. 78 00:05:34,160 --> 00:05:36,279 Speaker 1: And when you say auto completes, you mean in people's 79 00:05:36,320 --> 00:05:38,560 Speaker 1: heads they think Jim Watson and then they think X 80 00:05:38,640 --> 00:05:38,920 Speaker 1: Y Z. 81 00:05:39,920 --> 00:05:42,840 Speaker 3: Well, yes, but again we're falling into the trap. So 82 00:05:42,920 --> 00:05:44,800 Speaker 3: let's briefly go into the trap and then only to 83 00:05:45,279 --> 00:05:47,640 Speaker 3: like it's a picture plant. Let's go into the picture plant, 84 00:05:47,640 --> 00:05:49,640 Speaker 3: help ourselves to a couple of deep gulps, and then 85 00:05:49,680 --> 00:05:52,480 Speaker 3: get the hell out. Okay, great, Okay, So Jim Watson 86 00:05:52,520 --> 00:05:56,920 Speaker 3: auto completes right now to Rosalind Franklin. Ah And if 87 00:05:57,000 --> 00:05:59,240 Speaker 3: you ask me, if you care about Rosalind Franklin, you 88 00:05:59,240 --> 00:06:01,000 Speaker 3: should be carrying about Irwin Chargaff. 89 00:06:01,200 --> 00:06:03,200 Speaker 2: Nobody talks about chargaph interruption. 90 00:06:03,640 --> 00:06:07,360 Speaker 1: So Krick and Watson discovered the structure of DNA, published 91 00:06:07,360 --> 00:06:10,160 Speaker 1: that in April of nineteen fifty three, and they ended 92 00:06:10,240 --> 00:06:13,360 Speaker 1: up winning the Nobel Prize with Maurice Wilkins for that work. 93 00:06:13,920 --> 00:06:17,560 Speaker 1: People then came out and said, look, Rosalind Franklin had 94 00:06:17,600 --> 00:06:23,000 Speaker 1: taken the first photographs of the structure and didn't recognize 95 00:06:24,000 --> 00:06:26,800 Speaker 1: what it was, the double helix structure, but should have 96 00:06:26,800 --> 00:06:29,039 Speaker 1: gotten credit. That's what we're referring to here. Just so 97 00:06:29,240 --> 00:06:30,280 Speaker 1: not really okay. 98 00:06:31,400 --> 00:06:36,080 Speaker 3: Roslind Franklin had taken done the X ray crystallography on 99 00:06:36,160 --> 00:06:39,960 Speaker 3: nucleic acid and had this famous Maltese cross, which Watson 100 00:06:40,640 --> 00:06:45,120 Speaker 3: and Krick took to mean that the structure of the 101 00:06:45,200 --> 00:06:48,160 Speaker 3: nucleic acid was likely to be helical, not double helic. 102 00:06:48,400 --> 00:06:49,000 Speaker 2: Just helical. 103 00:06:49,720 --> 00:06:53,839 Speaker 3: They were under suspicion by their colleagues of not knowing anything, 104 00:06:54,880 --> 00:06:58,839 Speaker 3: of being so enamored of Linus Paulling's great achievement in 105 00:06:58,920 --> 00:07:01,760 Speaker 3: that discovery the alpha helix, which ended up as a 106 00:07:01,760 --> 00:07:05,520 Speaker 3: secondary structure of protein that they were trying to copy 107 00:07:06,760 --> 00:07:11,120 Speaker 3: palling and to shove nucleic acid into a form that 108 00:07:11,600 --> 00:07:13,200 Speaker 3: there was no reason to think that it had to 109 00:07:13,240 --> 00:07:15,840 Speaker 3: be that. And so the part of the the part 110 00:07:15,840 --> 00:07:18,080 Speaker 3: of the problem of this story is that Rosslyn Frank 111 00:07:18,240 --> 00:07:22,440 Speaker 3: was absolutely correct. There was no reason that DNA had 112 00:07:22,480 --> 00:07:27,280 Speaker 3: to be helical. That's good science, it's very good science. 113 00:07:27,600 --> 00:07:30,680 Speaker 3: And it's an absolute cautionary tale. Why you can't let 114 00:07:30,800 --> 00:07:32,640 Speaker 3: good scientists run science. 115 00:07:33,000 --> 00:07:36,680 Speaker 1: Ah, because you're making a distinction between good scientists and 116 00:07:36,680 --> 00:07:39,360 Speaker 1: great scientists. Okay, I see. 117 00:07:39,240 --> 00:07:42,160 Speaker 3: Jim Jim Watson, in my opinion, was not a good scientist. 118 00:07:42,240 --> 00:07:43,200 Speaker 1: He was a great scientist. 119 00:07:43,200 --> 00:07:44,360 Speaker 2: He was a great scientist. 120 00:07:44,400 --> 00:07:46,560 Speaker 3: And there it's not that you take good science and 121 00:07:46,600 --> 00:07:48,600 Speaker 3: turn it up to eleven to get great science. It's 122 00:07:48,640 --> 00:07:49,960 Speaker 3: a different process. 123 00:07:50,640 --> 00:07:52,880 Speaker 1: Okay. And you think of this as cowboys is that? 124 00:07:52,920 --> 00:07:57,840 Speaker 1: Which is that correct? You call cowboys science? YIPI ka, Okay, 125 00:07:57,880 --> 00:08:01,160 Speaker 1: So let's unpack that. So, yeah, what does cowboy science 126 00:08:01,160 --> 00:08:04,200 Speaker 1: look like? And actually let's continue this story in that context. 127 00:08:04,280 --> 00:08:09,720 Speaker 3: Well, you see, Jim and Francis were searching a much 128 00:08:09,800 --> 00:08:13,560 Speaker 3: smaller landscape because they were convinced that it was going 129 00:08:13,600 --> 00:08:16,480 Speaker 3: to be helical. Okay, so they didn't have to think 130 00:08:16,520 --> 00:08:18,840 Speaker 3: about all the possible like if you ever spend time 131 00:08:18,880 --> 00:08:22,480 Speaker 3: with the protein data bank, my god, does nature get 132 00:08:22,560 --> 00:08:26,160 Speaker 3: up to some fun architecture really, I mean just unbelievably 133 00:08:26,200 --> 00:08:30,080 Speaker 3: beautiful things. And they didn't think it was going to 134 00:08:30,120 --> 00:08:32,640 Speaker 3: be any of those things. So Jim and Francis were 135 00:08:32,679 --> 00:08:38,560 Speaker 3: in part not reasonable people, and they made a point 136 00:08:39,760 --> 00:08:42,920 Speaker 3: of telling everybody that if you weren't working on nucleic acid, 137 00:08:42,960 --> 00:08:44,760 Speaker 3: you were an idea that they didn't have time to 138 00:08:44,760 --> 00:08:47,440 Speaker 3: go to seminars that were mere distractions. This is the 139 00:08:47,440 --> 00:08:49,720 Speaker 3: difference between good science and great science. 140 00:08:49,800 --> 00:08:51,000 Speaker 1: Exactly what we're. 141 00:08:50,880 --> 00:08:55,880 Speaker 3: Doing is we're driving great science out of academics and 142 00:08:55,920 --> 00:08:58,800 Speaker 3: out of research so that we have this proliferation of 143 00:08:58,840 --> 00:09:01,680 Speaker 3: good science. Don't do this, people who are not a 144 00:09:01,720 --> 00:09:05,920 Speaker 3: walking hr nightmare. Yeah, which is a catastrophic decision. 145 00:09:06,280 --> 00:09:08,960 Speaker 2: So just before we get to that, just for the audience, 146 00:09:09,000 --> 00:09:09,679 Speaker 2: I want to mention. 147 00:09:09,880 --> 00:09:13,680 Speaker 1: So Eric knew Jim Watson, I as a postdoc worked 148 00:09:13,679 --> 00:09:16,240 Speaker 1: with Franciskriek. Neither of us knew the other one, but 149 00:09:16,320 --> 00:09:19,520 Speaker 1: we both got to spend time with these two giants 150 00:09:19,760 --> 00:09:24,360 Speaker 1: of my god a biology. Yeah. Okay, So so you're 151 00:09:24,400 --> 00:09:27,839 Speaker 1: categorizing that as great scientists as opposed to good scientists. 152 00:09:27,840 --> 00:09:30,400 Speaker 1: So what's the problem that you see in academia in 153 00:09:30,480 --> 00:09:34,000 Speaker 1: terms of support of good science and not great science. 154 00:09:34,040 --> 00:09:37,720 Speaker 3: Well, you need both, You absolutely need both. And one 155 00:09:37,760 --> 00:09:40,360 Speaker 3: of the tasks for people in the great science model 156 00:09:41,280 --> 00:09:45,000 Speaker 3: is not to crap all over their good science colleagues. 157 00:09:46,040 --> 00:09:51,520 Speaker 3: And in fact, Jim was extremely kind in my experience 158 00:09:52,200 --> 00:09:55,360 Speaker 3: to Roslin. Franklin and I discussed her at great length, 159 00:09:56,480 --> 00:09:59,280 Speaker 3: and he didn't pritify the story and by which you 160 00:09:59,280 --> 00:10:02,320 Speaker 3: mean making a pretty Yeah, well he was an ass. 161 00:10:02,800 --> 00:10:05,560 Speaker 3: He met your daughter and wife right, and he behaved 162 00:10:05,559 --> 00:10:09,120 Speaker 3: abysmally towards them. But then he behaved in a very 163 00:10:09,160 --> 00:10:13,600 Speaker 3: kind fashion as well, like what my daughter asked him 164 00:10:13,600 --> 00:10:17,120 Speaker 3: a question, as I recall in a crowded room, I 165 00:10:17,120 --> 00:10:22,400 Speaker 3: don't know, age twelve or thirteen, about the origin of 166 00:10:22,440 --> 00:10:25,599 Speaker 3: the organelles or how did mitochondria end up in the 167 00:10:25,640 --> 00:10:30,800 Speaker 3: eukaryotic cell. Was it an infection? And he looked at 168 00:10:30,800 --> 00:10:34,440 Speaker 3: her and he said, you know this Nobel Laureate eighty eight, 169 00:10:35,520 --> 00:10:38,320 Speaker 3: world famous. Oh, I don't have any interest in that, 170 00:10:41,440 --> 00:10:45,280 Speaker 3: you know. It reminds me of Miles Davis talking to 171 00:10:45,280 --> 00:10:48,000 Speaker 3: a three year old kid who was seeking his autograph, 172 00:10:48,000 --> 00:10:49,800 Speaker 3: and I think that his famous line was something like, 173 00:10:49,840 --> 00:10:52,520 Speaker 3: he looks up expecting to see an adult shows her. 174 00:10:52,640 --> 00:10:56,240 Speaker 3: I sees this tiny kid tugging at his pant leg. 175 00:10:56,640 --> 00:10:59,800 Speaker 3: He looks at it, says, fuck off, kid, Oh god, okay, 176 00:11:00,000 --> 00:11:04,640 Speaker 3: all right. But then Jim came back, you know, at 177 00:11:04,679 --> 00:11:08,920 Speaker 3: some point, and he said, which one is your wife? 178 00:11:10,760 --> 00:11:13,160 Speaker 3: And I said, the economist who mopped the floor with 179 00:11:13,200 --> 00:11:14,840 Speaker 3: you intellectually at dinner last night. 180 00:11:15,720 --> 00:11:16,960 Speaker 2: He said, oh, she's very good. 181 00:11:18,600 --> 00:11:22,559 Speaker 3: Because he was completely dismissive of her being female and 182 00:11:22,600 --> 00:11:24,520 Speaker 3: all these things. He said, you want to know why 183 00:11:24,520 --> 00:11:29,000 Speaker 3: your children are so intelligent? I said, excuse me. He said, well, 184 00:11:29,040 --> 00:11:31,719 Speaker 3: your daughter is obviously very intelligent, as is your son, 185 00:11:31,720 --> 00:11:34,280 Speaker 3: because he knew both of them. So even though he 186 00:11:34,360 --> 00:11:37,920 Speaker 3: was cruel, he thought very highly. And I said why 187 00:11:38,000 --> 00:11:39,760 Speaker 3: do you think He said, well, you can't just do 188 00:11:39,800 --> 00:11:41,880 Speaker 3: it with one set of genes. You should thank your wife. 189 00:11:42,080 --> 00:11:49,760 Speaker 3: Oh it's lovely. Well, but he was a misogynist, but 190 00:11:50,200 --> 00:11:54,720 Speaker 3: women loved him. But he was very kind and active 191 00:11:54,720 --> 00:11:59,160 Speaker 3: in promoting female careers. Like you don't understand the complexity 192 00:12:00,160 --> 00:12:03,360 Speaker 3: of this particular guy. Was he a horse's ass? 193 00:12:03,440 --> 00:12:03,960 Speaker 1: He was? 194 00:12:05,240 --> 00:12:08,680 Speaker 3: And I'm not saying that's great. I got a chance 195 00:12:08,720 --> 00:12:11,600 Speaker 3: to tell him to shut up multiple times and he 196 00:12:11,679 --> 00:12:12,040 Speaker 3: took it. 197 00:12:12,240 --> 00:12:13,240 Speaker 1: But you had to back it up. 198 00:12:13,280 --> 00:12:15,960 Speaker 3: You know, you can't just mumble something about racism or 199 00:12:16,000 --> 00:12:20,720 Speaker 3: sexism because DNA and its implications are absolutely profound and 200 00:12:20,760 --> 00:12:23,800 Speaker 3: most of us haven't wrestled with. And I also believe 201 00:12:23,840 --> 00:12:27,000 Speaker 3: that Jim came from a place of kindness and goodness 202 00:12:27,080 --> 00:12:28,160 Speaker 3: that isn't recognized. 203 00:12:28,280 --> 00:12:30,120 Speaker 1: Is there anything that we can learn from the story 204 00:12:30,160 --> 00:12:33,760 Speaker 1: of Roslin Franklin and Jim Watson that allows us to 205 00:12:33,840 --> 00:12:35,120 Speaker 1: do better science? 206 00:12:35,480 --> 00:12:35,880 Speaker 2: I think so. 207 00:12:36,200 --> 00:12:39,679 Speaker 3: I mean, I think that recognizing that there are these 208 00:12:39,720 --> 00:12:44,080 Speaker 3: different styles of science and that we need all of 209 00:12:44,120 --> 00:12:48,800 Speaker 3: these styles of science would be very helpful, and that 210 00:12:49,040 --> 00:12:51,400 Speaker 3: in general, I don't think a Jim Watson would want 211 00:12:51,600 --> 00:12:54,120 Speaker 3: to drive a Rosalind Franklin out of science at all. 212 00:12:54,920 --> 00:12:57,760 Speaker 3: But the problem is is that there is absolutely no 213 00:12:58,000 --> 00:13:01,880 Speaker 3: place for this cowboy science within the standard framework. So 214 00:13:01,920 --> 00:13:04,480 Speaker 3: if you look, for example, at the interactions with Chargaff, 215 00:13:05,280 --> 00:13:08,440 Speaker 3: Chargaff is absolutely merciless to Watson and Crek. He calls 216 00:13:08,480 --> 00:13:10,920 Speaker 3: them two pitchmen in search of a helix. 217 00:13:11,240 --> 00:13:13,320 Speaker 1: Okay, wait, hold on, So first tells who Chargraph is 218 00:13:13,360 --> 00:13:14,880 Speaker 1: and tells what a pitch man is. I don't know 219 00:13:14,920 --> 00:13:15,480 Speaker 1: that term. 220 00:13:15,760 --> 00:13:19,560 Speaker 3: So Irwin Chargaff is a Columbia professor. I believe that 221 00:13:19,640 --> 00:13:22,880 Speaker 3: he came probably from Vienna. I think he spoke five languages, 222 00:13:22,920 --> 00:13:25,200 Speaker 3: maybe English was his fifth. He wrote one of the 223 00:13:25,200 --> 00:13:28,880 Speaker 3: most amazing books in the history of biology, called Heracletian Fire, 224 00:13:28,920 --> 00:13:34,040 Speaker 3: which nobody's read, and in it he tells the story 225 00:13:34,280 --> 00:13:38,320 Speaker 3: of figuring out that the equimolar relations, where he figured 226 00:13:38,320 --> 00:13:41,560 Speaker 3: out that the amount of the nucleotides was exactly paired 227 00:13:41,559 --> 00:13:44,400 Speaker 3: when you chopped up whatever the nucleic acid was. 228 00:13:45,280 --> 00:13:45,880 Speaker 2: So we had this. 229 00:13:46,960 --> 00:13:49,240 Speaker 1: Meaning you have the same amount of a's and t's 230 00:13:49,760 --> 00:13:50,960 Speaker 1: and the same amount. 231 00:13:50,640 --> 00:13:53,560 Speaker 2: Of seeds as that's right and so. 232 00:13:53,679 --> 00:13:55,960 Speaker 3: But there was no explanation for it. So it was 233 00:13:56,000 --> 00:13:58,440 Speaker 3: what we would call a fine tuning mystery. Now, isn't 234 00:13:58,440 --> 00:13:59,960 Speaker 3: it interesting? What is it fine tuning mystery? 235 00:14:00,000 --> 00:14:00,840 Speaker 1: Well, that's usually we. 236 00:14:01,240 --> 00:14:05,920 Speaker 3: Encounter it in physics, like is the universe finally tuned 237 00:14:05,960 --> 00:14:08,360 Speaker 3: for life? If the proton were a little bit less 238 00:14:08,360 --> 00:14:10,400 Speaker 3: heavy or a little bit more heavy, it wouldn't work. 239 00:14:10,440 --> 00:14:14,400 Speaker 3: And how is the curvature exactly? Everything is just so right? 240 00:14:14,800 --> 00:14:17,040 Speaker 3: And usually what there is is there's an explanation in 241 00:14:17,040 --> 00:14:21,360 Speaker 3: this case, like the hydrogen bonds which enforced the pairing. 242 00:14:21,400 --> 00:14:25,240 Speaker 3: It wasn't an accident. And of course people always somebody 243 00:14:25,240 --> 00:14:26,800 Speaker 3: will always say, maybe it's a coincidence. 244 00:14:26,920 --> 00:14:28,680 Speaker 2: You can't conclude that right. 245 00:14:28,200 --> 00:14:31,840 Speaker 3: So what Watson and Crick did is they took that 246 00:14:31,960 --> 00:14:35,680 Speaker 3: information of CHARGEV and CHARGAV, I believe came up with 247 00:14:35,760 --> 00:14:39,880 Speaker 3: a mobius band theory of nucleic acid. He said, if 248 00:14:39,880 --> 00:14:42,800 Speaker 3: he'd only been able to work with a Rosalind Franklin, 249 00:14:42,840 --> 00:14:45,560 Speaker 3: he would have gotten it. And Jim, to his credit, said, 250 00:14:45,880 --> 00:14:49,560 Speaker 3: if Rosalind Franklin had simply spent one day decamping from 251 00:14:49,600 --> 00:14:52,480 Speaker 3: her atomacy, that we didn't have enough information to say 252 00:14:52,480 --> 00:14:54,160 Speaker 3: it was a helix. He said she would have gotten 253 00:14:54,200 --> 00:14:59,040 Speaker 3: it in an afternoon, an amazing claim that Rosalind Franklin 254 00:14:59,040 --> 00:15:02,120 Speaker 3: would have gotten the double helix in an afternoon but 255 00:15:02,320 --> 00:15:05,720 Speaker 3: for her insistence on being a good scientist. I see, 256 00:15:05,840 --> 00:15:09,280 Speaker 3: And so you have to understand that what Jim was 257 00:15:09,320 --> 00:15:12,160 Speaker 3: willing to acknowledge about Rosalind Franklin was in many ways 258 00:15:12,160 --> 00:15:19,680 Speaker 3: incredibly complimentary. But Chargaff writes very clearly that these pitchmen, 259 00:15:19,880 --> 00:15:22,200 Speaker 3: he means, you know, think about Silicon Valley where you 260 00:15:22,240 --> 00:15:25,440 Speaker 3: now it pitched me bro Okay. He saw them as 261 00:15:25,480 --> 00:15:29,120 Speaker 3: a couple of ne'er dwells. They didn't know any biochemistry, 262 00:15:30,440 --> 00:15:32,240 Speaker 3: That's what he says. He says, they don't know anybody. 263 00:15:33,400 --> 00:15:37,880 Speaker 3: You know, after they get the double helix, he writes, 264 00:15:38,520 --> 00:15:40,280 Speaker 3: you can tell how late in the day it is 265 00:15:40,320 --> 00:15:44,120 Speaker 3: in biology that such pigmies would throw such long shadows. 266 00:15:44,440 --> 00:15:49,040 Speaker 3: You have to understand that this you and I, I 267 00:15:49,040 --> 00:15:52,480 Speaker 3: mean you knew Francis, I didn't I. And to be blunt, 268 00:15:52,520 --> 00:15:56,840 Speaker 3: I think Francis was the more intellectually deep of the two. 269 00:15:57,720 --> 00:15:59,600 Speaker 3: And I told you before that even though I thought 270 00:15:59,760 --> 00:16:01,760 Speaker 3: France and this was smarter, I thought Jim was far 271 00:16:01,840 --> 00:16:09,840 Speaker 3: more important. And they represented two halves of great science. 272 00:16:10,040 --> 00:16:13,080 Speaker 3: And Jim was that brash. I think he was taken 273 00:16:13,120 --> 00:16:15,840 Speaker 3: into the University of Chicago at age fifteen, So shout 274 00:16:15,840 --> 00:16:18,880 Speaker 3: out to the University of Chicago. Let's get back to 275 00:16:18,920 --> 00:16:21,120 Speaker 3: taking twelve year olds, thirteen year olds, and fourteen year 276 00:16:21,160 --> 00:16:24,400 Speaker 3: olds into our universities. In the University of Chicago in 277 00:16:24,480 --> 00:16:27,440 Speaker 3: particular is probably our top university, with every other one 278 00:16:27,520 --> 00:16:29,920 Speaker 3: falling off of some cliff. For God's sakes, we don't 279 00:16:29,960 --> 00:16:36,040 Speaker 3: know why that brashness that Jim had and that eccentricity 280 00:16:36,040 --> 00:16:42,200 Speaker 3: that Francis had were essential in Chargef's chargeff Oh, I 281 00:16:42,280 --> 00:16:45,960 Speaker 3: wish I could read it just he writes so beautifully, 282 00:16:46,400 --> 00:16:48,520 Speaker 3: But he talks about the fact he said, the idea 283 00:16:49,600 --> 00:16:54,440 Speaker 3: that the odds that two geniuses would fall into his orbit, 284 00:16:55,080 --> 00:16:57,960 Speaker 3: knowing nothing of biochemistry and solve this problem was so 285 00:16:58,080 --> 00:17:02,360 Speaker 3: vanishingly small that it didn't even warrant consideration. 286 00:17:03,400 --> 00:17:07,520 Speaker 2: Wow. Wow, this is why peer review doesn't work. Right. 287 00:17:07,600 --> 00:17:11,399 Speaker 3: So, in large measure, the history of DNA and the 288 00:17:11,440 --> 00:17:15,760 Speaker 3: history of the genetic code again discovered by someone outside 289 00:17:15,800 --> 00:17:18,840 Speaker 3: the fabled RNA TI club, so that all the top 290 00:17:18,880 --> 00:17:21,520 Speaker 3: people were assembled to crack the genetic code, and the 291 00:17:21,560 --> 00:17:23,800 Speaker 3: guy who was outside of it, named Marshall Nierenberg was 292 00:17:23,840 --> 00:17:26,640 Speaker 3: the one who cracks it ten years later in sixty three. 293 00:17:27,920 --> 00:17:32,719 Speaker 3: This is the greatest story of recent years that you 294 00:17:32,760 --> 00:17:35,359 Speaker 3: and I are within the living memory of Watson and Crick. 295 00:17:35,600 --> 00:17:38,479 Speaker 3: And by the way, that book, the Double Helix, you know, 296 00:17:38,600 --> 00:17:41,280 Speaker 3: great literature begins with things like it was the best 297 00:17:41,320 --> 00:17:43,080 Speaker 3: of times, it was the worst of times, or in 298 00:17:43,119 --> 00:17:46,000 Speaker 3: the beginning, or some memorable call me Ishmael. 299 00:17:46,640 --> 00:17:50,000 Speaker 2: Watson began and ended that book. 300 00:17:50,680 --> 00:17:52,960 Speaker 1: Saying he's never seen Francis Crick in a humble mood, 301 00:17:53,440 --> 00:17:54,720 Speaker 1: modest mood, modest mood. 302 00:17:54,760 --> 00:17:57,199 Speaker 3: Right, yeah, perhaps he is with others, but it's not 303 00:17:57,280 --> 00:17:58,359 Speaker 3: so with me or whatever it is. 304 00:17:59,400 --> 00:18:01,920 Speaker 2: Oh my god, can that guy write? I mean that 305 00:18:02,080 --> 00:18:04,160 Speaker 2: was literature, man, Yeah, agreed. 306 00:18:04,320 --> 00:18:11,880 Speaker 3: So anyway, I just I love Jim unapologetically, and I'm 307 00:18:12,000 --> 00:18:15,159 Speaker 3: well aware of the total of the total nature of 308 00:18:15,160 --> 00:18:16,520 Speaker 3: the man, and I'm not going to sweep any of 309 00:18:16,520 --> 00:18:18,480 Speaker 3: that into the rug. And by the way, I don't 310 00:18:18,520 --> 00:18:21,600 Speaker 3: think we stripped him of his official titles at cold 311 00:18:21,640 --> 00:18:25,120 Speaker 3: Spring Harbor until two thousand and seven, and I don't 312 00:18:25,119 --> 00:18:27,840 Speaker 3: think we stripped him of his honorary titles until twenty 313 00:18:27,920 --> 00:18:31,080 Speaker 3: nineteen or something like that. And that, my friends, was 314 00:18:31,160 --> 00:18:33,639 Speaker 3: the American system that we could be proud of, that 315 00:18:33,720 --> 00:18:39,160 Speaker 3: a complete horses ass like Jim Watson would be kept 316 00:18:39,520 --> 00:18:45,440 Speaker 3: within the system of our institutions and celebrated and not 317 00:18:45,640 --> 00:18:49,639 Speaker 3: be seen as a walking hr problem waiting to happen. 318 00:18:50,359 --> 00:18:52,679 Speaker 1: I mean he was seen as that walking again was 319 00:18:52,960 --> 00:18:55,760 Speaker 1: by walking hr problem waiting to happen. 320 00:18:55,960 --> 00:18:59,280 Speaker 2: He was that twelve times a day. Yeah, exactly. 321 00:18:59,320 --> 00:19:01,960 Speaker 3: But my point is we didn't throw him out until 322 00:19:02,080 --> 00:19:04,159 Speaker 3: very late in the game when we were determined to 323 00:19:04,240 --> 00:19:04,960 Speaker 3: lose our mojo. 324 00:19:05,600 --> 00:19:05,879 Speaker 2: Ah. 325 00:19:05,920 --> 00:19:09,480 Speaker 1: Interesting, okay, And so coming back to this idea of 326 00:19:09,560 --> 00:19:13,320 Speaker 1: cowboy science, it's the Krick and Watson's it's the Nuremberg. 327 00:19:13,400 --> 00:19:17,760 Speaker 1: It's the ways of coming in from the outside and 328 00:19:17,800 --> 00:19:20,720 Speaker 1: proposing something breaking the rules. 329 00:19:21,000 --> 00:19:25,840 Speaker 3: Yeah, it's based on the middle finger. It's based on 330 00:19:25,960 --> 00:19:30,760 Speaker 3: being true to science and telling the National Security Complex, 331 00:19:30,800 --> 00:19:35,240 Speaker 3: the National Interest Complex, your department, your funder to sit 332 00:19:35,320 --> 00:19:36,080 Speaker 3: down and shut up. 333 00:19:37,560 --> 00:19:39,879 Speaker 1: And by the way, you know, I've talked about purity 334 00:19:39,920 --> 00:19:44,160 Speaker 1: view before, but one thing we know is that when 335 00:19:44,240 --> 00:19:47,119 Speaker 1: Krick and Watson came up with this structure and they said, hey, 336 00:19:47,160 --> 00:19:48,840 Speaker 1: we think it's a double helix, or the as and 337 00:19:48,840 --> 00:19:50,440 Speaker 1: t's and c's and g's are wanting to each other. 338 00:19:51,240 --> 00:19:54,880 Speaker 1: They sent the typed manuscript over to Nature. Crick's wife 339 00:19:54,960 --> 00:19:57,840 Speaker 1: drew the double helix picture, which is gorgeous picture by 340 00:19:57,880 --> 00:20:01,159 Speaker 1: gorgeous picture because he was terror trying and she was 341 00:20:01,160 --> 00:20:04,719 Speaker 1: really adeal no deal exactly. She was a good artist. Anyway, 342 00:20:04,760 --> 00:20:08,119 Speaker 1: they sent it over to Nature, which Kriik told me 343 00:20:08,520 --> 00:20:11,280 Speaker 1: at the time was a man and two boys, and 344 00:20:11,760 --> 00:20:15,719 Speaker 1: the yailed the mandscript over yeah, the editor, and it 345 00:20:15,760 --> 00:20:17,800 Speaker 1: got published. There was no peer review at the time. 346 00:20:18,160 --> 00:20:21,439 Speaker 3: There was no peer review until nineteen sixty five to 347 00:20:21,520 --> 00:20:26,280 Speaker 3: seventy five. This is a fabricated story, largely due to Merton, 348 00:20:27,400 --> 00:20:32,680 Speaker 3: that was backfitted and retconned into outside reviewing and outside 349 00:20:32,720 --> 00:20:37,440 Speaker 3: refereeing which occurred being turned into peer review. But basically 350 00:20:37,440 --> 00:20:40,280 Speaker 3: it's a fabricated story of the history of science. 351 00:20:40,359 --> 00:20:43,119 Speaker 1: Sorry you're saying. The fabricated story is that peer review 352 00:20:43,359 --> 00:20:45,320 Speaker 1: wasn't drew to at some point, and then it was 353 00:20:45,680 --> 00:20:47,120 Speaker 1: claimed that it was older than that. 354 00:20:47,720 --> 00:20:50,959 Speaker 3: The story is that peer review begins with the founding 355 00:20:51,000 --> 00:20:53,440 Speaker 3: of the Royal Society and has been with science ever since. 356 00:20:53,880 --> 00:20:54,040 Speaker 2: Ah. 357 00:20:54,400 --> 00:20:55,399 Speaker 1: Usually that's not true. 358 00:20:55,480 --> 00:20:59,280 Speaker 3: No, it's untrue. And the willingness of the Academy to 359 00:20:59,400 --> 00:21:04,280 Speaker 3: lie about that, bold faced lie about this it might 360 00:21:04,320 --> 00:21:04,960 Speaker 3: be ignorance. 361 00:21:05,080 --> 00:21:05,679 Speaker 2: No, it's not. 362 00:21:06,359 --> 00:21:09,120 Speaker 1: Well, it depends who you mean by the Academy. Had 363 00:21:09,200 --> 00:21:11,040 Speaker 1: you asked me before you told me about peer review, 364 00:21:11,080 --> 00:21:13,720 Speaker 1: I would have. But if I showed you, yeah, the 365 00:21:13,760 --> 00:21:18,480 Speaker 1: history of peer review. Right, the Academy wants to say 366 00:21:18,480 --> 00:21:22,119 Speaker 1: that outside refereeing is peer review, and it isn't. The 367 00:21:22,200 --> 00:21:26,280 Speaker 1: double helix is a great example of something. I believe 368 00:21:26,400 --> 00:21:28,760 Speaker 1: that in Horace Judson's Eight Day of Creation it could 369 00:21:28,760 --> 00:21:32,919 Speaker 1: be sourced elsewhere. The claim was that Watson Creek couldn't 370 00:21:32,920 --> 00:21:36,879 Speaker 1: be peer reviewed because there was too much information in 371 00:21:36,920 --> 00:21:39,320 Speaker 1: the one page paper. You know, as you know, they 372 00:21:39,320 --> 00:21:41,320 Speaker 1: had a fight as to whether to write something very 373 00:21:41,359 --> 00:21:46,359 Speaker 1: complete or something very incomplete. And you will notice that 374 00:21:47,480 --> 00:21:49,800 Speaker 1: there's a phrase it has not escaped our notice that 375 00:21:50,280 --> 00:21:53,320 Speaker 1: you know this? Yes, yes, so more or less that 376 00:21:53,359 --> 00:21:54,520 Speaker 1: was a twenty page paper. 377 00:21:54,560 --> 00:21:56,159 Speaker 2: It has not escaped. I noticed that. 378 00:21:56,200 --> 00:21:58,399 Speaker 1: Right by the way for the audience, it's they were saying, 379 00:21:58,480 --> 00:22:00,639 Speaker 1: it is giving this double heelos ructure, it has not 380 00:22:00,760 --> 00:22:03,080 Speaker 1: escaped our notice that. You know, you could unzip this 381 00:22:03,240 --> 00:22:06,440 Speaker 1: and duplicate it this way where each strand of the 382 00:22:06,560 --> 00:22:09,760 Speaker 1: DNA then gets the complementary nucleotides on it, and you 383 00:22:09,800 --> 00:22:13,040 Speaker 1: can do this amazing thing that way. Right, They just 384 00:22:13,119 --> 00:22:15,120 Speaker 1: mentioned it when well, have you been to the Eagle Pub? 385 00:22:15,720 --> 00:22:17,240 Speaker 2: I have not. You have to go. 386 00:22:17,359 --> 00:22:19,720 Speaker 1: Oh, I'd love you again for the audience. This is 387 00:22:19,720 --> 00:22:23,159 Speaker 1: where Crick and Watson burst into the Eagle Pub? Was 388 00:22:23,200 --> 00:22:25,040 Speaker 1: it the Eagle and Child or the whatever? 389 00:22:25,160 --> 00:22:25,359 Speaker 2: I think? 390 00:22:26,480 --> 00:22:29,960 Speaker 1: Oh, maybe they burst into the pub and in early 391 00:22:30,040 --> 00:22:32,200 Speaker 1: nineteen fifty three and said we have discovered the secret 392 00:22:32,240 --> 00:22:32,600 Speaker 1: of life. 393 00:22:33,480 --> 00:22:36,440 Speaker 2: Yeah, I'd love to go see that. Goosebumps. 394 00:22:36,520 --> 00:22:39,600 Speaker 1: Yeah, yeah, okay. So, by the way, one other thing 395 00:22:39,600 --> 00:22:41,400 Speaker 1: I just want to make it clear is that peer 396 00:22:41,480 --> 00:22:44,159 Speaker 1: review is you write up a science paper and you 397 00:22:44,280 --> 00:22:47,960 Speaker 1: submitted to a journal, and then the journal sends it 398 00:22:48,000 --> 00:22:50,800 Speaker 1: to several of your colleagues in the field, something like 399 00:22:51,160 --> 00:22:53,639 Speaker 1: the jury system that we have in court, something like that, 400 00:22:53,680 --> 00:22:56,600 Speaker 1: where your colleagues who are expert in the field also 401 00:22:57,200 --> 00:23:00,600 Speaker 1: review your thing. The thing that scientists find sting about 402 00:23:00,640 --> 00:23:03,919 Speaker 1: this is oftentimes peers are incentivized to stand in the 403 00:23:03,920 --> 00:23:07,600 Speaker 1: way of something getting published. And also, I would get 404 00:23:07,640 --> 00:23:09,160 Speaker 1: you tell me if I'm right, but I would guess 405 00:23:09,240 --> 00:23:13,280 Speaker 1: that you would think oftentimes your peers are good scientists, 406 00:23:13,280 --> 00:23:18,560 Speaker 1: not great scientists, and so they might block something for 407 00:23:19,200 --> 00:23:22,440 Speaker 1: six months or a year or longer because they want 408 00:23:22,440 --> 00:23:26,520 Speaker 1: to see more of this or that. But the important yeah, 409 00:23:26,800 --> 00:23:28,320 Speaker 1: I would assume you'd say, and I'd agree with you 410 00:23:28,359 --> 00:23:31,639 Speaker 1: that the important part sometimes is just to get something 411 00:23:31,680 --> 00:23:33,800 Speaker 1: out there. If Krick and Watson had been wrong about 412 00:23:33,840 --> 00:23:36,400 Speaker 1: the structure, fine, it's in the public eye. 413 00:23:36,240 --> 00:23:40,720 Speaker 3: And their review just happens when the world of your 414 00:23:40,720 --> 00:23:43,879 Speaker 3: colleagues see something. This is peer injunction. 415 00:23:45,000 --> 00:23:47,639 Speaker 2: Yeah. And it's competitor injunction. Yeah. 416 00:23:47,680 --> 00:23:51,400 Speaker 3: And it's one sided where you, generally speaking, can't see them, 417 00:23:51,440 --> 00:23:54,840 Speaker 3: but they can see you. What if somebody doesn't like you? 418 00:23:55,680 --> 00:23:58,479 Speaker 3: What if Jim Watson is pissed off everybody? 419 00:23:59,160 --> 00:23:59,720 Speaker 2: Yeah? 420 00:24:00,040 --> 00:24:01,919 Speaker 3: So my feeling is, who the hell are these peers? 421 00:24:01,960 --> 00:24:03,440 Speaker 3: Get them the hell out of my way. Let's go 422 00:24:03,520 --> 00:24:04,720 Speaker 3: back to the system that works. 423 00:24:05,280 --> 00:24:06,439 Speaker 1: So, by the way, what do you think of the 424 00:24:06,480 --> 00:24:09,880 Speaker 1: pre print system now where people submit things STI, let's 425 00:24:09,880 --> 00:24:12,120 Speaker 1: say archive or other prens. 426 00:24:12,160 --> 00:24:17,480 Speaker 3: I think it's fascinating because I don't know, let's just 427 00:24:17,680 --> 00:24:19,800 Speaker 3: talk about how funny the system is. First of all, 428 00:24:19,840 --> 00:24:21,840 Speaker 3: it comes out of where where's the archive come out of? 429 00:24:22,720 --> 00:24:26,399 Speaker 2: I don't know. I believe it's Los Alamos National fascinating. 430 00:24:26,520 --> 00:24:29,040 Speaker 3: Now there's a whole question going back to the founding 431 00:24:29,040 --> 00:24:31,720 Speaker 3: of Los Alamos about security review, and I think that 432 00:24:32,280 --> 00:24:37,200 Speaker 3: was it Bright who ran the voluntary board that said, look, 433 00:24:39,520 --> 00:24:44,000 Speaker 3: papers on neutrons and chain reactions are so dangerous that 434 00:24:44,080 --> 00:24:46,600 Speaker 3: everything will be submitted and we will figure out what's 435 00:24:46,600 --> 00:24:49,080 Speaker 3: safe and what isn't. So in part, you have to 436 00:24:49,080 --> 00:24:51,879 Speaker 3: ask the question about whether or not this layer is 437 00:24:51,960 --> 00:24:56,080 Speaker 3: there as part of security review, because what if somebody 438 00:24:56,080 --> 00:24:58,560 Speaker 3: wants to publish weaponized anthrax, or what if they want 439 00:24:58,600 --> 00:25:02,359 Speaker 3: to publish something that's relevant nuclear weapons, or but. 440 00:25:02,640 --> 00:25:05,120 Speaker 1: The advent of the internet allows that anyway. 441 00:25:05,160 --> 00:25:07,920 Speaker 3: Now, yes, yes, but as you'll notice, many of your 442 00:25:08,119 --> 00:25:12,960 Speaker 3: colleagues won't take something seriously unless it's gone through channel. Okay, yeah, 443 00:25:13,000 --> 00:25:15,919 Speaker 3: that's right. And then the question is who is entitled 444 00:25:15,960 --> 00:25:19,400 Speaker 3: to post there? Do you need a dot edu address? 445 00:25:20,480 --> 00:25:23,640 Speaker 3: There's a moderator group that turns down papers that people 446 00:25:23,680 --> 00:25:27,639 Speaker 3: don't recognize, So there is a review aspect. Yeah, and 447 00:25:28,359 --> 00:25:32,080 Speaker 3: let me keep going. Our taxpayer dollars pay for research, 448 00:25:32,160 --> 00:25:37,160 Speaker 3: which is then put under a paywall by let's say Elzevir. 449 00:25:38,560 --> 00:25:42,200 Speaker 3: Then you're forced to quote research and if you don't 450 00:25:42,200 --> 00:25:44,920 Speaker 3: have a subscription, you can't get behind the paywalls. You 451 00:25:45,000 --> 00:25:46,800 Speaker 3: have to pay forty dollars per paper to see if 452 00:25:46,800 --> 00:25:51,439 Speaker 3: it's relevant. This is nonsense. This is an old style 453 00:25:52,720 --> 00:26:00,000 Speaker 3: control mechanism. And even the archive is refereed at some level. 454 00:26:00,119 --> 00:26:04,119 Speaker 3: It has an endorsement system, and it appears probably to 455 00:26:04,240 --> 00:26:09,040 Speaker 3: catch certain things and to relegate them to If you 456 00:26:09,080 --> 00:26:11,960 Speaker 3: know that, there's this horrible thing called visra. I don't 457 00:26:12,000 --> 00:26:16,200 Speaker 3: know this fixer's archives spelled backwards, and that's where crazy 458 00:26:16,240 --> 00:26:17,280 Speaker 3: people are sent. 459 00:26:17,680 --> 00:26:19,440 Speaker 1: What do you mean they're sent as in, if someone 460 00:26:19,520 --> 00:26:21,080 Speaker 1: has an idea that's. 461 00:26:21,040 --> 00:26:25,360 Speaker 3: Oh, my friend Garrett Liasy, for example, submitted something. He's 462 00:26:25,400 --> 00:26:28,679 Speaker 3: a PhD in physics from U SEE San Diego, and 463 00:26:28,720 --> 00:26:30,359 Speaker 3: he's told this is not right for the archive. 464 00:26:31,040 --> 00:26:34,040 Speaker 1: Wow, just for the audience. Archive is supposed to publish 465 00:26:34,080 --> 00:26:37,439 Speaker 1: anything that's reasonable. I don't know in theory they're supposed to. 466 00:26:37,800 --> 00:26:40,440 Speaker 1: That's the idea. That's the idea is that it's an 467 00:26:40,480 --> 00:26:41,440 Speaker 1: open preprint server. 468 00:26:41,520 --> 00:26:43,560 Speaker 2: As as Paul ginsparg about that. 469 00:26:43,680 --> 00:26:47,600 Speaker 3: Paul Ginspark set this up and he used a device 470 00:26:47,960 --> 00:26:51,800 Speaker 3: like I come to Barry Caesar not to praise him, 471 00:26:51,840 --> 00:26:53,960 Speaker 3: and he said, no, this is we're not trying to 472 00:26:54,000 --> 00:26:57,480 Speaker 3: go around peer review. This is a holding tank for 473 00:26:57,560 --> 00:27:00,199 Speaker 3: things that are only those things that are seeking per 474 00:27:00,320 --> 00:27:05,480 Speaker 3: re view. But the biggest question is what do we 475 00:27:05,520 --> 00:27:11,760 Speaker 3: do with the science. It is powerful that goes against 476 00:27:11,800 --> 00:27:14,359 Speaker 3: the narratives which have policy implications. 477 00:27:14,640 --> 00:27:16,560 Speaker 2: The science has policy implications. Yeah. 478 00:27:16,800 --> 00:27:21,200 Speaker 3: For example, Jim and Francis, together with Marshall Nahrenberg, unlocked 479 00:27:21,240 --> 00:27:25,880 Speaker 3: something which allows you, with Crisper cast nine and other tools, 480 00:27:26,480 --> 00:27:31,760 Speaker 3: to write code directly into nucleic acid. Now, most of 481 00:27:31,800 --> 00:27:36,160 Speaker 3: us got locked down for two years because somehow twelve 482 00:27:36,280 --> 00:27:43,880 Speaker 3: nucleotides coded for four codons assembling four amino acids into 483 00:27:43,880 --> 00:27:47,000 Speaker 3: a furin cleavage site that got spliced in this spike 484 00:27:47,080 --> 00:27:54,960 Speaker 3: protein in coronavirus, which made this virus very human transmissible. Now, 485 00:27:55,000 --> 00:28:00,240 Speaker 3: my claim is, what does that tell you about the leverage. 486 00:28:01,080 --> 00:28:04,880 Speaker 2: Of this code? Who is allowed to do? 487 00:28:04,920 --> 00:28:08,720 Speaker 3: What? If you can shut down planet Earth for two 488 00:28:08,800 --> 00:28:14,160 Speaker 3: years with twelve nucleotides, what are we talking about here? 489 00:28:14,840 --> 00:28:16,560 Speaker 3: Why are we doing this out in the open? Why 490 00:28:16,560 --> 00:28:19,119 Speaker 3: are we pretending that there aren't military implications? Why are 491 00:28:19,160 --> 00:28:21,879 Speaker 3: we pretending that there aren't national interest implications and national 492 00:28:21,880 --> 00:28:27,679 Speaker 3: security implications? This is the unforgivable sin of modern university science. 493 00:28:27,760 --> 00:28:30,280 Speaker 3: Pretending that what we're doing is something that's g WI 494 00:28:30,320 --> 00:28:34,240 Speaker 3: is interesting. Everybody should do science. Science is fun. You 495 00:28:34,280 --> 00:28:38,560 Speaker 3: could be a scientist too. Well, Shit, first of all, 496 00:28:38,600 --> 00:28:41,720 Speaker 3: it's incredibly hard. It's incredibly demanding. It's like telling a 497 00:28:41,760 --> 00:28:44,320 Speaker 3: person who's five to ten that they can join the NBA. Yeah, 498 00:28:44,400 --> 00:28:47,120 Speaker 3: maybe there are people who are five to ten in 499 00:28:47,160 --> 00:28:50,200 Speaker 3: the NBA, but your odds are not good. Almost nobody 500 00:28:50,200 --> 00:28:54,080 Speaker 3: belongs at here. It's super dangerous, it's super powerful. It's 501 00:28:54,080 --> 00:28:57,520 Speaker 3: boring as hell, it's exciting as hell. We're just not 502 00:28:57,600 --> 00:29:02,000 Speaker 3: honest about what science is. We've got to break the 503 00:29:02,040 --> 00:29:07,080 Speaker 3: dependence on the university system and the federal granting agencies 504 00:29:07,120 --> 00:29:08,520 Speaker 3: of sciences to continue. 505 00:29:08,840 --> 00:29:11,160 Speaker 1: So this is fascinating because what you're what you draw 506 00:29:11,200 --> 00:29:13,440 Speaker 1: attention to that I really don't know anyone else drawing 507 00:29:13,480 --> 00:29:17,680 Speaker 1: attention to, is this point that science is extraordinarily powerful, 508 00:29:17,720 --> 00:29:21,240 Speaker 1: and therefore it has the attention of, and maybe even 509 00:29:21,280 --> 00:29:25,040 Speaker 1: the control of, in some ways organizations that are much 510 00:29:25,080 --> 00:29:27,960 Speaker 1: bigger than what's happening in the lab or in the university. 511 00:29:28,280 --> 00:29:30,240 Speaker 1: And I think your point, tell me if I'm correct 512 00:29:30,240 --> 00:29:33,040 Speaker 1: about this, is that most scientists don't realize that most 513 00:29:33,080 --> 00:29:36,080 Speaker 1: academicians don't know that they think they're just doing a thing, 514 00:29:36,120 --> 00:29:39,280 Speaker 1: but in fact there's a there's so much leverage going 515 00:29:39,320 --> 00:29:42,680 Speaker 1: on that they're playing in a bigger game without realizing it. 516 00:29:42,760 --> 00:29:45,719 Speaker 3: Yeah, Like you know, let's imagine you were a computer programmer. 517 00:29:46,200 --> 00:29:50,160 Speaker 3: You could program tic tac toe or checkers or something 518 00:29:50,560 --> 00:29:51,480 Speaker 3: and that would be amusing. 519 00:29:52,040 --> 00:29:52,680 Speaker 2: Or you could. 520 00:29:52,440 --> 00:29:56,440 Speaker 3: Program wire shark and that's wireshark. That's exactly the point. 521 00:29:56,440 --> 00:29:57,560 Speaker 3: You don't know what wireshark is. 522 00:29:57,560 --> 00:29:57,840 Speaker 2: Correct. 523 00:29:58,560 --> 00:30:01,920 Speaker 3: Wire Shark sits on your network and sniff's packets, which 524 00:30:02,000 --> 00:30:04,400 Speaker 3: means that any message that you're sending around your network 525 00:30:04,480 --> 00:30:08,320 Speaker 3: might be unencrypted and we can just read whatever emails 526 00:30:08,320 --> 00:30:11,360 Speaker 3: and messages you're sending. But because you don't know that 527 00:30:11,400 --> 00:30:14,680 Speaker 3: there's an application that allows you to read it, you 528 00:30:14,760 --> 00:30:17,200 Speaker 3: just sort of imagine, oh, I don't know I send 529 00:30:17,280 --> 00:30:20,520 Speaker 3: on email. You don't know about SMTP. The problem is 530 00:30:20,520 --> 00:30:22,920 Speaker 3: is that mostly what you're dealing with is not your 531 00:30:22,960 --> 00:30:25,200 Speaker 3: computer is not a computer. To you, it's just an 532 00:30:25,240 --> 00:30:28,480 Speaker 3: application serving device. But to somebody who lives on the 533 00:30:28,480 --> 00:30:30,880 Speaker 3: command line, just the way somebody could live on the 534 00:30:30,880 --> 00:30:35,400 Speaker 3: command line of DNA, they see an entirely different world. 535 00:30:35,720 --> 00:30:37,760 Speaker 3: So my claim is is that if you're writing something 536 00:30:37,800 --> 00:30:41,400 Speaker 3: like wireshark, you're very well aware of what you could 537 00:30:41,400 --> 00:30:45,120 Speaker 3: be doing. And if you're studying coronavirus, like the Cohealth 538 00:30:45,120 --> 00:30:49,600 Speaker 3: Alliance was studying coronavirus, or Ralph Barrack in North Carolina 539 00:30:49,680 --> 00:30:54,480 Speaker 3: was studying viruses, those people are very well aware of 540 00:30:54,520 --> 00:30:59,200 Speaker 3: what it takes to humanize and weaponize a virus platform 541 00:30:59,240 --> 00:31:04,400 Speaker 3: and tells who Berrick is a very talented scientist at 542 00:31:04,440 --> 00:31:07,800 Speaker 3: the University of North Carolina. And we signed a bio 543 00:31:07,840 --> 00:31:10,840 Speaker 3: weapons convention, and I think we signed two treaties in 544 00:31:10,880 --> 00:31:15,720 Speaker 3: the nineteen seventies which prohibit us from exploring offensive weapons. 545 00:31:15,760 --> 00:31:17,320 Speaker 3: But a lot of what you hear is what we 546 00:31:17,400 --> 00:31:20,560 Speaker 3: have to explore defensive weapons. In order to explore defensive 547 00:31:21,480 --> 00:31:24,160 Speaker 3: weapons against the weapons, we have to create the weapons 548 00:31:24,760 --> 00:31:27,120 Speaker 3: to begin with, so that we know what we're defending against. 549 00:31:27,360 --> 00:31:34,959 Speaker 3: So we're engaged in high level bullshit in order to 550 00:31:35,080 --> 00:31:40,000 Speaker 3: explain what we're doing messing around in a place like Wuhan, China. 551 00:31:40,360 --> 00:31:42,959 Speaker 1: Oh, you're saying we developed the bullshit to cover our 552 00:31:43,040 --> 00:31:45,040 Speaker 1: tracks at Wuhan, is what you mean? 553 00:31:45,840 --> 00:31:50,240 Speaker 3: We don't want to be caught off guard in a 554 00:31:50,240 --> 00:31:52,600 Speaker 3: prisoner's dilemma where we agree not to do something and 555 00:31:52,640 --> 00:31:56,760 Speaker 3: the other party decides to cheat on their agreement. So 556 00:31:56,920 --> 00:31:59,200 Speaker 3: we are cheating on the agreement that we signed, the 557 00:31:59,240 --> 00:32:02,800 Speaker 3: agreement being don't develop offensive. Yeah, that's the spirit of 558 00:32:02,840 --> 00:32:07,160 Speaker 3: the agreement. A word that is very rarely used in 559 00:32:07,200 --> 00:32:13,080 Speaker 3: this capacity called pettifogging, where you can talk about arbitraging 560 00:32:13,840 --> 00:32:17,960 Speaker 3: the letter against the spirit. We are engaged in arbitraging 561 00:32:18,000 --> 00:32:21,040 Speaker 3: the letter of the Bioweapons Convention against the spirit of 562 00:32:21,080 --> 00:32:25,840 Speaker 3: the Bioweapons Convention. But we are not doing this at 563 00:32:25,880 --> 00:32:29,240 Speaker 3: a credible level. We are telling tall tales that are 564 00:32:29,240 --> 00:32:32,800 Speaker 3: not befitting of adults, let alone scientists. And so what 565 00:32:32,840 --> 00:32:36,400 Speaker 3: we have is we have these sort of storytellers in chief. 566 00:32:36,840 --> 00:32:40,320 Speaker 3: So Francis Collins was a storyteller in chief, Anthony Fauci 567 00:32:40,400 --> 00:32:43,280 Speaker 3: was a storyteller in chief. That allows people like Peter 568 00:32:43,400 --> 00:32:46,840 Speaker 3: Dazac at the Cohalth Alliance in Ralph Barrick in North 569 00:32:46,840 --> 00:32:51,520 Speaker 3: Carolina to do the scientific and administrative work that are 570 00:32:51,640 --> 00:32:55,320 Speaker 3: engaged in our bioweapons program where we take see, if 571 00:32:55,360 --> 00:32:58,440 Speaker 3: you've just had something called NIID the National Institute of 572 00:32:58,440 --> 00:33:00,760 Speaker 3: Infectious Diseases, it would be too clear that it was 573 00:33:00,760 --> 00:33:02,320 Speaker 3: probably something military related. 574 00:33:02,320 --> 00:33:03,320 Speaker 2: So if you throw allergies. 575 00:33:03,320 --> 00:33:05,560 Speaker 3: In the middle of it, you get the National Institute 576 00:33:05,560 --> 00:33:07,160 Speaker 3: of Allergies and Infectious Disease. 577 00:33:07,440 --> 00:33:09,240 Speaker 2: Oh, these people are going to cure me of hay fever. 578 00:33:09,720 --> 00:33:12,640 Speaker 1: And so your view, My view is. 579 00:33:12,560 --> 00:33:18,000 Speaker 3: That we just went through a national security, national interest 580 00:33:18,080 --> 00:33:23,520 Speaker 3: exercise that was catastrophic for science. That there was a 581 00:33:23,600 --> 00:33:29,520 Speaker 3: world science experiment and the world looked to us as 582 00:33:29,600 --> 00:33:31,760 Speaker 3: scientists to say, what the hell is going on? 583 00:33:31,920 --> 00:33:33,040 Speaker 2: We let the world down? 584 00:33:33,200 --> 00:33:35,239 Speaker 1: So let's unpack that. In what ways did we let 585 00:33:35,320 --> 00:33:35,960 Speaker 1: the world down? 586 00:33:37,440 --> 00:33:39,680 Speaker 2: They wanted to know where this virus came from. 587 00:33:39,960 --> 00:33:44,479 Speaker 1: So there's a tension between national security interests and what 588 00:33:44,520 --> 00:33:47,680 Speaker 1: they're able to tell the public and what academic scientists 589 00:33:47,760 --> 00:33:48,200 Speaker 1: even know. 590 00:33:48,920 --> 00:33:50,680 Speaker 3: Well, do you remember when OJ was going to look 591 00:33:50,680 --> 00:33:54,120 Speaker 3: for the real killers? Yeah, so we're going to find 592 00:33:54,200 --> 00:33:56,160 Speaker 3: the origin of this virus if it kills us. 593 00:33:56,440 --> 00:34:00,760 Speaker 1: But do you think the people who knew about national 594 00:34:00,760 --> 00:34:03,200 Speaker 1: security buy weapons were the people doing that or other 595 00:34:03,240 --> 00:34:06,760 Speaker 1: academics who who really didn't know. We have two groups. 596 00:34:06,840 --> 00:34:08,239 Speaker 3: The first thing I want to know is I want 597 00:34:08,280 --> 00:34:11,400 Speaker 3: to put Ralph Barrick and Peter dejac on the stand, 598 00:34:11,400 --> 00:34:13,080 Speaker 3: and I want to ask them a ton of questions, 599 00:34:13,840 --> 00:34:15,440 Speaker 3: and I want to do that to Francis Collins, and 600 00:34:15,440 --> 00:34:16,920 Speaker 3: I want to do that to Anthony Fauci. 601 00:34:17,360 --> 00:34:19,920 Speaker 2: And I want those questions to be asked not by. 602 00:34:20,040 --> 00:34:25,480 Speaker 3: Random senators and congressmen. I want our best, heterodox pro 603 00:34:25,560 --> 00:34:30,480 Speaker 3: science thinkers coming up with exactly the right questions. And 604 00:34:30,520 --> 00:34:32,560 Speaker 3: I don't want it time limited by oh, you know, 605 00:34:32,880 --> 00:34:37,439 Speaker 3: the five minutes allotted are out. No, if this turns 606 00:34:37,480 --> 00:34:39,320 Speaker 3: out to be something that came out of a lab, 607 00:34:40,680 --> 00:34:42,520 Speaker 3: how many people did we just kill? I want to know, 608 00:34:43,120 --> 00:34:47,680 Speaker 3: did we kill zero? Good news? Did we kill millions? 609 00:34:48,200 --> 00:34:50,600 Speaker 2: I want to know. This is so bad? 610 00:34:51,600 --> 00:34:57,880 Speaker 3: And I don't think people understand within the academy that, 611 00:34:58,000 --> 00:35:02,120 Speaker 3: you know, let's imagine that you're doing development and zebra fish. 612 00:35:02,320 --> 00:35:04,839 Speaker 3: You say, well, science is basically working. Of course people 613 00:35:04,880 --> 00:35:08,000 Speaker 3: are going to have issues because it involved the Wuhan 614 00:35:08,080 --> 00:35:10,440 Speaker 3: Institute of Virology, and of course there's stuff going on 615 00:35:10,520 --> 00:35:14,560 Speaker 3: with bioweapons. You'd be naive to think that that's not okay. Well, 616 00:35:14,680 --> 00:35:17,760 Speaker 3: you allowed people to say that public health was science. 617 00:35:17,840 --> 00:35:21,880 Speaker 3: Public health is not science. Public health involves noble laws, 618 00:35:22,000 --> 00:35:27,360 Speaker 3: it involves coercive activities, nudging to use the cast Sunstein concept. 619 00:35:29,640 --> 00:35:33,600 Speaker 3: It's important that science cancel its credit card that it's 620 00:35:33,640 --> 00:35:37,960 Speaker 3: given to public health. No, that's not us. You're on 621 00:35:38,000 --> 00:35:41,279 Speaker 3: your own. You screwed up. We didn't screw up. What's more, 622 00:35:41,320 --> 00:35:43,160 Speaker 3: we know how to get to the bottom of these things. 623 00:35:43,200 --> 00:35:47,439 Speaker 3: We can figure out ohts. And the public looked to us, 624 00:35:49,640 --> 00:35:52,640 Speaker 3: and we sat there with Anthony Fauci and Francis's colins, 625 00:35:52,719 --> 00:35:58,840 Speaker 3: hands around our throat for funding, and we said, yes, boss, 626 00:36:00,440 --> 00:36:03,480 Speaker 3: well we're not supposed to have a boss. This is 627 00:36:03,480 --> 00:36:06,000 Speaker 3: what academic freedom is about. This is what public spirited 628 00:36:06,040 --> 00:36:08,840 Speaker 3: science is about. And yes, you can do some stuff 629 00:36:08,880 --> 00:36:11,280 Speaker 3: in terms of national interest, but when you allow something 630 00:36:11,320 --> 00:36:13,560 Speaker 3: potentially to get out of a lab and infect an 631 00:36:13,719 --> 00:36:17,800 Speaker 3: entire planet and kill millions, and then you force people 632 00:36:17,840 --> 00:36:20,440 Speaker 3: more or less through coercion, to inject themselves with something 633 00:36:20,440 --> 00:36:23,520 Speaker 3: that you're not explaining, well, you need to answer an 634 00:36:23,600 --> 00:36:27,160 Speaker 3: infinite number of questions from the world's smartest people, and 635 00:36:27,200 --> 00:36:29,440 Speaker 3: they need to know that they have a job on Monday, 636 00:36:29,920 --> 00:36:31,480 Speaker 3: if they do their job on a Friday. 637 00:36:31,760 --> 00:36:32,879 Speaker 2: You and I know. 638 00:36:32,960 --> 00:36:36,000 Speaker 3: People inside of the institutions who would have been very 639 00:36:36,040 --> 00:36:38,719 Speaker 3: capable of shouldering that burden. There was a lot of 640 00:36:38,760 --> 00:36:42,759 Speaker 3: fear inside of the universities that this was a bioweapon, 641 00:36:42,880 --> 00:36:45,080 Speaker 3: and then we pretended that people who said that on 642 00:36:45,160 --> 00:36:48,880 Speaker 3: the internet were stupid, they were crazy, they were conspiracy theorists. 643 00:36:48,880 --> 00:36:50,920 Speaker 3: How could you possibly imagine that this came out of 644 00:36:50,960 --> 00:36:53,400 Speaker 3: the wool One Institute of Virology? My god, are you 645 00:36:53,480 --> 00:36:58,480 Speaker 3: a racist? And the Lancet you know, fell behind, and 646 00:36:58,560 --> 00:37:01,600 Speaker 3: we had like, you know, seven seven Nobel laureates saying, 647 00:37:01,680 --> 00:37:05,040 Speaker 3: please don't shut off the grant of poor Peter Dajak. 648 00:37:05,840 --> 00:37:10,879 Speaker 2: Come on, this is just not adult level fiction, because 649 00:37:10,880 --> 00:37:11,440 Speaker 2: if we could. 650 00:37:11,320 --> 00:37:14,000 Speaker 1: Do it over again, you're saying we should look at 651 00:37:14,040 --> 00:37:15,719 Speaker 1: all the hypothesis, keep everything on the. 652 00:37:15,680 --> 00:37:19,160 Speaker 3: Time, saying that scientists have a right to be in 653 00:37:19,239 --> 00:37:24,399 Speaker 3: a more than equal relationship to the national interest complex. Yeah, 654 00:37:24,600 --> 00:37:27,920 Speaker 3: we are not your employees. We're not here to do 655 00:37:27,960 --> 00:37:31,279 Speaker 3: your dirty work. We're not here to cover up your mistakes. 656 00:37:32,719 --> 00:37:36,120 Speaker 3: We are public spirited individuals focused on truth. And don't 657 00:37:36,160 --> 00:37:38,320 Speaker 3: ever ask us to lie like this ever again. 658 00:37:39,600 --> 00:37:43,000 Speaker 1: Ever, who was asked to lie? Though, you're saying, in 659 00:37:43,120 --> 00:37:46,040 Speaker 1: terms of shutting down conversation about did this come out. 660 00:37:45,880 --> 00:37:50,040 Speaker 2: To Jabaria was asked to lie? Let's just start there, 661 00:37:50,120 --> 00:37:51,279 Speaker 2: great unpack that. 662 00:37:51,760 --> 00:37:57,400 Speaker 3: Okay, we have three fringe epidemiologists from fringe schools Stanford, 663 00:37:57,560 --> 00:38:06,000 Speaker 3: Harvard and Oxford. These crazy fringe epidemiologists who require, I 664 00:38:06,000 --> 00:38:09,719 Speaker 3: think from Francis's Collins and Email, a swift and devastating 665 00:38:09,800 --> 00:38:15,080 Speaker 3: takedown of their ideas. Swift and devastating takedown of their ideas. David, 666 00:38:16,000 --> 00:38:18,760 Speaker 3: this is madness. I talked to Jay, Jay's a friend. 667 00:38:20,520 --> 00:38:23,040 Speaker 3: Jay asked me, how did you learn how the universities 668 00:38:23,080 --> 00:38:26,640 Speaker 3: actually work so early in your career. I said, I 669 00:38:26,680 --> 00:38:30,400 Speaker 3: stumbled on it by making discoveries. I said, what do 670 00:38:30,480 --> 00:38:33,640 Speaker 3: you mean. He said, well, I was at Stanford I think, 671 00:38:33,680 --> 00:38:35,640 Speaker 3: he said, for thirty five years of my adult life 672 00:38:35,680 --> 00:38:38,959 Speaker 3: something like that. And he said I never had any 673 00:38:39,000 --> 00:38:41,960 Speaker 3: idea how this worked. And I said, what do you mean? 674 00:38:42,040 --> 00:38:44,680 Speaker 3: He said, It wasn't until I said, you know, we 675 00:38:44,719 --> 00:38:48,480 Speaker 3: don't do this for any pandemic. This is not the 676 00:38:48,480 --> 00:38:52,320 Speaker 3: standard operating procedure for any pandemic. What's not the standard 677 00:38:52,320 --> 00:38:54,560 Speaker 3: operating pu whatever we did in the face of COVID 678 00:38:54,800 --> 00:38:57,840 Speaker 3: two weeks to flatten the curved masks, yes, masks, no. 679 00:39:00,239 --> 00:39:01,120 Speaker 1: This you know what. 680 00:39:02,280 --> 00:39:04,560 Speaker 3: I had the Hong Kong flu in the end of 681 00:39:04,560 --> 00:39:08,480 Speaker 3: the sixties. It was a full on pandemic. That's what 682 00:39:08,520 --> 00:39:12,799 Speaker 3: happened during Woodstock. Woodstock took place during a pandemic. Right, 683 00:39:12,840 --> 00:39:15,680 Speaker 3: we don't remember the Hong Kong flu. Right, we're not 684 00:39:15,719 --> 00:39:17,840 Speaker 3: even allowed to call this the Wuhan flu or the 685 00:39:17,840 --> 00:39:21,760 Speaker 3: Wuhan virus. So their point was, what are we doing 686 00:39:23,200 --> 00:39:26,440 Speaker 3: as epidemiologists and virologists. We know that we have protocols 687 00:39:26,440 --> 00:39:31,120 Speaker 3: and we're not following them. What's happening. This is where 688 00:39:31,120 --> 00:39:33,760 Speaker 3: Francis Collins says, we need a swift and devastating takedown 689 00:39:33,800 --> 00:39:37,600 Speaker 3: of the ideas of these fringe epidemiologists. Suddenly, Ja Bodicharia 690 00:39:38,920 --> 00:39:41,960 Speaker 3: goes from being a darling of Stanford University, I think, 691 00:39:41,960 --> 00:39:44,600 Speaker 3: with an MD and a PhD in economics, some guy 692 00:39:44,600 --> 00:39:50,280 Speaker 3: who's like totally unassailable, to some fringe lunatic with an email. 693 00:39:51,239 --> 00:39:54,640 Speaker 3: You've never had this, David, You've never had this treatment, 694 00:39:55,280 --> 00:39:59,560 Speaker 3: and if you've ever had this treatment, you'll never forget it. 695 00:39:59,560 --> 00:40:03,160 Speaker 3: It's like you say something and it's treated like farting 696 00:40:03,200 --> 00:40:03,680 Speaker 3: in church. 697 00:40:03,960 --> 00:40:07,759 Speaker 1: The point is that their national security interests. Jay went 698 00:40:07,880 --> 00:40:10,280 Speaker 1: against that. He didn't know that he was doing without 699 00:40:10,360 --> 00:40:13,239 Speaker 1: knowing it, and found himself shut down. 700 00:40:13,440 --> 00:40:13,920 Speaker 2: Exactly. 701 00:40:13,960 --> 00:40:16,279 Speaker 3: The point was he was just trying to do what 702 00:40:16,320 --> 00:40:19,200 Speaker 3: he knew how to do. He's saying, I'm an expert, 703 00:40:20,000 --> 00:40:24,759 Speaker 3: let me contribute my expertise. We don't do this. This 704 00:40:24,880 --> 00:40:29,400 Speaker 3: is not standard operating protocol. So nobody pulled Jay aside 705 00:40:29,440 --> 00:40:34,359 Speaker 3: and said, hey, we may have created this. We have 706 00:40:34,440 --> 00:40:36,560 Speaker 3: a little bit of an issue of sensitivity with our 707 00:40:36,640 --> 00:40:39,319 Speaker 3: Chinese partners. I can't tell you everything. It's a need 708 00:40:39,360 --> 00:40:45,360 Speaker 3: to know basis. You see, in general, you have people 709 00:40:45,520 --> 00:40:50,000 Speaker 3: who know what a special access program is or an 710 00:40:50,080 --> 00:40:52,839 Speaker 3: unacknowledged special access program is, and you have people who 711 00:40:53,000 --> 00:40:54,560 Speaker 3: complain about conspiracy theories. 712 00:40:55,800 --> 00:40:57,279 Speaker 2: So tell us what that program is. 713 00:40:57,320 --> 00:41:00,440 Speaker 3: Well, I'm just saying it's a category of secrets stuff. 714 00:41:01,120 --> 00:41:05,399 Speaker 3: An unacknowledged special access program is some black budget thing 715 00:41:05,560 --> 00:41:08,080 Speaker 3: that we don't even talk about, and we don't even acknowledge. 716 00:41:08,160 --> 00:41:12,000 Speaker 3: It's a covert operation. It's deniable if it's ever discovered. 717 00:41:12,520 --> 00:41:18,200 Speaker 3: The right question about Wuhan and COVID is did this 718 00:41:18,920 --> 00:41:23,960 Speaker 3: involve a covert operation, did this involve a special access program? 719 00:41:24,120 --> 00:41:29,600 Speaker 3: And did it in particular involve an unacknowledged special access program? 720 00:41:29,760 --> 00:41:33,080 Speaker 3: And when you ask that question, you're clearly indicating that 721 00:41:33,160 --> 00:41:37,960 Speaker 3: you have knowledge of the architecture of how we keep secrets. 722 00:41:38,000 --> 00:41:40,560 Speaker 2: As a nation. We are entitled to keep secrets. We 723 00:41:40,640 --> 00:41:41,600 Speaker 2: have to keep secrets. 724 00:41:42,000 --> 00:41:46,640 Speaker 3: But somehow science and something called SSP or State Secrets privilege, 725 00:41:46,840 --> 00:41:51,360 Speaker 3: have collided, and now the world thinks that we're not 726 00:41:51,480 --> 00:41:54,000 Speaker 3: very good at our job. And my feeling is is 727 00:41:54,040 --> 00:41:57,240 Speaker 3: that we should say, hold my beer and we should 728 00:41:57,480 --> 00:42:01,000 Speaker 3: let our friends at Gspatial intelligen since the CIA the 729 00:42:01,120 --> 00:42:04,520 Speaker 3: NSA know that we are not in the business of 730 00:42:04,680 --> 00:42:21,279 Speaker 3: lying about science at this level, so let. 731 00:42:21,239 --> 00:42:23,719 Speaker 1: Me just zoom out to the big picture. The difficulty 732 00:42:23,800 --> 00:42:28,120 Speaker 1: is that you have scientists, academic scientists like me, for example, 733 00:42:28,480 --> 00:42:33,759 Speaker 1: who I would say I've been I'm very naive to this. 734 00:42:33,840 --> 00:42:37,480 Speaker 1: It's at the interface where there's all these problems because 735 00:42:38,120 --> 00:42:41,840 Speaker 1: it's not that I or my colleagues, to my knowledge 736 00:42:42,160 --> 00:42:46,560 Speaker 1: ever said hey, we'll do your bidding. But you're saying 737 00:42:47,520 --> 00:42:49,560 Speaker 1: just being naive is enough of a problem. 738 00:42:49,760 --> 00:42:50,360 Speaker 2: Great point. 739 00:42:50,480 --> 00:42:54,359 Speaker 3: Okay, So for example, we've known each other a long 740 00:42:54,400 --> 00:42:57,040 Speaker 3: time and one of the things that I loved was 741 00:42:57,120 --> 00:43:01,360 Speaker 3: mister potato head, thank you. Now, mister data is a 742 00:43:01,360 --> 00:43:03,600 Speaker 3: great idea that the brain is an all purpose computer 743 00:43:03,680 --> 00:43:04,920 Speaker 3: with default peripherals. 744 00:43:05,360 --> 00:43:06,760 Speaker 2: Correct me, if I'm wrong, that's perfect. 745 00:43:07,560 --> 00:43:11,000 Speaker 3: So now I've got my olfactory default peripherals in my nose, 746 00:43:11,040 --> 00:43:14,120 Speaker 3: in my mouth, I've got my visual default peripherals in 747 00:43:14,200 --> 00:43:17,840 Speaker 3: my eyes and ears that pick up frequencies either of 748 00:43:17,920 --> 00:43:20,920 Speaker 3: light or of sound waves. And I've got my skin. 749 00:43:21,360 --> 00:43:23,520 Speaker 3: But the question is what if I want to start 750 00:43:23,600 --> 00:43:26,400 Speaker 3: umvelt hacking. So I take the things that I can't perceive, 751 00:43:26,480 --> 00:43:30,600 Speaker 3: like ultraviolet or infrared light or polarization, let's say, and 752 00:43:30,640 --> 00:43:32,879 Speaker 3: I start coming up with new peripherals, and I jack 753 00:43:32,960 --> 00:43:35,640 Speaker 3: into the general all purpose computer that is my brain. 754 00:43:36,680 --> 00:43:38,000 Speaker 2: Okay, that is. 755 00:43:37,960 --> 00:43:42,120 Speaker 3: Such a cool idea which I've just loved. Right one day, 756 00:43:42,480 --> 00:43:46,480 Speaker 3: somebody shows up and says, your lab is locked. Why 757 00:43:47,120 --> 00:43:49,960 Speaker 3: we're concerned that what you're doing is is that you're 758 00:43:50,000 --> 00:43:53,279 Speaker 3: developing something that equips a soldier to be able to 759 00:43:53,320 --> 00:43:57,000 Speaker 3: perceive aspects of the battlefield that are currently not available 760 00:43:57,040 --> 00:43:59,920 Speaker 3: to our adversaries. We believe that what you're doing is 761 00:44:00,160 --> 00:44:04,680 Speaker 3: creating a technology that allows for total situational awareness of 762 00:44:04,719 --> 00:44:07,919 Speaker 3: a soldier on a battlefield to be able to see 763 00:44:07,960 --> 00:44:09,879 Speaker 3: the battlefield in a way that no one else can. 764 00:44:10,480 --> 00:44:13,600 Speaker 3: And therefore we are going to restrict your technology. You 765 00:44:13,640 --> 00:44:16,600 Speaker 3: didn't think mister potato head, it's a goofy name, right. 766 00:44:17,200 --> 00:44:18,880 Speaker 3: It's just as you talk about this stuff and you 767 00:44:18,920 --> 00:44:24,120 Speaker 3: do science, this is how you get into trouble. Mister 768 00:44:24,160 --> 00:44:27,160 Speaker 3: potato head is an amazing military concept. 769 00:44:28,760 --> 00:44:29,120 Speaker 2: I see. 770 00:44:29,160 --> 00:44:32,920 Speaker 1: So this is how scientists accidentally bump up against. 771 00:44:32,640 --> 00:44:34,360 Speaker 2: This sort of thing at some point in their career. 772 00:44:34,800 --> 00:44:35,280 Speaker 1: Possibly. 773 00:44:35,400 --> 00:44:39,640 Speaker 3: Let's imagine you're not part of the Manhattan Project, and 774 00:44:39,800 --> 00:44:43,040 Speaker 3: during the Manhattan Project, we tried to create disinformation that 775 00:44:43,080 --> 00:44:46,080 Speaker 3: didn't call attention to the fact that uranium and plutonium 776 00:44:46,120 --> 00:44:49,360 Speaker 3: were particularly promising for physile material. 777 00:44:49,760 --> 00:44:53,000 Speaker 1: The Manhattan Project being where the world's great physicists all 778 00:44:53,040 --> 00:44:55,520 Speaker 1: gathered in the middle of New Mexico and Los Alamos, well, 779 00:44:55,520 --> 00:44:57,400 Speaker 1: they were actually more distributed. They were at the University 780 00:44:57,400 --> 00:45:00,239 Speaker 1: of Chicago and Oak Ridge. 781 00:45:00,320 --> 00:45:01,840 Speaker 2: Yes, the majority of them. 782 00:45:02,600 --> 00:45:04,839 Speaker 3: We're in this group of white badges, I believe at 783 00:45:04,920 --> 00:45:07,640 Speaker 3: Les Alamos who had access to the super secret information. 784 00:45:08,719 --> 00:45:11,280 Speaker 3: And one of the things we did is we engaged 785 00:45:11,360 --> 00:45:14,960 Speaker 3: in haystacking, which is that we talked about many more 786 00:45:15,000 --> 00:45:18,760 Speaker 3: elements than we thought were relevant in order to allow our. 787 00:45:19,400 --> 00:45:21,400 Speaker 2: Haystacking means you throw out more information. 788 00:45:21,520 --> 00:45:23,239 Speaker 3: You have a needle. I want to come up with 789 00:45:23,280 --> 00:45:26,799 Speaker 3: a haystack in order to hide it. So imagine that 790 00:45:26,840 --> 00:45:29,760 Speaker 3: you're like some guy who's not contacted by the Manhattan 791 00:45:29,760 --> 00:45:32,520 Speaker 3: Project and you say, actually, you know, it's really just 792 00:45:33,480 --> 00:45:35,960 Speaker 3: yourm and plutonium that we should be focused on. 793 00:45:36,239 --> 00:45:38,960 Speaker 2: You're doing science as far as you know. Ah. 794 00:45:39,000 --> 00:45:41,240 Speaker 1: But if I were that guy, I might receive a phone. 795 00:45:41,040 --> 00:45:43,160 Speaker 3: Call, or you might find that none of your work 796 00:45:43,640 --> 00:45:49,719 Speaker 3: is published. I see, suddenly the referees keep sending things back. 797 00:45:51,000 --> 00:45:56,080 Speaker 3: I see it requires further data, promising, but incomplete. 798 00:45:56,920 --> 00:45:57,399 Speaker 1: Got it. 799 00:45:57,960 --> 00:46:00,480 Speaker 3: Yeah, So my point is that you haven't bumped up 800 00:46:00,520 --> 00:46:04,800 Speaker 3: against this yet. It's just you haven't thought enough about 801 00:46:05,080 --> 00:46:09,840 Speaker 3: how powerful you are and how powerful national interest is 802 00:46:09,960 --> 00:46:12,600 Speaker 3: and the way in which science and national interests interact. 803 00:46:13,000 --> 00:46:15,400 Speaker 1: So let's get back to cowboys science then. So what 804 00:46:15,440 --> 00:46:17,000 Speaker 1: does that look like? What does that mean to you 805 00:46:17,040 --> 00:46:20,800 Speaker 1: to be able to do something outside of the standard channels. 806 00:46:21,239 --> 00:46:23,600 Speaker 3: Well, one thing it means is that if a cowboy 807 00:46:23,640 --> 00:46:26,520 Speaker 3: bumps up into the national interest complex, the national interest 808 00:46:26,560 --> 00:46:29,080 Speaker 3: complex comes and tells you, hey, you're riding on the 809 00:46:29,160 --> 00:46:33,839 Speaker 3: range here in New Mexico or Nevada, and we've got 810 00:46:33,880 --> 00:46:37,280 Speaker 3: some aerospace stuff going on, and maybe some nuclear stuff. 811 00:46:38,360 --> 00:46:42,680 Speaker 3: We need your cooperation. You should not train any of 812 00:46:42,760 --> 00:46:46,080 Speaker 3: us if you can't talk to us later us being scientists, 813 00:46:46,080 --> 00:46:49,960 Speaker 3: you should not train somebody at my level if you 814 00:46:50,000 --> 00:46:54,160 Speaker 3: can't have a conversation about national interest. If you think 815 00:46:54,200 --> 00:46:56,560 Speaker 3: that you can't trust me with a secret, then don't 816 00:46:56,560 --> 00:46:56,960 Speaker 3: train me. 817 00:46:59,400 --> 00:47:02,560 Speaker 1: You're saying the problem is scientists get trained and then 818 00:47:02,920 --> 00:47:04,760 Speaker 1: they might find something. 819 00:47:05,200 --> 00:47:07,759 Speaker 3: A mentally retarded eight year old child would not be 820 00:47:07,840 --> 00:47:10,440 Speaker 3: able to believe some of the lies told by Tony Fauci, 821 00:47:10,719 --> 00:47:13,080 Speaker 3: like what give me an example, two weeks to flatten 822 00:47:13,120 --> 00:47:13,520 Speaker 3: the curve? 823 00:47:13,640 --> 00:47:18,040 Speaker 2: What was that? Or we don't need masks? 824 00:47:18,040 --> 00:47:21,560 Speaker 3: We do, we don't, we do, we don't clearly based 825 00:47:21,600 --> 00:47:24,200 Speaker 3: on whether or not there was a failure to replenish 826 00:47:24,239 --> 00:47:29,279 Speaker 3: PPE after it was drawn down, I believe during the 827 00:47:29,280 --> 00:47:34,600 Speaker 3: Bush administration, like we didn't follow surge protocols or the 828 00:47:34,680 --> 00:47:37,160 Speaker 3: idea that it was racism to ask whether or not 829 00:47:37,200 --> 00:47:39,800 Speaker 3: something emerge from a lab and to emerge from a 830 00:47:39,880 --> 00:47:43,120 Speaker 3: lab in Wuhan. All of this stuff is nonsense and 831 00:47:43,160 --> 00:47:48,120 Speaker 3: it's absolutely insulting. Or vaccines are safe full stop, No 832 00:47:48,200 --> 00:47:52,560 Speaker 3: they're not. Water isn't safe, full stop. 833 00:47:53,040 --> 00:47:55,520 Speaker 1: So let me understand what this what do you see 834 00:47:55,520 --> 00:47:56,680 Speaker 1: as a solution to this? 835 00:47:57,280 --> 00:48:00,880 Speaker 3: First thing is I think twice three times before you 836 00:48:01,000 --> 00:48:02,840 Speaker 3: train somebody at public expense. 837 00:48:03,080 --> 00:48:07,759 Speaker 1: But you don't want to not train brilliant young physicists. 838 00:48:07,960 --> 00:48:09,040 Speaker 1: So what's a better solution? 839 00:48:09,160 --> 00:48:12,080 Speaker 3: No, no, I do want to not train. If we 840 00:48:12,080 --> 00:48:15,680 Speaker 3: have a class Fuchs, I don't want them trained. I'm sorry, 841 00:48:15,719 --> 00:48:18,480 Speaker 3: who's class fuchs? They spy at Los Alamos. If we 842 00:48:18,560 --> 00:48:21,440 Speaker 3: have somebody who's not patriotic enough to understand that in 843 00:48:21,520 --> 00:48:25,080 Speaker 3: the wake of Los Alamos, the Manhattan Project and the 844 00:48:25,120 --> 00:48:28,880 Speaker 3: teller Ulam design, that physics is not kidding around. 845 00:48:29,800 --> 00:48:30,880 Speaker 2: Don't train that person. 846 00:48:31,360 --> 00:48:34,320 Speaker 1: How do you determine I don't know, especially an interview 847 00:48:34,400 --> 00:48:40,479 Speaker 1: sixteen year old kid, give them an interview, Okay, you know, Okay, yeah, 848 00:48:40,560 --> 00:48:42,520 Speaker 1: I have a different view of science than anyone else 849 00:48:42,560 --> 00:48:44,799 Speaker 1: on planet Earth. So you happen to be foolish enough 850 00:48:44,800 --> 00:48:47,160 Speaker 1: to invite me to sit down. So here's you're getting something. 851 00:48:47,200 --> 00:48:49,400 Speaker 1: I think that we are intellectual ninjas. 852 00:48:49,560 --> 00:48:53,480 Speaker 3: We are dangerous. What we do is important. It's not cute, 853 00:48:53,520 --> 00:48:57,760 Speaker 3: it's not fun, it's not interesting. It's life and death. 854 00:48:59,320 --> 00:49:02,800 Speaker 3: Particularly within six months between nineteen fifty two and nineteen 855 00:49:02,840 --> 00:49:07,040 Speaker 3: fifty three. From November to April, everything changed, and it 856 00:49:07,160 --> 00:49:11,000 Speaker 3: changed in physics, and it changed in biology because of 857 00:49:11,040 --> 00:49:15,400 Speaker 3: the discovery of televite device in nineteen fifty two in 858 00:49:15,440 --> 00:49:17,920 Speaker 3: November and the explosion of ivy mic, which was a 859 00:49:17,960 --> 00:49:22,680 Speaker 3: successful thermonuclear test in the Pacific with a three stage weapon, 860 00:49:23,680 --> 00:49:27,640 Speaker 3: and because of the discovery of the repeating structure of 861 00:49:27,719 --> 00:49:33,120 Speaker 3: nucleic acid perfectly suited to being a data store translated 862 00:49:33,120 --> 00:49:36,440 Speaker 3: by ribosomes into protein, which are the machines that determine 863 00:49:36,440 --> 00:49:39,759 Speaker 3: everything in the world that matter. Right, This is the 864 00:49:39,800 --> 00:49:44,439 Speaker 3: structure of DNA, past structure of DNA leading to the 865 00:49:44,440 --> 00:49:49,920 Speaker 3: central dogma of translation of DNA into RNA and RNA 866 00:49:50,040 --> 00:49:56,960 Speaker 3: into protein and the genetic code. We have power that 867 00:49:57,160 --> 00:50:02,560 Speaker 3: is inconceivable, and if we are going to have national 868 00:50:02,560 --> 00:50:07,000 Speaker 3: interest issues, we need to have those national interests issues 869 00:50:07,040 --> 00:50:10,600 Speaker 3: out early, not late. I don't know why we're inviting 870 00:50:10,600 --> 00:50:14,600 Speaker 3: the Chinese to staff our labs. Is that because we 871 00:50:14,640 --> 00:50:16,799 Speaker 3: have an agreement with China that we are somehow going 872 00:50:16,840 --> 00:50:21,520 Speaker 3: to avoid war. But our graduate students are not graduate students, 873 00:50:21,560 --> 00:50:26,120 Speaker 3: they are workers. It is a cryptic labor program for 874 00:50:26,160 --> 00:50:30,319 Speaker 3: the universities and the best in the brightest is not that, 875 00:50:30,400 --> 00:50:33,120 Speaker 3: because we compete in a labor market. It's the best value, 876 00:50:33,200 --> 00:50:38,280 Speaker 3: not the best minds. Furthermore, the American product, the cowboy 877 00:50:38,400 --> 00:50:41,520 Speaker 3: scientists think Bruce Willis is in a lab code. That 878 00:50:41,640 --> 00:50:44,840 Speaker 3: product has high variants, but much much higher. 879 00:50:44,920 --> 00:50:45,120 Speaker 2: Mean. 880 00:50:46,440 --> 00:50:49,440 Speaker 3: We are the best in the world at science and engineering, 881 00:50:49,560 --> 00:50:53,839 Speaker 3: full stop. We being America, We being America, and our 882 00:50:53,960 --> 00:50:59,920 Speaker 3: friends the French of the world's greatest mathematicians. You know, 883 00:51:00,880 --> 00:51:05,560 Speaker 3: some of our friends are our Russians. The West. Something 884 00:51:05,640 --> 00:51:09,400 Speaker 3: happened in the West and in Japan. It just didn't 885 00:51:09,400 --> 00:51:10,560 Speaker 3: happen in the rest of the world. 886 00:51:10,800 --> 00:51:11,319 Speaker 1: What is that? 887 00:51:11,520 --> 00:51:16,160 Speaker 3: I don't know, the Enlightenment, the scientific method, some compounding 888 00:51:16,200 --> 00:51:20,000 Speaker 3: effect from colonization. Maybe it had to do with the 889 00:51:20,040 --> 00:51:23,600 Speaker 3: exploitation of the Third World. I don't know. But something 890 00:51:23,719 --> 00:51:33,320 Speaker 3: happened where the West got insanely powerful, and the US, 891 00:51:34,200 --> 00:51:36,440 Speaker 3: in part because of World War two and the mismanagement 892 00:51:36,480 --> 00:51:43,440 Speaker 3: of Europe by Adolf had learned Mussolinian other became the 893 00:51:43,480 --> 00:51:47,120 Speaker 3: dominant scientific power the world has ever seen. And we're 894 00:51:47,200 --> 00:51:50,000 Speaker 3: great at what we do. We've got all these scientific 895 00:51:50,000 --> 00:51:52,680 Speaker 3: employers who just lie, lie, lie, as long as the 896 00:51:52,800 --> 00:51:55,919 Speaker 3: day is long. About how Americans are lazy and they're 897 00:51:55,960 --> 00:51:58,719 Speaker 3: stupid and they can't do work. And we have to 898 00:51:58,719 --> 00:52:00,840 Speaker 3: look at the fact that we're being eaten by India 899 00:52:00,880 --> 00:52:03,200 Speaker 3: and China. We're not being beaten by Indian China. We 900 00:52:03,960 --> 00:52:07,160 Speaker 3: have to worry about England and France. And by the 901 00:52:07,200 --> 00:52:10,680 Speaker 3: way Indian China are going to get there, particularly India's 902 00:52:11,200 --> 00:52:15,040 Speaker 3: choice of the ITS and the Tata Institute of Fundamental Research. 903 00:52:16,400 --> 00:52:19,359 Speaker 3: China is buying up our talent left and right. We're 904 00:52:19,440 --> 00:52:23,319 Speaker 3: laying down on the job. My old officemate, Michael Kratzios, 905 00:52:23,480 --> 00:52:26,279 Speaker 3: I think, is the head of the Presidential Council on 906 00:52:26,360 --> 00:52:29,000 Speaker 3: Science and Technology as well as the Office of Science 907 00:52:29,040 --> 00:52:32,560 Speaker 3: and Technology Policy so P CASS and OSTP. I believe 908 00:52:32,560 --> 00:52:36,239 Speaker 3: that was a job previously held by Isidore Robbie, the 909 00:52:36,239 --> 00:52:40,239 Speaker 3: Nobel Laureate in physics. I love Michael Cratsios, he's a friend, 910 00:52:40,360 --> 00:52:45,320 Speaker 3: he's a great guy. But we are not taking science 911 00:52:45,920 --> 00:52:50,160 Speaker 3: and the destruction of science in the US as the 912 00:52:50,200 --> 00:52:53,480 Speaker 3: seventeen alarm fire that it is. We need to get 913 00:52:53,560 --> 00:52:57,560 Speaker 3: money and our own people, and we need to shove 914 00:52:57,600 --> 00:53:00,360 Speaker 3: them down the throats of our employers with government help. 915 00:53:01,200 --> 00:53:05,239 Speaker 3: And we need absolute scientific dominance, and it needs to 916 00:53:05,280 --> 00:53:07,759 Speaker 3: be much more public spirited, much less under the thumb 917 00:53:07,840 --> 00:53:13,400 Speaker 3: of the national security community. And it needs to be 918 00:53:13,480 --> 00:53:16,960 Speaker 3: friendly to the military. We cannot pretend that we are 919 00:53:16,960 --> 00:53:18,480 Speaker 3: not military adjacent. 920 00:53:19,760 --> 00:53:21,719 Speaker 1: How could it be friendly to the military and not 921 00:53:22,239 --> 00:53:24,120 Speaker 1: aligned with the national security interests. 922 00:53:25,040 --> 00:53:28,239 Speaker 3: We have to align with the national security interest The 923 00:53:28,360 --> 00:53:31,759 Speaker 3: national security community is not as good as we are. 924 00:53:31,960 --> 00:53:34,439 Speaker 1: There are two things that I'm trying to understand, which 925 00:53:34,480 --> 00:53:39,920 Speaker 1: is so One issue is that science. I've always loved 926 00:53:40,040 --> 00:53:41,680 Speaker 1: viewing as an international fellowship. 927 00:53:41,719 --> 00:53:42,839 Speaker 2: I can travel anywhere in the world. 928 00:53:42,920 --> 00:53:45,520 Speaker 1: If I meet someone who studies science like I do, 929 00:53:45,680 --> 00:53:49,879 Speaker 1: we can talk sometimes just with equations, whatever it is. 930 00:53:50,000 --> 00:53:53,200 Speaker 1: We get each other so deeply and fundamentally. But it 931 00:53:53,320 --> 00:53:56,680 Speaker 1: sounds like, on the other hand, you're saying it's it 932 00:53:56,760 --> 00:53:57,680 Speaker 1: shouldn't actually be. 933 00:53:57,760 --> 00:53:58,640 Speaker 2: I had the same feeling. 934 00:53:58,840 --> 00:54:03,239 Speaker 3: Yeah, And then suddenly all these physicists in Iran met 935 00:54:03,280 --> 00:54:06,600 Speaker 3: an end during the recent war. When I go for 936 00:54:06,640 --> 00:54:10,600 Speaker 3: a talk at the Tata Institute of Fundamental Research, which 937 00:54:10,600 --> 00:54:12,759 Speaker 3: by the way, is the nicest part of Bombay, really 938 00:54:12,960 --> 00:54:16,040 Speaker 3: just beautiful, I have to go through a military checkpoint 939 00:54:16,280 --> 00:54:19,279 Speaker 3: to go to my string theory talk because it's in 940 00:54:19,400 --> 00:54:23,760 Speaker 3: Navy Nutgart what is that. It's in the naval base. Okay, 941 00:54:25,160 --> 00:54:27,680 Speaker 3: science is not what you're trying to make it out 942 00:54:27,680 --> 00:54:31,360 Speaker 3: to me. We've got this naive singsong view of science, 943 00:54:31,400 --> 00:54:34,760 Speaker 3: which we love. Yeah, right, because when you're doing science, 944 00:54:34,840 --> 00:54:36,439 Speaker 3: you don't care where somebody was born. 945 00:54:36,600 --> 00:54:39,279 Speaker 1: You know what. I think, it's so much of the 946 00:54:39,440 --> 00:54:43,040 Speaker 1: territory of science. If I were going to make up 947 00:54:43,040 --> 00:54:45,520 Speaker 1: a number, let me just make up ninety percent of 948 00:54:45,560 --> 00:54:50,440 Speaker 1: the territory. Really is the sing song international fellowship stuff, kumbaya, 949 00:54:50,600 --> 00:54:53,080 Speaker 1: my friend? Yeah, exactly. You could go anywhere and talk 950 00:54:53,080 --> 00:54:55,759 Speaker 1: to people about zebrafish and the neurons and what's going on. 951 00:54:56,160 --> 00:54:58,719 Speaker 1: But I see where you're coming from. There is this 952 00:54:58,840 --> 00:55:01,480 Speaker 1: ten percent what I make up the number where it 953 00:55:01,600 --> 00:55:04,919 Speaker 1: actually really matters. And suddenly it's serious stuff, as you said, 954 00:55:04,960 --> 00:55:09,040 Speaker 1: life and death stuff. And that's where we can't put 955 00:55:09,040 --> 00:55:12,239 Speaker 1: that all under the same umbrella because. 956 00:55:12,000 --> 00:55:13,760 Speaker 2: It's a giant problem. David. 957 00:55:14,640 --> 00:55:21,960 Speaker 3: Let's imagine that you care about for manifold topology, no 958 00:55:22,000 --> 00:55:26,680 Speaker 3: known problem. Let's imagine you care about elliptic curves. Suddenly 959 00:55:26,719 --> 00:55:29,960 Speaker 3: you have to do it at fort Meat because it's 960 00:55:30,000 --> 00:55:34,440 Speaker 3: involved in cryptography. The naive sing song thing. None of 961 00:55:34,520 --> 00:55:35,920 Speaker 3: us should hold that perspective. 962 00:55:36,160 --> 00:55:39,160 Speaker 1: Could we hold that perspective by saying, look, I mean 963 00:55:39,200 --> 00:55:41,880 Speaker 1: I think I would say most of the stuff I do, 964 00:55:42,000 --> 00:55:45,840 Speaker 1: maybe not the potato, and you didn't think you're correct. 965 00:55:45,840 --> 00:55:48,160 Speaker 1: I did not think about that. But let's say plenty 966 00:55:48,200 --> 00:55:51,400 Speaker 1: of other stuff that I've done about sleeping and dreaming 967 00:55:51,520 --> 00:55:53,840 Speaker 1: and brain plasticy and vision and visual illusion. 968 00:55:53,880 --> 00:55:58,880 Speaker 3: How important sleep is to our Tier one operators in 969 00:55:58,960 --> 00:56:04,200 Speaker 3: Delta force and ground branch at the CIA. We don't 970 00:56:04,360 --> 00:56:05,160 Speaker 3: You don't know. 971 00:56:06,160 --> 00:56:08,520 Speaker 1: I'm just saying I agree, I don't know the things 972 00:56:08,560 --> 00:56:10,759 Speaker 1: that I do when I'm stepping on something that This 973 00:56:10,880 --> 00:56:11,920 Speaker 1: is my point. 974 00:56:11,800 --> 00:56:18,000 Speaker 3: Is we can't afford this extended childhood as scientists. 975 00:56:18,960 --> 00:56:21,279 Speaker 1: I think that's an excellent point. Here's the part I'm 976 00:56:21,280 --> 00:56:24,920 Speaker 1: trying to understand, though, is it sounds like you're saying 977 00:56:25,400 --> 00:56:29,440 Speaker 1: we need to mature as scientists to understand. 978 00:56:29,480 --> 00:56:31,800 Speaker 2: Wow, there's real national security interests here. 979 00:56:32,680 --> 00:56:35,040 Speaker 1: But I think I'm also hearing you saying we don't 980 00:56:35,080 --> 00:56:37,400 Speaker 1: want to be bossed around by national correctists. 981 00:56:37,480 --> 00:56:41,640 Speaker 3: No, but we need to be I'm really glad we're 982 00:56:41,640 --> 00:56:44,000 Speaker 3: having this. This is a very difficult conversation, so nobody 983 00:56:44,040 --> 00:56:48,000 Speaker 3: I'd rather be having it within you. People don't understand 984 00:56:48,000 --> 00:56:51,719 Speaker 3: my perspective on universities, on science and national interest. Great 985 00:56:51,760 --> 00:56:54,840 Speaker 3: science and good science are continuing to happen inside of universities. 986 00:56:54,880 --> 00:56:57,360 Speaker 3: There's much less great science. There's much more good science. 987 00:56:58,120 --> 00:57:01,360 Speaker 3: But I am a talks regularly at Caltech and UCLA 988 00:57:02,280 --> 00:57:04,680 Speaker 3: probably should be going to USC. I don't see any 989 00:57:04,760 --> 00:57:07,400 Speaker 3: of the tech leaders who are opining about science, and 990 00:57:07,480 --> 00:57:10,560 Speaker 3: any talks that I ever go to, it's just academicians. 991 00:57:10,560 --> 00:57:13,440 Speaker 3: There's nobody from outside. So I am a huge defender 992 00:57:13,600 --> 00:57:17,000 Speaker 3: that the universities are not over. Standard thing in my 993 00:57:17,280 --> 00:57:20,680 Speaker 3: tech circles is yep, science is over, universities are over. 994 00:57:20,880 --> 00:57:24,080 Speaker 3: Not true, far from true. I am also a major 995 00:57:24,120 --> 00:57:29,000 Speaker 3: critic of science, saying the public can see that we 996 00:57:29,080 --> 00:57:33,480 Speaker 3: blew it oh on COVID multiple ways. We're not honest 997 00:57:33,520 --> 00:57:36,160 Speaker 3: about things like the measurement of inflation. I can promise 998 00:57:36,200 --> 00:57:41,280 Speaker 3: you that. And they are detecting that there's a hidden 999 00:57:41,360 --> 00:57:44,480 Speaker 3: hand and that scientists are somehow not acting in the 1000 00:57:44,480 --> 00:57:47,040 Speaker 3: public interest. And I believe that there's really something to 1001 00:57:47,120 --> 00:57:49,320 Speaker 3: that and that we scientists have to talk about that. 1002 00:57:50,080 --> 00:57:52,800 Speaker 3: I believe that the national interest community and the national 1003 00:57:52,800 --> 00:57:58,000 Speaker 3: security community are extremely important I believe in national interest, 1004 00:57:58,040 --> 00:58:01,000 Speaker 3: and I believe in national security. I believe that many 1005 00:58:01,000 --> 00:58:03,240 Speaker 3: people in that community are not good enough to be 1006 00:58:03,280 --> 00:58:06,920 Speaker 3: our bosses. Ah, and I believe that we are not 1007 00:58:06,960 --> 00:58:09,520 Speaker 3: good enough to be our bosses because part of being 1008 00:58:09,560 --> 00:58:12,040 Speaker 3: a grown up in that idiom is to say, I 1009 00:58:12,080 --> 00:58:15,560 Speaker 3: think about quarks. Quarks make up nucleons, and nucleons make 1010 00:58:15,640 --> 00:58:19,640 Speaker 3: up nuclear weapons. So yes, I don't know whether something 1011 00:58:19,680 --> 00:58:22,440 Speaker 3: I'm going to discover or might have a security implication. 1012 00:58:23,560 --> 00:58:26,280 Speaker 3: So more or less, we've got all of these contradictions. 1013 00:58:26,800 --> 00:58:30,440 Speaker 3: We're not playing at an adult level, and that I 1014 00:58:30,480 --> 00:58:33,600 Speaker 3: want our national security community to get better. I want 1015 00:58:33,640 --> 00:58:37,040 Speaker 3: scientists to be full partners. I want us to be 1016 00:58:37,080 --> 00:58:41,680 Speaker 3: pushing back on particularly bad national interest to people and saying, 1017 00:58:41,920 --> 00:58:45,160 Speaker 3: don't ever force me to repeat these laws to the public. 1018 00:58:45,640 --> 00:58:48,760 Speaker 3: And nobody's even having anything remotely like this conversation so 1019 00:58:48,800 --> 00:58:50,840 Speaker 3: far as I know it, I'm basically having it with 1020 00:58:50,920 --> 00:58:52,520 Speaker 3: myself on podcasts. 1021 00:58:52,560 --> 00:58:56,280 Speaker 1: Would you see this as being a self maturation among 1022 00:58:56,320 --> 00:58:59,800 Speaker 1: scientists and among national security people, or would you see 1023 00:58:59,800 --> 00:59:02,880 Speaker 1: a somebody in charge of that, the president and whoever, 1024 00:59:02,960 --> 00:59:05,080 Speaker 1: is saying okay, guys, everyone get to the table. 1025 00:59:05,600 --> 00:59:09,760 Speaker 3: I think Michael Cratzios should be relocated to some terrific 1026 00:59:09,880 --> 00:59:12,920 Speaker 3: office because I think he's an able and capable person. 1027 00:59:13,320 --> 00:59:14,240 Speaker 1: Tell us about him. 1028 00:59:14,360 --> 00:59:18,160 Speaker 3: It doesn't matter, he's not a leading scientist. That office, 1029 00:59:18,560 --> 00:59:21,840 Speaker 3: that team that advises the president should not be selected 1030 00:59:21,920 --> 00:59:26,720 Speaker 3: for on presidential loyalty to Donald Trump, full stop. I'm sorry. 1031 00:59:27,120 --> 00:59:29,440 Speaker 3: I understand that Donald Trump has been treated in some 1032 00:59:29,520 --> 00:59:32,120 Speaker 3: ways unfairly by the outside world, and that he has 1033 00:59:32,120 --> 00:59:36,760 Speaker 3: a reason to surround himself with loyalists. Science is not loyal. 1034 00:59:37,280 --> 00:59:37,520 Speaker 1: Okay. 1035 00:59:37,600 --> 00:59:42,200 Speaker 3: You can ask scientists minorly to hold off on something 1036 00:59:42,920 --> 00:59:46,440 Speaker 3: or to play ball, but you cannot ask them to 1037 00:59:46,800 --> 00:59:48,240 Speaker 3: just lie. 1038 00:59:48,480 --> 00:59:48,640 Speaker 2: Right. 1039 00:59:49,440 --> 00:59:56,680 Speaker 3: We need somebody with universal respect. The Jason's pe cast 1040 00:59:58,600 --> 01:00:03,040 Speaker 3: and OST and the National Academy of Sciences and the 1041 01:00:03,120 --> 01:00:09,240 Speaker 3: National Science Board need to be top people and team players, 1042 01:00:09,760 --> 01:00:14,480 Speaker 3: and a lot of players with one another, team players 1043 01:00:14,520 --> 01:00:19,400 Speaker 3: with the national interest community. Okay, and this is gonna 1044 01:00:19,400 --> 01:00:26,560 Speaker 3: sound contradictory, but it isn't massive individualists. It's a very 1045 01:00:26,600 --> 01:00:27,240 Speaker 3: tricky thing. 1046 01:00:28,400 --> 01:00:28,560 Speaker 2: You know. 1047 01:00:28,680 --> 01:00:32,240 Speaker 3: Leo Zillard, who wrote the original letter to start off 1048 01:00:32,240 --> 01:00:36,479 Speaker 3: the Manhattan Project, wasn't allowed into the Manhattan Project because 1049 01:00:36,520 --> 01:00:37,320 Speaker 3: it was too independent. 1050 01:00:37,480 --> 01:00:41,760 Speaker 1: So you're saying the solution is what we need is 1051 01:00:42,360 --> 01:00:48,440 Speaker 1: a maturation across both communities. But I want to make 1052 01:00:48,440 --> 01:00:50,680 Speaker 1: sure I understand what would be the path there that 1053 01:00:50,720 --> 01:00:52,840 Speaker 1: you that you might see. I mean, maybe you think 1054 01:00:52,920 --> 01:00:55,479 Speaker 1: it's just a difficult, thorny problem, but do you see 1055 01:00:55,800 --> 01:00:56,480 Speaker 1: twinny squint. 1056 01:00:56,520 --> 01:01:03,160 Speaker 3: Suddenly Donald Trump invites a list of the top American scientists, 1057 01:01:03,440 --> 01:01:06,760 Speaker 3: not good scientists, but very often great scientists. Tomorrow lago 1058 01:01:07,200 --> 01:01:10,440 Speaker 3: the same way all our friends in the entertainment world, 1059 01:01:10,480 --> 01:01:14,280 Speaker 3: the business world, the finance world, the tech world have gone, 1060 01:01:14,520 --> 01:01:16,560 Speaker 3: and he says, we are going to have a scientific 1061 01:01:16,600 --> 01:01:19,960 Speaker 3: renaissance full stop. There's no way we are going to 1062 01:01:20,000 --> 01:01:24,400 Speaker 3: continue to destroy our seed corn for technology. We need 1063 01:01:24,440 --> 01:01:27,040 Speaker 3: to know. I hear you guys are suffering. I hear 1064 01:01:27,080 --> 01:01:29,680 Speaker 3: you guys are precarious. I want to know why nobody 1065 01:01:29,720 --> 01:01:32,600 Speaker 3: stood up Defauci and Collins the way we needed them to. 1066 01:01:33,400 --> 01:01:35,800 Speaker 3: What does it take to get academic freedom. We understand 1067 01:01:35,800 --> 01:01:38,600 Speaker 3: that we've tasked you with protecting the nation and making 1068 01:01:38,680 --> 01:01:42,040 Speaker 3: us rich and powerful, and you're not participating. How do 1069 01:01:42,120 --> 01:01:45,400 Speaker 3: we get you back to second homes? How do we 1070 01:01:45,440 --> 01:01:47,680 Speaker 3: get you retirements? How do we get you raising three 1071 01:01:47,720 --> 01:01:50,800 Speaker 3: to four kids on one income? With help in the house. 1072 01:01:51,880 --> 01:01:55,600 Speaker 3: You're our a team man, So my feeling is suck 1073 01:01:55,680 --> 01:01:58,480 Speaker 3: it up, open your pocketbook, shut your mouth, learn to 1074 01:01:58,520 --> 01:02:02,360 Speaker 3: deal with science as what it is, and learn to 1075 01:02:02,400 --> 01:02:08,200 Speaker 3: deal with scientists as equals and team players. And don't ever, ever, ever, ever, 1076 01:02:08,320 --> 01:02:11,160 Speaker 3: take the world's smartest people and feed them a B 1077 01:02:11,320 --> 01:02:15,760 Speaker 3: minus lie and expect them to shut up or repeat 1078 01:02:15,760 --> 01:02:18,680 Speaker 3: it because you control whether or not they can function. 1079 01:02:19,120 --> 01:02:21,880 Speaker 3: It is time for the scientists to mutiny, not against 1080 01:02:21,880 --> 01:02:24,760 Speaker 3: the United States of America, but to mutiny against our 1081 01:02:24,800 --> 01:02:29,439 Speaker 3: agreement with the National Interest Complex. You guys broke the deal. 1082 01:02:29,600 --> 01:02:34,000 Speaker 3: We had something called the Endless Frontier. You pass something 1083 01:02:34,000 --> 01:02:37,280 Speaker 3: called the Mansfield Amendment around nineteen seventy, which removed military 1084 01:02:37,320 --> 01:02:41,600 Speaker 3: funding from Blue Sky research. You've been eroding US ever since. 1085 01:02:41,640 --> 01:02:44,400 Speaker 3: You passed the Immigration Act of nineteen ninety based on 1086 01:02:44,440 --> 01:02:47,880 Speaker 3: a fraud that you perpetrated through the National Science Foundation 1087 01:02:49,240 --> 01:02:54,720 Speaker 3: in the Reagan era under Eric Bloch. Enough enough, you 1088 01:02:54,760 --> 01:02:58,160 Speaker 3: were going to treat scientists properly, and if you don't 1089 01:02:58,360 --> 01:03:00,560 Speaker 3: expect them to move to China and then you're going 1090 01:03:00,640 --> 01:03:05,040 Speaker 3: to deal with American scientists helping the Chinese. I just 1091 01:03:05,120 --> 01:03:07,720 Speaker 3: really don't know, is there no one in the national 1092 01:03:10,720 --> 01:03:15,520 Speaker 3: intelligence complex, the national interest complex, national security complex, who 1093 01:03:15,520 --> 01:03:19,080 Speaker 3: has thought about the fact that we are destroying ourselves. 1094 01:03:20,040 --> 01:03:22,720 Speaker 1: I just don't grasp it destroying ourselves in terms of 1095 01:03:22,760 --> 01:03:26,840 Speaker 1: not helping scientists blossom and thrive. What do you think 1096 01:03:26,880 --> 01:03:32,200 Speaker 1: about a million dollar salary per year for scientists for 1097 01:03:32,280 --> 01:03:35,680 Speaker 1: a scientists? What do you think about bonuses that look 1098 01:03:35,800 --> 01:03:41,200 Speaker 1: like bonuses granted to investment bankers? Right now, I think 1099 01:03:41,240 --> 01:03:44,160 Speaker 1: there's a million dollar prize for solving P equals NP 1100 01:03:44,520 --> 01:03:50,000 Speaker 1: and a million dollar prize for the Remont hypothesis. Like, 1101 01:03:50,080 --> 01:03:51,640 Speaker 1: I think you're missing a few zeros on that. 1102 01:03:51,920 --> 01:03:53,440 Speaker 2: You're saying it should be much more. 1103 01:03:53,640 --> 01:03:56,360 Speaker 3: Reach deep into your heart and into your checkbook, and 1104 01:03:56,400 --> 01:03:58,760 Speaker 3: when you get serious, come back to me with a number. 1105 01:03:59,520 --> 01:04:03,640 Speaker 3: How how dare you? I mean, who are these people? 1106 01:04:04,200 --> 01:04:06,480 Speaker 3: A million dollars for the Remonnt hypothesis? 1107 01:04:06,520 --> 01:04:07,040 Speaker 2: Wow? 1108 01:04:07,960 --> 01:04:10,000 Speaker 1: And your point is if they offered way more than that, 1109 01:04:10,080 --> 01:04:12,440 Speaker 1: let's it was fifty million dollars for doing that, you'd 1110 01:04:12,480 --> 01:04:15,560 Speaker 1: attract more people there and it would be more reflective 1111 01:04:15,600 --> 01:04:16,840 Speaker 1: of what the value is. 1112 01:04:17,000 --> 01:04:19,520 Speaker 3: Tell me something you're interested in AI. When do you 1113 01:04:19,520 --> 01:04:22,280 Speaker 3: think the large language model thing reached its point of 1114 01:04:22,480 --> 01:04:28,800 Speaker 3: just unbelievable discontinuity with respect to the intellectual underpinnings. What 1115 01:04:28,840 --> 01:04:30,520 Speaker 3: paper would you associate with? 1116 01:04:31,800 --> 01:04:34,640 Speaker 1: Twenty seventeen, Google Brain publishes the Transformer Model. 1117 01:04:34,800 --> 01:04:38,160 Speaker 3: Yeah, now that paper is called the Tension is all 1118 01:04:38,200 --> 01:04:38,560 Speaker 3: you need? 1119 01:04:38,760 --> 01:04:41,920 Speaker 2: Yes? How many authors are there on it? 1120 01:04:42,080 --> 01:04:42,480 Speaker 1: Three? 1121 01:04:42,960 --> 01:04:44,200 Speaker 2: I remember and it's eight? 1122 01:04:44,440 --> 01:04:45,080 Speaker 1: Oh okay? 1123 01:04:45,480 --> 01:04:50,520 Speaker 3: What are their names? Any of them? Oops? Tell me something. 1124 01:04:51,200 --> 01:04:53,160 Speaker 3: What are some names that you associate with Google? 1125 01:04:53,560 --> 01:04:55,320 Speaker 1: Probably all the ones that come to mind or the 1126 01:04:55,360 --> 01:04:58,120 Speaker 1: executives give me who are you thinking? Well, you know 1127 01:04:58,240 --> 01:05:02,200 Speaker 1: Larry and SERGEI and Eric Schmid so on, anyone else? 1128 01:05:03,120 --> 01:05:08,680 Speaker 1: The people at Google X Maybe Jeff Okay, I mean 1129 01:05:08,680 --> 01:05:11,800 Speaker 1: I would name Jack Cadari and Adam Brown and other 1130 01:05:11,840 --> 01:05:12,800 Speaker 1: friends of mine who had. 1131 01:05:12,640 --> 01:05:15,800 Speaker 2: An expert shout out to Adam Brown. Yeah, quantum gravity 1132 01:05:16,000 --> 01:05:17,080 Speaker 2: going on at Google? 1133 01:05:17,240 --> 01:05:20,080 Speaker 1: Yeah okay, but of course demis. 1134 01:05:19,680 --> 01:05:24,120 Speaker 3: Okay, But my point is this is what we're doing. 1135 01:05:25,320 --> 01:05:32,640 Speaker 3: People create value, and you know I want to hear. Okay, 1136 01:05:32,680 --> 01:05:37,240 Speaker 3: he got plane rich from that paper? Hmm, by which 1137 01:05:37,280 --> 01:05:39,320 Speaker 3: you mean he could afford a private plane? Yeah, okay, 1138 01:05:39,480 --> 01:05:42,880 Speaker 3: you know he used to fly from lax to to JFK. 1139 01:05:43,120 --> 01:05:46,800 Speaker 3: Now he flies from Van Eys to Teeterborough. I used 1140 01:05:46,800 --> 01:05:50,440 Speaker 3: to swim naked with rowrobot off of his place in 1141 01:05:50,520 --> 01:05:54,560 Speaker 3: Martha's Vineyard. He was a Harvard professor, Casper Gratstein professor, 1142 01:05:55,720 --> 01:05:58,080 Speaker 3: and he had a second home on Martha's Vineyard. That's 1143 01:05:58,120 --> 01:06:02,120 Speaker 3: normal professors on Professor's Row. It was named Professor's Row 1144 01:06:02,160 --> 01:06:05,440 Speaker 3: because professors could afford the houses there. We've had a 1145 01:06:05,600 --> 01:06:09,360 Speaker 3: massive blowout of the genie coefficients. I want scientists to 1146 01:06:09,520 --> 01:06:12,320 Speaker 3: participate in the world they created for everyone else. 1147 01:06:12,600 --> 01:06:15,280 Speaker 1: One question is in the same way that you mentioned 1148 01:06:15,320 --> 01:06:20,959 Speaker 1: Fuchs before. It's very difficult to determine during somebody's career, 1149 01:06:21,040 --> 01:06:24,160 Speaker 1: first of all, whether they're patriot or they have other interests. 1150 01:06:24,200 --> 01:06:27,080 Speaker 1: And it's also difficult to determine who is going to 1151 01:06:27,200 --> 01:06:30,960 Speaker 1: make contributions and who is not. Really, it is because 1152 01:06:31,000 --> 01:06:34,360 Speaker 1: science is such a complex road. I know so many 1153 01:06:34,360 --> 01:06:36,880 Speaker 1: smart people, surely you do to who spent their lives 1154 01:06:36,960 --> 01:06:41,360 Speaker 1: doing hard work on things that happened to never yield something. 1155 01:06:41,640 --> 01:06:44,000 Speaker 1: And other people who are playing the what was that 1156 01:06:44,080 --> 01:06:48,320 Speaker 1: game with where you uncover squares with Mind's Minecraft? Not Minecraft, 1157 01:06:48,320 --> 01:06:52,080 Speaker 1: it's the Mind Sweeper mina. I'm sorry, Yeah, where you 1158 01:06:52,120 --> 01:06:54,400 Speaker 1: know you happen to click on a square when something 1159 01:06:54,480 --> 01:06:58,280 Speaker 1: huge opens up, and that, unfortunately is a matter of luck. 1160 01:06:58,320 --> 01:07:00,520 Speaker 1: Someometimes I have a totally different view on that. Okay, 1161 01:07:00,520 --> 01:07:01,360 Speaker 1: tell me yours. 1162 01:07:01,640 --> 01:07:05,080 Speaker 3: When somebody gets really lucky, really early, it often changes 1163 01:07:05,120 --> 01:07:08,280 Speaker 3: their brain chemistry, it changes how they swagger, how they 1164 01:07:08,520 --> 01:07:13,520 Speaker 3: approach the world. So in part, really good fortune, really 1165 01:07:13,560 --> 01:07:16,640 Speaker 3: early in life is a good thing can be. Yeah, 1166 01:07:16,680 --> 01:07:19,080 Speaker 3: that's what happened with Krick and Watson. Well, it didn't 1167 01:07:19,080 --> 01:07:20,840 Speaker 3: happen with Quick because Creek was in his thirties. It 1168 01:07:20,840 --> 01:07:22,000 Speaker 3: happened with Watson because he. 1169 01:07:22,000 --> 01:07:26,320 Speaker 1: Was, like, thirties is still pretty good. And come on, yeah, well, 1170 01:07:26,400 --> 01:07:29,840 Speaker 1: Krick was one of the great scientists of the I 1171 01:07:29,840 --> 01:07:31,080 Speaker 1: don't half at the twenty in the beginning of the 1172 01:07:31,120 --> 01:07:34,000 Speaker 1: twenty for such old man, my friend. No, he wasn't. 1173 01:07:34,040 --> 01:07:36,120 Speaker 3: He was in his thirties. It's different in biology, the 1174 01:07:36,120 --> 01:07:39,000 Speaker 3: whole thing has shifted. Well, but you don't know why 1175 01:07:39,040 --> 01:07:40,320 Speaker 3: it's different in biology. 1176 01:07:40,680 --> 01:07:41,640 Speaker 1: See, it's because it. 1177 01:07:41,560 --> 01:07:43,880 Speaker 2: Takes years to get a feeling for the organism. No 1178 01:07:44,000 --> 01:07:46,120 Speaker 2: it doesn't. Let me, let me explain why it happens. 1179 01:07:46,360 --> 01:07:48,800 Speaker 2: I'll tell me your opinion. Yes, well tell me. 1180 01:07:48,920 --> 01:07:54,000 Speaker 3: The American Society for Cell Biology ASCB worked with me, 1181 01:07:54,320 --> 01:07:57,400 Speaker 3: and I went around and I got to ask, I 1182 01:07:57,400 --> 01:08:01,400 Speaker 3: don't know, twenty twenty five of the world's top principal 1183 01:08:01,440 --> 01:08:04,160 Speaker 3: investigators why things were the way they are. 1184 01:08:05,800 --> 01:08:09,640 Speaker 2: Oh boy. We were not allowed to publish our findings 1185 01:08:09,640 --> 01:08:12,440 Speaker 2: in science until we took out our findings from the paper. 1186 01:08:13,680 --> 01:08:15,080 Speaker 1: Tell me what kind of stuff. 1187 01:08:15,120 --> 01:08:16,960 Speaker 3: One of the things that we collected in interviews with 1188 01:08:17,000 --> 01:08:23,520 Speaker 3: principal investigators is a discovery that when they did an analysis, 1189 01:08:23,560 --> 01:08:27,120 Speaker 3: they found that an unusually high number of female principal 1190 01:08:27,160 --> 01:08:31,599 Speaker 3: investigators changed their research patterns after the birth of their 1191 01:08:31,600 --> 01:08:35,880 Speaker 3: first child, that they found motherhood so fulfilling that it 1192 01:08:35,920 --> 01:08:39,440 Speaker 3: competed with what they were doing previously to run their labs. 1193 01:08:40,960 --> 01:08:46,879 Speaker 3: The claim was, we then decided to push academic freedom 1194 01:08:46,960 --> 01:08:50,720 Speaker 3: closer and closer to the point of geriatric pregnancy, so 1195 01:08:50,760 --> 01:08:54,919 Speaker 3: that we would not be surprised and that female principal 1196 01:08:54,960 --> 01:08:58,840 Speaker 3: investigators who had previously been all out in terms of 1197 01:08:58,880 --> 01:09:05,800 Speaker 3: their research would have zero or one children, but not 1198 01:09:06,439 --> 01:09:08,840 Speaker 3: multiple children and not get bogged down. And if you 1199 01:09:08,880 --> 01:09:10,720 Speaker 3: actually go back to the fifties and you look at 1200 01:09:10,720 --> 01:09:15,200 Speaker 3: some of the most successful female biologists before women's liberation, 1201 01:09:16,200 --> 01:09:19,479 Speaker 3: they were often fairly well to do and had help 1202 01:09:19,479 --> 01:09:25,240 Speaker 3: in the house, and we interviewed some people, for example, 1203 01:09:25,240 --> 01:09:28,400 Speaker 3: who had a child in the next day, brought the 1204 01:09:28,479 --> 01:09:32,679 Speaker 3: child into a playpen in the lab despite the presence 1205 01:09:32,680 --> 01:09:37,519 Speaker 3: of radioactive markers and mutagens, to show how serious they were. 1206 01:09:39,360 --> 01:09:45,760 Speaker 3: So my claim is that we've developed an entire rationale 1207 01:09:45,840 --> 01:09:49,760 Speaker 3: for why it now takes so long to become a 1208 01:09:49,920 --> 01:09:54,839 Speaker 3: full professor in biology. But it actually has different reasons. 1209 01:09:54,840 --> 01:09:57,800 Speaker 3: And when we put these quotes, because I recorded these 1210 01:09:57,840 --> 01:10:01,799 Speaker 3: on microconsettes, when we put these quotes into the paper, 1211 01:10:02,720 --> 01:10:06,639 Speaker 3: Science magazine, which is one of the top journals in biology, 1212 01:10:06,840 --> 01:10:09,480 Speaker 3: said you're going to have to take out these conclusions 1213 01:10:09,520 --> 01:10:11,320 Speaker 3: or we can't publish it. So I have a publication 1214 01:10:11,439 --> 01:10:14,600 Speaker 3: in Science, and the only reason that I have a 1215 01:10:14,600 --> 01:10:18,080 Speaker 3: publication in a top journal is that my co authors 1216 01:10:18,240 --> 01:10:24,760 Speaker 3: agree to take the findings out of the paper. This 1217 01:10:24,800 --> 01:10:28,760 Speaker 3: is how the game is really played. People say, well, Eric, 1218 01:10:29,400 --> 01:10:32,920 Speaker 3: you know you're against peer review because you can't pass it. 1219 01:10:33,000 --> 01:10:36,160 Speaker 3: I have peer reviewed papers. The issue is is that 1220 01:10:36,360 --> 01:10:40,000 Speaker 3: I know what it is. They won't let you publish 1221 01:10:40,640 --> 01:10:52,400 Speaker 3: the most interesting stuff if it disrupts the field. 1222 01:10:56,960 --> 01:10:59,120 Speaker 1: So this brings us back to the main theme of 1223 01:10:59,200 --> 01:11:03,840 Speaker 1: cowboys side. So what should that look like or what 1224 01:11:03,880 --> 01:11:06,640 Speaker 1: could that look like? First of all, we agree that 1225 01:11:06,680 --> 01:11:09,640 Speaker 1: there are great scientists and good scientists. This is your 1226 01:11:09,680 --> 01:11:12,840 Speaker 1: framing of it, which I agree with. The great scientists 1227 01:11:13,760 --> 01:11:15,439 Speaker 1: those are the ones that you want to do the 1228 01:11:15,439 --> 01:11:18,240 Speaker 1: cowboys science, whereas the good scientists are the ones sort of, 1229 01:11:18,520 --> 01:11:21,639 Speaker 1: you know, doing the hard, good work, but not coming 1230 01:11:21,680 --> 01:11:24,360 Speaker 1: up with the giant new frameworks on stuff they could. 1231 01:11:24,840 --> 01:11:27,799 Speaker 1: They could. But I'm just trying to frame your framework 1232 01:11:27,840 --> 01:11:28,960 Speaker 1: in a hopefully act. 1233 01:11:29,360 --> 01:11:31,080 Speaker 2: But again, I'm not against. 1234 01:11:31,840 --> 01:11:32,559 Speaker 1: You have nothing against. 1235 01:11:32,600 --> 01:11:35,280 Speaker 2: The good scientists might come up with something you don't expect. 1236 01:11:35,360 --> 01:11:37,080 Speaker 1: They might click on the square and mind sweeper that 1237 01:11:37,120 --> 01:11:40,280 Speaker 1: opens up something exactly exactly. Okay, So, but how do 1238 01:11:40,360 --> 01:11:43,280 Speaker 1: we encourage cowboy science? How do we make sure that 1239 01:11:43,360 --> 01:11:46,080 Speaker 1: the Jim Watson's and the so many others. 1240 01:11:46,120 --> 01:11:50,520 Speaker 3: We need national interest exemptions from the Civil Rights Act, 1241 01:11:50,920 --> 01:11:54,320 Speaker 3: we need slush funds, We need a lack of oversight. 1242 01:11:54,880 --> 01:11:56,920 Speaker 3: We need to be able to determine who the smart 1243 01:11:56,920 --> 01:12:00,840 Speaker 3: people are based on our own determinations, and screw off 1244 01:12:00,840 --> 01:12:02,560 Speaker 3: if you don't agree with us. 1245 01:12:03,720 --> 01:12:06,040 Speaker 2: How is that consistent with national security interest? 1246 01:12:06,120 --> 01:12:08,840 Speaker 3: This is what we did before we had a bunch 1247 01:12:08,840 --> 01:12:10,040 Speaker 3: of really crazy. 1248 01:12:09,680 --> 01:12:14,599 Speaker 2: People, good crazy, I see mean good crazy mostly okay, 1249 01:12:14,640 --> 01:12:18,040 Speaker 2: but these are strong spices. Right. 1250 01:12:18,080 --> 01:12:22,479 Speaker 3: If you look at Elon Musk, he makes his own 1251 01:12:22,520 --> 01:12:26,320 Speaker 3: rules every chance he gets. Our guys made their own rules. 1252 01:12:27,640 --> 01:12:29,479 Speaker 3: We had Elon Musk's and science. 1253 01:12:29,840 --> 01:12:32,400 Speaker 1: Who are you thinking of when you say that Alexander 1254 01:12:32,520 --> 01:12:38,360 Speaker 1: Grothendieg is a door singer Sidney Brenner. See you got 1255 01:12:38,400 --> 01:12:41,160 Speaker 1: Reverend Sidney Brenner. Yeah, I knew him as well at 1256 01:12:41,160 --> 01:12:42,800 Speaker 1: the Amazing, Amazing guy. 1257 01:12:42,880 --> 01:12:45,599 Speaker 3: Yeah, I'm not supposed to know who that is. But like, 1258 01:12:45,880 --> 01:12:50,519 Speaker 3: these are my heroes. These people in general, they need 1259 01:12:50,560 --> 01:12:53,080 Speaker 3: to not ask mommy and daddy whether they can fund somebody. 1260 01:12:53,120 --> 01:12:56,080 Speaker 3: They need to not worry about their students being able 1261 01:12:56,080 --> 01:12:57,560 Speaker 3: to get a job. They pick up the phone and 1262 01:12:57,600 --> 01:12:59,840 Speaker 3: they say my student needs a job, and that they 1263 01:13:00,040 --> 01:13:03,520 Speaker 3: put it down and there's not some process. 1264 01:13:02,680 --> 01:13:05,880 Speaker 1: M I wonder about that. Let me give you a 1265 01:13:05,920 --> 01:13:09,120 Speaker 1: quick analogy. In Silicon Valley, what I see are startups, 1266 01:13:09,200 --> 01:13:11,920 Speaker 1: young startups. You got three people in a garage things, 1267 01:13:12,560 --> 01:13:15,080 Speaker 1: And then you see what happens as companies mature, when 1268 01:13:15,120 --> 01:13:18,240 Speaker 1: they become Google and Facebook and Amazon and so on, 1269 01:13:18,600 --> 01:13:20,800 Speaker 1: and they have to or they feel that they have 1270 01:13:20,840 --> 01:13:22,599 Speaker 1: to put the rule books in place and make these 1271 01:13:22,680 --> 01:13:23,479 Speaker 1: judges a number. 1272 01:13:23,520 --> 01:13:26,200 Speaker 3: I think it's like fifteen employees and suddenly the rules 1273 01:13:26,280 --> 01:13:30,040 Speaker 3: change on. Yeah, that's right, That's what I'm trying to say, right, So, okay, 1274 01:13:30,040 --> 01:13:30,639 Speaker 3: so how do we. 1275 01:13:30,640 --> 01:13:33,040 Speaker 2: Need national interest exemptions for science? 1276 01:13:33,320 --> 01:13:38,120 Speaker 1: I see you're saying the national interests. They're the ones 1277 01:13:38,200 --> 01:13:41,840 Speaker 1: who in this future, in this utopia that that we're 1278 01:13:41,840 --> 01:13:45,320 Speaker 1: trying to get to what it could look like. They watch, 1279 01:13:45,439 --> 01:13:47,439 Speaker 1: they see who the young scientists are who are doing 1280 01:13:47,479 --> 01:13:50,519 Speaker 1: something very bold, and they say, you have an exemption. 1281 01:13:50,600 --> 01:13:56,400 Speaker 3: We're going to Yeah, I see if there's some Okay, 1282 01:13:56,520 --> 01:14:00,760 Speaker 3: if there's some c elegans researcher who also wants to 1283 01:14:00,800 --> 01:14:04,679 Speaker 3: have her own OnlyFans account, I may not be thrilled, 1284 01:14:04,680 --> 01:14:07,760 Speaker 3: but that is a life choice. But I don't want 1285 01:14:07,800 --> 01:14:10,520 Speaker 3: to tell her that she has to leave the academy 1286 01:14:10,560 --> 01:14:15,280 Speaker 3: because she's behaving inappropriately. If people want to take drugs 1287 01:14:15,320 --> 01:14:17,320 Speaker 3: and go to Burning Man, they need to take drugs 1288 01:14:17,360 --> 01:14:20,479 Speaker 3: and they need to go to Burning Man. If people 1289 01:14:20,479 --> 01:14:22,599 Speaker 3: want to tell a joke about two imms go into 1290 01:14:22,640 --> 01:14:24,080 Speaker 3: a bar, they need to be able to tell a 1291 01:14:24,160 --> 01:14:28,599 Speaker 3: joke about two imms go into a bar, get out 1292 01:14:28,640 --> 01:14:31,160 Speaker 3: of the way of great science. We know how to 1293 01:14:31,200 --> 01:14:33,479 Speaker 3: do this. We need to go back to being the 1294 01:14:33,560 --> 01:14:36,200 Speaker 3: United States of America. We need to fire Claudine Gay 1295 01:14:36,280 --> 01:14:39,400 Speaker 3: from her professorship at Harvard to send a message we don't. 1296 01:14:39,240 --> 01:14:40,000 Speaker 1: Do that anymore. 1297 01:14:40,640 --> 01:14:44,479 Speaker 3: We don't do rich we don't protect plagiarists and pretend 1298 01:14:44,520 --> 01:14:48,599 Speaker 3: to people. I remember getting my Harvard Alumni magazine, and 1299 01:14:48,640 --> 01:14:51,040 Speaker 3: I think that the cover article when Clauding Gay was 1300 01:14:51,080 --> 01:14:55,040 Speaker 3: being announced was a scholars scholar and I knew from that. 1301 01:14:55,720 --> 01:14:58,920 Speaker 3: It's like, you're trying too hard. We know who's good 1302 01:14:59,200 --> 01:15:02,600 Speaker 3: more or less. We don't always get it right, but 1303 01:15:02,680 --> 01:15:04,800 Speaker 3: imagine fifty percent of the time we get it right. 1304 01:15:05,800 --> 01:15:06,960 Speaker 3: Let us do our work. 1305 01:15:07,760 --> 01:15:09,439 Speaker 1: Let me make sure I understand one thing, though. So 1306 01:15:09,920 --> 01:15:13,679 Speaker 1: if you say, hey, these scientists over here are really 1307 01:15:14,120 --> 01:15:16,519 Speaker 1: brave and bold and smart, and we're going to give 1308 01:15:16,560 --> 01:15:21,439 Speaker 1: them national security exemption, what does that exemption mean? Does 1309 01:15:21,520 --> 01:15:22,160 Speaker 1: that mean that I don't know? 1310 01:15:22,200 --> 01:15:24,719 Speaker 2: Ad mean that they don't have to hire? According to some. 1311 01:15:26,040 --> 01:15:29,719 Speaker 1: Here's the question. If let's say one of these brilliant 1312 01:15:29,760 --> 01:15:32,960 Speaker 1: young scientists comes up with the next thing, the next 1313 01:15:32,960 --> 01:15:35,400 Speaker 1: thing that can be turned into a great, big weapon, 1314 01:15:36,080 --> 01:15:39,040 Speaker 1: the exemption means that they're free to publish that or 1315 01:15:39,040 --> 01:15:40,160 Speaker 1: they're not free to it. 1316 01:15:40,240 --> 01:15:40,680 Speaker 2: What does it mean. 1317 01:15:40,760 --> 01:15:43,280 Speaker 3: Look, I think it's important to understand what I'm advocating. 1318 01:15:43,960 --> 01:15:47,920 Speaker 3: I'm advocating that we begin a new relationship based on 1319 01:15:47,960 --> 01:15:51,600 Speaker 3: the fact that the National interest complex welched on the 1320 01:15:51,680 --> 01:15:55,920 Speaker 3: last one. We had an agreement, you broke it. The 1321 01:15:55,960 --> 01:16:00,080 Speaker 3: agreement is called the Endless Frontier of vanavar Bush. So 1322 01:16:00,120 --> 01:16:02,880 Speaker 3: you welched on a series of tacit agreements. 1323 01:16:02,920 --> 01:16:06,320 Speaker 2: Fine, in the agreement was scientists can do anything they want. 1324 01:16:06,439 --> 01:16:11,120 Speaker 3: The agreement was that more or less science was a rebellious, 1325 01:16:12,280 --> 01:16:16,360 Speaker 3: fiercely independent part of the National interest complex. The National 1326 01:16:16,400 --> 01:16:19,559 Speaker 3: interest complex agreed to fund us through universities, that that 1327 01:16:19,560 --> 01:16:22,760 Speaker 3: would be where they would do their the line share 1328 01:16:22,760 --> 01:16:26,519 Speaker 3: of their research. That we would have academic freedom, that 1329 01:16:26,560 --> 01:16:28,559 Speaker 3: they could call on us in times of war, that 1330 01:16:28,600 --> 01:16:30,320 Speaker 3: they would not call on us frivolously. 1331 01:16:30,560 --> 01:16:31,320 Speaker 2: We have an agreement. 1332 01:16:31,439 --> 01:16:35,879 Speaker 3: It's a series of interlocking tacit understanding. Some of them 1333 01:16:36,120 --> 01:16:37,600 Speaker 3: made explicit, most of them not. 1334 01:16:38,280 --> 01:16:40,960 Speaker 1: I see, and that has changed. But you're saying, let's 1335 01:16:41,000 --> 01:16:43,200 Speaker 1: get back to that. Let's get back to the spirit 1336 01:16:43,240 --> 01:16:45,000 Speaker 1: that we want. We want to get back to the 1337 01:16:45,080 --> 01:16:48,400 Speaker 1: spirit of the endless frontier. So endless Frontier take two. 1338 01:16:48,439 --> 01:16:51,000 Speaker 3: Given that you welched on our last agreement, so I 1339 01:16:51,040 --> 01:16:54,439 Speaker 3: think we need to reassert ourselves and say we're not 1340 01:16:54,560 --> 01:16:55,560 Speaker 3: playing ball. 1341 01:16:55,960 --> 01:16:57,800 Speaker 1: And endless front of your Part two looks like what 1342 01:16:58,000 --> 01:17:00,439 Speaker 1: it looks like. Hey, here's a brilliant scientist. I'm going 1343 01:17:00,479 --> 01:17:03,000 Speaker 1: to make sure that they have what they need to 1344 01:17:03,000 --> 01:17:04,000 Speaker 1: do their science. 1345 01:17:04,360 --> 01:17:08,360 Speaker 3: We train people, we plan to employ. We stop pretending 1346 01:17:08,439 --> 01:17:11,960 Speaker 3: that we are going to train you up only to 1347 01:17:12,000 --> 01:17:14,040 Speaker 3: abandon you to say I can't believe you ever got 1348 01:17:14,080 --> 01:17:16,200 Speaker 3: the idea you were going to have a research career. 1349 01:17:16,439 --> 01:17:19,800 Speaker 3: See if you look at Norman Steenrod, who is a 1350 01:17:19,840 --> 01:17:22,959 Speaker 3: mathematician active in I don't know the forties and fifties, 1351 01:17:25,600 --> 01:17:29,599 Speaker 3: all of his students survive to become professors. He has 1352 01:17:29,640 --> 01:17:33,040 Speaker 3: twenty three or something except for the last one around 1353 01:17:33,080 --> 01:17:37,679 Speaker 3: nineteen seventy two seventy three. And if you just take 1354 01:17:37,800 --> 01:17:43,160 Speaker 3: in and you raise it to hire and higher power, 1355 01:17:43,479 --> 01:17:46,080 Speaker 3: you can't keep having twenty three offspring and expecting them 1356 01:17:46,080 --> 01:17:49,200 Speaker 3: all to become professors. So you need something like you 1357 01:17:49,320 --> 01:17:53,400 Speaker 3: social employment, where most of us don't end up producing anyone. 1358 01:17:54,400 --> 01:17:56,800 Speaker 3: But you need to employ them, and you need to 1359 01:17:56,840 --> 01:17:59,840 Speaker 3: stop pretending that the graduate students are. 1360 01:18:00,360 --> 01:18:02,799 Speaker 2: Students, particularly foreign exchange students. 1361 01:18:03,200 --> 01:18:06,439 Speaker 3: They're not. They're workers. They're workers who do not have 1362 01:18:06,560 --> 01:18:09,920 Speaker 3: protections of workers because you've classified them as students. 1363 01:18:11,200 --> 01:18:13,000 Speaker 1: But the idea is the situation where now is those 1364 01:18:13,000 --> 01:18:16,280 Speaker 1: students are very unlikely to become professors someday because there's 1365 01:18:16,320 --> 01:18:16,960 Speaker 1: just too many. 1366 01:18:16,800 --> 01:18:22,439 Speaker 3: Of them and we're training our rivals. I look, nobody 1367 01:18:22,520 --> 01:18:24,320 Speaker 3: has thought this through. This is a lot like what 1368 01:18:24,400 --> 01:18:28,840 Speaker 3: happened with USAID. USAID was a slush fund for the 1369 01:18:28,880 --> 01:18:33,280 Speaker 3: CIA and the State Department. So you look at it, 1370 01:18:33,360 --> 01:18:35,680 Speaker 3: it doesn't make any sense. Why are we funding a 1371 01:18:35,800 --> 01:18:39,759 Speaker 3: transgender opera in Bolivia? I would imagine we are trying 1372 01:18:39,800 --> 01:18:47,120 Speaker 3: to topple the Bolivian regime and trans issues are fabulously divisive. Okay, 1373 01:18:47,240 --> 01:18:48,880 Speaker 3: now you can decide whether that's a good thing or 1374 01:18:48,920 --> 01:18:56,880 Speaker 3: a bad thing. But we didn't just fund that stupidly, right, 1375 01:18:57,360 --> 01:19:00,920 Speaker 3: that was an issue of stake craft. Well, science is 1376 01:19:00,960 --> 01:19:03,439 Speaker 3: also an instrument of state craft, but science is also 1377 01:19:03,560 --> 01:19:06,000 Speaker 3: just science. So what I'm trying to say is that 1378 01:19:06,120 --> 01:19:08,360 Speaker 3: rather than having these colleagues who put their finger in 1379 01:19:08,400 --> 01:19:11,400 Speaker 3: their cheek and say gosh, she will you know, she whiz? 1380 01:19:11,640 --> 01:19:14,639 Speaker 3: You sound like a conspiracy theorist. For God's sakes, man, 1381 01:19:15,640 --> 01:19:18,120 Speaker 3: you're part of the National Interest complex, act like it. 1382 01:19:18,360 --> 01:19:21,439 Speaker 3: The Department of Energy is the Department of Physics. It's 1383 01:19:21,479 --> 01:19:24,680 Speaker 3: the old AEC turned into the Department of Energy by 1384 01:19:24,800 --> 01:19:28,320 Speaker 3: Jimmy Carter. What was the Atomic Energy Commission? 1385 01:19:30,200 --> 01:19:30,400 Speaker 2: You know? 1386 01:19:30,600 --> 01:19:34,120 Speaker 3: So the idea is, well, maybe it's oil and gas. Really, 1387 01:19:34,800 --> 01:19:38,040 Speaker 3: is that what they told you? We have a fake 1388 01:19:38,240 --> 01:19:40,640 Speaker 3: cryptic system and it used to work when there were 1389 01:19:40,640 --> 01:19:43,559 Speaker 3: smart people who could do the crypsis, and then around 1390 01:19:43,600 --> 01:19:47,080 Speaker 3: about nineteen seventy it all broke. So we're now like 1391 01:19:47,200 --> 01:19:50,240 Speaker 3: fifty five years into a cryptic system that's getting weaker 1392 01:19:50,240 --> 01:19:52,760 Speaker 3: and weaker all the time. We need a new van 1393 01:19:52,800 --> 01:19:54,439 Speaker 3: of var Bush and it needs to come through the 1394 01:19:54,439 --> 01:19:57,760 Speaker 3: Office of Science and Technology Policy OSTP, and it needs 1395 01:19:57,800 --> 01:20:00,160 Speaker 3: to go through Pea CAST, and it needs to go 1396 01:20:00,200 --> 01:20:05,639 Speaker 3: through the National Academy's complex. And it needs to be 1397 01:20:06,160 --> 01:20:09,040 Speaker 3: a highly elite and the elite have a terrible name 1398 01:20:09,080 --> 01:20:14,639 Speaker 3: because the people we've been calling elite aren't so elite 1399 01:20:14,680 --> 01:20:18,640 Speaker 3: surgeons still have a great name, or elite athletes have 1400 01:20:18,640 --> 01:20:21,759 Speaker 3: a great name, or elite Tier one operators in special 1401 01:20:21,760 --> 01:20:24,360 Speaker 3: forces have a great name, but the elite has a 1402 01:20:24,439 --> 01:20:27,040 Speaker 3: terrible name, because that's what we associate with Davos, people 1403 01:20:27,120 --> 01:20:30,120 Speaker 3: gathering on their private jets to tell us to lower 1404 01:20:30,120 --> 01:20:33,640 Speaker 3: our carbon footprints. Right, So basically we need to go 1405 01:20:33,720 --> 01:20:37,480 Speaker 3: back to public spirited elite scientists. We're very well compensated, 1406 01:20:37,600 --> 01:20:41,400 Speaker 3: very well protected. We're public spirited who do not allow 1407 01:20:41,439 --> 01:20:44,160 Speaker 3: themselves to get pushed around trivially, but in a pinch, 1408 01:20:44,560 --> 01:20:48,200 Speaker 3: know how to behave as patriotic Americans. And we need 1409 01:20:48,240 --> 01:20:50,280 Speaker 3: to lead the West, and we need to help out 1410 01:20:50,320 --> 01:20:55,360 Speaker 3: our European friends and our Japanese allies. This just isn't 1411 01:20:55,360 --> 01:20:58,000 Speaker 3: that hard. It's just nobody gets it. 1412 01:20:59,280 --> 01:21:01,840 Speaker 1: So what would you say, She's the path, the best 1413 01:21:01,880 --> 01:21:02,920 Speaker 1: path for us to get there. 1414 01:21:03,960 --> 01:21:09,800 Speaker 2: Hold a conference of smart people, do it quietly. 1415 01:21:09,479 --> 01:21:13,200 Speaker 1: Like in a silamar where you're inviting the who exactly 1416 01:21:13,400 --> 01:21:15,599 Speaker 1: you write the national interest people as well as the scientist. 1417 01:21:15,640 --> 01:21:18,800 Speaker 3: You figure out who knows how to play okay, and 1418 01:21:18,840 --> 01:21:21,479 Speaker 3: you do it as a closed conference and everybody checks 1419 01:21:21,520 --> 01:21:25,040 Speaker 3: their phone at the door, and you have the people 1420 01:21:25,040 --> 01:21:28,840 Speaker 3: in the fields that really matter. I want cryptography people, 1421 01:21:29,120 --> 01:21:32,800 Speaker 3: I want people who do fundamental physics. I want people 1422 01:21:32,880 --> 01:21:40,679 Speaker 3: in molecular biology, and you say, look, we had a deal, 1423 01:21:42,200 --> 01:21:45,599 Speaker 3: what is our new version of that deal. Because it's 1424 01:21:45,600 --> 01:21:48,880 Speaker 3: not poverty, it's not being precarious, it's not begging for 1425 01:21:48,920 --> 01:21:53,120 Speaker 3: a grill. Please Tony Fauci continue my funding. We need 1426 01:21:53,240 --> 01:21:55,840 Speaker 3: to tell Fauci in Collins to take a hike when 1427 01:21:55,840 --> 01:21:58,439 Speaker 3: they are behaving counter to the interests of the United 1428 01:21:58,479 --> 01:22:02,599 Speaker 3: States of America, to the interests of science. The public 1429 01:22:02,680 --> 01:22:05,280 Speaker 3: needs to be able to rely on us for ground 1430 01:22:05,320 --> 01:22:08,559 Speaker 3: truth ninety eight percent of the time and the two 1431 01:22:08,600 --> 01:22:10,799 Speaker 3: percent that they can't, We've got to be very careful. 1432 01:22:12,720 --> 01:22:15,200 Speaker 3: I understand that you have, you know, weaponized answers. I'm 1433 01:22:15,200 --> 01:22:20,680 Speaker 3: not saying be naive, publish everything science, science science, but 1434 01:22:20,760 --> 01:22:23,400 Speaker 3: for the most part, we have to be absolutely reliable. 1435 01:22:24,439 --> 01:22:26,520 Speaker 3: And we need to save theoretical physics. 1436 01:22:27,120 --> 01:22:28,640 Speaker 2: First. 1437 01:22:29,320 --> 01:22:33,240 Speaker 3: Fundamental physics is in such bad shape and you can't 1438 01:22:33,280 --> 01:22:36,880 Speaker 3: peer review your way out of it because all the 1439 01:22:36,920 --> 01:22:42,000 Speaker 3: peers are infected with the same mind virus. So you're 1440 01:22:42,000 --> 01:22:43,439 Speaker 3: just going to get more of the same. If you 1441 01:22:43,520 --> 01:22:46,640 Speaker 3: keep saying peer peer peer, you need to ask, Okay. 1442 01:22:46,600 --> 01:22:48,280 Speaker 1: You're referring, for example, string theory. 1443 01:22:48,320 --> 01:22:52,040 Speaker 3: Here, I'm referring to particle theory of the standard model 1444 01:22:52,960 --> 01:22:58,960 Speaker 3: the basis of general relativity need to be advanced. In 1445 01:22:59,000 --> 01:23:02,160 Speaker 3: Neither of these theories is measured by their fundamental constituent 1446 01:23:02,240 --> 01:23:09,880 Speaker 3: called the Lagrangian or the action has really moved in 1447 01:23:10,120 --> 01:23:11,680 Speaker 3: fifty two years? 1448 01:23:12,120 --> 01:23:14,400 Speaker 1: And has it not moved because it's so successful, or 1449 01:23:14,439 --> 01:23:15,240 Speaker 1: you're saying it. 1450 01:23:15,040 --> 01:23:19,680 Speaker 2: It's very successful. Okay, But imagine that you knew. 1451 01:23:22,040 --> 01:23:26,680 Speaker 3: Twenty three of the sixty four codons in biology, and 1452 01:23:26,720 --> 01:23:29,679 Speaker 3: fifty two years later you still knew exactly that number 1453 01:23:29,680 --> 01:23:31,360 Speaker 3: of codons and you didn't have the rest of the 1454 01:23:31,400 --> 01:23:32,600 Speaker 3: genetic code. 1455 01:23:34,040 --> 01:23:38,120 Speaker 1: Enough in that analogy, though those twenty three are still correct, 1456 01:23:38,160 --> 01:23:40,920 Speaker 1: we're just looking for the remainder. But is that how 1457 01:23:40,960 --> 01:23:42,160 Speaker 1: you see that? Pretty much? 1458 01:23:42,200 --> 01:23:45,880 Speaker 3: The standard model is amazing, But you know again, crocodile 1459 01:23:46,000 --> 01:23:48,679 Speaker 3: rock is amazing, and that came from the same exact era, 1460 01:23:49,880 --> 01:23:53,320 Speaker 3: and I cannot stand listening to crocodile Rock on repeat 1461 01:23:54,240 --> 01:23:57,640 Speaker 3: for fifty two years. 1462 01:23:58,400 --> 01:24:00,519 Speaker 1: But is it that it's a hard problem and that's 1463 01:24:00,520 --> 01:24:03,840 Speaker 1: why the other codons haven't been figured out. 1464 01:24:04,360 --> 01:24:06,559 Speaker 2: Yeah, it's a very hard problem. 1465 01:24:06,720 --> 01:24:09,160 Speaker 3: But it gets a lot harder when you only listen 1466 01:24:09,280 --> 01:24:15,160 Speaker 3: to suspiciously ten leaders of the field who all are 1467 01:24:16,680 --> 01:24:20,960 Speaker 3: interchangeably convinced of the same wrong things, which is quantum 1468 01:24:21,000 --> 01:24:23,080 Speaker 3: gravity was not what we were Well, it's not even 1469 01:24:23,280 --> 01:24:27,360 Speaker 3: string theory. Who said quantum gravity was the task that 1470 01:24:27,400 --> 01:24:30,160 Speaker 3: we all needed to get on in nineteen eighty three 1471 01:24:30,200 --> 01:24:35,000 Speaker 3: eighty four, who said that the standard model is ugly. 1472 01:24:35,760 --> 01:24:38,840 Speaker 3: It's the most beautiful thing I've ever seen. Who said 1473 01:24:38,840 --> 01:24:41,599 Speaker 3: that we stopped listening to our colleagues and we call 1474 01:24:41,680 --> 01:24:44,160 Speaker 3: them names, as opposed to saying, well, do you have 1475 01:24:44,200 --> 01:24:49,400 Speaker 3: a different idea. There is no world in which our 1476 01:24:49,479 --> 01:24:52,720 Speaker 3: physics thing makes sense except for a security context. So 1477 01:24:52,880 --> 01:25:00,559 Speaker 3: either we now have the world's worst accidental culture in 1478 01:25:00,720 --> 01:25:04,559 Speaker 3: fundamental physics, or we have a security regime in which 1479 01:25:04,600 --> 01:25:06,920 Speaker 3: we're not supposed to actually advance and what we're doing 1480 01:25:07,040 --> 01:25:07,519 Speaker 3: is safe. 1481 01:25:09,000 --> 01:25:12,799 Speaker 1: Does that mean that in that model that somebody has 1482 01:25:13,479 --> 01:25:17,240 Speaker 1: broken through that wall and they met Well, we don't know. 1483 01:25:17,400 --> 01:25:19,880 Speaker 3: I mean, I'm almost positive that we don't have a 1484 01:25:19,920 --> 01:25:27,240 Speaker 3: fundamental theory. But you know, we know that Marc Andreessen 1485 01:25:27,680 --> 01:25:30,280 Speaker 3: and Ben Horowitz were in the White House during the 1486 01:25:30,280 --> 01:25:34,479 Speaker 3: Biden administration and they said they were told don't invest 1487 01:25:34,520 --> 01:25:37,400 Speaker 3: in AI startups because we're not going to allow them 1488 01:25:37,800 --> 01:25:41,599 Speaker 3: to continue and these guys said, well, what do you mean, 1489 01:25:41,680 --> 01:25:42,840 Speaker 3: it's just based on math. 1490 01:25:42,920 --> 01:25:43,960 Speaker 2: You can't outlaw math. 1491 01:25:44,800 --> 01:25:46,880 Speaker 3: And they said, we will classify math just the way 1492 01:25:46,880 --> 01:25:56,000 Speaker 3: we classified fields of theoretical physics. Hmm, okay, so nobody 1493 01:25:56,000 --> 01:25:59,120 Speaker 3: knows what that means. Didn't sound like it was just 1494 01:25:59,200 --> 01:26:02,280 Speaker 3: nuclear physics. I highly recommend everyone go to a website 1495 01:26:02,360 --> 01:26:05,839 Speaker 3: run by Alex Wellerstein. It was at the Stevens Institute 1496 01:26:05,880 --> 01:26:07,560 Speaker 3: of Technology in New Jersey. 1497 01:26:08,360 --> 01:26:09,240 Speaker 2: And he is. 1498 01:26:10,680 --> 01:26:15,960 Speaker 3: I think the world expert on the atomic security. And 1499 01:26:16,000 --> 01:26:18,759 Speaker 3: you can learn all the things we did around science 1500 01:26:18,960 --> 01:26:21,160 Speaker 3: around nuclear weapons from his sites. 1501 01:26:21,680 --> 01:26:23,160 Speaker 2: So we can just know. 1502 01:26:23,320 --> 01:26:25,479 Speaker 3: We don't have to guess or speculate as to what 1503 01:26:25,520 --> 01:26:27,840 Speaker 3: it looks like when national interest in science run into 1504 01:26:27,880 --> 01:26:28,200 Speaker 3: each other. 1505 01:26:28,240 --> 01:26:31,400 Speaker 2: We can say, oh, here's what happened. 1506 01:26:31,320 --> 01:26:31,479 Speaker 3: Like. 1507 01:26:33,200 --> 01:26:36,120 Speaker 1: Scientists ideas were shut down from the public that kind 1508 01:26:36,120 --> 01:26:37,599 Speaker 1: of thing, or what are we talking about? 1509 01:26:39,920 --> 01:26:47,719 Speaker 3: Newspaper stories were spiked, disinformation was distributed in the scientific literature. 1510 01:26:49,000 --> 01:26:51,840 Speaker 3: Volumes were taken off of the shelves that thought were 1511 01:26:51,840 --> 01:26:55,320 Speaker 3: thought to be advantageous to the enemy. Review boards were 1512 01:26:55,360 --> 01:27:01,479 Speaker 3: set up that would intermediate between journals and researchers in 1513 01:27:01,479 --> 01:27:06,639 Speaker 3: case somebody submitted something that might have security implications, so 1514 01:27:06,800 --> 01:27:11,000 Speaker 3: we know exactly what it looks like when the national 1515 01:27:11,000 --> 01:27:15,000 Speaker 3: interest complex decides that something like neutrons is too dangerous 1516 01:27:15,040 --> 01:27:16,919 Speaker 3: to simply do as g wiz science. 1517 01:27:18,920 --> 01:27:19,200 Speaker 2: Right. 1518 01:27:19,280 --> 01:27:23,000 Speaker 3: And so as a result of this, if you tell me, well, Eric, 1519 01:27:23,160 --> 01:27:25,960 Speaker 3: that's conspiracy theorizing, I'm just going to tell you, go 1520 01:27:25,960 --> 01:27:29,720 Speaker 3: find a blog called restricted Data. Knock yourself out and 1521 01:27:29,760 --> 01:27:34,440 Speaker 3: tell me why you imagine that secret facilities, secret protocols, 1522 01:27:34,479 --> 01:27:41,360 Speaker 3: secret agreements aren't all through national interests interfaces with the 1523 01:27:41,400 --> 01:27:44,720 Speaker 3: scientific community, because clearly we know that that it has 1524 01:27:44,840 --> 01:27:47,439 Speaker 3: been the case. We have it documented eight ways to Sunday. 1525 01:27:48,000 --> 01:27:50,400 Speaker 3: Why are you claiming that this is somehow the product 1526 01:27:50,400 --> 01:27:53,559 Speaker 3: of an overactive imagination. You're just not well read. 1527 01:27:53,960 --> 01:27:56,000 Speaker 1: But in some sense you're in favor of that right, 1528 01:27:56,240 --> 01:27:59,120 Speaker 1: in the sense of the call it the two percent 1529 01:27:59,200 --> 01:28:03,040 Speaker 1: of science that you think for national security interests should 1530 01:28:03,040 --> 01:28:07,280 Speaker 1: be masked, you would or wouldn't be in favor of. 1531 01:28:07,600 --> 01:28:10,960 Speaker 3: I'm in favor of getting rid of the g willakers 1532 01:28:11,000 --> 01:28:12,240 Speaker 3: attitude towards science. 1533 01:28:15,000 --> 01:28:21,960 Speaker 2: It's garish. Science is fun enough, Science is too important. 1534 01:28:23,000 --> 01:28:25,439 Speaker 3: We have the right to national security interests, and we 1535 01:28:25,520 --> 01:28:29,000 Speaker 3: have the right to not have national security interests completely 1536 01:28:29,760 --> 01:28:33,040 Speaker 3: destroy our credibility with each other, the public, and the world. 1537 01:28:33,600 --> 01:28:34,080 Speaker 2: And so the. 1538 01:28:34,080 --> 01:28:40,040 Speaker 3: Issue is, I want a high resolution relationship where we 1539 01:28:40,080 --> 01:28:43,040 Speaker 3: think a great deal about these issues. We continue to 1540 01:28:43,080 --> 01:28:45,560 Speaker 3: going back. If you can't be done in an open environment, 1541 01:28:46,200 --> 01:28:48,640 Speaker 3: take it into the national labs. If it can't be 1542 01:28:48,680 --> 01:28:52,680 Speaker 3: done in the national labs, do it in an unannounced facility. 1543 01:28:53,280 --> 01:28:55,320 Speaker 3: But what I don't want is I don't want us 1544 01:28:55,400 --> 01:28:58,639 Speaker 3: training up super smart people who can see through the liawes, 1545 01:28:58,880 --> 01:29:02,360 Speaker 3: not being told that they're part of some sort of 1546 01:29:02,479 --> 01:29:05,960 Speaker 3: agreement that they never signed up for, and sitting there 1547 01:29:06,000 --> 01:29:10,919 Speaker 3: getting destroyed through let's say, COVID influence campaigns, where suddenly 1548 01:29:11,160 --> 01:29:14,559 Speaker 3: you know, every time you say anything in public, there's 1549 01:29:14,600 --> 01:29:18,040 Speaker 3: one hundred and fifty accounts that suspiciously never have people 1550 01:29:18,080 --> 01:29:21,160 Speaker 3: behind them that are constantly starking at you. I mean, 1551 01:29:22,120 --> 01:29:25,000 Speaker 3: whatever this thing is, it's intolerable, and we should we 1552 01:29:25,000 --> 01:29:26,840 Speaker 3: should put a bullet through it. We should drive a 1553 01:29:26,880 --> 01:29:30,120 Speaker 3: stake through its heart and kill off whatever this thing 1554 01:29:30,240 --> 01:29:34,400 Speaker 3: is in favor of a reasonable agreement between the scientific world, 1555 01:29:34,520 --> 01:29:36,479 Speaker 3: the tech world, and the national interest world. 1556 01:29:36,560 --> 01:29:40,800 Speaker 1: And your point is that if scientists grow up understanding 1557 01:29:40,800 --> 01:29:43,559 Speaker 1: that that's the situation that the stuff they do matters. 1558 01:29:44,280 --> 01:29:47,080 Speaker 1: Then if you're okay with this, there are plenty of 1559 01:29:47,160 --> 01:29:50,280 Speaker 1: couriers for you. Yeah, if you want to believe that 1560 01:29:50,320 --> 01:29:54,160 Speaker 1: somehow science is just about open inquiry and g whiz. Look, 1561 01:29:54,400 --> 01:29:57,720 Speaker 1: I'm all for cowboys, I'm all for hyper independence. I 1562 01:29:58,080 --> 01:30:00,920 Speaker 1: view iin Rand as a collectivist. I am off the 1563 01:30:01,040 --> 01:30:04,000 Speaker 1: charts individualist. But I'm not stupid. 1564 01:30:04,120 --> 01:30:07,240 Speaker 3: And so my claim is is that an Oppenheimer of 1565 01:30:07,280 --> 01:30:11,240 Speaker 3: Annoyman a Fineman worked within a world in which they 1566 01:30:11,320 --> 01:30:16,040 Speaker 3: respected the national security complex. Put somebody at the top, 1567 01:30:16,120 --> 01:30:19,600 Speaker 3: like a Leslie Groves and you'll get compliance. But you 1568 01:30:19,760 --> 01:30:22,880 Speaker 3: can't have this as some sort of low level administrative thing. 1569 01:30:23,479 --> 01:30:26,240 Speaker 3: This is life and death to the United States. And 1570 01:30:26,400 --> 01:30:29,280 Speaker 3: China is going to eat our lunch. I guarantee it. 1571 01:30:29,320 --> 01:30:32,200 Speaker 3: They will hire our best people away because we are 1572 01:30:32,479 --> 01:30:35,520 Speaker 3: in some sort of a mental spiral. 1573 01:30:36,280 --> 01:30:37,679 Speaker 2: We're going straight down the drain. 1574 01:30:37,760 --> 01:30:41,080 Speaker 3: Because we don't think that a scientist should be able 1575 01:30:41,120 --> 01:30:44,000 Speaker 3: to check in at the four seasons without thinking about it, 1576 01:30:44,680 --> 01:30:47,519 Speaker 3: that they shouldn't be able to fly business class, that 1577 01:30:47,560 --> 01:30:50,320 Speaker 3: they shouldn't have a retirement and a second home. And 1578 01:30:50,400 --> 01:30:53,200 Speaker 3: I just I want the people who think that that's 1579 01:30:53,720 --> 01:30:57,200 Speaker 3: garish and that this is like, as the phrase goes, 1580 01:30:57,439 --> 01:31:00,360 Speaker 3: welfare queens and white lab coats. I want them to 1581 01:31:00,439 --> 01:31:04,040 Speaker 3: choke on this particular fur ball, and I want them 1582 01:31:04,080 --> 01:31:06,800 Speaker 3: to recognize. Now, what you did is you took the 1583 01:31:06,800 --> 01:31:09,880 Speaker 3: world's greatest scientific community. You asked them to keep you safe, 1584 01:31:09,960 --> 01:31:12,360 Speaker 3: you asked them to keep you rich, and you told 1585 01:31:12,360 --> 01:31:14,360 Speaker 3: them to stay outside while you held a party. 1586 01:31:14,680 --> 01:31:17,040 Speaker 2: Enough, it's over. 1587 01:31:20,240 --> 01:31:25,240 Speaker 1: That was my conversation with Eric Weinstein. So wrapping this up. Generally, 1588 01:31:25,280 --> 01:31:28,639 Speaker 1: when we talk about science, we think of discoveries as 1589 01:31:28,680 --> 01:31:32,560 Speaker 1: having a pretty simple structure. A paper against published, or 1590 01:31:32,600 --> 01:31:36,479 Speaker 1: a breakthrough gets announced, maybe a prize is awarded. But 1591 01:31:36,560 --> 01:31:40,360 Speaker 1: the truth is that underneath every insight is a huge 1592 01:31:40,400 --> 01:31:47,200 Speaker 1: scaffolding of institutions and incentives and traditions and also unspoken rules. 1593 01:31:47,840 --> 01:31:54,200 Speaker 1: That scaffolding shapes which questions feel askable and which ideas 1594 01:31:54,280 --> 01:31:59,879 Speaker 1: feel risky, in which paths seem invisible until someone insists 1595 01:31:59,920 --> 01:32:04,040 Speaker 1: on walking them anyway. Conversations like today's remind us that 1596 01:32:04,080 --> 01:32:07,320 Speaker 1: science is not just a method for understanding the world. 1597 01:32:07,400 --> 01:32:11,800 Speaker 1: It is also a human system with pressures and power 1598 01:32:11,880 --> 01:32:17,240 Speaker 1: dynamics that influence its trajectory, paying attention to that system, 1599 01:32:17,800 --> 01:32:22,679 Speaker 1: questioning how it works and who it serves, and how 1600 01:32:22,720 --> 01:32:28,760 Speaker 1: it evolves. This might be as important as any single experiment. 1601 01:32:34,640 --> 01:32:37,479 Speaker 1: Go to eagleman dot com slash podcast for more information 1602 01:32:37,840 --> 01:32:41,080 Speaker 1: and to find further reading. Join the weekly discussions on 1603 01:32:41,080 --> 01:32:44,280 Speaker 1: my substack and check out Subscribe to Inner Cosmos on 1604 01:32:44,320 --> 01:32:47,480 Speaker 1: YouTube for videos of each episode and to leave comments 1605 01:32:47,800 --> 01:32:51,680 Speaker 1: until next time. I'm David Eagleman and this is inner Cosmos.