1 00:00:03,040 --> 00:00:06,760 Speaker 1: Welcome to Stuff to Blow Your Mind production of iHeartRadio. 2 00:00:13,560 --> 00:00:15,360 Speaker 2: Hey you welcome to Stuff to Blow Your Mind. My 3 00:00:15,440 --> 00:00:16,599 Speaker 2: name is Robert Lamb. 4 00:00:16,720 --> 00:00:19,680 Speaker 3: And I'm Joe McCormick, and we're back with the follow 5 00:00:19,760 --> 00:00:23,479 Speaker 3: up to last week's episode called the Doomsday Water, which 6 00:00:23,600 --> 00:00:28,920 Speaker 3: was about a totally non existent but historically very interesting 7 00:00:29,440 --> 00:00:34,479 Speaker 3: hypothetical substance called polywater. Now, if you haven't heard or 8 00:00:34,520 --> 00:00:37,520 Speaker 3: watched the last episode yet, this is one where I'd 9 00:00:37,560 --> 00:00:40,160 Speaker 3: really recommend you do the series in order you should 10 00:00:40,159 --> 00:00:42,239 Speaker 3: go back and check that one out. Today, I think 11 00:00:42,280 --> 00:00:45,360 Speaker 3: we're going to add a few more details on the 12 00:00:45,440 --> 00:00:49,599 Speaker 3: history of polywater and then have some discussion about ideas 13 00:00:49,640 --> 00:00:52,319 Speaker 3: that kind of bloom out of the ashes of this 14 00:00:52,560 --> 00:00:54,040 Speaker 3: failed scientific project. 15 00:00:54,720 --> 00:00:55,680 Speaker 2: All right, let's do it. 16 00:00:56,200 --> 00:00:59,440 Speaker 3: Yeah, So, to start with the condensed refresher on the 17 00:00:59,440 --> 00:01:03,800 Speaker 3: timeline of polywater, the story begins with some isolated chemistry 18 00:01:03,800 --> 00:01:06,319 Speaker 3: research taking place in the Soviet Union in the early 19 00:01:06,400 --> 00:01:09,240 Speaker 3: nineteen sixties, and this was carried out by a scientist 20 00:01:09,360 --> 00:01:15,720 Speaker 3: named Nikolai Fedyakin. Fedyakin discovers that by condensing samples of 21 00:01:15,760 --> 00:01:19,200 Speaker 3: what he believes to be pure water pure H two 22 00:01:19,800 --> 00:01:25,000 Speaker 3: inside extremely tiny glass capillary tubes. Under just the right conditions, 23 00:01:25,600 --> 00:01:29,680 Speaker 3: he can somehow cause the water to appear to separate 24 00:01:29,800 --> 00:01:33,440 Speaker 3: into two different substances. You've got regular water and then 25 00:01:33,800 --> 00:01:38,520 Speaker 3: something else, this other anomalous form of water which seems 26 00:01:38,600 --> 00:01:42,360 Speaker 3: to be denser than normal water and have weird properties 27 00:01:42,480 --> 00:01:45,760 Speaker 3: like an extremely high boiling point and a very low 28 00:01:45,800 --> 00:01:49,559 Speaker 3: freezing point. So this is kind of hard to imagine 29 00:01:49,600 --> 00:01:53,360 Speaker 3: because we normally think of water as just water. But 30 00:01:53,560 --> 00:01:56,720 Speaker 3: maybe the easiest way to do it is imagine another 31 00:01:57,400 --> 00:01:59,800 Speaker 3: phase of water that you've never seen before. So you 32 00:01:59,800 --> 00:02:03,640 Speaker 3: can think of liquid water, ice and steam, and no 33 00:02:03,760 --> 00:02:07,800 Speaker 3: imagine the same exact substance also has a form that 34 00:02:08,040 --> 00:02:13,639 Speaker 3: is something like the consistency of wax or vacilline. Rob 35 00:02:13,680 --> 00:02:16,239 Speaker 3: I was actually thinking about it in between when we 36 00:02:16,360 --> 00:02:20,960 Speaker 3: recorded the last episode and this one, Like, why is 37 00:02:21,000 --> 00:02:25,600 Speaker 3: it so hard to imagine this hypothetical alternative form of 38 00:02:25,639 --> 00:02:29,400 Speaker 3: water Because we watch phase transitions of water all the time. 39 00:02:29,480 --> 00:02:33,080 Speaker 3: It's like totally normal to observe that liquid water becomes 40 00:02:33,120 --> 00:02:36,120 Speaker 3: ice and then you boil it and become steam. These 41 00:02:36,320 --> 00:02:40,200 Speaker 3: different phases don't look or feel like each other at all, 42 00:02:40,280 --> 00:02:42,320 Speaker 3: and we would just watch them shift back and forth 43 00:02:42,360 --> 00:02:45,440 Speaker 3: and nothing is strange about that, But trying to imagine 44 00:02:45,480 --> 00:02:49,680 Speaker 3: this other phase is just kind of impossible. It feels like, well, 45 00:02:49,720 --> 00:02:52,040 Speaker 3: if it was this other way, if it had this 46 00:02:52,120 --> 00:02:55,920 Speaker 3: other consistency and feeling and appearance, it just wouldn't be water. 47 00:02:57,880 --> 00:03:00,239 Speaker 2: Yeah, that is interesting to think about. I mean, a 48 00:03:00,240 --> 00:03:03,000 Speaker 2: big part of it is just how mundane the realities 49 00:03:03,000 --> 00:03:05,960 Speaker 2: of the phases of water are. Like I think already today, 50 00:03:06,040 --> 00:03:10,200 Speaker 2: I've encountered water in and all three of its forms. 51 00:03:10,200 --> 00:03:14,440 Speaker 2: You know, I've put ice in my thermos, I've heated 52 00:03:14,520 --> 00:03:18,000 Speaker 2: up water for coffee and tea and produced steam, and 53 00:03:18,040 --> 00:03:22,240 Speaker 2: of course I've had liquid water and am mostly liquid water. 54 00:03:22,840 --> 00:03:25,520 Speaker 2: So yeah, I guess it's just part of it is 55 00:03:25,560 --> 00:03:27,600 Speaker 2: the world we live in, in the world we are. 56 00:03:27,840 --> 00:03:30,400 Speaker 3: I think maybe that's right, And maybe it's because water 57 00:03:30,560 --> 00:03:33,960 Speaker 3: is so fundamental and central to our lives and we 58 00:03:34,040 --> 00:03:38,920 Speaker 3: encounter it so much that trying to imagine this other 59 00:03:39,000 --> 00:03:40,760 Speaker 3: form of it doesn't feel like it makes sense in 60 00:03:40,800 --> 00:03:43,320 Speaker 3: the way that it would be much easier to imagine 61 00:03:44,040 --> 00:03:46,840 Speaker 3: alternate forms of chemicals that we have less day to 62 00:03:46,920 --> 00:03:51,240 Speaker 3: day interaction with. But so anyway, this initial result by 63 00:03:51,360 --> 00:03:55,040 Speaker 3: Nikolai fed Yakin gets the attention of an esteemed Soviet 64 00:03:55,080 --> 00:03:59,240 Speaker 3: chemist named Boris de Yagan, who replicates the procedure for 65 00:03:59,360 --> 00:04:03,160 Speaker 3: making this anomalist water and then starts publishing papers on 66 00:04:03,200 --> 00:04:07,640 Speaker 3: it in Russian language journals. Derriagan believes this alternative, this 67 00:04:07,800 --> 00:04:12,280 Speaker 3: alternate form of water could be immensely important, of immense 68 00:04:12,320 --> 00:04:16,559 Speaker 3: scientific and technological significance, and then in roughly the years 69 00:04:16,680 --> 00:04:21,400 Speaker 3: nineteen sixty six to sixty eight, Derriogan gives presentations on 70 00:04:21,480 --> 00:04:25,920 Speaker 3: the anomalous water at international conferences and gradually starts to 71 00:04:25,960 --> 00:04:29,360 Speaker 3: get the attention of Western scientists, especially in Great Britain 72 00:04:29,400 --> 00:04:33,200 Speaker 3: and the United States. Several of these scientists begin their 73 00:04:33,279 --> 00:04:37,880 Speaker 3: own anomalous water research program, copying the initial methods for 74 00:04:37,960 --> 00:04:40,760 Speaker 3: making it from the Soviet Union, and then in the 75 00:04:40,839 --> 00:04:45,720 Speaker 3: year nineteen sixty nine, the anomalist water has a breakout moment. 76 00:04:45,880 --> 00:04:49,039 Speaker 3: There is one paper by a group of American scientists 77 00:04:49,520 --> 00:04:54,320 Speaker 3: showing infrared spectroscopy results from the substance that appear to 78 00:04:54,440 --> 00:04:58,320 Speaker 3: show a different signature than that of regular water, which 79 00:04:58,400 --> 00:05:01,880 Speaker 3: is interpreted to mean that well, it is water, but 80 00:05:02,040 --> 00:05:05,120 Speaker 3: it's got a different molecular structure than normal water, which 81 00:05:05,160 --> 00:05:09,680 Speaker 3: is what gives it this different spectrum. It is hypothesized 82 00:05:09,800 --> 00:05:13,159 Speaker 3: to be something like a polymer form of water. So 83 00:05:13,240 --> 00:05:16,839 Speaker 3: a polymer is a long, long or large molecule made 84 00:05:16,839 --> 00:05:21,320 Speaker 3: out of repeating units, and they hypothesize that maybe there 85 00:05:21,400 --> 00:05:26,240 Speaker 3: is a hexagonal arrangement of water molecules forming these kind 86 00:05:26,279 --> 00:05:29,800 Speaker 3: of like sheets or long, long structures. And here's where 87 00:05:29,800 --> 00:05:32,440 Speaker 3: you get the name polywater. It's like a polymer of water. 88 00:05:33,600 --> 00:05:36,839 Speaker 3: We also talked a bit last time about how important 89 00:05:37,000 --> 00:05:40,400 Speaker 3: it might have actually been in the reception of this 90 00:05:40,640 --> 00:05:44,640 Speaker 3: hypothetical substance, that it got a cool name, that it 91 00:05:44,680 --> 00:05:47,240 Speaker 3: was no longer being called just like you know, anomalous 92 00:05:47,760 --> 00:05:51,680 Speaker 3: anomalist behavior of water or various descriptive phrases. That it 93 00:05:51,720 --> 00:05:55,440 Speaker 3: got a name, and the name sounded interesting, that it's polywater. 94 00:05:56,160 --> 00:05:58,479 Speaker 3: You know, of course, poly means many, so it almost 95 00:05:58,560 --> 00:06:01,280 Speaker 3: implies a kind of mini splendor water, water that can 96 00:06:01,360 --> 00:06:05,080 Speaker 3: do many things or has maybe can do anything. And 97 00:06:05,120 --> 00:06:08,000 Speaker 3: in fact, lots of people in the media kind of 98 00:06:08,120 --> 00:06:10,680 Speaker 3: ended up treating it like it could do anything. There 99 00:06:10,680 --> 00:06:14,440 Speaker 3: was a frenzy of attention in the popular media, so 100 00:06:14,480 --> 00:06:18,040 Speaker 3: people start imagining all kinds of wild ways that polywater 101 00:06:18,480 --> 00:06:20,719 Speaker 3: could be applied. Maybe it's going to be the next 102 00:06:20,720 --> 00:06:23,560 Speaker 3: steam engine. It'll just you know, change everything in technology. 103 00:06:23,600 --> 00:06:28,719 Speaker 3: It'll do machine lubrication and nuclear power. It'll unlock the 104 00:06:28,760 --> 00:06:33,880 Speaker 3: secret of eternal youth. There is one article that was 105 00:06:33,960 --> 00:06:36,240 Speaker 3: quoted in one of our sources last time that was like, Hey, 106 00:06:36,279 --> 00:06:38,960 Speaker 3: your living room furniture, it's gonna be made out of water. 107 00:06:40,480 --> 00:06:42,320 Speaker 3: We never got to the bottom of how that works, 108 00:06:42,360 --> 00:06:47,560 Speaker 3: but I like it. But so there would be arguments 109 00:06:47,560 --> 00:06:52,320 Speaker 3: between polywater proponents and polywater skeptics. In these arguments, of course, 110 00:06:52,400 --> 00:06:56,080 Speaker 3: began in the scientific community and scientific literature, but this 111 00:06:56,279 --> 00:06:59,480 Speaker 3: eventually spilled out into the popular media around the year 112 00:06:59,520 --> 00:07:03,359 Speaker 3: nineteen s six nine, and it became a subject of 113 00:07:03,400 --> 00:07:06,800 Speaker 3: controversy that was being covered by the mainstream press, not 114 00:07:06,960 --> 00:07:11,400 Speaker 3: just in scientific literature. Also in nineteen sixty nine you 115 00:07:11,480 --> 00:07:14,640 Speaker 3: get this letter published in the journal Nature by a 116 00:07:14,720 --> 00:07:19,160 Speaker 3: chemist and named F. J. Donaho of Wilkes College in Pennsylvania, 117 00:07:19,200 --> 00:07:23,240 Speaker 3: which sketches out this really alarming idea. He says, maybe 118 00:07:23,280 --> 00:07:27,360 Speaker 3: polywater is not just a revolutionary discovery, it might be 119 00:07:27,680 --> 00:07:32,080 Speaker 3: the most dangerous substance on Earth, because, according to Donnaho, 120 00:07:32,640 --> 00:07:36,800 Speaker 3: it's possible that like the fictional Ice nine in the 121 00:07:36,800 --> 00:07:41,080 Speaker 3: novel Cat's Cradle by Kurt Vonnegut. If a seed crystal 122 00:07:41,160 --> 00:07:45,160 Speaker 3: of polywater were to escape into the natural environment and 123 00:07:45,240 --> 00:07:47,800 Speaker 3: get deposited in the soil or the ocean, even if 124 00:07:47,800 --> 00:07:49,440 Speaker 3: you just flush it down the toilet, it's going to 125 00:07:49,520 --> 00:07:53,840 Speaker 3: end up in the in the environment, it could provide 126 00:07:53,920 --> 00:07:57,640 Speaker 3: a nucleation point that would cause all of the water 127 00:07:57,720 --> 00:08:02,000 Speaker 3: in the world to become polywater, which I admit is 128 00:08:02,040 --> 00:08:06,040 Speaker 3: a to me, a gripping image, a truly grotesque and 129 00:08:06,200 --> 00:08:11,160 Speaker 3: bizarre and amazing image, a vasaline apocalypse, or a kind 130 00:08:11,200 --> 00:08:13,000 Speaker 3: of like wax end of the world. 131 00:08:13,840 --> 00:08:17,040 Speaker 2: Yeah, and also like kind of the ultimate technological whip 132 00:08:17,080 --> 00:08:21,560 Speaker 2: see like, oops, we broke water. Everyone water is now broken. 133 00:08:22,160 --> 00:08:25,760 Speaker 2: It doesn't work anymore, not the way that we were 134 00:08:25,760 --> 00:08:27,240 Speaker 2: accustomed to you. 135 00:08:27,280 --> 00:08:29,880 Speaker 3: Now, it's very important to note that this is not 136 00:08:30,000 --> 00:08:34,239 Speaker 3: like the mainstream opinion on polywater at the time. Immediately 137 00:08:34,280 --> 00:08:40,360 Speaker 3: after this, several prominent pro polywater scientists respond to the 138 00:08:40,440 --> 00:08:43,280 Speaker 3: letter in Nature. They argue with good reasons, it seems 139 00:08:43,559 --> 00:08:47,079 Speaker 3: that this is not likely. For one thing, if polywater exists, 140 00:08:47,200 --> 00:08:50,760 Speaker 3: it must occur sometimes in nature. So if it could 141 00:08:50,880 --> 00:08:53,160 Speaker 3: turn Earth into a vacilline world, it would have already 142 00:08:53,240 --> 00:08:57,280 Speaker 3: done that. And you know, they had other arguments too. 143 00:08:57,320 --> 00:09:00,880 Speaker 3: Of course, the whole thing is made pointless by the 144 00:09:00,920 --> 00:09:03,439 Speaker 3: fact that we would later discover polywater does not exist. 145 00:09:04,120 --> 00:09:07,600 Speaker 3: But while this big frenzy of excitement and enthusiasm and 146 00:09:07,679 --> 00:09:12,199 Speaker 3: fear is going on in the mainstream media in popular culture, 147 00:09:12,720 --> 00:09:16,120 Speaker 3: there are also lots of skeptical scientists just chipping away 148 00:09:16,200 --> 00:09:19,640 Speaker 3: at the polywater project. The most common objection is, are 149 00:09:19,679 --> 00:09:23,600 Speaker 3: you sure you're not just seeing the effects of impurities 150 00:09:23,640 --> 00:09:28,080 Speaker 3: in your water samples. Derriagan and the polywater proponents were 151 00:09:28,080 --> 00:09:31,760 Speaker 3: always quick to say, no, that is not what's happening. 152 00:09:31,800 --> 00:09:35,600 Speaker 3: Maybe your samples of water are contaminated, but ours are pure, 153 00:09:36,520 --> 00:09:38,160 Speaker 3: And that's kind of hard to argue with. I mean, 154 00:09:38,320 --> 00:09:39,920 Speaker 3: all you can do is test the ones you have 155 00:09:40,000 --> 00:09:42,280 Speaker 3: access to. If somebody is saying the ones you don't 156 00:09:42,280 --> 00:09:45,080 Speaker 3: have access to are the good ones, that's like, oh, 157 00:09:45,120 --> 00:09:45,800 Speaker 3: that's a problem. 158 00:09:46,840 --> 00:09:47,640 Speaker 2: But yeah. 159 00:09:47,679 --> 00:09:51,520 Speaker 3: So by the early nineteen seventies, the skeptical undermining of 160 00:09:51,520 --> 00:09:55,480 Speaker 3: the polywater project has made a lot of progress. It 161 00:09:55,559 --> 00:09:58,880 Speaker 3: starts to look more and more like the anomalous water 162 00:09:59,200 --> 00:10:02,840 Speaker 3: is anomal because it isn't water. It's a bunch of 163 00:10:02,880 --> 00:10:08,760 Speaker 3: contamination with leading candidates for the contamination being well. Some 164 00:10:08,800 --> 00:10:11,720 Speaker 3: people talked about particles of silica leaching from the glass 165 00:10:11,760 --> 00:10:14,200 Speaker 3: into the water and forming a kind of gel. Another 166 00:10:14,240 --> 00:10:19,600 Speaker 3: big contender is human sweat biological contamination from the probably 167 00:10:19,640 --> 00:10:22,400 Speaker 3: from the researchers themselves who were carrying out the experiment. 168 00:10:23,280 --> 00:10:27,600 Speaker 3: And then one famous blow to the polywater project comes 169 00:10:27,679 --> 00:10:31,400 Speaker 3: in the early seventies when an American researcher named Dennis L. 170 00:10:31,520 --> 00:10:36,280 Speaker 3: Rousseau does analysis on a bunch of sweat and gets 171 00:10:36,400 --> 00:10:39,840 Speaker 3: almost the exact same pattern that had famously appeared in 172 00:10:39,880 --> 00:10:43,800 Speaker 3: the big polywater spectroscopy paper in nineteen sixty nine. So 173 00:10:43,880 --> 00:10:47,080 Speaker 3: at this point, consensus starts to turn heavily against the 174 00:10:47,080 --> 00:10:51,760 Speaker 3: existence of polywater, and roughly nineteen seventy seventy one, most 175 00:10:51,880 --> 00:10:55,760 Speaker 3: scientists that were initially curious or open minded about it 176 00:10:55,800 --> 00:10:58,840 Speaker 3: start saying, no, I don't think this is real. It 177 00:10:59,000 --> 00:11:02,360 Speaker 3: takes the hardcore or polywater boosters a little more time 178 00:11:02,400 --> 00:11:06,719 Speaker 3: to come around, but by roughly nineteen seventy three, basically everybody, 179 00:11:06,760 --> 00:11:10,520 Speaker 3: including Boris de Riyagan, realizes and admits that it was 180 00:11:10,679 --> 00:11:15,320 Speaker 3: probably all just various types of contamination All along. The 181 00:11:15,360 --> 00:11:18,439 Speaker 3: stuff was not acting like water because it wasn't water. 182 00:11:20,240 --> 00:11:22,640 Speaker 3: So that's the story we talked about in the last episode. 183 00:11:23,040 --> 00:11:24,880 Speaker 3: Like I did in the previous episode, I want to 184 00:11:25,000 --> 00:11:27,120 Speaker 3: mention a couple of major sources here at the top. 185 00:11:27,320 --> 00:11:31,760 Speaker 3: One is an article called case Studies in Pathological Science, 186 00:11:31,840 --> 00:11:34,720 Speaker 3: published in American Scientists in the year nineteen ninety two 187 00:11:34,800 --> 00:11:38,360 Speaker 3: by Dennis L. Rousseau. This article is great because it 188 00:11:38,600 --> 00:11:42,840 Speaker 3: provides a historical overview of the polywater affair, a short one, 189 00:11:43,720 --> 00:11:45,439 Speaker 3: but from the point of view of someone who was 190 00:11:45,440 --> 00:11:49,280 Speaker 3: actually involved in it. Rousseau was initially very interested in 191 00:11:49,320 --> 00:11:52,560 Speaker 3: the possibilities of polywater, as he talks about in the article. 192 00:11:52,640 --> 00:11:56,079 Speaker 3: He's like, he and a collaborator of his, we're like, oh, 193 00:11:56,120 --> 00:11:58,920 Speaker 3: could it be the fountain of youth? Could it unlock 194 00:11:59,040 --> 00:12:03,280 Speaker 3: you know, longevity. But he eventually becomes very skeptical about it. 195 00:12:03,880 --> 00:12:06,600 Speaker 3: He becomes a polywater skeptic, and he leads some experiments 196 00:12:06,640 --> 00:12:09,760 Speaker 3: showing how it is almost certainly just caused by impurities. 197 00:12:10,600 --> 00:12:13,520 Speaker 3: Another big source I wanted to mention is an excellent 198 00:12:13,600 --> 00:12:16,600 Speaker 3: chapter in a book called H two O. A Biography 199 00:12:16,600 --> 00:12:19,480 Speaker 3: of Water by the British science writer Philip Ball, first 200 00:12:19,520 --> 00:12:23,760 Speaker 3: published in nineteen ninety nine. And to kick things off today, 201 00:12:23,880 --> 00:12:26,840 Speaker 3: I wanted to mention another article I was reading. Actually 202 00:12:26,880 --> 00:12:29,160 Speaker 3: I was reading this one in between when we recorded 203 00:12:29,160 --> 00:12:33,680 Speaker 3: the two episodes. This was a piece in Distillations magazine 204 00:12:33,720 --> 00:12:36,120 Speaker 3: from February twenty twenty called the Rise and Fall of 205 00:12:36,160 --> 00:12:40,839 Speaker 3: Polywater by the material scientist and science communicator Anissa Ramirez. 206 00:12:41,559 --> 00:12:44,959 Speaker 3: And this piece raised a few details about the story 207 00:12:44,960 --> 00:12:46,839 Speaker 3: that we didn't talk much about last time, but I 208 00:12:46,880 --> 00:12:49,000 Speaker 3: think would be good to dwell on for a moment 209 00:12:49,360 --> 00:12:51,760 Speaker 3: because they might inform the rest of our discussion today. 210 00:12:52,880 --> 00:12:58,120 Speaker 3: So one thing that Ramirez's account brings up is how 211 00:12:58,240 --> 00:13:02,160 Speaker 3: a lot of the early obsess about the weird properties 212 00:13:02,200 --> 00:13:07,680 Speaker 3: of polywater were visual observations made through a microscope. So 213 00:13:07,800 --> 00:13:10,640 Speaker 3: not getting a sample of this stuff, putting it in 214 00:13:10,679 --> 00:13:15,800 Speaker 3: a machine and getting a numerical readout on it, it 215 00:13:15,840 --> 00:13:19,400 Speaker 3: would be people looking through a microscope. So fed Yakin's 216 00:13:19,440 --> 00:13:22,880 Speaker 3: original experiments. They generated only a tiny amount of the 217 00:13:22,920 --> 00:13:26,800 Speaker 3: anomalous water. It was far less than a droplet, and 218 00:13:26,920 --> 00:13:30,199 Speaker 3: he had to study it through a relatively low resolution 219 00:13:30,480 --> 00:13:35,000 Speaker 3: optical microscope. When he made the original observations of properties 220 00:13:35,200 --> 00:13:38,560 Speaker 3: like the alleged greater density than normal water that was 221 00:13:38,640 --> 00:13:42,560 Speaker 3: like an inference made based on looking at how it 222 00:13:42,600 --> 00:13:46,320 Speaker 3: behaved through a microscope. And then once der Yoggan's team 223 00:13:46,360 --> 00:13:50,200 Speaker 3: took over, they also used microscope observation rather than bult 224 00:13:50,280 --> 00:13:52,560 Speaker 3: measurement to note a lot of things. They used that 225 00:13:52,679 --> 00:13:56,760 Speaker 3: to find that it allegedly expanded more than regular water 226 00:13:56,800 --> 00:13:59,960 Speaker 3: when it was heated, that it had a different pattern 227 00:14:00,120 --> 00:14:04,480 Speaker 3: of light refraction than normal water, and because of how 228 00:14:04,520 --> 00:14:08,920 Speaker 3: tiny the quantities of available anomalous water were, these were 229 00:14:09,080 --> 00:14:14,800 Speaker 3: visual observations of behavior through the optical microscopes in strange conditions, 230 00:14:14,840 --> 00:14:19,680 Speaker 3: like in these extremely tiny glass containers. So that's something 231 00:14:19,680 --> 00:14:21,760 Speaker 3: to remember for later. We'll come back to that later 232 00:14:21,800 --> 00:14:35,040 Speaker 3: in the episode. Another thing that we should dwell on 233 00:14:35,120 --> 00:14:36,960 Speaker 3: for a minute. We did sort of mention this in 234 00:14:37,000 --> 00:14:41,040 Speaker 3: the last episode, but it's worth revisiting is the strange 235 00:14:41,040 --> 00:14:43,600 Speaker 3: fact that while all of this frenzy was going on 236 00:14:43,680 --> 00:14:48,600 Speaker 3: in the late sixties, nobody had ever published a high 237 00:14:48,680 --> 00:14:53,120 Speaker 3: quality chemical analysis of a polywater sample to prove that 238 00:14:53,200 --> 00:14:57,000 Speaker 3: it was actually one hundred percent water. Philip Ball highlights 239 00:14:57,000 --> 00:14:59,600 Speaker 3: this in his chapter. Ramirez highlights this in the article too, 240 00:15:00,600 --> 00:15:04,160 Speaker 3: that this was in part due to how little polywater 241 00:15:04,320 --> 00:15:08,040 Speaker 3: could be made you're making these microscopic amounts of it 242 00:15:08,080 --> 00:15:11,920 Speaker 3: at a time, and so some researchers reported that they 243 00:15:11,960 --> 00:15:15,560 Speaker 3: just couldn't make enough of it that a reliable chemical 244 00:15:15,560 --> 00:15:19,720 Speaker 3: analysis was possible with their instruments. You know, as we 245 00:15:19,960 --> 00:15:23,040 Speaker 3: know from a lot of different domains, like you need 246 00:15:23,120 --> 00:15:27,640 Speaker 3: a reasonable sample size in order to have a reliable result. 247 00:15:28,320 --> 00:15:30,440 Speaker 3: You know, we often think about this in the context 248 00:15:30,520 --> 00:15:34,320 Speaker 3: of much higher levels of abstraction in the sciences, like 249 00:15:34,880 --> 00:15:37,720 Speaker 3: psychology studies. You know, if you have a psychology study 250 00:15:37,760 --> 00:15:41,000 Speaker 3: that's got twenty participants, that's usually not going to be 251 00:15:41,000 --> 00:15:43,400 Speaker 3: a very powerful result. You don't have a lot of 252 00:15:43,440 --> 00:15:46,360 Speaker 3: confidence that the same patterns would show up in the 253 00:15:46,360 --> 00:15:50,600 Speaker 3: general population. And you can say, in a way the 254 00:15:50,640 --> 00:15:54,120 Speaker 3: same thing about physical sciences, though obviously you're dealing with 255 00:15:54,280 --> 00:15:58,080 Speaker 3: different types of quantities there, but like extremely tiny quantities 256 00:15:58,400 --> 00:16:02,440 Speaker 3: are harder to measure reliable. Ramirez writes in our article, 257 00:16:02,520 --> 00:16:06,600 Speaker 3: quote chemicals, like humans, have unique fingerprints, and instruments called 258 00:16:06,640 --> 00:16:10,920 Speaker 3: spectrometers can identify the elements and molecules from a chemical 259 00:16:10,960 --> 00:16:14,760 Speaker 3: fingerprint or spectrum. Yet success hinges on the size of 260 00:16:14,800 --> 00:16:18,640 Speaker 3: the sample, where bigger is better. In published papers, anomalous 261 00:16:18,640 --> 00:16:22,360 Speaker 3: water believers lamented that there just wasn't enough of it, 262 00:16:22,440 --> 00:16:26,360 Speaker 3: certainly not enough to identify its molecular makeup. And I 263 00:16:26,360 --> 00:16:29,120 Speaker 3: think in the last episode we talked about the quote 264 00:16:29,120 --> 00:16:31,800 Speaker 3: from one of the guys who was working on polywater 265 00:16:31,840 --> 00:16:34,920 Speaker 3: at the time, one of the students of John Desmond Bernal, 266 00:16:35,360 --> 00:16:40,960 Speaker 3: who said, if only we had a thimbleful. So, while 267 00:16:41,040 --> 00:16:45,720 Speaker 3: this polywater research project was going on, because you couldn't 268 00:16:45,760 --> 00:16:48,120 Speaker 3: make enough of it to really get a good answer 269 00:16:48,200 --> 00:16:51,760 Speaker 3: to these core questions about it, the core questions like 270 00:16:52,440 --> 00:16:56,120 Speaker 3: is this really water, scientists kept nibbling around the edges. 271 00:16:56,440 --> 00:16:59,440 Speaker 3: They would measure what they could with the amounts available, 272 00:17:00,080 --> 00:17:04,680 Speaker 3: including physical characteristics like boiling point and viscosity, but even 273 00:17:04,720 --> 00:17:08,359 Speaker 3: these results were probably hampered in reliability by the tiny 274 00:17:08,359 --> 00:17:11,919 Speaker 3: amounts that could be tested at a time. Some skeptics 275 00:17:11,920 --> 00:17:15,840 Speaker 3: of polywater were sort of brought around to believing in 276 00:17:15,880 --> 00:17:18,600 Speaker 3: it by a paper that we mentioned last time. This 277 00:17:18,800 --> 00:17:21,800 Speaker 3: was the one published in the journal Science in June 278 00:17:22,000 --> 00:17:25,359 Speaker 3: nineteen sixty nine, which showed the work of scientists named 279 00:17:26,160 --> 00:17:30,320 Speaker 3: Robert Stromberg, Elis Lippencott, and Warren Grant, and they had 280 00:17:30,359 --> 00:17:35,480 Speaker 3: produced what looked like a high quality spectrometry result that 281 00:17:35,560 --> 00:17:39,840 Speaker 3: showed the absorption spectrum of polywater did not match that 282 00:17:40,000 --> 00:17:43,520 Speaker 3: of normal water or they claimed of any known substance. 283 00:17:43,960 --> 00:17:47,040 Speaker 3: But this was taken to indicate not that the sample 284 00:17:47,280 --> 00:17:50,879 Speaker 3: was not water, but that the water had a different 285 00:17:50,960 --> 00:17:54,159 Speaker 3: molecular structure than normal water, hence the idea of polywater. 286 00:17:55,280 --> 00:17:57,600 Speaker 3: Funny thing here is they also tried to do a 287 00:17:57,680 --> 00:18:01,359 Speaker 3: chemical analysis of polywater and they found some contaminants. They 288 00:18:01,359 --> 00:18:04,560 Speaker 3: found like sodium and silicon, but they were like, those 289 00:18:04,600 --> 00:18:07,840 Speaker 3: are there in the quantities observed are too small for 290 00:18:07,880 --> 00:18:09,840 Speaker 3: it to matter, so we don't have to worry about that. 291 00:18:10,680 --> 00:18:13,480 Speaker 3: And this is when you get that letter from Donahoe 292 00:18:13,560 --> 00:18:17,080 Speaker 3: to Nature writing about the dangers of polywater. Ramirez mentions 293 00:18:17,119 --> 00:18:19,200 Speaker 3: a detail here that I didn't encounter in any of 294 00:18:19,240 --> 00:18:21,560 Speaker 3: the other sources I was reading, where she says that 295 00:18:21,680 --> 00:18:24,000 Speaker 3: Robert Stromberg, you know, one of the authors of the 296 00:18:24,080 --> 00:18:27,840 Speaker 3: sixty nine paper, started getting angry letters from people who 297 00:18:27,840 --> 00:18:29,800 Speaker 3: said he was bringing about the end of the world. 298 00:18:29,880 --> 00:18:34,320 Speaker 3: So that's the public engagement we crave, you know. And 299 00:18:34,400 --> 00:18:37,280 Speaker 3: also I think this is maybe important. We didn't talk 300 00:18:37,280 --> 00:18:41,560 Speaker 3: about this enough last time. Ramirez mentions briefly how this 301 00:18:41,760 --> 00:18:44,720 Speaker 3: was affected by the Cold War context. We talked about 302 00:18:44,760 --> 00:18:49,760 Speaker 3: the Cold War context in the idea of the exchange 303 00:18:49,800 --> 00:18:53,800 Speaker 3: of information between East and West being limited in some ways, 304 00:18:53,880 --> 00:18:55,800 Speaker 3: not to say that there was no exchange of information, 305 00:18:55,840 --> 00:18:58,399 Speaker 3: because clearly the ideas were making it across. There were 306 00:18:58,400 --> 00:19:02,239 Speaker 3: international conferences, journals to get translated back and forth, but 307 00:19:02,280 --> 00:19:05,320 Speaker 3: there were, you know, just awkward things about how information 308 00:19:05,480 --> 00:19:07,639 Speaker 3: was shared around the world at the time because of 309 00:19:07,640 --> 00:19:12,880 Speaker 3: the Cold War context. Ramirez mentions that there are indications 310 00:19:12,920 --> 00:19:16,040 Speaker 3: that by nineteen sixty nine, the CIA was trying to 311 00:19:16,119 --> 00:19:19,439 Speaker 3: keep a close eye on polywater research in the Soviet Union, 312 00:19:20,240 --> 00:19:23,040 Speaker 3: and according to reports published in the Wall Street Journal, 313 00:19:23,080 --> 00:19:27,560 Speaker 3: the Pentagon started trying to fund polywater research. So it's 314 00:19:27,720 --> 00:19:31,120 Speaker 3: you know, it's like Doctor Strangelove, like, we cannot allow 315 00:19:31,160 --> 00:19:34,639 Speaker 3: a polywater gap. And people made this joke at the time, 316 00:19:34,760 --> 00:19:36,679 Speaker 3: you know, playing up on the idea of a missile 317 00:19:36,680 --> 00:19:40,200 Speaker 3: gap from the nuclear arms race. Now it's like, well, 318 00:19:40,240 --> 00:19:42,480 Speaker 3: if this is maybe going to be the most important 319 00:19:42,480 --> 00:19:46,000 Speaker 3: technology in the world, whether that's for good or for ill, 320 00:19:46,080 --> 00:19:48,159 Speaker 3: and the people were saying it could be for both, 321 00:19:48,640 --> 00:19:49,679 Speaker 3: we better get it first. 322 00:19:50,240 --> 00:19:52,600 Speaker 2: Yeah, yeah, I mean, hindsight is twenty twenty. We can 323 00:19:52,600 --> 00:19:55,680 Speaker 2: look back and see this as the race to become 324 00:19:55,720 --> 00:19:59,520 Speaker 2: the masters of sweaty water. But again, at the time, 325 00:19:59,680 --> 00:20:05,080 Speaker 2: if it's seemed like an adversary would have mastery over 326 00:20:05,720 --> 00:20:08,240 Speaker 2: some sort of new form of water that had all 327 00:20:08,320 --> 00:20:13,200 Speaker 2: these applications, then yeah, it was worth keeping an eye on. Yeah, 328 00:20:13,560 --> 00:20:16,080 Speaker 2: even though we know how it ended up. 329 00:20:16,359 --> 00:20:19,480 Speaker 3: Right, So we know now for multiple reasons that polywater 330 00:20:19,600 --> 00:20:22,240 Speaker 3: was not actually dangerous. First of all, it didn't exist, 331 00:20:22,320 --> 00:20:24,920 Speaker 3: and even at the time people who thought it did 332 00:20:24,960 --> 00:20:28,000 Speaker 3: exist had good arguments against the idea that it was dangerous. 333 00:20:28,080 --> 00:20:31,719 Speaker 3: But if you strip yourself of hindsight, and you strip 334 00:20:31,720 --> 00:20:35,520 Speaker 3: yourself of access to the good arguments against the danger, 335 00:20:36,600 --> 00:20:41,000 Speaker 3: if you allow yourself to inhabit the vasilyne apocalypse mindset, 336 00:20:41,560 --> 00:20:45,160 Speaker 3: it creates, at least for me, a very familiar feeling, actually, 337 00:20:45,720 --> 00:20:51,199 Speaker 3: the feeling that people somewhere are doing something obscure that 338 00:20:51,359 --> 00:20:54,159 Speaker 3: could be very dangerous. It could even be the end 339 00:20:54,200 --> 00:20:56,840 Speaker 3: of the world, and I have no power to stop it, 340 00:20:56,920 --> 00:21:01,000 Speaker 3: except like trying to write letters or you know. Yeah, 341 00:21:01,040 --> 00:21:04,879 Speaker 3: And add to that the knowledge that there's pressure operating 342 00:21:04,920 --> 00:21:08,840 Speaker 3: in the opposite direction that's like driving toward this result 343 00:21:08,920 --> 00:21:11,400 Speaker 3: that you're now afraid of and that you don't have 344 00:21:11,560 --> 00:21:13,840 Speaker 3: power to stop that pressure, Like there's some amount of 345 00:21:13,920 --> 00:21:18,520 Speaker 3: international great power competition that is making people think, even 346 00:21:18,560 --> 00:21:20,840 Speaker 3: if this stuff is dangerous, we've got to have it first. 347 00:21:20,920 --> 00:21:24,119 Speaker 3: They can't have it first. Feel similar to the nuclear 348 00:21:24,200 --> 00:21:27,800 Speaker 3: arms race, of course, and also feel similar to if 349 00:21:27,800 --> 00:21:30,120 Speaker 3: people have talked about this a lot where we might 350 00:21:30,240 --> 00:21:34,600 Speaker 3: be with AI. You know, there are I'm constantly struck 351 00:21:34,600 --> 00:21:36,200 Speaker 3: by the idea that there are a lot of people 352 00:21:36,240 --> 00:21:39,440 Speaker 3: in the United States who used to argue that AI 353 00:21:39,600 --> 00:21:43,200 Speaker 3: is potentially very dangerous. We might need to make absolutely 354 00:21:43,320 --> 00:21:45,639 Speaker 3: sure it's safe before developing it, and it might not 355 00:21:45,760 --> 00:21:49,000 Speaker 3: be possible to make sure that it's absolutely safe. And 356 00:21:49,080 --> 00:21:51,119 Speaker 3: now some of the same people are saying, well, you 357 00:21:51,119 --> 00:21:53,359 Speaker 3: can't let China build it first. We've got to build 358 00:21:53,359 --> 00:21:57,920 Speaker 3: it first. So that consciousness we have of these pressures 359 00:21:58,200 --> 00:22:02,320 Speaker 3: pushing in the opposite direction against caution in these scenarios 360 00:22:02,320 --> 00:22:05,159 Speaker 3: where we don't know exactly how dangerous something could be. 361 00:22:06,000 --> 00:22:09,240 Speaker 3: You know, you've got these international competitive pressures, You've got 362 00:22:10,040 --> 00:22:14,280 Speaker 3: money making pressures. Money making incentives, of course, are leading 363 00:22:14,440 --> 00:22:18,400 Speaker 3: people down trails that we don't know could turn out 364 00:22:18,440 --> 00:22:21,440 Speaker 3: to be okay, could turn out to be disastrous and 365 00:22:21,720 --> 00:22:24,600 Speaker 3: we can't really know in advance. It's a maddening feeling. 366 00:22:25,160 --> 00:22:29,600 Speaker 2: Absolutely yeah, this is this is going to be at 367 00:22:29,600 --> 00:22:33,240 Speaker 2: times fun to reflect on, at times maybe concerning, but 368 00:22:33,440 --> 00:22:38,679 Speaker 2: it is absolutely you know, a reality of our modern 369 00:22:38,720 --> 00:22:42,400 Speaker 2: age and our technological anxiety, and some of our anxiety 370 00:22:42,480 --> 00:22:47,440 Speaker 2: is about about science. A lot of it does come 371 00:22:47,520 --> 00:22:51,280 Speaker 2: down to this idea of containment, Like, if something is 372 00:22:52,000 --> 00:22:54,080 Speaker 2: we create something that is pidentially dangerous, how do we 373 00:22:54,200 --> 00:22:57,160 Speaker 2: keep it from getting out and getting places it shouldn't 374 00:22:57,160 --> 00:22:58,919 Speaker 2: be and falling into the hands of people who shouldn't 375 00:22:58,960 --> 00:23:02,080 Speaker 2: have it right? And then there's that added realization that, 376 00:23:02,720 --> 00:23:06,560 Speaker 2: no matter what, containment might not be possible for these things, 377 00:23:07,119 --> 00:23:09,359 Speaker 2: certainly not in the long run or in the short run. 378 00:23:09,400 --> 00:23:14,080 Speaker 2: And I keep coming across examples of this, you know, 379 00:23:14,160 --> 00:23:18,320 Speaker 2: the idea that you could look at a new technology 380 00:23:18,359 --> 00:23:21,520 Speaker 2: and think about like the absolute realization of the thing, 381 00:23:22,080 --> 00:23:25,280 Speaker 2: and how that could present some sort of existential risk 382 00:23:25,720 --> 00:23:29,639 Speaker 2: and some sort of huge danger to everything that we know. 383 00:23:30,000 --> 00:23:32,520 Speaker 2: But then there are also these short term possibilities where 384 00:23:32,600 --> 00:23:35,320 Speaker 2: the thing is not very advanced at all, It is 385 00:23:35,520 --> 00:23:38,760 Speaker 2: just advanced enough to get out of our hands and 386 00:23:39,640 --> 00:23:44,919 Speaker 2: quickly get out of control. So we'll look at some 387 00:23:45,000 --> 00:23:46,840 Speaker 2: examples of some of these, and a lot of it 388 00:23:46,880 --> 00:23:49,520 Speaker 2: also just comes back to the I think it's impossible 389 00:23:49,560 --> 00:23:51,800 Speaker 2: not to think of the tale of Pandora's Box in 390 00:23:51,840 --> 00:23:55,480 Speaker 2: all of this that there is there's except in the 391 00:23:55,520 --> 00:23:58,280 Speaker 2: sense that there's almost this certainty that these boxes will 392 00:23:58,280 --> 00:24:01,159 Speaker 2: be opened, and then once they are opened, we know 393 00:24:01,200 --> 00:24:05,199 Speaker 2: how it goes. Then the the the ill factors that 394 00:24:05,240 --> 00:24:07,879 Speaker 2: were contained within cannot be put back in the box. 395 00:24:08,119 --> 00:24:10,480 Speaker 2: The only thing that remains in the box is hope, 396 00:24:11,800 --> 00:24:16,320 Speaker 2: or at least we hope that's the case. Yeah, So, yeah, 397 00:24:16,320 --> 00:24:18,359 Speaker 2: I wanted to roll through some of these. We can 398 00:24:18,400 --> 00:24:22,000 Speaker 2: we can chat about these. A number of these involves 399 00:24:22,040 --> 00:24:27,159 Speaker 2: some form of runaway reaction, much like the hypothetical polywater 400 00:24:27,240 --> 00:24:31,879 Speaker 2: reaction or the fictional Ice nine reaction. We'll get to 401 00:24:31,920 --> 00:24:34,720 Speaker 2: some examples of that that are you know, a certain 402 00:24:34,720 --> 00:24:39,440 Speaker 2: amount of concern. We have any number of hypothetical apocalypses 403 00:24:39,520 --> 00:24:43,560 Speaker 2: based on science gone wrong, of science that could go wrong, 404 00:24:43,720 --> 00:24:45,800 Speaker 2: or science that could just get out out of our hands. 405 00:24:46,280 --> 00:24:48,640 Speaker 2: And yeah, like we've been saying, AI is certainly one 406 00:24:48,640 --> 00:24:51,399 Speaker 2: of the big ones, clearly the one that resonates the 407 00:24:51,400 --> 00:24:54,800 Speaker 2: most as we are currently living through the realization of 408 00:24:55,040 --> 00:24:58,720 Speaker 2: the thread, and we're going to cover a few specific 409 00:24:58,800 --> 00:25:02,240 Speaker 2: AI related scenario, but ultimately there are so many different 410 00:25:02,240 --> 00:25:07,239 Speaker 2: angles to take on AI and its threat, especially as 411 00:25:07,320 --> 00:25:11,959 Speaker 2: far as the democratization of AI tools goes for various 412 00:25:12,000 --> 00:25:17,159 Speaker 2: nefarious purposes. Many of these pose a significant contemporary threat. 413 00:25:17,640 --> 00:25:20,520 Speaker 2: And that's in addition to all of the various highly 414 00:25:20,560 --> 00:25:24,679 Speaker 2: observable ways that AI is impacting industries and ways of 415 00:25:24,760 --> 00:25:27,560 Speaker 2: life and hurting people's jobs already. 416 00:25:28,240 --> 00:25:32,199 Speaker 3: Right, I mean, because AI is so broad as a 417 00:25:32,240 --> 00:25:36,600 Speaker 3: technology class, its effects could end up being so broad, 418 00:25:36,680 --> 00:25:40,280 Speaker 3: so it's almost harder to narrow the focus of the 419 00:25:40,320 --> 00:25:42,240 Speaker 3: conversation like it would be for some of these other 420 00:25:42,320 --> 00:25:47,480 Speaker 3: hypothetical substances or real substances that people have worried about 421 00:25:47,520 --> 00:25:51,520 Speaker 3: being a kind of runaway reaction or containment danger. But 422 00:25:51,800 --> 00:25:54,439 Speaker 3: it is also funny that if you go back, just 423 00:25:54,600 --> 00:25:59,040 Speaker 3: like ten or twenty years, you can regularly find people 424 00:25:59,800 --> 00:26:03,920 Speaker 3: in imagining the future of AI as AI is something 425 00:26:03,920 --> 00:26:07,000 Speaker 3: that would be developed in containment and then there would 426 00:26:07,040 --> 00:26:10,680 Speaker 3: be a question about is it safe to let it out? Yeah, 427 00:26:10,760 --> 00:26:13,160 Speaker 3: And that's just not what we got at all. It's 428 00:26:13,200 --> 00:26:16,000 Speaker 3: just like it's just it's out from the moment it exists. 429 00:26:16,040 --> 00:26:18,880 Speaker 3: It's all you know, it's just set loose always it's 430 00:26:18,920 --> 00:26:19,480 Speaker 3: been loose. 431 00:26:19,840 --> 00:26:22,240 Speaker 2: Yeah, there was kind of a vision of AI and 432 00:26:22,320 --> 00:26:26,080 Speaker 2: its use in creative endeavors that I was exposed to 433 00:26:26,720 --> 00:26:29,680 Speaker 2: years back. You know. It's probably at a I want 434 00:26:29,720 --> 00:26:32,120 Speaker 2: to say, this was maybe a panel at the World 435 00:26:32,200 --> 00:26:35,680 Speaker 2: Science Festival in New York, and it presented this optimistic 436 00:26:35,800 --> 00:26:38,400 Speaker 2: idea of like in the future, in the near future, 437 00:26:38,480 --> 00:26:41,879 Speaker 2: and to a certain extent, contemporary examples you'll have certain 438 00:26:42,000 --> 00:26:49,280 Speaker 2: artists using AI and working alongside it to create new 439 00:26:49,320 --> 00:26:51,919 Speaker 2: forms of art or music or what have you, but 440 00:26:52,000 --> 00:26:55,120 Speaker 2: in a very collaborative way and in a way that 441 00:26:55,119 --> 00:26:59,960 Speaker 2: does that wouldn't feel icky. Instead, we have, like you know, 442 00:27:00,080 --> 00:27:04,320 Speaker 2: everybody out there using these tools, uh, creating a lot 443 00:27:04,359 --> 00:27:06,200 Speaker 2: of material that is that is ikey, a lot of 444 00:27:06,240 --> 00:27:09,600 Speaker 2: AI slop as we've come to call it, you know, 445 00:27:09,760 --> 00:27:13,800 Speaker 2: which is not to say that that original vision is 446 00:27:13,800 --> 00:27:17,359 Speaker 2: is isn't a possibility and there's not a lot of 447 00:27:17,960 --> 00:27:20,560 Speaker 2: value in it. I mean, especially since the tools are 448 00:27:20,560 --> 00:27:23,320 Speaker 2: not going away. I hope that that is what we 449 00:27:23,400 --> 00:27:25,280 Speaker 2: come back to and that is something that we can 450 00:27:25,600 --> 00:27:29,840 Speaker 2: return to. But but yeah, it is. It is alarming 451 00:27:29,880 --> 00:27:33,119 Speaker 2: to watch in real time all around us. Now not 452 00:27:33,240 --> 00:27:39,600 Speaker 2: everything regarding machines and AI is current or near term. 453 00:27:39,640 --> 00:27:42,400 Speaker 2: We also have the likes of the gray goo scenario. 454 00:27:42,680 --> 00:27:45,320 Speaker 2: This is, it's gray goo, not grey goose. That would 455 00:27:45,640 --> 00:27:50,639 Speaker 2: involve some sort of catastrophic event concerning vodka obviously, but 456 00:27:50,680 --> 00:27:55,200 Speaker 2: the gray goo scenario the idea here by which molecular 457 00:27:55,280 --> 00:27:59,800 Speaker 2: self replicating nanotechnology ends up consuming the entire biomass of 458 00:27:59,800 --> 00:28:03,000 Speaker 2: Earth and turning it into itself, so kind of like 459 00:28:03,040 --> 00:28:08,919 Speaker 2: a global T one thousand mass in its most literal sense, 460 00:28:08,960 --> 00:28:11,200 Speaker 2: you know, the idea that it would just turn everything 461 00:28:11,240 --> 00:28:12,680 Speaker 2: into silvery goo. 462 00:28:13,320 --> 00:28:18,200 Speaker 3: Kind of similar to the Eisneine scenario or the vaseline 463 00:28:18,240 --> 00:28:23,800 Speaker 3: apocalypse in that it represents a consumption of the natural 464 00:28:23,920 --> 00:28:29,120 Speaker 3: environment around us, a transformation of substances in our environment, 465 00:28:29,160 --> 00:28:33,200 Speaker 3: substances we need, or the substances of ourselves, into something 466 00:28:33,280 --> 00:28:35,960 Speaker 3: that is not useful to us or is actively harmful 467 00:28:36,000 --> 00:28:39,479 Speaker 3: to us. And so yeah, it has that in common 468 00:28:39,520 --> 00:28:44,239 Speaker 3: with the Donahoe scenario or the Vonnegut scenario, except the 469 00:28:44,240 --> 00:28:48,080 Speaker 3: way I understood people imagining the gray goo obviously, this 470 00:28:48,200 --> 00:28:51,280 Speaker 3: was always just like a speculative, Yeah, kind of science 471 00:28:51,280 --> 00:28:54,520 Speaker 3: fiction thing because we don't have nanotechnology of this type. 472 00:28:55,560 --> 00:28:59,960 Speaker 3: But the way I understood it was that because you 473 00:29:00,040 --> 00:29:04,000 Speaker 3: would have these little tiny robots, nanotechnology robots that would 474 00:29:04,240 --> 00:29:07,680 Speaker 3: be designed to make copies of themselves or to turn, 475 00:29:08,560 --> 00:29:11,560 Speaker 3: you know, molecules into something that they're trying to produce, 476 00:29:11,640 --> 00:29:14,200 Speaker 3: if that ever got out of hand, there really wouldn't 477 00:29:14,240 --> 00:29:16,200 Speaker 3: be a good way to stop it. Is that the idea? 478 00:29:16,760 --> 00:29:19,480 Speaker 2: Yeah, yeah, you know. Obviously this idea has been around 479 00:29:19,480 --> 00:29:21,840 Speaker 2: for number of decades at this point. The term was 480 00:29:21,880 --> 00:29:26,920 Speaker 2: coined by molecular nanotechnology pioneer kay Eric Drexler and Engines 481 00:29:26,960 --> 00:29:30,240 Speaker 2: of Creation back in nineteen eighty six, and we could 482 00:29:30,240 --> 00:29:34,320 Speaker 2: easily do an entire episode on it. It's fascinating, I think, 483 00:29:34,360 --> 00:29:37,360 Speaker 2: not only in this kind of like worst case scenario 484 00:29:39,080 --> 00:29:42,800 Speaker 2: and cautionary tale sort of way, but it also feels 485 00:29:44,480 --> 00:29:51,160 Speaker 2: kind of metaphorically sound and intimidating because it is, again, 486 00:29:51,240 --> 00:29:54,120 Speaker 2: like you said, about the transformation of the natural world 487 00:29:54,200 --> 00:29:58,760 Speaker 2: into things, and we do this all the time. It's 488 00:29:58,800 --> 00:30:03,040 Speaker 2: a huge an alarming aspect of human culture. We turn 489 00:30:03,160 --> 00:30:07,240 Speaker 2: things into other substances that we need or we think 490 00:30:07,280 --> 00:30:11,600 Speaker 2: we need, and then those become garbage. And we take advantage, 491 00:30:11,600 --> 00:30:13,720 Speaker 2: of course of a lot of the resources of our 492 00:30:13,760 --> 00:30:18,360 Speaker 2: earth that are not easily replaced or are irreplaceable. 493 00:30:18,040 --> 00:30:22,040 Speaker 3: Massively speeding up and automating something we do, you know, 494 00:30:22,120 --> 00:30:24,160 Speaker 3: from an alien's point of view, we do, which is, 495 00:30:24,160 --> 00:30:28,000 Speaker 3: we take natural substances and turn them into trash. 496 00:30:27,600 --> 00:30:32,320 Speaker 2: Right right. And you know, we've in various ways always 497 00:30:32,400 --> 00:30:35,040 Speaker 2: done this. But we've also observed the way that technology 498 00:30:35,080 --> 00:30:38,320 Speaker 2: and technological advancements have taken something that has been a 499 00:30:38,360 --> 00:30:40,680 Speaker 2: part of human culture and made it even more destructive 500 00:30:40,720 --> 00:30:44,120 Speaker 2: and more problematic, which is, you know, not again not 501 00:30:44,160 --> 00:30:45,840 Speaker 2: to say the technology is bad, but it's about what 502 00:30:45,880 --> 00:30:49,200 Speaker 2: we do with the technology. So the other interesting thing 503 00:30:49,240 --> 00:30:54,040 Speaker 2: when you go back to T. Drexler's original idea for 504 00:30:54,160 --> 00:30:57,240 Speaker 2: the gray goose scenario is that the threat wouldn't be 505 00:30:57,280 --> 00:30:59,760 Speaker 2: bound to some sort of ultimate highly evolved version of 506 00:30:59,760 --> 00:31:03,400 Speaker 2: the thing, but rather, to quote early as similar based replicators, 507 00:31:04,480 --> 00:31:08,720 Speaker 2: they get out and wind up of supplanting advanced organisms 508 00:31:09,440 --> 00:31:10,440 Speaker 2: in the ecosystem. 509 00:31:10,880 --> 00:31:15,320 Speaker 3: So is the idea that people working on this technology 510 00:31:15,600 --> 00:31:19,280 Speaker 3: in this sort of science fiction scenario, they would eventually 511 00:31:19,320 --> 00:31:22,800 Speaker 3: be able to make the nanotechnology safe, but before they 512 00:31:22,840 --> 00:31:26,160 Speaker 3: get to the fully refined safe version. An early version 513 00:31:26,400 --> 00:31:30,200 Speaker 3: breaks containment and gets out of hand and it's making copies. 514 00:31:31,040 --> 00:31:33,360 Speaker 3: So yeah, it's kind of a problem to do the 515 00:31:33,400 --> 00:31:36,600 Speaker 3: safety steps last, especially if you're dealing with something that 516 00:31:37,160 --> 00:31:40,200 Speaker 3: you know, if it broke containment could destroy everything. 517 00:31:40,560 --> 00:31:44,920 Speaker 2: Yeah, so it's my understanding that this has become less 518 00:31:45,000 --> 00:31:49,000 Speaker 2: of a realistic threat in the eyes of many technologists. 519 00:31:49,640 --> 00:31:51,320 Speaker 2: But it also, at the same time is taken on 520 00:31:51,360 --> 00:31:53,160 Speaker 2: a life of its own, such as in science fiction, 521 00:31:54,320 --> 00:31:56,440 Speaker 2: in the culture novels of the and m Banks, you 522 00:31:56,520 --> 00:32:01,479 Speaker 2: have the hegemonizing swarms, whereas just occasionally the culture and 523 00:32:01,600 --> 00:32:04,320 Speaker 2: or other interstellar powers have to deal with the fact 524 00:32:04,360 --> 00:32:08,600 Speaker 2: that machines will start doing this and decide and basically 525 00:32:08,640 --> 00:32:11,000 Speaker 2: treat it like life form. Like there's a lot of 526 00:32:11,000 --> 00:32:13,600 Speaker 2: the ethical discussions, like at what point does it make 527 00:32:13,640 --> 00:32:19,080 Speaker 2: sense to wipe it out? So you know, in science fiction, again, 528 00:32:19,080 --> 00:32:20,520 Speaker 2: there are a number of takes on it that are 529 00:32:20,600 --> 00:32:24,920 Speaker 2: rather fascinating. You also have these different spin offs their concepts, 530 00:32:25,000 --> 00:32:28,840 Speaker 2: like green Goo, in which engineered organic matter ends up 531 00:32:28,880 --> 00:32:31,520 Speaker 2: taking it ends up taking the place of the nanotech 532 00:32:31,600 --> 00:32:32,760 Speaker 2: in this particular scenario. 533 00:32:33,000 --> 00:32:36,800 Speaker 3: Okay, so something that would be based on organic chemistry 534 00:32:36,880 --> 00:32:39,560 Speaker 3: or maybe cells or something like that instead of just 535 00:32:39,600 --> 00:32:40,480 Speaker 3: tiny machines. 536 00:32:40,600 --> 00:32:43,840 Speaker 2: Yeah, like it would be gray goo, except Crone and Burgie, 537 00:32:43,880 --> 00:32:46,720 Speaker 2: you know that sort of thing life gou. 538 00:32:46,840 --> 00:32:56,720 Speaker 3: Yeah. 539 00:32:57,120 --> 00:33:00,560 Speaker 2: Now, coming back to AI, there is all also something 540 00:33:00,600 --> 00:33:04,360 Speaker 2: called information gray goo, a term that I believe was 541 00:33:04,400 --> 00:33:09,120 Speaker 2: coined by British technology journalist Ian Betteridge. Also, I've seen 542 00:33:09,120 --> 00:33:13,480 Speaker 2: it referred to as the text apocalypse. And this one, 543 00:33:13,560 --> 00:33:15,800 Speaker 2: this one feels a lot, a lot closer and a 544 00:33:15,800 --> 00:33:21,880 Speaker 2: lot more fearsome. In this scenario, AI generated content overwhelms 545 00:33:21,960 --> 00:33:22,800 Speaker 2: human discourse. 546 00:33:24,440 --> 00:33:25,680 Speaker 3: That'll never happen. 547 00:33:27,160 --> 00:33:29,120 Speaker 2: Yeah, I mean, this one feels a lot more like 548 00:33:29,160 --> 00:33:31,400 Speaker 2: a real world threat because we already see the impact. 549 00:33:31,800 --> 00:33:34,160 Speaker 2: I mean, this has just become part of our just 550 00:33:34,200 --> 00:33:37,440 Speaker 2: like standard informational intake, just having to be on guard 551 00:33:37,520 --> 00:33:43,160 Speaker 2: against AI generated material because you know, is it authentic? 552 00:33:43,280 --> 00:33:45,440 Speaker 2: Is it is it real or not? Like we want 553 00:33:45,440 --> 00:33:45,800 Speaker 2: to know. 554 00:33:46,920 --> 00:33:48,640 Speaker 3: Or do we want to know? I mean, I think 555 00:33:48,680 --> 00:33:52,600 Speaker 3: one of the real danger is is that the the 556 00:33:52,640 --> 00:33:56,880 Speaker 3: scourge of AI slop interacts with our preferences and biases 557 00:33:57,640 --> 00:34:01,680 Speaker 3: such that you when you see something that you don't 558 00:34:02,040 --> 00:34:05,080 Speaker 3: like or don't want to be true, you can recognize 559 00:34:05,160 --> 00:34:08,480 Speaker 3: that it is AI generated garbage and fake and trying 560 00:34:08,480 --> 00:34:11,520 Speaker 3: to manipulate you in some way. And when you see 561 00:34:11,520 --> 00:34:14,600 Speaker 3: something that you do like or do want to be true, 562 00:34:14,680 --> 00:34:16,759 Speaker 3: very often you'll just be like, yeah, that's probably real. 563 00:34:16,920 --> 00:34:18,839 Speaker 2: Yeah, I believe there's a real danger point and it's 564 00:34:18,840 --> 00:34:22,440 Speaker 2: an uneven boundary line because it kind of applies at 565 00:34:22,440 --> 00:34:27,280 Speaker 2: different parts of our life, these different areas where we decide, okay, 566 00:34:27,440 --> 00:34:30,760 Speaker 2: I don't care about authenticity in this area, and it'll 567 00:34:30,760 --> 00:34:33,320 Speaker 2: be something like I see it happening, you know, all 568 00:34:33,360 --> 00:34:36,439 Speaker 2: around me, where someone you know might well say, hey, 569 00:34:36,560 --> 00:34:38,880 Speaker 2: you know, I don't want AI doing this or that. 570 00:34:39,360 --> 00:34:43,840 Speaker 2: But then they might think, well, I'm okay with it 571 00:34:43,920 --> 00:34:46,680 Speaker 2: creating my next head shot. You know, I'm okay that. 572 00:34:46,880 --> 00:34:48,400 Speaker 2: You know. It's like maybe I have some you know, 573 00:34:48,400 --> 00:34:50,479 Speaker 2: I have some some hang ups about how my last 574 00:34:50,520 --> 00:34:54,759 Speaker 2: headshot look and I'm totally okay with AI just creating 575 00:34:54,840 --> 00:34:57,359 Speaker 2: something for me, even though it's not authentic. It's not 576 00:34:57,400 --> 00:35:00,839 Speaker 2: really what I look like, and it's not created via 577 00:35:00,960 --> 00:35:04,160 Speaker 2: authentic means. But you know, you say, all right, I'm 578 00:35:04,160 --> 00:35:06,080 Speaker 2: gonna check that off. But then what's the next thing 579 00:35:06,120 --> 00:35:08,839 Speaker 2: to fall? And then ultimately, what does the boundary line 580 00:35:08,920 --> 00:35:12,360 Speaker 2: end up looking like anyway, I want to read a 581 00:35:12,440 --> 00:35:16,640 Speaker 2: quote here from betteredge, he writes, and you can read 582 00:35:16,680 --> 00:35:18,920 Speaker 2: about this at ianbetterage dot com. He has a whole 583 00:35:18,960 --> 00:35:21,960 Speaker 2: post on the information gray Goo. He says, quote, this 584 00:35:22,040 --> 00:35:24,719 Speaker 2: is the AI grey goose scenario and Internet choked with 585 00:35:24,840 --> 00:35:28,240 Speaker 2: low quality content which never improves, where it is almost 586 00:35:28,280 --> 00:35:32,840 Speaker 2: impossible to locate public, reliable sources for information because the 587 00:35:32,920 --> 00:35:35,520 Speaker 2: tools we have been able to rely on in the past, 588 00:35:35,680 --> 00:35:39,000 Speaker 2: Google social media can never keep up with the scale 589 00:35:39,040 --> 00:35:42,640 Speaker 2: of new content being created, where the volume of content 590 00:35:42,760 --> 00:35:46,880 Speaker 2: created overwhelms human or algorithmic abilities to sift through it 591 00:35:46,960 --> 00:35:49,360 Speaker 2: quickly and find high quality stuff. 592 00:35:50,120 --> 00:35:52,400 Speaker 3: You know, it's funny how when we talk about the 593 00:35:52,719 --> 00:35:57,399 Speaker 3: dangers of AI, people think about the terminator and Skynet 594 00:35:58,200 --> 00:36:01,680 Speaker 3: or I have no mouth that I'm scream And you know, 595 00:36:02,120 --> 00:36:04,680 Speaker 3: as I said earlier, like we can't really know where 596 00:36:04,719 --> 00:36:07,560 Speaker 3: we're gonna end up. You don't even know how likely 597 00:36:07,680 --> 00:36:10,160 Speaker 3: to rate those kinds of outcomes. You hope that they're 598 00:36:10,280 --> 00:36:14,920 Speaker 3: very unlikely because you know, but you don't really know. 599 00:36:15,440 --> 00:36:17,759 Speaker 3: With stuff like this, I almost feel like you could 600 00:36:17,800 --> 00:36:20,680 Speaker 3: make the case that we're already halfway there. We're sort 601 00:36:20,719 --> 00:36:23,160 Speaker 3: of edging into this scenario, are we not? 602 00:36:23,640 --> 00:36:25,799 Speaker 2: Yeah? Yeah, I mean it really does feel like a 603 00:36:25,800 --> 00:36:28,759 Speaker 2: lot of the battle is on, you know, who's going 604 00:36:28,760 --> 00:36:30,880 Speaker 2: to win between like this effort to sort of like 605 00:36:31,600 --> 00:36:34,760 Speaker 2: break down the public to where we don't care about 606 00:36:34,800 --> 00:36:38,279 Speaker 2: authenticity and anything, we don't care about human creation. We're 607 00:36:38,320 --> 00:36:40,480 Speaker 2: just like, it sounds good enough to me, it looks 608 00:36:40,520 --> 00:36:42,200 Speaker 2: good enough to me, it feels good enough to me, 609 00:36:43,280 --> 00:36:46,120 Speaker 2: and then we're just okay with all of the jobs 610 00:36:46,120 --> 00:36:48,520 Speaker 2: that are lost, all of the meaning that is lost 611 00:36:48,560 --> 00:36:53,440 Speaker 2: in people's lives just because a particular image was a 612 00:36:53,440 --> 00:36:56,560 Speaker 2: little more pleasing to us, or a particular piece of 613 00:36:56,560 --> 00:36:59,720 Speaker 2: writing was just a little more calibrated to our tastes 614 00:36:59,800 --> 00:37:02,560 Speaker 2: and so forth, or something was we were able to 615 00:37:02,600 --> 00:37:06,960 Speaker 2: generate it quickly and easily through some sort of online interface. 616 00:37:08,160 --> 00:37:12,000 Speaker 2: So yeah, it's alarming. It's it's frankly terrifying because I 617 00:37:12,040 --> 00:37:15,880 Speaker 2: wish hope was still in the Pandora's box. I'm not 618 00:37:15,880 --> 00:37:19,239 Speaker 2: sure that it is. A lot of it just comes 619 00:37:19,280 --> 00:37:21,600 Speaker 2: down to like, what what are we going to put 620 00:37:21,640 --> 00:37:21,920 Speaker 2: up with? 621 00:37:22,719 --> 00:37:24,799 Speaker 3: Yeah, And I guess the other thing being that the 622 00:37:25,000 --> 00:37:29,239 Speaker 3: example of AI is much more complicated, I think than 623 00:37:29,360 --> 00:37:33,600 Speaker 3: the other examples like so greygu is just essentially a 624 00:37:33,600 --> 00:37:36,560 Speaker 3: science fiction scenario at this point. There's nothing really like 625 00:37:36,640 --> 00:37:39,200 Speaker 3: it going on, at least that we know, So, you know, 626 00:37:39,360 --> 00:37:42,759 Speaker 3: it's just a hypothetical, and it is debatable to what 627 00:37:42,920 --> 00:37:46,799 Speaker 3: extent it is plausible. Polywater, of course was you know 628 00:37:46,840 --> 00:37:50,120 Speaker 3: that the fears about that are totally unfounded. It never existed. 629 00:37:50,200 --> 00:37:52,840 Speaker 3: Even if it did exist, it probably wasn't dangerous in 630 00:37:53,920 --> 00:37:57,480 Speaker 3: the ways that we're being imagined. I guess AI is 631 00:37:57,560 --> 00:38:00,319 Speaker 3: different for a number of reasons. I mean, for one, thing, 632 00:38:00,520 --> 00:38:02,960 Speaker 3: like it actually exists and is here. I guess you 633 00:38:03,000 --> 00:38:06,440 Speaker 3: could debate in terms of its existence. You can have 634 00:38:06,480 --> 00:38:09,319 Speaker 3: a debate about whether or not it actually constitutes intelligence 635 00:38:09,400 --> 00:38:12,200 Speaker 3: or not, and people do have that philosophical debate and intelligence. 636 00:38:12,239 --> 00:38:14,279 Speaker 2: It's an AI that the term is sometimes used a 637 00:38:14,320 --> 00:38:16,640 Speaker 2: little loosely with what we're talking about. 638 00:38:16,760 --> 00:38:19,480 Speaker 3: Yeah, so people argue about that, but there's no denying 639 00:38:19,560 --> 00:38:23,560 Speaker 3: that it is here and it's doing something. It's doing 640 00:38:23,600 --> 00:38:25,319 Speaker 3: a lot of things that I mean, I don't want 641 00:38:25,320 --> 00:38:27,000 Speaker 3: to sound too negative about it, like it does a 642 00:38:27,040 --> 00:38:29,240 Speaker 3: lot of things that are very useful. It's very useful 643 00:38:29,239 --> 00:38:33,080 Speaker 3: as a you know, as a search engine, as Yeah, 644 00:38:32,920 --> 00:38:38,400 Speaker 3: that kind of thing, like it makes information processing tasks 645 00:38:38,480 --> 00:38:40,640 Speaker 3: easier in a lot of ways. Of course, it's still 646 00:38:40,680 --> 00:38:43,359 Speaker 3: you know, has a lot of hallucination problems and all that. 647 00:38:43,440 --> 00:38:48,719 Speaker 3: But it is not an imaginary thing like like polywater 648 00:38:49,040 --> 00:38:51,759 Speaker 3: or like or you know, just a fully science fiction 649 00:38:51,840 --> 00:38:54,920 Speaker 3: speculative thing like gray goo. It's here. It does some 650 00:38:55,000 --> 00:38:58,640 Speaker 3: things that are undeniably useful, and it's integrated into the economy, 651 00:38:59,040 --> 00:39:02,359 Speaker 3: which makes it you know, harder to I guess it 652 00:39:02,400 --> 00:39:06,239 Speaker 3: makes the question of thinking about its potential danger is 653 00:39:06,280 --> 00:39:07,359 Speaker 3: even more fraud. 654 00:39:07,560 --> 00:39:10,960 Speaker 2: Yeah, yeah, I would agree. I want to throw out, 655 00:39:11,120 --> 00:39:14,960 Speaker 2: as is always the case, listeners, if you have differing 656 00:39:15,000 --> 00:39:16,960 Speaker 2: opinions on any of this and you would like to 657 00:39:17,440 --> 00:39:19,960 Speaker 2: rationally discuss them with us, you know, write in. We'll 658 00:39:19,960 --> 00:39:22,200 Speaker 2: have that email address at the end of this episode. 659 00:39:22,360 --> 00:39:25,759 Speaker 2: We're always happy to discuss. Now, I'm going to turn 660 00:39:25,760 --> 00:39:29,200 Speaker 2: the page a little bit, get away from the from 661 00:39:29,239 --> 00:39:32,960 Speaker 2: from the the AI concerns. Don't worry, they're not going away. 662 00:39:33,000 --> 00:39:34,839 Speaker 2: They'll be They'll be there when we come back to them. 663 00:39:35,280 --> 00:39:39,399 Speaker 2: But I want to turn to some examples of potential 664 00:39:39,680 --> 00:39:44,200 Speaker 2: you know, chain reaction catastrophes tied to advancements in science 665 00:39:44,520 --> 00:39:48,320 Speaker 2: that have turned out to not be the case cases 666 00:39:48,400 --> 00:39:51,520 Speaker 2: much like poly water, where someone was like, this might 667 00:39:51,600 --> 00:39:54,600 Speaker 2: happen and we need to be wary of it, and 668 00:39:54,640 --> 00:39:57,880 Speaker 2: then for a variety of reasons, that worst case scenario 669 00:39:57,960 --> 00:40:01,880 Speaker 2: turned out to not be the case. Yeah, so we 670 00:40:01,960 --> 00:40:06,680 Speaker 2: already mentioned atomic weaponry, like the advent of atomic weaponry 671 00:40:06,719 --> 00:40:11,600 Speaker 2: being being something that helped inform the understanding of the 672 00:40:11,600 --> 00:40:16,000 Speaker 2: potential threats of poly water and also ends up casting 673 00:40:16,000 --> 00:40:19,880 Speaker 2: a long shadow over any technology to come out after 674 00:40:20,280 --> 00:40:25,960 Speaker 2: the advent of nuclear weaponry, especially and the one I 675 00:40:25,960 --> 00:40:27,840 Speaker 2: want to focus on here is one that I believe 676 00:40:27,880 --> 00:40:30,440 Speaker 2: this I have not actually seen the Oppenheimer film, but 677 00:40:30,480 --> 00:40:32,440 Speaker 2: I believe this actually comes up in the film, and 678 00:40:32,480 --> 00:40:37,279 Speaker 2: that is the idea that, Okay, when we carry out 679 00:40:37,280 --> 00:40:41,839 Speaker 2: this first atmospheric detonation of a nuclear device, it might 680 00:40:42,160 --> 00:40:46,400 Speaker 2: ignite the atmosphere, resulting in a global chain reaction of 681 00:40:46,520 --> 00:40:47,640 Speaker 2: atmospheric fire. 682 00:40:48,000 --> 00:40:51,239 Speaker 3: This is addressed in the movie. The characters do talk 683 00:40:51,280 --> 00:40:53,759 Speaker 3: about the possibility. It's been a while since I've seen it, 684 00:40:53,800 --> 00:40:57,840 Speaker 3: so I don't remember the specific scene, but my understanding 685 00:40:58,080 --> 00:41:02,719 Speaker 3: is that in reality and history they did consider this 686 00:41:02,760 --> 00:41:06,040 Speaker 3: as a possibility, but most of the experts who did 687 00:41:06,040 --> 00:41:12,400 Speaker 3: the calculations on it, rated the likelihood as very low. Again, 688 00:41:12,440 --> 00:41:14,479 Speaker 3: that raises the question of I don't know how low 689 00:41:14,600 --> 00:41:17,200 Speaker 3: does the likelihood of something like this have to be 690 00:41:17,320 --> 00:41:18,680 Speaker 3: for you to feel okay? 691 00:41:18,719 --> 00:41:23,560 Speaker 2: Continuing, Yeah, yeah, especially when in this case where obviously 692 00:41:23,719 --> 00:41:27,480 Speaker 2: the goal is the creation of a weapon of mass destruction, 693 00:41:28,080 --> 00:41:31,520 Speaker 2: a weapon that's going to cast this long shadow over 694 00:41:32,400 --> 00:41:37,320 Speaker 2: human societies for ages and ages to come. A similar 695 00:41:37,360 --> 00:41:41,800 Speaker 2: possibility was apparently explored for underwater detonation as well, and again, 696 00:41:41,880 --> 00:41:45,759 Speaker 2: like you said, while technically possible, scientists of the day 697 00:41:45,880 --> 00:41:50,000 Speaker 2: agreed that it was terribly unlikely, and our current understanding 698 00:41:50,200 --> 00:41:52,400 Speaker 2: of it seems to fall in the idea that the 699 00:41:52,440 --> 00:41:55,400 Speaker 2: density of Earth's atmosphere is just much too low for 700 00:41:55,440 --> 00:41:58,720 Speaker 2: this to take place, as is the atmosphere of Venus. 701 00:41:58,760 --> 00:42:02,239 Speaker 2: Even jennif Flgarris did a nice write up on this 702 00:42:02,440 --> 00:42:05,520 Speaker 2: in twenty twenty four for Advanced Science News, I believe 703 00:42:05,560 --> 00:42:09,239 Speaker 2: in response to the Oppenheimer movie, interviewing a pair of 704 00:42:09,360 --> 00:42:13,319 Speaker 2: nuclear astrophysicist on the topic, and that seemed to be 705 00:42:13,600 --> 00:42:16,160 Speaker 2: where they landed on it. It's like, it's just, as 706 00:42:16,200 --> 00:42:18,920 Speaker 2: far as we understand it, this is not possible in 707 00:42:19,000 --> 00:42:22,120 Speaker 2: Earth's atmosphere. And of course all of this is just 708 00:42:22,160 --> 00:42:25,760 Speaker 2: in addition to the other obvious concerns over the potential 709 00:42:26,000 --> 00:42:29,560 Speaker 2: for the rollout of nuclear weaponry to enable humans to 710 00:42:29,719 --> 00:42:34,280 Speaker 2: destroy each other through devastating warfare, fallout, wide ranging fires, 711 00:42:34,320 --> 00:42:37,800 Speaker 2: and the chilling grip of nuclear winter. This is, of course, 712 00:42:37,800 --> 00:42:40,880 Speaker 2: it's just another just obvious and huge aspect of the 713 00:42:40,960 --> 00:42:42,840 Speaker 2: nuclear age. You know, we knew the world would not 714 00:42:42,880 --> 00:42:44,279 Speaker 2: be the same, and it has not been. 715 00:42:44,920 --> 00:42:48,760 Speaker 3: Right, So there was concern about the possibility of a physical, 716 00:42:48,920 --> 00:42:52,360 Speaker 3: immediate runaway reaction that could destroy the world, and those 717 00:42:52,480 --> 00:42:55,680 Speaker 3: concerns were not founded, or at least it was considered 718 00:42:55,760 --> 00:42:58,120 Speaker 3: very unlikely and proved to not be the case that 719 00:42:58,320 --> 00:43:00,600 Speaker 3: a nuclear detonation would do that to the end atmosphere. 720 00:43:00,960 --> 00:43:05,000 Speaker 3: And yet there was a runaway, a destructive runaway reaction, 721 00:43:05,120 --> 00:43:08,040 Speaker 3: and we just don't know like on what time scale, 722 00:43:08,120 --> 00:43:11,600 Speaker 3: if at all, it could prove extremely destructive to the world. 723 00:43:11,600 --> 00:43:15,400 Speaker 3: I mean, the atomic weapons they were designing were used 724 00:43:15,520 --> 00:43:18,520 Speaker 3: in the immediate circumstance, and then you know, it's hard 725 00:43:18,560 --> 00:43:22,440 Speaker 3: to imagine that going through into the future, that we 726 00:43:22,480 --> 00:43:25,879 Speaker 3: could have nuclear weapons on Earth and they would never 727 00:43:25,960 --> 00:43:28,600 Speaker 3: ever be used ever again for you know, however many 728 00:43:29,800 --> 00:43:32,840 Speaker 3: you know, thousands or hopefully millions of years, humans continue 729 00:43:32,840 --> 00:43:33,360 Speaker 3: to exist. 730 00:43:33,600 --> 00:43:36,000 Speaker 2: Yeah, I mean it would be great, That would be 731 00:43:36,040 --> 00:43:37,000 Speaker 2: great the case, but. 732 00:43:37,200 --> 00:43:40,080 Speaker 3: The odds seem to be against that never ever happening. 733 00:43:40,280 --> 00:43:44,080 Speaker 2: Yeah, I mean, yeah, have you met us, the human race? 734 00:43:44,400 --> 00:43:49,200 Speaker 2: This is sadly something we're highly capable of using. And 735 00:43:49,840 --> 00:43:52,560 Speaker 2: even you know, this is obviously a complex discussion. There 736 00:43:52,560 --> 00:43:53,759 Speaker 2: are a lot of ins and outs, and you can 737 00:43:53,760 --> 00:43:56,919 Speaker 2: get into various analysis of like where we are now 738 00:43:56,960 --> 00:44:02,120 Speaker 2: with various safeguards and so forth. But even if the 739 00:44:02,120 --> 00:44:07,359 Speaker 2: potential for their usage is low, if it's low year 740 00:44:07,400 --> 00:44:10,320 Speaker 2: to year, and then we have to just carry on indefinitely, 741 00:44:10,719 --> 00:44:14,960 Speaker 2: like what does that look like statistically? So yeah, continues 742 00:44:15,000 --> 00:44:18,160 Speaker 2: to be a matter of great concern obviously exactly. 743 00:44:18,280 --> 00:44:20,880 Speaker 3: Yeah, I mean, what are the odds of flipping heads 744 00:44:20,880 --> 00:44:24,240 Speaker 3: on a coin eight times in a row? Is very low? 745 00:44:24,320 --> 00:44:26,480 Speaker 3: But just keep flipping the coin, I mean, you keep 746 00:44:26,520 --> 00:44:27,839 Speaker 3: doing it, Eventually you'll get there. 747 00:44:28,040 --> 00:44:31,520 Speaker 2: Yeah. Now I want to cover a few more here. 748 00:44:33,080 --> 00:44:35,440 Speaker 2: The number of you may be reminded of some of 749 00:44:35,520 --> 00:44:39,680 Speaker 2: the discussions and science headlines and just general media headlines 750 00:44:39,960 --> 00:44:43,720 Speaker 2: around two thousand and eight, regarding the large hadron collideer 751 00:44:43,760 --> 00:44:47,799 Speaker 2: the LHC, specifically, as it smashed protons of greater and 752 00:44:47,840 --> 00:44:51,759 Speaker 2: greater speeds, it was brought up, hey, what if they 753 00:44:51,760 --> 00:44:55,920 Speaker 2: were to accidentally generate micro black holes that could potentially 754 00:44:56,000 --> 00:44:58,960 Speaker 2: grow and, I don't know, consume the entire planet Robi. 755 00:44:59,080 --> 00:45:01,360 Speaker 3: You've looked into this more recently than I have. But 756 00:45:01,760 --> 00:45:05,520 Speaker 3: my understanding of this, just from my memory, is that 757 00:45:05,760 --> 00:45:09,799 Speaker 3: the main people talking about this were not well informed scientists. 758 00:45:09,840 --> 00:45:12,760 Speaker 3: It was more kind of a fringe conspiracy theory. 759 00:45:13,480 --> 00:45:15,799 Speaker 2: Yeah, and it definitely one of those things that ends 760 00:45:15,880 --> 00:45:18,680 Speaker 2: up resonating at the headline level everywhere else because the 761 00:45:18,719 --> 00:45:21,360 Speaker 2: idea of like creating a black hole in a lab 762 00:45:21,400 --> 00:45:24,360 Speaker 2: and then eating everything, you know, that's an evocative idea. 763 00:45:25,280 --> 00:45:30,960 Speaker 2: It certainly jives with a lot of our science fiction, like. 764 00:45:30,960 --> 00:45:33,319 Speaker 3: The Vasilene apocalypse or like the gray Goo. It's like 765 00:45:33,400 --> 00:45:37,440 Speaker 3: the consumption of the world around us by a strange substance. 766 00:45:37,480 --> 00:45:42,960 Speaker 3: It's impossible to deny how interestingly awful that idea is. 767 00:45:43,160 --> 00:45:45,200 Speaker 2: Yeah, and just the idea of the black hole, even 768 00:45:45,239 --> 00:45:48,719 Speaker 2: if you only halfway understand it, and I think it's 769 00:45:48,960 --> 00:45:52,440 Speaker 2: probably less most I'm not going to pretend to completely 770 00:45:52,520 --> 00:45:57,279 Speaker 2: understand black holes, but you know they're evocative. It's a 771 00:45:57,320 --> 00:46:03,840 Speaker 2: fascinating idea. LHC scientists considered the possibility and concluded that quote, 772 00:46:03,880 --> 00:46:06,520 Speaker 2: if micro black holes do appear in the collisions created 773 00:46:06,520 --> 00:46:10,600 Speaker 2: by the LHC, they would disintegrate rapidly in around ten 774 00:46:10,680 --> 00:46:14,000 Speaker 2: to the negative twenty seventh power seconds. They would decay 775 00:46:14,040 --> 00:46:19,240 Speaker 2: into standard model or super symmetric particles, creating events containing 776 00:46:19,320 --> 00:46:22,360 Speaker 2: an exceptional number of tracks in our detectors which we 777 00:46:22,600 --> 00:46:26,200 Speaker 2: would easily spot. Finding more on any of these subjects 778 00:46:26,200 --> 00:46:29,960 Speaker 2: would open the door to yet unknown possibilities. In other words, 779 00:46:31,640 --> 00:46:36,759 Speaker 2: it wasn't happening, and it's not a real concern. Now 780 00:46:37,120 --> 00:46:40,600 Speaker 2: there's another related concern that also popped up. I think 781 00:46:40,640 --> 00:46:46,000 Speaker 2: it may be resonated in headline level a little less 782 00:46:46,000 --> 00:46:50,719 Speaker 2: strongly because not as evocative as black holes. But there 783 00:46:50,760 --> 00:46:54,479 Speaker 2: was also this idea that hypothetical clumps of strange matter 784 00:46:54,680 --> 00:46:58,959 Speaker 2: called strangelets could be generated in a particle accelerator, either 785 00:46:59,000 --> 00:47:03,799 Speaker 2: the LHC or or the relativistic heavy ion collideer. The 786 00:47:04,000 --> 00:47:07,680 Speaker 2: rhic and the extreme version of the scenario is that 787 00:47:07,719 --> 00:47:09,759 Speaker 2: it would set off a chain reaction that converts the 788 00:47:09,920 --> 00:47:14,760 Speaker 2: entire planet into a condensed lump of strange goop. However, 789 00:47:15,000 --> 00:47:18,640 Speaker 2: no strangelets have ever been observed at the RHIC or 790 00:47:18,680 --> 00:47:23,120 Speaker 2: the LHC, and the scientists contend that they're even less 791 00:47:23,280 --> 00:47:28,880 Speaker 2: likely at the LHC facility, the planet has not been gooped, 792 00:47:29,160 --> 00:47:31,120 Speaker 2: so we seem to be pretty good on this case 793 00:47:31,160 --> 00:47:31,520 Speaker 2: as well. 794 00:47:32,200 --> 00:47:34,360 Speaker 3: I feel like these examples raise a different kind of 795 00:47:34,440 --> 00:47:39,239 Speaker 3: question about these containment fears, which is, you know, we've 796 00:47:39,239 --> 00:47:42,360 Speaker 3: been exploring since we're obviously talking about like very serious 797 00:47:42,400 --> 00:47:46,439 Speaker 3: heavy stuff like nuclear weapons and potentially AI talking about 798 00:47:46,480 --> 00:47:51,879 Speaker 3: stuff that quite clearly I think most reasonable people would 799 00:47:51,920 --> 00:47:54,719 Speaker 3: agree we should be concerned about, like there should be 800 00:47:54,920 --> 00:47:59,520 Speaker 3: levels of caution dealing with these things. On the other hand, 801 00:47:59,680 --> 00:48:04,960 Speaker 3: you really that anybody can raise fears about anything for 802 00:48:05,080 --> 00:48:08,640 Speaker 3: any reasons, good or not. So, like if people are 803 00:48:08,800 --> 00:48:14,160 Speaker 3: raising concerns, just the fact that somebody is raising concerns 804 00:48:14,200 --> 00:48:18,360 Speaker 3: about something doesn't necessarily mean those concerns are things that 805 00:48:18,400 --> 00:48:21,080 Speaker 3: are well founded or that we should take seriously. And 806 00:48:21,160 --> 00:48:24,960 Speaker 3: part of the problem is like in the realm of 807 00:48:25,000 --> 00:48:29,160 Speaker 3: cutting edge science, like most people if they hear about 808 00:48:29,239 --> 00:48:31,759 Speaker 3: a concern with something, oh, you know, they're doing an 809 00:48:31,840 --> 00:48:35,520 Speaker 3: experiment at the particle collider. You know, it could do 810 00:48:35,640 --> 00:48:39,360 Speaker 3: something that destroys the world. Ninety nine point whatever percent 811 00:48:39,400 --> 00:48:41,759 Speaker 3: of people have no idea whether they should take that 812 00:48:41,880 --> 00:48:44,960 Speaker 3: concern seriously or not. But the fact that somebody is 813 00:48:45,040 --> 00:48:47,760 Speaker 3: saying it makes you feel like, well, maybe I should 814 00:48:47,800 --> 00:48:49,880 Speaker 3: be concerned because you don't know, you know, you don't 815 00:48:49,880 --> 00:48:54,160 Speaker 3: have the background knowledge to evaluate these claims and understand 816 00:48:54,239 --> 00:48:57,400 Speaker 3: whether they're they're based on reasonable concerns or not. 817 00:48:57,840 --> 00:48:59,200 Speaker 2: Yeah. Yeah, I mean a lot of it comes down 818 00:48:59,239 --> 00:49:02,640 Speaker 2: to need trust in science. You need trust in scientific institutions, 819 00:49:02,640 --> 00:49:06,399 Speaker 2: and you need regulations in place, and you need all 820 00:49:06,440 --> 00:49:19,080 Speaker 2: of this sort of working together. Another thing I should 821 00:49:19,080 --> 00:49:20,920 Speaker 2: bring up, just briefly. I don't want to spend too 822 00:49:20,960 --> 00:49:25,680 Speaker 2: much time on strangelets and micro black holes, but kind 823 00:49:25,680 --> 00:49:29,759 Speaker 2: of getting back to the poly water. These often raise questions, Okay, 824 00:49:29,760 --> 00:49:32,319 Speaker 2: we're talking about creating something in a lab, but it's 825 00:49:32,360 --> 00:49:35,640 Speaker 2: the thing that we're hypothetically creating. Is it something that 826 00:49:35,680 --> 00:49:39,719 Speaker 2: is that is just occurring elsewhere in the world or 827 00:49:39,760 --> 00:49:41,920 Speaker 2: in the universe, And then we have to sort of 828 00:49:41,920 --> 00:49:45,560 Speaker 2: weigh those two things, you know, like, Okay, if this 829 00:49:45,640 --> 00:49:48,320 Speaker 2: is a possibility in a lab, then it is surely 830 00:49:48,320 --> 00:49:50,920 Speaker 2: a reality elsewhere in the universe. And what does that 831 00:49:50,960 --> 00:49:51,800 Speaker 2: mean for our fears? 832 00:49:52,239 --> 00:49:55,120 Speaker 3: Right with the polywater comparison being that if we can 833 00:49:55,160 --> 00:49:58,120 Speaker 3: make it in the lab with you know, no especially 834 00:49:58,160 --> 00:50:02,480 Speaker 3: weird conditions, we're just like tiny quartz tubes. It probably 835 00:50:02,520 --> 00:50:06,439 Speaker 3: occurs in nature sometimes, so doesn't seem like it could 836 00:50:06,520 --> 00:50:09,600 Speaker 3: be just that some quantity of it in the environment 837 00:50:09,640 --> 00:50:13,239 Speaker 3: immediately leads, or not immediately at all, leads to worldwide catastrophe. 838 00:50:13,520 --> 00:50:17,480 Speaker 2: Yeah. Another case that's interesting to bring up in this 839 00:50:17,520 --> 00:50:21,640 Speaker 2: conversation is the nineteen seventy five a Silamar conference on 840 00:50:22,120 --> 00:50:27,279 Speaker 2: recombinant DNA. This was a gathering of biologists, lawyers, and 841 00:50:27,280 --> 00:50:31,319 Speaker 2: physicians at a Silamar conference center near Monterey, California, a 842 00:50:31,320 --> 00:50:34,799 Speaker 2: lovely place I once attended a wedding there. Oh really Yeah, 843 00:50:35,080 --> 00:50:37,279 Speaker 2: I highly recommend visiting that area if you get the chance. 844 00:50:37,320 --> 00:50:40,400 Speaker 2: But the conference in question was centered around biohazards and 845 00:50:40,440 --> 00:50:45,640 Speaker 2: regulatory concerns regarding developing and as well as just near 846 00:50:45,680 --> 00:50:51,840 Speaker 2: future biotechnological advancements in general, especially recombinant DNA genetic material 847 00:50:51,920 --> 00:50:55,040 Speaker 2: created from two different sources to create all new sequences. 848 00:50:56,400 --> 00:50:59,400 Speaker 2: It was an advancement that at the time was highly promising, 849 00:51:00,120 --> 00:51:03,960 Speaker 2: and it has proven to be very beneficial in biotechnology 850 00:51:04,160 --> 00:51:07,760 Speaker 2: and medicine multiple research areas. But at the same time, 851 00:51:09,000 --> 00:51:12,400 Speaker 2: at the time, researchers also recognized the huge potential for 852 00:51:12,440 --> 00:51:16,439 Speaker 2: the creation of organisms with dangerous properties, not so much 853 00:51:16,560 --> 00:51:19,400 Speaker 2: like space were wolves, or anything, but more so like 854 00:51:19,480 --> 00:51:23,680 Speaker 2: the accidental creation of deadly pathogens that could conceivably wash 855 00:51:23,719 --> 00:51:27,000 Speaker 2: over the globe, like any number of real and imagined 856 00:51:27,040 --> 00:51:30,120 Speaker 2: plagues and doomsday scenarios. There was a lot of concern 857 00:51:30,920 --> 00:51:36,879 Speaker 2: regarding cancer viruses cancer causing viruses in particular. I read 858 00:51:36,880 --> 00:51:41,440 Speaker 2: a quote from French biologist Philippe Kouilsky about the conference, 859 00:51:41,480 --> 00:51:45,359 Speaker 2: in which he underlined the excitement surrounding it, but also 860 00:51:45,400 --> 00:51:48,439 Speaker 2: the confusion quote because some of the basic questions could 861 00:51:48,440 --> 00:51:51,280 Speaker 2: only be dealt with in great disorder or not confronted 862 00:51:51,320 --> 00:51:54,480 Speaker 2: at all. On the frontiers of the unknown, the analysis 863 00:51:54,560 --> 00:51:57,560 Speaker 2: of benefits and hazards were locked up in concentric circles 864 00:51:57,560 --> 00:52:01,640 Speaker 2: of ignorance. How could one determine the reality without experimenting, 865 00:52:01,880 --> 00:52:05,280 Speaker 2: without taking a minimum of risk. I read this quote 866 00:52:05,320 --> 00:52:08,560 Speaker 2: in a silomar in recombinant DNA the end of the 867 00:52:08,560 --> 00:52:12,920 Speaker 2: Beginning by medical researcher Donald S. Fredrickson. This came out 868 00:52:12,920 --> 00:52:17,040 Speaker 2: in nineteen ninety one, but I think that succinctly summarizes 869 00:52:17,120 --> 00:52:20,600 Speaker 2: some of the issues regarding a number of these scenarios. 870 00:52:20,600 --> 00:52:23,320 Speaker 2: Science advances, like I said in previous episode, kind of 871 00:52:23,320 --> 00:52:26,680 Speaker 2: like a swine mold navigating a maze of understanding. And 872 00:52:26,920 --> 00:52:29,440 Speaker 2: when do we decide to not let it explore a 873 00:52:29,440 --> 00:52:32,719 Speaker 2: particular corridor or to try and slow its pace down 874 00:52:32,719 --> 00:52:33,800 Speaker 2: in a particular corridor. 875 00:52:34,080 --> 00:52:36,480 Speaker 3: Can you even effectively do that? I mean, science is 876 00:52:36,520 --> 00:52:42,200 Speaker 3: not a top down structure. I mean there are structures 877 00:52:42,239 --> 00:52:45,040 Speaker 3: and institutions within it that have some top down control, 878 00:52:45,080 --> 00:52:48,799 Speaker 3: but overall it is sort of like an organism in itself. Yeah, 879 00:52:48,840 --> 00:52:53,480 Speaker 3: it's a worldwide phenomenon where people can independently pursue things, 880 00:52:53,520 --> 00:52:57,880 Speaker 3: I mean, not totally independently. Cooperation is very important to 881 00:52:57,960 --> 00:53:02,920 Speaker 3: how science proceeds. But you know it's very hard to 882 00:53:03,320 --> 00:53:05,239 Speaker 3: you know, put put the lock on it and just 883 00:53:05,239 --> 00:53:06,839 Speaker 3: say nobody can look into this. 884 00:53:07,000 --> 00:53:09,600 Speaker 2: Yeah, especially if there's money to be made, or there 885 00:53:09,719 --> 00:53:14,279 Speaker 2: is a strategic advantage for like a nation state to. 886 00:53:14,320 --> 00:53:17,120 Speaker 3: Acquire or people think either of those things. 887 00:53:17,200 --> 00:53:19,759 Speaker 2: Yeah, if there's whether or not. Yeah, yeah, even if 888 00:53:19,760 --> 00:53:22,520 Speaker 2: there even if there's there's nothing substantial to it at all, 889 00:53:22,560 --> 00:53:24,759 Speaker 2: if there's a possibility of either of those things, it 890 00:53:24,840 --> 00:53:27,760 Speaker 2: may get a fair amount of attention. So the conference 891 00:53:27,800 --> 00:53:30,319 Speaker 2: in question, it resulted in a number of safety guidelines. 892 00:53:30,600 --> 00:53:33,680 Speaker 2: Some research was halted and biosafety levels were established. Some 893 00:53:33,680 --> 00:53:36,880 Speaker 2: of those are still in use today. Frederickson pointed out 894 00:53:36,880 --> 00:53:39,879 Speaker 2: in his paper, though, that some of these guidelines were 895 00:53:39,920 --> 00:53:43,040 Speaker 2: relaxed in seventy eight. By seventy eight, but at the 896 00:53:43,080 --> 00:53:47,000 Speaker 2: time of his writing, none of the more dire biotechnology 897 00:53:47,000 --> 00:53:50,520 Speaker 2: outcomes discussed had come to pass, while many great advancements 898 00:53:50,560 --> 00:53:53,480 Speaker 2: had been made. But he stressed that we shouldn't let 899 00:53:53,560 --> 00:53:56,560 Speaker 2: either factor wayh too heavily on our judgment of the 900 00:53:56,600 --> 00:54:00,000 Speaker 2: precaution exercised at the conference. And I think that is 901 00:54:01,600 --> 00:54:03,680 Speaker 2: something to keep in mind with all these scenarios, Like 902 00:54:05,160 --> 00:54:06,680 Speaker 2: we know that either you know, there were a lot 903 00:54:06,719 --> 00:54:09,319 Speaker 2: of benefits to be gained or there ended up not 904 00:54:09,400 --> 00:54:12,719 Speaker 2: being a threat, But we have to really put ourselves 905 00:54:12,840 --> 00:54:16,160 Speaker 2: in the shoes of the people in the trenches dealing 906 00:54:16,200 --> 00:54:19,239 Speaker 2: with the prospect of some sort of an advancement in 907 00:54:19,280 --> 00:54:20,080 Speaker 2: their given time. 908 00:54:20,560 --> 00:54:22,520 Speaker 3: Well, like we talked about last time. I think it's 909 00:54:22,600 --> 00:54:26,120 Speaker 3: very important when we reflect on the polywater saga for 910 00:54:26,200 --> 00:54:28,880 Speaker 3: our takeaway not to be what a bunch of dummies, 911 00:54:29,160 --> 00:54:32,720 Speaker 3: you know that they like, they're fools, and I'm smarter 912 00:54:32,800 --> 00:54:35,279 Speaker 3: than them because I wouldn't have fallen for polywater. I mean, 913 00:54:35,560 --> 00:54:39,479 Speaker 3: that's silly. We have the benefit of hindsight. We can, 914 00:54:39,680 --> 00:54:41,799 Speaker 3: you know, read how the whole thing unfolded. We know 915 00:54:41,920 --> 00:54:45,480 Speaker 3: the end of the story. A lot of very smart 916 00:54:45,520 --> 00:54:51,279 Speaker 3: and very productive scientists got confused and led into this 917 00:54:51,400 --> 00:54:55,520 Speaker 3: research dead end. And fortunately, you know, we did eventually 918 00:54:55,520 --> 00:54:58,239 Speaker 3: figure it out. Like there was a clarification process that 919 00:54:58,280 --> 00:55:01,400 Speaker 3: went on in the scientific community and eventually it was 920 00:55:01,480 --> 00:55:04,600 Speaker 3: figured out that like, oh, polywater is not real. This 921 00:55:04,800 --> 00:55:08,720 Speaker 3: was mistaken all along in a way that is science 922 00:55:08,800 --> 00:55:11,120 Speaker 3: working the way it's supposed to. I mean, it is 923 00:55:11,239 --> 00:55:15,799 Speaker 3: inevitable in science that some incorrect ideas are going to 924 00:55:15,920 --> 00:55:18,719 Speaker 3: end up are going to be floated and in some 925 00:55:18,800 --> 00:55:22,560 Speaker 3: cases might attract a lot of attention and enthusiasm. But 926 00:55:22,840 --> 00:55:25,319 Speaker 3: the great thing about science is that it is this 927 00:55:25,480 --> 00:55:30,360 Speaker 3: vast collaborative process of gradual clarification where that stuff will 928 00:55:30,640 --> 00:55:34,080 Speaker 3: get sorted out over time. So it's not quite fair to, 929 00:55:35,040 --> 00:55:39,120 Speaker 3: you know, to flog people for having been mistaken for 930 00:55:39,239 --> 00:55:41,520 Speaker 3: some time. It might be fair to flog people if 931 00:55:41,520 --> 00:55:44,360 Speaker 3: they're like really stubborn, you know, and once the evidence 932 00:55:44,400 --> 00:55:46,440 Speaker 3: comes out they refuse to acknowledge it. 933 00:55:46,760 --> 00:55:49,640 Speaker 2: Yeah. Now, another comparison I want to make here, this 934 00:55:50,080 --> 00:55:51,919 Speaker 2: is I think there's a strong comparison to be made 935 00:55:51,920 --> 00:55:56,040 Speaker 2: here between this and the hypothetical polywater scenario, and it 936 00:55:56,160 --> 00:56:01,799 Speaker 2: concerns an HIV AIDS medication by the aim of Retona vir. 937 00:56:04,080 --> 00:56:06,880 Speaker 2: It was originally produced in a crystalline form, and this 938 00:56:07,000 --> 00:56:09,919 Speaker 2: was called form one that was discovered in nineteen ninety six, 939 00:56:10,480 --> 00:56:13,640 Speaker 2: and it was offered in an unrefrigerated capsule. But then 940 00:56:13,640 --> 00:56:16,160 Speaker 2: a couple of years later, form two was discovered with 941 00:56:16,280 --> 00:56:23,560 Speaker 2: significantly lower bioavailability, so less useful as a medication, but 942 00:56:23,680 --> 00:56:27,400 Speaker 2: it also was more stable in this form, and the 943 00:56:27,520 --> 00:56:31,880 Speaker 2: problem quickly became clear that if form two came in 944 00:56:31,960 --> 00:56:35,479 Speaker 2: contact with form one, it would convert Form one into 945 00:56:35,480 --> 00:56:40,120 Speaker 2: form two, and even trace amounts could do this, you know, 946 00:56:40,200 --> 00:56:44,280 Speaker 2: trace amounts in production, and so the pharmaceutical company behind 947 00:56:44,320 --> 00:56:47,680 Speaker 2: it ended up apparently losing millions due to its impact 948 00:56:47,719 --> 00:56:52,160 Speaker 2: on production lines. Eventually, new formulations allowed them to avoid 949 00:56:52,200 --> 00:56:52,920 Speaker 2: the problem. 950 00:56:53,280 --> 00:56:56,040 Speaker 3: So the comparison here would be that kind of like 951 00:56:56,120 --> 00:57:01,799 Speaker 3: the alleged polywater you know, the vasilene apocalypse and or 952 00:57:01,840 --> 00:57:05,279 Speaker 3: the Eisen nine thing, you would have a contagion by 953 00:57:05,320 --> 00:57:09,399 Speaker 3: touch of a chemical form. So like the one touch 954 00:57:09,480 --> 00:57:11,640 Speaker 3: is one comes into contact with the other and it 955 00:57:11,760 --> 00:57:15,120 Speaker 3: changes it and maybe changes property is that you don't 956 00:57:15,120 --> 00:57:17,480 Speaker 3: want changed or important. In this case, it wouldn't affect 957 00:57:17,480 --> 00:57:21,160 Speaker 3: the entire biosphere, but you know, would affect important properties 958 00:57:21,200 --> 00:57:23,320 Speaker 3: of a pharmaceutical as intended. 959 00:57:23,200 --> 00:57:26,120 Speaker 2: Right, right, that's my understanding of it here. But again 960 00:57:26,320 --> 00:57:28,920 Speaker 2: it's also my understanding that they end up finding a 961 00:57:28,960 --> 00:57:32,880 Speaker 2: new formulation that allows them to avoid the problem. Now 962 00:57:32,920 --> 00:57:36,480 Speaker 2: there's another one I want to mention here, vacuum decay. Right, 963 00:57:36,520 --> 00:57:39,200 Speaker 2: So this one's complex and it entails quantum field theory. 964 00:57:39,240 --> 00:57:42,760 Speaker 2: But my best understanding of the idea is that our universe, 965 00:57:42,920 --> 00:57:46,040 Speaker 2: what if it actually has a vacuum state that you 966 00:57:46,040 --> 00:57:50,560 Speaker 2: could describe as a metastable false vacuum, and that suddenly 967 00:57:50,760 --> 00:57:53,720 Speaker 2: if a bubble of lower energy true vacuum were to 968 00:57:53,720 --> 00:57:58,200 Speaker 2: manifest in our university, a quantum tunneling, it would rapidly expand, 969 00:57:58,320 --> 00:58:02,280 Speaker 2: correcting our vacuum to a lower energy true vacuum, and 970 00:58:02,360 --> 00:58:04,960 Speaker 2: rewrite the fundamental laws of physics and alter the masses 971 00:58:04,960 --> 00:58:06,840 Speaker 2: of elementary particles in the process. 972 00:58:07,720 --> 00:58:10,280 Speaker 3: I remember reading about this year's back. I also, of course, 973 00:58:10,440 --> 00:58:13,000 Speaker 3: don't understand the finer points of this, you know, the 974 00:58:13,480 --> 00:58:16,960 Speaker 3: physics involved. But yeah, I remember reading about this and 975 00:58:17,000 --> 00:58:20,880 Speaker 3: thinking like, well, that's kind of interesting, except like, what 976 00:58:20,920 --> 00:58:22,520 Speaker 3: would you do about it? Nothing? 977 00:58:24,240 --> 00:58:27,760 Speaker 2: Yeah, I mean, if I'm understanding it correctly, it would unmake, 978 00:58:27,880 --> 00:58:30,240 Speaker 2: remake everything in our universe. Life as we know it 979 00:58:30,280 --> 00:58:32,720 Speaker 2: would no longer be possible. Nothing as we know it 980 00:58:32,760 --> 00:58:37,479 Speaker 2: would be possible. But the upside, according to astrophysicist doctor 981 00:58:37,520 --> 00:58:40,560 Speaker 2: Katie Mack, is that since the bubble would expand at 982 00:58:40,560 --> 00:58:45,000 Speaker 2: the speed of light, unmaking or remaking everything into an 983 00:58:45,080 --> 00:58:48,600 Speaker 2: unknowable new form, you would just never know what hit you. 984 00:58:49,840 --> 00:58:50,040 Speaker 3: Yeah. 985 00:58:50,120 --> 00:58:53,800 Speaker 2: Yeah, things would just simply stop being and that's all 986 00:58:53,800 --> 00:58:55,360 Speaker 2: there is to it. So it's like, well, you know, 987 00:58:56,280 --> 00:59:01,120 Speaker 2: not really something to lose sleepover. And again it involves 988 00:59:01,120 --> 00:59:04,560 Speaker 2: like something from outside of our universe becoming part of 989 00:59:04,560 --> 00:59:09,040 Speaker 2: our universe. Like it's it's like a borderline outside context 990 00:59:09,080 --> 00:59:09,880 Speaker 2: problem here. 991 00:59:10,160 --> 00:59:12,920 Speaker 3: Borderline that seems like the kind of definition of an 992 00:59:12,920 --> 00:59:14,440 Speaker 3: outside context problem, isn't. 993 00:59:14,240 --> 00:59:16,360 Speaker 2: Well, I mean, if we've thought of it, then I 994 00:59:16,360 --> 00:59:20,800 Speaker 2: guess we have some context for it, but barely. I 995 00:59:20,840 --> 00:59:23,160 Speaker 2: feel like, yeah, and there's nothing we could do about it, 996 00:59:23,480 --> 00:59:25,480 Speaker 2: And it would be the last if we were to 997 00:59:25,480 --> 00:59:28,200 Speaker 2: worry about it. It would be the last thing we worried about. Yeah, 998 00:59:28,320 --> 00:59:30,800 Speaker 2: all right, And finally, I'd be remiss if I didn't 999 00:59:30,840 --> 00:59:33,960 Speaker 2: at least mention the scenario presented in the series one 1000 00:59:34,000 --> 00:59:37,000 Speaker 2: pilot episode of Look Around You, the two thousand and 1001 00:59:37,000 --> 00:59:41,160 Speaker 2: two parody of nineteen eighties British educational programming. This was 1002 00:59:41,200 --> 00:59:45,720 Speaker 2: created by Robert Popper and Peter Sarah Finowitch. And this 1003 00:59:45,800 --> 00:59:47,400 Speaker 2: is the Helvetica scenario. 1004 00:59:48,320 --> 00:59:50,040 Speaker 3: Didn't this come up on the show just recently? 1005 00:59:50,480 --> 00:59:52,840 Speaker 2: Well, it may have. I do love it and think 1006 00:59:52,840 --> 00:59:55,840 Speaker 2: about it way too often. But yeah, in the pilot, 1007 00:59:56,120 --> 01:00:01,840 Speaker 2: again this is all parody, and this is all surrealistic humor. 1008 01:00:02,440 --> 01:00:05,880 Speaker 2: We learned that under certain unstated conditions, a change can 1009 01:00:05,880 --> 01:00:09,760 Speaker 2: take place in the calcium molecule, causing molecular collapse. In 1010 01:00:09,880 --> 01:00:13,400 Speaker 2: something called the Helvetica scenario. It's not explained, but we 1011 01:00:13,440 --> 01:00:17,280 Speaker 2: see a faceless, agitated human scientist in containment kind of like, 1012 01:00:17,320 --> 01:00:21,400 Speaker 2: you know, pawing at the wall and taking what we've 1013 01:00:21,440 --> 01:00:25,400 Speaker 2: been discussing here. I guess we could loosely imagine some 1014 01:00:25,440 --> 01:00:30,200 Speaker 2: sort of scenario by which cascading calcium collapse would have 1015 01:00:30,440 --> 01:00:34,440 Speaker 2: dire consequences for us in our world, especially since it 1016 01:00:34,480 --> 01:00:37,680 Speaker 2: plays a number of crucial roles in our bodies. So 1017 01:00:38,040 --> 01:00:40,440 Speaker 2: I guess we might imagine that we would sort of 1018 01:00:40,480 --> 01:00:42,760 Speaker 2: melt into blobs of some sort. I'm not sure if 1019 01:00:42,760 --> 01:00:46,200 Speaker 2: we would turn into faceless scientists dudes beating on the wall, 1020 01:00:46,280 --> 01:00:49,920 Speaker 2: but it wouldn't be good if it could happen. Luckily, 1021 01:00:49,960 --> 01:00:53,960 Speaker 2: it cannot happen. But I can't help but imagine that 1022 01:00:54,880 --> 01:00:59,560 Speaker 2: polywater and or ice nine partially inspire the Helvetica scenario 1023 01:01:00,160 --> 01:01:02,000 Speaker 2: presented in this program. 1024 01:01:02,400 --> 01:01:05,840 Speaker 3: Yeah, I guess we keep imagining polywater the polywater apocalypse 1025 01:01:05,880 --> 01:01:09,040 Speaker 3: as it gets into the environment, takes over everything in 1026 01:01:09,040 --> 01:01:11,560 Speaker 3: the environment, and you get like the ending of Kat's 1027 01:01:11,560 --> 01:01:15,000 Speaker 3: Cradle where the world is ice nine. But yeah, I 1028 01:01:15,000 --> 01:01:19,320 Speaker 3: guess the other version to imagine is like doctor Manhattan 1029 01:01:19,440 --> 01:01:22,360 Speaker 3: locked in the room and he gets the poly water 1030 01:01:22,440 --> 01:01:24,320 Speaker 3: inside his body and you just get to watch what 1031 01:01:24,360 --> 01:01:24,960 Speaker 3: happens to him. 1032 01:01:25,000 --> 01:01:27,320 Speaker 2: Yeah, Yeah, and just hope it doesn't get out in 1033 01:01:27,520 --> 01:01:30,760 Speaker 2: any way. Look around you. It gives us a best 1034 01:01:30,800 --> 01:01:33,960 Speaker 2: case scenario for something like this, where it's happened, it's bad, 1035 01:01:34,040 --> 01:01:43,760 Speaker 2: but we keep it locked away. 1036 01:01:45,040 --> 01:01:48,440 Speaker 3: Are you cool if we talk a bit about pathological. 1037 01:01:47,800 --> 01:01:49,920 Speaker 2: Science, Yeah, let's do so. 1038 01:01:50,800 --> 01:01:53,160 Speaker 3: A couple of the big sources that I mentioned in 1039 01:01:53,200 --> 01:01:55,600 Speaker 3: the last episode, that chapter by Philip Ball and the 1040 01:01:55,600 --> 01:02:02,040 Speaker 3: paper by Dennis Rousseau both frame polywater as a key 1041 01:02:02,240 --> 01:02:06,640 Speaker 3: example of what's known as pathological science. I was trying 1042 01:02:06,720 --> 01:02:12,800 Speaker 3: to understand the difference between pathological science and pseudoscience. I 1043 01:02:12,840 --> 01:02:16,080 Speaker 3: don't think there are actually formal definitions that keep these 1044 01:02:16,120 --> 01:02:18,680 Speaker 3: things separate, so instead I just have to infer the 1045 01:02:18,720 --> 01:02:23,200 Speaker 3: difference in how they're usually used. So the distinction I 1046 01:02:23,200 --> 01:02:28,760 Speaker 3: would make is this pseudoscience is fake science, and it's 1047 01:02:28,800 --> 01:02:33,800 Speaker 3: fake science from the beginning. It might be esthetically disguised 1048 01:02:33,840 --> 01:02:37,240 Speaker 3: as science. Maybe it is supposed to look and feel 1049 01:02:37,280 --> 01:02:40,000 Speaker 3: like science to people who can't tell the difference, but 1050 01:02:40,200 --> 01:02:45,120 Speaker 3: it is fundamentally not guided by an earnest search for 1051 01:02:45,400 --> 01:02:50,080 Speaker 3: the truth. It is fundamentally not scientific in its methods, 1052 01:02:50,560 --> 01:02:53,160 Speaker 3: and so it is sort of doomed from the start. 1053 01:02:53,600 --> 01:02:57,800 Speaker 3: There's nothing really scientific about pseudoscience in how it works, 1054 01:02:57,920 --> 01:03:02,400 Speaker 3: maybe only in how it looks, whereas pathological science, I think, 1055 01:03:02,480 --> 01:03:07,080 Speaker 3: is usually referred to science that begins as real science 1056 01:03:07,720 --> 01:03:12,360 Speaker 3: and does attempt to follow the scientific method, but it 1057 01:03:12,400 --> 01:03:16,920 Speaker 3: refers to these projects or cases where genuine researchers are 1058 01:03:17,040 --> 01:03:22,000 Speaker 3: led into error by things like unconscious bias and failure 1059 01:03:22,160 --> 01:03:25,720 Speaker 3: to confront central questions or contradictions. A big thing in 1060 01:03:25,760 --> 01:03:30,240 Speaker 3: how people talk about pathological science, at least as it 1061 01:03:30,240 --> 01:03:35,000 Speaker 3: seems to me, is a failure to ask the right questions. 1062 01:03:35,160 --> 01:03:38,480 Speaker 3: So maybe people are following the scientific method, but they 1063 01:03:38,560 --> 01:03:42,680 Speaker 3: are forming their conclusions based on focusing on the wrong questions, 1064 01:03:44,480 --> 01:03:47,200 Speaker 3: and so polywater seems to be a great example of that. 1065 01:03:47,560 --> 01:03:50,080 Speaker 3: In this paper by Dennis Rousseau, he also uses the 1066 01:03:50,160 --> 01:03:55,080 Speaker 3: examples of cold fusion, another pathological science project, and infinite 1067 01:03:55,160 --> 01:03:59,600 Speaker 3: dilution properties in water principle behind like homeopathy and stuff. 1068 01:04:01,120 --> 01:04:03,280 Speaker 3: What Russeau does in his paper is he tries to 1069 01:04:03,440 --> 01:04:09,400 Speaker 3: identify three overriding characteristics of pathological science. Because of course, 1070 01:04:09,800 --> 01:04:14,360 Speaker 3: you know, it's not entirely useless when people go astray 1071 01:04:14,520 --> 01:04:16,680 Speaker 3: in searching for the truth. You can actually learn a 1072 01:04:16,720 --> 01:04:20,120 Speaker 3: lot from looking how smart people get things wrong, it's 1073 01:04:20,160 --> 01:04:24,800 Speaker 3: a very useful thing to study. So Rousseau first calls 1074 01:04:24,840 --> 01:04:27,480 Speaker 3: to the attention he did not invent the concept of 1075 01:04:27,520 --> 01:04:31,080 Speaker 3: pathological science. The term seems to come from, or at 1076 01:04:31,160 --> 01:04:36,000 Speaker 3: least was made very popular by the Nobel Prize winning 1077 01:04:36,120 --> 01:04:41,840 Speaker 3: chemist Irving Langmuir. Langmuir identified six symptoms of pathological science 1078 01:04:42,120 --> 01:04:45,320 Speaker 3: in a famous talk that he gave in the nineteen fifties. 1079 01:04:45,360 --> 01:04:49,840 Speaker 3: I think, but Russeau tries to condense them into three characteristics, 1080 01:04:50,120 --> 01:04:53,600 Speaker 3: or he condenses langmuir six into two characteristics, and then 1081 01:04:53,640 --> 01:04:56,120 Speaker 3: he adds one of his own. So the first one 1082 01:04:56,120 --> 01:04:59,080 Speaker 3: I want to quote from Russeau here quote the effect 1083 01:04:59,240 --> 01:05:03,480 Speaker 3: being studied is often at the limits of detectability or 1084 01:05:03,480 --> 01:05:08,760 Speaker 3: has a very low statistical significance. So if what he's 1085 01:05:08,760 --> 01:05:11,840 Speaker 3: getting at here is if the claimed effect is very 1086 01:05:11,960 --> 01:05:16,480 Speaker 3: weak or hard to detect, errors of bias can creep 1087 01:05:16,520 --> 01:05:22,480 Speaker 3: into the observation or interpretation process. He identifies especially effects 1088 01:05:22,560 --> 01:05:27,120 Speaker 3: that are not measured by objective numerical readings on instruments, 1089 01:05:27,560 --> 01:05:32,600 Speaker 3: but effects that require visual observation by the experimenter. So 1090 01:05:32,720 --> 01:05:35,840 Speaker 3: if you have to rely on looking with your eyes 1091 01:05:35,880 --> 01:05:38,880 Speaker 3: to see the effect, and the effect is very weak. 1092 01:05:39,320 --> 01:05:42,560 Speaker 3: This is the danger zone for unconscious bias to take 1093 01:05:42,600 --> 01:05:47,320 Speaker 3: over in observation. Also, when effects are very weak or 1094 01:05:47,360 --> 01:05:51,920 Speaker 3: of low statistical significance, you might not be able to 1095 01:05:52,400 --> 01:05:56,760 Speaker 3: establish a clear connection between the size of the effect 1096 01:05:56,880 --> 01:06:00,680 Speaker 3: you see and the size of the very variable that 1097 01:06:00,800 --> 01:06:04,000 Speaker 3: is supposedly causing it. So I don't know, you know, 1098 01:06:04,080 --> 01:06:07,200 Speaker 3: I coat my body in three times as much tiger repellent, 1099 01:06:07,560 --> 01:06:10,160 Speaker 3: and I see the exact same number of tigers, which 1100 01:06:10,200 --> 01:06:10,680 Speaker 3: is zero. 1101 01:06:11,520 --> 01:06:11,720 Speaker 2: You know. 1102 01:06:11,920 --> 01:06:17,560 Speaker 3: So like if you're not establishing a strong correlation between 1103 01:06:17,640 --> 01:06:21,000 Speaker 3: how much you're tweaking the supposed cause and how big 1104 01:06:21,080 --> 01:06:24,600 Speaker 3: the effect is. When you don't see that correlation, that 1105 01:06:24,640 --> 01:06:29,120 Speaker 3: should raise concerns. But also Russeau says, the experiment are 1106 01:06:29,160 --> 01:06:33,760 Speaker 3: involved in pathological science can sometimes wave that concern away 1107 01:06:34,160 --> 01:06:36,840 Speaker 3: by saying, you know, well, we're in the early stages 1108 01:06:36,880 --> 01:06:39,520 Speaker 3: of discovering a new phenomenon. We don't understand all of 1109 01:06:39,560 --> 01:06:42,480 Speaker 3: the variables yet. In some cases that could be true, 1110 01:06:42,680 --> 01:06:45,880 Speaker 3: but that can also be an excuse that allows you 1111 01:06:45,920 --> 01:06:50,160 Speaker 3: to keep believing in a false theoretical model. But I 1112 01:06:50,160 --> 01:06:52,600 Speaker 3: think this is really important when you compare it to 1113 01:06:52,720 --> 01:06:55,240 Speaker 3: the detail that we talked about earlier. Today where a 1114 01:06:55,280 --> 01:06:58,840 Speaker 3: lot of these initial observations about the properties of polywater, 1115 01:06:59,240 --> 01:07:01,720 Speaker 3: you know, the easing ways that stuff was different than 1116 01:07:01,720 --> 01:07:07,640 Speaker 3: regular water. They're observed on tiny, tiny quantities of it, 1117 01:07:08,080 --> 01:07:13,760 Speaker 3: and they're not measured with normal you know, bulk measurement instruments. 1118 01:07:13,800 --> 01:07:19,280 Speaker 3: They're being inferred from visual observation through microscopes, So researchers 1119 01:07:19,280 --> 01:07:24,640 Speaker 3: are relying on things they see through microscopes, acting you know, 1120 01:07:24,880 --> 01:07:28,760 Speaker 3: properties of tiny amounts of something that they're looking at 1121 01:07:28,800 --> 01:07:31,840 Speaker 3: through these lenses. That does seem to be like a 1122 01:07:31,880 --> 01:07:34,920 Speaker 3: real danger zone where you might kind of see what 1123 01:07:34,960 --> 01:07:36,120 Speaker 3: you want to see. 1124 01:07:36,080 --> 01:07:39,080 Speaker 2: You know, from the micro to the macro. This reminds 1125 01:07:39,120 --> 01:07:43,400 Speaker 2: me of some of the pre photography errors in astronomy 1126 01:07:44,120 --> 01:07:47,280 Speaker 2: that we've discussed in the show before, be it the 1127 01:07:47,320 --> 01:07:54,600 Speaker 2: sighting of a potential planetary body, to even the supposed 1128 01:07:54,640 --> 01:07:57,400 Speaker 2: identification of things like canals on Mars and so forth. 1129 01:07:57,680 --> 01:08:02,360 Speaker 3: Perfect connection. Actually, because people often cite Martian canals as 1130 01:08:02,400 --> 01:08:06,680 Speaker 3: a like a core example of pathological science. It's not 1131 01:08:06,840 --> 01:08:11,240 Speaker 3: that it was fake pseudoscience. It was people trying to 1132 01:08:11,360 --> 01:08:14,160 Speaker 3: use the scientific method and trying to be responsible but 1133 01:08:14,320 --> 01:08:17,719 Speaker 3: getting fooled by you know, they were trying to see 1134 01:08:17,720 --> 01:08:20,439 Speaker 3: things with these telescopes that they couldn't get the resolution 1135 01:08:20,560 --> 01:08:23,360 Speaker 3: they needed, and they were making these inferences and then 1136 01:08:23,400 --> 01:08:26,880 Speaker 3: building on that and that was the problem. Yeah, so 1137 01:08:27,200 --> 01:08:29,920 Speaker 3: we got led into this false belief that Mars had 1138 01:08:29,920 --> 01:08:31,080 Speaker 3: these canals. 1139 01:08:30,840 --> 01:08:32,200 Speaker 2: And it was you know, go back and listen to 1140 01:08:32,280 --> 01:08:34,280 Speaker 2: our older episode on it. I forget the title off 1141 01:08:34,280 --> 01:08:35,960 Speaker 2: it offn It might have just been canals of Mars 1142 01:08:36,040 --> 01:08:38,719 Speaker 2: or something, but you know, it was an exciting idea 1143 01:08:38,800 --> 01:08:43,040 Speaker 2: that wasn't impossible at the time. But yeah, it was 1144 01:08:43,080 --> 01:08:44,800 Speaker 2: a lot of it was based on what do I 1145 01:08:44,840 --> 01:08:46,920 Speaker 2: think I see with with well, not the naked eye, 1146 01:08:46,960 --> 01:08:48,599 Speaker 2: but what do I think I see with my eye 1147 01:08:48,600 --> 01:08:49,440 Speaker 2: through a telescope? 1148 01:08:49,680 --> 01:08:55,720 Speaker 3: Yeah, okay. Characteristic number two that Russo identifies of pathological 1149 01:08:55,760 --> 01:09:00,320 Speaker 3: science is a readiness to disregard prevailing ideas and theories. 1150 01:09:01,720 --> 01:09:04,160 Speaker 3: The way he puts it, it's when you know, my 1151 01:09:04,280 --> 01:09:07,840 Speaker 3: new observation, Yeah, it conflicts with all previous experience and 1152 01:09:07,880 --> 01:09:11,360 Speaker 3: with our current best theories. But you're too dogmatically wedded 1153 01:09:11,400 --> 01:09:16,799 Speaker 3: to the past. I'm changing everything, you know. Within science, 1154 01:09:17,040 --> 01:09:22,759 Speaker 3: there is always a balance between open mindedness and explanatory conservatism. 1155 01:09:23,240 --> 01:09:26,320 Speaker 3: It is, of course true across the history of science 1156 01:09:26,400 --> 01:09:30,160 Speaker 3: that older theories are superseded by newer, more accurate theories. 1157 01:09:31,160 --> 01:09:34,360 Speaker 3: You know, you can talk about ways that Newton's model 1158 01:09:34,360 --> 01:09:37,360 Speaker 3: of gravity was superseded by Einstein, though it also doesn't 1159 01:09:37,360 --> 01:09:41,759 Speaker 3: really mean Newton was wrong, Like, at certain scales, Newton 1160 01:09:41,840 --> 01:09:43,960 Speaker 3: is still incredibly useful. You know, you can still do 1161 01:09:44,080 --> 01:09:48,240 Speaker 3: Newtonian calculations and make good predictions. But it turns out 1162 01:09:48,280 --> 01:09:52,000 Speaker 3: that you can't use Newton to understand all gravitational phenomena 1163 01:09:52,120 --> 01:09:54,640 Speaker 3: in the world, in the universe. And so you know, 1164 01:09:54,680 --> 01:09:58,240 Speaker 3: Einstein provides an update and a refinement that gets more 1165 01:09:58,280 --> 01:10:00,960 Speaker 3: accuracy in certain ways, and you know, you get that 1166 01:10:01,240 --> 01:10:04,879 Speaker 3: all throughout science. So you have to remain open minded 1167 01:10:05,040 --> 01:10:07,680 Speaker 3: that there might be better theories of reality than the 1168 01:10:07,680 --> 01:10:11,040 Speaker 3: ones we're using now. At the same time, we have 1169 01:10:11,120 --> 01:10:13,439 Speaker 3: the theories we have now because they're really good and 1170 01:10:13,479 --> 01:10:18,160 Speaker 3: they consistently make good, accurate predictions. And most of the time, 1171 01:10:18,160 --> 01:10:20,280 Speaker 3: when somebody thinks they've come up with a new model 1172 01:10:20,360 --> 01:10:23,920 Speaker 3: that changes everything, they're probably wrong. Yeah. 1173 01:10:24,000 --> 01:10:29,080 Speaker 2: Yeah, it's exceptional moments and time and exceptional individuals have 1174 01:10:29,280 --> 01:10:32,840 Speaker 2: proven the exception. You know, are you the one that 1175 01:10:32,920 --> 01:10:37,679 Speaker 2: has reinvented things and broken through to the other side. Maybe, 1176 01:10:38,080 --> 01:10:39,480 Speaker 2: but it is unlikely. 1177 01:10:39,880 --> 01:10:42,599 Speaker 3: Well, also, I think of your theory is revolutionary, and 1178 01:10:42,680 --> 01:10:47,639 Speaker 3: it contradicts a lot of common experience or prevailing theories. 1179 01:10:47,720 --> 01:10:50,880 Speaker 3: It should have a good a good account of why, 1180 01:10:51,160 --> 01:10:53,760 Speaker 3: like what's going on there? Why did we get this 1181 01:10:54,200 --> 01:10:56,880 Speaker 3: previous mistake? And result of this theory is actually better. 1182 01:10:58,560 --> 01:11:01,000 Speaker 3: So and one thing I was thinking about is this 1183 01:11:01,080 --> 01:11:05,479 Speaker 3: is one way that the social virtues of science come in. 1184 01:11:05,479 --> 01:11:08,200 Speaker 3: In my view, what we mean when we talk about 1185 01:11:08,240 --> 01:11:13,000 Speaker 3: science can't really be done by one person alone or 1186 01:11:13,040 --> 01:11:18,400 Speaker 3: by one team. It is a diverse, global, professional and 1187 01:11:18,479 --> 01:11:23,360 Speaker 3: social culture that strengthens itself by acting as a community 1188 01:11:23,920 --> 01:11:28,120 Speaker 3: to process new ideas and information and sort truth from error. 1189 01:11:28,680 --> 01:11:31,360 Speaker 3: And one of the tools in its arsenal, of course, 1190 01:11:31,520 --> 01:11:36,240 Speaker 3: is replication or experiment. New ideas need to be tested 1191 01:11:36,280 --> 01:11:41,280 Speaker 3: through critical experiments, and if somebody claims revolutionary experimental results 1192 01:11:41,320 --> 01:11:43,439 Speaker 3: that don't fit with our best theories of the world, 1193 01:11:43,960 --> 01:11:47,000 Speaker 3: other scientists try to repeat that experiment themselves and see 1194 01:11:47,000 --> 01:11:49,760 Speaker 3: if they get the same results. And here we get 1195 01:11:49,840 --> 01:11:54,759 Speaker 3: to Rousseau's third characteristic of pathological science. He says, essentially, 1196 01:11:54,880 --> 01:12:00,360 Speaker 3: the investigator avoids decisive experiments that could potentially rule out 1197 01:12:00,400 --> 01:12:01,799 Speaker 3: their revolutionary theory. 1198 01:12:02,439 --> 01:12:02,879 Speaker 2: Quote. 1199 01:12:03,240 --> 01:12:07,000 Speaker 3: To avoid confronting the truth, the investigator selects experiments that 1200 01:12:07,040 --> 01:12:11,000 Speaker 3: do nothing except perhaps add another significant figure to the 1201 01:12:11,080 --> 01:12:15,000 Speaker 3: result or measure a variant of the phenomenon. The investigator 1202 01:12:15,040 --> 01:12:18,120 Speaker 3: never finds the time to complete the critical measurement, which 1203 01:12:18,120 --> 01:12:21,799 Speaker 3: could bring down the whole house of cards. This connects 1204 01:12:21,800 --> 01:12:24,040 Speaker 3: to what we were talking about with polywater, where all 1205 01:12:24,080 --> 01:12:27,400 Speaker 3: this stuff's going on while there's never been a really good, 1206 01:12:27,560 --> 01:12:30,840 Speaker 3: high quality chemical analysis of polywater to say, are we 1207 01:12:30,960 --> 01:12:33,280 Speaker 3: sure we know what it is? Are we sure this 1208 01:12:33,439 --> 01:12:34,200 Speaker 3: is just water? 1209 01:12:35,280 --> 01:12:39,759 Speaker 2: Yeah? Yeah, Like the one thing that could really answer 1210 01:12:39,760 --> 01:12:43,120 Speaker 2: the question and determine whether we should proceed or not. Well, 1211 01:12:43,200 --> 01:12:46,479 Speaker 2: let's not do that. Let's just keep proceding anyway. And 1212 01:12:46,520 --> 01:12:48,599 Speaker 2: there may be more to that. As we've discussed with polywater, 1213 01:12:48,640 --> 01:12:53,360 Speaker 2: there are other practical reasons of why that major step 1214 01:12:53,400 --> 01:12:57,240 Speaker 2: hasn't been taken with polywater, being the amount of polywater 1215 01:12:57,320 --> 01:12:58,120 Speaker 2: that is available. 1216 01:12:58,280 --> 01:13:01,599 Speaker 3: If it is practically difficult, that's almost kind that makes 1217 01:13:01,640 --> 01:13:04,839 Speaker 3: it easier to perpetuate, you know, to move on without 1218 01:13:04,880 --> 01:13:09,360 Speaker 3: asking this critical central question. Now, what if somebody else 1219 01:13:09,479 --> 01:13:12,479 Speaker 3: does that critical experiment and reveals your new discovery to 1220 01:13:12,520 --> 01:13:16,000 Speaker 3: be wrong. In some cases, you will get people ignoring 1221 01:13:16,040 --> 01:13:19,320 Speaker 3: the finding or criticizing the method. For some reason your 1222 01:13:19,360 --> 01:13:24,519 Speaker 3: results are not acceptable, maybe you did the experiment wrong again. 1223 01:13:24,640 --> 01:13:28,840 Speaker 3: You know, this is not usually regarded as good behavior retrospectively, 1224 01:13:28,920 --> 01:13:31,120 Speaker 3: but people can get really caught up, you know, in 1225 01:13:31,439 --> 01:13:32,880 Speaker 3: trying to pursue their programs. 1226 01:13:33,200 --> 01:13:34,719 Speaker 2: Yeah, I mean there are a number of human factors 1227 01:13:34,760 --> 01:13:37,759 Speaker 2: that go into this too, anyway, the ego, sun, time, 1228 01:13:38,280 --> 01:13:41,280 Speaker 2: some costs, so much so, so many different factors. 1229 01:13:41,640 --> 01:13:46,360 Speaker 3: I would argue science is a superhuman process, but individual 1230 01:13:46,439 --> 01:13:51,559 Speaker 3: scientists are humans. And one thing Rousseau points out that 1231 01:13:51,800 --> 01:13:54,559 Speaker 3: I think I would totally agree with is that in 1232 01:13:54,600 --> 01:13:58,360 Speaker 3: these historical examples where you get scientists getting caught up 1233 01:13:58,360 --> 01:14:03,360 Speaker 3: in pathological science, they're usually not doing so consciously. They 1234 01:14:03,400 --> 01:14:08,560 Speaker 3: are falling victim to unconscious bias while genuinely believing themselves 1235 01:14:08,560 --> 01:14:10,679 Speaker 3: to be on the right track. You might get little 1236 01:14:10,680 --> 01:14:12,880 Speaker 3: moments of doubt, like you could ask the question like, 1237 01:14:13,240 --> 01:14:15,760 Speaker 3: why isn't more of an effort made to pursue that 1238 01:14:15,880 --> 01:14:19,760 Speaker 3: one really decisive central experiment. But you know that kind 1239 01:14:19,760 --> 01:14:23,559 Speaker 3: of our motivations can be unconsciously affected by biases as well. 1240 01:14:24,439 --> 01:14:26,840 Speaker 3: You know, like what seems like the most important thing 1241 01:14:26,840 --> 01:14:29,000 Speaker 3: to spend your time on can also be affected by 1242 01:14:29,120 --> 01:14:33,400 Speaker 3: unconscious biases. You may get the occasional conscious fraud, but 1243 01:14:33,600 --> 01:14:36,040 Speaker 3: I think it's like most people who have the will 1244 01:14:36,080 --> 01:14:37,920 Speaker 3: and intent to do that sort of thing do not 1245 01:14:38,040 --> 01:14:41,360 Speaker 3: end up working in scientific research. It's not a very 1246 01:14:41,439 --> 01:14:47,160 Speaker 3: natural fit. And yeah, so Rousseau argues that argues that 1247 01:14:47,200 --> 01:14:50,719 Speaker 3: in his experience, deliberate fraud in science is extremely rare. 1248 01:14:51,040 --> 01:14:55,120 Speaker 3: It's the more dangerous cocktail is these kind of sketchy 1249 01:14:55,200 --> 01:14:59,040 Speaker 3: areas where you're relying on these weak results and maybe 1250 01:14:59,200 --> 01:15:03,960 Speaker 3: visual obs or it's this cocktail of self delusion and 1251 01:15:04,040 --> 01:15:07,759 Speaker 3: sloppiness that can sustain a project like this. And again 1252 01:15:07,960 --> 01:15:13,480 Speaker 3: it's important to note that, like, you know, unlike pseudoscience, 1253 01:15:13,520 --> 01:15:17,160 Speaker 3: which tends to never go away, you know, the pseudoscience 1254 01:15:17,160 --> 01:15:19,960 Speaker 3: comes up and it just keeps on going forever. One 1255 01:15:19,960 --> 01:15:22,479 Speaker 3: distinction does seem to be that with these things identified 1256 01:15:22,520 --> 01:15:26,240 Speaker 3: as pathological science, they are transient phenomena. They pop up 1257 01:15:26,320 --> 01:15:28,519 Speaker 3: for a few years and people get excited about and 1258 01:15:28,560 --> 01:15:32,240 Speaker 3: eventually everybody's like, oh, yeah, that's that's not that was 1259 01:15:32,280 --> 01:15:33,400 Speaker 3: all wrong, and they move on. 1260 01:15:33,840 --> 01:15:35,840 Speaker 2: I guess one of the dangers is that you have 1261 01:15:35,920 --> 01:15:39,799 Speaker 2: pathological science. But then what if pseudoscience comes a long 1262 01:15:39,880 --> 01:15:41,639 Speaker 2: and says like, hey, what do you got over there? 1263 01:15:41,840 --> 01:15:42,880 Speaker 3: Well, I think that does happen? 1264 01:15:42,960 --> 01:15:45,680 Speaker 2: Yeah, and then they latch onto it or or you 1265 01:15:45,680 --> 01:15:48,720 Speaker 2: know someone who's like, this is a great story. It's 1266 01:15:48,720 --> 01:15:51,479 Speaker 2: an entertaining story. Let's tell that story and people will 1267 01:15:51,479 --> 01:15:53,400 Speaker 2: love to hear it. And now it's taken on life 1268 01:15:53,400 --> 01:15:57,439 Speaker 2: of its own outside, completely removed from the science system, 1269 01:15:57,800 --> 01:16:01,080 Speaker 2: and therefore now it will live on and for an 1270 01:16:01,120 --> 01:16:05,480 Speaker 2: extended period of time no matter what science decides on it. Yeah, 1271 01:16:05,520 --> 01:16:08,639 Speaker 2: and of course the other factor is on top of that, Well, 1272 01:16:08,680 --> 01:16:11,000 Speaker 2: if I buy into this story, if I buy into 1273 01:16:11,000 --> 01:16:15,000 Speaker 2: the pseudoscience, does it make something easier for me? Does 1274 01:16:15,040 --> 01:16:17,880 Speaker 2: it like, does it give me some sense of hope? 1275 01:16:18,680 --> 01:16:23,840 Speaker 2: Does it sort of distills some anxiety or fear that 1276 01:16:23,920 --> 01:16:26,160 Speaker 2: I have about the world or my place in it? 1277 01:16:26,400 --> 01:16:28,880 Speaker 2: You know, if it becomes useful in that regard to 1278 01:16:28,960 --> 01:16:35,200 Speaker 2: the individual or to corporations or to nation states, then yeah, 1279 01:16:35,280 --> 01:16:36,720 Speaker 2: that just feeds into it more. 1280 01:16:37,120 --> 01:16:40,280 Speaker 3: Or is it entertaining or does it does it flatter 1281 01:16:40,360 --> 01:16:44,960 Speaker 3: my biases? Or yeah, yeah, this may be counterintuitive, but 1282 01:16:46,360 --> 01:16:50,320 Speaker 3: framing it within thinking about the superhuman qualities of the 1283 01:16:50,320 --> 01:16:53,559 Speaker 3: scientific process as a whole, apart from just the individual 1284 01:16:53,680 --> 01:16:58,240 Speaker 3: human qualities of each researcher as a person. I kind 1285 01:16:58,240 --> 01:17:02,439 Speaker 3: of find the whole Polywater store a bit inspiring. You know, 1286 01:17:02,600 --> 01:17:05,720 Speaker 3: it's weird to think of a story of a you know, 1287 01:17:05,960 --> 01:17:11,760 Speaker 3: several years long enterprise of error as inspiring, but I 1288 01:17:11,760 --> 01:17:13,760 Speaker 3: do think of it as kind of inspiring because it 1289 01:17:13,880 --> 01:17:16,040 Speaker 3: shows that, you know, you can have a lot of 1290 01:17:16,080 --> 01:17:18,519 Speaker 3: sunk costs, you can have a lot of error and 1291 01:17:19,880 --> 01:17:24,360 Speaker 3: people fooling themselves into following something that isn't real, but 1292 01:17:24,560 --> 01:17:28,839 Speaker 3: you also get to see how the process eventually corrects 1293 01:17:28,840 --> 01:17:33,439 Speaker 3: itself and see what that involves, Like, how is it 1294 01:17:33,479 --> 01:17:36,400 Speaker 3: that we soort fact from fiction? And we can learn 1295 01:17:36,439 --> 01:17:37,720 Speaker 3: a lot from this kind of thing. 1296 01:17:38,600 --> 01:17:41,840 Speaker 2: Yeah, yeah, again, the idea that the system ultimately worked 1297 01:17:41,840 --> 01:17:44,560 Speaker 2: the way it was supposed to with polywater, that we 1298 01:17:44,640 --> 01:17:47,559 Speaker 2: figured out that it wasn't a thing and we moved on, 1299 01:17:48,360 --> 01:17:50,240 Speaker 2: and at the same time we can still look back 1300 01:17:50,280 --> 01:17:53,519 Speaker 2: on it and find the concept and the story of 1301 01:17:53,560 --> 01:17:58,600 Speaker 2: it entertaining and thought provoking. You know, it doesn't did 1302 01:17:58,680 --> 01:18:02,040 Speaker 2: diminish that at all. So so yeah, I agree, this 1303 01:18:02,720 --> 01:18:05,719 Speaker 2: is a fascinating journey, and it serves as an interesting 1304 01:18:05,760 --> 01:18:10,080 Speaker 2: model again to compare to these various other scenarios and 1305 01:18:10,160 --> 01:18:13,560 Speaker 2: hypothetical scenarios. All right, well, shall we go ahead and 1306 01:18:13,600 --> 01:18:14,400 Speaker 2: close it out there? 1307 01:18:14,760 --> 01:18:15,200 Speaker 3: I think so? 1308 01:18:16,000 --> 01:18:18,240 Speaker 2: All right? Just a reminder to everyone out there that 1309 01:18:18,320 --> 01:18:20,360 Speaker 2: Stuff to Blow Your Mind is primarily a science and 1310 01:18:20,360 --> 01:18:24,240 Speaker 2: culture podcast, with core episodes on Tuesdays and Thursdays, short 1311 01:18:24,240 --> 01:18:26,800 Speaker 2: form episodes on Wednesdays and on Fridays. We set aside 1312 01:18:26,800 --> 01:18:28,840 Speaker 2: most serious concerns, so just talk about a weird film 1313 01:18:28,880 --> 01:18:32,360 Speaker 2: on weird House cinema. We have been around for years 1314 01:18:32,360 --> 01:18:35,760 Speaker 2: and years at this point, So if you go to 1315 01:18:35,920 --> 01:18:38,120 Speaker 2: wherever you get your podcasts, look up Stuff to Blow 1316 01:18:38,160 --> 01:18:42,240 Speaker 2: your Mind, you'll find pretty deep audio archive there. You 1317 01:18:42,240 --> 01:18:46,040 Speaker 2: can go back and find topics we've discussed in the past. 1318 01:18:46,080 --> 01:18:48,840 Speaker 2: You can also find topics that we've referenced here in 1319 01:18:48,880 --> 01:18:51,599 Speaker 2: this episode, and we encourage you to do that. Wherever 1320 01:18:51,640 --> 01:18:54,639 Speaker 2: you get the podcast, Just rate, review and subscribe if 1321 01:18:54,640 --> 01:18:54,960 Speaker 2: you can. 1322 01:18:55,640 --> 01:18:59,240 Speaker 3: Huge thanks as always to our excellent audio producer JJ Posway. 1323 01:18:59,400 --> 01:19:00,800 Speaker 3: If you would like to get in touch with us 1324 01:19:00,840 --> 01:19:03,439 Speaker 3: with feedback on this episode or any other, to suggest 1325 01:19:03,479 --> 01:19:05,320 Speaker 3: a topic for the future, or just to say hello. 1326 01:19:05,479 --> 01:19:07,960 Speaker 3: You can email us at contact at stuff to Blow 1327 01:19:07,960 --> 01:19:16,360 Speaker 3: your Mind dot com. 1328 01:19:16,400 --> 01:19:19,360 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 1329 01:19:19,439 --> 01:19:22,240 Speaker 1: more podcasts from my heart Radio, visit the iHeartRadio app, 1330 01:19:22,400 --> 01:19:39,600 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.