1 00:00:02,279 --> 00:00:04,760 Speaker 1: Strange Arrivals is a production of I Heart three D 2 00:00:04,880 --> 00:00:16,639 Speaker 1: Audience for full exposure listen with headphones. In October, in 3 00:00:16,680 --> 00:00:19,840 Speaker 1: the midst of the pandemic, I spoke by zoom with 4 00:00:19,960 --> 00:00:23,680 Speaker 1: author Colin Dickey. He is the author of ghost Land 5 00:00:24,160 --> 00:00:29,240 Speaker 1: about American haunted houses, and teaches at National University. I 6 00:00:29,320 --> 00:00:34,400 Speaker 1: spoke with him about his latest book, The Unidentified Mythical Monsters, 7 00:00:34,600 --> 00:00:39,920 Speaker 1: Alien Encounters and Our Obsession with the Unexplained. I'm Toby 8 00:00:39,960 --> 00:00:45,280 Speaker 1: Ball and this is Strange Arrivals. My name is Colin Dickey. 9 00:00:45,440 --> 00:00:49,479 Speaker 1: I am the author of The Unidentified Mythical Monsters, Alien 10 00:00:49,560 --> 00:00:53,440 Speaker 1: Encounters and Our Obsession with the Unexplained. So how did 11 00:00:53,479 --> 00:00:56,680 Speaker 1: you come to write this book? Like what one interest 12 00:00:56,760 --> 00:00:59,680 Speaker 1: led you to to do in the research and then 13 00:00:59,720 --> 00:01:03,120 Speaker 1: in the writing. Um. I think I've always been interested 14 00:01:03,240 --> 00:01:06,920 Speaker 1: in the way in which a set of beliefs to 15 00:01:07,040 --> 00:01:10,760 Speaker 1: ideas which you could maybe call sort of like fringe beliefs, um, 16 00:01:10,800 --> 00:01:13,360 Speaker 1: you know, for for lack of a better term, the 17 00:01:13,440 --> 00:01:16,720 Speaker 1: stuff that's not taken seriously by by mainstream science or 18 00:01:16,800 --> 00:01:19,200 Speaker 1: journalism or religion or or kind of you know, the 19 00:01:19,280 --> 00:01:23,600 Speaker 1: usual kind of authorities, and and thus gets sort of 20 00:01:23,680 --> 00:01:25,840 Speaker 1: shunted into the side in terms of you know, superstition 21 00:01:25,880 --> 00:01:28,360 Speaker 1: and that kind of stuff. Um. But but I found 22 00:01:28,360 --> 00:01:32,680 Speaker 1: that that those kind of beliefs often really whether or 23 00:01:32,720 --> 00:01:34,280 Speaker 1: not one beliefs that they're true or not, they have 24 00:01:34,520 --> 00:01:37,680 Speaker 1: a pretty enormous effect on culture and and and the 25 00:01:37,720 --> 00:01:39,959 Speaker 1: way you know, the world works, and you know, they 26 00:01:39,959 --> 00:01:44,720 Speaker 1: have sort of material effects. And I've found that it 27 00:01:44,959 --> 00:01:48,360 Speaker 1: seems necessary to talk about these things again, not necessarily 28 00:01:48,360 --> 00:01:51,720 Speaker 1: in terms of are they real or all they are 29 00:01:51,760 --> 00:01:54,240 Speaker 1: they you know, fake, although you know, that's an interesting question, 30 00:01:54,320 --> 00:01:56,600 Speaker 1: but but more so like what do these beliefs do 31 00:01:57,040 --> 00:01:59,440 Speaker 1: in in the world and how do they affect the 32 00:01:59,440 --> 00:02:02,200 Speaker 1: world around us? Um, you know, even for those of 33 00:02:02,240 --> 00:02:04,360 Speaker 1: us who don't necessarily believe in these things. And I 34 00:02:04,800 --> 00:02:07,480 Speaker 1: you know, my my previous book, ghost Land in American 35 00:02:07,520 --> 00:02:10,840 Speaker 1: History and Haunted Places was similarly based around this question 36 00:02:10,880 --> 00:02:12,720 Speaker 1: of you know, I think whether or not you believe 37 00:02:12,800 --> 00:02:16,799 Speaker 1: that ghosts exists, it's sort of undeniable that the sort 38 00:02:16,800 --> 00:02:20,280 Speaker 1: of language of of ghosts and haunted places are are 39 00:02:20,360 --> 00:02:23,480 Speaker 1: things that sort of drive American culture and sort of 40 00:02:23,480 --> 00:02:25,400 Speaker 1: say a great deal about our history. So I think 41 00:02:25,440 --> 00:02:28,120 Speaker 1: that's kind of how I gradually sort of got more 42 00:02:28,160 --> 00:02:30,399 Speaker 1: and more interested in these kind of you know again 43 00:02:30,480 --> 00:02:32,799 Speaker 1: quote unquote fringe topics for lack of a better word, 44 00:02:33,600 --> 00:02:39,200 Speaker 1: did you have sort of a central thesis for this book, which, 45 00:02:39,400 --> 00:02:41,239 Speaker 1: for people who haven't read it is, you know it 46 00:02:41,480 --> 00:02:46,080 Speaker 1: um it encompasses sort of a variety of different you know, 47 00:02:46,160 --> 00:02:51,600 Speaker 1: quote unquote fringe topics, including cryptids and UFOs. Is there 48 00:02:51,680 --> 00:02:57,600 Speaker 1: some sort of central organizing thesis you have, Well, it's funny. 49 00:02:57,639 --> 00:03:01,040 Speaker 1: It's started out so so this this book in particular, 50 00:03:01,560 --> 00:03:05,000 Speaker 1: in the wake of the election. UM. You know, there 51 00:03:05,040 --> 00:03:06,799 Speaker 1: was a lot of discussion about the ways in which 52 00:03:07,800 --> 00:03:11,720 Speaker 1: social media companies like Facebook had had been responsible directly 53 00:03:11,800 --> 00:03:15,079 Speaker 1: or indirectly for pushing a lot of conspiracy theories into 54 00:03:15,080 --> 00:03:17,839 Speaker 1: the mainstream. And you know, at the time, I thought, well, 55 00:03:17,880 --> 00:03:20,000 Speaker 1: you know, that's that's undeniably true, and I think that 56 00:03:20,080 --> 00:03:23,160 Speaker 1: these companies have a real obligation that they aren't taken seriously. 57 00:03:23,200 --> 00:03:25,239 Speaker 1: But at the same time, I wanted to push back 58 00:03:25,240 --> 00:03:29,920 Speaker 1: on the idea that the rising conspiracy theories was somehow 59 00:03:30,960 --> 00:03:34,760 Speaker 1: unique to a social media era and UM, and what 60 00:03:34,800 --> 00:03:38,000 Speaker 1: I wanted to do was to do a kind of 61 00:03:38,040 --> 00:03:42,360 Speaker 1: genealogical history in some sense of UM you know, a 62 00:03:42,440 --> 00:03:45,280 Speaker 1: set of French beliefs and or you know, conspiracy theories. 63 00:03:45,400 --> 00:03:50,040 Speaker 1: And in the beginning I cast a pretty wide net. So, 64 00:03:50,080 --> 00:03:52,880 Speaker 1: you know, my the first draft of the book, UM, 65 00:03:52,920 --> 00:03:56,280 Speaker 1: you know, had everything from you know, anti submitted conspiracy 66 00:03:56,280 --> 00:03:59,040 Speaker 1: theories like the Protocols of the Elders Desigon to um, 67 00:03:59,080 --> 00:04:03,240 Speaker 1: you know, flat Earth spiracy theories, as well as UM 68 00:04:03,760 --> 00:04:08,120 Speaker 1: cryptids and UFOs and and as I worked through the 69 00:04:08,160 --> 00:04:10,000 Speaker 1: material and kind of you know, worked on the book, 70 00:04:10,080 --> 00:04:13,320 Speaker 1: I I've found that I kind of didn't want to 71 00:04:13,320 --> 00:04:15,320 Speaker 1: go that wide. I wanted to kind of narrow it down, 72 00:04:15,360 --> 00:04:18,880 Speaker 1: and so, you know, so I started to kind of 73 00:04:19,000 --> 00:04:21,800 Speaker 1: pair back my my list of subjects. But I couldn't 74 00:04:21,880 --> 00:04:24,119 Speaker 1: quite get away from the fact that that the three 75 00:04:24,160 --> 00:04:26,200 Speaker 1: topics that ended up being sort of the driving force 76 00:04:26,240 --> 00:04:29,200 Speaker 1: of this book, UM, you know, not just cryptis and aliens, 77 00:04:29,200 --> 00:04:33,719 Speaker 1: but also the lost continents of Atlantis and Lamaria UM 78 00:04:33,760 --> 00:04:37,760 Speaker 1: all were kind of interrelated in some form historically. You 79 00:04:37,800 --> 00:04:41,120 Speaker 1: know that that you know, the the rise of the 80 00:04:41,160 --> 00:04:44,960 Speaker 1: modern UFO movement comes out of um, you know, uh, 81 00:04:45,240 --> 00:04:48,840 Speaker 1: Ray Palmer's magazine Fate, and Ray Palmer before that had 82 00:04:48,960 --> 00:04:53,160 Speaker 1: introduced the world to the Shaver mysteries and um, you know, 83 00:04:53,200 --> 00:04:55,560 Speaker 1: the Shave of the with these sort of kind of 84 00:04:55,640 --> 00:04:58,960 Speaker 1: sci fi but maybe supposedly true things about these sort 85 00:04:58,960 --> 00:05:03,359 Speaker 1: of weird underground races that controlled, you know, our mind 86 00:05:03,440 --> 00:05:06,680 Speaker 1: and thoughts and that sort of thing. And and Palmer 87 00:05:06,760 --> 00:05:09,880 Speaker 1: had had titled the first installment of the Shaber Mysteries. 88 00:05:09,880 --> 00:05:14,479 Speaker 1: I remember Lamuria, so sort of I couldn't get two 89 00:05:14,600 --> 00:05:17,640 Speaker 1: UFOs without first talking about Lamiria, this sort of fabled 90 00:05:17,640 --> 00:05:21,240 Speaker 1: lost continent sort of the Pacific Ocean version of Atlantis 91 00:05:21,720 --> 00:05:23,720 Speaker 1: that you know, has a long history in and of itself. 92 00:05:23,760 --> 00:05:25,800 Speaker 1: So this is a long answer to your question, but it, 93 00:05:26,120 --> 00:05:27,840 Speaker 1: you know, I mean, it did start with a kind 94 00:05:27,880 --> 00:05:29,880 Speaker 1: of wide net. And I found these kind of three 95 00:05:29,880 --> 00:05:33,680 Speaker 1: topics intertwining in such a way that I wanted to 96 00:05:33,720 --> 00:05:37,080 Speaker 1: try and figure out a way to account for them, 97 00:05:37,080 --> 00:05:39,480 Speaker 1: a kind of you know, a kind of hypothesis that 98 00:05:39,480 --> 00:05:42,680 Speaker 1: would that will encompass these kind of three areas and 99 00:05:42,720 --> 00:05:45,240 Speaker 1: why they seem to be so interrelated to one another. 100 00:05:47,600 --> 00:05:49,960 Speaker 1: So while you're talking it, you know, there there are 101 00:05:50,080 --> 00:05:55,960 Speaker 1: certain figures in the book who seemed to drive, um, 102 00:05:56,360 --> 00:05:59,880 Speaker 1: certain of these ideas. Do you have any sense about 103 00:06:00,080 --> 00:06:06,000 Speaker 1: like why certain ideas like make it into sort of 104 00:06:06,680 --> 00:06:11,599 Speaker 1: wider fringe belief while others kind of die on the vine. Yeah, 105 00:06:11,640 --> 00:06:14,159 Speaker 1: and I think that that again, I mean I that 106 00:06:14,279 --> 00:06:16,000 Speaker 1: was one of the questions that really drove me to 107 00:06:16,120 --> 00:06:18,320 Speaker 1: try and sort of figure out why, you know, you know, 108 00:06:18,520 --> 00:06:19,840 Speaker 1: how this book was going to work, and sort of 109 00:06:19,839 --> 00:06:23,560 Speaker 1: figure out like why does you know, for example, one 110 00:06:23,560 --> 00:06:24,680 Speaker 1: of the things I talked about in the book, that 111 00:06:24,760 --> 00:06:28,680 Speaker 1: the Gloucester Sea Serpent was this extremely well documented cryptic 112 00:06:28,760 --> 00:06:32,599 Speaker 1: siting um in the the eighteen teens, I think eighteen 113 00:06:32,680 --> 00:06:36,839 Speaker 1: seventeen um that has been more or less ignored by 114 00:06:36,839 --> 00:06:38,880 Speaker 1: you know, a lot of kind of you know, mainstream 115 00:06:38,920 --> 00:06:41,240 Speaker 1: understandings of cryptics in favor of the Lockniff Monster and 116 00:06:41,600 --> 00:06:45,000 Speaker 1: Bigfoot and um, you know a couple of others. So, 117 00:06:45,120 --> 00:06:46,560 Speaker 1: you know, partly I was just like, you know, why 118 00:06:46,640 --> 00:06:49,440 Speaker 1: was this thing, which which by any objective standards is 119 00:06:49,520 --> 00:06:52,400 Speaker 1: much more well documented than than Bigfoot or the Yeti 120 00:06:52,520 --> 00:06:55,880 Speaker 1: or whatever. Nonetheless, um kind of get you know, place 121 00:06:55,960 --> 00:06:58,320 Speaker 1: second fiddle. So yeah, so I really wanted to kind 122 00:06:58,360 --> 00:07:01,000 Speaker 1: of try and understand in those and I think what 123 00:07:01,080 --> 00:07:06,120 Speaker 1: I found, um was that in a lot of cases, 124 00:07:06,200 --> 00:07:11,680 Speaker 1: these beliefs are you know, these Um, these beings, these objects, 125 00:07:11,720 --> 00:07:14,320 Speaker 1: whatever you want to call them, light up a kind 126 00:07:14,360 --> 00:07:21,600 Speaker 1: of central kind of narrative that works for the person 127 00:07:21,640 --> 00:07:23,320 Speaker 1: who believes in them, you know, I mean I think 128 00:07:23,640 --> 00:07:25,560 Speaker 1: that's that's a kind of abstract way of saying that. 129 00:07:25,560 --> 00:07:29,040 Speaker 1: I mean, they they do, they do something for people. 130 00:07:29,160 --> 00:07:31,520 Speaker 1: They they explain the world in a way, you know. 131 00:07:31,560 --> 00:07:33,960 Speaker 1: I mean, I think for for everyone who actually has 132 00:07:34,160 --> 00:07:36,960 Speaker 1: you know, has had a UFO sighting in some form 133 00:07:37,000 --> 00:07:39,800 Speaker 1: and another. Um, there are a bunch of other people 134 00:07:39,800 --> 00:07:42,840 Speaker 1: who haven't, but who you know, and to to quote 135 00:07:42,880 --> 00:07:46,720 Speaker 1: the X files want to believe because the idea of 136 00:07:47,160 --> 00:07:51,320 Speaker 1: you know, UFOs will uh, it's it's pleasing, you know, 137 00:07:51,360 --> 00:07:53,880 Speaker 1: I mean, you know, the possibility is exciting, or you know, 138 00:07:53,920 --> 00:07:56,200 Speaker 1: what it says about our relationship to the government sort 139 00:07:56,200 --> 00:07:57,640 Speaker 1: of checks out in their belief, you know, I mean 140 00:07:57,640 --> 00:07:59,960 Speaker 1: like it it does something for them in a way 141 00:07:59,880 --> 00:08:02,080 Speaker 1: that goes beyond I think just you know, the question 142 00:08:02,120 --> 00:08:06,560 Speaker 1: of of documentary evidence. So you've got you've got a 143 00:08:06,640 --> 00:08:16,040 Speaker 1: long narrative section on UFOs. Since can you talk a 144 00:08:16,080 --> 00:08:19,760 Speaker 1: little bit about sort of your take on that you know, 145 00:08:19,960 --> 00:08:25,360 Speaker 1: history and timeline. Yeah, I mean, the idea that there 146 00:08:25,400 --> 00:08:29,360 Speaker 1: are unidentified objects in the sky. I mean those that 147 00:08:29,360 --> 00:08:32,440 Speaker 1: that belief goes back you know, millennia, you know, I 148 00:08:32,440 --> 00:08:34,400 Speaker 1: mean you can find it in you know, ancient Greek 149 00:08:34,400 --> 00:08:37,000 Speaker 1: and Roman writing. So it's it's it's not as though 150 00:08:37,720 --> 00:08:42,480 Speaker 1: when Kenneth Arnold sees, um, you know, seven bat wing 151 00:08:42,600 --> 00:08:45,920 Speaker 1: shaped silver objects flying over Mount Rainier, that that's the 152 00:08:45,960 --> 00:08:50,640 Speaker 1: beginning of um, you know, UFO sightings by any means. 153 00:08:50,679 --> 00:08:52,520 Speaker 1: So so to start there is in some sense a 154 00:08:52,559 --> 00:08:56,520 Speaker 1: little bit arbitrary, but it also is the moment at 155 00:08:56,600 --> 00:09:02,200 Speaker 1: which those strange objects in the sky become crystallized as 156 00:09:02,200 --> 00:09:05,560 Speaker 1: a certain kind of thing, you know. But prior to that, um, 157 00:09:05,600 --> 00:09:07,640 Speaker 1: you know, I mean again you can go back millennia 158 00:09:07,679 --> 00:09:10,080 Speaker 1: and you you have objects in the sky which are 159 00:09:10,120 --> 00:09:12,920 Speaker 1: you know, omens or their signs from God's or whomever 160 00:09:13,160 --> 00:09:17,000 Speaker 1: you know, And and that's sort of one stable narrative trajectory, right, 161 00:09:17,200 --> 00:09:18,840 Speaker 1: um you know, or they are you know, sort of 162 00:09:18,840 --> 00:09:22,400 Speaker 1: magical in some sense or another. And what happens in 163 00:09:23,640 --> 00:09:29,000 Speaker 1: is that suddenly these things are no longer considered to 164 00:09:29,120 --> 00:09:33,200 Speaker 1: be I guess, supernatural. They're they're sort of they're sort 165 00:09:33,200 --> 00:09:37,400 Speaker 1: of paranormal while also being sort of scientifically plausible. And 166 00:09:37,440 --> 00:09:40,640 Speaker 1: I think that sort of changes how we start to see, 167 00:09:41,200 --> 00:09:43,199 Speaker 1: you know, what a UFO is or what a thing 168 00:09:43,240 --> 00:09:45,319 Speaker 1: in the sky might be in a way that's that's 169 00:09:45,320 --> 00:09:48,680 Speaker 1: sort of new and different. So there's this sort of 170 00:09:48,679 --> 00:09:54,559 Speaker 1: progression of I don't know, sort of sort of conceptualizing 171 00:09:55,440 --> 00:09:58,000 Speaker 1: what the UFOs are like. You kind of start off 172 00:09:58,040 --> 00:10:04,320 Speaker 1: with the contact e these who you know, by modern standards, 173 00:10:04,400 --> 00:10:08,760 Speaker 1: I kind of feel like, seem a little quaint. Um, 174 00:10:08,760 --> 00:10:14,199 Speaker 1: But there's you know, past that or beyond that, there's 175 00:10:14,240 --> 00:10:22,280 Speaker 1: this sense that, um, that that people's ideas about UFOs 176 00:10:22,320 --> 00:10:28,000 Speaker 1: and how the government's dealing with them are um, reflective 177 00:10:29,320 --> 00:10:32,960 Speaker 1: of sort of what's going on in government in non 178 00:10:33,200 --> 00:10:36,760 Speaker 1: sort of fringe things and people's perceptions of that. And 179 00:10:37,200 --> 00:10:41,360 Speaker 1: I think you brought up in the book, you know, Vietnam, 180 00:10:41,480 --> 00:10:47,600 Speaker 1: uh CO, Intel, pro Watergate, Um. You know, there are 181 00:10:47,720 --> 00:10:52,120 Speaker 1: lots of examples. Um, So what what was your what's 182 00:10:52,120 --> 00:10:54,960 Speaker 1: your sense of that? Yeah, I mean one of the 183 00:10:55,000 --> 00:10:57,800 Speaker 1: things that that became really early on, you know, clear 184 00:10:57,800 --> 00:10:59,520 Speaker 1: to me really early on, is I think that in 185 00:10:59,559 --> 00:11:03,800 Speaker 1: a lot of ways, you know, belief in encryptids parallels 186 00:11:03,800 --> 00:11:06,199 Speaker 1: belief in you know, parallels of belief in UFOs, and 187 00:11:06,480 --> 00:11:08,240 Speaker 1: you see that again in terms of just you know, 188 00:11:08,360 --> 00:11:11,000 Speaker 1: who believes what and that kind of stuff. But but 189 00:11:11,240 --> 00:11:14,120 Speaker 1: very early on it also was made clear to me 190 00:11:14,200 --> 00:11:16,800 Speaker 1: that there's there's one really specific way in which they're different. 191 00:11:16,840 --> 00:11:19,040 Speaker 1: Is that, you know, you either believe in the Lockness 192 00:11:19,000 --> 00:11:21,240 Speaker 1: sponsor you don't. But what you you don't believe that 193 00:11:21,280 --> 00:11:25,480 Speaker 1: the Lockness monster is the existence of the Locknes sponsors 194 00:11:25,520 --> 00:11:28,160 Speaker 1: being covered up by the government, Whereas if you believe 195 00:11:28,200 --> 00:11:32,280 Speaker 1: in UFOs, almost by default you believe that the government 196 00:11:32,320 --> 00:11:36,480 Speaker 1: knows more than they're letting on and it's being um, 197 00:11:36,520 --> 00:11:38,640 Speaker 1: you know, hidden from us in some some fashion. And 198 00:11:38,800 --> 00:11:42,440 Speaker 1: I found that to be again, you know, considering that 199 00:11:42,480 --> 00:11:46,760 Speaker 1: both of these ideas you know, prior to World War Two, 200 00:11:47,280 --> 00:11:52,040 Speaker 1: we're pretty synchronous, you know, in terms of the shape 201 00:11:52,040 --> 00:11:54,040 Speaker 1: of of of the narrative that a company of these 202 00:11:54,040 --> 00:11:56,360 Speaker 1: beliefs suddenly in the wake of World War two, and 203 00:11:56,360 --> 00:11:57,679 Speaker 1: and for me, I think really in the wake of 204 00:11:57,720 --> 00:12:03,920 Speaker 1: the Manhattan Project, which was this incredibly successful attempt to 205 00:12:04,000 --> 00:12:10,199 Speaker 1: keep a completely unknown and unknowable and world changing technological 206 00:12:10,240 --> 00:12:14,199 Speaker 1: weapon entirely out of the public eye until it was used, right, 207 00:12:14,240 --> 00:12:16,320 Speaker 1: you know, I mean, like you know, the the amount 208 00:12:16,360 --> 00:12:19,440 Speaker 1: of secrecy required to keep the Manhattan Projects secret, and 209 00:12:19,480 --> 00:12:22,240 Speaker 1: the success that they had, you know, sort of gave 210 00:12:22,360 --> 00:12:26,360 Speaker 1: the world a sense that, um, you know, and specifically Americans, 211 00:12:26,360 --> 00:12:29,240 Speaker 1: well you know that that the American government was capable 212 00:12:29,480 --> 00:12:35,040 Speaker 1: of doing absolutely unheard of things in utter secrecy. And 213 00:12:35,240 --> 00:12:37,280 Speaker 1: you know, and again, I don't think it's a coincidence that, 214 00:12:37,360 --> 00:12:40,720 Speaker 1: you know, so much of the modern UFO lore is about, 215 00:12:41,040 --> 00:12:44,239 Speaker 1: you know, the deserts of New Mexico and Nevada and Arizona. 216 00:12:44,320 --> 00:12:47,080 Speaker 1: You know, precisely where you know, the Manhattan Project was 217 00:12:47,120 --> 00:12:50,280 Speaker 1: located outside of Santa Fe, Right, So, um, so I 218 00:12:50,320 --> 00:12:54,760 Speaker 1: think that that the Manhattan Project suddenly makes the the 219 00:12:54,760 --> 00:12:58,240 Speaker 1: the idea that the government is capable of of covering 220 00:12:58,320 --> 00:13:03,520 Speaker 1: up something beyond belief quite plausible. And so the arrival 221 00:13:03,679 --> 00:13:06,600 Speaker 1: of of you know, the kind of modern UFO myth 222 00:13:06,720 --> 00:13:11,880 Speaker 1: really becomes to reflect that idea that it is absolutely 223 00:13:11,920 --> 00:13:15,440 Speaker 1: within the realm of plausibility that the government has uh 224 00:13:15,480 --> 00:13:22,559 Speaker 1: these sort of secret, supernatural, paranormal things going on at 225 00:13:22,600 --> 00:13:24,520 Speaker 1: all times, and and it's only a matter of time 226 00:13:24,559 --> 00:13:44,079 Speaker 1: before we all know about them. You know. One of 227 00:13:44,400 --> 00:13:46,440 Speaker 1: the things that that you cover in your book is 228 00:13:46,480 --> 00:13:53,320 Speaker 1: the story of Paul Benowitz and William Moore and and 229 00:13:53,679 --> 00:13:59,120 Speaker 1: Richard Doughty, And what what's your sense of how that 230 00:13:59,200 --> 00:14:05,760 Speaker 1: fits into what you were just talking about the story 231 00:14:05,840 --> 00:14:08,600 Speaker 1: of Paul Bennowit's again, I mean, you know, so so 232 00:14:08,679 --> 00:14:10,440 Speaker 1: what I was just talking about, you know, is is 233 00:14:10,480 --> 00:14:14,880 Speaker 1: I think UH sort of a credulity among people about 234 00:14:14,920 --> 00:14:16,839 Speaker 1: what the government is capable of doing, you know, the 235 00:14:17,200 --> 00:14:21,360 Speaker 1: lengths of UH secrecy that that it can sustain, which 236 00:14:21,440 --> 00:14:26,440 Speaker 1: which frankly, I myself have difficulty going there a lot 237 00:14:26,440 --> 00:14:28,120 Speaker 1: of the times because you know, I mean I look 238 00:14:28,160 --> 00:14:31,720 Speaker 1: at actual conspiracies, like something like Watergate. You know, Watergate. 239 00:14:32,280 --> 00:14:35,000 Speaker 1: As soon as you know, Woodward and Bernstein and the 240 00:14:35,000 --> 00:14:38,520 Speaker 1: New York Times started asking questions about Watergate, the whole 241 00:14:38,520 --> 00:14:41,600 Speaker 1: thing fell apart. You know, it was, you know, barely 242 00:14:41,600 --> 00:14:44,120 Speaker 1: two years later before Nixon resigns. Right. So It's like, 243 00:14:44,520 --> 00:14:48,400 Speaker 1: once you actually start asking around about government secrets, it's 244 00:14:48,600 --> 00:14:51,920 Speaker 1: usually the case that if they are big enough, UM, 245 00:14:52,080 --> 00:14:55,600 Speaker 1: and they are momentous enough, you'll find people who can talk. 246 00:14:55,680 --> 00:14:57,560 Speaker 1: You know, I mean, you know, you you you can 247 00:14:57,600 --> 00:15:02,600 Speaker 1: find secretaries, you can find janitors, you can find bookkeepers. 248 00:15:02,640 --> 00:15:04,600 Speaker 1: You know, somebody knows something that you know, it is 249 00:15:04,640 --> 00:15:07,400 Speaker 1: willing to talk. So that's why I've always been somewhat 250 00:15:07,400 --> 00:15:10,120 Speaker 1: skeptical about, you know, some of this sort of more 251 00:15:10,280 --> 00:15:13,840 Speaker 1: outlandish government conspiracy theories. And then I came across the 252 00:15:13,880 --> 00:15:19,960 Speaker 1: story of Paul Benowitz, and I was frankly flabbergasted by 253 00:15:20,160 --> 00:15:27,760 Speaker 1: the depths of cynicism and um uh sort of disdain 254 00:15:27,920 --> 00:15:31,680 Speaker 1: with which a you know, a government operation would treat 255 00:15:32,360 --> 00:15:35,920 Speaker 1: um not just an American citizen, but a a veteran 256 00:15:36,000 --> 00:15:38,640 Speaker 1: and someone who you know worked for the government and 257 00:15:38,680 --> 00:15:41,880 Speaker 1: deeply loved his government. And this idea that you know, 258 00:15:42,000 --> 00:15:45,120 Speaker 1: as Paul Benowitz began, you know, living on the edge 259 00:15:45,160 --> 00:15:49,320 Speaker 1: of Curtland Air Force Base in Albuquerque, began to see 260 00:15:49,400 --> 00:15:54,000 Speaker 1: things above the skies of Curtland Air Force Base and 261 00:15:54,200 --> 00:15:59,360 Speaker 1: photographing them and distributing them to you know, his UFO believer, 262 00:16:00,160 --> 00:16:03,800 Speaker 1: you know networks the government. It seems to me, I mean, 263 00:16:03,920 --> 00:16:06,040 Speaker 1: you know, what I presume is the most sort of 264 00:16:06,040 --> 00:16:09,440 Speaker 1: Okam's Razor example, you know, explanation is that there was 265 00:16:09,520 --> 00:16:12,960 Speaker 1: some sort of you know, top secret craft of some 266 00:16:13,080 --> 00:16:17,680 Speaker 1: kind being you know, um, I don't know tested out there, 267 00:16:17,720 --> 00:16:20,000 Speaker 1: you know, but that that the Air Force could have 268 00:16:20,240 --> 00:16:22,480 Speaker 1: Paul called in Paul Benewitz and said, you know, look 269 00:16:23,520 --> 00:16:26,600 Speaker 1: what you saw was classified, and you, being a good patriot, 270 00:16:26,600 --> 00:16:28,000 Speaker 1: we're just going to ask you to just you know, 271 00:16:28,120 --> 00:16:31,240 Speaker 1: stop sending out those those images. You know. And I 272 00:16:31,280 --> 00:16:34,400 Speaker 1: have no reason to think that Paul Benowitz, who's who's 273 00:16:34,400 --> 00:16:37,120 Speaker 1: not only was a patriot, but his livelihood. His company 274 00:16:37,200 --> 00:16:39,520 Speaker 1: was a Air Force contractor, so there's no reason he 275 00:16:39,520 --> 00:16:42,880 Speaker 1: would have sacrificed his livelihood. Um, But they don't do that. Instead, 276 00:16:42,920 --> 00:16:45,880 Speaker 1: they say, we are going to pump this guy with 277 00:16:46,600 --> 00:16:51,400 Speaker 1: an elaborate campaign of disinformation to the point where, you know, 278 00:16:51,440 --> 00:16:55,320 Speaker 1: they gave him a computer that they said was receiving 279 00:16:55,880 --> 00:17:00,160 Speaker 1: extraterrestrial communications of unknown origin and could he help them, 280 00:17:00,200 --> 00:17:02,920 Speaker 1: you know, decode any of it and so and then 281 00:17:02,960 --> 00:17:06,840 Speaker 1: they just bombarded this computer with just gibberish that they 282 00:17:06,840 --> 00:17:09,120 Speaker 1: were broadcasting from across the street to make him think 283 00:17:09,160 --> 00:17:11,040 Speaker 1: that he was getting alien signals, you know. And so 284 00:17:11,200 --> 00:17:13,200 Speaker 1: you know, things like that, where it's just this very 285 00:17:13,280 --> 00:17:19,320 Speaker 1: bizarre and elaborate, sustained campaign to take a you know, 286 00:17:19,440 --> 00:17:25,600 Speaker 1: a law abiding and patriotic American citizen and basically, yeah, 287 00:17:27,080 --> 00:17:29,200 Speaker 1: more or less drive him crazy, you know, like to 288 00:17:29,200 --> 00:17:31,120 Speaker 1: to like push him to the point where nobody would 289 00:17:31,160 --> 00:17:34,280 Speaker 1: take him seriously because he was advancing these wild and 290 00:17:34,800 --> 00:17:38,840 Speaker 1: unverifiable um theories about aliens, and that just seems to me. 291 00:17:40,160 --> 00:17:42,600 Speaker 1: I mean it's a it's a really sad and tragic story. 292 00:17:42,640 --> 00:17:44,920 Speaker 1: I mean, there's there's so much like pathos involved in there. 293 00:17:44,920 --> 00:17:47,600 Speaker 1: And also I think it does speak to a kind 294 00:17:47,640 --> 00:17:51,520 Speaker 1: of weird cynicism that that seems to drive some levels 295 00:17:51,560 --> 00:17:55,800 Speaker 1: of the U. S. Government. Yeah, you know, it just 296 00:17:55,840 --> 00:18:00,399 Speaker 1: seems so unnecessary. Like you were saying, I was struck 297 00:18:00,440 --> 00:18:02,400 Speaker 1: by that too. It's like, this seems like a lot 298 00:18:02,440 --> 00:18:09,800 Speaker 1: of effort two do something that's completely unnecessary, and you know, 299 00:18:09,960 --> 00:18:14,000 Speaker 1: with malintent. I guess you know my take in some sense, 300 00:18:14,040 --> 00:18:15,640 Speaker 1: as they were testing out to see what they could 301 00:18:15,640 --> 00:18:17,520 Speaker 1: get away with, you know that that again it was 302 00:18:17,600 --> 00:18:19,800 Speaker 1: not only was it unnecessary, but I think that there 303 00:18:19,880 --> 00:18:23,600 Speaker 1: was some elements um, you know among um you know, 304 00:18:23,640 --> 00:18:27,320 Speaker 1: the Air Force intelligence officers involved of like you know, like, uh, 305 00:18:27,359 --> 00:18:29,320 Speaker 1: you know what happens if we do this, you know, 306 00:18:29,359 --> 00:18:31,160 Speaker 1: like it was it was a little sort of guinea 307 00:18:31,200 --> 00:18:34,200 Speaker 1: pig experiment that they carried out on this this unwitting 308 00:18:35,080 --> 00:18:37,159 Speaker 1: uh you know, a guy who ended up, you know, 309 00:18:37,240 --> 00:18:40,360 Speaker 1: really really suffering through what seems like, you know, sort 310 00:18:40,400 --> 00:18:43,240 Speaker 1: of a lot of the hallmarks of of you know, 311 00:18:43,320 --> 00:18:48,399 Speaker 1: clinical psychosis as a result. So, and I'm just gonna 312 00:18:48,920 --> 00:18:53,680 Speaker 1: read like a brief part and this is actually um 313 00:18:54,080 --> 00:18:59,720 Speaker 1: is in the context of cryptid hoaxes, but I thought 314 00:18:59,720 --> 00:19:04,199 Speaker 1: it was sort of widely applicable. It's a you're you're 315 00:19:04,240 --> 00:19:08,919 Speaker 1: talking about Kevin Young's maxim quote hoax has proved that 316 00:19:08,960 --> 00:19:13,120 Speaker 1: believing is seeing end quote. Whatever is in those documents 317 00:19:13,240 --> 00:19:15,639 Speaker 1: is what you choose to put into them, whatever you 318 00:19:15,720 --> 00:19:18,399 Speaker 1: need them to be. And you pick up the quote again. 319 00:19:19,080 --> 00:19:22,600 Speaker 1: The hoax is rather a kind of coded confession, revealing 320 00:19:22,640 --> 00:19:26,080 Speaker 1: not only a deep seated cultural wish, but also a 321 00:19:26,119 --> 00:19:30,920 Speaker 1: common set of themes or faints or strategies that add 322 00:19:31,000 --> 00:19:34,240 Speaker 1: up to a ritual that that seems to me to 323 00:19:34,240 --> 00:19:39,119 Speaker 1: be fairly certainly widely applicable to almost all the stuff 324 00:19:39,119 --> 00:19:43,600 Speaker 1: that you're talking about in here. Oh yeah, and and definitely, 325 00:19:43,640 --> 00:19:45,960 Speaker 1: I mean, and you know, Kevin Kevin Young's book for 326 00:19:45,960 --> 00:19:48,240 Speaker 1: for those of you who don't know it, Bunk is 327 00:19:48,760 --> 00:19:53,320 Speaker 1: absolutely fantastic. What Young is interested in is is particularly 328 00:19:53,359 --> 00:19:58,320 Speaker 1: hoaxes that that reflect American ambivalence about race. So you know, 329 00:19:58,440 --> 00:20:02,200 Speaker 1: various white authors who of you know, published fake memoirs 330 00:20:02,280 --> 00:20:04,600 Speaker 1: as you know, uh, you know, black or Native American 331 00:20:04,600 --> 00:20:06,200 Speaker 1: writers or something like that, and then have been sort 332 00:20:06,240 --> 00:20:09,000 Speaker 1: of found out and that kind of stuff. And so 333 00:20:09,000 --> 00:20:11,160 Speaker 1: so what Young is sort of tracing, which I think 334 00:20:11,200 --> 00:20:13,960 Speaker 1: is is really valuable and important, is the way in 335 00:20:14,000 --> 00:20:16,320 Speaker 1: which as that Coaches sort of implies, you know that 336 00:20:16,440 --> 00:20:20,520 Speaker 1: that the idea that a hoax is a kind of 337 00:20:20,680 --> 00:20:24,720 Speaker 1: attempt to make a myth into reality. You know that 338 00:20:24,920 --> 00:20:27,399 Speaker 1: if you have a kind of and again this is 339 00:20:27,440 --> 00:20:29,520 Speaker 1: sort of you know, his argument, not not so much mine, 340 00:20:29,640 --> 00:20:33,080 Speaker 1: but I'm paraphrasing his argument, which I agree with that 341 00:20:33,359 --> 00:20:35,680 Speaker 1: you know, if you have a sort of narrative about 342 00:20:35,560 --> 00:20:38,320 Speaker 1: up inner city crime or urban poverty or whatever, and 343 00:20:38,359 --> 00:20:41,719 Speaker 1: then a white person comes along and writes a memoir 344 00:20:41,760 --> 00:20:43,920 Speaker 1: as a black person sort of burying out these these 345 00:20:43,960 --> 00:20:46,960 Speaker 1: myths and these sort of misconceptions, then you know, that's 346 00:20:47,080 --> 00:20:48,760 Speaker 1: that's what the hoax is there to do. The hoax 347 00:20:48,800 --> 00:20:52,480 Speaker 1: is there to sort of make the false thing look true. 348 00:20:52,560 --> 00:20:56,400 Speaker 1: And and so with my book, you know, when I'm 349 00:20:56,440 --> 00:20:58,600 Speaker 1: moving into you know, things like cryptos and UFOs, I 350 00:20:58,600 --> 00:21:00,919 Speaker 1: mean there's a lot that I talked about where I 351 00:21:00,960 --> 00:21:03,520 Speaker 1: don't I don't have any evidence to to believe that, 352 00:21:03,640 --> 00:21:05,280 Speaker 1: you know, a given story as a hoax. I don't 353 00:21:05,320 --> 00:21:06,800 Speaker 1: have a I don't have a good explanation. I think 354 00:21:06,800 --> 00:21:10,240 Speaker 1: there's a lot that remains unexplained, and so I, by 355 00:21:10,280 --> 00:21:12,879 Speaker 1: no means want to assert that the you know, the 356 00:21:12,880 --> 00:21:15,000 Speaker 1: whole field is full of hoaxes. But you know, we 357 00:21:15,080 --> 00:21:19,359 Speaker 1: do have a number of very obvious documented hoaxes. And 358 00:21:19,400 --> 00:21:22,359 Speaker 1: in fact, um, you know, one of the again, one 359 00:21:22,359 --> 00:21:27,160 Speaker 1: of more tragically you know, weird stories is this guy 360 00:21:27,400 --> 00:21:31,879 Speaker 1: and Calsco Montana who buys a gilly suit, which is 361 00:21:31,920 --> 00:21:34,600 Speaker 1: a you know, sort of extreme camouflage where you know, 362 00:21:34,640 --> 00:21:36,080 Speaker 1: the suit is supposed to make you look like a 363 00:21:36,080 --> 00:21:38,199 Speaker 1: pile of leaves, you know, so you can sort of 364 00:21:38,240 --> 00:21:41,879 Speaker 1: lie prone and you know, be a sniper or whatever. Um, 365 00:21:41,920 --> 00:21:44,880 Speaker 1: and he wears the suit and he wanders out onto 366 00:21:44,920 --> 00:21:47,680 Speaker 1: the freeway at night, hoping that a motorist will catch 367 00:21:48,200 --> 00:21:51,680 Speaker 1: just a glimpse of him and um and believe it 368 00:21:51,720 --> 00:21:52,800 Speaker 1: to be a big foot you know. So it's a 369 00:21:52,880 --> 00:21:57,520 Speaker 1: it's a bigfoot hoax thing. But um, he uh he 370 00:21:57,680 --> 00:22:01,320 Speaker 1: gets hit by uh not one but two cars actually 371 00:22:01,400 --> 00:22:03,160 Speaker 1: ends up dead, you know, it ends up you know, killed. 372 00:22:03,200 --> 00:22:06,720 Speaker 1: And so there's this this really strange sort of story about, 373 00:22:07,200 --> 00:22:09,480 Speaker 1: you know, an attempt to create a hoax, to kind 374 00:22:09,520 --> 00:22:12,000 Speaker 1: of gin up a belief in a thing which this 375 00:22:12,040 --> 00:22:15,399 Speaker 1: person knows to be false that ends up backfiring in 376 00:22:15,440 --> 00:22:18,600 Speaker 1: this horrible way. And so I think that when you 377 00:22:19,080 --> 00:22:22,159 Speaker 1: catalog a lot of the history of of hoaxes, you know, 378 00:22:22,320 --> 00:22:25,720 Speaker 1: um UFO photos where where it's sort of clearly a 379 00:22:25,800 --> 00:22:28,560 Speaker 1: hub cap or side view mirror or something like that, 380 00:22:28,760 --> 00:22:31,800 Speaker 1: and you you sort of aggregate all of them in 381 00:22:31,880 --> 00:22:34,879 Speaker 1: this kind of kind of compendium, which you get is 382 00:22:35,359 --> 00:22:39,200 Speaker 1: again and again a sort of aching desire to make 383 00:22:39,240 --> 00:22:42,800 Speaker 1: a thing true at all costs, even to the point 384 00:22:42,800 --> 00:22:44,879 Speaker 1: of sort of you know, inventing the whole the thing 385 00:22:44,920 --> 00:22:50,040 Speaker 1: from whole cloth. That's interesting going back to the whole 386 00:22:50,080 --> 00:22:53,440 Speaker 1: Doughty in Benewits. You know, one of the big differences 387 00:22:53,480 --> 00:22:57,000 Speaker 1: that although it's a big hoax, is that he you know, 388 00:22:57,080 --> 00:23:01,760 Speaker 1: it's not coming from this sort of sincere hope to 389 00:23:01,840 --> 00:23:04,359 Speaker 1: make something that you wish was real, have other people 390 00:23:04,440 --> 00:23:09,200 Speaker 1: also believe it's real. It's just this absolutely cynical exercise. Yeah, 391 00:23:09,320 --> 00:23:11,119 Speaker 1: that's that's a good point. And yeah, and at the 392 00:23:11,119 --> 00:23:12,919 Speaker 1: same time it makes me think just you know, what 393 00:23:12,960 --> 00:23:14,760 Speaker 1: you're saying just makes me think it's But it's also 394 00:23:14,800 --> 00:23:19,480 Speaker 1: an attempt to see, like how I guess I guess 395 00:23:19,520 --> 00:23:22,520 Speaker 1: the thing is, the more people desire something to be true, 396 00:23:22,560 --> 00:23:25,600 Speaker 1: the easier it is to to feed them lies. Right, 397 00:23:25,640 --> 00:23:28,320 Speaker 1: you know that, you know, the people buy things because 398 00:23:28,359 --> 00:23:30,840 Speaker 1: they they want to believe that they're true. And so 399 00:23:30,880 --> 00:23:33,919 Speaker 1: I feel, like with with Doughty trying to sort of 400 00:23:33,960 --> 00:23:37,320 Speaker 1: spread these these hoaxes throughout the UFO community, is also 401 00:23:37,320 --> 00:23:39,840 Speaker 1: a sense to see, like, you know, how deep is 402 00:23:40,480 --> 00:23:42,200 Speaker 1: you know, quote the Beachies, how deep is this love? 403 00:23:42,320 --> 00:23:45,359 Speaker 1: You know, like how um, how easy will it be 404 00:23:45,560 --> 00:23:48,520 Speaker 1: to get people to believe this? Because that becomes an 405 00:23:48,560 --> 00:23:51,280 Speaker 1: index of how strong the belief is. And it turns 406 00:23:51,280 --> 00:23:54,280 Speaker 1: out that, you know, until Bill Moore kind of you know, 407 00:23:54,320 --> 00:23:58,000 Speaker 1: gave his maya culpa, the belief was really strong. People 408 00:23:58,040 --> 00:24:01,560 Speaker 1: were were desperate even to the point to you know, 409 00:24:01,640 --> 00:24:04,840 Speaker 1: accepting things that that didn't hold up under any scrutiny whatsoever, 410 00:24:04,880 --> 00:24:06,960 Speaker 1: but they accepted them to be true because you know, 411 00:24:07,119 --> 00:24:10,200 Speaker 1: that strong desire to to have the thing be real. 412 00:24:11,480 --> 00:24:15,120 Speaker 1: Another thing that you bring up that I thought would 413 00:24:15,160 --> 00:24:17,639 Speaker 1: be would be interesting to the people who are listening. 414 00:24:17,720 --> 00:24:22,919 Speaker 1: Is this idea of stigmatized knowledge. I realize I'm kind 415 00:24:22,960 --> 00:24:25,919 Speaker 1: of pulling a term out of the book, but do 416 00:24:25,960 --> 00:24:30,720 Speaker 1: you yeah, I mean this again, this is a really 417 00:24:31,359 --> 00:24:33,440 Speaker 1: this is this is difficult for me to kind of 418 00:24:33,880 --> 00:24:36,679 Speaker 1: part because I find myself often on both sides of 419 00:24:36,760 --> 00:24:39,199 Speaker 1: this in some ways. And you know, again, when I 420 00:24:39,200 --> 00:24:40,720 Speaker 1: was when I was trying to sort of make sense 421 00:24:40,720 --> 00:24:42,840 Speaker 1: of the genealogy of some of these believes, I kept 422 00:24:42,920 --> 00:24:45,600 Speaker 1: coming back to this period at the end of the 423 00:24:45,680 --> 00:24:53,119 Speaker 1: nineteenth century when um, you know, scientific sort of uh 424 00:24:53,480 --> 00:24:57,000 Speaker 1: methodology and thought becomes kind of institutionalized, right, So you 425 00:24:57,040 --> 00:25:00,040 Speaker 1: have the rise of PhD programs, which don't exist in 426 00:25:00,080 --> 00:25:03,199 Speaker 1: America before the eighteen seventies. You have the rise of 427 00:25:03,320 --> 00:25:06,920 Speaker 1: professional organizations, so you know, you can't just be can't 428 00:25:06,960 --> 00:25:08,679 Speaker 1: just be a country doctor anymore. Now you have to 429 00:25:08,720 --> 00:25:12,880 Speaker 1: belong to the you know, American Physicians Association or whatever 430 00:25:12,920 --> 00:25:15,399 Speaker 1: it is, you know, and so so things sort of 431 00:25:15,440 --> 00:25:20,040 Speaker 1: get ossified in these these institutions. And I think in 432 00:25:20,080 --> 00:25:22,359 Speaker 1: some ways I think that's good, you know, prevents you know, 433 00:25:22,520 --> 00:25:25,280 Speaker 1: I don't know, quacks from you know, taking people's money 434 00:25:25,320 --> 00:25:29,600 Speaker 1: for you know, dubious medical practices. It it allows you 435 00:25:29,640 --> 00:25:33,359 Speaker 1: to make scientific discoveries that you know at scale, which 436 00:25:33,359 --> 00:25:35,280 Speaker 1: you know are are are much easier to do in 437 00:25:35,359 --> 00:25:38,480 Speaker 1: a kind of you know university with you know, grant 438 00:25:38,520 --> 00:25:40,480 Speaker 1: funding or in you know whatever. Yeah, you know what 439 00:25:40,520 --> 00:25:44,320 Speaker 1: I'm saying. Um, But it also then creates this kind 440 00:25:44,359 --> 00:25:48,800 Speaker 1: of a group of outsiders who who rightly or wrongly 441 00:25:48,920 --> 00:25:52,840 Speaker 1: feel that they are not going to be taken seriously 442 00:25:52,840 --> 00:25:55,800 Speaker 1: because they're not part of this institution and um, and 443 00:25:56,000 --> 00:25:59,080 Speaker 1: that their contributions should be considered just as valid. And 444 00:25:59,160 --> 00:26:03,080 Speaker 1: so through through what I found is sort of um. 445 00:26:03,119 --> 00:26:05,320 Speaker 1: You know, this period of kind of the institutionalization of 446 00:26:05,320 --> 00:26:08,800 Speaker 1: science is when you first get the rise of uh, 447 00:26:08,800 --> 00:26:10,120 Speaker 1: you know what I what I call in the book 448 00:26:10,119 --> 00:26:11,880 Speaker 1: and what you know as they were known then cranks, 449 00:26:11,880 --> 00:26:14,560 Speaker 1: you know, these these people like ignacious Donnelly, who was 450 00:26:14,600 --> 00:26:18,840 Speaker 1: not an archaeologist or an anthropologist or or did he 451 00:26:18,920 --> 00:26:21,280 Speaker 1: do any field work, but he he writes the book 452 00:26:21,960 --> 00:26:24,400 Speaker 1: you know, on Atlantis, the first sort of modern book 453 00:26:24,440 --> 00:26:27,480 Speaker 1: on Atlantis, arguing that is a real place that you 454 00:26:27,520 --> 00:26:29,960 Speaker 1: know it existed, and that it's sunking to sea and 455 00:26:29,960 --> 00:26:32,760 Speaker 1: and you know, he has no scientific basis for this, 456 00:26:32,880 --> 00:26:35,640 Speaker 1: but um, he's not daunted by this, and it turns 457 00:26:35,640 --> 00:26:37,399 Speaker 1: out to be you know, pretty popular. And so this 458 00:26:37,680 --> 00:26:42,360 Speaker 1: institutionalization of science also gets the rise of cranks, who 459 00:26:42,359 --> 00:26:48,080 Speaker 1: are people on the outside of science arguing for scientific 460 00:26:48,119 --> 00:26:51,399 Speaker 1: knowledge and fact and so, you know, so stigmatized knowledge 461 00:26:51,400 --> 00:26:54,560 Speaker 1: I think sort of sort of comes out of that 462 00:26:54,560 --> 00:26:56,119 Speaker 1: that divide. I mean, you know, what do you do 463 00:26:56,200 --> 00:27:00,320 Speaker 1: with this stuff which is sort of outside of the 464 00:27:00,359 --> 00:27:04,280 Speaker 1: bounds of acceptable science and and thus not going to 465 00:27:04,359 --> 00:27:07,480 Speaker 1: be taken seriously in a kind of mainstream scientific discourse. 466 00:27:07,480 --> 00:27:10,520 Speaker 1: Like on the one hand, I think that's that seems good, 467 00:27:10,680 --> 00:27:14,760 Speaker 1: that seems right. Um, you know. Um, on the other hand, 468 00:27:15,200 --> 00:27:17,960 Speaker 1: science is only as ever as good as the scientists 469 00:27:17,960 --> 00:27:20,800 Speaker 1: who who do the science right, so that you know, 470 00:27:20,840 --> 00:27:23,320 Speaker 1: I mean, you know, science gets it wrong all the time, 471 00:27:23,520 --> 00:27:27,640 Speaker 1: and we depend on a kind of external system of 472 00:27:27,840 --> 00:27:31,399 Speaker 1: checks and balances to make sure that. You know, that 473 00:27:31,480 --> 00:27:35,280 Speaker 1: kind of institutionalized science is in itself making you know, 474 00:27:35,400 --> 00:27:37,840 Speaker 1: dangerous airs a fact and judgment, which it is often 475 00:27:37,920 --> 00:27:40,000 Speaker 1: capable of doing everything from you know, I don't know 476 00:27:40,040 --> 00:27:43,960 Speaker 1: eugenics to uh, you know, you name it, and so 477 00:27:43,960 --> 00:27:46,240 Speaker 1: so I think that it's On the one hand, I 478 00:27:46,720 --> 00:27:48,840 Speaker 1: find myself often sort of skeptical of a lot of 479 00:27:48,840 --> 00:27:51,280 Speaker 1: stuff that falls under that category of stigmatized knowledge. I 480 00:27:51,320 --> 00:27:56,840 Speaker 1: also find it a really vital and valuable sort of arena, 481 00:27:56,960 --> 00:28:00,960 Speaker 1: precisely because it offers one of the few kind of 482 00:28:01,040 --> 00:28:05,640 Speaker 1: you know, checks we have against you know, an institutionization 483 00:28:05,720 --> 00:28:08,520 Speaker 1: of science that might be, uh, you know, prone to 484 00:28:08,560 --> 00:28:13,680 Speaker 1: its own problems. It's interesting you quote um Don Donderie 485 00:28:14,119 --> 00:28:18,480 Speaker 1: in the book, and I had a fairly long conversation 486 00:28:18,520 --> 00:28:22,520 Speaker 1: with him for the first season of Strange Arrivals, and 487 00:28:22,560 --> 00:28:25,959 Speaker 1: a fair amount of it was about sort of the 488 00:28:26,040 --> 00:28:33,719 Speaker 1: frustrations of trying to do scientific research on UFOs and 489 00:28:33,720 --> 00:28:37,600 Speaker 1: how there's just there's it's just not part of science, right, 490 00:28:37,720 --> 00:28:41,880 Speaker 1: There's there's no grants you can get, there's no place 491 00:28:41,920 --> 00:28:45,120 Speaker 1: that's gonna peer review papers about UFOs. That it just 492 00:28:45,160 --> 00:28:48,960 Speaker 1: sort of exists in this in this place, and you know, 493 00:28:49,040 --> 00:28:51,520 Speaker 1: he was sort of expressing his frustration and trying to 494 00:28:51,520 --> 00:28:55,400 Speaker 1: get anybody to take it seriously for those reasons. On 495 00:28:55,440 --> 00:28:58,000 Speaker 1: the one hand, I think that's a valid you know, 496 00:28:58,080 --> 00:28:59,760 Speaker 1: that's a that's a valid complain And you know, I 497 00:28:59,800 --> 00:29:04,120 Speaker 1: saw also saw that a lot with um luckness monster believers, right, 498 00:29:04,120 --> 00:29:06,440 Speaker 1: you know that you know, uh, it's not it's not 499 00:29:06,520 --> 00:29:09,800 Speaker 1: cheap to trawl the bottom of lockness with you know, 500 00:29:09,800 --> 00:29:14,240 Speaker 1: a sonar array looking for unidentified large creatures. Right, you know, 501 00:29:14,280 --> 00:29:17,640 Speaker 1: it takes takes funding, you know, and and mainstream science 502 00:29:17,680 --> 00:29:19,760 Speaker 1: is not going to pay for that funding, and thus 503 00:29:19,800 --> 00:29:21,960 Speaker 1: it you know, it doesn't get done and becomes a 504 00:29:22,000 --> 00:29:24,840 Speaker 1: sort of circular argument where you know, people say, well, 505 00:29:24,880 --> 00:29:26,720 Speaker 1: we have no scientific evidence, but we're also not going 506 00:29:26,760 --> 00:29:29,960 Speaker 1: to pay for the scientific inquiry that might turn up 507 00:29:30,000 --> 00:29:32,840 Speaker 1: that evidence. And I get that that is sort of frustrating. 508 00:29:33,800 --> 00:29:37,800 Speaker 1: The flip side is, um the cost of doing that 509 00:29:37,920 --> 00:29:41,959 Speaker 1: kind of of research has actually plummeted with the you know, 510 00:29:42,000 --> 00:29:45,760 Speaker 1: it's just the increases in sort of technological cheatment and 511 00:29:46,000 --> 00:29:49,120 Speaker 1: you know what you can accomplish and document with a 512 00:29:49,840 --> 00:29:53,200 Speaker 1: freaking iPhone, you know, to say nothing of ten dollars 513 00:29:53,280 --> 00:29:55,440 Speaker 1: or whatever worth of high tech equipment when we get 514 00:29:55,480 --> 00:29:57,960 Speaker 1: you quite far. And so we we both see on 515 00:29:58,000 --> 00:29:59,560 Speaker 1: the one hand the sort of sense of like, well, 516 00:29:59,560 --> 00:30:02,800 Speaker 1: we're not we're not sort of putting enough money into 517 00:30:03,040 --> 00:30:05,920 Speaker 1: you know, possibly finding these discoveries. But at the same time, 518 00:30:06,320 --> 00:30:09,000 Speaker 1: We're also at a point, I think, technologically where these 519 00:30:09,000 --> 00:30:12,280 Speaker 1: discoveries actually don't take that much money anymore, and when 520 00:30:12,320 --> 00:30:16,160 Speaker 1: we're still not finding them. So all right, well, uh, 521 00:30:16,440 --> 00:30:19,760 Speaker 1: I really appreciate your time. This is awesome. Yeah, a 522 00:30:19,800 --> 00:30:22,520 Speaker 1: lot of fun. Yeah, thanks for reaching out. Thanks, have 523 00:30:22,600 --> 00:30:25,080 Speaker 1: a good uh rest of your day. Yeah, youtuo, thanks 524 00:30:25,080 --> 00:30:30,200 Speaker 1: a lot. Strange Arrivals is a production of I Heart 525 00:30:30,240 --> 00:30:33,480 Speaker 1: three D Audio and Grim and Mild from Aaron Mankey. 526 00:30:33,720 --> 00:30:36,440 Speaker 1: This episode was written and hosted by Toby Ball and 527 00:30:36,520 --> 00:30:40,680 Speaker 1: produced by Miranda Hawkins and Josh Same, with executive producers 528 00:30:40,720 --> 00:30:45,640 Speaker 1: Alex Williams, Matt Frederick, and Aaron Mankey. Learn more about 529 00:30:45,680 --> 00:30:49,280 Speaker 1: Strange Rivals over at Grimm and Mild dot com, and 530 00:30:49,360 --> 00:30:52,560 Speaker 1: find more podcasts from my heart Radio by visiting the 531 00:30:52,560 --> 00:30:56,240 Speaker 1: I Heart Radio app, Apple Podcasts, or wherever you listen 532 00:30:56,240 --> 00:31:02,280 Speaker 1: to your favorite shows.