1 00:00:09,664 --> 00:00:13,104 Speaker 1: Mission Implausible is now something you can watch. Just go 2 00:00:13,144 --> 00:00:16,823 Speaker 1: to YouTube and search Mission Implausible podcasts, or click on 3 00:00:16,864 --> 00:00:20,864 Speaker 1: the link to our channel. In our show notes, I'm 4 00:00:20,904 --> 00:00:22,784 Speaker 1: John Cipher and I'm Jerry O'Shea. 5 00:00:23,264 --> 00:00:26,544 Speaker 2: We have over sixty years of experience as clandestine officers 6 00:00:26,664 --> 00:00:29,544 Speaker 2: in the CIA, serving in high risk areas all around 7 00:00:29,544 --> 00:00:30,344 Speaker 2: the world. 8 00:00:30,104 --> 00:00:33,543 Speaker 3: And part of our job was creating conspiracies to deceive 9 00:00:33,623 --> 00:00:34,464 Speaker 3: our adversaries. 10 00:00:34,784 --> 00:00:37,504 Speaker 2: Now we're going to use that experience to investigate the 11 00:00:37,504 --> 00:00:40,504 Speaker 2: conspiracy theories everyone's talking about, as well as some you 12 00:00:40,584 --> 00:00:41,264 Speaker 2: may not have heard. 13 00:00:41,304 --> 00:00:43,584 Speaker 4: Could they be true or are we being manipulated? 14 00:00:43,703 --> 00:00:49,504 Speaker 2: We'll find out now on Mission Implausible. So today's guest 15 00:00:49,543 --> 00:00:52,464 Speaker 2: is Tamson Shaw. She's a professor of European and Mediterranean 16 00:00:52,504 --> 00:00:56,664 Speaker 2: Studies and Philosophy at NYU. She's written about psychology, Silicon 17 00:00:56,784 --> 00:00:59,224 Speaker 2: Valley and the effects of social media, the rise of 18 00:00:59,224 --> 00:01:02,984 Speaker 2: oligarchs overseas and at home, American intelligence, at ri Snowdmen, 19 00:01:03,104 --> 00:01:05,264 Speaker 2: much else often in The New York Review of Books 20 00:01:05,304 --> 00:01:07,304 Speaker 2: and elsewhere. She's now working on a book on our 21 00:01:07,344 --> 00:01:11,864 Speaker 2: susceptibility to influence operations than So welcome, Thank you, I 22 00:01:11,904 --> 00:01:14,904 Speaker 2: know you teach a course at NYU and psychological manipulation 23 00:01:15,024 --> 00:01:19,184 Speaker 2: in politics, including like Nazi propaganda and Stasi, etc. But 24 00:01:19,304 --> 00:01:21,704 Speaker 2: in it you talk about conspiracy theories and how they 25 00:01:21,783 --> 00:01:24,304 Speaker 2: might be changing with the Internet, and AI. 26 00:01:24,944 --> 00:01:28,383 Speaker 5: Yeah, I'm very interested in the form that they take nowadays, 27 00:01:28,423 --> 00:01:34,144 Speaker 5: partly because everybody's so focused on disinformation as a national 28 00:01:34,184 --> 00:01:38,063 Speaker 5: security concern and a general political concern. But a lot 29 00:01:38,104 --> 00:01:43,024 Speaker 5: of the conspiracies and now conspiracy theories are just put 30 00:01:43,063 --> 00:01:47,424 Speaker 5: together with true facts and they leave the audience to 31 00:01:47,824 --> 00:01:50,944 Speaker 5: make the inferences, the causal inferences about what's going on. 32 00:01:51,184 --> 00:01:55,264 Speaker 5: So a good example would be the Loose Change documentary 33 00:01:55,304 --> 00:01:57,984 Speaker 5: about nine to eleven, where it just starts in with 34 00:01:58,224 --> 00:02:03,184 Speaker 5: reciting facts, actually starting with the Reichstag fire, going through 35 00:02:03,784 --> 00:02:07,264 Speaker 5: Roosevelt having known about Pearl Harbor. I'm not saying that 36 00:02:07,304 --> 00:02:08,824 Speaker 5: all of these facts are true, by the way, but 37 00:02:08,984 --> 00:02:12,504 Speaker 5: it's just resenting them as facts, and then everything up 38 00:02:12,504 --> 00:02:15,624 Speaker 5: through Vietnam and JFK of course to the present day. 39 00:02:15,904 --> 00:02:18,544 Speaker 5: So it just presents the sequence of events and it 40 00:02:18,584 --> 00:02:21,384 Speaker 5: doesn't tell you who the conspirators are or what the 41 00:02:21,424 --> 00:02:25,904 Speaker 5: conspirator is. It just leaves you to draw the inferences yourself, 42 00:02:26,344 --> 00:02:28,584 Speaker 5: and I think that's something people like doing because then 43 00:02:28,584 --> 00:02:31,144 Speaker 5: they feel as though they're discovering something if they put 44 00:02:31,144 --> 00:02:35,744 Speaker 5: it together for themselves. And that's I think something that 45 00:02:35,784 --> 00:02:39,584 Speaker 5: the Internet has really encouraged, because people used to get 46 00:02:39,624 --> 00:02:43,224 Speaker 5: information in some kind of critical context, like if you 47 00:02:43,304 --> 00:02:45,024 Speaker 5: wanted to find out what happened, you'd go to a 48 00:02:45,064 --> 00:02:49,504 Speaker 5: book or a newspaper or congressional papers, whatever it was. 49 00:02:49,944 --> 00:02:52,384 Speaker 5: But now on the Internet, you can just look up 50 00:02:52,504 --> 00:02:56,504 Speaker 5: facts in this very piecemeal way, join the dots yourself, 51 00:02:56,664 --> 00:02:59,624 Speaker 5: and it doesn't exist in any critical context at all, 52 00:03:00,184 --> 00:03:03,304 Speaker 5: so there aren't really any constraints on what people are 53 00:03:03,344 --> 00:03:05,944 Speaker 5: going to do with those facts. And I think it's 54 00:03:06,024 --> 00:03:10,064 Speaker 5: encouraged this way of putting something together yourself. That is 55 00:03:10,104 --> 00:03:13,784 Speaker 5: a very exciting narrative, and you're the hero in the narrative, 56 00:03:13,824 --> 00:03:16,704 Speaker 5: of course, because you're finding out about this big plot 57 00:03:16,784 --> 00:03:20,464 Speaker 5: that nobody else knows about. And I think organizations like 58 00:03:20,544 --> 00:03:24,664 Speaker 5: WikiLeaks encouraged that a bit with putting out, as they said, 59 00:03:24,744 --> 00:03:28,104 Speaker 5: raw information. Of course it was always very highly edited 60 00:03:28,144 --> 00:03:30,704 Speaker 5: in fact, but they're saying, look, we're just giving you 61 00:03:31,024 --> 00:03:34,504 Speaker 5: the information and you can put together yourself what's happening. 62 00:03:34,984 --> 00:03:38,544 Speaker 5: All of their audience. We're putting together the deep state 63 00:03:38,584 --> 00:03:39,864 Speaker 5: being in charge of everything. 64 00:03:40,104 --> 00:03:43,464 Speaker 3: We use the terms of conspiracy theory and conspiracy and 65 00:03:43,464 --> 00:03:45,864 Speaker 3: they mean different things, right, And you deal a lot 66 00:03:45,944 --> 00:03:49,304 Speaker 3: with philosophy, and like why is it that people want 67 00:03:49,304 --> 00:03:52,544 Speaker 3: to deal in conspiracies and conspiracy theories. And we're talking 68 00:03:52,584 --> 00:03:56,824 Speaker 3: about Steve Bannett, right, great conspiracy theorists, and he loves 69 00:03:56,864 --> 00:04:03,344 Speaker 3: the Knights Templar. Let's go back to the thirteen hundreds 70 00:04:03,744 --> 00:04:08,704 Speaker 3: with the Knights Templar. There's this powerful organization. They're rich, 71 00:04:09,104 --> 00:04:11,984 Speaker 3: they're actually the bankers of Europe. They bring in all 72 00:04:12,024 --> 00:04:16,064 Speaker 3: these this revolutionary new for those days banking banking procedures, 73 00:04:16,384 --> 00:04:20,383 Speaker 3: and in the end, the King of France accuses them 74 00:04:20,424 --> 00:04:26,424 Speaker 3: of heresy and of worshiping a goat headed figure called Baffamet, 75 00:04:26,664 --> 00:04:29,984 Speaker 3: and he burns them at the stake. But this is 76 00:04:30,024 --> 00:04:33,463 Speaker 3: this grand conspiracy that's been pushed by the King of France. 77 00:04:33,784 --> 00:04:36,863 Speaker 4: But in the end, of course, he takes their money. 78 00:04:37,383 --> 00:04:39,704 Speaker 3: And the whole point of this is that the King 79 00:04:39,743 --> 00:04:43,344 Speaker 3: of France then seizes all the assets that they have, 80 00:04:43,503 --> 00:04:46,263 Speaker 3: and the King of France is in a dire financial situation. 81 00:04:46,464 --> 00:04:49,624 Speaker 3: So there was a real set of conspiracy theories pushed 82 00:04:49,664 --> 00:04:51,743 Speaker 3: out to the larger population. There are a lot of 83 00:04:51,743 --> 00:04:54,383 Speaker 3: facts involved in this and that, and yet in the 84 00:04:54,503 --> 00:04:57,903 Speaker 3: end of the day, it was a conspiracy. To push 85 00:04:57,904 --> 00:05:00,623 Speaker 3: a conspiracy theory is that it was pretty clear that 86 00:05:00,704 --> 00:05:04,264 Speaker 3: the Knights Templar were not worshiping a goat headed figure, 87 00:05:04,743 --> 00:05:07,143 Speaker 3: and we could play that all the way to today. 88 00:05:07,264 --> 00:05:11,063 Speaker 3: So when people are looking at their phones, sitting on 89 00:05:11,104 --> 00:05:15,664 Speaker 3: the toilet and coming up with connecting these facts, oftentimes 90 00:05:15,664 --> 00:05:18,264 Speaker 3: they're led in a certain direction. So I was wondering 91 00:05:18,503 --> 00:05:21,303 Speaker 3: returning Bick to this century, his things changed all that 92 00:05:21,424 --> 00:05:23,784 Speaker 3: much with people being directed in certain ways. 93 00:05:24,183 --> 00:05:27,903 Speaker 5: No, I think people are directed. Even in these fact listing, 94 00:05:27,943 --> 00:05:32,344 Speaker 5: as I call them, conspiracy theories, they're still selecting the 95 00:05:32,383 --> 00:05:35,823 Speaker 5: facts carefully to lead people in a particular direction. So 96 00:05:36,104 --> 00:05:39,463 Speaker 5: I think they always tend to be led top down. 97 00:05:39,904 --> 00:05:43,584 Speaker 5: Maybe QAnon involved in a slightly more organic way because 98 00:05:43,584 --> 00:05:48,144 Speaker 5: it originated as LARPing or live action roleplay on four Chan, 99 00:05:48,584 --> 00:05:51,903 Speaker 5: but then there's been a lot of directing it for 100 00:05:52,063 --> 00:05:54,503 Speaker 5: a particular political agenda, of course. 101 00:05:54,424 --> 00:05:56,024 Speaker 2: So yes, suchly what you're saying is it used to 102 00:05:56,063 --> 00:05:58,104 Speaker 2: be conspiracy theorists had to come up with sort of 103 00:05:58,183 --> 00:06:02,143 Speaker 2: complex elaborations, right, whatever, Nazi propaganda about the Jews and 104 00:06:02,183 --> 00:06:04,743 Speaker 2: things like that. But now you're saying, essentially because of 105 00:06:04,823 --> 00:06:07,663 Speaker 2: the Internet really and Twitter and all these other social 106 00:06:07,743 --> 00:06:11,703 Speaker 2: media things, is you don't need to really fabricate anything sophisticated. 107 00:06:12,024 --> 00:06:14,823 Speaker 2: You can just do your own research, put a few 108 00:06:14,863 --> 00:06:18,263 Speaker 2: facts together, and the Internet helps all those things connect. 109 00:06:18,264 --> 00:06:19,624 Speaker 2: And then what comes out of there is you get 110 00:06:19,664 --> 00:06:22,384 Speaker 2: the thrill of discovery of thinking you're putting together some 111 00:06:22,464 --> 00:06:25,784 Speaker 2: really cool and complex thing that nobody knows about. It 112 00:06:25,943 --> 00:06:28,263 Speaker 2: does seem to me if that's true, that means that 113 00:06:28,303 --> 00:06:30,944 Speaker 2: there's more and more conspiracy theories making their way to 114 00:06:30,984 --> 00:06:33,344 Speaker 2: the Internet, and it seems like AI would then suck 115 00:06:33,383 --> 00:06:35,104 Speaker 2: that in and turbocharge the process. 116 00:06:35,224 --> 00:06:36,344 Speaker 4: Yeah, you worry about it. 117 00:06:36,424 --> 00:06:38,663 Speaker 5: I do worry about that, not because I think AI 118 00:06:38,823 --> 00:06:42,464 Speaker 5: developed conspiracy theories. If you actually ask it to, it's 119 00:06:42,584 --> 00:06:46,064 Speaker 5: very old school. So I asked for a conspiracy theory 120 00:06:46,183 --> 00:06:49,943 Speaker 5: showing that Frank Sinatra killed JFK, for instance, and it 121 00:06:49,984 --> 00:06:53,303 Speaker 5: gives you a whole lot of complex motives and relationships 122 00:06:53,344 --> 00:06:57,784 Speaker 5: between people that are really dispensable for today's conspiracy theories. 123 00:06:57,823 --> 00:06:59,984 Speaker 5: You don't need them. What you need to do is 124 00:07:00,183 --> 00:07:04,024 Speaker 5: just list some pertinent backs and then ply a causal 125 00:07:04,264 --> 00:07:07,943 Speaker 5: relationship between them, and that's very easy to do now 126 00:07:08,183 --> 00:07:11,424 Speaker 5: with facts on the end, and the way that AI 127 00:07:11,984 --> 00:07:16,544 Speaker 5: generates knowledge for people, Again, if they ask it a question, 128 00:07:16,664 --> 00:07:18,944 Speaker 5: it will give them a factual answer, but the one 129 00:07:18,984 --> 00:07:22,464 Speaker 5: that isn't part of any critical context unless they look 130 00:07:22,464 --> 00:07:26,144 Speaker 5: at the sources. And then the sources are often very 131 00:07:26,184 --> 00:07:28,624 Speaker 5: bad because they have to be ones that are just 132 00:07:28,744 --> 00:07:32,624 Speaker 5: most appropriate for these large language models to use. And 133 00:07:32,864 --> 00:07:35,224 Speaker 5: I've found that if it's anything to do with SAJFK, 134 00:07:35,464 --> 00:07:38,784 Speaker 5: they go straight for the conspiracy theory ones, and of 135 00:07:38,784 --> 00:07:41,504 Speaker 5: course there's way more of that content out there for 136 00:07:41,584 --> 00:07:44,944 Speaker 5: them to use. And there's going to be this recursive 137 00:07:44,984 --> 00:07:49,504 Speaker 5: problem where people rely on the lms to get information 138 00:07:50,064 --> 00:07:52,304 Speaker 5: and then they use that information in what they write, 139 00:07:52,344 --> 00:07:55,304 Speaker 5: and then that's all such back up into the lms, 140 00:07:55,944 --> 00:07:59,864 Speaker 5: and it's so much harder to get the truth out 141 00:07:59,904 --> 00:08:02,944 Speaker 5: there and to research and find out the truth that 142 00:08:03,264 --> 00:08:05,584 Speaker 5: it just seems clear to me that we're going to 143 00:08:05,704 --> 00:08:10,064 Speaker 5: end up with endless nonsense and people having very little 144 00:08:10,104 --> 00:08:12,904 Speaker 5: capacity to differentiate between what's true and what's not. 145 00:08:13,264 --> 00:08:17,264 Speaker 3: Before the advent of the Internet, by and large, we 146 00:08:17,344 --> 00:08:20,624 Speaker 3: would look to experts for facts, right, and what a 147 00:08:20,744 --> 00:08:23,424 Speaker 3: fact is the very definition of the word. I've spent 148 00:08:23,744 --> 00:08:27,744 Speaker 3: three decades as John did in CIA. What do I 149 00:08:27,904 --> 00:08:32,064 Speaker 3: know about making vaccines? I'm not even sure what RNA 150 00:08:32,304 --> 00:08:34,784 Speaker 3: enabled venis. I don't even have the terminology for it. 151 00:08:35,064 --> 00:08:37,664 Speaker 3: And I looked to the FDA, and I look to 152 00:08:37,784 --> 00:08:40,983 Speaker 3: medical experts, right, And yet I don't know these things right. 153 00:08:41,104 --> 00:08:45,944 Speaker 3: And so it seems before you studied the third rite, 154 00:08:45,984 --> 00:08:50,103 Speaker 3: and Hitler could claim that, oh, we were we were 155 00:08:50,144 --> 00:08:52,504 Speaker 3: attacked by the Soviet Union. We had no choice but 156 00:08:52,583 --> 00:08:55,343 Speaker 3: to invade poll the polls were And yet even he 157 00:08:55,424 --> 00:08:58,104 Speaker 3: had to sort of, you know, he had to set 158 00:08:58,144 --> 00:09:01,144 Speaker 3: up a fake attack, and so to present that, are 159 00:09:01,144 --> 00:09:04,344 Speaker 3: we losing the ability to have rational discourse if we 160 00:09:04,384 --> 00:09:06,463 Speaker 3: can make up our own facts and that there are 161 00:09:06,463 --> 00:09:09,503 Speaker 3: no referees? And I'm just thinking RFK Junior, right, I mean, 162 00:09:09,784 --> 00:09:12,784 Speaker 3: you have the entire medical basically the entire medical community 163 00:09:13,064 --> 00:09:15,944 Speaker 3: it saying he and him saying, I've got a brain worm. 164 00:09:16,024 --> 00:09:18,384 Speaker 3: But it's because it's got the fact, and I have 165 00:09:18,424 --> 00:09:18,864 Speaker 3: the power. 166 00:09:19,024 --> 00:09:23,224 Speaker 5: People can't invent facts, because facts just exist in the world, 167 00:09:23,784 --> 00:09:27,223 Speaker 5: but people can invent claims about facts, and a lot 168 00:09:27,264 --> 00:09:30,223 Speaker 5: of them are going to be false. Actually discovering the 169 00:09:30,223 --> 00:09:33,864 Speaker 5: fact is relatively difficult. Making them up is very easy. 170 00:09:33,904 --> 00:09:38,343 Speaker 5: And that's why with the immediacy of communication on the Internet, 171 00:09:38,784 --> 00:09:42,304 Speaker 5: it's just bound to be the case that you drown 172 00:09:42,384 --> 00:09:46,024 Speaker 5: out the true facts with the nonsense unless you have, 173 00:09:46,184 --> 00:09:51,064 Speaker 5: as you say, gatekeepers or expertise. And people believe in 174 00:09:51,103 --> 00:09:55,023 Speaker 5: that lesson less partly for it does a good motivation, 175 00:09:55,304 --> 00:09:57,664 Speaker 5: which is that people want to know not just what 176 00:09:57,703 --> 00:10:01,863 Speaker 5: the facts are, but what somebody's justifications are for saying 177 00:10:01,904 --> 00:10:03,984 Speaker 5: those are the facts. So they want to go through 178 00:10:04,024 --> 00:10:07,744 Speaker 5: the whole process themselves and find out where this comes from. 179 00:10:08,223 --> 00:10:12,304 Speaker 5: But the Internet has encouraged these forms of interaction that 180 00:10:12,343 --> 00:10:14,904 Speaker 5: are very fast, and they don't actually give people time 181 00:10:14,984 --> 00:10:18,024 Speaker 5: to do that research and find out whether a claim 182 00:10:18,103 --> 00:10:18,903 Speaker 5: is justified. 183 00:10:19,184 --> 00:10:22,024 Speaker 3: Well, you know, it is a fact that forty seven 184 00:10:22,064 --> 00:10:24,624 Speaker 3: point three percent of all statistics are made up on 185 00:10:24,664 --> 00:10:25,064 Speaker 3: the spot. 186 00:10:25,343 --> 00:10:26,864 Speaker 5: Yeah, I've heard. 187 00:10:28,664 --> 00:10:31,304 Speaker 2: So when you work with a class like this and 188 00:10:31,343 --> 00:10:34,583 Speaker 2: you talk about these things AI and Silicon Valley and 189 00:10:34,624 --> 00:10:36,144 Speaker 2: what are the students take out of it? What is 190 00:10:36,144 --> 00:10:39,463 Speaker 2: it that surprises them? Is this generation different in how 191 00:10:39,463 --> 00:10:40,583 Speaker 2: they process these things? 192 00:10:40,583 --> 00:10:44,184 Speaker 5: And you, yeah, sometimes they surprise me, because they so 193 00:10:44,424 --> 00:10:48,824 Speaker 5: much more online than people of my generation that they 194 00:10:48,984 --> 00:10:51,263 Speaker 5: know a huge amount of this stuff. One of them 195 00:10:51,343 --> 00:10:56,104 Speaker 5: was just telling me this week about white nationalists propaganda 196 00:10:56,343 --> 00:11:00,704 Speaker 5: on I think it's which doesn't say anything explicit about 197 00:11:00,824 --> 00:11:04,944 Speaker 5: white nationalism. It just shows you these, for instance, streets 198 00:11:04,944 --> 00:11:08,304 Speaker 5: from the nineteen fifties full of white people who are 199 00:11:08,583 --> 00:11:12,343 Speaker 5: nicely indirect or shopping or whatever it is, and there's 200 00:11:12,424 --> 00:11:16,544 Speaker 5: no explicit message. But the people who subscribe to that 201 00:11:16,703 --> 00:11:20,224 Speaker 5: content are going to understand what the message is supposed 202 00:11:20,264 --> 00:11:22,024 Speaker 5: to be, and other people are going to get sucked 203 00:11:22,064 --> 00:11:24,223 Speaker 5: and just out of curiosity. 204 00:11:24,424 --> 00:11:27,464 Speaker 3: Very often, it seems to me. I also studied the 205 00:11:27,544 --> 00:11:30,544 Speaker 3: rise of fascism in Germany and other places, and a 206 00:11:30,583 --> 00:11:35,024 Speaker 3: lot of it was around the basic myths that people believed. 207 00:11:35,103 --> 00:11:37,584 Speaker 3: Right in Germany, it was the time of the time 208 00:11:37,624 --> 00:11:40,504 Speaker 3: of past greatness, the folk of Andero and the Germanic 209 00:11:40,544 --> 00:11:43,664 Speaker 3: people's coming in, and the whole sense not so much 210 00:11:43,744 --> 00:11:47,824 Speaker 3: racial purity, but the sense of pride in German nationalism 211 00:11:48,144 --> 00:11:51,184 Speaker 3: comes much earlier the vanderfugel and where people were playing 212 00:11:51,583 --> 00:11:54,784 Speaker 3: folk music, singing with guitars as they wandered through the 213 00:11:54,824 --> 00:11:57,624 Speaker 3: streets of Germany, and then this was taken in a 214 00:11:57,624 --> 00:12:00,944 Speaker 3: different direction by the Nazis, and white nationals today are 215 00:12:00,984 --> 00:12:04,344 Speaker 3: also doing this. It's a myth that in the nineteen 216 00:12:04,463 --> 00:12:07,623 Speaker 3: fifties did everybody was like leave it to Beaver, right. 217 00:12:07,703 --> 00:12:09,944 Speaker 3: I mean, if you were African American that the United 218 00:12:09,944 --> 00:12:12,664 Speaker 3: States was not that all great in the nineteen fifties, 219 00:12:12,784 --> 00:12:16,103 Speaker 3: or there was polio in the nineteen fifties, and there 220 00:12:16,103 --> 00:12:18,064 Speaker 3: were higher instances of domestic abuse. 221 00:12:18,103 --> 00:12:20,904 Speaker 5: So I'm very interested, as you say, in the example 222 00:12:21,064 --> 00:12:26,984 Speaker 5: of the Nazis, because they really exploited these existing myths 223 00:12:27,024 --> 00:12:37,503 Speaker 5: in German culture. I was playing my students a bit 224 00:12:37,544 --> 00:12:42,344 Speaker 5: of Wagner to show them how powerful this was. Of course, 225 00:12:42,664 --> 00:12:45,264 Speaker 5: Hitler was obsessed with Wagner, and I don't think you 226 00:12:45,264 --> 00:12:50,424 Speaker 5: can understand the power of somebody like Hitler without seeing 227 00:12:50,504 --> 00:12:54,944 Speaker 5: how this myth of his charisma developed. And that's because 228 00:12:55,424 --> 00:13:00,463 Speaker 5: people already had this really strong desire for a leader 229 00:13:00,583 --> 00:13:05,424 Speaker 5: that was part of the this cult of Germany being 230 00:13:05,583 --> 00:13:09,184 Speaker 5: in need of salvation, and people were looking for some 231 00:13:09,264 --> 00:13:12,544 Speaker 5: kind of strong, charismatic leader. In fact, Hitler was early on, 232 00:13:12,624 --> 00:13:16,384 Speaker 5: and then it only gradually became clear to everybody that 233 00:13:16,424 --> 00:13:20,224 Speaker 5: it should be him. People like Goebbels were looking for 234 00:13:20,703 --> 00:13:24,863 Speaker 5: this savior everywhere and just describing it in very pseudo 235 00:13:24,944 --> 00:13:28,144 Speaker 5: religious terms. So there's I guess, there's the mythic content, 236 00:13:28,184 --> 00:13:32,384 Speaker 5: which refers to the past, and all forms of nationalism 237 00:13:32,463 --> 00:13:35,784 Speaker 5: are mythic in the sense that they create this imaginary past. 238 00:13:36,304 --> 00:13:38,664 Speaker 5: And then there's what I would call fantasy, which is 239 00:13:38,744 --> 00:13:43,304 Speaker 5: the idea that's somehow infusing the present, and something like 240 00:13:43,424 --> 00:13:48,384 Speaker 5: national socialism contains an enormous element of fantasy and just 241 00:13:48,463 --> 00:13:49,784 Speaker 5: fantastical beliefs. 242 00:13:50,343 --> 00:13:52,344 Speaker 3: I just want to chip in and say it wasn't 243 00:13:52,384 --> 00:13:55,424 Speaker 3: just that the Nazis are the easiest, because yeah, we've 244 00:13:55,463 --> 00:13:57,664 Speaker 3: got movies about it, but this was also the same 245 00:13:57,784 --> 00:14:00,904 Speaker 3: tendencies were present in Japan. You know that they are 246 00:14:01,144 --> 00:14:04,544 Speaker 3: the chosen race. Their emperor was descended from the Sun 247 00:14:04,624 --> 00:14:08,703 Speaker 3: God himself, and Mussolini going back to the Roman Empire, 248 00:14:08,904 --> 00:14:11,424 Speaker 3: and the Hungaris going back to the time of the 249 00:14:11,463 --> 00:14:14,824 Speaker 3: mod Hoards coming in. So this was this is really 250 00:14:15,103 --> 00:14:18,424 Speaker 3: something that was widespread, This looking back to the glories 251 00:14:18,463 --> 00:14:20,864 Speaker 3: of the past, looking back for a leader who can 252 00:14:20,864 --> 00:14:24,424 Speaker 3: take us back there, and then matching that with intolerance 253 00:14:24,424 --> 00:14:28,464 Speaker 3: of others, and most important, most importantly, but also vital 254 00:14:28,504 --> 00:14:32,424 Speaker 3: to this was you have to blame someone for why 255 00:14:32,504 --> 00:14:33,424 Speaker 3: we're not there. 256 00:14:33,744 --> 00:14:36,864 Speaker 5: You see this playing out in Britain now, because of 257 00:14:36,904 --> 00:14:39,864 Speaker 5: course the English especially have been really good at this 258 00:14:40,424 --> 00:14:44,304 Speaker 5: myth making. They've co opted a lot of artists and 259 00:14:44,544 --> 00:14:47,544 Speaker 5: musicians and we all see land of hope and glory 260 00:14:47,584 --> 00:14:54,663 Speaker 5: at the prongs and the kind of imperial patriotism that existed. 261 00:14:54,864 --> 00:14:58,144 Speaker 5: I'm reluctant to call it nationalism because we have English 262 00:14:58,224 --> 00:15:01,704 Speaker 5: and Scottish and Welsh nationalisms that are separate from that 263 00:15:01,984 --> 00:15:07,544 Speaker 5: patriotism about Britain and about the Empire. So Britain I 264 00:15:07,584 --> 00:15:12,064 Speaker 5: think developed and the Victorian times, especially the height of 265 00:15:12,104 --> 00:15:13,864 Speaker 5: the Victorian Empire. 266 00:15:14,064 --> 00:15:18,424 Speaker 6: A huge amount of mythology around this, and a lot 267 00:15:18,464 --> 00:15:21,624 Speaker 6: of it was just very flimsy. Of course, it's based 268 00:15:21,624 --> 00:15:26,464 Speaker 6: on untruth or just the pseudo religious stuff. And now 269 00:15:26,824 --> 00:15:30,344 Speaker 6: Britain is no longer a great power and people are 270 00:15:30,904 --> 00:15:34,624 Speaker 6: I don't think exactly resentful of that, but there's a 271 00:15:34,744 --> 00:15:38,864 Speaker 6: kind of nostalgia for the Britain that was a great power, 272 00:15:39,224 --> 00:15:44,264 Speaker 6: and people like Nigel Farage and Reform UK are obviously 273 00:15:44,304 --> 00:15:48,504 Speaker 6: playing on that and they're identifying the enemy, which of 274 00:15:48,544 --> 00:15:53,704 Speaker 6: course is immigrants and the immigrants is somehow polluting that 275 00:15:53,944 --> 00:15:58,504 Speaker 6: idea of Britain, or sometimes it's that idea of Englishness. 276 00:15:58,704 --> 00:16:01,864 Speaker 6: The conservative philosopher Roger Scrutens, who a lot of these 277 00:16:01,904 --> 00:16:04,784 Speaker 6: guys liked, wrote a lot about Englishness, and it's very 278 00:16:04,864 --> 00:16:08,104 Speaker 6: much this myth of the village green and playing cricket 279 00:16:08,424 --> 00:16:12,144 Speaker 6: and tea and waving the flag. And it's a very 280 00:16:12,224 --> 00:16:15,384 Speaker 6: kind of cliched Victorian image. 281 00:16:15,584 --> 00:16:17,624 Speaker 2: So we're doing the same thing. Maga is looking back 282 00:16:17,664 --> 00:16:21,064 Speaker 2: at a mythic past. These immigrants are some sort of enemy, 283 00:16:21,064 --> 00:16:24,384 Speaker 2: when it's in fact we're all immigrants. I wondered when 284 00:16:24,384 --> 00:16:28,584 Speaker 2: you talk about these things, utiple, are leaders knowingly taking 285 00:16:28,664 --> 00:16:31,504 Speaker 2: advantage of this or is this something that's just deep 286 00:16:31,504 --> 00:16:35,144 Speaker 2: in human beings and they just they want to grasp 287 00:16:35,304 --> 00:16:38,944 Speaker 2: for something to make themselves feel better about themselves, and 288 00:16:39,264 --> 00:16:41,984 Speaker 2: that's about others and about sort of power against their 289 00:16:42,064 --> 00:16:43,264 Speaker 2: enemies in the next state over. 290 00:16:43,544 --> 00:16:46,224 Speaker 5: People will always say it's something deep in human beings 291 00:16:46,304 --> 00:16:49,224 Speaker 5: and they're just tribal. But if you try to give 292 00:16:49,304 --> 00:16:53,544 Speaker 5: explanations in any particular case, you find that nationalism especially 293 00:16:53,664 --> 00:16:57,624 Speaker 5: hasn't always existed as a form of belonging. It was invented. 294 00:16:58,344 --> 00:17:00,384 Speaker 5: I don't want to always go back to the fascists 295 00:17:00,384 --> 00:17:04,023 Speaker 5: because it's such a cliche. But Hitler wanted to create 296 00:17:04,064 --> 00:17:07,063 Speaker 5: this idea of populist leadership where he and the people 297 00:17:07,104 --> 00:17:10,024 Speaker 5: were won. And he says in his nineteen thirty or 298 00:17:10,024 --> 00:17:12,824 Speaker 5: speech at the Nuremberg Rally that it's not the state 299 00:17:12,944 --> 00:17:15,424 Speaker 5: ruling the people, but the people in the state are 300 00:17:15,504 --> 00:17:18,904 Speaker 5: the same thing, and everybody has to believe in this WI, 301 00:17:19,224 --> 00:17:21,424 Speaker 5: so that they're part of this we with the leader. 302 00:17:21,984 --> 00:17:24,544 Speaker 5: And the way that you can most easily get that 303 00:17:24,704 --> 00:17:29,024 Speaker 5: isn't by making everyone to subscribe to some common values 304 00:17:29,744 --> 00:17:31,904 Speaker 5: or set of common beliefs. It's just by having a 305 00:17:31,944 --> 00:17:35,784 Speaker 5: common enemy. You just have them and us, and that's 306 00:17:35,824 --> 00:17:37,704 Speaker 5: the easiest way to do it. And it's clearly what 307 00:17:37,744 --> 00:17:39,864 Speaker 5: people are doing now on the populist ride. 308 00:17:40,144 --> 00:17:41,704 Speaker 4: The we changes the we. 309 00:17:42,504 --> 00:17:45,184 Speaker 3: During the Third Reich, just running with this was the 310 00:17:45,264 --> 00:17:48,704 Speaker 3: Germanic people and they would test bloodlines going to join 311 00:17:48,744 --> 00:17:50,984 Speaker 3: the SS. You had to go back generations to make 312 00:17:51,024 --> 00:17:54,664 Speaker 3: sure there was no Jewish or French or Italian blooding, 313 00:17:54,824 --> 00:17:57,224 Speaker 3: and the United States were no different. So there's My 314 00:17:57,624 --> 00:18:00,424 Speaker 3: favorite political party was the No Nothing. So the eighteen 315 00:18:00,504 --> 00:18:04,184 Speaker 3: fifties right, and they were built on a conspiracy theory 316 00:18:04,664 --> 00:18:09,424 Speaker 3: that the Irish in the Germans, the Catholics were coming in, 317 00:18:10,064 --> 00:18:14,344 Speaker 3: the Papists were coming in to destroy their Protestant Republic 318 00:18:14,744 --> 00:18:19,944 Speaker 3: by being there. And yet today Irish and the descendants 319 00:18:19,984 --> 00:18:23,064 Speaker 3: of those Irish and Catholics who were supposedly a fifth 320 00:18:23,064 --> 00:18:27,784 Speaker 3: column to destroy the republic are now themselves embracing conspiracy 321 00:18:27,824 --> 00:18:32,224 Speaker 3: theories about Central Americans coming in right, fellow Roman Catholics 322 00:18:31,984 --> 00:18:34,224 Speaker 3: as the as the other. And then we had in 323 00:18:34,304 --> 00:18:37,984 Speaker 3: the nineteen twenties we had the KKK, where like President 324 00:18:37,944 --> 00:18:40,784 Speaker 3: of the United States was in the KKK. And the 325 00:18:40,784 --> 00:18:42,544 Speaker 3: only explanation I could come up with, I think there 326 00:18:42,704 --> 00:18:45,904 Speaker 3: is it's being manipulated by I know this sounds conspiratorial, 327 00:18:45,944 --> 00:18:47,824 Speaker 3: but it's being manipulated by people with power. 328 00:18:47,944 --> 00:18:49,704 Speaker 4: Right, Who is us? Who were they? 329 00:18:50,304 --> 00:18:53,064 Speaker 3: And how do I You may actually believe it in 330 00:18:53,104 --> 00:18:56,023 Speaker 3: this there I think Steven Miller probably does, right, But 331 00:18:56,424 --> 00:18:57,744 Speaker 3: who was us and who is they? 332 00:18:58,064 --> 00:19:03,104 Speaker 5: It's always something that's constructed. So Irish people weren't originally 333 00:19:03,144 --> 00:19:05,304 Speaker 5: considered to be white, and if you look at what 334 00:19:05,344 --> 00:19:08,624 Speaker 5: white means in America, there are a lot of people 335 00:19:08,744 --> 00:19:11,664 Speaker 5: who in fact, don't really know whether they count as 336 00:19:12,104 --> 00:19:14,984 Speaker 5: part of this white nation that the white nationalists are 337 00:19:15,024 --> 00:19:18,744 Speaker 5: talking about. It's a big gray area because these terms 338 00:19:18,784 --> 00:19:21,864 Speaker 5: are just invented and they don't necessarily have anything to 339 00:19:21,904 --> 00:19:26,224 Speaker 5: do with any physically existing reality, whether it's genetic or 340 00:19:26,264 --> 00:19:29,344 Speaker 5: skin color or whatever it is. And that's I think 341 00:19:29,784 --> 00:19:34,744 Speaker 5: always the case that this exclusion takes place, and it's 342 00:19:34,824 --> 00:19:38,384 Speaker 5: people in power defining categories in order to do that, 343 00:19:38,744 --> 00:19:41,024 Speaker 5: and then other people who want to be part of 344 00:19:41,064 --> 00:19:44,264 Speaker 5: the inn group, of course, willingly subscribe to it, so 345 00:19:44,424 --> 00:19:46,384 Speaker 5: it takes on a power of its own. 346 00:19:46,424 --> 00:19:49,384 Speaker 3: I believe it's in the nineteen twenties, the Supreme Court 347 00:19:49,504 --> 00:19:53,184 Speaker 3: had to decide whether Armenians were white. Yes, and it's 348 00:19:53,224 --> 00:19:55,264 Speaker 3: actually a court case that they came up, and then 349 00:19:55,304 --> 00:19:57,584 Speaker 3: they had to just figure out whether Punjabis are white. 350 00:19:57,624 --> 00:20:00,504 Speaker 3: And they're like, okay, Armenians are white because they're Christians. 351 00:20:00,824 --> 00:20:04,184 Speaker 3: But Punjabis from northern India who may have like green 352 00:20:04,304 --> 00:20:07,344 Speaker 3: or blue eyes, yeah, they're not even if they're Christians. 353 00:20:07,344 --> 00:20:09,783 Speaker 5: So it's so interesting to use that example because my 354 00:20:09,944 --> 00:20:14,704 Speaker 5: daughter's father is Armenian and she considers herself Armenian, and 355 00:20:14,744 --> 00:20:17,944 Speaker 5: she's very recently been coming and asking me, am I 356 00:20:18,024 --> 00:20:21,704 Speaker 5: White because of content she's seeing online where people are 357 00:20:21,744 --> 00:20:26,584 Speaker 5: saying Armenians are Middle Eastern or Armenians are Asian, and 358 00:20:27,024 --> 00:20:30,263 Speaker 5: it's an area of confusion. And mostly what people in 359 00:20:30,304 --> 00:20:33,904 Speaker 5: America know about Armenians comes from the Kardashians. And I 360 00:20:33,944 --> 00:20:37,584 Speaker 5: don't think that's a share share share ye yeah, oh yeah, 361 00:20:37,624 --> 00:20:38,184 Speaker 5: and share. 362 00:20:38,264 --> 00:20:53,024 Speaker 2: Of course, Let's turn to some of the tools, because 363 00:20:53,064 --> 00:20:55,864 Speaker 2: you write a lot about Silicon Valley and a lot 364 00:20:55,904 --> 00:20:57,664 Speaker 2: of the tools that we're using now. You write about 365 00:20:57,664 --> 00:21:01,144 Speaker 2: the sort of toxic role of Facebook, WhatsApp, YouTube, how 366 00:21:01,184 --> 00:21:03,984 Speaker 2: they help in some ways are rewiring our minds. But 367 00:21:04,024 --> 00:21:05,704 Speaker 2: I know you just did a book review of someone 368 00:21:05,784 --> 00:21:07,864 Speaker 2: in the New York Times and talked about how some 369 00:21:07,904 --> 00:21:09,904 Speaker 2: people are trying to look at you know, yet not 370 00:21:09,984 --> 00:21:12,664 Speaker 2: everybody gets sucked in. So what is it about the 371 00:21:12,784 --> 00:21:16,584 Speaker 2: mind's wiring that protects some and then brings in others? 372 00:21:16,904 --> 00:21:19,704 Speaker 5: I wish I knew the answer to that. Yeah, I know, 373 00:21:19,944 --> 00:21:25,264 Speaker 5: it's also designed to exploit certain people. That the divisiveness, 374 00:21:25,344 --> 00:21:28,504 Speaker 5: the trolls and the bots that are responsible for it, 375 00:21:28,784 --> 00:21:33,263 Speaker 5: they know what they're doing. The people that the uniquit Yeah, 376 00:21:33,464 --> 00:21:38,024 Speaker 5: people think that information has become more democratic because we 377 00:21:38,064 --> 00:21:41,384 Speaker 5: have the Internet and we have AI and we can 378 00:21:41,424 --> 00:21:44,344 Speaker 5: consult these llms. We just have to remember that all 379 00:21:44,424 --> 00:21:49,464 Speaker 5: of those companies are owned by these oligarchs essentially or 380 00:21:49,584 --> 00:21:53,344 Speaker 5: polygarch and they have their own agendas. They're all far right, 381 00:21:53,624 --> 00:21:56,584 Speaker 5: a lot of them are pushing white nationalism, and they 382 00:21:56,704 --> 00:22:00,064 Speaker 5: set the algorithm to tell people what they want to 383 00:22:00,104 --> 00:22:03,304 Speaker 5: tell them. So if you look up on chat GPT, 384 00:22:03,544 --> 00:22:07,304 Speaker 5: for instance, describe Ukraine to me, it will give you 385 00:22:07,344 --> 00:22:10,344 Speaker 5: a description. And then if you say DISCRIPSI Ukraine and 386 00:22:10,784 --> 00:22:14,984 Speaker 5: Ukraine in the style of RT, the Russian Propaganda channel 387 00:22:15,224 --> 00:22:18,944 Speaker 5: will give you a completely different description, just selecting the 388 00:22:18,984 --> 00:22:23,704 Speaker 5: backs differently, creating a different narrative. And you realize whatever 389 00:22:23,744 --> 00:22:27,144 Speaker 5: the default setting is for the algorithm is set by 390 00:22:27,184 --> 00:22:31,104 Speaker 5: these guys like Musk and Teal and their friends, who 391 00:22:31,224 --> 00:22:36,024 Speaker 5: all share a very particular political agenda. And that's not 392 00:22:36,104 --> 00:22:41,504 Speaker 5: something democratic. It's everybody being manipulated by the same group 393 00:22:41,544 --> 00:22:43,664 Speaker 5: of people. Again, this sounds like a conspiracy. 394 00:22:43,704 --> 00:22:44,464 Speaker 4: Tip well, it. 395 00:22:44,384 --> 00:22:47,344 Speaker 3: Sounds like a conspiracy, but it's one that I'm open to. 396 00:22:47,664 --> 00:22:50,504 Speaker 3: Who does control the algorithm? John and I coming out 397 00:22:50,544 --> 00:22:53,944 Speaker 3: of CIA, the algorithms in Russia are controlled by the 398 00:22:54,064 --> 00:22:56,784 Speaker 3: state who what you get to read and what is 399 00:22:56,824 --> 00:23:00,744 Speaker 3: affect is controlled by the state, And increasingly in the US, 400 00:23:00,224 --> 00:23:04,664 Speaker 3: it is a fact that it the algorithms by which 401 00:23:04,704 --> 00:23:08,144 Speaker 3: they were strongly influenced, are controlled by a small cabal 402 00:23:08,224 --> 00:23:12,864 Speaker 3: of politically influential people who have who want to I 403 00:23:12,904 --> 00:23:15,824 Speaker 3: don't get it, want to make yet another billion dollars 404 00:23:15,944 --> 00:23:17,344 Speaker 3: or want yet more power. 405 00:23:17,384 --> 00:23:19,384 Speaker 4: I mean, you know, I don't know how many. 406 00:23:19,464 --> 00:23:22,904 Speaker 5: Well you have to understand the fantasies to understand him 407 00:23:22,944 --> 00:23:28,584 Speaker 5: and his followers. These people are just such florid fantasists. 408 00:23:28,944 --> 00:23:33,024 Speaker 7: They were all gamers, all gamers, Yeah, and they all 409 00:23:33,024 --> 00:23:36,864 Speaker 7: think that they're going to pioneer a life on other planets, 410 00:23:36,984 --> 00:23:38,664 Speaker 7: or that they're going to discover how we can all 411 00:23:38,704 --> 00:23:39,384 Speaker 7: live forever. 412 00:23:39,824 --> 00:23:42,824 Speaker 5: They're not interested in real world problems. They just have 413 00:23:42,944 --> 00:23:48,224 Speaker 5: this fantastical set of beliefs and ambitions. That's partly what's 414 00:23:48,384 --> 00:23:52,264 Speaker 5: so worrying about having them be in con charge, because 415 00:23:52,424 --> 00:23:56,664 Speaker 5: in charge, because they're actually very detached from reality. But yeah, 416 00:23:56,704 --> 00:24:00,824 Speaker 5: you mentioned putin having to control the media, including the 417 00:24:00,824 --> 00:24:05,424 Speaker 5: internet there. I mean, if anybody wants to establish themselves 418 00:24:05,544 --> 00:24:08,464 Speaker 5: as a dictator in the long term. Now they're going 419 00:24:08,544 --> 00:24:11,064 Speaker 5: to have to bring those oligogs to heal and just 420 00:24:11,104 --> 00:24:14,184 Speaker 5: the way that Putin did so, I don't know why 421 00:24:14,184 --> 00:24:20,024 Speaker 5: those guys would feel so comfortable recommending an authoritarian form 422 00:24:20,024 --> 00:24:23,744 Speaker 5: of government when they would probably be what's most standing 423 00:24:23,784 --> 00:24:25,784 Speaker 5: in the way of that. Ultimately, a lot of people 424 00:24:25,864 --> 00:24:28,144 Speaker 5: were saying this about Elon Musk a few months ago. 425 00:24:28,264 --> 00:24:30,264 Speaker 2: But when I grew up, the Republican Party was quote 426 00:24:30,264 --> 00:24:32,624 Speaker 2: the party of capitalism, But now it seems more and 427 00:24:32,664 --> 00:24:34,784 Speaker 2: more like the party of state monopoly. Like it's yeah, 428 00:24:34,904 --> 00:24:37,584 Speaker 2: we've seen in Russia. It's more like state capitalism or 429 00:24:37,584 --> 00:24:41,344 Speaker 2: economic nationalism. And you've written about Bannon and some of 430 00:24:41,344 --> 00:24:44,824 Speaker 2: these social media folks. How do we get here? How 431 00:24:44,864 --> 00:24:47,424 Speaker 2: did a party that thought it believed in one thing 432 00:24:47,784 --> 00:24:49,744 Speaker 2: turn itself into believing something else? 433 00:24:49,904 --> 00:24:53,704 Speaker 5: Think about the evolution of Silicon Valley and the establishment 434 00:24:53,824 --> 00:24:58,384 Speaker 5: of those huge monopolies, not really in terms of one 435 00:24:58,464 --> 00:25:01,824 Speaker 5: party or another, because of course it was under Clinton 436 00:25:01,904 --> 00:25:04,864 Speaker 5: that you had all of this deregulation. Bob Ruman and 437 00:25:04,944 --> 00:25:08,384 Speaker 5: Larry Summerson, all of those people, and it was Clinton 438 00:25:08,464 --> 00:25:13,304 Speaker 5: that wanted to privatize these industries that were essential to 439 00:25:13,544 --> 00:25:17,344 Speaker 5: national security, as a way of fighting terrorism. I mean, 440 00:25:17,344 --> 00:25:20,224 Speaker 5: already in nineteen ninety eight, that's what Clinton was saying 441 00:25:20,224 --> 00:25:23,664 Speaker 5: we needed to do. And then, of course, the Silicon 442 00:25:23,744 --> 00:25:28,264 Speaker 5: Valley economy becomes established in a context where we need 443 00:25:28,304 --> 00:25:32,424 Speaker 5: to develop these technologies for the government and harness our 444 00:25:32,464 --> 00:25:36,864 Speaker 5: innovation for those ends, for national security ends. But you're 445 00:25:37,384 --> 00:25:41,144 Speaker 5: letting the private companies keep all the profits, even though 446 00:25:41,184 --> 00:25:45,824 Speaker 5: they're heavily subsidized with government funds. And you're allowing these 447 00:25:45,864 --> 00:25:49,904 Speaker 5: monopolies to develop, and because the government very quickly becomes 448 00:25:49,944 --> 00:25:53,824 Speaker 5: dependent on them. When they update their software, everybody has 449 00:25:53,864 --> 00:25:56,544 Speaker 5: to update with them, and they're competing with China, of 450 00:25:56,544 --> 00:25:59,864 Speaker 5: course in a commercial marketplace, and that has a lot 451 00:25:59,864 --> 00:26:03,744 Speaker 5: of ramifications for government. So in the end, you would 452 00:26:03,824 --> 00:26:07,424 Speaker 5: think government would want to align their interests much more 453 00:26:07,864 --> 00:26:11,944 Speaker 5: closely with the interests of the Silicon Valley. 454 00:26:11,744 --> 00:26:14,424 Speaker 2: Oligogus, because it is true when Jerry and I started, 455 00:26:14,624 --> 00:26:17,224 Speaker 2: we didn't have we weren't tied to these companies to 456 00:26:17,264 --> 00:26:19,544 Speaker 2: the same extent. We often created our own technology that 457 00:26:19,584 --> 00:26:24,584 Speaker 2: eventually made its way into companies in the market. So 458 00:26:24,624 --> 00:26:27,704 Speaker 2: how did you get interested in that? And then what 459 00:26:27,864 --> 00:26:30,064 Speaker 2: is your sense of the intelligence community as you look 460 00:26:30,104 --> 00:26:30,704 Speaker 2: at it now. 461 00:26:30,904 --> 00:26:33,584 Speaker 5: I was always interested in Snowden, actually as soon as 462 00:26:33,584 --> 00:26:36,304 Speaker 5: that movie came out, because I just had a deep 463 00:26:36,424 --> 00:26:39,864 Speaker 5: distrust of the way they presented the information in the movie, 464 00:26:39,944 --> 00:26:43,944 Speaker 5: Laura Patress and Glenn Greenwald, and it was so much 465 00:26:44,104 --> 00:26:47,824 Speaker 5: part of this gaming fantasy, which is a character just 466 00:26:47,904 --> 00:26:51,424 Speaker 5: like Edward Snowden being the lone hero battling these great 467 00:26:51,504 --> 00:26:55,464 Speaker 5: powers and leaking information. Glenn Greenwold, I guess, started out 468 00:26:55,504 --> 00:26:58,064 Speaker 5: as a lawyer defending people on the far right, and 469 00:26:58,104 --> 00:27:01,904 Speaker 5: here they were being celebrated by the left as these 470 00:27:02,344 --> 00:27:07,304 Speaker 5: great heroes, and I was just instinctively suspicious of that. 471 00:27:07,424 --> 00:27:09,504 Speaker 2: The hero fantasy is I think a big problem for 472 00:27:09,664 --> 00:27:11,904 Speaker 2: I mean, I don't know I'm oversimplifying here, but even 473 00:27:11,944 --> 00:27:14,304 Speaker 2: our gun culture, I think there's this view that I 474 00:27:14,424 --> 00:27:17,064 Speaker 2: watch these movies. It's all about heroes and defending yourself 475 00:27:17,064 --> 00:27:18,904 Speaker 2: and if I have these guns and they're coming after me, 476 00:27:19,264 --> 00:27:21,224 Speaker 2: I can take on the state and all these I 477 00:27:21,304 --> 00:27:24,344 Speaker 2: just think people love being the hero of their own story. 478 00:27:24,424 --> 00:27:28,504 Speaker 2: It's dangerous actually, because in real life individuals can't take 479 00:27:28,544 --> 00:27:31,744 Speaker 2: on the state like they think they Yeah. 480 00:27:31,224 --> 00:27:34,104 Speaker 5: And you ask what I think about the intelligence services, Now, 481 00:27:34,464 --> 00:27:38,344 Speaker 5: I do worry about the revolving door with the private companies, 482 00:27:38,704 --> 00:27:42,544 Speaker 5: just because people can go they can get government positions 483 00:27:42,584 --> 00:27:46,184 Speaker 5: for the training and then take that to a private 484 00:27:46,224 --> 00:27:49,024 Speaker 5: company run by one of these oligogs. And of course 485 00:27:49,024 --> 00:27:52,104 Speaker 5: that happens all the time, and it happens with most 486 00:27:52,104 --> 00:27:54,424 Speaker 5: sad as well, especially where they've got a really strong 487 00:27:54,504 --> 00:27:57,944 Speaker 5: tech sept. That there's always been this interaction of the two, 488 00:27:58,344 --> 00:28:01,904 Speaker 5: but that is going to increasingly mean that there are 489 00:28:01,984 --> 00:28:07,384 Speaker 5: these big financial motivations and maybe also ideological motivations for 490 00:28:07,504 --> 00:28:12,184 Speaker 5: people when they're in government. That will mean a decline 491 00:28:12,224 --> 00:28:15,624 Speaker 5: in the sense of mission that probably your generation had. 492 00:28:16,024 --> 00:28:17,664 Speaker 5: So that's something that I worry about. 493 00:28:17,864 --> 00:28:22,624 Speaker 3: The real danger is XCIA guys who start up podcasts 494 00:28:22,704 --> 00:28:24,304 Speaker 3: to like manipulate people. 495 00:28:24,584 --> 00:28:26,984 Speaker 2: Well, I think there's there are a few XCIA people 496 00:28:27,024 --> 00:28:29,264 Speaker 2: have podcasts who are lying about. 497 00:28:28,984 --> 00:28:30,104 Speaker 4: Stuff like us unlike US. 498 00:28:30,104 --> 00:28:32,304 Speaker 2: Oh no, there's good ones. But there's also people who 499 00:28:32,384 --> 00:28:35,184 Speaker 2: have much bigger followings than we do. It's true, essentially 500 00:28:35,544 --> 00:28:38,424 Speaker 2: pretending that they were these superheroes when they say when 501 00:28:38,424 --> 00:28:42,224 Speaker 2: they were nobody's yeah, like people follow that stuff and 502 00:28:42,224 --> 00:28:42,944 Speaker 2: they're full of shit. 503 00:28:43,064 --> 00:28:45,384 Speaker 5: Frankly, I know it's interesting you talk about the death 504 00:28:45,384 --> 00:28:48,184 Speaker 5: of expertise. But there are all these people who are 505 00:28:48,904 --> 00:28:51,864 Speaker 5: self appointed experts that have huge followings. 506 00:28:52,304 --> 00:28:56,624 Speaker 3: So expertise arguably is what makes the human race function. 507 00:28:56,824 --> 00:28:57,024 Speaker 4: Right. 508 00:28:57,104 --> 00:29:00,064 Speaker 3: We can't all we can't all do everything. I don't 509 00:29:00,104 --> 00:29:02,664 Speaker 3: know how to make a computer, and the parts come 510 00:29:03,104 --> 00:29:04,024 Speaker 3: from all over the world. 511 00:29:04,104 --> 00:29:05,424 Speaker 4: I mean, we have to specialize. 512 00:29:05,464 --> 00:29:09,944 Speaker 5: And it's also ironic because knowledge has become more more 513 00:29:09,984 --> 00:29:16,544 Speaker 5: specialized with scientific advances and engineering advances, and people used 514 00:29:16,544 --> 00:29:19,064 Speaker 5: to be able to repare their cars. They can't repare 515 00:29:19,104 --> 00:29:22,944 Speaker 5: their own computers the same way as the military could 516 00:29:23,664 --> 00:29:26,824 Speaker 5: their planes, and now they can't prepare their software. The 517 00:29:26,944 --> 00:29:31,984 Speaker 5: Soviets in general really hampered science there where they had 518 00:29:32,144 --> 00:29:36,104 Speaker 5: many great scientists, but they hampered what they could do 519 00:29:36,304 --> 00:29:40,184 Speaker 5: just because they were so ideological that they had to 520 00:29:40,264 --> 00:29:44,104 Speaker 5: censor anybody or get rid of anybody who wasn't ideologically 521 00:29:44,184 --> 00:29:46,624 Speaker 5: aligned with them. And once you do that and you 522 00:29:46,664 --> 00:29:50,144 Speaker 5: can't have open scientific debate, then you can't have the 523 00:29:50,184 --> 00:29:54,704 Speaker 5: cultivation of that independent expertise, and you really need it 524 00:29:54,744 --> 00:29:57,624 Speaker 5: to have any meaningful scientific progress. 525 00:29:57,944 --> 00:30:03,184 Speaker 2: We see very senior people, politicians, congressmen, people you associated 526 00:30:03,184 --> 00:30:05,504 Speaker 2: with government, who if we asked them a question about 527 00:30:05,544 --> 00:30:07,984 Speaker 2: something might be very strong opinion on what they believe 528 00:30:07,984 --> 00:30:10,144 Speaker 2: based on their experience. Then if present Trump says the 529 00:30:10,144 --> 00:30:12,864 Speaker 2: opposite is true, they just immediately change and say they 530 00:30:12,944 --> 00:30:13,984 Speaker 2: say that's the right thing. 531 00:30:14,104 --> 00:30:17,064 Speaker 5: And also do they really believe it. I'm assuming that 532 00:30:17,184 --> 00:30:19,504 Speaker 5: if they change their view that quickly, they're just going 533 00:30:19,544 --> 00:30:23,344 Speaker 5: along with something and don't really care what they believe. 534 00:30:23,544 --> 00:30:25,984 Speaker 5: So in the nineties, I guess it was no. The 535 00:30:26,024 --> 00:30:29,784 Speaker 5: two thousands, a philosopher at Princeton called Harry Frankfurt wrote 536 00:30:29,784 --> 00:30:32,824 Speaker 5: a book called On Bullshit. It was about this phenomenon 537 00:30:32,864 --> 00:30:36,064 Speaker 5: where people just don't really care what's true and what's not, 538 00:30:36,624 --> 00:30:39,584 Speaker 5: and so they'll say anything regardless. So they're not lying 539 00:30:39,824 --> 00:30:42,304 Speaker 5: and they're not being hypocritical, they just don't really care. 540 00:30:42,344 --> 00:30:46,464 Speaker 3: In US, where myth controls fact, you get to things 541 00:30:46,544 --> 00:30:50,064 Speaker 3: like if you don't like the labor statistics, fire the 542 00:30:50,104 --> 00:30:52,664 Speaker 3: head of the Bureau of Labor Statistics. And then the 543 00:30:52,664 --> 00:30:55,384 Speaker 3: same with climate change. If you don't like climate change, 544 00:30:55,584 --> 00:30:59,344 Speaker 3: you know, stop stop acquiring data to support or not 545 00:30:59,384 --> 00:31:03,584 Speaker 3: support it, right, just defund defund it. And then basically, 546 00:31:03,624 --> 00:31:05,184 Speaker 3: my myth is bigger than your myth. 547 00:31:05,384 --> 00:31:08,664 Speaker 2: So get back to intelligence. Are there questions that you'd 548 00:31:08,664 --> 00:31:10,504 Speaker 2: like to talk to us about well, because a lot 549 00:31:10,584 --> 00:31:12,904 Speaker 2: of bad things have happened in the intelligence world. Mostly 550 00:31:12,944 --> 00:31:15,584 Speaker 2: they're Jerry's fault, and I'm willing to talk about them openly. Yeah. 551 00:31:15,624 --> 00:31:18,344 Speaker 5: No, I've got so many questions because you guys devise 552 00:31:18,464 --> 00:31:21,624 Speaker 5: conspiracy theory, so of course I want to know how 553 00:31:21,664 --> 00:31:22,264 Speaker 5: you did it. 554 00:31:22,464 --> 00:31:25,064 Speaker 3: If you're going to make a conspiracy, Let's just pretend 555 00:31:25,464 --> 00:31:28,584 Speaker 3: the conspiracy is you want to bump into a North 556 00:31:28,664 --> 00:31:29,744 Speaker 3: Korean scientist. 557 00:31:29,864 --> 00:31:32,064 Speaker 4: Right, you can't walk over to him and. 558 00:31:32,024 --> 00:31:35,424 Speaker 3: Say, Hi, I'm with the CIA, I'd like to talk 559 00:31:35,504 --> 00:31:39,024 Speaker 3: to you. You have to engineer a set of circumstances 560 00:31:39,064 --> 00:31:42,624 Speaker 3: that will allow that to happen. So the conspiracy is 561 00:31:42,664 --> 00:31:47,224 Speaker 3: something that is small and manageable, and even then doesn't 562 00:31:47,264 --> 00:31:50,544 Speaker 3: always stand up to deep scrutiny, just enough to get 563 00:31:50,584 --> 00:31:51,464 Speaker 3: you in the doorle So. 564 00:31:51,464 --> 00:31:54,824 Speaker 2: You might pretend to be a Malaysian businessman who meets 565 00:31:54,824 --> 00:31:57,944 Speaker 2: this person when they're traveling and develops some sort of 566 00:31:57,944 --> 00:32:00,384 Speaker 2: relationship and then turns them over to someone else who 567 00:32:00,464 --> 00:32:02,664 Speaker 2: might have a business interest in them. It's also a cipra, 568 00:32:02,864 --> 00:32:05,304 Speaker 2: so you're stringing somebody along so that you can get 569 00:32:05,344 --> 00:32:07,544 Speaker 2: to understand them. And understand what makes them ticke and stuff. 570 00:32:07,544 --> 00:32:10,144 Speaker 2: So you're creating a mini conspiracy. But I would argue 571 00:32:10,184 --> 00:32:13,544 Speaker 2: that intelligence officers often the least conspiratorial people know one 572 00:32:13,544 --> 00:32:17,824 Speaker 2: because how essentially impossible it is to create these big, 573 00:32:17,904 --> 00:32:22,584 Speaker 2: complex conspiracies Because he's a bureaucracy. People sign off for things, 574 00:32:22,864 --> 00:32:26,264 Speaker 2: there's lawyers, rules, regulations, there's lots of people involved. So 575 00:32:26,304 --> 00:32:29,824 Speaker 2: the notion that you could create this very complex conspiracy. 576 00:32:30,024 --> 00:32:32,104 Speaker 2: Look at the JFK thing. Okay, they made sure the 577 00:32:32,224 --> 00:32:34,664 Speaker 2: phones were all turned off in this city at that time, 578 00:32:34,704 --> 00:32:37,424 Speaker 2: and they made sure that the plane was over this 579 00:32:37,544 --> 00:32:39,504 Speaker 2: certain area when this thing happened, and they made sure 580 00:32:39,584 --> 00:32:42,144 Speaker 2: this group of people, they turned off all the security 581 00:32:42,144 --> 00:32:45,264 Speaker 2: and they brought new security. All of those people involved 582 00:32:45,264 --> 00:32:47,424 Speaker 2: having to understand what they're doing and understand they're doing 583 00:32:47,424 --> 00:32:48,224 Speaker 2: something very different. 584 00:32:48,344 --> 00:32:51,104 Speaker 5: It Just imagine governments were that competent. 585 00:32:51,384 --> 00:32:52,384 Speaker 2: Nobody's that competent. 586 00:32:52,504 --> 00:32:54,464 Speaker 3: So there was that we had a the d n 587 00:32:54,584 --> 00:32:57,944 Speaker 3: I was created, right, the direct cognition, Oh it was. 588 00:32:57,984 --> 00:33:00,384 Speaker 3: I'll say it was Dennis Blair and he's he was 589 00:33:00,664 --> 00:33:01,664 Speaker 3: a second or third one. 590 00:33:01,744 --> 00:33:03,264 Speaker 4: He's I'm the big guy. 591 00:33:03,304 --> 00:33:06,264 Speaker 3: I looked at this ORG chart and you all you all, 592 00:33:06,304 --> 00:33:09,344 Speaker 3: you motherfuckers, you got to come and bow to me, right, 593 00:33:09,624 --> 00:33:12,744 Speaker 3: So he puts out this worldwide thing saying all chiefs 594 00:33:12,784 --> 00:33:15,944 Speaker 3: of station must report to me by next week. And 595 00:33:15,984 --> 00:33:20,064 Speaker 3: then we said chiefs of station, like, okay, where do 596 00:33:20,104 --> 00:33:21,904 Speaker 3: we get our plane tickets? We don't have money in 597 00:33:21,944 --> 00:33:24,384 Speaker 3: our budget for this, so d and I what's our 598 00:33:24,504 --> 00:33:27,704 Speaker 3: budget number that we can do this? And where are 599 00:33:27,704 --> 00:33:30,784 Speaker 3: we going to stay? And who pays for the rental cars? 600 00:33:31,184 --> 00:33:33,744 Speaker 3: So it was like the bureaucracy just didn't you know. 601 00:33:33,784 --> 00:33:35,464 Speaker 3: He's like, yeah, you may be the boss, dude, but 602 00:33:35,504 --> 00:33:37,704 Speaker 3: you don't have a budget. You got your lawyers, like 603 00:33:37,744 --> 00:33:40,784 Speaker 3: I got shit going on. And then it was a disaster. 604 00:33:41,504 --> 00:33:45,104 Speaker 3: And so people who think by and large that in 605 00:33:45,184 --> 00:33:48,464 Speaker 3: the federal government, even in the CIA and the intelligence community, 606 00:33:48,544 --> 00:33:52,064 Speaker 3: somebody can per fiat just say oh dude, oh And 607 00:33:52,104 --> 00:33:54,864 Speaker 3: then everybody talked about it for years. It was like 608 00:33:55,184 --> 00:33:58,824 Speaker 3: it's like, you know, everybody knew, because it's not like 609 00:33:58,904 --> 00:34:00,664 Speaker 3: all the chiefs of station kept their mouths shut. They 610 00:34:00,704 --> 00:34:03,184 Speaker 3: all had coffee with their buddies and people are overhearing, 611 00:34:03,224 --> 00:34:04,944 Speaker 3: and that was a disaster well. 612 00:34:04,784 --> 00:34:07,384 Speaker 2: And they hid us for a personality that we actually 613 00:34:07,384 --> 00:34:10,143 Speaker 2: are the kind of people who like are happy saying 614 00:34:10,184 --> 00:34:13,944 Speaker 2: no to leadership and y and so like you've got 615 00:34:13,984 --> 00:34:15,623 Speaker 2: a bunch of like, you know, you're trying to herd 616 00:34:15,663 --> 00:34:18,584 Speaker 2: cats and all these people think they're smarter than everybody else. 617 00:34:18,623 --> 00:34:21,263 Speaker 2: They're not, but they have they're pretty smart, and they 618 00:34:21,304 --> 00:34:24,224 Speaker 2: have their own sort of views, and they don't care 619 00:34:24,464 --> 00:34:28,464 Speaker 2: what people say. Because intelligence is about providing oftentimes information 620 00:34:28,504 --> 00:34:30,183 Speaker 2: to policy makers. We don't want to hear it. Yeah, 621 00:34:30,304 --> 00:34:32,304 Speaker 2: so that's part of the culture. And so the notion 622 00:34:32,384 --> 00:34:35,424 Speaker 2: that like this group could keep a deep conspiracy for 623 00:34:35,464 --> 00:34:36,623 Speaker 2: decades it's nuts. 624 00:34:36,864 --> 00:34:37,823 Speaker 4: I have to say that. 625 00:34:38,183 --> 00:34:40,664 Speaker 3: Every once in a while, in the Washington Post, Lesson 626 00:34:40,743 --> 00:34:43,584 Speaker 3: less Lately or New York Times, there will be some 627 00:34:43,703 --> 00:34:48,143 Speaker 3: expos on some past operation that the agency pulled off, 628 00:34:48,544 --> 00:34:49,904 Speaker 3: and I'll look at that and go, oh, I know 629 00:34:50,024 --> 00:34:52,623 Speaker 3: about this one. And then I'll start reading it and like, oh, 630 00:34:52,663 --> 00:34:53,944 Speaker 3: I didn't know those details. 631 00:34:54,263 --> 00:34:54,703 Speaker 4: WHOA. 632 00:34:55,344 --> 00:34:58,343 Speaker 5: I know a bunch of investigative journalists and they will say, 633 00:34:58,344 --> 00:35:01,183 Speaker 5: it's really hard to find out what's going on because 634 00:35:01,223 --> 00:35:03,664 Speaker 5: you don't just get this one source like deep throat 635 00:35:03,703 --> 00:35:08,464 Speaker 5: who knows everything. People have their own area of specialization 636 00:35:08,864 --> 00:35:12,143 Speaker 5: and they have some sense of what's going on elsewhere, 637 00:35:12,223 --> 00:35:15,863 Speaker 5: and there isn't just this one font of information that 638 00:35:15,944 --> 00:35:16,623 Speaker 5: you can go to. 639 00:35:16,984 --> 00:35:18,424 Speaker 2: No, And there's need to know. You might be in 640 00:35:18,424 --> 00:35:20,504 Speaker 2: the middle of an operation, and I know a lot of 641 00:35:20,504 --> 00:35:23,584 Speaker 2: stuff about that, but there's other pieces that you don't 642 00:35:23,584 --> 00:35:25,384 Speaker 2: have any sense for. And then you say, well, the 643 00:35:25,384 --> 00:35:28,024 Speaker 2: people above you must know. But they're managing tons and 644 00:35:28,024 --> 00:35:30,504 Speaker 2: tons of these operations, so they know only as much 645 00:35:30,544 --> 00:35:32,344 Speaker 2: as they need you or get money or to brief 646 00:35:32,384 --> 00:35:34,704 Speaker 2: that up. If the North Koreans of Russians had drugs 647 00:35:34,703 --> 00:35:37,544 Speaker 2: and they got the director of CIA and ask them, okay, 648 00:35:37,544 --> 00:35:40,064 Speaker 2: tell us all the secrets, you wouldn't get a lot. 649 00:35:40,024 --> 00:35:42,863 Speaker 3: And the things account like sources and methods. I'll tell 650 00:35:42,864 --> 00:35:47,223 Speaker 3: you if you get three CIA guys drinking around a bar, right, 651 00:35:47,304 --> 00:35:51,024 Speaker 3: and they're really gonna say it is most of us 652 00:35:51,064 --> 00:35:54,464 Speaker 3: could not remember the names of our agents or what 653 00:35:54,544 --> 00:35:56,984 Speaker 3: was the name of that operation again, even just a 654 00:35:57,064 --> 00:36:00,263 Speaker 3: few years afterwards, because these are small details, and in 655 00:36:00,344 --> 00:36:02,384 Speaker 3: our thing is like you may look at it once 656 00:36:02,464 --> 00:36:04,424 Speaker 3: and then a year later you don't really use that 657 00:36:04,864 --> 00:36:05,663 Speaker 3: small detail. 658 00:36:05,944 --> 00:36:09,223 Speaker 5: That The myth of the deep state really relies on 659 00:36:09,344 --> 00:36:11,424 Speaker 5: the idea that it's a bunch of people who know 660 00:36:11,544 --> 00:36:14,384 Speaker 5: exactly what they're doing and have a common cause. But 661 00:36:14,464 --> 00:36:17,383 Speaker 5: I've been looking at the origins of that myth, and 662 00:36:17,544 --> 00:36:20,824 Speaker 5: so one of them, of course is the Bay of Pigs, 663 00:36:20,864 --> 00:36:24,024 Speaker 5: when people started to become aware of what CIA were 664 00:36:24,064 --> 00:36:28,544 Speaker 5: doing and that they have this disastrous operation, and Kennedy 665 00:36:29,183 --> 00:36:32,304 Speaker 5: said that it was all on him, that he took responsibility, 666 00:36:32,344 --> 00:36:34,464 Speaker 5: but at the same time he fired Alan Dulles. 667 00:36:34,824 --> 00:36:38,024 Speaker 3: I think one of the great misunderstandings was a misunderstanding 668 00:36:38,064 --> 00:36:40,824 Speaker 3: at the time was I think a lot of people 669 00:36:40,944 --> 00:36:43,183 Speaker 3: involved in the app and lower down thought that if 670 00:36:43,223 --> 00:36:47,784 Speaker 3: it went bad, Kennedy would and had freed to employ 671 00:36:47,864 --> 00:36:50,223 Speaker 3: the US Air Force, And I think there were some 672 00:36:50,344 --> 00:36:52,983 Speaker 3: indications that he would have, but I think I don't 673 00:36:53,024 --> 00:36:54,943 Speaker 3: think he ever said I will and then went back 674 00:36:54,984 --> 00:36:56,783 Speaker 3: on his words. I think they just a lot of 675 00:36:56,783 --> 00:36:58,544 Speaker 3: it was just a human fuck up. 676 00:36:58,703 --> 00:37:01,663 Speaker 2: Yeah, they made assumptions about what and of course this 677 00:37:01,783 --> 00:37:04,663 Speaker 2: is pre reform cias. When so you go to see 678 00:37:04,663 --> 00:37:06,663 Speaker 2: the president, they would say, maybe not give you a 679 00:37:06,663 --> 00:37:08,703 Speaker 2: direct order. They might like, listen, we're going to do 680 00:37:08,703 --> 00:37:11,183 Speaker 2: something about the leader in Chile or whatever it is, 681 00:37:11,223 --> 00:37:14,104 Speaker 2: and the director would come back without a specific order, 682 00:37:14,143 --> 00:37:15,783 Speaker 2: but like a general thing and go back down to 683 00:37:15,783 --> 00:37:17,544 Speaker 2: people generally, we need to do this, we need that. 684 00:37:17,584 --> 00:37:19,943 Speaker 2: So I've always making assumptions about what people above them 685 00:37:20,304 --> 00:37:22,584 Speaker 2: want and think, and then the real world is complex 686 00:37:22,623 --> 00:37:25,703 Speaker 2: and dirty and it gets carried out and then everybody's like, oh, 687 00:37:25,783 --> 00:37:28,303 Speaker 2: I thought you wanted this or wanted that. Once we 688 00:37:28,344 --> 00:37:31,223 Speaker 2: got to the reforms in the seventies, it became much 689 00:37:31,223 --> 00:37:33,703 Speaker 2: clear that presidents when they wanted to see I had 690 00:37:33,743 --> 00:37:36,384 Speaker 2: to do covert action, which is this kind of stuff. 691 00:37:36,504 --> 00:37:39,224 Speaker 2: They had to write written orders that were then shared 692 00:37:39,263 --> 00:37:41,943 Speaker 2: with the congressional committees as well as the agency, which 693 00:37:41,944 --> 00:37:44,623 Speaker 2: could then write back. And then everything we do is 694 00:37:44,783 --> 00:37:49,303 Speaker 2: incredibly documented. Yeah, so I think the intelligennity change fundamentally 695 00:37:49,304 --> 00:37:51,303 Speaker 2: at that time. But I also think there was still 696 00:37:51,304 --> 00:37:54,704 Speaker 2: a culture that people from that from intelligence don't speak, 697 00:37:54,703 --> 00:37:57,143 Speaker 2: they don't speak publicly, And it wasn't until I think 698 00:37:57,223 --> 00:38:00,183 Speaker 2: like the nineties, maybe early two thousands that if you retired. 699 00:38:00,223 --> 00:38:02,343 Speaker 2: So Jerry and I were both under State Department cover. 700 00:38:02,623 --> 00:38:04,024 Speaker 2: When we were retired, we were allowed to what we 701 00:38:04,064 --> 00:38:07,263 Speaker 2: call roll back our cover and then write articles and speak. 702 00:38:07,344 --> 00:38:09,104 Speaker 2: And you know, I can remember General Hayden saying it was. 703 00:38:09,623 --> 00:38:12,143 Speaker 2: It's a responsibility if you work for a secret organization 704 00:38:12,424 --> 00:38:14,184 Speaker 2: that if you have the opportunity to try to explain 705 00:38:14,263 --> 00:38:16,424 Speaker 2: what you can to the American public, you have an 706 00:38:16,424 --> 00:38:18,823 Speaker 2: obligation to do that. The public relies on the fact 707 00:38:18,824 --> 00:38:20,664 Speaker 2: that we are doing a mission for the American public, 708 00:38:20,703 --> 00:38:21,983 Speaker 2: so we can give you as much as we can. 709 00:38:22,223 --> 00:38:28,064 Speaker 3: But there's an inherent contradiction in having a clandestine a 710 00:38:28,223 --> 00:38:33,864 Speaker 3: secret intelligence agency inside a transparent democracy. The two don't fit, 711 00:38:34,024 --> 00:38:37,304 Speaker 3: and they shouldn't fit. We should control our intelligence agencies 712 00:38:37,424 --> 00:38:40,743 Speaker 3: very closely by elected officials in other countries. It's not 713 00:38:40,824 --> 00:38:43,863 Speaker 3: the case that the GRU and the SVR fit just 714 00:38:43,984 --> 00:38:47,224 Speaker 3: fine hand in glove and are central to Putin's Russia, 715 00:38:47,304 --> 00:38:50,623 Speaker 3: but CIA and MI sixter they're not. You're trying to 716 00:38:50,663 --> 00:38:54,183 Speaker 3: save sources and methods. You're trying to keep people for us. 717 00:38:54,223 --> 00:38:56,104 Speaker 3: People come to us and they say, I'll tell you 718 00:38:56,143 --> 00:38:58,904 Speaker 3: what's really going on, but you've got to keep me alive. 719 00:38:59,064 --> 00:39:01,984 Speaker 3: And we take that very seriously, and yeah, so we're 720 00:39:02,024 --> 00:39:04,024 Speaker 3: not going to tell we're not going to tell news 721 00:39:04,064 --> 00:39:06,623 Speaker 3: of the world. Well, according to this guy, you know, 722 00:39:06,663 --> 00:39:08,904 Speaker 3: the Russians are lying to us because he and his 723 00:39:08,944 --> 00:39:10,944 Speaker 3: family are you know, be executed. 724 00:39:33,584 --> 00:39:36,743 Speaker 5: So I think Dylan Avery is the name of the 725 00:39:36,783 --> 00:39:40,304 Speaker 5: person who made the Loose Change documentary about nine to eleven. 726 00:39:40,424 --> 00:39:42,144 Speaker 5: The conspiracy theory one. 727 00:39:42,223 --> 00:39:44,663 Speaker 4: Please experience on that. Now everybody knows what that is. 728 00:39:44,864 --> 00:39:47,383 Speaker 6: Well, the nine to eleven truthers who think that it 729 00:39:47,424 --> 00:39:51,704 Speaker 6: was all a government conspiracy have various sources, but the 730 00:39:51,703 --> 00:39:55,624 Speaker 6: most mainstream one is this documentary called Loose Change. 731 00:39:55,703 --> 00:39:56,743 Speaker 5: The final version of. 732 00:39:56,703 --> 00:40:00,903 Speaker 6: It ended up having Roger Stone as one of their producers, so. 733 00:40:00,824 --> 00:40:03,743 Speaker 5: You can imagine, oh wait, maybe it was Alex Jones. Yeah, 734 00:40:03,743 --> 00:40:06,584 Speaker 5: it was Alex Jones. But it started out just this 735 00:40:06,663 --> 00:40:10,863 Speaker 5: guy who was actually writing a fictional screen so it 736 00:40:10,944 --> 00:40:15,504 Speaker 5: was precisely supposed to be one of these Hollywood stories 737 00:40:16,223 --> 00:40:20,303 Speaker 5: about the loan hero discovering the big conspiracy, and then 738 00:40:20,384 --> 00:40:23,064 Speaker 5: he became the hero of his own movie and turned 739 00:40:23,064 --> 00:40:26,064 Speaker 5: it into a documentary. And I think that's just something 740 00:40:26,104 --> 00:40:29,504 Speaker 5: that is really tempting for people. It's what's been glamorized 741 00:40:29,584 --> 00:40:33,064 Speaker 5: more than anything else in our societies. The fact most 742 00:40:33,183 --> 00:40:36,984 Speaker 5: buy movies are about people like Jason Bourne, who are 743 00:40:37,104 --> 00:40:41,623 Speaker 5: that loan hero that are going against all of the 744 00:40:41,783 --> 00:40:46,984 Speaker 5: authorities and finding out everything for themselves and changing the world. 745 00:40:47,424 --> 00:40:50,864 Speaker 2: But the reality of an intelligence organization, it's a team sport. 746 00:40:50,904 --> 00:40:53,264 Speaker 2: Like Jerry and I are trying to meet a recruit, 747 00:40:53,263 --> 00:40:56,424 Speaker 2: a source, but we're counting on lots of people behind 748 00:40:56,464 --> 00:40:59,744 Speaker 2: us expertise. If we have to get special equipment, people 749 00:40:59,743 --> 00:41:01,944 Speaker 2: who have records back home that can help us understand 750 00:41:02,024 --> 00:41:04,343 Speaker 2: what's going on about, what the issues are, what kind 751 00:41:04,384 --> 00:41:07,384 Speaker 2: of questions they ask, and so you talk about common 752 00:41:07,424 --> 00:41:11,104 Speaker 2: cause to us mission is very but the mission is 753 00:41:11,663 --> 00:41:15,663 Speaker 2: providing unbiased information to policymakers they make better policy and 754 00:41:15,703 --> 00:41:19,584 Speaker 2: protect the American citizens. And it's non but it is nonpartisan. 755 00:41:19,623 --> 00:41:20,183 Speaker 4: I don't talk. 756 00:41:20,304 --> 00:41:23,183 Speaker 2: I don't remember ever talking to anybody inside, even my 757 00:41:23,223 --> 00:41:26,823 Speaker 2: best friends about politics or politicians, even when we were drinking, 758 00:41:26,864 --> 00:41:29,223 Speaker 2: even when we're overseas. It wasn't really till you were 759 00:41:29,424 --> 00:41:31,223 Speaker 2: tired and you see them on Facebook or these other 760 00:41:31,263 --> 00:41:33,703 Speaker 2: places talking and you're like, oh my god, I had 761 00:41:33,743 --> 00:41:35,823 Speaker 2: no idea that, and like, I would you know, until 762 00:41:35,904 --> 00:41:37,584 Speaker 2: Jerry and I got to know each other after we retired, 763 00:41:37,623 --> 00:41:39,024 Speaker 2: I don't know Jerry's politics. 764 00:41:39,223 --> 00:41:41,303 Speaker 4: Yeah, no clue. He just knew I was right all 765 00:41:41,344 --> 00:41:41,663 Speaker 4: the time. 766 00:41:41,824 --> 00:41:44,584 Speaker 2: And focus on You're focused on doing your work and 767 00:41:44,663 --> 00:41:47,664 Speaker 2: if anything we sort of look down on politicians because 768 00:41:47,743 --> 00:41:49,623 Speaker 2: they come out to our stations. We have to brief 769 00:41:49,623 --> 00:41:52,064 Speaker 2: them all the time. The questions are often stupid. They 770 00:41:52,143 --> 00:41:55,024 Speaker 2: don't really understand us. They're in a game where they're 771 00:41:55,024 --> 00:41:57,263 Speaker 2: trying to self promote. We're in a game where it's 772 00:41:57,263 --> 00:41:58,264 Speaker 2: not about self promotion. 773 00:41:58,464 --> 00:42:04,263 Speaker 5: So you know, if any president comes into office and 774 00:42:04,384 --> 00:42:07,623 Speaker 5: has a whole new vision for what CIA should be 775 00:42:07,743 --> 00:42:11,303 Speaker 5: and what everybody should be doing that conflicts with the 776 00:42:11,384 --> 00:42:13,863 Speaker 5: sense of mission and people going to push back. 777 00:42:13,663 --> 00:42:16,344 Speaker 2: That's a great question, because I don't it never happened before. 778 00:42:16,424 --> 00:42:19,104 Speaker 2: Jerry and I worked for Reagan and Bush and Obama 779 00:42:19,223 --> 00:42:21,424 Speaker 2: and Clinton and everybody, and it was there would be 780 00:42:21,623 --> 00:42:24,944 Speaker 2: policy changes and questions and what happened with this country 781 00:42:24,944 --> 00:42:26,504 Speaker 2: in that country and what we're doing, But in terms 782 00:42:26,544 --> 00:42:29,104 Speaker 2: of our business, almost no real change in. 783 00:42:29,104 --> 00:42:29,863 Speaker 4: The day to day work. 784 00:42:29,864 --> 00:42:32,904 Speaker 2: If you're in India trying to collect information on Russians 785 00:42:32,904 --> 00:42:34,623 Speaker 2: and Chinese and other kinds of things and a new 786 00:42:34,623 --> 00:42:38,264 Speaker 2: president comes in, there's not a lot of difference. However, 787 00:42:38,344 --> 00:42:40,863 Speaker 2: now it might be true because the administration wants the 788 00:42:40,864 --> 00:42:43,544 Speaker 2: CIA to be a partisan weapon to help the presidents 789 00:42:43,544 --> 00:42:45,823 Speaker 2: stay in power or to carry through his thing. So 790 00:42:46,143 --> 00:42:48,343 Speaker 2: now might be the real challenge and to try to 791 00:42:48,384 --> 00:42:51,783 Speaker 2: see how much of that is filtering down through the organization. 792 00:42:52,183 --> 00:42:54,544 Speaker 3: Yeah, so, Tamson, you talked about being a hero in 793 00:42:54,584 --> 00:42:56,744 Speaker 3: euro movie, and I would be remiss if I didn't 794 00:42:56,783 --> 00:42:59,743 Speaker 3: throw in this anecdote. I was in a place and 795 00:42:59,783 --> 00:43:01,464 Speaker 3: it was in a war zone, right, It was a 796 00:43:01,584 --> 00:43:05,104 Speaker 3: very dangerous place, and a congressman flew in and he 797 00:43:05,223 --> 00:43:09,184 Speaker 3: was inside this secure military base, but outside the base 798 00:43:09,703 --> 00:43:12,424 Speaker 3: the bad guys were there and they kind of like 799 00:43:13,263 --> 00:43:16,384 Speaker 3: ruled the turf. And he said, I want to go 800 00:43:16,504 --> 00:43:19,263 Speaker 3: out and I want to get in a firefight. Right, 801 00:43:19,304 --> 00:43:22,023 Speaker 3: I want to go out, have the bad guys attack us, 802 00:43:22,064 --> 00:43:23,743 Speaker 3: and I want to get in a firefight, and I 803 00:43:23,783 --> 00:43:27,544 Speaker 3: want it photographed me shooting a weapon in them. 804 00:43:27,944 --> 00:43:29,584 Speaker 4: And first of all, we would go out in a 805 00:43:29,623 --> 00:43:32,423 Speaker 4: convoy with jeeps and things like that, and they would 806 00:43:32,504 --> 00:43:33,183 Speaker 4: ambush us. 807 00:43:33,263 --> 00:43:34,863 Speaker 3: The rule of an ambush is they get the first 808 00:43:34,864 --> 00:43:37,703 Speaker 3: with automatic weapons, They get the first thousand rounds to 809 00:43:37,743 --> 00:43:40,703 Speaker 3: shoot at us, and then if and when we survived, 810 00:43:40,783 --> 00:43:41,703 Speaker 3: we get to shoot back. 811 00:43:41,743 --> 00:43:43,544 Speaker 4: But they're hiding. But I didn't tell them that. 812 00:43:43,824 --> 00:43:47,143 Speaker 3: And then he went a step further and said, oh, 813 00:43:47,384 --> 00:43:50,223 Speaker 3: be good if I just got wingked, you know, just 814 00:43:50,304 --> 00:43:53,343 Speaker 3: like a little you know, just like just you know, 815 00:43:53,464 --> 00:43:55,823 Speaker 3: just a little bit of blood and like you. 816 00:43:55,703 --> 00:43:57,944 Speaker 4: Do what a modern round does to you. 817 00:43:57,944 --> 00:43:59,944 Speaker 3: You know, it's like, you know, you're more likely to 818 00:44:00,263 --> 00:44:03,863 Speaker 3: This was a guy from the House of Representatives, and 819 00:44:03,944 --> 00:44:06,904 Speaker 3: he was so steeped in the mythology. 820 00:44:06,344 --> 00:44:08,864 Speaker 4: Of the movies because he had seen. 821 00:44:08,584 --> 00:44:10,623 Speaker 3: This on TV so many And I'm not an ex 822 00:44:10,703 --> 00:44:13,343 Speaker 3: military guide. The military guys who are like flood the block. 823 00:44:13,464 --> 00:44:18,143 Speaker 4: Yeah, and you know I too, I know. It was like, 824 00:44:18,263 --> 00:44:19,623 Speaker 4: I'm not going with you. I'm not. 825 00:44:20,024 --> 00:44:23,544 Speaker 3: Yeah, And so I think I jokingly told them that 826 00:44:23,584 --> 00:44:26,104 Speaker 3: it wouldn't be good for my career. It's not a 827 00:44:26,104 --> 00:44:29,383 Speaker 3: career improver to get a congressman killed, Like who's your guest? Right, 828 00:44:29,864 --> 00:44:33,223 Speaker 3: By and large, we have some really good officers and 829 00:44:33,304 --> 00:44:35,783 Speaker 3: to see the vast majority are because these are people 830 00:44:35,783 --> 00:44:38,703 Speaker 3: who could have made lots more money on the outside. 831 00:44:39,024 --> 00:44:41,343 Speaker 3: They wouldn't have taken their kids to places where there 832 00:44:41,344 --> 00:44:44,303 Speaker 3: are cobras and malaria. You worry about your family, and 833 00:44:44,384 --> 00:44:46,423 Speaker 3: yet they did it because of mission, because of those 834 00:44:46,544 --> 00:44:49,384 Speaker 3: values and the constitutions you talked about that we believe 835 00:44:49,464 --> 00:44:50,264 Speaker 3: in those things. 836 00:44:50,623 --> 00:44:51,303 Speaker 4: But we have a few. 837 00:44:51,424 --> 00:44:53,544 Speaker 3: Like any place else, we have a few people who 838 00:44:53,623 --> 00:44:55,784 Speaker 3: either our shipballs or become crazy. 839 00:44:56,104 --> 00:44:56,343 Speaker 2: Yeah. 840 00:44:56,464 --> 00:44:59,984 Speaker 5: I just read Tim Winer's book on CIA in the 841 00:45:00,024 --> 00:45:02,584 Speaker 5: twenty first centuries, and that was amazing because there are 842 00:45:02,623 --> 00:45:05,703 Speaker 5: people in there who he names that I either met 843 00:45:05,824 --> 00:45:08,624 Speaker 5: or just follow on social media, and I'm suddenly finding 844 00:45:08,663 --> 00:45:11,744 Speaker 5: out about the things they've done that are just mind 845 00:45:11,783 --> 00:45:17,223 Speaker 5: blowing to ordinary people who live safe lives in urban settings. 846 00:45:17,623 --> 00:45:20,143 Speaker 2: It was a fun job. Like honestly, God and I 847 00:45:20,304 --> 00:45:22,823 Speaker 2: sound like we're proselytizing here, but you know, I spent 848 00:45:22,864 --> 00:45:26,104 Speaker 2: almost thirty years in the organization. I don't regret any 849 00:45:26,143 --> 00:45:28,223 Speaker 2: of it. I think what I did. You know, we're 850 00:45:28,223 --> 00:45:30,423 Speaker 2: trying to do the right thing. There were so many 851 00:45:30,504 --> 00:45:33,223 Speaker 2: fun and interesting people. We got to work around the world, 852 00:45:33,344 --> 00:45:36,584 Speaker 2: different cultures, meeting and working with foreign organizations. It was 853 00:45:36,663 --> 00:45:39,783 Speaker 2: really interesting, fun stimulating. You felt like you were doing 854 00:45:39,783 --> 00:45:42,304 Speaker 2: something bigger than yourself. I don't have any regrets. 855 00:45:42,464 --> 00:45:45,064 Speaker 5: How do you guys pick up languages just like that, 856 00:45:45,424 --> 00:45:45,984 Speaker 5: You don't. 857 00:45:45,743 --> 00:45:47,823 Speaker 3: Pick them up from this, Well, that's part of the 858 00:45:47,864 --> 00:45:51,263 Speaker 3: diversity that we are languages right, or we take a 859 00:45:51,344 --> 00:45:53,744 Speaker 3: year off and you get okay with the language. 860 00:45:53,783 --> 00:45:56,263 Speaker 2: Yeah, I learned Finnish and Russian and server e creation, 861 00:45:56,584 --> 00:45:58,903 Speaker 2: but it was painful. So when I learned Russian, I 862 00:45:58,984 --> 00:46:01,984 Speaker 2: was in a room with one or two other people 863 00:46:02,384 --> 00:46:05,544 Speaker 2: for a year, all day long, trying to speak, trying 864 00:46:05,544 --> 00:46:08,104 Speaker 2: to read, and then at going back and back. Then 865 00:46:08,104 --> 00:46:11,104 Speaker 2: when there's cassettes, having to listen home and do someone homework. 866 00:46:10,783 --> 00:46:12,544 Speaker 4: And just it's just it's your job. 867 00:46:12,623 --> 00:46:14,824 Speaker 2: You work for CIA, and that whole year you're working 868 00:46:14,904 --> 00:46:17,304 Speaker 2: in a language lab preparing to learn the language. 869 00:46:17,384 --> 00:46:20,344 Speaker 5: So can I ask how does it feel when members 870 00:46:20,344 --> 00:46:24,584 Speaker 5: of the public who now know that your former CIA 871 00:46:24,864 --> 00:46:28,504 Speaker 5: start without knowing anything about what you did, obviously, will 872 00:46:28,504 --> 00:46:32,184 Speaker 5: start to blame you for every covit oh that they've 873 00:46:32,223 --> 00:46:37,143 Speaker 5: heard about, like Operation paper Clip to Nazis or Operation 874 00:46:37,304 --> 00:46:39,743 Speaker 5: Phoenix or whatever it is. That you know that a 875 00:46:40,064 --> 00:46:43,344 Speaker 5: part of the culpiller idea of CIA. 876 00:46:43,424 --> 00:46:46,143 Speaker 3: Operation paper Clip is a good one. So do you 877 00:46:46,344 --> 00:46:50,264 Speaker 3: allow the sow Do you allow Stalin, who's murdering millions 878 00:46:50,263 --> 00:46:52,903 Speaker 3: of his own people to get scientists who know how 879 00:46:52,904 --> 00:46:55,343 Speaker 3: to make a nuclear bomb in missiles? Or do you 880 00:46:56,064 --> 00:46:58,223 Speaker 3: take these Nazis and bring them to the US where 881 00:46:58,263 --> 00:47:01,623 Speaker 3: Stalin doesn't get them? It's not a black or white issue, 882 00:47:01,663 --> 00:47:05,544 Speaker 3: it's a gray issue, and oftentimes in CIA we deal 883 00:47:05,584 --> 00:47:09,224 Speaker 3: with those right. And there's one of the really interesting 884 00:47:09,263 --> 00:47:12,424 Speaker 3: things in being a CIA. If it was easy black 885 00:47:12,464 --> 00:47:15,104 Speaker 3: and white, obviously good or bad, then you wouldn't need us. 886 00:47:15,504 --> 00:47:18,423 Speaker 3: So we deal often with gray, whereas how gray is it? 887 00:47:18,743 --> 00:47:22,064 Speaker 3: And what's our mission and what's what's our political masters want? 888 00:47:22,263 --> 00:47:25,104 Speaker 3: Is what's good for the democracy? And what's the downside 889 00:47:25,104 --> 00:47:26,024 Speaker 3: of not doing it? 890 00:47:26,143 --> 00:47:28,544 Speaker 5: But when it goes wrong, you guys seem to get 891 00:47:28,584 --> 00:47:31,743 Speaker 5: the blame wrong than the president did it Is that 892 00:47:31,824 --> 00:47:33,664 Speaker 5: because of plausible deniability? 893 00:47:33,743 --> 00:47:35,904 Speaker 2: For I think that's the idea. But if you go 894 00:47:35,944 --> 00:47:37,863 Speaker 2: to the working boy, like I said, when people bring 895 00:47:37,864 --> 00:47:39,824 Speaker 2: that stuff up often, I'm glad to have the discussion 896 00:47:39,864 --> 00:47:42,183 Speaker 2: because some of the things that the early CI did. 897 00:47:42,263 --> 00:47:43,943 Speaker 2: First of all, it was a new institution brought out 898 00:47:43,984 --> 00:47:46,743 Speaker 2: of nowhere, and the fear was World War three was 899 00:47:46,824 --> 00:47:49,223 Speaker 2: right around the bend. Russians were going to come over 900 00:47:49,344 --> 00:47:52,263 Speaker 2: take over Europe. We had to act like the Russians did. 901 00:47:52,544 --> 00:47:54,343 Speaker 2: And the people who were in the early CIA came 902 00:47:54,384 --> 00:47:57,984 Speaker 2: out of working in World War Two. They were paramilitary officers, 903 00:47:58,024 --> 00:48:00,904 Speaker 2: and so in those early years, the presidents used the 904 00:48:00,944 --> 00:48:03,703 Speaker 2: CIA as is like a secret weapon, and the people 905 00:48:03,743 --> 00:48:05,303 Speaker 2: there were learning on the fly, and so I think 906 00:48:05,304 --> 00:48:06,703 Speaker 2: we did make a lot of mistakes. I think a 907 00:48:06,703 --> 00:48:09,504 Speaker 2: lot of stuff. You look at it with Iran or Guatemala, 908 00:48:09,544 --> 00:48:11,944 Speaker 2: all of these kinds of things. Number One, presidents were 909 00:48:12,024 --> 00:48:14,064 Speaker 2: behind it. Number Two, the agency didn't have any real 910 00:48:14,104 --> 00:48:16,743 Speaker 2: oversight and didn't have much experience, and so it's fair 911 00:48:16,783 --> 00:48:18,464 Speaker 2: to have those discussions and what was right and what 912 00:48:18,504 --> 00:48:21,624 Speaker 2: was wrong. I think it's changed a lot, and it's true. 913 00:48:21,904 --> 00:48:24,944 Speaker 2: You don't hear people say the Department of Defense invasion 914 00:48:24,984 --> 00:48:27,783 Speaker 2: of Iraq. They say the US invasion of Iraq. But 915 00:48:27,824 --> 00:48:30,544 Speaker 2: when you talk about enhance the derogation, the CIA enhanced derogation. 916 00:48:30,824 --> 00:48:34,904 Speaker 2: It was actually the US program of its interesting. President 917 00:48:34,984 --> 00:48:38,104 Speaker 2: gave it, the Justice Department cleared it. I mean, Congress 918 00:48:38,223 --> 00:48:40,943 Speaker 2: was briefed fully on all of it. And you can 919 00:48:41,024 --> 00:48:43,023 Speaker 2: argue whether it was good or bad. But the CI 920 00:48:43,104 --> 00:48:44,904 Speaker 2: didn't invent it and say we're just going to do 921 00:48:44,944 --> 00:48:46,184 Speaker 2: this without anybody. 922 00:48:45,984 --> 00:48:47,344 Speaker 4: And context is important. 923 00:48:47,424 --> 00:48:50,783 Speaker 3: Right twenty five years later, twenty almost twenty five years later, 924 00:48:50,783 --> 00:48:53,064 Speaker 3: after nine to eleven, we haven't had another major attack 925 00:48:53,143 --> 00:48:56,503 Speaker 3: inside the US. So people are like feeling safe now, 926 00:48:56,864 --> 00:49:00,143 Speaker 3: But twenty three years ago, we're picking up chatter from 927 00:49:00,223 --> 00:49:03,623 Speaker 3: al Qaida that another attack is coming, right, they are 928 00:49:03,703 --> 00:49:05,384 Speaker 3: going to and they were working on it like a 929 00:49:05,464 --> 00:49:07,984 Speaker 3: dirty bomb, all sorts of things. And we're looking at 930 00:49:08,104 --> 00:49:12,024 Speaker 3: not just three thousand dead, but tens of thousands potentially dead. Okay, 931 00:49:12,064 --> 00:49:15,304 Speaker 3: So what would you do morally to stop ten thousand 932 00:49:15,344 --> 00:49:19,504 Speaker 3: Americans dying? Would you make someone discoss you know, you're 933 00:49:19,504 --> 00:49:21,783 Speaker 3: not pulling their fingernails out, that's illegal, that's torture, But 934 00:49:21,904 --> 00:49:25,143 Speaker 3: how uncomfortable would you make them? And who decides and 935 00:49:25,223 --> 00:49:27,863 Speaker 3: how is that decided? So it gets into a cause. 936 00:49:27,864 --> 00:49:30,544 Speaker 3: If you don't do it and ten thousand people die, 937 00:49:30,743 --> 00:49:33,343 Speaker 3: you know the gifts here, But at the time this 938 00:49:33,504 --> 00:49:37,784 Speaker 3: was considered a genuine possibility. How do you live with yourself? 939 00:49:37,864 --> 00:49:38,423 Speaker 4: Right? And so. 940 00:49:40,143 --> 00:49:43,903 Speaker 2: The same congressional people who complain about we went too far, 941 00:49:44,504 --> 00:49:46,663 Speaker 2: if there was far strikes, they would have said, you 942 00:49:46,703 --> 00:49:49,183 Speaker 2: guys didn't do enough. And so we understand that where 943 00:49:49,183 --> 00:49:51,104 Speaker 2: it's sort of about the edge, and so you're going 944 00:49:51,183 --> 00:49:52,504 Speaker 2: to get criticism. 945 00:49:52,743 --> 00:49:55,823 Speaker 4: Is it torture to put somebody in isolation? Right? 946 00:49:55,864 --> 00:49:57,504 Speaker 3: We do it in prisons all the time. So is 947 00:49:57,544 --> 00:49:59,664 Speaker 3: that torture or isn't it. You can make an argument 948 00:49:59,783 --> 00:50:03,304 Speaker 3: both ways, and so where is the line and torture 949 00:50:03,384 --> 00:50:06,823 Speaker 3: to just be to be sadistic is one thing, But 950 00:50:07,344 --> 00:50:10,863 Speaker 3: pressure on another person to say lives when it's a 951 00:50:10,904 --> 00:50:14,223 Speaker 3: ticking time bombing you don't have six weeks, it's quite 952 00:50:14,223 --> 00:50:14,703 Speaker 3: something else. 953 00:50:14,944 --> 00:50:16,743 Speaker 2: I think it's in Tim Winer's book where he talks 954 00:50:16,743 --> 00:50:19,343 Speaker 2: about the head lawyer of the counter Terrorism Center when 955 00:50:19,344 --> 00:50:21,663 Speaker 2: this came up, that they were talking about waterboarding, and 956 00:50:21,663 --> 00:50:24,263 Speaker 2: she said, well, I'm not signing off on waterboarding. I 957 00:50:24,304 --> 00:50:26,623 Speaker 2: know we use it against our own troops and training, 958 00:50:26,663 --> 00:50:28,944 Speaker 2: but I'm not signing off on that until you waterboard me. 959 00:50:29,304 --> 00:50:32,343 Speaker 2: And she was she was waterboarded, and she said that 960 00:50:32,504 --> 00:50:35,464 Speaker 2: was awful, but it's not worse than childbirth. And so 961 00:50:35,864 --> 00:50:37,823 Speaker 2: these are hard choices and if you don't if you 962 00:50:37,824 --> 00:50:40,464 Speaker 2: can't make hard choices, you shouldn't get in that business. 963 00:50:40,464 --> 00:50:41,303 Speaker 4: I do something right. 964 00:50:41,344 --> 00:50:43,423 Speaker 3: So we're not defending it, We're just simply saying it's 965 00:50:43,424 --> 00:50:46,503 Speaker 3: a very interesting conversation. And at the end of the day, 966 00:50:46,703 --> 00:50:49,984 Speaker 3: I think those decisions need to be made by our 967 00:50:50,024 --> 00:50:53,304 Speaker 3: elected representatives, not by members of the intelligence company. 968 00:50:53,384 --> 00:50:56,744 Speaker 2: But then we need to elect smarter people. Then yeah, listen, 969 00:50:56,783 --> 00:50:59,024 Speaker 2: I think we probably kept you too long. It's been fasting. 970 00:50:59,384 --> 00:51:01,623 Speaker 2: We want to thank you for coming on with us today. 971 00:51:01,864 --> 00:51:07,944 Speaker 5: Thank you, it was great talking to guys. 972 00:51:08,504 --> 00:51:13,544 Speaker 8: Mission Implausible was produced by Adam Davidson, Jerry O'shay, John Cipher, 973 00:51:13,864 --> 00:51:18,264 Speaker 8: and Jonathan Stern. The associate producer is Rachel Harner. Mission 974 00:51:18,304 --> 00:51:22,343 Speaker 8: Implausible is a production of honorable mention and abominable pictures 975 00:51:22,384 --> 00:51:23,743 Speaker 8: for iHeart Podcasts.