1 00:00:01,320 --> 00:00:04,440 Speaker 1: I'm John Cipher and I'm Jerry O'Shea. I was a 2 00:00:04,480 --> 00:00:07,840 Speaker 1: CIA officer stationed around the world in high threat posts 3 00:00:07,840 --> 00:00:09,520 Speaker 1: in Europe, Russia, and in Asia. 4 00:00:09,600 --> 00:00:12,680 Speaker 2: And I served in Africa, Asia, Europe, the Middle East 5 00:00:12,760 --> 00:00:16,840 Speaker 2: and in war zones. We sometimes created conspiracies to deceive 6 00:00:16,880 --> 00:00:17,799 Speaker 2: our adversaries. 7 00:00:18,040 --> 00:00:21,720 Speaker 1: Now we're going to use our expertise to deconstruct conspiracy 8 00:00:21,760 --> 00:00:23,520 Speaker 1: theories large and small. 9 00:00:23,800 --> 00:00:26,280 Speaker 2: Could they be true? Or are we being manipulated? 10 00:00:26,800 --> 00:00:33,800 Speaker 1: This is mission implausible. Today's guest is David McCluskey. He's 11 00:00:34,159 --> 00:00:36,760 Speaker 1: actually a friend and a former colleague, and he is 12 00:00:36,880 --> 00:00:41,280 Speaker 1: now best known for his best selling spy novels Damasca, Station, 13 00:00:41,560 --> 00:00:44,879 Speaker 1: Moscow X and The Seventh Floor, all worth reading. We'll 14 00:00:44,880 --> 00:00:47,040 Speaker 1: probably talk about him in the podcast, but he's probably 15 00:00:47,080 --> 00:00:50,080 Speaker 1: even better known nowadays, especially in England, for his well 16 00:00:50,080 --> 00:00:54,000 Speaker 1: regarded podcast The Rest Is Classified with British security journalist 17 00:00:54,000 --> 00:00:54,640 Speaker 1: Gordon Carrera. 18 00:00:54,840 --> 00:00:55,240 Speaker 2: It's good. 19 00:00:55,720 --> 00:00:58,200 Speaker 1: David previously worked at the CIA on issues related to 20 00:00:58,240 --> 00:01:00,840 Speaker 1: the Middle East and worked in embassies abroad, but he 21 00:01:00,920 --> 00:01:02,920 Speaker 1: was in the sort of snotty, whimpy side of the agency. 22 00:01:03,360 --> 00:01:05,920 Speaker 1: Jully and I who were busy saving the world so 23 00:01:05,959 --> 00:01:10,240 Speaker 1: that it could be destroyed later. So sorry about that. 24 00:01:10,480 --> 00:01:15,120 Speaker 2: So I can't be friends, you know, Manars and Venus 25 00:01:15,160 --> 00:01:17,640 Speaker 2: can d O and d I people. We'll explore that. 26 00:01:17,760 --> 00:01:18,760 Speaker 2: I'm sorry over here, you go. 27 00:01:19,240 --> 00:01:21,880 Speaker 3: I was I was trying to translate all the garbage 28 00:01:21,920 --> 00:01:23,880 Speaker 3: you were writing up in the field so the people 29 00:01:23,920 --> 00:01:25,920 Speaker 3: making the decisions in DC would know what the hell 30 00:01:26,000 --> 00:01:26,800 Speaker 3: you were talking about. 31 00:01:29,440 --> 00:01:32,400 Speaker 1: I have lots of questions, but here's where I'd like 32 00:01:32,440 --> 00:01:34,319 Speaker 1: to start off with. What is it like now to 33 00:01:34,360 --> 00:01:36,119 Speaker 1: be making a living off of our exploits? 34 00:01:39,440 --> 00:01:42,640 Speaker 3: I like I like to I like to say John 35 00:01:43,319 --> 00:01:47,480 Speaker 3: that you know, I was certainly not a I was 36 00:01:47,480 --> 00:01:50,320 Speaker 3: certainly not a case officer like you out there saving 37 00:01:50,360 --> 00:01:54,440 Speaker 3: the world regularly day by day. But but you know 38 00:01:54,960 --> 00:01:58,160 Speaker 3: every day. Yeah, what better person to actually be able 39 00:01:58,200 --> 00:02:02,280 Speaker 3: to write down on paper those exploits than an analyst, right? 40 00:02:02,400 --> 00:02:05,040 Speaker 3: I mean, this is this It's just a continuation of 41 00:02:05,120 --> 00:02:08,280 Speaker 3: my sort of prior career right now. Now I get 42 00:02:08,320 --> 00:02:12,160 Speaker 3: to write more fantastical stories and UH and occasionally you know, 43 00:02:12,680 --> 00:02:15,880 Speaker 3: poke fun at you guys, and UH and some of 44 00:02:15,880 --> 00:02:19,040 Speaker 3: my analysts friends as well. So it's not too bad. 45 00:02:19,200 --> 00:02:19,960 Speaker 3: It's not too bad. 46 00:02:20,639 --> 00:02:23,720 Speaker 2: Yeah, I have to say as we start that when 47 00:02:23,720 --> 00:02:26,400 Speaker 2: I entered the agency the first couple of tours, I 48 00:02:26,600 --> 00:02:29,840 Speaker 2: was like the agency version of hermaphrodite, sort of a 49 00:02:30,480 --> 00:02:33,720 Speaker 2: I was because I was between I was between the 50 00:02:33,800 --> 00:02:39,919 Speaker 2: d I, the intelligence guys, the analysts, guys, intelligence folk, 51 00:02:40,040 --> 00:02:43,600 Speaker 2: the analysts, and the operations officer, tough guys like John 52 00:02:44,000 --> 00:02:48,440 Speaker 2: and both of you, the DI analyst and the do 53 00:02:48,560 --> 00:02:51,400 Speaker 2: O operators. Both of you gave me noggies and swirlies 54 00:02:51,520 --> 00:02:53,160 Speaker 2: right and looked good. 55 00:02:53,200 --> 00:02:58,200 Speaker 3: Point, this is a real besmirching of our reports, officer friends, 56 00:02:58,280 --> 00:03:01,000 Speaker 3: isn't it. It is. It's terrible. 57 00:03:02,240 --> 00:03:04,760 Speaker 1: I all of us deserve a besmirching, So that's not 58 00:03:04,919 --> 00:03:06,959 Speaker 1: that's true. I was trying to go on offense til 59 00:03:07,000 --> 00:03:08,760 Speaker 1: that my besmirching is a little further down the pike. 60 00:03:08,960 --> 00:03:13,480 Speaker 3: That's true. That's true. Yeah, No, I mean, uh, I 61 00:03:13,520 --> 00:03:16,280 Speaker 3: mean analysts, though I don't think they were analysts, weren't 62 00:03:16,320 --> 00:03:18,920 Speaker 3: the type to be given out like noggi's and swirlies. Jerry, 63 00:03:19,080 --> 00:03:21,280 Speaker 3: and I feel like it was very much a heads down, 64 00:03:21,400 --> 00:03:24,040 Speaker 3: eating lunch at your desk crowd for the most part, right, 65 00:03:24,639 --> 00:03:25,520 Speaker 3: That's what I remember. 66 00:03:25,639 --> 00:03:29,119 Speaker 2: Mostly I got like eye rolls and like oh yeah, 67 00:03:29,280 --> 00:03:31,160 Speaker 2: you got oh And then I get use a big 68 00:03:31,200 --> 00:03:32,400 Speaker 2: word and I have to go look it up and 69 00:03:32,440 --> 00:03:33,720 Speaker 2: realize I've been besmirched. 70 00:03:33,880 --> 00:03:36,520 Speaker 3: But this is an influencing thing. We were talking, We 71 00:03:36,560 --> 00:03:39,000 Speaker 3: were talking about this before we even hit record, that 72 00:03:39,360 --> 00:03:42,680 Speaker 3: what Jerry just did is a classic case officer influencing 73 00:03:42,720 --> 00:03:45,720 Speaker 3: tactic for analysts, which is, when you're sitting down in 74 00:03:45,760 --> 00:03:48,360 Speaker 3: a briefing, they tell you, they the case officers, tell 75 00:03:48,360 --> 00:03:50,280 Speaker 3: you how smart you are and how they don't know anything, 76 00:03:50,360 --> 00:03:52,800 Speaker 3: and how they're just desperate for your knowledge, and then 77 00:03:52,920 --> 00:03:54,760 Speaker 3: use the analyst. You know, even if you kind of 78 00:03:54,800 --> 00:03:58,000 Speaker 3: know what's happening, you feel good when the briefing starts 79 00:03:58,120 --> 00:04:00,680 Speaker 3: right and it's just a classic I swear that they 80 00:04:00,720 --> 00:04:02,560 Speaker 3: taught you that somewhere along the line, because I got 81 00:04:02,840 --> 00:04:05,400 Speaker 3: every case officer I sat down with ever period. 82 00:04:05,960 --> 00:04:07,560 Speaker 2: I have to get this out, so they get this 83 00:04:07,600 --> 00:04:10,840 Speaker 2: out early. We had a a an analyst, a very 84 00:04:10,920 --> 00:04:15,160 Speaker 2: senior analyst woman. She came to Manila and we were 85 00:04:15,400 --> 00:04:18,360 Speaker 2: talking and you know, around the table, and we were 86 00:04:18,360 --> 00:04:22,000 Speaker 2: giving a briefing. And at the time, one of the 87 00:04:22,160 --> 00:04:25,240 Speaker 2: terrorist groups, the Moro Islamic Liberation Front, was something we 88 00:04:25,240 --> 00:04:31,039 Speaker 2: were concerned with. And so the reports officer says, well, well, 89 00:04:31,040 --> 00:04:33,440 Speaker 2: there's no way around this, you know. He says that 90 00:04:33,880 --> 00:04:37,680 Speaker 2: it's called the MILF, And everybody smiles around the table 91 00:04:37,800 --> 00:04:41,600 Speaker 2: except for the analyst, and she's like yeah, and he's 92 00:04:41,640 --> 00:04:44,200 Speaker 2: like the mill and uh, she's like and she's like 93 00:04:44,360 --> 00:04:46,279 Speaker 2: smart enough to pick up that's supposed to mean something. 94 00:04:46,360 --> 00:04:48,320 Speaker 2: He says, well, maybe you should look it up later, 95 00:04:48,520 --> 00:04:50,039 Speaker 2: but don't use a government computer. 96 00:04:52,480 --> 00:04:57,720 Speaker 3: Well it was abbreviated milf in' that's the name. 97 00:04:57,880 --> 00:05:00,800 Speaker 2: Yeah, yeah, the MILF. Yeah. Yeah, we're going down. We're 98 00:05:00,800 --> 00:05:05,080 Speaker 2: already sort of launching down a path, and conspiracy theories 99 00:05:05,120 --> 00:05:08,320 Speaker 2: come out of like the CIA, And David, I was 100 00:05:08,360 --> 00:05:11,119 Speaker 2: wondering if you could talk a little bit about maybe 101 00:05:11,200 --> 00:05:14,640 Speaker 2: some of the tribes inside of CIA really briefly, and 102 00:05:14,680 --> 00:05:18,640 Speaker 2: then discuss what the DII the analysts do, and then 103 00:05:18,720 --> 00:05:21,800 Speaker 2: we can sort of talk about conspiracy theories and conspiracies 104 00:05:22,520 --> 00:05:26,000 Speaker 2: both within the agency and who gets blamed for conspiracy 105 00:05:26,040 --> 00:05:28,320 Speaker 2: theories and so forth. So why don't you why don't 106 00:05:28,360 --> 00:05:28,960 Speaker 2: I kick it over to you? 107 00:05:29,000 --> 00:05:30,880 Speaker 3: I'm sure you're well, yeah, I mean, and the most 108 00:05:30,880 --> 00:05:33,119 Speaker 3: important tribe is the analyst, because they're the ones writing 109 00:05:33,160 --> 00:05:36,280 Speaker 3: and stuff that goes goes to the policymakers, right, that's 110 00:05:36,320 --> 00:05:38,520 Speaker 3: what the CIA is there to do. For the most part. 111 00:05:39,279 --> 00:05:41,680 Speaker 3: Occasionally we'll spin up some covered action stuff which usually 112 00:05:41,680 --> 00:05:44,160 Speaker 3: doesn't go that well. But you know, the analysts are 113 00:05:44,200 --> 00:05:48,120 Speaker 3: sort of we're holding the flame. I mean, so the 114 00:05:48,160 --> 00:05:52,120 Speaker 3: analysts is certainly one tribe, right, your your case officer 115 00:05:52,200 --> 00:05:53,200 Speaker 3: brethren are another. 116 00:05:53,880 --> 00:05:54,920 Speaker 2: Relations officers. 117 00:05:55,000 --> 00:05:55,600 Speaker 4: Yeah, that's right. 118 00:05:55,680 --> 00:05:59,479 Speaker 3: There's there's a whole host of like teers and ness 119 00:05:59,480 --> 00:06:03,400 Speaker 3: in tiers. Yeah, yeah, the support guys. But like even 120 00:06:03,440 --> 00:06:05,320 Speaker 3: within like the d O, right, there's a bunch of 121 00:06:05,360 --> 00:06:09,159 Speaker 3: different sort of you know, there's like a sort of hierarchy. 122 00:06:08,800 --> 00:06:12,200 Speaker 4: Right, And how do you see the hierarchy? That's that's all. 123 00:06:12,520 --> 00:06:15,440 Speaker 4: How do I see it in the operations side? 124 00:06:15,480 --> 00:06:15,680 Speaker 2: Yeah? 125 00:06:15,720 --> 00:06:19,919 Speaker 3: Well I was thinking that, Yeah, there's there's reports officers, 126 00:06:20,000 --> 00:06:24,840 Speaker 3: there's cmos, there's there's sues, right, and I always forgot 127 00:06:24,920 --> 00:06:31,360 Speaker 3: what that stood for, essentially, like essentially like an analyst. 128 00:06:30,839 --> 00:06:31,359 Speaker 1: In the d O. 129 00:06:31,720 --> 00:06:37,159 Speaker 3: Is that right? Okay? There were targeting officers, which I 130 00:06:37,200 --> 00:06:39,560 Speaker 3: guess could be in either the d I or the 131 00:06:39,640 --> 00:06:43,520 Speaker 3: d O, as I recall, you know, interestingly enough, the 132 00:06:43,600 --> 00:06:47,359 Speaker 3: last time I went into Langley up in the newly 133 00:06:47,400 --> 00:06:50,960 Speaker 3: renovated museum. There is a wall that has a list 134 00:06:51,040 --> 00:06:54,719 Speaker 3: of every or what seems like every possible job title 135 00:06:54,760 --> 00:06:56,800 Speaker 3: you could have at the CIA, and it is an 136 00:06:56,960 --> 00:07:01,480 Speaker 3: entire wall. Have you seen this. It's well, let us in. 137 00:07:01,440 --> 00:07:05,360 Speaker 4: There anymore to be clear. 138 00:07:05,640 --> 00:07:07,359 Speaker 2: And you still have pictures in me on a horse 139 00:07:07,400 --> 00:07:10,160 Speaker 2: in Afghanistan. Those are all going we didn't go real 140 00:07:10,200 --> 00:07:13,440 Speaker 2: well yeah, yeah, I mean. 141 00:07:13,320 --> 00:07:16,320 Speaker 3: My perception was always the case. Officers believed themselves to 142 00:07:16,360 --> 00:07:18,880 Speaker 3: be the reason the entire place existed, and thus the 143 00:07:18,920 --> 00:07:23,800 Speaker 3: most important people there thatived believed, yes, believed themselves to 144 00:07:23,840 --> 00:07:26,280 Speaker 3: be the most important Who I mean, who else there was? 145 00:07:26,480 --> 00:07:30,040 Speaker 3: You know, a variety of techie type roles. There were 146 00:07:30,040 --> 00:07:32,640 Speaker 3: those guys in the green jackets who monitored the uh, 147 00:07:33,280 --> 00:07:35,520 Speaker 3: the contractors who came in who I always like to 148 00:07:35,840 --> 00:07:37,600 Speaker 3: include them in any kind of book, if I could 149 00:07:37,600 --> 00:07:39,080 Speaker 3: just randomly throw a reference at them. 150 00:07:39,120 --> 00:07:39,440 Speaker 2: I don't know. 151 00:07:40,440 --> 00:07:42,080 Speaker 4: You've done a good job with your books. You've thrown 152 00:07:42,080 --> 00:07:44,480 Speaker 4: in that you had the hot dog machine, and the hot. 153 00:07:44,320 --> 00:07:47,880 Speaker 3: Dog machine in there, sadly was it was ripped out 154 00:07:47,880 --> 00:07:50,360 Speaker 3: at some point in the past decade, probably for like 155 00:07:50,440 --> 00:07:53,560 Speaker 3: code violations. I tried to get a reference to the 156 00:07:53,640 --> 00:07:55,440 Speaker 3: you know, the gift shop in there as much as 157 00:07:55,440 --> 00:07:57,600 Speaker 3: I can, and I used I think John, I actually 158 00:07:58,040 --> 00:08:00,720 Speaker 3: took this line from you with permission, of course, because 159 00:08:00,720 --> 00:08:03,120 Speaker 3: at one point in some conversation you said that we 160 00:08:03,160 --> 00:08:06,280 Speaker 3: stopped being a real spy service as soon as the 161 00:08:06,320 --> 00:08:08,640 Speaker 3: gift shop went in. You know, we put that. Put 162 00:08:08,640 --> 00:08:09,600 Speaker 3: that in the mouth of. 163 00:08:11,440 --> 00:08:13,160 Speaker 4: The CIA on it the end. 164 00:08:13,600 --> 00:08:13,840 Speaker 3: Yeah. 165 00:08:13,880 --> 00:08:16,640 Speaker 2: But David, so there's there's the analysts and the operator, 166 00:08:16,680 --> 00:08:19,679 Speaker 2: so the two core tribes. But there's another split, another 167 00:08:19,800 --> 00:08:22,280 Speaker 2: schism inside of the agency. And I want to know 168 00:08:22,320 --> 00:08:25,000 Speaker 2: which side you fall on. Were you dunkin Donuts or 169 00:08:25,160 --> 00:08:28,600 Speaker 2: were you Starbucks? There's two places to get coffee. 170 00:08:29,160 --> 00:08:31,560 Speaker 3: That's right, That's right. You know what I I migrated 171 00:08:31,560 --> 00:08:33,839 Speaker 3: over the course of my career. I began as a 172 00:08:33,960 --> 00:08:35,280 Speaker 3: dunkin Donuts. 173 00:08:34,840 --> 00:08:37,719 Speaker 4: Man, made more money, went to Starbucks. 174 00:08:37,240 --> 00:08:40,080 Speaker 3: Yeah, exactly. And then once I made gs ten I started, 175 00:08:40,160 --> 00:08:42,760 Speaker 3: I went to Starbucks. It was Duncan to start, and 176 00:08:42,760 --> 00:08:46,040 Speaker 3: then it became Starbucks. Although I heard a rumor, I 177 00:08:46,200 --> 00:08:48,280 Speaker 3: wasn't there. A director at some point wanted to like 178 00:08:48,360 --> 00:08:52,040 Speaker 3: rip out the coffee stands, right, I don't know which one. 179 00:08:52,080 --> 00:08:54,040 Speaker 4: Our friends Rob Danberg wanted to do that. 180 00:08:54,080 --> 00:08:57,800 Speaker 3: He oh, Okay, yeah, yeah, get rid of them. But yeah, no, 181 00:08:57,880 --> 00:08:59,840 Speaker 3: I'm I became a Starbucks man as it went on, 182 00:08:59,880 --> 00:09:04,720 Speaker 3: and there were plenty of my analyst friends who probably 183 00:09:04,760 --> 00:09:07,520 Speaker 3: spent good portions of their days kind of like just 184 00:09:07,640 --> 00:09:12,480 Speaker 3: milling around the Starbucks writer having various like ostensible work 185 00:09:12,559 --> 00:09:15,959 Speaker 3: chats in the cafeteria or at the Starbucks that were 186 00:09:16,400 --> 00:09:16,880 Speaker 3: that were not. 187 00:09:17,200 --> 00:09:19,079 Speaker 4: That's why he wanted to rip them out, because. 188 00:09:19,200 --> 00:09:22,160 Speaker 3: Yeah, exactly exactly the strength. 189 00:09:22,280 --> 00:09:24,480 Speaker 1: Let's talk about you for a second. I mean, really, 190 00:09:24,559 --> 00:09:26,880 Speaker 1: you have become quite popular both for your books, which 191 00:09:26,920 --> 00:09:30,920 Speaker 1: are high praise, but also this podcast, which is newer. 192 00:09:31,480 --> 00:09:33,719 Speaker 1: So which do you like better nowadays? The writing or 193 00:09:33,760 --> 00:09:34,559 Speaker 1: the podcasting. 194 00:09:34,840 --> 00:09:37,720 Speaker 3: Oh that's hard. I mean I enjoy both of them. 195 00:09:37,720 --> 00:09:40,480 Speaker 3: I think the writing is the central thing that that 196 00:09:40,720 --> 00:09:42,920 Speaker 3: I don't know, I will never I will never stop writing, 197 00:09:42,960 --> 00:09:44,800 Speaker 3: and I hope the podcast never stops. But kind of 198 00:09:44,840 --> 00:09:48,080 Speaker 3: the central piece of all of it's the books. But 199 00:09:48,120 --> 00:09:50,240 Speaker 3: Gordon and I having a ton of fun on the podcast, 200 00:09:50,440 --> 00:09:53,959 Speaker 3: and I see it as very like symbiotic with the 201 00:09:54,000 --> 00:09:58,760 Speaker 3: writing because it's an opportunity to go into spy stories 202 00:09:58,760 --> 00:10:01,320 Speaker 3: that I know some thing about and then actually go 203 00:10:01,440 --> 00:10:03,920 Speaker 3: deep and do the research to come up with kind of, 204 00:10:04,559 --> 00:10:06,800 Speaker 3: you know, the arc for a tour or four six 205 00:10:06,840 --> 00:10:09,319 Speaker 3: part series and and learn a whole bunch of stuff 206 00:10:09,320 --> 00:10:12,480 Speaker 3: that I don't I didn't otherwise know, and then frankly, 207 00:10:12,840 --> 00:10:15,280 Speaker 3: you know, try to find ways to work some of 208 00:10:15,320 --> 00:10:17,480 Speaker 3: that material into books if I can. And so kind 209 00:10:17,480 --> 00:10:22,400 Speaker 3: of going into real stuff and finding interesting characters or 210 00:10:22,440 --> 00:10:25,800 Speaker 3: situations is really fruitful. So I think the two things 211 00:10:25,840 --> 00:10:26,480 Speaker 3: really work well. 212 00:10:26,480 --> 00:10:28,920 Speaker 2: Again, fiction and John's war stories are pretty much the 213 00:10:28,920 --> 00:10:29,280 Speaker 2: same thing. 214 00:10:29,480 --> 00:10:33,160 Speaker 4: Yeah, yeah, there I was. It does be an igor. 215 00:10:34,120 --> 00:10:35,520 Speaker 4: Let's take a break. We'll be right back. 216 00:10:49,800 --> 00:10:52,480 Speaker 2: So I want to I'm gonna endeavor to say something 217 00:10:52,880 --> 00:10:54,000 Speaker 2: good about. 218 00:10:53,760 --> 00:10:56,520 Speaker 3: Analysts, okay, And I know it's gonna be hard for you. 219 00:10:56,559 --> 00:10:59,000 Speaker 2: And it's and I'm going I want to get down 220 00:10:59,040 --> 00:11:02,600 Speaker 2: this rabbit hole. And this gets into conspiracies and conspiracy theories. 221 00:11:02,600 --> 00:11:05,800 Speaker 2: So one of the bigger ones is IRAQ w WEMD 222 00:11:06,040 --> 00:11:08,680 Speaker 2: and it's something that I have an indirect insight into, 223 00:11:08,720 --> 00:11:12,360 Speaker 2: and I imagine also you indirect through hearsay and so forth. 224 00:11:12,840 --> 00:11:16,360 Speaker 2: But my sense was it is billed as and on 225 00:11:16,520 --> 00:11:18,760 Speaker 2: permission to swear, because this is a big one. This 226 00:11:18,920 --> 00:11:23,319 Speaker 2: is like a huge agency fuck up. And I think 227 00:11:23,440 --> 00:11:26,120 Speaker 2: I'm not going to shy away from that, but I 228 00:11:26,120 --> 00:11:29,400 Speaker 2: think the real for me, in any case, the real 229 00:11:29,480 --> 00:11:34,520 Speaker 2: conspiracy here is that the Bush administration basically had decided 230 00:11:34,760 --> 00:11:39,040 Speaker 2: to go to war, and they built their own with 231 00:11:39,120 --> 00:11:41,840 Speaker 2: Doug Fife, they built their own intelligence service in the 232 00:11:41,880 --> 00:11:44,760 Speaker 2: Pentagon for a little while for two years to basically 233 00:11:44,840 --> 00:11:48,160 Speaker 2: prove that case and to throw out any evidence that 234 00:11:48,280 --> 00:11:50,400 Speaker 2: didn't support it. And then of course they went down 235 00:11:50,400 --> 00:11:53,439 Speaker 2: the road of Saddam and al Qaeda were very close, 236 00:11:53,440 --> 00:11:57,320 Speaker 2: which was completely untrue. And then they said that, you know, 237 00:11:57,520 --> 00:12:00,920 Speaker 2: they hooked into a whole host of scammers that claimed 238 00:12:00,920 --> 00:12:04,240 Speaker 2: that Saddam had this you know, curveball and all the 239 00:12:04,240 --> 00:12:08,719 Speaker 2: rest of this. And my understanding is that the analysts 240 00:12:08,760 --> 00:12:13,240 Speaker 2: who wouldn't go along with this were systematically weeded out, 241 00:12:13,400 --> 00:12:15,600 Speaker 2: you know, sort of pushed off to the side or 242 00:12:15,640 --> 00:12:18,160 Speaker 2: demoted or asked to move off, and that the analyst 243 00:12:18,240 --> 00:12:21,960 Speaker 2: at the end said, the evidence that we have indicates 244 00:12:22,000 --> 00:12:24,480 Speaker 2: that they likely do. There was sort of probability, I 245 00:12:24,520 --> 00:12:27,440 Speaker 2: don't know as high, medium, low, and the White I 246 00:12:27,440 --> 00:12:29,880 Speaker 2: said that's not good enough. And George Tennant, who I 247 00:12:29,920 --> 00:12:33,360 Speaker 2: think highly uncaved is my understanding, and he said, look, 248 00:12:33,400 --> 00:12:36,319 Speaker 2: it's it's yes or no. There's no maybe's here. We're 249 00:12:36,360 --> 00:12:39,160 Speaker 2: going to war. Say yes or no, and you'd better 250 00:12:39,200 --> 00:12:41,920 Speaker 2: be friggin yes, right, And so they did they said yes. 251 00:12:42,440 --> 00:12:46,760 Speaker 2: And so the agency, through a conspiracy, has been stuck 252 00:12:46,840 --> 00:12:50,120 Speaker 2: with this has been pinned to us as a one 253 00:12:50,160 --> 00:12:52,600 Speaker 2: of our great failures. And I think it's you know, 254 00:12:52,679 --> 00:12:54,719 Speaker 2: the analysts are sort of at the center of this, 255 00:12:54,840 --> 00:12:57,360 Speaker 2: the pressures on them. So I'm wondering from where you sat, 256 00:12:57,440 --> 00:13:00,160 Speaker 2: from what you heard, you weren't involved in this. It's 257 00:13:00,160 --> 00:13:02,800 Speaker 2: your sense of pressure on analysts. And this goes today 258 00:13:02,840 --> 00:13:07,080 Speaker 2: with this administration to all sorts of things like you know, 259 00:13:07,480 --> 00:13:09,480 Speaker 2: where COVID came from and so forth. So I wonder 260 00:13:09,559 --> 00:13:11,120 Speaker 2: wonder if you could go into that. 261 00:13:11,320 --> 00:13:14,720 Speaker 3: Yeah, that's my understanding as well. The rock WMD story 262 00:13:14,760 --> 00:13:18,559 Speaker 3: is that it's a it is in some sense a 263 00:13:18,600 --> 00:13:22,680 Speaker 3: story about analytic failure, because at the time there wasn't 264 00:13:23,520 --> 00:13:27,400 Speaker 3: I actually don't think that there were embedded across the DII. 265 00:13:27,520 --> 00:13:31,040 Speaker 3: There weren't sort of the the confidence statements and all 266 00:13:31,080 --> 00:13:32,800 Speaker 3: these kind of other and frankly, a lot of the 267 00:13:32,840 --> 00:13:37,440 Speaker 3: clarity around how good or bad or fragmentary or otherwise 268 00:13:37,480 --> 00:13:40,560 Speaker 3: the sourcing was like those were not as consistently embedded 269 00:13:40,559 --> 00:13:45,000 Speaker 3: in the assessments, and so it was easier in some ways, 270 00:13:45,120 --> 00:13:50,079 Speaker 3: I think, to sort of handwave maybe or for consumers 271 00:13:50,120 --> 00:13:53,040 Speaker 3: of the intel to take away different stories from what 272 00:13:53,080 --> 00:13:57,040 Speaker 3: they were reading. And a lot of the changes made, 273 00:13:57,200 --> 00:13:59,760 Speaker 3: like just as I was joining, were to try to 274 00:13:59,800 --> 00:14:04,080 Speaker 3: like systematize the way that we communicated INTEL assessments and 275 00:14:04,160 --> 00:14:06,280 Speaker 3: to be very clear about what we knew and what 276 00:14:06,320 --> 00:14:07,959 Speaker 3: we didn't know and things like that. 277 00:14:08,160 --> 00:14:08,360 Speaker 2: You know. 278 00:14:08,400 --> 00:14:10,840 Speaker 3: The art of the story you told on Rock WMD, 279 00:14:10,960 --> 00:14:15,240 Speaker 3: I think is a that's a particularly egregious example of 280 00:14:16,160 --> 00:14:19,960 Speaker 3: the politics kind of dipping in and dictating or really 281 00:14:20,000 --> 00:14:25,520 Speaker 3: influencing what the CIA puts down on paper. I think 282 00:14:25,560 --> 00:14:31,000 Speaker 3: the politicization of analysis, though it's rarely that easy. I 283 00:14:31,040 --> 00:14:33,080 Speaker 3: think to sort of go back and say that's exactly 284 00:14:33,080 --> 00:14:34,960 Speaker 3: what happened, I think a Rock WMD, because I mean 285 00:14:35,000 --> 00:14:37,720 Speaker 3: my understanding is that sometimes like there were NSC or 286 00:14:37,720 --> 00:14:39,720 Speaker 3: Pentagon or White House officials who would literally I mean 287 00:14:39,720 --> 00:14:42,000 Speaker 3: it came out to Langley on multiple occasions to like 288 00:14:43,240 --> 00:14:47,720 Speaker 3: basically sit with the analysts right and right and like 289 00:14:47,760 --> 00:14:49,880 Speaker 3: so I mean that's a lot of pressure right, and 290 00:14:50,040 --> 00:14:52,040 Speaker 3: to your point, like, if the decision has already been 291 00:14:52,080 --> 00:14:55,600 Speaker 3: made to do this, to go to war, I guess 292 00:14:55,640 --> 00:14:58,160 Speaker 3: becomes more of a political question for you know, the 293 00:14:58,160 --> 00:15:00,280 Speaker 3: director for the seventh floor to figure out how do 294 00:15:00,280 --> 00:15:02,680 Speaker 3: you not lose the trust of the White House, right 295 00:15:02,760 --> 00:15:05,560 Speaker 3: if they've already made the decision. But but yeah, I 296 00:15:05,560 --> 00:15:09,600 Speaker 3: mean the politicization pressure and sort of I guess the 297 00:15:09,760 --> 00:15:15,280 Speaker 3: risk is real and it is really I think hard 298 00:15:15,320 --> 00:15:17,520 Speaker 3: to it's not It shouldn't be hard for the line 299 00:15:17,560 --> 00:15:19,520 Speaker 3: analysts to navigate because they're the ones that are looking 300 00:15:19,520 --> 00:15:21,680 Speaker 3: at the stuff, and ideally they're being backed up the 301 00:15:21,760 --> 00:15:25,440 Speaker 3: chain by people who have an interest in like maybe corny, 302 00:15:25,520 --> 00:15:28,640 Speaker 3: but like, yeah, speaking the truth regardless of what the 303 00:15:28,680 --> 00:15:35,720 Speaker 3: politics are. But there's really many different ways that the 304 00:15:35,800 --> 00:15:38,800 Speaker 3: analysis can get politicized at different parts of the chain. 305 00:15:39,560 --> 00:15:43,600 Speaker 3: A lot of different people are touching it, and frankly, 306 00:15:43,680 --> 00:15:47,200 Speaker 3: stuff can get politicized by what's not written or what's 307 00:15:47,280 --> 00:15:49,960 Speaker 3: not reported into the White House or what doesn't get 308 00:15:50,000 --> 00:15:56,840 Speaker 3: read in the PDB. So brief yeah, yeah, so you know, 309 00:15:57,160 --> 00:15:59,960 Speaker 3: it's a really and it's really subtle, I think, because frankly, 310 00:16:00,680 --> 00:16:03,920 Speaker 3: the politicization could start with you know, someone who's editing 311 00:16:03,960 --> 00:16:07,480 Speaker 3: one of those pdbs just kind of softening some stuff 312 00:16:07,480 --> 00:16:11,200 Speaker 3: a little bit, or saying, is this really a moderate 313 00:16:11,320 --> 00:16:14,840 Speaker 3: confidence judgment? This seems more like a low confidence judgment 314 00:16:14,920 --> 00:16:17,240 Speaker 3: to me based on this definition, And that's more art 315 00:16:17,280 --> 00:16:20,480 Speaker 3: than science. And if something's a low confidence judgment, then 316 00:16:20,480 --> 00:16:23,080 Speaker 3: maybe it's you know, maybe it's not worth actually briefing 317 00:16:23,120 --> 00:16:25,720 Speaker 3: to the president period. And so it never makes, you know, 318 00:16:26,080 --> 00:16:28,560 Speaker 3: it never makes it into any kind of higher level 319 00:16:28,920 --> 00:16:32,120 Speaker 3: conversation down in the oval. So there's lots of different ways. 320 00:16:32,360 --> 00:16:34,120 Speaker 3: I mean, frankly, i'd be curious for your guys thoughts 321 00:16:34,120 --> 00:16:37,720 Speaker 3: on this, And you know, on the collection side, I'd 322 00:16:37,720 --> 00:16:40,840 Speaker 3: imagine there's plenty of ways that someone on the seventh 323 00:16:40,880 --> 00:16:43,960 Speaker 3: floor in the DdO shop could say on these three 324 00:16:44,080 --> 00:16:47,200 Speaker 3: or four topics, I want to see the stuff before 325 00:16:47,320 --> 00:16:52,240 Speaker 3: it gets descemnded. And you could decide, for any reason, 326 00:16:52,280 --> 00:16:55,560 Speaker 3: you kind of cook up maybe we don't descend this 327 00:16:55,760 --> 00:16:59,200 Speaker 3: kind of thing, and because it's gonna set off fireworks downtown, 328 00:16:59,280 --> 00:17:01,600 Speaker 3: and maybe I make up some stuff about how the 329 00:17:01,640 --> 00:17:05,000 Speaker 3: sourcing's bogus or whatever in order to justify it. So 330 00:17:05,000 --> 00:17:07,359 Speaker 3: there's a lot of different ways you could play around 331 00:17:07,359 --> 00:17:10,240 Speaker 3: with it. I think throughout the chain, right, I. 332 00:17:10,200 --> 00:17:13,160 Speaker 2: Think it's important to discuss what descend means, right. People 333 00:17:13,240 --> 00:17:16,760 Speaker 2: that may not realize this. So people like John and I, 334 00:17:16,920 --> 00:17:20,080 Speaker 2: we would collect the information from a source what it 335 00:17:20,359 --> 00:17:23,800 Speaker 2: just putatively get a human source. He gives us the information. 336 00:17:24,480 --> 00:17:26,960 Speaker 2: We don't like hand it to the president. We actually 337 00:17:27,040 --> 00:17:31,359 Speaker 2: hand it to a reports office or a CMO and 338 00:17:31,440 --> 00:17:35,200 Speaker 2: they take the information, decide how it's written up. Then 339 00:17:35,240 --> 00:17:38,240 Speaker 2: it goes to headquarters where they look at it again, 340 00:17:38,400 --> 00:17:42,040 Speaker 2: and then they descem it to whomever needs it, whoever 341 00:17:42,080 --> 00:17:44,920 Speaker 2: is in need to know, so to the analysts. And 342 00:17:44,960 --> 00:17:48,400 Speaker 2: there's always a danger that if that raw disseminated report 343 00:17:48,760 --> 00:17:51,720 Speaker 2: goes directly to the White House and it's wrong, or 344 00:17:52,200 --> 00:17:55,320 Speaker 2: if it's lanted, then you've got other problems. But so 345 00:17:55,480 --> 00:17:57,560 Speaker 2: that's the way it should go. But there's lots of 346 00:17:57,800 --> 00:18:01,600 Speaker 2: places to slip TwixT the it fixed the spoon in 347 00:18:01,640 --> 00:18:04,080 Speaker 2: the lip, right I mean, and not always to influence it. 348 00:18:04,640 --> 00:18:06,560 Speaker 1: But yeah, if you go to go back to the 349 00:18:06,640 --> 00:18:10,639 Speaker 1: WMD a rock WND problem, A lot of people have 350 00:18:10,720 --> 00:18:13,160 Speaker 1: gone back after the analysts for making a bad call there, 351 00:18:13,200 --> 00:18:15,359 Speaker 1: but frankly there was no good collection then either. So 352 00:18:15,800 --> 00:18:19,560 Speaker 1: it was a real intelligence failure. We had no good sources. Really, 353 00:18:20,320 --> 00:18:22,080 Speaker 1: we had a White House that decided they needed to 354 00:18:22,119 --> 00:18:24,399 Speaker 1: go to war, and we had no Americans living in 355 00:18:24,400 --> 00:18:26,920 Speaker 1: the country. We had no good sources, and the analysts 356 00:18:26,920 --> 00:18:29,720 Speaker 1: were being pressured to come up with an analytic judgment, 357 00:18:29,760 --> 00:18:32,800 Speaker 1: and I think they tried to use what would be logical. Well, 358 00:18:32,880 --> 00:18:35,439 Speaker 1: he had WMD before and he tried to hide it, 359 00:18:35,480 --> 00:18:38,639 Speaker 1: so we must be doing so again. And so it 360 00:18:38,680 --> 00:18:40,360 Speaker 1: was a real problem. And I think we've seen even 361 00:18:40,359 --> 00:18:42,280 Speaker 1: more recently. You talked about it in one of your 362 00:18:42,960 --> 00:18:48,240 Speaker 1: podcasts recently about the twenty twenty two invasion of Ukraine. 363 00:18:48,320 --> 00:18:49,040 Speaker 4: It seems in that. 364 00:18:49,080 --> 00:18:51,600 Speaker 1: Case we actually had good collection. We had some sort 365 00:18:51,640 --> 00:18:55,119 Speaker 1: of source or sources. They were telling us that Putin 366 00:18:55,200 --> 00:18:58,600 Speaker 1: was about to invade. But some of the analysis clearly 367 00:18:58,640 --> 00:19:01,800 Speaker 1: was wrong because we had information Putin was going to invade. 368 00:19:02,160 --> 00:19:04,760 Speaker 1: But then were the decision was what's going to happen next? 369 00:19:04,880 --> 00:19:07,760 Speaker 1: And essentially our analysts in some fashion must have told 370 00:19:07,800 --> 00:19:09,879 Speaker 1: the White House, well, Ukraine's gonna lose quickly, and so 371 00:19:09,960 --> 00:19:12,159 Speaker 1: you better get Lenski out of there, and that's not 372 00:19:12,240 --> 00:19:12,840 Speaker 1: what happened. 373 00:19:13,080 --> 00:19:13,600 Speaker 4: Someone had to. 374 00:19:13,560 --> 00:19:16,400 Speaker 1: Make a decision on how were the Ukrainians willing to fight, 375 00:19:16,480 --> 00:19:20,080 Speaker 1: and we obviously did not have good intelligence on that either. 376 00:19:20,240 --> 00:19:22,960 Speaker 2: And David, so you were involved a lot with Syria 377 00:19:23,040 --> 00:19:25,640 Speaker 2: that's reflected in your book. You were there for Prog 378 00:19:25,840 --> 00:19:31,720 Speaker 2: Prog Spring, Christ dates me for the Arab Spring. Spring 379 00:19:33,480 --> 00:19:38,600 Speaker 2: springs and people tend to think, in conspiratorial thinking that 380 00:19:38,680 --> 00:19:43,760 Speaker 2: the agency is omnipotent when things surprise us. So like 381 00:19:43,840 --> 00:19:46,760 Speaker 2: the Arab Spring, people tend to think that the agency 382 00:19:46,800 --> 00:19:49,560 Speaker 2: is involved in prophesying that we've got You guys have 383 00:19:49,600 --> 00:19:51,880 Speaker 2: a crystal ball. So I was wondering if you could 384 00:19:51,880 --> 00:19:54,040 Speaker 2: give us your senses an analyst of what we could 385 00:19:54,119 --> 00:19:56,480 Speaker 2: know about the Arab Spring. Right, a fruit seller gets 386 00:19:56,480 --> 00:19:59,199 Speaker 2: slapped in the world falls apart and then whether and 387 00:19:59,200 --> 00:20:02,280 Speaker 2: then more recent is like when a sad fell, like 388 00:20:02,359 --> 00:20:06,040 Speaker 2: everybody was like holy shit, I mean nobody predicted it. 389 00:20:06,080 --> 00:20:07,919 Speaker 2: So I just wondering if you could give folks a 390 00:20:07,960 --> 00:20:10,480 Speaker 2: sense of like how you guys are just as much 391 00:20:10,480 --> 00:20:12,480 Speaker 2: at sea as the rest of us, right, and it's 392 00:20:12,480 --> 00:20:15,560 Speaker 2: not a conspiracy to hide or that we've got all 393 00:20:15,600 --> 00:20:18,040 Speaker 2: this information. Yea, this information, we have all this power 394 00:20:18,080 --> 00:20:19,320 Speaker 2: that people assume we have. 395 00:20:19,800 --> 00:20:22,320 Speaker 3: I think that The way I would maybe frame it is, 396 00:20:22,480 --> 00:20:28,680 Speaker 3: I think making a call on the eruption of mass 397 00:20:28,720 --> 00:20:33,400 Speaker 3: sort of popular uprising in the Middle East is a mystery. 398 00:20:33,520 --> 00:20:37,480 Speaker 3: It is a sort of mass psychological phenomenon that is 399 00:20:37,520 --> 00:20:42,800 Speaker 3: inherently unpredictable and actually nobody is capable of predicting it. 400 00:20:42,920 --> 00:20:48,840 Speaker 3: Whereas does Saddam possess chemical, biological weapons whatever is a secret, 401 00:20:49,320 --> 00:20:52,639 Speaker 3: We in theory could have collected information that would have, 402 00:20:52,840 --> 00:20:54,960 Speaker 3: you know, sort of proved or disproved whether or not 403 00:20:55,000 --> 00:20:58,720 Speaker 3: that's true, Which is why I think I would say 404 00:20:58,880 --> 00:21:01,480 Speaker 3: to your point, John, I'd never thought about WMD. Actually 405 00:21:01,560 --> 00:21:05,199 Speaker 3: is a doo failure, a collection failure that's also a 406 00:21:05,240 --> 00:21:07,399 Speaker 3: piece of it. The whole kind of chain failed because 407 00:21:07,760 --> 00:21:11,199 Speaker 3: the analysts we didn't have good information, and then the 408 00:21:11,200 --> 00:21:14,760 Speaker 3: analysts sort of through pressure and otherwise, you know, messed 409 00:21:14,840 --> 00:21:18,120 Speaker 3: up the call. But I think an intelligence failure involves 410 00:21:18,160 --> 00:21:20,240 Speaker 3: like are there secrets to collect that should have been 411 00:21:20,280 --> 00:21:24,359 Speaker 3: collected or analysis that should have been conducted given the 412 00:21:24,400 --> 00:21:29,040 Speaker 3: available information that wasn't right, Whereas I think predicting Syria 413 00:21:29,880 --> 00:21:33,520 Speaker 3: it's not possible, and so we certainly did not have 414 00:21:33,520 --> 00:21:36,960 Speaker 3: a crystal ball, and most of what the country analysts 415 00:21:36,960 --> 00:21:40,000 Speaker 3: were doing in those late twenty ten early twenty eleven 416 00:21:40,040 --> 00:21:44,919 Speaker 3: months were either writing pieces that explained sort of what 417 00:21:45,000 --> 00:21:47,439 Speaker 3: are the hurdles to a protest movement or to an 418 00:21:47,480 --> 00:21:50,080 Speaker 3: insurgency or a civil war, like what has to be 419 00:21:50,160 --> 00:21:52,880 Speaker 3: overcome to get there? So said differently, like what would 420 00:21:52,880 --> 00:21:55,679 Speaker 3: you have to believe for this to happen? Or writing 421 00:21:55,760 --> 00:22:00,000 Speaker 3: scenarios that lay out, you know, three or four worlds 422 00:22:00,200 --> 00:22:03,280 Speaker 3: or paths you could travel down should the sort of 423 00:22:03,920 --> 00:22:07,159 Speaker 3: security apparatus and general kind of pillars of stability in 424 00:22:07,240 --> 00:22:11,200 Speaker 3: a given country start to crumble, And none of us 425 00:22:11,400 --> 00:22:13,760 Speaker 3: had any eye. I mean I had friends who were 426 00:22:13,760 --> 00:22:16,840 Speaker 3: working on other you know, milliased accounts at that point 427 00:22:16,880 --> 00:22:20,159 Speaker 3: in time, who were literally writing essentially writing pieces that 428 00:22:20,359 --> 00:22:22,439 Speaker 3: had them in draft form that were like, this is 429 00:22:22,480 --> 00:22:24,240 Speaker 3: why it's not going to happen here, and then the 430 00:22:24,280 --> 00:22:27,080 Speaker 3: protest movement would start to May would be like delete these, 431 00:22:27,200 --> 00:22:30,600 Speaker 3: you know, So like no one saw that stuff coming, right, 432 00:22:30,640 --> 00:22:34,000 Speaker 3: I mean no more than like nobody saw the sort 433 00:22:34,040 --> 00:22:38,120 Speaker 3: of collapse of communism across Eastern Europe happening so quickly 434 00:22:38,320 --> 00:22:42,199 Speaker 3: in eighty nine either, right, So it's just it's not predictable. 435 00:22:42,600 --> 00:22:44,119 Speaker 3: Those kind of mass events. 436 00:22:44,440 --> 00:22:46,679 Speaker 2: Well, I don't know. I don't know about that. I 437 00:22:46,680 --> 00:22:51,359 Speaker 2: mean I went to West Berlin in the in April 438 00:22:51,359 --> 00:22:54,800 Speaker 2: of eighty nine, and you know, five months later it 439 00:22:54,840 --> 00:22:57,600 Speaker 2: all fell apart, the wall came down. Coincidence. 440 00:23:01,520 --> 00:23:03,280 Speaker 4: Well, I'm in analysis right there. 441 00:23:05,000 --> 00:23:05,760 Speaker 2: In my mind. 442 00:23:05,880 --> 00:23:26,720 Speaker 1: No, and more of this enlightening banter after a quick break. 443 00:23:28,880 --> 00:23:32,040 Speaker 2: Just because I was sort of involved in this. It 444 00:23:32,119 --> 00:23:35,120 Speaker 2: was also a process issue. So there was this source 445 00:23:35,320 --> 00:23:40,840 Speaker 2: called Curveball. Now Curveball, he was the main source. Singles 446 00:23:40,960 --> 00:23:43,160 Speaker 2: never do things on single source. He was the main 447 00:23:43,240 --> 00:23:47,760 Speaker 2: source that told us with great confidence and specificity about 448 00:23:47,800 --> 00:23:51,840 Speaker 2: the state of the WND programs. And Curveball was living 449 00:23:51,880 --> 00:23:55,440 Speaker 2: in Munich at the time, and German Station and I 450 00:23:55,560 --> 00:23:58,719 Speaker 2: got a pretty good view into this talked to the 451 00:23:58,760 --> 00:24:01,040 Speaker 2: Germans about him because he refus to meet the CIA, 452 00:24:01,119 --> 00:24:03,880 Speaker 2: but he did talk to the German intelligence and he said, 453 00:24:03,920 --> 00:24:06,560 Speaker 2: I won't meet the CIA, but you know, you can 454 00:24:06,600 --> 00:24:09,760 Speaker 2: give their information to HIT to them, to US intelligence. 455 00:24:10,359 --> 00:24:12,480 Speaker 2: I think the station took a look at his stuff 456 00:24:12,520 --> 00:24:15,000 Speaker 2: and said, we can't verify any of this and maybe 457 00:24:15,000 --> 00:24:16,920 Speaker 2: it's true, maybe it's not, but it's like we can't 458 00:24:17,040 --> 00:24:19,840 Speaker 2: verify it. So it's really not any good. And then 459 00:24:19,880 --> 00:24:22,760 Speaker 2: what happened is the military came in and they said, well, 460 00:24:22,760 --> 00:24:24,919 Speaker 2: we'll do it, right, but they didn't tell the station. 461 00:24:25,400 --> 00:24:29,400 Speaker 2: So all that reporting that went out went out through 462 00:24:29,480 --> 00:24:33,320 Speaker 2: the US military, and we in the station and our 463 00:24:33,359 --> 00:24:35,560 Speaker 2: analysts were getting it, but we didn't know where it 464 00:24:35,640 --> 00:24:37,600 Speaker 2: was coming from, right, I mean, we didn't have a 465 00:24:37,640 --> 00:24:39,240 Speaker 2: need to know that it was coming out of Munich. 466 00:24:39,320 --> 00:24:41,480 Speaker 2: And it only sort of at the very end did 467 00:24:41,520 --> 00:24:45,320 Speaker 2: we solve that. Yeah, it was like. 468 00:24:44,680 --> 00:24:46,240 Speaker 3: A big circular references. 469 00:24:46,400 --> 00:24:47,400 Speaker 4: This is that same guy. 470 00:24:47,880 --> 00:24:50,760 Speaker 2: There's the same guy. And I think it's been in 471 00:24:50,800 --> 00:24:54,680 Speaker 2: the press that the station chief actually wrote and said, 472 00:24:54,200 --> 00:24:58,600 Speaker 2: I hope this isn't the case, and Tyler Drumhill are famously, 473 00:24:58,640 --> 00:25:01,200 Speaker 2: again according to the media, went and said, I hope 474 00:25:01,200 --> 00:25:03,800 Speaker 2: you're not basing it and on this guy in Munich 475 00:25:04,160 --> 00:25:06,640 Speaker 2: on curveball, And by then it was sort of too late. 476 00:25:06,680 --> 00:25:09,440 Speaker 2: But the process, the military and you know, they they 477 00:25:09,560 --> 00:25:12,040 Speaker 2: talked to each other, but they didn't talk about sourcing 478 00:25:12,080 --> 00:25:13,920 Speaker 2: to each other, right, So it gets I don't want 479 00:25:13,920 --> 00:25:15,280 Speaker 2: to get too much into the weeds, but yeah, it 480 00:25:15,320 --> 00:25:17,400 Speaker 2: was also a process issue for sure. 481 00:25:17,400 --> 00:25:22,000 Speaker 3: And the way I remember this kind of rolling Downhill. 482 00:25:22,040 --> 00:25:23,640 Speaker 3: To me as an analyst when I joined a few 483 00:25:23,720 --> 00:25:28,440 Speaker 3: years after, this was being really really explicit. When you're 484 00:25:28,440 --> 00:25:32,000 Speaker 3: writing a finished intelligence product. You know, you're not giving 485 00:25:32,040 --> 00:25:34,120 Speaker 3: obviously the name of the source, you're not describing where 486 00:25:34,119 --> 00:25:37,680 Speaker 3: they work, but you don't cure like like extensive descriptions 487 00:25:37,720 --> 00:25:39,680 Speaker 3: of how much confidence we have in the source and 488 00:25:39,720 --> 00:25:43,080 Speaker 3: whether they actually have good access and all of that, 489 00:25:43,240 --> 00:25:45,359 Speaker 3: like have they been vetted? So there's there was a 490 00:25:45,359 --> 00:25:49,560 Speaker 3: lot more information that got included post like three oh 491 00:25:49,640 --> 00:25:52,200 Speaker 3: four oh five than if you go back and read 492 00:25:52,240 --> 00:25:56,240 Speaker 3: stuff that was written in one or two, like oftentimes 493 00:25:56,240 --> 00:25:57,880 Speaker 3: the sourcing statements were very thin. 494 00:25:58,400 --> 00:26:00,840 Speaker 2: Trust me, dude, you know that it's a sourcing statement. 495 00:26:00,880 --> 00:26:03,480 Speaker 3: Yeah, yeah, I mean kind of yeah, that's the way 496 00:26:03,480 --> 00:26:03,840 Speaker 3: it read. 497 00:26:04,119 --> 00:26:04,639 Speaker 4: Yeah. 498 00:26:04,840 --> 00:26:06,680 Speaker 1: To go back to Syria, I think Syria is a 499 00:26:07,520 --> 00:26:09,840 Speaker 1: big deal. Like there was environmental concerns and things that 500 00:26:09,920 --> 00:26:12,200 Speaker 1: led to you know, lots of people leaving Syria, which 501 00:26:12,240 --> 00:26:15,879 Speaker 1: led to problems in Europe and political problems that related 502 00:26:15,880 --> 00:26:18,879 Speaker 1: to immigration and things that put pressure on political leaders. 503 00:26:19,520 --> 00:26:21,800 Speaker 1: What do you think is happening now? Do you think 504 00:26:22,440 --> 00:26:24,440 Speaker 1: this is a policy question not an intelligence question? Do 505 00:26:24,480 --> 00:26:26,960 Speaker 1: you think we are handling it well, we have a 506 00:26:27,000 --> 00:26:29,000 Speaker 1: new leader in Syria, and it sounds to me like, 507 00:26:29,560 --> 00:26:31,760 Speaker 1: you know, the administration wants to sort of blow him off, 508 00:26:31,800 --> 00:26:35,200 Speaker 1: and I worry that more problems in Syria are a 509 00:26:36,000 --> 00:26:38,960 Speaker 1: potential problem for outside of Syria as well. 510 00:26:39,280 --> 00:26:42,480 Speaker 3: Yeah, I'm weirdly as an analyst. I'm a natural pessimist, 511 00:26:42,520 --> 00:26:44,920 Speaker 3: and I think Syria for like fifteen years, has given 512 00:26:44,960 --> 00:26:47,760 Speaker 3: me plenty of reasons to be pessimistic about pretty much everything, 513 00:26:48,600 --> 00:26:50,960 Speaker 3: a lot of reasons to continue to be pessimistic, although 514 00:26:51,000 --> 00:26:53,840 Speaker 3: strangely enough, at this point I'm sort of cautiously hopeful 515 00:26:53,920 --> 00:26:57,080 Speaker 3: or optimistic about the future. I mean, and it is. 516 00:26:57,240 --> 00:27:00,359 Speaker 3: It's not to say that I think it's possible that 517 00:27:00,400 --> 00:27:04,040 Speaker 3: we get like a flowering Jeffersonian democracy in Syria. That 518 00:27:04,200 --> 00:27:08,080 Speaker 3: seems unlikely, but I think it's also very possible, if 519 00:27:08,080 --> 00:27:12,560 Speaker 3: not even likely, that we get a better outcome than 520 00:27:12,960 --> 00:27:17,320 Speaker 3: Assad by a wide margin in terms of it is 521 00:27:17,320 --> 00:27:21,000 Speaker 3: a low bar, but I think it's it's possible that 522 00:27:21,080 --> 00:27:23,920 Speaker 3: we end up with a Syria that is at least 523 00:27:24,000 --> 00:27:28,119 Speaker 3: not sort of actively creating, you know, massive problems for 524 00:27:28,160 --> 00:27:32,280 Speaker 3: all of its neighbors, and maybe hopefully in the future 525 00:27:33,040 --> 00:27:37,679 Speaker 3: politically more stable and also more open, right than the 526 00:27:37,760 --> 00:27:41,920 Speaker 3: regime Osad Ran and frankly managing you know, it's repressive 527 00:27:41,920 --> 00:27:45,320 Speaker 3: apparatus on a much less sort of industrial horrific scale. 528 00:27:45,400 --> 00:27:47,439 Speaker 3: We will cross that bar for sure, right is that 529 00:27:47,480 --> 00:27:49,399 Speaker 3: this is going to be a far less repressive regime 530 00:27:49,400 --> 00:27:52,879 Speaker 3: than the one oside Ran. That's said, I mean, you 531 00:27:52,920 --> 00:27:57,000 Speaker 3: know where to start with the problems. And frankly, you know, 532 00:27:57,080 --> 00:28:01,080 Speaker 3: even though we still have true groups in the northeast, 533 00:28:01,200 --> 00:28:04,400 Speaker 3: I think maybe about nine hundred or so that are 534 00:28:04,400 --> 00:28:09,200 Speaker 3: there supporting our comrades in the SDF, the Syrian Democratic Forces. 535 00:28:09,320 --> 00:28:13,399 Speaker 3: You know, I don't think this administration has any Do 536 00:28:13,480 --> 00:28:16,320 Speaker 3: they have a Syria policy or an interest in developing one? 537 00:28:16,359 --> 00:28:19,000 Speaker 3: I mean, and if so, what would it even be. 538 00:28:19,280 --> 00:28:21,960 Speaker 3: You know, I think this is kind of one of 539 00:28:21,960 --> 00:28:25,439 Speaker 3: those pockets of the world that's just probably not going 540 00:28:25,480 --> 00:28:28,080 Speaker 3: to be getting a lot of interest from Trump two 541 00:28:28,119 --> 00:28:28,840 Speaker 3: point zero. 542 00:28:28,640 --> 00:28:30,120 Speaker 4: But it will be getting a lot of interest from 543 00:28:30,240 --> 00:28:31,119 Speaker 4: Israel and Turkey. 544 00:28:31,160 --> 00:28:32,840 Speaker 3: And we will be getting a lot of interest from 545 00:28:32,880 --> 00:28:35,200 Speaker 3: Israel and Turkey, and I think they're the two major 546 00:28:35,480 --> 00:28:39,360 Speaker 3: geopolitical winners here. I read that the Israeli Air Force 547 00:28:39,760 --> 00:28:44,160 Speaker 3: in the days after Osad fled flew more air sorties 548 00:28:44,560 --> 00:28:47,720 Speaker 3: over Syria than they had in any day in any 549 00:28:47,760 --> 00:28:53,040 Speaker 3: conflict since nineteen sixty seven. They essentially destroyed most of 550 00:28:53,120 --> 00:28:56,920 Speaker 3: Syria's military capacity in like about seventy two hours. And 551 00:28:56,960 --> 00:29:00,160 Speaker 3: the Turks, you know, have are probably the group with 552 00:29:00,200 --> 00:29:02,560 Speaker 3: the most influence both in the North and then with 553 00:29:03,000 --> 00:29:06,280 Speaker 3: the new new crew in Damascus, so big winners. Iran 554 00:29:07,040 --> 00:29:10,360 Speaker 3: US not so much. So we'll see. 555 00:29:10,360 --> 00:29:14,240 Speaker 2: We just need an independent Curtis stan that'll fix it exactly. 556 00:29:15,360 --> 00:29:17,880 Speaker 3: I've got an idea for the Middle East. I do 557 00:29:17,960 --> 00:29:19,800 Speaker 3: feel I mean, I do feel bad for that. It's 558 00:29:19,840 --> 00:29:23,560 Speaker 3: like the Kurds is just the constant every time, like 559 00:29:23,680 --> 00:29:25,920 Speaker 3: every time at one of these conflicts ends, it's the 560 00:29:25,960 --> 00:29:28,520 Speaker 3: Kurds end up getting screwed. So it's really tragic. 561 00:29:30,040 --> 00:29:32,840 Speaker 5: We're gonna stop here for today and continue next week 562 00:29:32,920 --> 00:29:36,160 Speaker 5: with more of our conversation with David McCloskey here on 563 00:29:36,400 --> 00:29:45,280 Speaker 5: Mission Implausible. Mission Implausible is produced by Adam Davidson, Jerry O'sha, 564 00:29:45,800 --> 00:29:50,760 Speaker 5: John Ceipher, and Jonathan Stern. The associate producer is Rachel Harner. 565 00:29:51,000 --> 00:29:54,800 Speaker 5: Mission Implausible It's a production of honorable mention and abominable 566 00:29:54,840 --> 00:29:56,760 Speaker 5: pictures for iHeart podcasts,