1 00:00:15,356 --> 00:00:23,276 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:23,316 --> 00:00:26,316 Speaker 1: where we explored the stories behind the stories in the news. 3 00:00:26,716 --> 00:00:32,956 Speaker 1: I'm Noah Feldman Today some good news. The Yubola outbreak 4 00:00:33,036 --> 00:00:36,196 Speaker 1: in the Democratic Republic of Congo seems to be coming 5 00:00:36,236 --> 00:00:40,676 Speaker 1: to an end finally. Since the outbreak started in August 6 00:00:40,796 --> 00:00:45,596 Speaker 1: twenty eighteen, over two thousand people have died, but now 7 00:00:45,916 --> 00:00:51,916 Speaker 1: infection rates are measurably slowing down. Plus, in other good news, 8 00:00:51,916 --> 00:00:55,036 Speaker 1: a vaccine for yubola is likely to get approved by 9 00:00:55,076 --> 00:00:59,116 Speaker 1: the European Commission very soon. That means that there is 10 00:00:59,196 --> 00:01:04,756 Speaker 1: now a potential prospect of substantial reduction of future outbreaks. 11 00:01:05,476 --> 00:01:09,556 Speaker 1: To learn more about these important developments, I spoke to Pardisa, 12 00:01:09,636 --> 00:01:11,956 Speaker 1: who's a cutting edge biologist at Harvard and at the 13 00:01:11,996 --> 00:01:15,716 Speaker 1: Broad Institute. Partis was actually in West Africa during the 14 00:01:15,716 --> 00:01:20,276 Speaker 1: twenty fourteen Ebola outbreak researching the disease, on which she's 15 00:01:20,276 --> 00:01:24,716 Speaker 1: made important scientific contributions. I started by asking her exactly 16 00:01:24,756 --> 00:01:29,076 Speaker 1: how the ebola vaccine works. Ebola as an RNA virus, 17 00:01:29,236 --> 00:01:31,716 Speaker 1: so some viruses are made of DNA and some are 18 00:01:31,756 --> 00:01:35,196 Speaker 1: made of RNA. RNA is a less stable sort of 19 00:01:35,236 --> 00:01:39,516 Speaker 1: the thing that makes copies of it is has more 20 00:01:39,836 --> 00:01:44,036 Speaker 1: mutations in it. So RNA viruses mutate faster, and flu 21 00:01:44,196 --> 00:01:47,356 Speaker 1: and ebol are both RNA viruses, but then those are 22 00:01:47,396 --> 00:01:51,476 Speaker 1: even within them those there's differences. You know, HIV and 23 00:01:51,556 --> 00:01:54,276 Speaker 1: flu are really hard to develop vaccines for because they 24 00:01:54,316 --> 00:01:58,156 Speaker 1: have a lot of different versions of themselves and they change, 25 00:01:58,516 --> 00:02:01,876 Speaker 1: you know, quickly, and ebola is not quite like that. 26 00:02:01,956 --> 00:02:05,396 Speaker 1: It's actually not as difficult to develop a vaccine for. 27 00:02:05,516 --> 00:02:08,156 Speaker 1: And that's sort of why we've seen such early, really 28 00:02:08,196 --> 00:02:13,316 Speaker 1: good success. We're seeing really good, promising success of the vaccines. 29 00:02:13,676 --> 00:02:17,116 Speaker 1: And why also, you know, so Murk has a lead 30 00:02:17,396 --> 00:02:19,916 Speaker 1: vaccine that's being used in the DRC right now. A 31 00:02:20,036 --> 00:02:23,196 Speaker 1: number of other companies like Johnson and Johnson and GSK 32 00:02:23,396 --> 00:02:26,196 Speaker 1: had one. A lot of people are developing vaccines because 33 00:02:26,196 --> 00:02:29,156 Speaker 1: they know that it's likely to be effective. Was there 34 00:02:29,156 --> 00:02:33,516 Speaker 1: a single breakthrough or development over the last five years 35 00:02:33,556 --> 00:02:38,676 Speaker 1: that made it possible for multiple companies to develop vaccines here? 36 00:02:38,716 --> 00:02:42,156 Speaker 1: I mean, obviously was not really in contemplation when the 37 00:02:42,196 --> 00:02:46,756 Speaker 1: disease was still poorly understood and when just basically characterizing 38 00:02:46,756 --> 00:02:48,916 Speaker 1: it was still the name of the game. Is the 39 00:02:48,996 --> 00:02:50,916 Speaker 1: process sort of what we should expect that over a 40 00:02:50,956 --> 00:02:53,596 Speaker 1: course of five years there was time or is it 41 00:02:53,636 --> 00:02:57,716 Speaker 1: more that there was some particular intervening event or discovery 42 00:02:57,716 --> 00:03:00,196 Speaker 1: that made it possible to develop these vaccines. No. I 43 00:03:00,196 --> 00:03:02,676 Speaker 1: mean the vaccine that Murk is using right now is 44 00:03:02,716 --> 00:03:07,236 Speaker 1: based on VSV, which is a particular type of raptivirus 45 00:03:08,076 --> 00:03:11,236 Speaker 1: that is really easy to engineer. And essentially what they 46 00:03:11,276 --> 00:03:14,716 Speaker 1: do is they take another virus that's less dangerous and 47 00:03:14,756 --> 00:03:17,996 Speaker 1: they put a protein of a bowl of virus that 48 00:03:19,356 --> 00:03:23,196 Speaker 1: is presented to the immune system into that virus, and 49 00:03:23,276 --> 00:03:26,196 Speaker 1: so it allows basically this other virus to bring in 50 00:03:26,636 --> 00:03:28,316 Speaker 1: a piece of a bowl of virus that will allow 51 00:03:28,356 --> 00:03:31,196 Speaker 1: our immune system to get a chance to see it 52 00:03:31,276 --> 00:03:35,876 Speaker 1: and develop immunity to it. It's a pretty well understood 53 00:03:35,996 --> 00:03:39,236 Speaker 1: way of developing a vaccine, and other groups like Johnson 54 00:03:39,236 --> 00:03:41,756 Speaker 1: and Johnson I think, have one that are based on 55 00:03:41,796 --> 00:03:44,236 Speaker 1: an adenovirus as well as another virus. So they have 56 00:03:44,316 --> 00:03:47,116 Speaker 1: got some other things that they're doing to try to 57 00:03:47,116 --> 00:03:50,716 Speaker 1: make vaccines even better. But even the very simple merc 58 00:03:50,956 --> 00:03:54,916 Speaker 1: based vaccine that's based on just one virus engineer to 59 00:03:56,276 --> 00:03:59,076 Speaker 1: carry a protein from e bowl is working very well. 60 00:03:59,276 --> 00:04:01,836 Speaker 1: So I don't think that this is an instance like 61 00:04:01,996 --> 00:04:04,916 Speaker 1: HIV or flu, where we need to do something very 62 00:04:04,956 --> 00:04:07,316 Speaker 1: fancy to get it to work. This is really more 63 00:04:07,876 --> 00:04:11,196 Speaker 1: of a situation in which there was no need to 64 00:04:11,236 --> 00:04:12,756 Speaker 1: invest that kind of money. I mean, I think one 65 00:04:12,756 --> 00:04:15,916 Speaker 1: of the really big issues that we have in vaccine 66 00:04:15,916 --> 00:04:19,076 Speaker 1: development in general is that there's no incentive for big 67 00:04:19,116 --> 00:04:21,516 Speaker 1: companies to put in a lot of effort to develop 68 00:04:21,596 --> 00:04:24,476 Speaker 1: vaccines for anything that might possibly be an outbreak. No 69 00:04:24,556 --> 00:04:27,596 Speaker 1: financial incentive. You mean, there's a there's a strong humanitarian incentive, 70 00:04:27,636 --> 00:04:29,716 Speaker 1: but you mean that there's not. There are for profit 71 00:04:29,756 --> 00:04:31,476 Speaker 1: companies and there's just not enough money to be made 72 00:04:31,516 --> 00:04:34,916 Speaker 1: in a crisis. In a simplest term, Yes, but you know, 73 00:04:34,956 --> 00:04:37,196 Speaker 1: but you could also say, well, we don't we don't 74 00:04:37,276 --> 00:04:39,476 Speaker 1: know what the next big outbreak is, so we can't 75 00:04:39,516 --> 00:04:43,156 Speaker 1: necessarily build vaccines for everything that might be possible. You know, 76 00:04:43,236 --> 00:04:46,996 Speaker 1: before the West African outbreak, it wasn't like a lot 77 00:04:47,036 --> 00:04:48,756 Speaker 1: of people were worried that people was going to be 78 00:04:48,796 --> 00:04:52,236 Speaker 1: something that would infect them. So there's definitely not financial incentive. 79 00:04:52,276 --> 00:04:56,196 Speaker 1: But there was not even necessarily of the universe of 80 00:04:56,276 --> 00:04:58,836 Speaker 1: many viruses that you might want to be worried about 81 00:04:58,836 --> 00:05:03,476 Speaker 1: and create a large scale vaccination program. People would one 82 00:05:03,556 --> 00:05:05,396 Speaker 1: might argue which of these viruses are the ones to 83 00:05:05,436 --> 00:05:07,796 Speaker 1: go after? So that's sort of really interesting because the 84 00:05:07,836 --> 00:05:10,236 Speaker 1: point you're making, if I understand correctly, is that you 85 00:05:10,276 --> 00:05:12,916 Speaker 1: know there's a cost benefit that goes into not only 86 00:05:12,956 --> 00:05:14,996 Speaker 1: developing a vaccine, but also in spending all the money 87 00:05:15,036 --> 00:05:17,756 Speaker 1: to go out and get people vaccinated. And to do that, 88 00:05:18,116 --> 00:05:21,236 Speaker 1: you really need to think there's a relatively high probability 89 00:05:21,636 --> 00:05:25,316 Speaker 1: of some event occurring that would have bad consequences. I mean, 90 00:05:25,316 --> 00:05:26,516 Speaker 1: I guess really what you want to do in a 91 00:05:26,516 --> 00:05:28,836 Speaker 1: cost benefit is figure out not only the probability of 92 00:05:28,876 --> 00:05:31,356 Speaker 1: an outbreak, but also the consequences of that outbreak, and 93 00:05:31,396 --> 00:05:34,116 Speaker 1: then multiply the probability by how bad it would be 94 00:05:34,316 --> 00:05:37,516 Speaker 1: and get an expected value, and then that goes into 95 00:05:37,556 --> 00:05:39,996 Speaker 1: your cost benefit analysis. And you're saying that, really, it 96 00:05:40,076 --> 00:05:43,596 Speaker 1: wasn't clear before the twenty fourteen West Africa outbreak that 97 00:05:43,676 --> 00:05:46,436 Speaker 1: ebola was that kind of a virus. Yeah, I mean, 98 00:05:46,716 --> 00:05:49,196 Speaker 1: I mean, obviously it was that kind of a virus, 99 00:05:49,236 --> 00:05:51,436 Speaker 1: But was it more that kind of a virus than 100 00:05:51,556 --> 00:05:56,876 Speaker 1: machupo or judine or haunts of virus or lassa virus. 101 00:05:56,956 --> 00:05:58,716 Speaker 1: I mean, if there is a vaccine that you would 102 00:05:58,756 --> 00:06:01,756 Speaker 1: have a bigger that would kind of hit the buttons 103 00:06:01,756 --> 00:06:05,836 Speaker 1: of being both a really big global sort of pandemic 104 00:06:05,876 --> 00:06:09,396 Speaker 1: threat and also a public health crisis there where veloping 105 00:06:09,476 --> 00:06:11,916 Speaker 1: vaccine now would also help a lot of people. Loss 106 00:06:11,916 --> 00:06:14,156 Speaker 1: of virus might be a better choice. And so it's 107 00:06:14,196 --> 00:06:17,436 Speaker 1: not necessarily that one couldn't make an argument that they 108 00:06:17,436 --> 00:06:19,756 Speaker 1: should have gone after ebola. It's just that there's I 109 00:06:19,796 --> 00:06:22,156 Speaker 1: think of almost four hundred viruses that are known to 110 00:06:22,196 --> 00:06:25,116 Speaker 1: infect humans, which one of those you would choose as 111 00:06:25,196 --> 00:06:26,916 Speaker 1: hard to make a case for. And when you talk 112 00:06:26,956 --> 00:06:30,236 Speaker 1: about the financials, so at least some of these vaccines 113 00:06:30,236 --> 00:06:33,476 Speaker 1: were already developed when the West African outbreak hit, they 114 00:06:33,516 --> 00:06:37,116 Speaker 1: just hadn't been tested in clinical trials. And that's really 115 00:06:37,116 --> 00:06:40,596 Speaker 1: the hard part. The cost of developing a vaccine may 116 00:06:40,676 --> 00:06:44,636 Speaker 1: be relatively reasonable, and then reasonable it means millions within 117 00:06:44,676 --> 00:06:47,556 Speaker 1: the millions of dollars, But once you go to testing 118 00:06:47,596 --> 00:06:50,156 Speaker 1: the first the safety and then the efficacy of the 119 00:06:50,236 --> 00:06:53,316 Speaker 1: vaccines through multiple clinical trials, you're getting up into the 120 00:06:53,556 --> 00:06:56,556 Speaker 1: sort of tens to hundreds of millions of dollars. And 121 00:06:56,596 --> 00:06:58,756 Speaker 1: so when you start to think about that, that's where 122 00:06:58,756 --> 00:07:01,796 Speaker 1: it becomes a whole different ball game. So in the 123 00:07:01,956 --> 00:07:04,156 Speaker 1: when the West African outbreak hit, there were vaccines that 124 00:07:04,196 --> 00:07:06,636 Speaker 1: could be employed, and one of the big things that 125 00:07:06,716 --> 00:07:09,036 Speaker 1: was happening there. At first, people didn't even want to 126 00:07:09,036 --> 00:07:13,036 Speaker 1: do vaccinations because they considered that research. To me is 127 00:07:13,076 --> 00:07:15,236 Speaker 1: sort of to say we're not going to try anything 128 00:07:15,876 --> 00:07:19,796 Speaker 1: is wild that that was such a big deal to 129 00:07:19,836 --> 00:07:23,596 Speaker 1: say we're going to try something when anybody who is 130 00:07:23,636 --> 00:07:26,876 Speaker 1: on the front lines, if asked, would you take a vaccine, 131 00:07:26,916 --> 00:07:30,076 Speaker 1: even if experimental, as long as it's known to be safe, 132 00:07:30,476 --> 00:07:32,236 Speaker 1: you would absolutely do it. But it took a long 133 00:07:32,276 --> 00:07:33,956 Speaker 1: time to even get the community to say we're going 134 00:07:33,996 --> 00:07:36,276 Speaker 1: to try some stuff. And it wasn't really until it 135 00:07:36,356 --> 00:07:40,636 Speaker 1: sort of hit and probably hit Western individuals, that people said, okay, 136 00:07:40,676 --> 00:07:43,916 Speaker 1: let's try these vaccines. And now now I think we 137 00:07:43,996 --> 00:07:46,836 Speaker 1: understand more that we need to try things in real time. 138 00:07:47,676 --> 00:07:49,636 Speaker 1: I want to unpack the many fascinating as you said 139 00:07:49,636 --> 00:07:54,356 Speaker 1: there party. So one is that actually, some of these 140 00:07:55,156 --> 00:08:00,036 Speaker 1: vaccines did exist in twenty fourteen, and we're not used 141 00:08:00,556 --> 00:08:05,996 Speaker 1: because they hadn't been properly tested in a randomized, controlled experiment, 142 00:08:06,036 --> 00:08:08,156 Speaker 1: And that runs very much counter to what the ordinary 143 00:08:08,196 --> 00:08:12,116 Speaker 1: person imagines from watching the movies, namely that you know, 144 00:08:12,156 --> 00:08:14,756 Speaker 1: when people are dying in large numbers in the field, 145 00:08:15,116 --> 00:08:17,556 Speaker 1: that's the moment when you say, Okay, we have this 146 00:08:17,636 --> 00:08:19,716 Speaker 1: untested vaccine, but it has a chance of working, so 147 00:08:19,796 --> 00:08:21,556 Speaker 1: let's get out there and save some lives. I mean, 148 00:08:21,716 --> 00:08:24,036 Speaker 1: there we have this sort of fantasy that when that happens, 149 00:08:24,796 --> 00:08:28,036 Speaker 1: then emergency and measures kick in. And I hear you're 150 00:08:28,036 --> 00:08:29,916 Speaker 1: saying that that's not exactly what happened, or at least 151 00:08:29,956 --> 00:08:31,756 Speaker 1: not what happened. That's just not what happens. It's not 152 00:08:31,756 --> 00:08:34,356 Speaker 1: what happened at all. I mean. And vaccines are kind 153 00:08:34,356 --> 00:08:38,596 Speaker 1: of a you know, are one thing, which is they're preventative, 154 00:08:38,636 --> 00:08:42,356 Speaker 1: and so one might argue, if you don't feel sure 155 00:08:42,396 --> 00:08:44,156 Speaker 1: that it'll be effective, you might not want to take it. 156 00:08:44,596 --> 00:08:48,596 Speaker 1: What's more shocking is treatments. Once you have ebola, would 157 00:08:48,636 --> 00:08:52,876 Speaker 1: you rather take your chances on riding out ebola or 158 00:08:52,916 --> 00:08:56,676 Speaker 1: trying a treatment that is not completely well vetted but 159 00:08:56,876 --> 00:09:00,116 Speaker 1: you know, is confident in given that the fatality rates 160 00:09:00,156 --> 00:09:03,396 Speaker 1: early in the outbreak were about eighty percent, you would say, 161 00:09:03,476 --> 00:09:07,196 Speaker 1: you know, I'll try anything. But even there, I think 162 00:09:07,236 --> 00:09:09,516 Speaker 1: that was the place that it was really even more 163 00:09:09,556 --> 00:09:12,436 Speaker 1: remarkable to see that even there a lot of the 164 00:09:12,556 --> 00:09:16,476 Speaker 1: international community wasn't willing to try treatments. There's some famous 165 00:09:16,516 --> 00:09:20,156 Speaker 1: cases of places where there was battles over trying treatments 166 00:09:20,156 --> 00:09:24,756 Speaker 1: on individuals that were infected, where organizations chose not to 167 00:09:26,356 --> 00:09:29,156 Speaker 1: and the individuals died. They didn't do it until it 168 00:09:29,196 --> 00:09:32,396 Speaker 1: was actually Americans that were infected that they said, well, 169 00:09:32,396 --> 00:09:34,036 Speaker 1: oh now we'll try the treatments. But there were a 170 00:09:34,036 --> 00:09:36,836 Speaker 1: lot of really prominent healthcare workers that were infected and 171 00:09:36,836 --> 00:09:38,556 Speaker 1: they didn't even have it. They were not even given 172 00:09:38,556 --> 00:09:42,676 Speaker 1: the opportunity to take a experimental drug. So willingness to 173 00:09:42,716 --> 00:09:47,076 Speaker 1: try experimental treatments was actually higher when it was Americans 174 00:09:47,156 --> 00:09:50,596 Speaker 1: getting sick than when it was Africans who were getting sick. 175 00:09:50,636 --> 00:09:53,236 Speaker 1: And that doesn't seem intuitive to me. I mean, you 176 00:09:53,276 --> 00:09:57,596 Speaker 1: sort of imagine that if anything, you know, Western professionals, 177 00:09:57,636 --> 00:09:59,676 Speaker 1: they're not all Western, but many Western professionals would be 178 00:09:59,676 --> 00:10:03,116 Speaker 1: more willing to try out something experimental in Africa. So 179 00:10:03,236 --> 00:10:05,356 Speaker 1: what's the background to that, is that a case of 180 00:10:05,396 --> 00:10:08,796 Speaker 1: bending over backwards to try not to appear to be 181 00:10:09,156 --> 00:10:14,516 Speaker 1: ferimenting with an experimental treatment on people in poorer countries 182 00:10:14,916 --> 00:10:18,076 Speaker 1: and then that's a kind of perverse reversal of our 183 00:10:18,716 --> 00:10:21,956 Speaker 1: deep past and that leads to this kind of strange result. 184 00:10:22,116 --> 00:10:24,636 Speaker 1: Or is it more a story of just valuing American 185 00:10:24,676 --> 00:10:27,476 Speaker 1: lives more highly? Like what's you're deep inside this? So 186 00:10:27,516 --> 00:10:29,516 Speaker 1: what's your theory about how that that happened? Well, I mean, 187 00:10:29,596 --> 00:10:32,356 Speaker 1: actually I was part of the conversations, and I would 188 00:10:32,396 --> 00:10:35,396 Speaker 1: say although I didn't get to I would I did 189 00:10:35,396 --> 00:10:38,356 Speaker 1: not get to witness. So the one the one that 190 00:10:38,396 --> 00:10:40,876 Speaker 1: I got to witness the early conversations and then didn't 191 00:10:40,876 --> 00:10:43,676 Speaker 1: get to witness what happened at the end was with 192 00:10:43,996 --> 00:10:47,956 Speaker 1: doctor Humar Khan, who's a Shikumar Khan, who's the head 193 00:10:47,996 --> 00:10:51,676 Speaker 1: of the clinical ward at Kennema Government Hospital in Siri Leone. 194 00:10:51,916 --> 00:10:55,036 Speaker 1: That's where my lab had worked for many years before 195 00:10:55,076 --> 00:10:58,676 Speaker 1: the Bola outbreak hit. And he was very prominent, very 196 00:10:59,516 --> 00:11:02,756 Speaker 1: heroic Siri Leone in doctor on the front lines of 197 00:11:03,036 --> 00:11:06,316 Speaker 1: first loss of virus and then ebola and of yours 198 00:11:06,356 --> 00:11:08,996 Speaker 1: and a friend of mine and he was he was 199 00:11:09,156 --> 00:11:11,796 Speaker 1: infected and it was actually the first time that the 200 00:11:11,796 --> 00:11:16,196 Speaker 1: international community said now, well, so we were actually banging 201 00:11:16,236 --> 00:11:19,476 Speaker 1: down the doors trying to get experimental drugs and vaccines 202 00:11:21,236 --> 00:11:24,036 Speaker 1: tested early in the outbreak, and that was sort of 203 00:11:24,076 --> 00:11:26,876 Speaker 1: like there was just like not no discussion of that. 204 00:11:26,996 --> 00:11:28,716 Speaker 1: It was it was that was that no one even 205 00:11:28,756 --> 00:11:31,676 Speaker 1: wanted to do it. And then finally actually when doctor 206 00:11:31,756 --> 00:11:34,156 Speaker 1: Khn became ill and people recognize he was very prominent 207 00:11:34,436 --> 00:11:38,276 Speaker 1: and this would be really devastating, there was discussions, but 208 00:11:38,356 --> 00:11:42,596 Speaker 1: even there it was interesting. So I think a lot 209 00:11:42,636 --> 00:11:43,756 Speaker 1: of it has to do with the fact that they 210 00:11:43,756 --> 00:11:46,396 Speaker 1: want to be perceived as testing on Africans. A lot 211 00:11:46,436 --> 00:11:48,276 Speaker 1: of it is see what if something goes wrong, what 212 00:11:48,316 --> 00:11:50,276 Speaker 1: if he gets the drug and dies, would they blame 213 00:11:50,316 --> 00:11:52,316 Speaker 1: the drug? So I think a lot of us around 214 00:11:52,356 --> 00:11:55,276 Speaker 1: that political perception, and some of it is this idea 215 00:11:55,436 --> 00:11:58,356 Speaker 1: that this kind of paternalistic view of like we have 216 00:11:58,436 --> 00:12:02,316 Speaker 1: to protect them, them being being Africans. Africans, yeah, they 217 00:12:02,636 --> 00:12:05,596 Speaker 1: like you know, when I do ethical protocols in Africa, 218 00:12:05,876 --> 00:12:07,756 Speaker 1: I've taken a lot of ethics courses. I got really 219 00:12:07,756 --> 00:12:09,556 Speaker 1: invested in it because I wanted to under stand how 220 00:12:09,636 --> 00:12:12,116 Speaker 1: are we doing these things and thinking about how to 221 00:12:12,156 --> 00:12:14,396 Speaker 1: do things ethically right. And there's a lot of like 222 00:12:14,436 --> 00:12:16,356 Speaker 1: we must protect them or we must be you know, 223 00:12:16,356 --> 00:12:18,956 Speaker 1: their vulnerable populations. We don't want to exploit them, but 224 00:12:19,076 --> 00:12:22,516 Speaker 1: sometimes it's almost infantalizing the way we describe their ability 225 00:12:22,556 --> 00:12:24,876 Speaker 1: to interpret it. And in the case of doctor Khan, 226 00:12:25,516 --> 00:12:28,476 Speaker 1: it was an extreme case of in fact, he's more 227 00:12:28,516 --> 00:12:30,796 Speaker 1: informed than I am to make this decision than any 228 00:12:30,876 --> 00:12:32,476 Speaker 1: than And you know, even though I have an MD 229 00:12:32,556 --> 00:12:35,356 Speaker 1: as well, he's still probably the most informed person on 230 00:12:35,396 --> 00:12:37,476 Speaker 1: the planet to make a really good decision as to 231 00:12:37,476 --> 00:12:38,876 Speaker 1: whether or not he should take the start or not. 232 00:12:39,156 --> 00:12:40,836 Speaker 1: It's like it is like the scenario for a movie, 233 00:12:40,876 --> 00:12:42,316 Speaker 1: isn't it. You know, like the person with the most 234 00:12:42,356 --> 00:12:44,716 Speaker 1: on the ground clinical experience in the disease is now 235 00:12:44,756 --> 00:12:46,596 Speaker 1: infected with the disease and who's going to make the 236 00:12:46,596 --> 00:12:49,116 Speaker 1: decision about his treatment? Yeah, And so out of a 237 00:12:49,156 --> 00:12:51,956 Speaker 1: lot of those conversations that seemed pretty obvious that he 238 00:12:52,076 --> 00:12:53,956 Speaker 1: was the right person, it made a lot of sense, 239 00:12:54,076 --> 00:12:56,356 Speaker 1: and this is the time to try it. You know, 240 00:12:56,356 --> 00:12:57,996 Speaker 1: I wasn't in the situation room. I was just in 241 00:12:58,036 --> 00:12:59,836 Speaker 1: the kind of pre conversations. But when it got to 242 00:12:59,956 --> 00:13:02,476 Speaker 1: the situation room, they got cold feed. And I think 243 00:13:02,516 --> 00:13:05,836 Speaker 1: for these other reasons, these sort of like larger implications, 244 00:13:06,356 --> 00:13:08,876 Speaker 1: And then that part that you asked before, Like I said, 245 00:13:08,876 --> 00:13:12,956 Speaker 1: I want to speculate, but obviously things changed when it 246 00:13:12,996 --> 00:13:16,076 Speaker 1: was Americans on the ground affected. And I think um 247 00:13:16,916 --> 00:13:19,476 Speaker 1: parties before you tell us that what actually happened to 248 00:13:19,556 --> 00:13:23,196 Speaker 1: doctor Khan. Oh, he passed away. He passed away. So 249 00:13:23,236 --> 00:13:25,636 Speaker 1: they chose not they ultimately chose. I mean, I just 250 00:13:25,676 --> 00:13:28,516 Speaker 1: want to make sure that we get the depth of this. 251 00:13:28,636 --> 00:13:31,396 Speaker 1: So he there, he was, and in the end, the 252 00:13:31,476 --> 00:13:34,236 Speaker 1: relevant people in the room were not prepared to try 253 00:13:34,276 --> 00:13:37,996 Speaker 1: the experimental treatment and he died. Yeah, And the experimental 254 00:13:37,996 --> 00:13:39,956 Speaker 1: treatment was at the site, so there it was just 255 00:13:40,036 --> 00:13:43,716 Speaker 1: you know, feet away Frore it was, he was, it was, 256 00:13:43,756 --> 00:13:46,876 Speaker 1: it was there, it was ready, it could have been used, 257 00:13:46,916 --> 00:13:49,196 Speaker 1: and it wasn't. And when they tried that treatment later 258 00:13:49,396 --> 00:13:53,116 Speaker 1: on others they took that exact that exact dose, and 259 00:13:53,116 --> 00:13:56,516 Speaker 1: they gave it to two Americans, Nancy Warble and Kent Brandley, 260 00:13:56,556 --> 00:14:02,916 Speaker 1: and they both survived. Wow. Wow. Yeah. So I mean, 261 00:14:02,956 --> 00:14:04,716 Speaker 1: when you can never say with one hundred percent certainty, 262 00:14:04,756 --> 00:14:07,916 Speaker 1: but there's very very strong reason to believe that had 263 00:14:07,956 --> 00:14:10,956 Speaker 1: doctor Khan received that treatment that was right there in 264 00:14:11,076 --> 00:14:15,836 Speaker 1: the in the building with him, he would have lived it. Definitely. 265 00:14:16,356 --> 00:14:19,356 Speaker 1: That was the risk I'm sure he know, his family knows, 266 00:14:19,476 --> 00:14:23,436 Speaker 1: and I know if he was never given the choice. 267 00:14:23,636 --> 00:14:25,556 Speaker 1: If he was given the choice, he certainly would have 268 00:14:26,116 --> 00:14:30,036 Speaker 1: he would have ruled the dice with the drug. Can 269 00:14:30,076 --> 00:14:31,756 Speaker 1: I ask because this is a bit opaque to the 270 00:14:31,836 --> 00:14:34,156 Speaker 1: ordinary person. Definitely to me, And I'm not asking you 271 00:14:34,276 --> 00:14:37,876 Speaker 1: to name any names, of course, but institutionally, who gets 272 00:14:37,916 --> 00:14:39,636 Speaker 1: to make that call? You mentioned that you had only 273 00:14:39,676 --> 00:14:41,836 Speaker 1: been in the preliminary conversations. But when the decision is 274 00:14:41,916 --> 00:14:46,276 Speaker 1: ultimately being made. Are we talking about international organizations? Are 275 00:14:46,276 --> 00:14:50,316 Speaker 1: we talking about local government? Are we talking about US government? 276 00:14:50,396 --> 00:14:52,756 Speaker 1: Is at the who like who is at the private 277 00:14:52,756 --> 00:14:55,116 Speaker 1: companies that control these things? Who is in the room 278 00:14:55,156 --> 00:14:57,676 Speaker 1: when the decision is made. In this particular case, it 279 00:14:57,716 --> 00:15:00,156 Speaker 1: was the MSF, And they have talked about it themselves. 280 00:15:00,436 --> 00:15:04,956 Speaker 1: That's the med Sandstone Frontier, the International NGEO, Doctors Without Borders. Yeah, 281 00:15:04,996 --> 00:15:07,236 Speaker 1: and and they I have a lot of sympathy for 282 00:15:07,436 --> 00:15:10,316 Speaker 1: them and their decision, So I often say it's not 283 00:15:10,356 --> 00:15:12,756 Speaker 1: the decision I would have made. But I do think, 284 00:15:12,796 --> 00:15:14,876 Speaker 1: at least in the early conversations I was part of, 285 00:15:15,316 --> 00:15:18,196 Speaker 1: they're being thoughtful, They're being mindful, and there's a number 286 00:15:18,236 --> 00:15:19,756 Speaker 1: of things that kind of came into play, and they 287 00:15:19,756 --> 00:15:23,756 Speaker 1: made a choice, and they certainly regretted that choice. So 288 00:15:23,836 --> 00:15:26,916 Speaker 1: in that case, it was, yeah, it's this particular organization 289 00:15:26,956 --> 00:15:28,996 Speaker 1: and they were in charge of his care and they 290 00:15:28,996 --> 00:15:31,956 Speaker 1: are making a choice about what to do with his care. 291 00:15:32,636 --> 00:15:37,076 Speaker 1: So I wrote a book called Outbreak Culture, and it 292 00:15:37,116 --> 00:15:41,996 Speaker 1: talks about sort of what can become a toxic culture 293 00:15:42,076 --> 00:15:46,156 Speaker 1: that emerges amidst the crucible of an outbreak. And the 294 00:15:46,276 --> 00:15:49,836 Speaker 1: reason I wrote it, and I wrote it with Lara 295 00:15:49,916 --> 00:15:55,276 Speaker 1: Salahia journalist, a fantastic journalist, was that I'm not trying 296 00:15:55,316 --> 00:15:58,396 Speaker 1: to blame a particular person. I have deep sympathy for 297 00:15:58,476 --> 00:16:00,876 Speaker 1: MSF and deep respect for everything that they're trying to do. 298 00:16:01,276 --> 00:16:04,076 Speaker 1: I'm trying to understand the different the kind of complexity 299 00:16:04,156 --> 00:16:09,996 Speaker 1: of considerations that these matter organizations have. It's the suspicion 300 00:16:10,036 --> 00:16:13,836 Speaker 1: and paranoia and you know, perverse incentives that emerge in 301 00:16:13,876 --> 00:16:18,196 Speaker 1: the middle of a completely banana situation. Is how do 302 00:16:18,236 --> 00:16:22,676 Speaker 1: we make better decisions given the many, many complexities of 303 00:16:22,676 --> 00:16:26,116 Speaker 1: an outbreak. In your book that you just mentioned, Outbreak Culture, 304 00:16:26,316 --> 00:16:31,516 Speaker 1: you go through in a lot of systematic detail things 305 00:16:31,556 --> 00:16:34,796 Speaker 1: that went wrong in the West African outbreak and in 306 00:16:34,836 --> 00:16:38,316 Speaker 1: the response to the outbreak. And I'm wondering, as you 307 00:16:38,436 --> 00:16:41,556 Speaker 1: look at the responses that are happening right now in 308 00:16:41,676 --> 00:16:44,836 Speaker 1: the Democratic Republic of Congo, which is obviously a different 309 00:16:44,916 --> 00:16:50,516 Speaker 1: geographical location, different political circumstances, do you think to yourself, Oh, 310 00:16:50,556 --> 00:16:52,516 Speaker 1: my god, here we go again. You know, here are 311 00:16:52,556 --> 00:16:54,916 Speaker 1: the mistakes that I just pointed out in a book, 312 00:16:55,316 --> 00:16:58,716 Speaker 1: you know, published just last year, you know, for everybody 313 00:16:58,756 --> 00:17:01,876 Speaker 1: to see, like, don't do this wrong. People could die, 314 00:17:01,996 --> 00:17:05,196 Speaker 1: and then those things are happening. Or maybe there are 315 00:17:05,276 --> 00:17:08,076 Speaker 1: examples where there are lessons learned that you pointed out 316 00:17:08,076 --> 00:17:10,436 Speaker 1: that are being taken into acount WHI would be more optimistic, 317 00:17:10,556 --> 00:17:12,276 Speaker 1: we're thinking better, or maybe it's both. I would really 318 00:17:12,356 --> 00:17:15,396 Speaker 1: love to hear from your perspective, how are we doing 319 00:17:15,476 --> 00:17:18,276 Speaker 1: this time relative to the last time. The thing is, 320 00:17:18,436 --> 00:17:22,236 Speaker 1: these books have been written time and time again. In fact, 321 00:17:22,356 --> 00:17:25,156 Speaker 1: when Laura and I kind of talk about these are recommendations, 322 00:17:25,196 --> 00:17:29,276 Speaker 1: we actually don't give specific recommendations because so many people 323 00:17:29,316 --> 00:17:32,156 Speaker 1: have given them before. What we try instead to do 324 00:17:32,316 --> 00:17:35,076 Speaker 1: is to give a framework of how we should think 325 00:17:35,116 --> 00:17:40,276 Speaker 1: about responding to outbreaks, and specifically we talk about organizational justice. 326 00:17:40,756 --> 00:17:44,396 Speaker 1: You know, the idea of organizational justice is that people 327 00:17:44,516 --> 00:17:49,236 Speaker 1: believe that there's due process, there's a meritocracy, there's cooperation, 328 00:17:49,316 --> 00:17:53,556 Speaker 1: there's transparency, and you know there. I think we're getting 329 00:17:53,596 --> 00:17:57,076 Speaker 1: better at recognizing that we need to work with the 330 00:17:57,116 --> 00:18:00,476 Speaker 1: communities allow them to understand what we're doing. But I'd 331 00:18:00,516 --> 00:18:02,796 Speaker 1: say that there's a move towards that being important, but 332 00:18:02,836 --> 00:18:04,956 Speaker 1: not necessarily that we've learned how to do it right, 333 00:18:05,356 --> 00:18:08,076 Speaker 1: because there's still a challenge of getting people vaccinated when 334 00:18:08,076 --> 00:18:10,436 Speaker 1: there's a lack of trust. A lot of the treatment 335 00:18:10,476 --> 00:18:14,556 Speaker 1: centers are being burned down because we haven't effectively communicated 336 00:18:14,596 --> 00:18:17,276 Speaker 1: with the communities in which we're working, and so we 337 00:18:17,756 --> 00:18:20,396 Speaker 1: haven't given them a sense of organizational justice, even if 338 00:18:20,436 --> 00:18:23,956 Speaker 1: we may be developing it ourselves. And how much of 339 00:18:23,996 --> 00:18:26,396 Speaker 1: that is dependent on the government that's in play. I mean, 340 00:18:26,396 --> 00:18:29,156 Speaker 1: in the case of Democratic Republic of Congo, you're working 341 00:18:29,156 --> 00:18:34,196 Speaker 1: in a place that has a history of serious civil 342 00:18:34,236 --> 00:18:38,796 Speaker 1: war autocracy of a totalitarian type before the civil war 343 00:18:39,316 --> 00:18:42,836 Speaker 1: disorder that arguably was even worse than the totalitarian dictatorship. 344 00:18:43,396 --> 00:18:46,516 Speaker 1: It's not credible or realistic to think that the government 345 00:18:46,956 --> 00:18:52,116 Speaker 1: would have popular legitimacy as delivering justice and in those 346 00:18:52,156 --> 00:18:55,076 Speaker 1: places definitely where the outbreak is occurring. So given that, 347 00:18:55,676 --> 00:18:59,556 Speaker 1: is it at all credible to even imagine that local 348 00:18:59,636 --> 00:19:01,676 Speaker 1: people will say, oh, well, you know that might be true, 349 00:19:01,676 --> 00:19:04,476 Speaker 1: but this international health organization that's looking to help us, 350 00:19:04,796 --> 00:19:07,756 Speaker 1: they're you know, full of justice. You know, absolutely agree, 351 00:19:07,756 --> 00:19:10,636 Speaker 1: And I think that that's a major shoe is obviously 352 00:19:10,676 --> 00:19:13,436 Speaker 1: that in a society in which there's always been corruption 353 00:19:13,476 --> 00:19:15,676 Speaker 1: and injustice, it's really hard to get anybody to believe 354 00:19:15,676 --> 00:19:18,556 Speaker 1: in anything. At the same time, I don't like kind 355 00:19:18,596 --> 00:19:20,556 Speaker 1: of pointing to that being in a shoe because it's 356 00:19:20,596 --> 00:19:25,116 Speaker 1: so easy. We're so good at anthropomorphizing other cultures and 357 00:19:25,156 --> 00:19:27,876 Speaker 1: sort of saying that's that's their problem, that's not us. 358 00:19:28,236 --> 00:19:30,596 Speaker 1: That we don't look to see what we're doing, what 359 00:19:30,596 --> 00:19:33,036 Speaker 1: we could be doing better, and essentially we kind of 360 00:19:33,076 --> 00:19:35,036 Speaker 1: throw everything up to oh, those are corrupt governments, so 361 00:19:35,356 --> 00:19:38,236 Speaker 1: you know, so there, But ultimately, like you know, I 362 00:19:38,276 --> 00:19:41,676 Speaker 1: witnessed it personally, we are bad actors ourselves. We can 363 00:19:41,756 --> 00:19:44,476 Speaker 1: be what's an example of where you saw we in 364 00:19:44,516 --> 00:19:46,956 Speaker 1: the sense by which I assume you mean the international 365 00:19:46,996 --> 00:19:50,396 Speaker 1: community of scientists and healthcare workers who are trying to 366 00:19:50,436 --> 00:19:53,316 Speaker 1: help What's an instance where you saw that that we 367 00:19:53,796 --> 00:19:56,316 Speaker 1: being bad actors. Oh, I mean a million different ways. 368 00:19:56,596 --> 00:19:58,276 Speaker 1: And again I'm not going to name any names, but 369 00:19:58,316 --> 00:20:00,716 Speaker 1: I'll say that there are a number of people there 370 00:20:00,956 --> 00:20:02,636 Speaker 1: in the West effering an outbreak that we're all. I 371 00:20:02,636 --> 00:20:05,836 Speaker 1: mean you, We actually saw public fighting between number of 372 00:20:06,156 --> 00:20:09,516 Speaker 1: major international organizations. People were trying to throw the people 373 00:20:09,556 --> 00:20:11,436 Speaker 1: out because they wanted to be the ones to like 374 00:20:11,756 --> 00:20:14,236 Speaker 1: do the first diagnosis or get the first paper out 375 00:20:14,396 --> 00:20:17,156 Speaker 1: or they We got kicked out of multiple sites. I 376 00:20:17,156 --> 00:20:19,756 Speaker 1: got kicked out of participating. They said you have no 377 00:20:19,836 --> 00:20:22,396 Speaker 1: business being here. And so there you are on the ground, 378 00:20:22,436 --> 00:20:24,596 Speaker 1: the person who's been doing the most groundwork in the 379 00:20:24,596 --> 00:20:28,516 Speaker 1: field scientifically for the previous several years, and they're like, leave, 380 00:20:28,556 --> 00:20:30,116 Speaker 1: we don't want you here. Yeah, my team was told 381 00:20:30,116 --> 00:20:32,196 Speaker 1: to leave. They were told they were not to return. 382 00:20:32,316 --> 00:20:35,116 Speaker 1: We had no business here. And you know, our colleagues 383 00:20:35,116 --> 00:20:38,036 Speaker 1: were the only ones doing diagnosis. They had a you know, 384 00:20:38,676 --> 00:20:40,676 Speaker 1: a diagnostic that was working, and they're the only ones 385 00:20:40,756 --> 00:20:43,596 Speaker 1: doing diagnosis in a particular site, and an individual for 386 00:20:43,676 --> 00:20:46,276 Speaker 1: a major organization kicked us out because they didn't want us. 387 00:20:46,476 --> 00:20:47,996 Speaker 1: We were doing it for free. We weren't doing it 388 00:20:47,996 --> 00:20:49,996 Speaker 1: for credit, but they didn't want us doing it, and 389 00:20:50,196 --> 00:20:52,756 Speaker 1: our partner there and our African partner there kind of 390 00:20:52,756 --> 00:20:54,956 Speaker 1: fought with this person and said, but you recognize that 391 00:20:55,116 --> 00:20:56,596 Speaker 1: we're the only ones that can do it. If we leave, 392 00:20:56,636 --> 00:20:58,876 Speaker 1: there's there's nothing, and he said, I don't care. I 393 00:20:58,916 --> 00:21:01,916 Speaker 1: want you, I want you out. And then the amount 394 00:21:01,956 --> 00:21:04,236 Speaker 1: of libel that gets thrown out and everybody gets it. 395 00:21:04,316 --> 00:21:07,996 Speaker 1: Like essentially, I had multiple ceases and assists. I had, 396 00:21:08,356 --> 00:21:10,596 Speaker 1: you know, before my paper we got published in a 397 00:21:10,596 --> 00:21:13,796 Speaker 1: major journal, got out, we had somebody send a basically 398 00:21:14,076 --> 00:21:16,596 Speaker 1: a scathing letter to the journal to say that we 399 00:21:16,596 --> 00:21:20,596 Speaker 1: were criminals. I mean, it's endless. It's literally eighty percent 400 00:21:20,636 --> 00:21:23,196 Speaker 1: of my time during that time was fighting off attacks, 401 00:21:23,316 --> 00:21:28,476 Speaker 1: not actually doing positive work. I mean, that's it's genuinely astonishing. 402 00:21:28,476 --> 00:21:29,716 Speaker 1: I mean, you know what, you and I have spoken 403 00:21:29,756 --> 00:21:31,316 Speaker 1: about this a little bit before, and so I know 404 00:21:31,356 --> 00:21:33,276 Speaker 1: a little bit about it. But what you're describing is 405 00:21:33,276 --> 00:21:38,036 Speaker 1: something really really pervasive of competition under conditions about break 406 00:21:38,116 --> 00:21:40,836 Speaker 1: for credit and for I guess the point of the 407 00:21:40,836 --> 00:21:43,876 Speaker 1: credit is to get future funding. Yeah, to get future funding, 408 00:21:43,876 --> 00:21:46,636 Speaker 1: to get recognition. Now all of those things, I have 409 00:21:46,676 --> 00:21:49,276 Speaker 1: to say, it's a depressing picture that you're painting, but 410 00:21:49,596 --> 00:21:51,516 Speaker 1: depressing well. But the thing is, I mean it's not 411 00:21:51,636 --> 00:21:53,796 Speaker 1: different from I mean, have you been seeing what's happening 412 00:21:53,836 --> 00:21:56,556 Speaker 1: in Washington right now? I mean, ultimately everything is about that. 413 00:21:56,836 --> 00:21:58,796 Speaker 1: I mean, I think the people, that's what people right. 414 00:21:58,796 --> 00:22:00,236 Speaker 1: But the thing, the difference there is that people are 415 00:22:00,276 --> 00:22:02,596 Speaker 1: in this instance that you're describing, people are dying of 416 00:22:02,676 --> 00:22:05,916 Speaker 1: disease in real time. Yeah, you know, I mean Washington 417 00:22:06,036 --> 00:22:08,556 Speaker 1: is bad. I'm not defending it for goodness sakes, but 418 00:22:08,556 --> 00:22:11,516 Speaker 1: by the same token, we sort of expect that. I mean, 419 00:22:11,556 --> 00:22:13,396 Speaker 1: maybe it's to do with our expectations. I mean we 420 00:22:13,436 --> 00:22:16,316 Speaker 1: sort of expect politicians to be politicians, but then here 421 00:22:16,356 --> 00:22:20,636 Speaker 1: are scientists and public health workers and physicians. Almost everyone 422 00:22:20,716 --> 00:22:24,276 Speaker 1: swore in the Hippocratic oath. You know, they're they've devoted 423 00:22:24,276 --> 00:22:26,036 Speaker 1: their lives to do this and not to making billions 424 00:22:26,076 --> 00:22:28,836 Speaker 1: of dollars because they wanted to help people. You know, 425 00:22:28,836 --> 00:22:31,596 Speaker 1: they're willing to take significant risks by being in country 426 00:22:31,636 --> 00:22:33,996 Speaker 1: as you were, in the middle of an outbreak. I mean, 427 00:22:34,036 --> 00:22:35,876 Speaker 1: we sort of expect. Maybe I don't expect everyone to 428 00:22:35,876 --> 00:22:37,996 Speaker 1: be quite Albert Schwitzer, but we at least expect people 429 00:22:37,996 --> 00:22:40,756 Speaker 1: to be better than Congress. Yeah, in that situation. Yeah, 430 00:22:40,796 --> 00:22:43,236 Speaker 1: I mean it's a low standard. But you know so, 431 00:22:43,556 --> 00:22:45,276 Speaker 1: and and I should say, I want to make very 432 00:22:45,276 --> 00:22:47,916 Speaker 1: clear there are some of those people there, right, you 433 00:22:47,956 --> 00:22:50,276 Speaker 1: see the best of of humanity and the worst of 434 00:22:50,356 --> 00:22:52,236 Speaker 1: humanity is what you see, right you see, I mean 435 00:22:52,236 --> 00:22:54,476 Speaker 1: you see these frontline workers who are who are just 436 00:22:54,516 --> 00:22:58,316 Speaker 1: doing everything they can anonymously, you know, just to help 437 00:22:58,316 --> 00:23:00,956 Speaker 1: their fellow you know, fellow man. You see all of that. 438 00:23:00,996 --> 00:23:04,236 Speaker 1: It's it's tremendous um. It's actually it's that so I 439 00:23:04,276 --> 00:23:07,036 Speaker 1: call it a crucible. It is truly a crucible. It's 440 00:23:07,036 --> 00:23:11,436 Speaker 1: a melting pot of insane for upon you in which 441 00:23:11,516 --> 00:23:14,316 Speaker 1: like all of these emotions happened. But two, the thing 442 00:23:14,356 --> 00:23:16,956 Speaker 1: is people expect that suddenly we'd all behave really well. 443 00:23:17,316 --> 00:23:20,796 Speaker 1: But even even when people make Hollywood characterizations of what 444 00:23:20,796 --> 00:23:22,636 Speaker 1: happens during an outbreak, people are like punching each other 445 00:23:22,676 --> 00:23:25,116 Speaker 1: in the grocery store for food. And I mean it's 446 00:23:25,116 --> 00:23:27,676 Speaker 1: a crazy environment. I mean, there are legitimate reasons to 447 00:23:27,716 --> 00:23:32,436 Speaker 1: be just frightened, as suspicious and concerned, and so it 448 00:23:32,476 --> 00:23:36,116 Speaker 1: definitely like amps up the paranoia level a lot that 449 00:23:36,196 --> 00:23:38,196 Speaker 1: I can't believe I mean, I you know, it was 450 00:23:38,396 --> 00:23:41,356 Speaker 1: not as extreme, but when I was in Iraq, you know, 451 00:23:41,556 --> 00:23:44,956 Speaker 1: in the period of time immediately following the US invasion 452 00:23:44,956 --> 00:23:47,516 Speaker 1: and occupation there, you could just sort of see the 453 00:23:47,596 --> 00:23:51,716 Speaker 1: kind of amped up cortisol levels in everybody leads to 454 00:23:51,916 --> 00:23:55,596 Speaker 1: bad decision making. It leads to paranoia. One person shoots 455 00:23:55,596 --> 00:23:58,036 Speaker 1: at you, and now you think everyone's shooting at you, right, 456 00:23:58,156 --> 00:24:00,476 Speaker 1: you know, there's no question that decision making. And I 457 00:24:00,476 --> 00:24:03,716 Speaker 1: imagine that people who were on a battlefield, the literal battlefield, 458 00:24:03,716 --> 00:24:05,996 Speaker 1: people soldiers serving in war, and that they do in 459 00:24:06,036 --> 00:24:08,956 Speaker 1: their writing explore this same sort of idea that there 460 00:24:08,996 --> 00:24:10,916 Speaker 1: is is you know, there are a whole series of 461 00:24:10,916 --> 00:24:14,316 Speaker 1: forces that kick in and make it very difficult to 462 00:24:14,436 --> 00:24:16,836 Speaker 1: exercise excellent judgment, to do things right, to look out 463 00:24:16,876 --> 00:24:19,996 Speaker 1: for for anybody other than your team. Right, That's very 464 00:24:20,036 --> 00:24:21,956 Speaker 1: much what's going on there as well. I mean, the 465 00:24:22,036 --> 00:24:25,556 Speaker 1: virus is actually just a backdrop to what's happening, the 466 00:24:25,556 --> 00:24:28,516 Speaker 1: politics happening between people. It's like you, in a way, 467 00:24:28,556 --> 00:24:30,436 Speaker 1: the virus is the thing that people are the least 468 00:24:30,476 --> 00:24:34,196 Speaker 1: worried about. Um, they're really worried about who's slandering them, 469 00:24:34,196 --> 00:24:36,196 Speaker 1: who's trying to take them out of their position. Who's 470 00:24:36,676 --> 00:24:40,116 Speaker 1: and it's kind of that's really people taking their eye 471 00:24:40,116 --> 00:24:44,196 Speaker 1: off the ball, right, and and there's this huge existential threat, 472 00:24:44,236 --> 00:24:47,516 Speaker 1: and everybody's just worried about you know, they're the office politics. 473 00:24:47,756 --> 00:24:50,476 Speaker 1: We're primates, We like we worry a lot about hierarchy. Yeah, 474 00:24:50,836 --> 00:24:53,476 Speaker 1: let's do I want to ask when you look at 475 00:24:53,516 --> 00:24:57,076 Speaker 1: the upside of the current outbreak, you know, namely the 476 00:24:57,156 --> 00:25:01,876 Speaker 1: successes least preliminary successes of the vaccine and preliminary successes 477 00:25:01,876 --> 00:25:05,116 Speaker 1: of the treatments, do you think, Look, it took a 478 00:25:05,156 --> 00:25:09,916 Speaker 1: little while, but science broadly speaking is working. You know, 479 00:25:09,996 --> 00:25:13,076 Speaker 1: we're getting to a place where when we hear about 480 00:25:13,076 --> 00:25:16,156 Speaker 1: an ebola outbreak, we're no longer going to feel sitting 481 00:25:16,476 --> 00:25:19,716 Speaker 1: here in the United States existential dread, you know, worry 482 00:25:19,756 --> 00:25:23,316 Speaker 1: about a global pandemic. We're going to increasingly think, Okay, 483 00:25:23,316 --> 00:25:24,916 Speaker 1: you know, like this is rough, and it's going to 484 00:25:24,996 --> 00:25:26,956 Speaker 1: be difficult, and no doubt some people will die, but 485 00:25:27,516 --> 00:25:30,636 Speaker 1: this is going to be fixable and manageable. And if so, 486 00:25:30,836 --> 00:25:34,196 Speaker 1: is that is that a win for really for science broadly, 487 00:25:34,316 --> 00:25:35,996 Speaker 1: you know, a vindication of the kind of work that 488 00:25:36,036 --> 00:25:39,996 Speaker 1: you've been doing. Yeah, I think that we have the 489 00:25:39,996 --> 00:25:44,916 Speaker 1: technology and the technological capabilities to likely be able to 490 00:25:44,916 --> 00:25:48,076 Speaker 1: respond to any virus that's circulating. You know, I feel 491 00:25:48,076 --> 00:25:51,076 Speaker 1: pretty confident that we can, you know, even the really 492 00:25:51,116 --> 00:25:54,396 Speaker 1: hard things like flu and HIV we're starting to figure out, 493 00:25:54,556 --> 00:25:57,596 Speaker 1: or hepatitis C, like, we're starting to see breakthroughs. The 494 00:25:57,716 --> 00:26:00,596 Speaker 1: technology is there to be able to respond. I think 495 00:26:00,636 --> 00:26:03,276 Speaker 1: the thing that's still looming is, you know, one of 496 00:26:03,276 --> 00:26:05,236 Speaker 1: the reasons why flu is such a concern is that, 497 00:26:05,276 --> 00:26:08,036 Speaker 1: you know, the Spanish flu of nineteen eighteen was able 498 00:26:08,036 --> 00:26:10,916 Speaker 1: to likely you estimated to have taken out twenty five 499 00:26:10,916 --> 00:26:13,836 Speaker 1: million people in twenty five weeks. We you know, we're 500 00:26:13,836 --> 00:26:18,676 Speaker 1: still not moving fast enough where a significant percentage of 501 00:26:18,676 --> 00:26:22,036 Speaker 1: the world's population might not die if a new outbreak 502 00:26:22,116 --> 00:26:24,436 Speaker 1: came out that we weren't ready for. And so it's 503 00:26:24,676 --> 00:26:26,676 Speaker 1: one of those places where we actually have all the 504 00:26:26,716 --> 00:26:29,356 Speaker 1: technological capabilities to make this move faster, but we just 505 00:26:29,556 --> 00:26:32,236 Speaker 1: need to put the resources behind making it possible to 506 00:26:32,236 --> 00:26:34,876 Speaker 1: move fast enough to be ready for any threat. It's 507 00:26:34,916 --> 00:26:37,036 Speaker 1: why folks like Bill Gates are on the front line 508 00:26:37,076 --> 00:26:39,436 Speaker 1: saying this is the thing I'm most worried about in 509 00:26:39,476 --> 00:26:42,356 Speaker 1: the history of humanity. Infectious diseases have been one of 510 00:26:42,356 --> 00:26:45,036 Speaker 1: the most major killers. They're really effective at doing what 511 00:26:45,076 --> 00:26:47,556 Speaker 1: they do. And we're not even talking yet about the 512 00:26:47,596 --> 00:26:51,716 Speaker 1: fact that there's such a thing as suicide biobombers, there's 513 00:26:51,756 --> 00:26:54,196 Speaker 1: such a thing as biothreats. There are You may have 514 00:26:54,236 --> 00:26:56,356 Speaker 1: an index case who doesn't want to be found, and 515 00:26:56,436 --> 00:26:58,476 Speaker 1: that changes everything as well. Right, So when we talk 516 00:26:58,516 --> 00:27:00,436 Speaker 1: about bad actors, we're talking about bad actors who just 517 00:27:00,436 --> 00:27:02,316 Speaker 1: want to paper. What about the bad actors who are 518 00:27:02,316 --> 00:27:04,636 Speaker 1: actually trying to spread the virus and infect as many 519 00:27:04,636 --> 00:27:07,276 Speaker 1: people as possible. So we have to be ready for that, 520 00:27:07,396 --> 00:27:09,756 Speaker 1: the rise of deviant culture and the power that we 521 00:27:09,836 --> 00:27:12,556 Speaker 1: can give them. I heard two different messages in there, 522 00:27:12,596 --> 00:27:14,756 Speaker 1: and I want to disentangle them because I hear an 523 00:27:14,756 --> 00:27:19,196 Speaker 1: optimistic message that says, you know, we've really made scientifically, 524 00:27:19,236 --> 00:27:21,276 Speaker 1: I should say you, but you know you the scientific 525 00:27:21,276 --> 00:27:24,436 Speaker 1: community who work on this really have made huge strides 526 00:27:24,636 --> 00:27:29,996 Speaker 1: in studying and responding to viruses. And in that sense, 527 00:27:30,436 --> 00:27:33,396 Speaker 1: it sounds like you're saying that the danger of global 528 00:27:33,476 --> 00:27:36,076 Speaker 1: pandemic as sort of high on the list of things 529 00:27:36,076 --> 00:27:38,716 Speaker 1: that could go very wrong for the world, which was 530 00:27:38,756 --> 00:27:40,956 Speaker 1: certainly in the curriculum that I was taught as a 531 00:27:40,956 --> 00:27:43,676 Speaker 1: as a kid is declining relative to some of the 532 00:27:43,676 --> 00:27:46,356 Speaker 1: other threats, which say are still there, like global warming, 533 00:27:46,436 --> 00:27:49,116 Speaker 1: which you know, with the topic for another day, but 534 00:27:49,156 --> 00:27:51,956 Speaker 1: we don't have all the answers there. But it sounds 535 00:27:51,956 --> 00:27:53,796 Speaker 1: like you're saying in an optimistic way that when it 536 00:27:53,836 --> 00:27:56,876 Speaker 1: comes to the threat of pandemic by virus, that we 537 00:27:56,876 --> 00:27:59,116 Speaker 1: were actually in a way better place today than we 538 00:27:59,116 --> 00:28:01,476 Speaker 1: were even a short time ago. But then at the 539 00:28:01,516 --> 00:28:04,836 Speaker 1: same time, I also hear you saying that because of 540 00:28:05,036 --> 00:28:11,356 Speaker 1: speed and the danger of terrorists intentionally spreading disease and 541 00:28:11,436 --> 00:28:15,956 Speaker 1: a hiding index cases that actually that danger remains about 542 00:28:15,956 --> 00:28:18,196 Speaker 1: as pressing as it ever was. So am I hearing 543 00:28:18,196 --> 00:28:20,076 Speaker 1: you right? Which is? Oh? I think it's I mean, 544 00:28:20,436 --> 00:28:22,956 Speaker 1: I think both can be true. I mean, I think 545 00:28:22,996 --> 00:28:25,836 Speaker 1: that's right. That the same science that made it possible 546 00:28:25,876 --> 00:28:28,196 Speaker 1: for us to have a vaccine for ebola very very 547 00:28:28,276 --> 00:28:29,836 Speaker 1: quickly and to be able to launch it and have 548 00:28:29,876 --> 00:28:32,356 Speaker 1: it be very effective, is the same science that makes 549 00:28:32,356 --> 00:28:35,676 Speaker 1: it possible for one to synthesize ebola and for you know, 550 00:28:35,676 --> 00:28:37,996 Speaker 1: people to understand how it works and how it goes forth. 551 00:28:38,036 --> 00:28:40,876 Speaker 1: So we are in that point where we have the 552 00:28:40,916 --> 00:28:44,556 Speaker 1: capabilities to do tremendous good, but we have to be 553 00:28:44,676 --> 00:28:47,676 Speaker 1: very committed, and we have to be very collaborative, and 554 00:28:47,756 --> 00:28:52,556 Speaker 1: we have to move forward so that we navigate very 555 00:28:52,596 --> 00:28:55,276 Speaker 1: treacherous waters. I can only tell you that it makes 556 00:28:55,316 --> 00:28:56,916 Speaker 1: me feel a lot better to know that you're out 557 00:28:56,916 --> 00:28:59,636 Speaker 1: there working on that partdis So go forth, please and 558 00:28:59,796 --> 00:29:02,756 Speaker 1: solve some more solve some more diseases while you're at it, 559 00:29:02,796 --> 00:29:04,916 Speaker 1: and we'll all be extremely grateful. Pride, thank you so 560 00:29:04,996 --> 00:29:06,836 Speaker 1: much for your time. I really appreciate it. Thanks though 561 00:29:06,836 --> 00:29:14,476 Speaker 1: it's always a pleasure to talk to you. The story 562 00:29:14,476 --> 00:29:16,556 Speaker 1: of Ebola is going to go down in the history 563 00:29:16,556 --> 00:29:21,636 Speaker 1: of medicine as a fascinating case study of how a 564 00:29:21,796 --> 00:29:27,476 Speaker 1: very contagious, extremely dangerous disease can draw intense global scrutiny, 565 00:29:27,756 --> 00:29:33,676 Speaker 1: can highlight tremendous inequalities, can emphasize the irrationality and difficulty 566 00:29:33,676 --> 00:29:38,516 Speaker 1: of our attempts to treat disease, and yet eventually lead 567 00:29:38,756 --> 00:29:43,916 Speaker 1: to important treatment interventions and potentially even a vaccine that 568 00:29:43,996 --> 00:29:49,076 Speaker 1: renders the disease obsolete. It's important to keep in mind, then, 569 00:29:49,236 --> 00:29:52,436 Speaker 1: that the story of Ebola has two sides. It will 570 00:29:52,476 --> 00:29:54,356 Speaker 1: have the side of the story, which we've heard about 571 00:29:54,516 --> 00:29:59,276 Speaker 1: very clearly from Pardis, in which there are dysfunctional elements, 572 00:29:59,636 --> 00:30:04,636 Speaker 1: unnecessary competition, the wrong incentives, even bias of the most 573 00:30:04,676 --> 00:30:08,436 Speaker 1: fundamental kind that stand in the way of effective cures 574 00:30:08,476 --> 00:30:12,236 Speaker 1: and effective treatment. Some of those biases and problems might 575 00:30:12,316 --> 00:30:16,996 Speaker 1: even have led to the deaths of researchers and medical 576 00:30:17,036 --> 00:30:20,156 Speaker 1: care workers, to say nothing of ordinary people who contracted 577 00:30:20,196 --> 00:30:25,316 Speaker 1: the disease. Yet nevertheless, the alternative story, the heroic story 578 00:30:25,716 --> 00:30:30,276 Speaker 1: of how those sacrifices ultimately led to real improvements, is 579 00:30:30,396 --> 00:30:34,996 Speaker 1: a crucial element of this tale. I hope and look 580 00:30:35,036 --> 00:30:37,396 Speaker 1: forward to a day when we'll only hear the word 581 00:30:37,476 --> 00:30:41,676 Speaker 1: ebola in the history books. When that happens, those histories 582 00:30:41,796 --> 00:30:45,476 Speaker 1: will include appreciation for the work of scientists like Pardis 583 00:30:45,836 --> 00:30:50,196 Speaker 1: who've tried to use reason and logic in the attempt 584 00:30:50,436 --> 00:30:55,876 Speaker 1: to address this deeply serious problem of disease. And now 585 00:30:56,116 --> 00:31:01,156 Speaker 1: our sound of the week. Last night, the United States 586 00:31:01,196 --> 00:31:06,356 Speaker 1: brought the world's number one terroist leader to justice. Abu 587 00:31:07,356 --> 00:31:12,796 Speaker 1: Bakar Albeg is dead. That's President Donald Trump, of course. 588 00:31:12,996 --> 00:31:17,516 Speaker 1: Last Sunday, describing a military operation that is arguably one 589 00:31:17,516 --> 00:31:21,076 Speaker 1: of the most significant of his presidency, the one which 590 00:31:21,156 --> 00:31:23,916 Speaker 1: led to the death of the leader of the Islamic State, 591 00:31:24,036 --> 00:31:29,036 Speaker 1: Abubakhara al Bakhdadi, the self declared caliph of the organization 592 00:31:29,596 --> 00:31:33,596 Speaker 1: that managed to conquer substantial amounts of territory in Syria 593 00:31:33,676 --> 00:31:38,636 Speaker 1: and Iraq, and along the way combined an extraordinary capacity 594 00:31:38,956 --> 00:31:43,396 Speaker 1: to murder, to rape and to kill with a remarkable 595 00:31:43,436 --> 00:31:47,516 Speaker 1: ability to attract followers from around the Muslim world who 596 00:31:47,596 --> 00:31:51,916 Speaker 1: came to the Islamic State with the aspiration of participating 597 00:31:52,356 --> 00:31:58,956 Speaker 1: in a millennial, idealistic and to them utopian community. What 598 00:31:59,116 --> 00:32:02,516 Speaker 1: will happen to the Islamic State in the aftermath of 599 00:32:02,556 --> 00:32:06,236 Speaker 1: Bahdadi's death and in the aftermath of the collapse of 600 00:32:06,276 --> 00:32:10,836 Speaker 1: his caliphate. To understand how Islamic State is likely to develop, 601 00:32:10,956 --> 00:32:14,676 Speaker 1: what's crucial to realize is that ultimately Islamic State turned 602 00:32:14,676 --> 00:32:17,796 Speaker 1: out to be a very different kind of movement than 603 00:32:17,836 --> 00:32:21,796 Speaker 1: al Qaeda, the terrorist organization from which it originally grew. 604 00:32:22,556 --> 00:32:25,996 Speaker 1: Recall that Al Kaieda defined itself globally as a movement 605 00:32:26,036 --> 00:32:30,956 Speaker 1: designed to fight the jihad by Muslims against Western occupiers, 606 00:32:31,396 --> 00:32:34,276 Speaker 1: those people who were perceived and in some cases actually 607 00:32:34,316 --> 00:32:39,076 Speaker 1: were occupying Muslim lands on behalf of non Muslim countries. 608 00:32:39,996 --> 00:32:42,956 Speaker 1: Those people who joined al Qaida were signing up to be, 609 00:32:43,116 --> 00:32:46,676 Speaker 1: in their own minds, warriors, and in most instances they 610 00:32:46,756 --> 00:32:50,756 Speaker 1: knew they would die. They engaged in terrorist activities all 611 00:32:50,756 --> 00:32:54,596 Speaker 1: over the world, always with the theoretical aspiration of using 612 00:32:54,596 --> 00:32:59,316 Speaker 1: those terrorist interventions to make change in Western policy. But 613 00:32:59,396 --> 00:33:01,876 Speaker 1: at no point in its history was al Kaieda effective 614 00:33:02,036 --> 00:33:03,836 Speaker 1: for more than a few days in a few places 615 00:33:03,876 --> 00:33:09,316 Speaker 1: at a time in actually achieving sovereign government over substantial 616 00:33:09,396 --> 00:33:14,196 Speaker 1: amounts of territory. What made the Islamic State different from 617 00:33:14,196 --> 00:33:17,836 Speaker 1: al Qaeda, and indeed unique in the annals of Islamic 618 00:33:17,876 --> 00:33:21,836 Speaker 1: jihadi terrorism, is that the Islamic State did manage to 619 00:33:21,876 --> 00:33:26,316 Speaker 1: conquer a large swath of territory in which they actually 620 00:33:26,356 --> 00:33:30,236 Speaker 1: created a functioning government. Using their reign of terror, but 621 00:33:30,396 --> 00:33:35,436 Speaker 1: also using the ordinary bureaucratic tools of regular, everyday municipal governance, 622 00:33:35,836 --> 00:33:40,716 Speaker 1: they manage to run the show. The end of Islamic 623 00:33:40,756 --> 00:33:44,916 Speaker 1: State and the death of Baghdadi himself capture the reality 624 00:33:45,116 --> 00:33:48,596 Speaker 1: that the Islamic State is no longer distinct or different. 625 00:33:48,956 --> 00:33:52,516 Speaker 1: It does not govern territory in any meaningful way anywhere 626 00:33:52,516 --> 00:33:54,476 Speaker 1: in the areas of Sirah and Iraq, where it had 627 00:33:54,516 --> 00:33:57,636 Speaker 1: its heart. It's true that in a few places around 628 00:33:57,636 --> 00:34:01,956 Speaker 1: the world, various terrorists jihadi groups have used the brand 629 00:34:02,036 --> 00:34:05,116 Speaker 1: name of Islamic State to try to claim for themselves 630 00:34:05,156 --> 00:34:09,516 Speaker 1: the authority to govern in small enclaves. That will probably 631 00:34:09,596 --> 00:34:12,116 Speaker 1: continue to be the case. It will probably still be 632 00:34:12,156 --> 00:34:14,356 Speaker 1: the case that if you're the leader of jihadi group 633 00:34:14,356 --> 00:34:16,116 Speaker 1: and you actually managed to grab some land and you 634 00:34:16,156 --> 00:34:18,796 Speaker 1: want to legitimate what you're doing, you may say I 635 00:34:18,956 --> 00:34:21,636 Speaker 1: am the local branch of the Islamic State. But as 636 00:34:21,676 --> 00:34:24,556 Speaker 1: that becomes rarer and rarer, and as there is no 637 00:34:24,716 --> 00:34:27,956 Speaker 1: substantive caliphate of the Islamic State to which such an 638 00:34:27,996 --> 00:34:31,036 Speaker 1: oath of allegiance could refer, what's going to happen into 639 00:34:31,116 --> 00:34:34,916 Speaker 1: Islamic State, overwhelmingly likely is that it will gradually morph 640 00:34:35,116 --> 00:34:37,836 Speaker 1: back into the organization from which it came in the 641 00:34:37,876 --> 00:34:42,916 Speaker 1: first place, namely Al Qaida. Ultimately, historians will look at 642 00:34:42,916 --> 00:34:46,556 Speaker 1: the Islamic State and conclude that it represented a distinct 643 00:34:46,676 --> 00:34:50,316 Speaker 1: moment where there was an attempt to instantiate in real 644 00:34:50,436 --> 00:34:54,876 Speaker 1: life a set of old, indeed medievil ideas about utopian 645 00:34:54,956 --> 00:34:59,636 Speaker 1: Islamic governance and that that effort failed. The death of 646 00:34:59,676 --> 00:35:03,596 Speaker 1: Abu Bakhar al Baghdadi is therefore significant as a kind 647 00:35:03,636 --> 00:35:07,956 Speaker 1: of endpoint to the aspirational ideal of the Islamic State 648 00:35:08,036 --> 00:35:12,436 Speaker 1: as a real world state. As a terrorist organization like 649 00:35:12,476 --> 00:35:15,876 Speaker 1: al Qaeda, the Islamic State can continue to exist without 650 00:35:15,916 --> 00:35:20,516 Speaker 1: a caliph, but in its character as a distinct aspirational 651 00:35:20,556 --> 00:35:24,116 Speaker 1: caliphate trying to govern space, it can't do without a 652 00:35:24,196 --> 00:35:27,556 Speaker 1: leader at the top, and in that sense, Bahdadi's death 653 00:35:27,796 --> 00:35:35,236 Speaker 1: is historically significant. Deep Background is brought to you by 654 00:35:35,276 --> 00:35:38,916 Speaker 1: Pushkin Industries. Our producer is Lydia Genecott, with engineering by 655 00:35:38,996 --> 00:35:43,076 Speaker 1: Jason Gambrell and Jason Roskowski. Our showrunner is Sophie mckibbon. 656 00:35:43,356 --> 00:35:46,476 Speaker 1: Our theme music is composed by Luis Gara. Special thanks 657 00:35:46,476 --> 00:35:50,076 Speaker 1: for the Pushkin Brass, Malcolm Gladwell, Jacob Weisberg, and Mia Lobel. 658 00:35:50,556 --> 00:35:52,836 Speaker 1: I'm Noah Feldman. You can follow me on Twitter at 659 00:35:52,836 --> 00:35:55,796 Speaker 1: Noah R. Feldman. This is Deep Background.