1 00:00:15,396 --> 00:00:22,076 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:22,116 --> 00:00:25,756 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:26,276 --> 00:00:30,436 Speaker 1: I'm Noah Feldman. These days, it seems we live and 4 00:00:30,516 --> 00:00:34,236 Speaker 1: die by the model. You can't turn on your phone, 5 00:00:34,316 --> 00:00:37,276 Speaker 1: or open the newspaper or watch the television without being 6 00:00:37,476 --> 00:00:40,436 Speaker 1: hit in the face by some sort of model or 7 00:00:40,516 --> 00:00:43,596 Speaker 1: graph or chart that purports to show you how fast 8 00:00:43,636 --> 00:00:46,316 Speaker 1: the coronavirus is likely to spread, when it will peak, 9 00:00:46,396 --> 00:00:49,076 Speaker 1: whether it will plateau, how many people will die, you 10 00:00:49,236 --> 00:00:53,396 Speaker 1: name it. These graphs and charts and models are constantly changing, 11 00:00:53,796 --> 00:00:57,516 Speaker 1: and sometimes they are in conflict with one another. So 12 00:00:57,836 --> 00:00:59,636 Speaker 1: how should we be making sense of all of this? 13 00:01:00,676 --> 00:01:05,036 Speaker 1: Here to help is Carl Bergstrup. He's a computational biologist 14 00:01:05,076 --> 00:01:08,316 Speaker 1: at the University of Washington who's got a deep background 15 00:01:08,436 --> 00:01:12,996 Speaker 1: both in epidemiology and in model building and in model analysis. 16 00:01:13,276 --> 00:01:16,636 Speaker 1: He's also an expert on the spread of misinformation. With 17 00:01:16,676 --> 00:01:20,476 Speaker 1: his co author Jevin D. West, he's written a forthcoming book, 18 00:01:20,996 --> 00:01:24,716 Speaker 1: Calling Bullshit, The Art of Skepticism in a Data Driven World. 19 00:01:25,196 --> 00:01:28,356 Speaker 1: I spoke to Carl on Monday afternoon. Carl, you are, 20 00:01:28,676 --> 00:01:33,316 Speaker 1: by profession some combination of a modeler and an explainer 21 00:01:33,396 --> 00:01:37,556 Speaker 1: of models and a debunker of bad models, and we 22 00:01:37,636 --> 00:01:42,196 Speaker 1: now live in model world twenty four seven. So my 23 00:01:42,236 --> 00:01:44,036 Speaker 1: first question before I ask you to do all three 24 00:01:44,076 --> 00:01:47,556 Speaker 1: of those things simultaneously is how weird has this time 25 00:01:47,636 --> 00:01:50,316 Speaker 1: been for you? That's a good question. I think it's 26 00:01:50,356 --> 00:01:52,156 Speaker 1: hard when you're in the middle of it. You don't 27 00:01:52,196 --> 00:01:54,916 Speaker 1: even really pause to think about whether it's weird or normal. 28 00:01:54,996 --> 00:01:59,036 Speaker 1: It just is. The thing that strikes me as the 29 00:01:59,076 --> 00:02:02,636 Speaker 1: strangest in a sense, is that I spent many years 30 00:02:02,636 --> 00:02:07,396 Speaker 1: doing confectious disease epidemiology, and then, by various convoluted paths, 31 00:02:07,716 --> 00:02:09,716 Speaker 1: ended up over the last few years studying the spread 32 00:02:09,756 --> 00:02:13,476 Speaker 1: of misinformation on social networks. And to have those two 33 00:02:13,516 --> 00:02:15,756 Speaker 1: things come together the way that they're coming together right 34 00:02:15,756 --> 00:02:18,796 Speaker 1: now has been really striking. Let's start with a model 35 00:02:18,836 --> 00:02:21,516 Speaker 1: that you've spoken in favor of, in fact strongly in 36 00:02:21,556 --> 00:02:25,876 Speaker 1: favor of, namely the simple model that produced the expression 37 00:02:25,916 --> 00:02:28,516 Speaker 1: which has now entered the lingo of flattening the curve. 38 00:02:29,236 --> 00:02:32,196 Speaker 1: Say a word about why that basic curve with the 39 00:02:32,236 --> 00:02:35,836 Speaker 1: flattening was so effective. Well, I think it's the notion 40 00:02:35,876 --> 00:02:39,916 Speaker 1: of a very simple idea made concrete with a nice picture, 41 00:02:39,956 --> 00:02:42,436 Speaker 1: and so you know, the most important thing about that 42 00:02:42,516 --> 00:02:46,236 Speaker 1: picture was that it is worth a thousand words, and 43 00:02:46,316 --> 00:02:49,196 Speaker 1: those thousand words are worth thousands of lives, I believe, 44 00:02:49,556 --> 00:02:53,156 Speaker 1: because they showed people at the time why one argument 45 00:02:53,196 --> 00:02:55,476 Speaker 1: that was floating around was not a good argument. And 46 00:02:55,476 --> 00:02:57,276 Speaker 1: what had been floating around at the time was, look, 47 00:02:57,636 --> 00:02:59,716 Speaker 1: we're all going to you know, people were saying, we're 48 00:02:59,716 --> 00:03:01,556 Speaker 1: all going to get this or a large fraction unless 49 00:03:01,556 --> 00:03:04,276 Speaker 1: they're going to get this, and there's nothing we can 50 00:03:04,316 --> 00:03:06,276 Speaker 1: do about it. So let's get this over with as 51 00:03:06,316 --> 00:03:08,476 Speaker 1: quick as possible. So they must talk about taking it 52 00:03:08,516 --> 00:03:11,596 Speaker 1: on the chin, talk about you know, why would you 53 00:03:11,636 --> 00:03:16,436 Speaker 1: prolong the economic disruption and so forth. And one thing 54 00:03:16,436 --> 00:03:19,636 Speaker 1: that people hadn't been thinking about was the sort of 55 00:03:20,356 --> 00:03:22,836 Speaker 1: way that if we didn't do anything to control the pandemic, 56 00:03:22,916 --> 00:03:25,756 Speaker 1: then we would badly exceed hospital capacity. And so what 57 00:03:25,876 --> 00:03:30,436 Speaker 1: this one little picture did was stressed that you have 58 00:03:30,516 --> 00:03:33,596 Speaker 1: to worry not only about the area under the pandemic 59 00:03:33,676 --> 00:03:36,276 Speaker 1: curve where the epidemic curve, which also have to worry 60 00:03:36,396 --> 00:03:38,676 Speaker 1: about the height of that epidemic curve at any given 61 00:03:38,716 --> 00:03:40,796 Speaker 1: time because we have a limited hospital capacity and for 62 00:03:40,836 --> 00:03:44,836 Speaker 1: this particular disease, I see you care saves lives. So 63 00:03:44,876 --> 00:03:47,156 Speaker 1: I just shifted the framework of people's thinking in a 64 00:03:47,236 --> 00:03:49,316 Speaker 1: very simple way, and I think, you know, people definitely 65 00:03:49,316 --> 00:03:51,676 Speaker 1: started rallying to this notion of flattening the curve, and 66 00:03:52,396 --> 00:03:55,756 Speaker 1: people now take that for granted. It's time to start 67 00:03:55,756 --> 00:03:58,036 Speaker 1: thinking about new models and new new pictures now that 68 00:03:58,076 --> 00:04:00,956 Speaker 1: this is something we all take for granted. But I 69 00:04:00,956 --> 00:04:03,276 Speaker 1: think certainly some of the reason we haven't exceeded healthcare 70 00:04:03,276 --> 00:04:06,716 Speaker 1: capacity any worse than we have is that people kind 71 00:04:06,716 --> 00:04:08,916 Speaker 1: of took this message to heart and realize that there 72 00:04:08,996 --> 00:04:13,556 Speaker 1: is an important public health role that'll be played by 73 00:04:13,956 --> 00:04:17,196 Speaker 1: trying to slow down the spread of this early on, Carl, 74 00:04:17,236 --> 00:04:20,396 Speaker 1: I want to ask you a sort of philosophical question 75 00:04:20,476 --> 00:04:23,756 Speaker 1: about models that's been really very much front of mine 76 00:04:23,836 --> 00:04:27,876 Speaker 1: for me, and that is, broadly speaking, the relationship between 77 00:04:27,916 --> 00:04:30,156 Speaker 1: a model that's meant to tell you what you should 78 00:04:30,196 --> 00:04:34,276 Speaker 1: do in life, a kind of normative model like the 79 00:04:34,316 --> 00:04:37,196 Speaker 1: model that you were talking about, the flattening the curve model, 80 00:04:37,916 --> 00:04:41,636 Speaker 1: and a descriptive model, a model that's meant to describe 81 00:04:41,756 --> 00:04:43,556 Speaker 1: the world as it is or as you think there's 82 00:04:43,596 --> 00:04:47,156 Speaker 1: some probability of it being. This is a very hard 83 00:04:47,156 --> 00:04:49,756 Speaker 1: line to draw. But I think it's a distinction that 84 00:04:49,796 --> 00:04:54,076 Speaker 1: really matters because we're in this delicate moment now where 85 00:04:54,156 --> 00:04:57,756 Speaker 1: people are observing that models that were meant to say, hey, 86 00:04:57,836 --> 00:05:00,476 Speaker 1: if you don't do anything, you'll have results X and 87 00:05:00,676 --> 00:05:03,756 Speaker 1: Y and Z are now giving way to new models 88 00:05:03,756 --> 00:05:05,436 Speaker 1: that say, well, we've updated the models in light of 89 00:05:05,476 --> 00:05:08,556 Speaker 1: the social distancing that we've done. And that's leading some 90 00:05:08,596 --> 00:05:11,316 Speaker 1: part of the public, including even the somewhat educated public, 91 00:05:11,356 --> 00:05:13,796 Speaker 1: I think, to say, well, wait a minute, maybe those 92 00:05:13,836 --> 00:05:16,956 Speaker 1: initial models were grossly overstated. And it seems to me 93 00:05:16,996 --> 00:05:19,196 Speaker 1: that part of the distinction here is the difference between 94 00:05:19,516 --> 00:05:22,716 Speaker 1: a model that is meant to say, take action and 95 00:05:22,836 --> 00:05:25,436 Speaker 1: a model that's meant to say, here's the description of 96 00:05:25,436 --> 00:05:28,316 Speaker 1: the world, updating for what we have done. Could you 97 00:05:28,356 --> 00:05:30,676 Speaker 1: say a word about that distinction. Yeah, I mean, it's 98 00:05:30,676 --> 00:05:33,636 Speaker 1: not actually a distinction I had been thinking along about. 99 00:05:33,716 --> 00:05:36,436 Speaker 1: I mean, I would have thought, you know, more about 100 00:05:36,996 --> 00:05:41,156 Speaker 1: the fact that the models have a feedback in terms 101 00:05:41,156 --> 00:05:44,516 Speaker 1: of the models influence our behavior, which influences the data 102 00:05:44,556 --> 00:05:46,036 Speaker 1: that go into the model that you know as you 103 00:05:46,076 --> 00:05:48,396 Speaker 1: go through this loop. And so what's happening, of course, 104 00:05:48,516 --> 00:05:50,636 Speaker 1: is that early models, including the flat and the curve 105 00:05:51,236 --> 00:05:55,876 Speaker 1: just a conceptual model influence people's behavior, which then changes 106 00:05:55,916 --> 00:05:59,036 Speaker 1: the situation that we're in. And so now people can say, oh, well, 107 00:05:59,036 --> 00:06:00,956 Speaker 1: you know that was stupid. Why didn't we flatten the curve. 108 00:06:00,996 --> 00:06:03,996 Speaker 1: Hospitals haven't been overrun. It's like, well, yeah, hospitals haven't 109 00:06:04,036 --> 00:06:08,876 Speaker 1: been overrun precisely because we locked down places and flatten 110 00:06:08,916 --> 00:06:11,236 Speaker 1: the cur and that's why. You know, except for a 111 00:06:11,276 --> 00:06:14,796 Speaker 1: New York City, essentially most places are managing pretty recently 112 00:06:14,796 --> 00:06:17,676 Speaker 1: in terms of hospital capacity. I'd kind of, rather than 113 00:06:17,716 --> 00:06:20,636 Speaker 1: saying there are models that are normative and models that 114 00:06:20,636 --> 00:06:23,556 Speaker 1: are descriptive, I would say, there's the models can be 115 00:06:24,036 --> 00:06:26,636 Speaker 1: used in both of those directions, and so you can 116 00:06:26,676 --> 00:06:29,716 Speaker 1: come up with a model likely fight the curve model, 117 00:06:29,756 --> 00:06:32,156 Speaker 1: and that sort of describes two different things that could happen, 118 00:06:32,156 --> 00:06:34,556 Speaker 1: and then you have to draw your own normative conclusions 119 00:06:34,596 --> 00:06:37,236 Speaker 1: from that. So, you know, we could have this big 120 00:06:37,276 --> 00:06:39,036 Speaker 1: fast peak, we'd be done with it by the middle 121 00:06:39,036 --> 00:06:40,556 Speaker 1: of the summer. We'd all be back, you know, those 122 00:06:40,556 --> 00:06:43,276 Speaker 1: of us who are still alive, to our normal lives. 123 00:06:43,676 --> 00:06:45,956 Speaker 1: But we would have had to live through this period 124 00:06:45,956 --> 00:06:49,836 Speaker 1: where the hospitals are massively over capacity. We would have 125 00:06:49,876 --> 00:06:52,116 Speaker 1: lost friends and relatives and neighbors and so on. Or 126 00:06:52,276 --> 00:06:55,076 Speaker 1: we can try this other approach, which we don't even 127 00:06:55,116 --> 00:06:57,756 Speaker 1: know exactly how we're going to manage to keep this 128 00:06:57,796 --> 00:06:59,356 Speaker 1: thing down run in the future, but we can at 129 00:06:59,396 --> 00:07:01,236 Speaker 1: least get ourselves onto that track so that we can 130 00:07:01,276 --> 00:07:03,836 Speaker 1: solve the problem. And then if we do that, we're 131 00:07:03,836 --> 00:07:06,156 Speaker 1: probably going to be looking at a more protracted period 132 00:07:06,196 --> 00:07:10,476 Speaker 1: of life being different, but we're going to avoid these 133 00:07:11,196 --> 00:07:15,756 Speaker 1: really catastrophic periods of exceeding health capacity. So I guess 134 00:07:15,796 --> 00:07:18,476 Speaker 1: that would be sort of my distinction, which is perhaps 135 00:07:18,516 --> 00:07:20,556 Speaker 1: a little different than the one you're driving. So if 136 00:07:20,596 --> 00:07:23,156 Speaker 1: I understand you correctly, what you're saying is the model 137 00:07:23,196 --> 00:07:27,556 Speaker 1: itself is at least arguably neutral, and then it can 138 00:07:27,596 --> 00:07:30,916 Speaker 1: be used either to make a descriptive point about how 139 00:07:30,916 --> 00:07:33,596 Speaker 1: the world is or a normative point. And I guess 140 00:07:33,676 --> 00:07:37,036 Speaker 1: my pushback on that, if I'm understanding you correctly, would 141 00:07:37,036 --> 00:07:41,356 Speaker 1: be that because as you say, you're modeling either taking 142 00:07:41,356 --> 00:07:45,716 Speaker 1: into account probable effects on the world or not taking 143 00:07:45,756 --> 00:07:49,036 Speaker 1: those things into account. It's tricky, maybe not impossible, but 144 00:07:49,076 --> 00:07:53,196 Speaker 1: tricky to separate out a model that says, hey, this 145 00:07:53,276 --> 00:07:54,476 Speaker 1: is the way the world, you know, take it or 146 00:07:54,516 --> 00:07:55,876 Speaker 1: leave it. But this is the world way the world 147 00:07:55,876 --> 00:07:58,636 Speaker 1: would be if you do nothing, versus here's a model 148 00:07:58,836 --> 00:08:01,076 Speaker 1: based on what we think will happen if you do 149 00:08:01,196 --> 00:08:04,196 Speaker 1: take these steps. Yeah, I think that's right, and I mean, 150 00:08:04,316 --> 00:08:06,676 Speaker 1: but I agree with you about that, and I also 151 00:08:06,916 --> 00:08:10,236 Speaker 1: with respect pushback. I mean, I pushback. I'm making you 152 00:08:10,236 --> 00:08:13,356 Speaker 1: know this claim that they're absolutely neutral myself, because I mean, 153 00:08:13,396 --> 00:08:15,836 Speaker 1: models or tools, models are designed for purposes, and so 154 00:08:15,916 --> 00:08:18,756 Speaker 1: people are making each of these models for some particular reason. 155 00:08:19,076 --> 00:08:20,756 Speaker 1: You know, the flat and the curve was made to 156 00:08:20,796 --> 00:08:23,436 Speaker 1: try to help people see that it's not just the 157 00:08:23,476 --> 00:08:26,276 Speaker 1: total number of cases, it's also the timing of those cases, 158 00:08:26,316 --> 00:08:27,996 Speaker 1: whether they all happen at the same time or not. 159 00:08:28,316 --> 00:08:31,076 Speaker 1: It had a purpose. I think you can kind you 160 00:08:31,116 --> 00:08:32,436 Speaker 1: can kind of see them that way when I look 161 00:08:32,436 --> 00:08:33,996 Speaker 1: at some of the models that people are doing now, 162 00:08:34,156 --> 00:08:37,676 Speaker 1: For example, the there's this IGMME model that's being used 163 00:08:37,676 --> 00:08:39,916 Speaker 1: a lot that predicts hospital needs by state and also 164 00:08:39,996 --> 00:08:42,836 Speaker 1: death rates by state and so on, and so that's 165 00:08:42,836 --> 00:08:46,916 Speaker 1: a model that's been designed to be used precisely for 166 00:08:47,036 --> 00:08:49,276 Speaker 1: thinking about, you know, what kind of equipment, how many 167 00:08:49,276 --> 00:08:51,636 Speaker 1: beds do we need and so on, So you know, 168 00:08:51,636 --> 00:08:53,476 Speaker 1: even though you can say, well it's just a model 169 00:08:53,516 --> 00:08:55,716 Speaker 1: of what happens, it's still designed to this purpose. And 170 00:08:55,716 --> 00:08:58,356 Speaker 1: then once that purpose comes in, then you know that 171 00:08:58,396 --> 00:09:00,396 Speaker 1: purpose is usually to make a decision, and so the 172 00:09:00,436 --> 00:09:04,356 Speaker 1: whole normative aspect of modeling flows into that through that channel. 173 00:09:04,356 --> 00:09:08,756 Speaker 1: I think if we're going to try to reopen the economy, 174 00:09:08,836 --> 00:09:13,796 Speaker 1: cause before we have the capacity, assuming we ever get 175 00:09:13,836 --> 00:09:17,796 Speaker 1: the capacity to do millions and millions of tests, then 176 00:09:17,876 --> 00:09:19,796 Speaker 1: there are going to have to be, as it were, 177 00:09:20,276 --> 00:09:24,476 Speaker 1: many spikes in the curve. We just want those spikes 178 00:09:24,476 --> 00:09:27,716 Speaker 1: to be below the point where they overwhelm the hospitals. 179 00:09:28,236 --> 00:09:30,676 Speaker 1: What's a model that sort of pictures that? I mean? 180 00:09:30,796 --> 00:09:32,596 Speaker 1: Is it a kind of sign curve model where the 181 00:09:33,116 --> 00:09:35,516 Speaker 1: top of the curve is just beneath the number of 182 00:09:35,516 --> 00:09:37,716 Speaker 1: people who will flood the hospitals? Is that what we're 183 00:09:37,716 --> 00:09:40,876 Speaker 1: realistically talking about until such time as we have either 184 00:09:40,876 --> 00:09:43,396 Speaker 1: a vaccine or very extensive testing. Well, I'm not very 185 00:09:43,436 --> 00:09:47,276 Speaker 1: optimistic about that approach, because you don't get something like 186 00:09:47,276 --> 00:09:49,396 Speaker 1: a nice smooth signed curve, but you usually get as 187 00:09:49,436 --> 00:09:52,756 Speaker 1: a pretty rapid ramp up to being at hospital capacity 188 00:09:52,796 --> 00:09:54,756 Speaker 1: and a fairly slow drop off and then a rapid 189 00:09:54,836 --> 00:09:56,836 Speaker 1: ramp up at a fairly slow drop off, so you 190 00:09:56,836 --> 00:10:00,036 Speaker 1: actually spend most of your time with social distancing still 191 00:10:00,116 --> 00:10:03,836 Speaker 1: han and only these little gaps with social distancing off. 192 00:10:03,876 --> 00:10:07,436 Speaker 1: That's what came out of the Imperial College model at 193 00:10:07,436 --> 00:10:10,236 Speaker 1: the scenario. And then there are been other plans along 194 00:10:10,276 --> 00:10:12,156 Speaker 1: these lines, a two day work week plan that our 195 00:10:12,316 --> 00:10:14,196 Speaker 1: lines groups proposed and so on, and they have this 196 00:10:14,316 --> 00:10:17,436 Speaker 1: general form. So that's one problem. The other problem is 197 00:10:17,476 --> 00:10:21,436 Speaker 1: you're dealing in these cases with exponential growth and imperfect measurement, 198 00:10:21,516 --> 00:10:25,356 Speaker 1: and so when you try to manage exponential growth to 199 00:10:25,436 --> 00:10:28,156 Speaker 1: sort of top it out right below hospital capacity, that's 200 00:10:28,156 --> 00:10:30,836 Speaker 1: a very difficult call to make given the amount of 201 00:10:30,836 --> 00:10:32,636 Speaker 1: information you have. And so what's going to happen is 202 00:10:32,636 --> 00:10:34,716 Speaker 1: you're going to miss, and you're going to miss by 203 00:10:34,756 --> 00:10:37,516 Speaker 1: a lot because it's exponential growth. So I think the 204 00:10:37,516 --> 00:10:39,876 Speaker 1: other thing, you know, there was a nice paper that 205 00:10:40,556 --> 00:10:44,516 Speaker 1: you looked at that sort of optimal management of a 206 00:10:44,556 --> 00:10:47,356 Speaker 1: pandemic like this and keeping it under hospital capacity. But 207 00:10:47,476 --> 00:10:52,236 Speaker 1: the conclusion, informally stated, you know, was essentially, don't try 208 00:10:52,236 --> 00:10:55,516 Speaker 1: to get cute with epidemic growth. And I think that's 209 00:10:55,636 --> 00:10:57,596 Speaker 1: kind of a key message that comes out of modeling 210 00:10:57,636 --> 00:10:59,556 Speaker 1: there is. It's you know, as soon as you start 211 00:10:59,596 --> 00:11:02,996 Speaker 1: thinking about the uncertainties that are in place, you recognize 212 00:11:03,076 --> 00:11:07,796 Speaker 1: that actually keeping it under capacity is an almost impossible challenge. 213 00:11:09,116 --> 00:11:12,596 Speaker 1: And making a really powerful point about the likelihood of 214 00:11:12,796 --> 00:11:16,316 Speaker 1: best case and worst case scenarios, which I think is 215 00:11:16,436 --> 00:11:20,116 Speaker 1: quirky and totally non obvious to the labors of whom 216 00:11:20,116 --> 00:11:22,876 Speaker 1: I am a good example. So we often think, well, 217 00:11:22,916 --> 00:11:24,476 Speaker 1: there are two models. There's a best case, they're in 218 00:11:24,476 --> 00:11:26,956 Speaker 1: a worst case scenario, and in life things will probably 219 00:11:26,996 --> 00:11:29,516 Speaker 1: be somewhere in the middle. That's how we often think. 220 00:11:29,556 --> 00:11:32,916 Speaker 1: It's a kind of Goldilocks principle. Yeah, we go through life, 221 00:11:32,956 --> 00:11:35,596 Speaker 1: and you have been arguing that that's exactly wrong. In 222 00:11:35,716 --> 00:11:39,036 Speaker 1: thinking about an epidemics say more for us, Well, epidemics 223 00:11:38,756 --> 00:11:42,036 Speaker 1: are an interesting kind of dynamical process. There are sort 224 00:11:42,036 --> 00:11:46,756 Speaker 1: of attractors in an epidemic. You can have a dynamic 225 00:11:46,836 --> 00:11:49,756 Speaker 1: where you relatively quickly suppress an epidemic, and you go 226 00:11:49,836 --> 00:11:51,836 Speaker 1: to a relatively small number of cases where you can 227 00:11:51,876 --> 00:11:53,516 Speaker 1: have a dynamic where it sort of blows up and 228 00:11:53,636 --> 00:11:56,796 Speaker 1: racist through the population. And whether you have this former 229 00:11:56,836 --> 00:11:59,316 Speaker 1: case where you suppress it or the latter case where 230 00:11:59,316 --> 00:12:02,476 Speaker 1: you fail depends basically on this number or not that 231 00:12:02,516 --> 00:12:05,636 Speaker 1: everyone's talking about basic reproductive numbers. So it's how many 232 00:12:05,676 --> 00:12:10,676 Speaker 1: cases does an initial infected case generate downstream, And of 233 00:12:10,676 --> 00:12:13,156 Speaker 1: course that number depends not only in the properties the virus, 234 00:12:13,196 --> 00:12:15,156 Speaker 1: but on the amounts of social distancing we're doing and 235 00:12:15,196 --> 00:12:17,716 Speaker 1: so forth. And so as we've gone into this lockdown 236 00:12:17,756 --> 00:12:19,956 Speaker 1: period in the social distancing, we've tried to knock our 237 00:12:20,076 --> 00:12:24,196 Speaker 1: not down pretty low, and we've got it down around one, 238 00:12:24,316 --> 00:12:26,236 Speaker 1: but we don't know exactly where it is, and so 239 00:12:26,276 --> 00:12:28,436 Speaker 1: if it's a little bit below one, then each case 240 00:12:28,516 --> 00:12:31,116 Speaker 1: generates fewer than one additional cases, and we get this 241 00:12:31,156 --> 00:12:33,716 Speaker 1: exponential fall off in the number of cases, and we 242 00:12:33,836 --> 00:12:36,316 Speaker 1: might end up with say about three percent of the 243 00:12:36,396 --> 00:12:39,876 Speaker 1: US having been infected by the virus after the first way. 244 00:12:40,196 --> 00:12:42,436 Speaker 1: If we're just a little bit above one, then each 245 00:12:42,476 --> 00:12:45,516 Speaker 1: case generates more than one new cases, and it grows 246 00:12:45,516 --> 00:12:47,676 Speaker 1: and grows, and it grows exponentially, and so it races 247 00:12:47,716 --> 00:12:49,956 Speaker 1: through the whole population. And if that happens, you're going 248 00:12:49,956 --> 00:12:52,276 Speaker 1: to end up with you somewhere in the range of 249 00:12:52,316 --> 00:12:55,236 Speaker 1: thirty to seventy percent of the population infected in the 250 00:12:55,276 --> 00:13:00,036 Speaker 1: absence of continued intervention by people changing their behavior so 251 00:13:00,076 --> 00:13:03,196 Speaker 1: that this are zero are not is adjusted over time, 252 00:13:03,476 --> 00:13:05,476 Speaker 1: you really end up on one of these two trajectories 253 00:13:05,556 --> 00:13:08,516 Speaker 1: or not. Compared to rolling a ball down a ridge 254 00:13:08,596 --> 00:13:11,076 Speaker 1: or a fence, I mean, Paul may stay on the 255 00:13:11,196 --> 00:13:13,076 Speaker 1: ridge line for a while, but sooner or later it's 256 00:13:13,116 --> 00:13:14,916 Speaker 1: going to drop off of one side or the other 257 00:13:15,516 --> 00:13:19,196 Speaker 1: just because your trajectory is just slightly in one direction 258 00:13:19,276 --> 00:13:21,716 Speaker 1: or the other. And so sort of the what happens 259 00:13:21,716 --> 00:13:25,236 Speaker 1: in the pandemic is a bit like that. What's really 260 00:13:25,276 --> 00:13:27,476 Speaker 1: interesting is something that people have suggested to me that 261 00:13:27,516 --> 00:13:30,156 Speaker 1: we may have this element of what physicists call self 262 00:13:30,276 --> 00:13:34,076 Speaker 1: organized criticality where actually, even though you've got this unstable 263 00:13:34,116 --> 00:13:36,316 Speaker 1: point where or not is one where you're kind of 264 00:13:36,356 --> 00:13:39,076 Speaker 1: rolling down the ridge line, the pandemic starts to take 265 00:13:39,156 --> 00:13:41,396 Speaker 1: off and really accelerate, Then people get scared and they 266 00:13:41,436 --> 00:13:43,556 Speaker 1: do more social distancing, and they push it back down. 267 00:13:43,956 --> 00:13:46,596 Speaker 1: If the pandemic starts to get shut down, then people 268 00:13:46,596 --> 00:13:49,316 Speaker 1: start to relax and they do less. And so you've 269 00:13:49,316 --> 00:13:51,436 Speaker 1: got on one hand, you've got the fundamental dynamics of 270 00:13:51,476 --> 00:13:54,396 Speaker 1: the pandemic. If everyone's behaviors helped constant we'd either have 271 00:13:54,436 --> 00:13:56,516 Speaker 1: a great, big one or we'd shut it down pretty quickly. 272 00:13:56,756 --> 00:13:59,036 Speaker 1: People are modulating their behavior, and they may actually be 273 00:13:59,076 --> 00:14:02,556 Speaker 1: pushing us along this along the ridge, yeah, or on 274 00:14:02,596 --> 00:14:05,596 Speaker 1: the ridge exactly. So that's an interesting sort of second 275 00:14:05,636 --> 00:14:07,996 Speaker 1: layer on all of this. I'm glad you raised that issue, 276 00:14:08,036 --> 00:14:09,716 Speaker 1: because that's exactly what I was and to ask about it. 277 00:14:09,716 --> 00:14:12,356 Speaker 1: I mean, in a dynamic picture where we're always reacting 278 00:14:12,396 --> 00:14:15,436 Speaker 1: to whatever we see happening out there, you could imagine 279 00:14:15,436 --> 00:14:18,556 Speaker 1: a government at least trying very hard to get its 280 00:14:18,596 --> 00:14:21,916 Speaker 1: people to type trate going to work and then coming 281 00:14:21,916 --> 00:14:23,636 Speaker 1: home from work, going back and forth, and then getting 282 00:14:23,676 --> 00:14:25,396 Speaker 1: some of a thing in the middle. I thought, in 283 00:14:25,396 --> 00:14:28,036 Speaker 1: that context of the Swedish example, I mean, Sweden is 284 00:14:28,036 --> 00:14:30,476 Speaker 1: sort of functioning as a kind of experiment where they're 285 00:14:30,516 --> 00:14:32,756 Speaker 1: doing a fair amount of social distancing, but in otherwise 286 00:14:32,836 --> 00:14:35,076 Speaker 1: way as they aren't, it'll be really interesting to see 287 00:14:35,156 --> 00:14:37,796 Speaker 1: maybe they will turn out somewhere in the middle. Yeah, 288 00:14:37,996 --> 00:14:39,876 Speaker 1: I mean it's certainly possible. I think even there, you're 289 00:14:40,236 --> 00:14:43,116 Speaker 1: kind of unlikely too, because not only do you have 290 00:14:43,196 --> 00:14:46,316 Speaker 1: this system that's you know, with exponential growth, but then 291 00:14:46,316 --> 00:14:49,636 Speaker 1: you've got long delays, so there are big delays between 292 00:14:49,676 --> 00:14:51,436 Speaker 1: when people are infected and when we start to really 293 00:14:51,476 --> 00:14:54,356 Speaker 1: observe the consequences of those infections. You know, with full 294 00:14:54,356 --> 00:14:56,676 Speaker 1: on testing, it would be five days to a week. 295 00:14:57,196 --> 00:14:59,516 Speaker 1: If we're waiting for people to end up in the hospital, 296 00:14:59,556 --> 00:15:02,356 Speaker 1: we're looking at more like ten to fifteen days. And 297 00:15:02,436 --> 00:15:05,476 Speaker 1: so whenever you're trying to do control on a system 298 00:15:05,556 --> 00:15:09,516 Speaker 1: that has exponential growth, implification had major delay place that 299 00:15:09,596 --> 00:15:12,996 Speaker 1: control becomes extremely hard and you're overshooting. You you're fish 300 00:15:12,996 --> 00:15:16,156 Speaker 1: tailing and overcorrecting, and you're going to end up off 301 00:15:16,156 --> 00:15:28,236 Speaker 1: the road. We'll be back in just a moment. Can 302 00:15:28,276 --> 00:15:31,156 Speaker 1: I ask you to put on your misinformation science has tad? 303 00:15:31,916 --> 00:15:34,916 Speaker 1: What are the most egregious examples that you've seen thus 304 00:15:34,956 --> 00:15:39,916 Speaker 1: far in this pandemic of misinformation with observable real world consequences? 305 00:15:40,276 --> 00:15:43,316 Speaker 1: You know. Unfortunately, I think some of the most egregious 306 00:15:43,356 --> 00:15:47,276 Speaker 1: misinformation in terms of the magnitude of the consequences has 307 00:15:47,276 --> 00:15:49,516 Speaker 1: been coming out of the White House. And it was 308 00:15:49,516 --> 00:15:53,036 Speaker 1: the protracted period where there was a serious effort to 309 00:15:53,076 --> 00:15:57,396 Speaker 1: downplay the magnitude of what was going on, and that 310 00:15:57,996 --> 00:16:01,476 Speaker 1: delayed the national response in a whole bunch of ways. 311 00:16:01,916 --> 00:16:04,796 Speaker 1: And I think it also has the consequence of really 312 00:16:05,316 --> 00:16:08,916 Speaker 1: hemorrhaging the trust that people need in order to comply 313 00:16:09,476 --> 00:16:12,436 Speaker 1: over the long term with public health measures, which are 314 00:16:12,516 --> 00:16:14,716 Speaker 1: more or less our only way of controlling the pandemic 315 00:16:14,796 --> 00:16:18,436 Speaker 1: right now. So, if you have a story that is 316 00:16:18,516 --> 00:16:24,236 Speaker 1: changing all the time, where you've got different agencies presenting 317 00:16:24,316 --> 00:16:27,716 Speaker 1: different version of the story, You're hearing from some that 318 00:16:28,276 --> 00:16:31,076 Speaker 1: this is going to go away on its own, that 319 00:16:31,116 --> 00:16:33,476 Speaker 1: it's on the way down, it's going to disappear in April, 320 00:16:33,476 --> 00:16:35,836 Speaker 1: and whatever the case is, and then the same government 321 00:16:35,876 --> 00:16:38,036 Speaker 1: has other agencies telling you this is a serious threat, 322 00:16:38,076 --> 00:16:39,916 Speaker 1: there's going to be protracted to a human to human 323 00:16:39,956 --> 00:16:42,796 Speaker 1: spread in the United States, etc. People start to not 324 00:16:42,916 --> 00:16:45,196 Speaker 1: be sure who they can trust. And when that happens, 325 00:16:45,716 --> 00:16:49,876 Speaker 1: then it becomes harder and harder to manifest the political 326 00:16:49,876 --> 00:16:53,116 Speaker 1: will that you need to take unpopular measures like closing schools, 327 00:16:53,116 --> 00:16:55,556 Speaker 1: closing non essential businesses and like so. I think, but 328 00:16:55,916 --> 00:16:58,596 Speaker 1: in terms of like the measurable consequences, one of the 329 00:16:58,676 --> 00:17:01,756 Speaker 1: worst things that's happened in terms of misinformation has been 330 00:17:01,756 --> 00:17:05,956 Speaker 1: this initial attempt to downplay the seriousness of a pandemic 331 00:17:05,996 --> 00:17:08,516 Speaker 1: in order to I think, prop up the stock market 332 00:17:08,556 --> 00:17:11,396 Speaker 1: on a first or you have a book coming out 333 00:17:11,436 --> 00:17:15,676 Speaker 1: soon with the wonderful title calling Bullshit. Presumably when you 334 00:17:15,716 --> 00:17:18,756 Speaker 1: were writing the book was before the pandemic broke out. 335 00:17:19,356 --> 00:17:22,236 Speaker 1: What were the things that at the time you wanted 336 00:17:22,276 --> 00:17:25,436 Speaker 1: to call bullshit on? And then maybe from there we'll 337 00:17:25,476 --> 00:17:27,996 Speaker 1: go to how that's relevant in the present moment. So 338 00:17:28,116 --> 00:17:29,996 Speaker 1: when we wrote the book, we were really thinking about 339 00:17:29,996 --> 00:17:34,316 Speaker 1: the ways that quantitative information is used to mislead people. 340 00:17:34,556 --> 00:17:37,716 Speaker 1: So what we thought was very, very important was to 341 00:17:37,956 --> 00:17:41,076 Speaker 1: teach people that you are not at the mercy of 342 00:17:41,116 --> 00:17:45,316 Speaker 1: the person who's bringing data and statistical analyses and machine 343 00:17:45,396 --> 00:17:47,556 Speaker 1: learning or whatever it is to the table. You don't 344 00:17:47,556 --> 00:17:50,276 Speaker 1: have to just accept those because you are not a 345 00:17:50,316 --> 00:17:53,436 Speaker 1: PhD satistician, or you're not a computer scientist or whatever. 346 00:17:53,756 --> 00:17:55,956 Speaker 1: The basic way that we think about this is that 347 00:17:56,076 --> 00:17:58,836 Speaker 1: with any of these systems you've got, you know, when 348 00:17:58,836 --> 00:18:01,556 Speaker 1: people are drawing conclusions based on data, they collect a 349 00:18:01,556 --> 00:18:03,556 Speaker 1: bunch of data. That data then or put into some 350 00:18:03,636 --> 00:18:05,956 Speaker 1: kind of machinery which you can think of as a 351 00:18:05,996 --> 00:18:08,956 Speaker 1: black box out the other end, and comes some results, 352 00:18:09,276 --> 00:18:13,556 Speaker 1: and then people draw conclusions from the results, and the 353 00:18:13,636 --> 00:18:16,196 Speaker 1: bullshit is rarely in the black box where we find, 354 00:18:16,196 --> 00:18:18,116 Speaker 1: so the bullshit is rarely in hard effect of the 355 00:18:18,116 --> 00:18:20,756 Speaker 1: technical construction of the model. Of course, there are cases 356 00:18:20,756 --> 00:18:23,836 Speaker 1: where that is what happens, but that's a strong minority 357 00:18:23,876 --> 00:18:26,116 Speaker 1: of the bullshit that we see out there. Almost all 358 00:18:26,116 --> 00:18:28,196 Speaker 1: of it is because people have data that are not 359 00:18:28,236 --> 00:18:32,036 Speaker 1: necessarily representative or appropriate for the question that they're asking. 360 00:18:32,036 --> 00:18:34,556 Speaker 1: You're comparing apples and oranges. They've got a biased sample. 361 00:18:34,596 --> 00:18:36,236 Speaker 1: There are all these kinds of things that can go wrong, 362 00:18:36,676 --> 00:18:39,236 Speaker 1: or people get results out and then they draw unjustified 363 00:18:39,276 --> 00:18:44,116 Speaker 1: conclusions from the results. They overgeneralize, they infer causality where 364 00:18:44,116 --> 00:18:47,236 Speaker 1: they've only got an observational study and only know about correlation, 365 00:18:47,556 --> 00:18:49,196 Speaker 1: these kinds of things. And so what we wanted to 366 00:18:49,196 --> 00:18:51,516 Speaker 1: really stress with the book was that you don't have 367 00:18:51,556 --> 00:18:53,596 Speaker 1: to be sort of held hostage by people that have 368 00:18:53,676 --> 00:18:56,156 Speaker 1: the numbers, because you can use the basic critical thinking 369 00:18:56,156 --> 00:18:58,956 Speaker 1: skills that anybody has in order to see through this 370 00:18:58,996 --> 00:19:00,956 Speaker 1: sort of stuff. You say, well, these data appropriate for 371 00:19:00,996 --> 00:19:05,196 Speaker 1: answering that question, to these conclusions actually follow from those results. 372 00:19:05,836 --> 00:19:09,996 Speaker 1: I love the idea that careful critical thinking and analyzing 373 00:19:09,996 --> 00:19:14,156 Speaker 1: the premises can really help you identify bullshit when it 374 00:19:14,276 --> 00:19:16,716 Speaker 1: is being dealt to you. I want to ask you 375 00:19:16,916 --> 00:19:20,076 Speaker 1: though about a variant on that which I think may 376 00:19:20,076 --> 00:19:23,276 Speaker 1: be pretty different that we've been seeing throughout the current pandemic, 377 00:19:23,356 --> 00:19:26,196 Speaker 1: which is that it's not just that non experts in 378 00:19:26,316 --> 00:19:30,276 Speaker 1: fields are calling bullshit on models, they also think that 379 00:19:30,316 --> 00:19:34,516 Speaker 1: they can build their own models better. And at first 380 00:19:34,596 --> 00:19:36,716 Speaker 1: I thought to myself, this problem is mostly out there 381 00:19:36,756 --> 00:19:39,676 Speaker 1: on medium, or it's you know, the co worker in 382 00:19:39,716 --> 00:19:42,276 Speaker 1: the office. And I actually wrote a column for Bloomberg 383 00:19:42,356 --> 00:19:45,396 Speaker 1: where I write a column saying, well, the amateur epidemiologists 384 00:19:45,396 --> 00:19:48,156 Speaker 1: please just sit down and shut up. So this phenomenon 385 00:19:48,156 --> 00:19:51,316 Speaker 1: seems to be kind of everywhere right now, and I'm 386 00:19:51,356 --> 00:19:53,716 Speaker 1: wondering if there's, like, is there a cognitive thing going 387 00:19:53,716 --> 00:19:55,876 Speaker 1: on here? If people think they can find a problem, 388 00:19:56,556 --> 00:19:59,516 Speaker 1: which is true by your view, they also think that 389 00:19:59,556 --> 00:20:01,996 Speaker 1: they can then devise a better model, and at least 390 00:20:02,036 --> 00:20:04,196 Speaker 1: as far as I can tell, that's not true. They 391 00:20:04,276 --> 00:20:07,836 Speaker 1: usually cannot devise a better model. Well, I think, you know, 392 00:20:08,156 --> 00:20:11,276 Speaker 1: on one hand, we only need input from a lot 393 00:20:11,316 --> 00:20:14,756 Speaker 1: of sources, and so I think that it's really important 394 00:20:15,116 --> 00:20:18,916 Speaker 1: to recognize that there are going to be good ideas 395 00:20:18,916 --> 00:20:22,316 Speaker 1: coming from outside of the epidemiology community at the same time, 396 00:20:22,916 --> 00:20:25,956 Speaker 1: one wants to be aware of the dunning Krueger effect. 397 00:20:25,956 --> 00:20:28,396 Speaker 1: You know, the basic idea of the dunning Kruger effect 398 00:20:28,436 --> 00:20:30,476 Speaker 1: is you don't know enough to know you're wrong, essentially, 399 00:20:30,476 --> 00:20:33,036 Speaker 1: and so you think that you have a very good understanding. 400 00:20:33,036 --> 00:20:35,356 Speaker 1: And so the Dunning Kruger effect typically has this sort 401 00:20:35,356 --> 00:20:39,876 Speaker 1: of you know, non monotone distribution of people's confidences. If 402 00:20:39,876 --> 00:20:41,996 Speaker 1: they don't know much, they're quite confident. If they go 403 00:20:42,196 --> 00:20:44,676 Speaker 1: some they aren't confident, and if they know enough they 404 00:20:44,996 --> 00:20:48,236 Speaker 1: start to get relatively confident again. And so we're seeing 405 00:20:48,276 --> 00:20:53,076 Speaker 1: a lot from both of those confident peaks. Professional epidemiologists 406 00:20:53,196 --> 00:20:56,556 Speaker 1: and otherwise there are some really good ideas coming from 407 00:20:56,596 --> 00:20:59,236 Speaker 1: people that are not professional epidemiologists. But I think one 408 00:20:59,236 --> 00:21:02,716 Speaker 1: of the things that's hard is when things are presented 409 00:21:02,836 --> 00:21:07,796 Speaker 1: as a fact instead of suggestion, and particularly in a 410 00:21:07,836 --> 00:21:12,076 Speaker 1: context where these stupid epidemiologists they don't even know about 411 00:21:12,516 --> 00:21:15,276 Speaker 1: such and such. So this is the sort of the 412 00:21:15,436 --> 00:21:19,356 Speaker 1: not helpful side of things. I'm personally collaborating with a 413 00:21:19,436 --> 00:21:21,756 Speaker 1: number of people who are outside of the epidemiology community, 414 00:21:21,836 --> 00:21:24,716 Speaker 1: not only economists on the economics side, but you know, 415 00:21:24,756 --> 00:21:30,036 Speaker 1: for example, been consulting with a group of baseball analytics 416 00:21:30,076 --> 00:21:32,516 Speaker 1: people that wanted to find a small, tractable problem that 417 00:21:32,556 --> 00:21:35,156 Speaker 1: they could contribute to because they're really really good at 418 00:21:35,836 --> 00:21:39,196 Speaker 1: figuring out things from numbers. And you know, they called 419 00:21:39,196 --> 00:21:44,396 Speaker 1: me and were they understood that they needed some background 420 00:21:44,756 --> 00:21:46,476 Speaker 1: in order to be able to do something useful. They 421 00:21:46,476 --> 00:21:48,236 Speaker 1: couldn't just sit down and make their own model, and 422 00:21:48,316 --> 00:21:50,476 Speaker 1: so we spent a lot of time talking about what's 423 00:21:50,476 --> 00:21:52,436 Speaker 1: sort of an unsolved problem that they could take a 424 00:21:52,476 --> 00:21:54,436 Speaker 1: crack act. And you know, I think that kind of 425 00:21:54,476 --> 00:21:57,036 Speaker 1: thing is really constructive, So I guess, yeah, it's a 426 00:21:57,076 --> 00:21:59,556 Speaker 1: it's a double its sort. Well, if you're helping saber 427 00:21:59,596 --> 00:22:02,956 Speaker 1: metrics to save the world, that'll be a further contribution 428 00:22:02,996 --> 00:22:04,796 Speaker 1: that you're making. Yeah, well it's kind it's kind of fun. 429 00:22:04,836 --> 00:22:07,516 Speaker 1: You know, we're trying to figure out how to epidemologists 430 00:22:07,516 --> 00:22:09,316 Speaker 1: are trying to figure out how to save baseball, and 431 00:22:09,396 --> 00:22:11,836 Speaker 1: these guys are trying to figure out how to do effidemiology. 432 00:22:11,916 --> 00:22:14,916 Speaker 1: So it's a nice combination a mutual assistance society in 433 00:22:14,956 --> 00:22:17,796 Speaker 1: a moment when we very much need that. Yeah, Carl, 434 00:22:17,836 --> 00:22:19,876 Speaker 1: what am I not asking you that you think I 435 00:22:19,916 --> 00:22:22,676 Speaker 1: should be asking you? What? Are what are salient problems 436 00:22:22,676 --> 00:22:25,196 Speaker 1: on any of your dimensions of expertise that you're observing 437 00:22:25,516 --> 00:22:28,676 Speaker 1: that you think people should know about. I think at 438 00:22:28,676 --> 00:22:32,836 Speaker 1: this point, the really big salient problem is to think 439 00:22:32,876 --> 00:22:35,716 Speaker 1: about how do we emerge from the situation that we're 440 00:22:35,716 --> 00:22:38,876 Speaker 1: in now. We all need to do that. We can't 441 00:22:38,916 --> 00:22:42,516 Speaker 1: wait around for twelve to eighteen months, and so what 442 00:22:42,636 --> 00:22:47,036 Speaker 1: are the possible solutions. And there's so many different disciplines 443 00:22:47,076 --> 00:22:50,396 Speaker 1: that are involved in finding these solutions. You know, how 444 00:22:50,436 --> 00:22:55,036 Speaker 1: do we need to restructure the economic system so that 445 00:22:55,116 --> 00:22:59,076 Speaker 1: we can in healthy ways help businesses weather this shutdown? 446 00:22:59,556 --> 00:23:01,716 Speaker 1: As you're asking those questions, at the same time, you're 447 00:23:01,756 --> 00:23:03,956 Speaker 1: trying to figure out things, you know that are really 448 00:23:03,996 --> 00:23:07,956 Speaker 1: technical immunology problems, like how long does immunity last, what 449 00:23:08,076 --> 00:23:11,516 Speaker 1: fraction of people generate, is it even safe for people 450 00:23:11,516 --> 00:23:14,236 Speaker 1: who've had the disease go back to work, etc. So 451 00:23:14,316 --> 00:23:16,476 Speaker 1: someone out we need to find very good ways for 452 00:23:16,596 --> 00:23:19,876 Speaker 1: us to as an extremely broad community to sit down 453 00:23:20,076 --> 00:23:24,476 Speaker 1: and talk across these boundaries without epidemiologists saying that economists 454 00:23:24,476 --> 00:23:27,236 Speaker 1: don't know anything about the economy or vice versa. So 455 00:23:27,276 --> 00:23:29,676 Speaker 1: that's really important finding that way to communicate. I think 456 00:23:29,796 --> 00:23:31,036 Speaker 1: you know. The other you know, one of the other 457 00:23:31,036 --> 00:23:33,796 Speaker 1: things that has been really pressing challenge that none of 458 00:23:33,836 --> 00:23:37,676 Speaker 1: us saw coming really was we've you know, we've been 459 00:23:37,716 --> 00:23:39,436 Speaker 1: planning for a crisis like this for a long time 460 00:23:39,476 --> 00:23:41,076 Speaker 1: and thinking about what it would look like and how 461 00:23:41,116 --> 00:23:44,596 Speaker 1: we might react, and hoping it would never come. In 462 00:23:44,596 --> 00:23:47,756 Speaker 1: the epidemology community we have and now we're here, of course, 463 00:23:47,756 --> 00:23:51,116 Speaker 1: and the thing we weren't really thinking about was about 464 00:23:51,156 --> 00:23:53,796 Speaker 1: the way that all of this was going to be 465 00:23:54,436 --> 00:23:56,756 Speaker 1: so heavily politicized that we feel like we're fighting a 466 00:23:56,756 --> 00:23:59,196 Speaker 1: battle on two fronts. We're fighting a battle against the virus, 467 00:23:59,276 --> 00:24:01,636 Speaker 1: but then we're fighting a battle against misinformation around the 468 00:24:01,716 --> 00:24:05,036 Speaker 1: virus that's being promulgated by people up to it, including 469 00:24:05,036 --> 00:24:07,636 Speaker 1: the White House. And so that's a challenge and threat 470 00:24:07,676 --> 00:24:10,876 Speaker 1: that we just hadn't been thinking about seriously. And I 471 00:24:10,916 --> 00:24:13,156 Speaker 1: think that was a missed opportunity on our part to 472 00:24:13,476 --> 00:24:16,876 Speaker 1: not be thinking about the information side of this. Do 473 00:24:16,876 --> 00:24:19,196 Speaker 1: you think that's partly the result of a kind of 474 00:24:19,956 --> 00:24:23,996 Speaker 1: unexpressed but mistaken sense of cultural superiority. So I know, 475 00:24:24,076 --> 00:24:27,916 Speaker 1: epidemiologists who have worked extensively in subserent African Ani Bola 476 00:24:27,996 --> 00:24:31,636 Speaker 1: for example, and who were acutely attuned and have written 477 00:24:31,676 --> 00:24:35,076 Speaker 1: extensively about how do you deal with public misinformation, but 478 00:24:35,116 --> 00:24:38,796 Speaker 1: also with governmental distortions, you know, regimes that aren't willing 479 00:24:38,836 --> 00:24:41,396 Speaker 1: to play ball, or the distort facts and circumstances under 480 00:24:41,436 --> 00:24:44,196 Speaker 1: some conditions. And that was I think pretty commonplace in 481 00:24:44,196 --> 00:24:46,876 Speaker 1: the epidemiological community, in the public health community as well. 482 00:24:47,036 --> 00:24:49,036 Speaker 1: But there were somehow this unspoken thought that that can't 483 00:24:49,076 --> 00:24:51,636 Speaker 1: happen here, not that a pandemic can't happen here, but 484 00:24:51,676 --> 00:24:54,476 Speaker 1: that in the United States or in some Western European country, 485 00:24:54,676 --> 00:24:56,676 Speaker 1: we would all have Angelo mercles, you know, we would 486 00:24:56,676 --> 00:25:00,156 Speaker 1: have rational home and political leadership. And of course that's 487 00:25:00,396 --> 00:25:04,076 Speaker 1: totally false, and it was knowably false. I would have 488 00:25:04,116 --> 00:25:07,996 Speaker 1: said in advance, I think that's a very sharp observation, 489 00:25:08,236 --> 00:25:11,036 Speaker 1: and I completely agreed. At least I fell into that trap. 490 00:25:11,076 --> 00:25:12,916 Speaker 1: I don't want to speak for everybody. You're completely right 491 00:25:12,956 --> 00:25:14,996 Speaker 1: that there are people working in other parts of the 492 00:25:15,036 --> 00:25:17,916 Speaker 1: world that are well aware of these challenges, and I 493 00:25:17,956 --> 00:25:20,916 Speaker 1: just don't think we were thinking as seriously about it 494 00:25:20,956 --> 00:25:23,156 Speaker 1: as we should have been. You know, Obama was in 495 00:25:23,196 --> 00:25:25,156 Speaker 1: some ways, in a lot of ways close to that 496 00:25:25,196 --> 00:25:27,716 Speaker 1: morbial mold and we weren't really thinking about that. I mean, 497 00:25:27,716 --> 00:25:31,756 Speaker 1: I remember going back to Bush administration and thinking, you know, 498 00:25:31,796 --> 00:25:35,356 Speaker 1: in the discussions with the Bush administration, there were concerns 499 00:25:35,356 --> 00:25:37,876 Speaker 1: about the degree to which the government should step in 500 00:25:37,956 --> 00:25:40,916 Speaker 1: and be involved in something like pandemic planning. Why can't 501 00:25:40,956 --> 00:25:43,276 Speaker 1: it be privatized and so on. So we had these 502 00:25:43,276 --> 00:25:46,916 Speaker 1: sort of ideological disagreements about the best way to handle 503 00:25:47,396 --> 00:25:51,396 Speaker 1: something like a pandemic planning, But we very much had 504 00:25:51,396 --> 00:25:53,316 Speaker 1: the sense that if the thing actually broke out, we'd 505 00:25:53,316 --> 00:25:55,196 Speaker 1: all be on the same page, and we'd all acknowledge 506 00:25:55,196 --> 00:25:56,956 Speaker 1: that it was happening and we try to do the 507 00:25:56,956 --> 00:25:58,356 Speaker 1: best we could to get rid of it, and there 508 00:25:58,356 --> 00:26:01,756 Speaker 1: wouldn't be this period where we were pretending that it 509 00:26:01,796 --> 00:26:05,316 Speaker 1: wasn't happening at all. Carl, thank you for an extremely 510 00:26:05,356 --> 00:26:08,956 Speaker 1: illuminating conversation and for your work, and please keep up 511 00:26:09,316 --> 00:26:12,396 Speaker 1: bullshit on the things that need bullshit called and clarifying 512 00:26:12,396 --> 00:26:14,236 Speaker 1: things for us. I really appreciate your time. Thanks. So 513 00:26:14,436 --> 00:26:17,796 Speaker 1: I'm talking talking to Carl really brought home to me 514 00:26:17,996 --> 00:26:21,396 Speaker 1: just how dependent we are on models from making sense 515 00:26:21,596 --> 00:26:24,436 Speaker 1: of what's going on here. It's not only that they're everywhere, 516 00:26:24,516 --> 00:26:27,356 Speaker 1: it's that they have the capacity to fundamentally shape our 517 00:26:27,436 --> 00:26:30,556 Speaker 1: ideas about what we ought to do. Indeed, the very 518 00:26:30,596 --> 00:26:32,956 Speaker 1: phrase flatten the curve, which has been a kind of 519 00:26:33,076 --> 00:26:35,396 Speaker 1: motto for all of us in this early period of 520 00:26:35,436 --> 00:26:40,596 Speaker 1: the pandemic, is itself language directly taken from modeling, and 521 00:26:40,636 --> 00:26:42,756 Speaker 1: I can't think of the last time that a modeling 522 00:26:42,876 --> 00:26:45,996 Speaker 1: term became our guide for how we should be behaving 523 00:26:46,196 --> 00:26:49,876 Speaker 1: at the most fundamental level of our ordinary lives. Yet 524 00:26:49,916 --> 00:26:52,036 Speaker 1: at the same time that Carl shows us the importance 525 00:26:52,036 --> 00:26:54,876 Speaker 1: of models, he's also very attuned to the idea that 526 00:26:54,996 --> 00:26:58,156 Speaker 1: models can be deceptive, and indeed that they can lie. 527 00:26:58,676 --> 00:27:01,156 Speaker 1: And they can lie, he says, if we fail to 528 00:27:01,196 --> 00:27:05,076 Speaker 1: take into account using our critical faculties, what's going into them. 529 00:27:05,476 --> 00:27:07,916 Speaker 1: The problem, he says, is not that the models themselves 530 00:27:07,956 --> 00:27:10,916 Speaker 1: are fallacious, that if the premises are wrong, we can 531 00:27:10,956 --> 00:27:15,436 Speaker 1: be led to very, very bad conclusions. Carl is also 532 00:27:15,516 --> 00:27:18,076 Speaker 1: closely focused on the question that we've been thinking about 533 00:27:18,156 --> 00:27:20,196 Speaker 1: here at deep background, and then all of us are 534 00:27:20,196 --> 00:27:23,436 Speaker 1: going to continue thinking about going forward, namely, how do 535 00:27:23,516 --> 00:27:27,156 Speaker 1: we come out from behind our social distancing and slowly 536 00:27:27,196 --> 00:27:31,396 Speaker 1: and carefully begin the process of reopening the economy. I 537 00:27:31,476 --> 00:27:33,716 Speaker 1: promise you We'll be talking more about that in the 538 00:27:33,756 --> 00:27:36,596 Speaker 1: episodes ahead. Until the next time I speak to you, 539 00:27:37,036 --> 00:27:41,996 Speaker 1: be careful, be safe, and be well. Deep Background is 540 00:27:42,036 --> 00:27:45,076 Speaker 1: brought to you by Pushkin Industries. Our producer is Lydia 541 00:27:45,116 --> 00:27:49,076 Speaker 1: gene Coott, with research help from zooe Wynn. Mastering is 542 00:27:49,116 --> 00:27:53,116 Speaker 1: by Jason Gambrell and Martin Gonzalez. Our showrunner is Sophie mcibbon. 543 00:27:53,476 --> 00:27:56,716 Speaker 1: Our theme music is composed by Luis GERA special thanks 544 00:27:56,716 --> 00:28:00,396 Speaker 1: to the Pushkin Brass, Malcolm Gladwell, Jacob Weisberg, and Mia Loebell. 545 00:28:00,796 --> 00:28:03,676 Speaker 1: I'm Noah Feldman. I also write a regular column for 546 00:28:03,716 --> 00:28:06,716 Speaker 1: Bloomberg Opinion, which you can find at Bloomberg dot com 547 00:28:06,756 --> 00:28:10,716 Speaker 1: slash Feldman. To discover bloom its original slate of podcasts, 548 00:28:10,996 --> 00:28:14,756 Speaker 1: go to Bloomberg dot com slash Podcasts. You can follow 549 00:28:14,756 --> 00:28:18,836 Speaker 1: me on Twitter at Noah R. Feldman. This is Deep Background.