1 00:00:15,396 --> 00:00:23,796 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:23,836 --> 00:00:27,116 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:27,556 --> 00:00:31,636 Speaker 1: I'm Noah Feldman. We've been talking a lot in the 4 00:00:31,716 --> 00:00:35,996 Speaker 1: last year about the possibilities of herd immunity or community immunity, 5 00:00:36,436 --> 00:00:39,876 Speaker 1: but it's now overwhelmingly clear that the US population is 6 00:00:39,956 --> 00:00:42,596 Speaker 1: not going to hit in the foreseeable future a rate 7 00:00:42,676 --> 00:00:46,236 Speaker 1: that would satisfy the herd immunity standard. And the main 8 00:00:46,396 --> 00:00:49,476 Speaker 1: reason for that is not distribution of vaccines, but what 9 00:00:49,676 --> 00:00:55,116 Speaker 1: is called, slightly as a euphemism, vaccine hesitancy. Some studies 10 00:00:55,116 --> 00:00:57,716 Speaker 1: have shown that as many as one in five Americans 11 00:00:57,716 --> 00:01:00,916 Speaker 1: say they wouldn't want to take a coronavirus vaccine, and 12 00:01:00,996 --> 00:01:04,036 Speaker 1: there are some indications from other studies that those numbers 13 00:01:04,116 --> 00:01:09,796 Speaker 1: are in fact rising rather than decline. Public health officials 14 00:01:09,956 --> 00:01:12,996 Speaker 1: and governments and indeed all of us therefore need to 15 00:01:13,036 --> 00:01:17,316 Speaker 1: think about ways to understand and increase public acceptance of 16 00:01:17,356 --> 00:01:20,236 Speaker 1: the vaccine, provided they believe, as I do, that the 17 00:01:20,316 --> 00:01:23,396 Speaker 1: vaccine is an important tool in helping us get beyond 18 00:01:23,476 --> 00:01:27,436 Speaker 1: this pandemic. Today's guest is one of the world's leading 19 00:01:27,436 --> 00:01:31,596 Speaker 1: researchers on precisely the question of why people hesitate to 20 00:01:31,596 --> 00:01:34,596 Speaker 1: take vaccines, why they don't want to take vaccines, and 21 00:01:34,836 --> 00:01:38,116 Speaker 1: what might be done about it. Doctor Haddie Larson is 22 00:01:38,116 --> 00:01:42,716 Speaker 1: an anthropologist. She's the founding director of the Vaccine Confidence Project, 23 00:01:43,116 --> 00:01:46,476 Speaker 1: an interdisciplinary research group at the London School of Hygiene 24 00:01:46,476 --> 00:01:51,276 Speaker 1: and Tropical Medicine. She also headed the Global Immunization Communication 25 00:01:51,356 --> 00:01:55,916 Speaker 1: Program at UNISEF, and she's the author of a new book, Stuck, 26 00:01:56,196 --> 00:02:00,436 Speaker 1: How Vaccine Rumors Start and Why they Don't go Away. 27 00:02:00,596 --> 00:02:03,716 Speaker 1: Doctor Larson will help us understand what are the contributing 28 00:02:03,756 --> 00:02:07,796 Speaker 1: factors that undermine vaccine confidence and what we might think 29 00:02:07,836 --> 00:02:11,236 Speaker 1: about doing differently if we want to improve the situation 30 00:02:11,636 --> 00:02:19,436 Speaker 1: going forward. Heidi, thank you so much for being I 31 00:02:19,516 --> 00:02:21,756 Speaker 1: want to begin by asking you a sort of top 32 00:02:21,756 --> 00:02:25,036 Speaker 1: of the line question, which is in your very deep 33 00:02:25,156 --> 00:02:30,116 Speaker 1: cross cultural comparisons of vaccine hesitancy. Do you think that 34 00:02:30,196 --> 00:02:35,396 Speaker 1: there are universal or roughly universal causes for vaccine hesitancy 35 00:02:35,876 --> 00:02:38,876 Speaker 1: or do you think that each culture has its own 36 00:02:39,036 --> 00:02:43,156 Speaker 1: reasons making it difficult to speak in very broad general 37 00:02:43,236 --> 00:02:48,196 Speaker 1: terms comparing places. I would say that the common things 38 00:02:48,316 --> 00:02:54,276 Speaker 1: are issues of liberty and choice, and freedom of choice, 39 00:02:54,996 --> 00:03:01,196 Speaker 1: the anti government control sentiments, versus liberty. The second one 40 00:03:01,316 --> 00:03:05,476 Speaker 1: would be nature. Is it natural or it feels like 41 00:03:05,516 --> 00:03:10,436 Speaker 1: it's against God's plan? But that tension between you know, 42 00:03:10,636 --> 00:03:15,316 Speaker 1: natural versus technical or chemical is another tension. And I 43 00:03:15,356 --> 00:03:20,756 Speaker 1: think the third one, as a universal is just safety, safety, safety, safety. 44 00:03:21,196 --> 00:03:24,876 Speaker 1: And I think the other kind of critical dimension is 45 00:03:25,516 --> 00:03:31,276 Speaker 1: trust and underlying trust in government and authorities. In quote experts, 46 00:03:32,276 --> 00:03:34,596 Speaker 1: I know you've emphasized trust very much. You said to 47 00:03:34,596 --> 00:03:37,356 Speaker 1: the New York Times, we have a trust problem, not 48 00:03:37,476 --> 00:03:40,916 Speaker 1: a misinformation problem, which is a provocative claim. Could you 49 00:03:40,956 --> 00:03:44,236 Speaker 1: say more about why you think trust is at the 50 00:03:44,236 --> 00:03:49,316 Speaker 1: heart of the concerns. I honestly believe that we don't 51 00:03:49,316 --> 00:03:53,756 Speaker 1: have a misinformation problem as much as a relationship problem. 52 00:03:53,756 --> 00:03:58,476 Speaker 1: And I say that because if we had a stronger 53 00:03:58,956 --> 00:04:05,156 Speaker 1: trust relationship between public and science, public and authorities, people 54 00:04:05,196 --> 00:04:09,356 Speaker 1: cope with some risk if they trust you, there willing 55 00:04:09,396 --> 00:04:11,876 Speaker 1: to put up with a little risk. If they don't, 56 00:04:12,636 --> 00:04:17,596 Speaker 1: they're questioning, they're concerned. They start from a position of distrust. 57 00:04:18,556 --> 00:04:22,956 Speaker 1: So if we can build that underlying relationship and make 58 00:04:22,996 --> 00:04:27,396 Speaker 1: it more trusting, we'll have a bit more resilience an 59 00:04:27,436 --> 00:04:31,956 Speaker 1: acceptance of scientific advice. But that's what I mean by 60 00:04:31,996 --> 00:04:36,916 Speaker 1: the underlying resilience and willingness to take that little risk. 61 00:04:38,436 --> 00:04:41,236 Speaker 1: May I ask a follow on question there, because it 62 00:04:41,276 --> 00:04:43,716 Speaker 1: looks to me like there's some tension between the trust 63 00:04:43,756 --> 00:04:49,996 Speaker 1: analysis and you are three big drivers of vaccine misinformation. 64 00:04:50,156 --> 00:04:54,996 Speaker 1: So for libertarians, some distrust and authority is kind of 65 00:04:55,076 --> 00:04:58,796 Speaker 1: constitutive of their worldview that we need to be fundamentally 66 00:04:58,876 --> 00:05:03,036 Speaker 1: skeptical of aggregations of power and authority. And then with 67 00:05:03,076 --> 00:05:05,836 Speaker 1: respect to people who think that it's in God's hands, 68 00:05:06,076 --> 00:05:11,516 Speaker 1: those folks too have a principled reason to be distrustful 69 00:05:12,036 --> 00:05:16,836 Speaker 1: of human interventions. And so I'm really wondering if the 70 00:05:16,836 --> 00:05:21,116 Speaker 1: trust problem is overcomable at all, or maybe even shouldn't 71 00:05:21,116 --> 00:05:23,676 Speaker 1: be overcome from the standpoint of those two kinds of 72 00:05:24,236 --> 00:05:29,356 Speaker 1: hesitancy objections, the analysis, the sophisticate analysis that you're offering, 73 00:05:29,956 --> 00:05:34,996 Speaker 1: may suggest almost a kind of impossibility of overcoming some 74 00:05:35,076 --> 00:05:38,356 Speaker 1: of these things because of a contradiction that exists between 75 00:05:38,396 --> 00:05:42,276 Speaker 1: the value of trust and then these principal objections to trust. 76 00:05:42,796 --> 00:05:45,716 Speaker 1: I do believe that there are going to be certain 77 00:05:45,796 --> 00:05:50,276 Speaker 1: kinds of hesitancy and actually deep refusers that we won't 78 00:05:50,316 --> 00:05:52,636 Speaker 1: be able to overcome. I think we need to accept that, 79 00:05:53,396 --> 00:05:58,436 Speaker 1: But what we should as a health and medical community 80 00:05:59,076 --> 00:06:03,876 Speaker 1: strive to get as many people on board as it were, 81 00:06:04,396 --> 00:06:07,156 Speaker 1: for the sake of the public sealth. And there will 82 00:06:07,196 --> 00:06:10,876 Speaker 1: also be people. There are also people who can't take 83 00:06:10,996 --> 00:06:17,756 Speaker 1: vaccines because of underlying medical conditions. So, for instance, with COVID, 84 00:06:18,196 --> 00:06:21,876 Speaker 1: we don't need one hundred percent of people vaccinated to 85 00:06:21,996 --> 00:06:27,436 Speaker 1: get community immunity, as they say, but we do want 86 00:06:27,436 --> 00:06:29,716 Speaker 1: to get as many as we can. I'm not saying 87 00:06:29,756 --> 00:06:35,476 Speaker 1: give up, but we will have these deep challenges. Why 88 00:06:35,636 --> 00:06:38,876 Speaker 1: can't we actually nevertheless do better? I mean, go back 89 00:06:38,916 --> 00:06:44,916 Speaker 1: to smallpox, one of the great successes of immunizations in 90 00:06:45,356 --> 00:06:50,356 Speaker 1: global history, right, effective eradication over a long period of 91 00:06:50,356 --> 00:06:53,316 Speaker 1: time with a lot of coordination, but not so very 92 00:06:53,356 --> 00:06:56,436 Speaker 1: long ago, right, I mean that process ended in the seventies, 93 00:06:56,476 --> 00:07:00,796 Speaker 1: if I'm not mistaken. So why was it possible to 94 00:07:00,836 --> 00:07:04,636 Speaker 1: do that? But it's not quote unquote possible to do 95 00:07:04,676 --> 00:07:07,636 Speaker 1: this now. I mean, what were the tools and techniques 96 00:07:07,676 --> 00:07:11,396 Speaker 1: that were used to bring us to smallpox eradication? And 97 00:07:11,436 --> 00:07:16,876 Speaker 1: why do we seem so stuck in so many places today? 98 00:07:17,276 --> 00:07:22,036 Speaker 1: We're much more democratic. This is a really important point 99 00:07:22,076 --> 00:07:25,396 Speaker 1: you're thinking, and I thought about it a lot when 100 00:07:25,436 --> 00:07:29,196 Speaker 1: I was going with the polio workers door to door 101 00:07:29,236 --> 00:07:33,876 Speaker 1: in northern Nigeria, in India in some of the most 102 00:07:33,996 --> 00:07:38,956 Speaker 1: resistant communities, and thinking, you know, these are some of 103 00:07:38,996 --> 00:07:42,796 Speaker 1: the same communities, particularly in India, which went through the 104 00:07:42,836 --> 00:07:47,636 Speaker 1: same thing around smallpox. But for the smallpox eradication, there 105 00:07:47,716 --> 00:07:53,076 Speaker 1: was some police force and pretty coercive measures that would 106 00:07:53,116 --> 00:07:57,756 Speaker 1: absolutely not be acceptable today, and certainly in the context 107 00:07:57,756 --> 00:08:01,636 Speaker 1: of the polio eradication initiative, we could not do in 108 00:08:01,716 --> 00:08:03,796 Speaker 1: some of the states I was in what was done 109 00:08:05,196 --> 00:08:08,356 Speaker 1: in the previous campaigns. I'm not saying it was all 110 00:08:08,436 --> 00:08:13,636 Speaker 1: that way, but there were certain types of coercion that 111 00:08:13,796 --> 00:08:17,956 Speaker 1: isn't just not tolerated today. I'm going to ask a 112 00:08:17,996 --> 00:08:20,236 Speaker 1: subversive question, and I want to preface it by saying, 113 00:08:20,276 --> 00:08:21,916 Speaker 1: you know, my day job is that I'm a constitutional 114 00:08:21,956 --> 00:08:25,276 Speaker 1: law professor. I spend all of my time thinking about, 115 00:08:25,356 --> 00:08:28,516 Speaker 1: you know, how you could make liberal democracy work alongside 116 00:08:28,516 --> 00:08:31,316 Speaker 1: the need for government authority. So I take this with 117 00:08:31,476 --> 00:08:35,156 Speaker 1: you with that background, Maybe we're just doing it wrong. 118 00:08:35,676 --> 00:08:37,876 Speaker 1: You know, maybe the idea that there should be a 119 00:08:37,916 --> 00:08:43,636 Speaker 1: democratic right to refuse vaccination is not only a mistake 120 00:08:43,756 --> 00:08:47,556 Speaker 1: from a practical standpoint, but is generating the kind of 121 00:08:47,716 --> 00:08:51,316 Speaker 1: hesitancy that you're talking about to a greater degree than 122 00:08:51,316 --> 00:08:54,076 Speaker 1: it would otherwise exist, in the sense that if you 123 00:08:54,156 --> 00:08:56,796 Speaker 1: ask people do you want the vaccine or not, you're 124 00:08:56,836 --> 00:09:00,516 Speaker 1: putting them almost in an existential situation of having to 125 00:09:00,556 --> 00:09:06,556 Speaker 1: weigh many, many, many different factors ideological personal belief, chance, risk, knowledge, ignorance. 126 00:09:06,876 --> 00:09:09,676 Speaker 1: Maybe that's just asking too much bunch of people. And 127 00:09:09,716 --> 00:09:13,356 Speaker 1: maybe if we simply required it universally, there would be 128 00:09:13,436 --> 00:09:16,196 Speaker 1: there would still be some objections, but maybe people just 129 00:09:16,196 --> 00:09:18,956 Speaker 1: wouldn't spend as much time dreaming them up and just 130 00:09:18,956 --> 00:09:21,676 Speaker 1: to finish the thought about why this isn't maybe so crazy. 131 00:09:22,116 --> 00:09:24,436 Speaker 1: You know, when the US Supreme Court was asked to 132 00:09:24,476 --> 00:09:28,956 Speaker 1: consider this issue in the early twentieth century, it answered unequivocally, Look, 133 00:09:29,076 --> 00:09:31,516 Speaker 1: you have lots of liberal rights against the state, but 134 00:09:31,596 --> 00:09:33,876 Speaker 1: you don't have a right to say, note of vaccines 135 00:09:33,916 --> 00:09:38,076 Speaker 1: because vaccination is necessary to help everybody, and so you 136 00:09:38,156 --> 00:09:40,596 Speaker 1: just don't have that as a fundamental constitutional right. And 137 00:09:40,636 --> 00:09:42,316 Speaker 1: you know what, if it went to the Supreme Court today, 138 00:09:42,516 --> 00:09:46,476 Speaker 1: they might well reach the same conclusion. So it's not 139 00:09:46,756 --> 00:09:51,876 Speaker 1: necessary that we say that forced immunization is undemocratic. We 140 00:09:51,916 --> 00:09:55,636 Speaker 1: could just say forced immunization is consistent with our values 141 00:09:55,636 --> 00:10:01,876 Speaker 1: because it's necessary to save lives. I absolutely agree with you. 142 00:10:02,596 --> 00:10:06,236 Speaker 1: I mean, we do have required immunization to go to school, 143 00:10:06,796 --> 00:10:09,876 Speaker 1: but it's about settings. In my work in the UN, 144 00:10:09,956 --> 00:10:13,356 Speaker 1: I worked a lot on rights issues, and particularly the 145 00:10:13,356 --> 00:10:16,316 Speaker 1: Convention on the Rights of the Child, and we often 146 00:10:16,356 --> 00:10:20,756 Speaker 1: talked about this fine line where rights become responsibilities. You 147 00:10:20,836 --> 00:10:23,436 Speaker 1: have your individual right until you get to a point 148 00:10:23,436 --> 00:10:28,076 Speaker 1: where it harms others, and vaccines sit on that cusp. 149 00:10:28,916 --> 00:10:32,676 Speaker 1: I agree with you that we do need to rethink 150 00:10:33,276 --> 00:10:37,156 Speaker 1: our whole approach. I think we really need a whole 151 00:10:37,156 --> 00:10:40,676 Speaker 1: different approach because also we have a lot more vaccines, 152 00:10:40,756 --> 00:10:45,756 Speaker 1: and I know that, for instance, in France they added 153 00:10:46,236 --> 00:10:50,596 Speaker 1: a number of additional vaccines required for school in the 154 00:10:50,676 --> 00:10:55,036 Speaker 1: context of some very serious measles outbreaks in twenty eighteen. 155 00:10:55,076 --> 00:10:57,996 Speaker 1: I think there were eighty thousand across Europe, and there 156 00:10:58,116 --> 00:11:02,316 Speaker 1: was outrage in the streets against it, but ultimately a 157 00:11:02,396 --> 00:11:06,356 Speaker 1: number of the healthcare professionals told colleagues at the Ministry 158 00:11:06,396 --> 00:11:10,516 Speaker 1: of Health, thank you. It takes the onus off of 159 00:11:10,556 --> 00:11:14,476 Speaker 1: me as a healthcare provider to have to persuade someone 160 00:11:14,516 --> 00:11:16,956 Speaker 1: to take I feel like the government is behind me. 161 00:11:17,276 --> 00:11:20,956 Speaker 1: I'm supporting that, I'm helping implement that, but the onus 162 00:11:21,116 --> 00:11:24,996 Speaker 1: is not on me to make that persuasive argument. Well, 163 00:11:25,196 --> 00:11:28,556 Speaker 1: can you say more about the approach that you would advocate, 164 00:11:28,636 --> 00:11:33,796 Speaker 1: because we don't train physicians or scientists very much in 165 00:11:34,076 --> 00:11:38,316 Speaker 1: convincing people to take up the pro social, pro health 166 00:11:38,396 --> 00:11:41,596 Speaker 1: interventions that they invent, right, I mean, scientists are supposed 167 00:11:41,636 --> 00:11:44,636 Speaker 1: to invent things that make the world better, and physicians 168 00:11:44,676 --> 00:11:47,836 Speaker 1: are supposed to give treatments that make the world better. 169 00:11:47,956 --> 00:11:50,516 Speaker 1: But we don't think that their job is to do persuading. 170 00:11:50,876 --> 00:11:53,916 Speaker 1: So what is the approach that you think would be 171 00:11:53,916 --> 00:11:57,756 Speaker 1: better for the next time around. Well, I think actually 172 00:11:57,796 --> 00:12:01,956 Speaker 1: in the current environment, a lot of the medical community, 173 00:12:02,116 --> 00:12:05,276 Speaker 1: particularly the more senior medical community, is not used to 174 00:12:05,316 --> 00:12:09,676 Speaker 1: being challenged, is not used to having their authority challenged. 175 00:12:10,076 --> 00:12:13,516 Speaker 1: And I think we've come to a different point where 176 00:12:13,596 --> 00:12:16,156 Speaker 1: we have publics that are very different than they were, 177 00:12:16,756 --> 00:12:21,516 Speaker 1: certainly twenty years ago, much more questioning the world of 178 00:12:21,556 --> 00:12:26,236 Speaker 1: information at their fingertips, not hesitant to be challenging the 179 00:12:26,276 --> 00:12:30,316 Speaker 1: authority of their doctors. And what I've seen happen is 180 00:12:30,396 --> 00:12:34,116 Speaker 1: some doctors actually shut down because they don't want to 181 00:12:34,116 --> 00:12:36,836 Speaker 1: go there. They don't want to have that argument. So 182 00:12:36,956 --> 00:12:40,876 Speaker 1: I think what needs to be trained is less of 183 00:12:40,916 --> 00:12:44,116 Speaker 1: the promotional side and more of the how to have 184 00:12:44,156 --> 00:12:47,196 Speaker 1: a difficult conversation. But can I just ask, do you 185 00:12:47,196 --> 00:12:48,956 Speaker 1: think that would work? I mean, in light of the 186 00:12:49,516 --> 00:12:52,916 Speaker 1: subtle social pressures that you're describing, I don't think. I mean, 187 00:12:52,956 --> 00:12:55,516 Speaker 1: I'm just thinking of physicians. I know many of them 188 00:12:55,516 --> 00:12:58,596 Speaker 1: are most impressive people you know that I come into 189 00:12:58,636 --> 00:13:02,356 Speaker 1: contact with, and they have many, many amazing skills. But 190 00:13:02,396 --> 00:13:04,916 Speaker 1: I'm not sure that even with you know, a sophisticated 191 00:13:04,956 --> 00:13:08,996 Speaker 1: training seminar, they would be able to convince people who 192 00:13:08,996 --> 00:13:13,196 Speaker 1: are truly vaccine hesitant, partly because I'm just not sure 193 00:13:13,196 --> 00:13:15,716 Speaker 1: what arguments are are gonna work. I mean, you know, 194 00:13:15,716 --> 00:13:17,916 Speaker 1: you could say, well, gee, you trust me the rest 195 00:13:17,916 --> 00:13:19,916 Speaker 1: of the time, why don't you trust me this time? 196 00:13:20,676 --> 00:13:24,276 Speaker 1: And that's the honest answer, right That's the true answer, 197 00:13:24,356 --> 00:13:27,756 Speaker 1: Like I can't demonstrate to you the truth of the 198 00:13:27,796 --> 00:13:30,196 Speaker 1: scientific evidence right now, you have to trust me because 199 00:13:30,236 --> 00:13:31,796 Speaker 1: you trust me the rest of the time. Like that's 200 00:13:31,796 --> 00:13:35,516 Speaker 1: the actual epistemological answer. But I don't think people be 201 00:13:35,636 --> 00:13:37,636 Speaker 1: very inclined to believe them when they said that, given 202 00:13:37,676 --> 00:13:43,396 Speaker 1: what you're describing, Yeah, you know, it's interesting. I agree 203 00:13:43,436 --> 00:13:45,356 Speaker 1: with you. And I know a number of doctors who 204 00:13:45,396 --> 00:13:48,876 Speaker 1: have told me that, you know, they've tried all kinds 205 00:13:48,916 --> 00:13:52,396 Speaker 1: of angles in some situations, and it's just not going 206 00:13:52,396 --> 00:13:55,996 Speaker 1: to change some people's minds. And they've kind of gotten 207 00:13:55,996 --> 00:13:59,396 Speaker 1: to a point where they say, I try in many 208 00:13:59,436 --> 00:14:03,036 Speaker 1: different angles. I talk about vaccinating my own children, I 209 00:14:03,116 --> 00:14:06,996 Speaker 1: talk about vaccinating myself. They still don't. So there are 210 00:14:07,036 --> 00:14:10,356 Speaker 1: going to be some people that are difficult to change. 211 00:14:10,916 --> 00:14:13,996 Speaker 1: But at the end of the day, we still in 212 00:14:14,036 --> 00:14:18,676 Speaker 1: all of our globally, in surveys and in any kind 213 00:14:18,716 --> 00:14:23,276 Speaker 1: of trust barometers or whatever, we still see that more 214 00:14:23,316 --> 00:14:27,676 Speaker 1: than ever, doctors and healthcare providers have more trust than 215 00:14:27,716 --> 00:14:32,036 Speaker 1: just about any other institution going. So there is a 216 00:14:32,076 --> 00:14:35,196 Speaker 1: trust there. But I think right now they need more 217 00:14:35,236 --> 00:14:39,316 Speaker 1: than the doctor. They need somebody else in their social 218 00:14:39,356 --> 00:14:44,156 Speaker 1: spheres to change their mind. Now. I don't pretend to 219 00:14:44,196 --> 00:14:47,956 Speaker 1: have any easy answers, but all I would say is, 220 00:14:48,116 --> 00:14:51,516 Speaker 1: don't give up. If you're a doctor, keep trying. One 221 00:14:51,596 --> 00:14:55,956 Speaker 1: of the puzzles that strikes me so rich and interesting 222 00:14:55,996 --> 00:15:01,636 Speaker 1: around vaccine hesitancy is that unlike almost any other situation 223 00:15:01,756 --> 00:15:04,156 Speaker 1: that we're required to make a decision about our healthcare, 224 00:15:04,796 --> 00:15:08,756 Speaker 1: which are usually individualized. That is to say, my decision 225 00:15:08,756 --> 00:15:11,716 Speaker 1: only affect me. In the case of vaccines, there is 226 00:15:11,716 --> 00:15:15,956 Speaker 1: a free rider dynamic, right If enough other people are 227 00:15:16,036 --> 00:15:20,876 Speaker 1: vaccinated and I'm not vaccinated, I'm still reducing my odds 228 00:15:20,916 --> 00:15:25,156 Speaker 1: of getting sick because the prevalence of the disease will 229 00:15:25,196 --> 00:15:29,516 Speaker 1: decline by whatever percentage of people are are vaccinated. And 230 00:15:29,796 --> 00:15:32,476 Speaker 1: that makes me wonder, are there any examples that you've 231 00:15:32,476 --> 00:15:35,236 Speaker 1: come across in your work of situations where lots of 232 00:15:35,276 --> 00:15:38,556 Speaker 1: people decline a vaccine and then the disease is bad 233 00:15:38,676 --> 00:15:41,996 Speaker 1: enough that it spreads, it continues to do harm in 234 00:15:41,996 --> 00:15:44,476 Speaker 1: the community where the people live, and then you get 235 00:15:44,516 --> 00:15:48,156 Speaker 1: some kind of systematic shift where people say, WHOA, I 236 00:15:48,196 --> 00:15:49,836 Speaker 1: thought I was going to get away with this, but 237 00:15:49,916 --> 00:15:52,436 Speaker 1: now I can't get away with it, and so I'm 238 00:15:52,476 --> 00:15:54,596 Speaker 1: shifting my views and now I'm going to go out 239 00:15:54,636 --> 00:15:57,516 Speaker 1: there and get a vaccine. Or does the kind of 240 00:15:57,556 --> 00:16:01,236 Speaker 1: free rider effect is it so powerful that once people 241 00:16:01,236 --> 00:16:05,276 Speaker 1: have said no, they're probably never going to say yes. Yeah. 242 00:16:05,356 --> 00:16:08,036 Speaker 1: I think at the end of the day, the free riders, 243 00:16:08,076 --> 00:16:11,356 Speaker 1: if they think that there's enough going on. They might 244 00:16:12,916 --> 00:16:16,276 Speaker 1: be opportunistic. But if they see that it's a pretty 245 00:16:16,316 --> 00:16:20,436 Speaker 1: serious pandemic or whatever, or in the case of measles, 246 00:16:20,916 --> 00:16:26,756 Speaker 1: pretty serious wave coming back, they could be more open 247 00:16:26,796 --> 00:16:29,436 Speaker 1: to getting vaccinated. That could be enough to change their 248 00:16:29,476 --> 00:16:35,116 Speaker 1: mind if we want to leverage that situation. People don't 249 00:16:35,596 --> 00:16:39,836 Speaker 1: in general, how many people know what percentage of their 250 00:16:39,916 --> 00:16:44,356 Speaker 1: community is actually getting vaccinated to even know if they 251 00:16:44,396 --> 00:16:48,196 Speaker 1: can relax. One strategy would be to let people know, 252 00:16:48,916 --> 00:16:51,396 Speaker 1: like how many people in your community are vaccinate. I 253 00:16:51,436 --> 00:16:53,556 Speaker 1: don't know if that's a good strategy in a sense, 254 00:16:53,596 --> 00:16:58,956 Speaker 1: because it might if the community's going going well, it 255 00:16:59,076 --> 00:17:01,356 Speaker 1: might make more people say, oh, I don't have to 256 00:17:01,356 --> 00:17:05,476 Speaker 1: get vaccinated. Do you think, Heidi, that it's too late 257 00:17:05,596 --> 00:17:10,956 Speaker 1: this time around to make sibstential inroads in let's say, 258 00:17:11,276 --> 00:17:13,996 Speaker 1: the United States and Western Europe against people who are 259 00:17:14,236 --> 00:17:16,716 Speaker 1: vaccine hesitant. I mean, I understand that this is a 260 00:17:16,756 --> 00:17:18,356 Speaker 1: fight that's going to be a global fight, and there 261 00:17:18,356 --> 00:17:21,116 Speaker 1: may be places in the world where vaccines still haven't 262 00:17:21,116 --> 00:17:23,276 Speaker 1: spread that much at all and where the fight really 263 00:17:23,316 --> 00:17:26,196 Speaker 1: needs to be concentrated. Or do you think there's still 264 00:17:26,236 --> 00:17:32,916 Speaker 1: time to make a meaningful difference in this particular round. So, 265 00:17:32,996 --> 00:17:35,196 Speaker 1: you know, an optimistic view which didn't turn out to 266 00:17:35,236 --> 00:17:38,116 Speaker 1: be true, was that we would vaccinate enough people fast 267 00:17:38,196 --> 00:17:42,156 Speaker 1: enough that community immunity or her immunity could be reached 268 00:17:42,396 --> 00:17:44,836 Speaker 1: and we would not have to worry about the variants 269 00:17:44,836 --> 00:17:48,596 Speaker 1: that are will inevitably now spread and some of which 270 00:17:48,636 --> 00:17:53,676 Speaker 1: may eventually evolve to be vaccine resistant. That ship seems 271 00:17:53,716 --> 00:17:59,356 Speaker 1: to have sailed, at least in the West. Yeah, I 272 00:17:59,396 --> 00:18:01,756 Speaker 1: think we still need to get as many people as 273 00:18:01,796 --> 00:18:06,676 Speaker 1: possible vaccinated. I don't think it's too late, and I 274 00:18:06,716 --> 00:18:10,236 Speaker 1: don't think we should give up, because out of principle, 275 00:18:11,836 --> 00:18:16,036 Speaker 1: it's really important to get people on board. I'd like 276 00:18:16,116 --> 00:18:18,796 Speaker 1: to say it's never too late. It's a very dynamic 277 00:18:18,876 --> 00:18:23,396 Speaker 1: changing environment. I think I remember saying, I think it 278 00:18:23,436 --> 00:18:26,916 Speaker 1: was in late January we're going to hit a wall 279 00:18:27,356 --> 00:18:30,436 Speaker 1: in late March April, I said. So it was a 280 00:18:30,476 --> 00:18:33,236 Speaker 1: bit later than that because at the beginning of the 281 00:18:33,316 --> 00:18:36,196 Speaker 1: year we are in the thick of a serious second wave. 282 00:18:36,716 --> 00:18:40,196 Speaker 1: We had recent news of these vaccines being highly effective, 283 00:18:41,236 --> 00:18:45,076 Speaker 1: more effective than most vaccines, and there was a limited supply, 284 00:18:45,716 --> 00:18:49,236 Speaker 1: so all of those things would drive people. So we 285 00:18:49,276 --> 00:18:52,636 Speaker 1: had the willing the eager upfront, wanting to get whatever 286 00:18:52,756 --> 00:18:56,756 Speaker 1: limited supply, seeing that you know, this is bad, still bad. 287 00:18:57,476 --> 00:19:02,156 Speaker 1: But as we get more supply, as the willing have 288 00:19:02,276 --> 00:19:06,356 Speaker 1: been more vaccinated, and as the pandemic appears to be waning, 289 00:19:07,196 --> 00:19:11,116 Speaker 1: you know, we're starting to hit it more difficult population. 290 00:19:11,196 --> 00:19:14,276 Speaker 1: So I think we have to change some strategies. And 291 00:19:14,356 --> 00:19:17,156 Speaker 1: I also don't think we should give up, because it's 292 00:19:17,196 --> 00:19:21,716 Speaker 1: not just about COVID vaccine. Everything we do around COVID 293 00:19:22,116 --> 00:19:26,556 Speaker 1: and building confidence around the vaccines it could be foundational 294 00:19:26,676 --> 00:19:41,116 Speaker 1: moving forward for other vaccines. We'll be right back. How 295 00:19:41,156 --> 00:19:43,716 Speaker 1: do you. I am fascinated by the role that fear 296 00:19:43,836 --> 00:19:48,436 Speaker 1: has played on all sides of the COVID pandemic, addits 297 00:19:49,116 --> 00:19:51,836 Speaker 1: and the various treatments that we've been looking at for it. 298 00:19:51,956 --> 00:19:56,836 Speaker 1: So it sometimes seems to me that at first, many, 299 00:19:56,876 --> 00:20:00,596 Speaker 1: many people were afraid of getting COVID, and those folks 300 00:20:00,636 --> 00:20:04,516 Speaker 1: eventually masked up and engaged in social distancing, and others said, well, look, 301 00:20:04,556 --> 00:20:07,636 Speaker 1: it's not the end of the world. Statistically, you probably 302 00:20:07,676 --> 00:20:10,196 Speaker 1: won't die from it and not get it, and so 303 00:20:10,556 --> 00:20:13,596 Speaker 1: don't be so fear based. And then with the rise 304 00:20:13,596 --> 00:20:16,556 Speaker 1: of the vaccines, we've seen a shift, and now there 305 00:20:16,556 --> 00:20:21,636 Speaker 1: are lots of people saying, well, I am afraid of 306 00:20:21,636 --> 00:20:24,956 Speaker 1: the vaccine more than I am afraid of the possibility 307 00:20:24,956 --> 00:20:27,556 Speaker 1: of getting COVID. Now, obviously there's a lot of overlap 308 00:20:27,596 --> 00:20:29,436 Speaker 1: between those people and people who said they weren't afraid 309 00:20:29,476 --> 00:20:32,236 Speaker 1: of COVID in the first place. But before they were 310 00:20:32,236 --> 00:20:34,956 Speaker 1: saying we're not afraid of COVID, and now they're saying, 311 00:20:34,996 --> 00:20:37,516 Speaker 1: we are afraid of a vaccine. Meanwhile, the people who 312 00:20:37,556 --> 00:20:40,036 Speaker 1: before we're afraid of COVID are now saying that they're 313 00:20:40,076 --> 00:20:41,796 Speaker 1: not afraid of a vaccine. I don't realize there's not 314 00:20:41,796 --> 00:20:43,836 Speaker 1: a perfect match, but that does seem to be the case. 315 00:20:44,556 --> 00:20:48,676 Speaker 1: I mean, I'm genuinely curious. I don't really understand how 316 00:20:48,676 --> 00:20:52,396 Speaker 1: the economy of fear is working for each group, except 317 00:20:52,436 --> 00:20:56,636 Speaker 1: to say that each group is afraid of something different. Yeah, 318 00:20:56,676 --> 00:21:00,396 Speaker 1: and I think it's a risk perception thing too. I mean, 319 00:21:00,916 --> 00:21:04,916 Speaker 1: there was more fear about the virus back a few 320 00:21:04,956 --> 00:21:08,956 Speaker 1: months ago, because there was more virus, the mortality rates 321 00:21:09,196 --> 00:21:14,716 Speaker 1: or higher. It was more fearful. And now as we 322 00:21:14,756 --> 00:21:19,316 Speaker 1: see that waning a bit, what seemed like a smaller risk, 323 00:21:19,916 --> 00:21:23,516 Speaker 1: the relative risk has changed in the Meanwhile, too, people 324 00:21:23,556 --> 00:21:27,596 Speaker 1: didn't have before the information about the rare risk of 325 00:21:27,676 --> 00:21:32,756 Speaker 1: these the blood clots, for instance, So there's new information 326 00:21:32,876 --> 00:21:36,236 Speaker 1: in that mix that we didn't have before. We see 327 00:21:36,236 --> 00:21:39,996 Speaker 1: this even with childhood vaccines. We've got a lot of 328 00:21:40,076 --> 00:21:44,436 Speaker 1: mothers now who are skeptical about childhood vaccines. They're just 329 00:21:44,596 --> 00:21:49,236 Speaker 1: doing a very basic risk calculation. They don't see the 330 00:21:49,316 --> 00:21:54,156 Speaker 1: threat of all these childhood diseases, but ironically, does the 331 00:21:54,236 --> 00:21:59,356 Speaker 1: vaccines work. Yeah, But to them and their child, it's 332 00:21:59,396 --> 00:22:01,796 Speaker 1: like the thing that has the risk is the vaccine. 333 00:22:02,996 --> 00:22:07,076 Speaker 1: And there's also this kind of way that risk plays 334 00:22:07,116 --> 00:22:11,356 Speaker 1: with our minds in a way, if a mother gives 335 00:22:11,356 --> 00:22:15,036 Speaker 1: a child the vaccine or gets a child vaccinated and 336 00:22:15,116 --> 00:22:21,156 Speaker 1: there's a problem, she feels far more regret and responsibility 337 00:22:21,316 --> 00:22:26,076 Speaker 1: than if a child naturally gets measles, because it's nature. 338 00:22:26,756 --> 00:22:31,836 Speaker 1: So it's another factor that weighs in there. Yeah, that 339 00:22:31,876 --> 00:22:34,116 Speaker 1: actually leads me to a question that I imagine you've 340 00:22:34,116 --> 00:22:37,236 Speaker 1: spent a lot of time thinking about. It. Seems like 341 00:22:37,356 --> 00:22:41,116 Speaker 1: vaccine hesitancy has some components that are grounded in quote 342 00:22:41,156 --> 00:22:46,516 Speaker 1: unquote reason, mathematical reason risk assessment that an economist would 343 00:22:46,596 --> 00:22:50,196 Speaker 1: say is rational to undertake and then some of it 344 00:22:50,236 --> 00:22:54,516 Speaker 1: consists in beliefs and values, which who's to say exactly 345 00:22:54,516 --> 00:22:58,756 Speaker 1: which you're right and wrong? And then some inheres in 346 00:22:59,756 --> 00:23:05,756 Speaker 1: true irrationality, paranoia, fantasies, false claims about the world that 347 00:23:05,796 --> 00:23:09,556 Speaker 1: are demonstrably false, not just false opinion, but false claims 348 00:23:09,556 --> 00:23:12,916 Speaker 1: of fact. And I guess what I'm wondering is, I mean, 349 00:23:12,996 --> 00:23:15,676 Speaker 1: having spent so much of your career thinking about these questions, 350 00:23:15,996 --> 00:23:18,636 Speaker 1: do you ever think about, like, roughly what percentage is 351 00:23:18,636 --> 00:23:21,956 Speaker 1: contributing to each You know, how much of vaccine hesitancy 352 00:23:22,036 --> 00:23:27,436 Speaker 1: is coming from rational calculations, even if they're unconscious rational calculations. 353 00:23:27,916 --> 00:23:30,636 Speaker 1: How much of it is coming from beliefs and values 354 00:23:30,676 --> 00:23:33,156 Speaker 1: which aren't really subject to being shown true or false 355 00:23:33,476 --> 00:23:35,956 Speaker 1: in the same way that facts are. And how much 356 00:23:35,956 --> 00:23:38,956 Speaker 1: of it is coming from just false beliefs about the 357 00:23:38,956 --> 00:23:42,196 Speaker 1: world that we might be willing to labels as irrational. 358 00:23:44,516 --> 00:23:48,036 Speaker 1: I have thought about that. It really depends on the person. 359 00:23:48,396 --> 00:23:51,756 Speaker 1: But I think that in the broader group of hesitant 360 00:23:52,396 --> 00:23:56,236 Speaker 1: people around vaccines, I mean, I think we don't give 361 00:23:56,316 --> 00:24:01,876 Speaker 1: enough credit sometimes to parents, to others who are kind 362 00:24:01,916 --> 00:24:07,036 Speaker 1: of weighing things they're not just you know, emotional crazy 363 00:24:07,116 --> 00:24:12,076 Speaker 1: beliefs there are, and and my point of haut anti vaccine, 364 00:24:12,676 --> 00:24:16,516 Speaker 1: I don't mind the word. The thing I don't like 365 00:24:16,956 --> 00:24:20,316 Speaker 1: is that it's used so loosely for anyone who doesn't 366 00:24:20,356 --> 00:24:24,036 Speaker 1: want a vaccine. It's often used to a lot of 367 00:24:24,076 --> 00:24:27,916 Speaker 1: hesitant parents who aren't at all anti vaccine. They have 368 00:24:28,036 --> 00:24:31,676 Speaker 1: some you know, they're asking some questions and then they oh, 369 00:24:31,756 --> 00:24:35,916 Speaker 1: she's just an anti VAXX, and then she becomes it 370 00:24:36,236 --> 00:24:40,276 Speaker 1: more anti vacs because of that judgment. Although it is 371 00:24:40,396 --> 00:24:44,036 Speaker 1: vaccine hesitancy then also though in a sense over inclusive, 372 00:24:44,076 --> 00:24:46,396 Speaker 1: because for some of the people, sure they're hesitating, and 373 00:24:46,516 --> 00:24:49,676 Speaker 1: the implication of hesitancy is it hints, oh, you can 374 00:24:49,716 --> 00:24:52,556 Speaker 1: be convinced. But it seems a bit like a euphemism 375 00:24:52,556 --> 00:24:55,276 Speaker 1: to me to describe people who are saying, oh, hell no, 376 00:24:55,516 --> 00:24:57,596 Speaker 1: I'm not going anywhere near this vaccine. And there are 377 00:24:57,596 --> 00:24:59,476 Speaker 1: a lot of people who are saying that they're not 378 00:24:59,556 --> 00:25:02,996 Speaker 1: hesitant at all, they're just a clear no. I agree. 379 00:25:03,276 --> 00:25:06,836 Speaker 1: Hesitancy was not a word I chose. This was a 380 00:25:06,836 --> 00:25:11,556 Speaker 1: word that was decided by the World Health Organization, and 381 00:25:11,836 --> 00:25:16,316 Speaker 1: I was part of the advisory group to kind of 382 00:25:16,396 --> 00:25:20,676 Speaker 1: characterize the scope and scale of it, but we weren't 383 00:25:20,716 --> 00:25:24,116 Speaker 1: able to name it. We were given that framing of it. 384 00:25:24,676 --> 00:25:28,796 Speaker 1: And I've written some things about the ambiguity of that term. 385 00:25:28,836 --> 00:25:31,596 Speaker 1: So I fully fully agree with you. What would you 386 00:25:31,676 --> 00:25:34,556 Speaker 1: choose if you could choose any term? Well, well, I've 387 00:25:34,556 --> 00:25:37,596 Speaker 1: picked the framing of confidence because also I was thinking 388 00:25:37,596 --> 00:25:40,796 Speaker 1: of the consumer confidence index, and we have a vaccine 389 00:25:40,836 --> 00:25:43,956 Speaker 1: confidence index. You can be zero percent confident and you 390 00:25:43,956 --> 00:25:47,596 Speaker 1: can be one hundred percent confident. But I don't think 391 00:25:47,596 --> 00:25:51,436 Speaker 1: there's any magic, one one word. Just to try to 392 00:25:51,476 --> 00:25:56,996 Speaker 1: close with something slightly optimistic, what is your case study that, 393 00:25:57,076 --> 00:26:00,116 Speaker 1: in your mind is the most optimistic or positive case 394 00:26:00,756 --> 00:26:04,996 Speaker 1: in which a population which had hesitancy gradually shifted to 395 00:26:05,076 --> 00:26:08,876 Speaker 1: being less hesitant. If there are I hope some examples 396 00:26:08,876 --> 00:26:11,196 Speaker 1: of that out there that you've encountered in your research. 397 00:26:11,676 --> 00:26:14,436 Speaker 1: There's a few of them. I mean the most recent one, 398 00:26:14,556 --> 00:26:17,196 Speaker 1: for instance, on the COVID work. One of the more 399 00:26:18,116 --> 00:26:23,396 Speaker 1: successful engagement strategies was through barbers and hairdressers in Maryland. 400 00:26:24,436 --> 00:26:26,796 Speaker 1: I know in some of my work in India and 401 00:26:27,716 --> 00:26:33,756 Speaker 1: in Africa, when you started to address other things in 402 00:26:33,796 --> 00:26:38,996 Speaker 1: the community that people cared about, they were more accepting 403 00:26:38,996 --> 00:26:41,356 Speaker 1: of the vaccine because they felt like, actually, you're not 404 00:26:41,516 --> 00:26:45,316 Speaker 1: just here to give me my job and keep moving. Oh, 405 00:26:45,316 --> 00:26:47,396 Speaker 1: maybe you do care about what I think or my 406 00:26:47,476 --> 00:26:52,116 Speaker 1: well being. And I think moving forward on COVID, we 407 00:26:52,236 --> 00:26:58,276 Speaker 1: do need to somehow embed it in COVID recovery more broadly, 408 00:26:58,316 --> 00:27:04,036 Speaker 1: addressing mental health things, addressing other things. So I think 409 00:27:04,076 --> 00:27:07,156 Speaker 1: we need to step back from the needle, as it were, 410 00:27:07,676 --> 00:27:12,756 Speaker 1: and really think about context and never assume what's in 411 00:27:12,796 --> 00:27:15,996 Speaker 1: the minds of people. They may be telling you they 412 00:27:15,996 --> 00:27:18,556 Speaker 1: think it's a safety issue, but there may be something else. 413 00:27:19,036 --> 00:27:22,676 Speaker 1: I think we need to hear out people because in India, 414 00:27:22,716 --> 00:27:26,356 Speaker 1: I remember one example that I always think about is 415 00:27:26,356 --> 00:27:29,836 Speaker 1: there was everyone was saying, oh, it's a rumor that's 416 00:27:29,876 --> 00:27:32,316 Speaker 1: going to sterilize us. They're never going to let go 417 00:27:32,396 --> 00:27:35,396 Speaker 1: of this rumor. Well, spending some time when some of 418 00:27:35,436 --> 00:27:38,356 Speaker 1: these villages talking to people and not just a one 419 00:27:38,436 --> 00:27:41,436 Speaker 1: time survey, but going back and saying, well, what else 420 00:27:41,516 --> 00:27:44,156 Speaker 1: is bugging you? You know? And it turned out that 421 00:27:44,516 --> 00:27:51,316 Speaker 1: this community didn't want men coming from Delhi to their village. 422 00:27:52,556 --> 00:27:55,796 Speaker 1: One they didn't want men vaccinating their children, and two 423 00:27:56,076 --> 00:27:59,636 Speaker 1: they wanted people who were from the community. So if 424 00:27:59,676 --> 00:28:02,356 Speaker 1: something happened they could find them. Well, these are pretty 425 00:28:02,396 --> 00:28:08,076 Speaker 1: reasonable things. Once that changed, somehow, the rumor thing disappeared. 426 00:28:08,676 --> 00:28:12,596 Speaker 1: So I think trying to understand if is there something 427 00:28:12,636 --> 00:28:16,716 Speaker 1: else going on here that you know, is more tangible 428 00:28:17,076 --> 00:28:21,036 Speaker 1: that maybe maybe it's more straightforward than you think, and 429 00:28:21,156 --> 00:28:26,036 Speaker 1: maybe it's not, maybe it's more complicated. But well, my Barbara, 430 00:28:26,036 --> 00:28:27,556 Speaker 1: who's been cutting in my hair since I was nine, 431 00:28:27,556 --> 00:28:29,436 Speaker 1: it is definitely the wisest person that I know. So 432 00:28:29,636 --> 00:28:33,316 Speaker 1: I like the idea of relying on relying on Barbara's 433 00:28:33,356 --> 00:28:35,556 Speaker 1: I mean in the Indian case, of course, in many 434 00:28:35,636 --> 00:28:39,876 Speaker 1: villages in India people were forcibly sterilized, sometimes against their will, 435 00:28:39,956 --> 00:28:43,156 Speaker 1: sometimes without their knowledge, as recently as the nineteen seventies, 436 00:28:43,156 --> 00:28:44,676 Speaker 1: so you could sort of understand. That seems to me 437 00:28:44,716 --> 00:28:47,036 Speaker 1: to fall into the category of very reasonable people to 438 00:28:47,076 --> 00:28:49,596 Speaker 1: be afraid of that they had a real world experience. 439 00:28:51,556 --> 00:28:54,636 Speaker 1: What should I be asking you that I'm not asking you, Heidi, Well, 440 00:28:54,676 --> 00:28:58,516 Speaker 1: I can tell you what I'm most worried about is 441 00:28:58,516 --> 00:29:03,316 Speaker 1: where we're going with social media. I think we need 442 00:29:03,396 --> 00:29:10,836 Speaker 1: to find some way to allow for opinion, find a 443 00:29:10,836 --> 00:29:14,076 Speaker 1: different way to handle the way we're dealing with it, 444 00:29:14,156 --> 00:29:17,996 Speaker 1: particularly around vaccines. I see us going in a direction 445 00:29:18,316 --> 00:29:24,036 Speaker 1: of shutting a lot of things down that might backfire. 446 00:29:24,636 --> 00:29:27,236 Speaker 1: It is something that keeps me up at night because 447 00:29:27,276 --> 00:29:32,196 Speaker 1: I see some extreme behavior on both sides. This is 448 00:29:32,196 --> 00:29:34,116 Speaker 1: something I also spend a huge amount of my time 449 00:29:34,396 --> 00:29:36,476 Speaker 1: working on and listeners of the show. Not that I've 450 00:29:36,516 --> 00:29:39,356 Speaker 1: advised Facebook on their free expression policy, so I care 451 00:29:39,396 --> 00:29:42,036 Speaker 1: a lot about this. I thought I heard you hinting 452 00:29:42,916 --> 00:29:45,756 Speaker 1: that maybe the social media companies are going too far 453 00:29:46,356 --> 00:29:52,076 Speaker 1: in taking down content that they label as anti vaccination misinformation. 454 00:29:52,156 --> 00:29:53,956 Speaker 1: And that surprised me to hear you say that, because 455 00:29:53,996 --> 00:29:57,156 Speaker 1: so many people from the medical establishment are out there 456 00:29:57,196 --> 00:30:00,356 Speaker 1: pressuring the social media companies to do still more to 457 00:30:00,356 --> 00:30:04,076 Speaker 1: take down what is described as COVID misinformation, including anti 458 00:30:04,156 --> 00:30:07,596 Speaker 1: vaccine misinformation. So I did I hear you're right? Yeah, 459 00:30:07,596 --> 00:30:09,796 Speaker 1: and governments are too. Did I hear you're right there 460 00:30:09,796 --> 00:30:11,276 Speaker 1: that you think it would be actually a mistake for 461 00:30:11,316 --> 00:30:13,756 Speaker 1: the social media companies to go too far in shutting 462 00:30:13,756 --> 00:30:18,596 Speaker 1: down skeptical discourse. I think it's a risk, and I 463 00:30:18,636 --> 00:30:21,396 Speaker 1: think that it's not just the health authorities who are 464 00:30:21,436 --> 00:30:24,516 Speaker 1: putting that pressure. It's government. I mean, it's coming from 465 00:30:24,556 --> 00:30:30,836 Speaker 1: the top. But my Vaccine Confidence Project group here after 466 00:30:30,876 --> 00:30:36,676 Speaker 1: spending eleven years and continuing to be listening and understand 467 00:30:36,676 --> 00:30:40,276 Speaker 1: the dynamics of what's going on out there. You can't 468 00:30:40,276 --> 00:30:45,516 Speaker 1: just flip a switch. You cannot delete doubt. And some 469 00:30:45,556 --> 00:30:48,596 Speaker 1: of the key strategies right now that are being used 470 00:30:48,596 --> 00:30:52,036 Speaker 1: by those who want to disrupt are quicker and more 471 00:30:52,036 --> 00:30:58,516 Speaker 1: clever and nimble than the more promotional positive ones. And 472 00:30:58,556 --> 00:31:02,916 Speaker 1: we're just either driving at underground or it's going into 473 00:31:03,076 --> 00:31:09,396 Speaker 1: its embedding and a lot of other networks, and we 474 00:31:09,476 --> 00:31:12,116 Speaker 1: need I think we need a different strategy. I mean, 475 00:31:12,156 --> 00:31:16,916 Speaker 1: I think I fully and absolutely agree with taking down 476 00:31:16,956 --> 00:31:20,796 Speaker 1: things that are overtly harmful, and I do think and 477 00:31:20,956 --> 00:31:24,796 Speaker 1: fully agree that we need to work on mitigating the 478 00:31:24,836 --> 00:31:28,796 Speaker 1: amplification of risk that I think is one of the 479 00:31:28,836 --> 00:31:34,756 Speaker 1: real issues how it spreads. My red flag is that 480 00:31:36,036 --> 00:31:39,156 Speaker 1: we need a lot more work to understand the dynamics 481 00:31:39,196 --> 00:31:42,796 Speaker 1: of this space. And I worried that in trying to 482 00:31:42,836 --> 00:31:46,636 Speaker 1: clean it up, we're pushing it underground, We're pushing it 483 00:31:46,676 --> 00:31:50,316 Speaker 1: in spaces that we're going to be less able to 484 00:31:50,436 --> 00:31:54,116 Speaker 1: engage with it, less able to address it, and not 485 00:31:54,316 --> 00:31:58,956 Speaker 1: able to get cues on where we need to build 486 00:31:59,396 --> 00:32:02,596 Speaker 1: more resilience, and I think it should be also a 487 00:32:02,716 --> 00:32:07,716 Speaker 1: challenge to the public health and scientific community that are 488 00:32:07,756 --> 00:32:12,836 Speaker 1: we not strong enough to stand up to And you 489 00:32:12,876 --> 00:32:16,156 Speaker 1: can't just take something down with giving a better story, 490 00:32:16,356 --> 00:32:19,476 Speaker 1: because they're going to find it somewhere else. It's kind 491 00:32:19,476 --> 00:32:23,836 Speaker 1: of an almost existential task in sorting this out. And 492 00:32:24,236 --> 00:32:27,516 Speaker 1: I think it's not just about vaccines. It's in other areas, 493 00:32:27,516 --> 00:32:31,716 Speaker 1: but vaccines, I think, because of the public health implications, 494 00:32:32,556 --> 00:32:34,876 Speaker 1: is a serious one. And I'd love to talk to 495 00:32:34,876 --> 00:32:39,156 Speaker 1: you more about this if you're working on it, because yeah, 496 00:32:39,196 --> 00:32:42,636 Speaker 1: it's really important, very gladly. I mean, you've really described 497 00:32:43,076 --> 00:32:45,676 Speaker 1: one of the classic free speech arguments, which is that 498 00:32:45,796 --> 00:32:48,476 Speaker 1: if speech is suppressed too much, it tends to go 499 00:32:48,596 --> 00:32:50,836 Speaker 1: underground and then it can do greater harm. I mean, 500 00:32:50,916 --> 00:32:54,316 Speaker 1: one of the early arguments for free speech, being the 501 00:32:54,356 --> 00:32:57,876 Speaker 1: twentieth century, when Western government started adopting it more actively, 502 00:32:58,236 --> 00:33:01,356 Speaker 1: was just the one you're making that we need actually 503 00:33:01,396 --> 00:33:03,316 Speaker 1: the full range of arguments to be made in public 504 00:33:03,356 --> 00:33:06,476 Speaker 1: in order to achieve some kind of consensus in order 505 00:33:06,516 --> 00:33:09,996 Speaker 1: for people to have trust in underlying institutions. And now 506 00:33:10,036 --> 00:33:12,116 Speaker 1: there's of course a lot of skepticism of that view. 507 00:33:12,156 --> 00:33:13,716 Speaker 1: In the light of the rise of social media, and 508 00:33:13,756 --> 00:33:15,956 Speaker 1: that view is very much under attack, so I think 509 00:33:15,956 --> 00:33:19,876 Speaker 1: your voice is extremely important on this subject. I want 510 00:33:19,876 --> 00:33:21,876 Speaker 1: to thank you for your fascinating work and for taking 511 00:33:21,916 --> 00:33:25,596 Speaker 1: time out of your incredibly busy crusade, as it were, 512 00:33:25,876 --> 00:33:28,516 Speaker 1: to understand hesitancy better to speak with us. Thank you 513 00:33:28,556 --> 00:33:31,436 Speaker 1: so much, doctor Larson. Thanks very nice to meet you. 514 00:33:37,476 --> 00:33:41,396 Speaker 1: I found my conversation with doctor Heidi Larson genuinely eye 515 00:33:41,396 --> 00:33:45,236 Speaker 1: opening and more than a little bit disturbing. Not because 516 00:33:45,276 --> 00:33:48,196 Speaker 1: of for research, which seems to me thoughtful and brilliant. No, 517 00:33:48,556 --> 00:33:52,596 Speaker 1: what scared me the most was the realization that what 518 00:33:52,636 --> 00:33:57,756 Speaker 1: we call vaccine hesitancy is almost certainly a necessary feature 519 00:33:58,076 --> 00:34:03,276 Speaker 1: of our contemporary liberal democratic approach to vaccines. When I 520 00:34:03,316 --> 00:34:06,596 Speaker 1: asked Heidi, why is it that in the past we 521 00:34:06,596 --> 00:34:08,316 Speaker 1: were able to get lots of people to take the 522 00:34:08,396 --> 00:34:11,916 Speaker 1: vaccine and were not, she answered unequivocally that the difference 523 00:34:12,036 --> 00:34:15,756 Speaker 1: was democracy. Now she was very positive about democracy, but 524 00:34:15,836 --> 00:34:18,396 Speaker 1: I have to say that her comment really made me 525 00:34:18,476 --> 00:34:21,716 Speaker 1: think that we don't have to, as a constitutional democracy, 526 00:34:22,076 --> 00:34:25,116 Speaker 1: necessarily take the view that people can choose whether or 527 00:34:25,156 --> 00:34:28,076 Speaker 1: not to get vaccines. We could, in principle say that 528 00:34:28,116 --> 00:34:31,916 Speaker 1: it is a legal obligation, and under current Supreme Court precedent, 529 00:34:32,316 --> 00:34:35,516 Speaker 1: the government would be empowered if Congress passed a law 530 00:34:35,756 --> 00:34:40,076 Speaker 1: to say that everybody must get this vaccine. I don't 531 00:34:40,076 --> 00:34:42,356 Speaker 1: think that's practically going to happen in the world in 532 00:34:42,396 --> 00:34:45,156 Speaker 1: which we currently live, but it did occur to me 533 00:34:45,356 --> 00:34:48,636 Speaker 1: that that might actually be desirable, because in a world 534 00:34:48,956 --> 00:34:52,716 Speaker 1: where ultimately people are given the choice of having vaccines, 535 00:34:53,076 --> 00:34:57,236 Speaker 1: then the points that Heidi brought up, namely libertarianism of 536 00:34:57,276 --> 00:34:59,636 Speaker 1: the left and the right, people who think that it's 537 00:34:59,676 --> 00:35:03,076 Speaker 1: against nature, against God's plan to have vaccines, people who 538 00:35:03,116 --> 00:35:07,036 Speaker 1: worry about the vaccine's safety, and overall, people who have 539 00:35:07,116 --> 00:35:10,476 Speaker 1: less trust in the capacities of government or medical authority 540 00:35:10,516 --> 00:35:13,956 Speaker 1: to do well by them are inevitably, I think, going 541 00:35:13,996 --> 00:35:19,676 Speaker 1: to substantially undercut the possibility of broadly adopted vaccinations. In 542 00:35:19,716 --> 00:35:23,396 Speaker 1: other words, by choosing to define liberal democratic rights in 543 00:35:23,396 --> 00:35:26,596 Speaker 1: the way that we have, we've invited the possibility of 544 00:35:26,716 --> 00:35:30,196 Speaker 1: much greater harm. That's a cost benefit analysis that I 545 00:35:30,276 --> 00:35:33,556 Speaker 1: don't think I fully thought through in the right way 546 00:35:33,796 --> 00:35:40,156 Speaker 1: before this conversation with Heidi. Finally, and significantly, Heidi made 547 00:35:40,196 --> 00:35:43,516 Speaker 1: a very counterintuitive point which I think is very well 548 00:35:43,516 --> 00:35:46,276 Speaker 1: worth listening too carefully, and that is that we need 549 00:35:46,316 --> 00:35:48,596 Speaker 1: to think about whether we might be going too far 550 00:35:48,956 --> 00:35:52,636 Speaker 1: in some contexts, including in the context of social media 551 00:35:52,876 --> 00:35:57,956 Speaker 1: at shutting down discourse that undermines vaccine confidence. Her concern 552 00:35:58,036 --> 00:36:00,116 Speaker 1: is that if we go too far, we will drive 553 00:36:00,356 --> 00:36:04,396 Speaker 1: vaccine confidence questions underground, and that that will make it harder, 554 00:36:04,636 --> 00:36:08,316 Speaker 1: not easier, for medical authorities and the government to convince 555 00:36:08,356 --> 00:36:12,556 Speaker 1: people to take that scenes Throughout the conversation, our theme 556 00:36:12,596 --> 00:36:16,676 Speaker 1: of power here on Deep Background was absolutely essential. The 557 00:36:16,716 --> 00:36:20,236 Speaker 1: power of government to make people take vaccines, the power 558 00:36:20,316 --> 00:36:24,996 Speaker 1: of people's beliefs to lead them to places of uncertainty 559 00:36:25,156 --> 00:36:30,676 Speaker 1: or questioning, and perhaps most significantly, the power of the 560 00:36:30,756 --> 00:36:36,756 Speaker 1: medical establishment, limited by its capacity to be trusted. These 561 00:36:36,876 --> 00:36:40,316 Speaker 1: questions all could not be more pressing at the moment, 562 00:36:40,836 --> 00:36:43,636 Speaker 1: and I'm grateful to doctor Larson for joining us to 563 00:36:43,676 --> 00:36:47,556 Speaker 1: explain them so intelligently. Until the next time I speak 564 00:36:47,556 --> 00:36:52,796 Speaker 1: to you, be careful, be safe, and please be well. 565 00:36:55,556 --> 00:36:58,596 Speaker 1: Deep Background is brought to you by Pushkin Industries. Our 566 00:36:58,636 --> 00:37:02,236 Speaker 1: producer is Mola Board, our engineer is Ben Talliday, and 567 00:37:02,316 --> 00:37:07,236 Speaker 1: our showrunner is Sophie Crane. Mckibbon. Editorial support from noam osband. 568 00:37:07,756 --> 00:37:11,116 Speaker 1: Theme music by Luis Gara at Pushkin, thanks to Mia Lobell, 569 00:37:11,316 --> 00:37:16,156 Speaker 1: Julia Barton, Lydia Jeancott, Heather Faine, Carlie Migliori, Maggie Taylor, 570 00:37:16,276 --> 00:37:19,836 Speaker 1: Eric Sandler, and Jacob Weissberg. You can find me on 571 00:37:19,876 --> 00:37:22,676 Speaker 1: Twitter at Noah R. Feldman. I also write a column 572 00:37:22,676 --> 00:37:25,396 Speaker 1: for Bloomberg Opinion, which you can find at bloomberg dot 573 00:37:25,396 --> 00:37:29,676 Speaker 1: com slash Feldman. To discover Bloomberg's original slate of podcasts, 574 00:37:29,876 --> 00:37:33,156 Speaker 1: go to Bloomberg dot com slash podcasts, and if you 575 00:37:33,236 --> 00:37:35,876 Speaker 1: like what you heard today, please write a review or 576 00:37:35,956 --> 00:37:38,916 Speaker 1: tell a friend. This is deep background