1 00:00:02,480 --> 00:00:05,960 Speaker 1: Hi, and welcome back to the Carol Markowitz Show on iHeartRadio. 2 00:00:06,559 --> 00:00:10,600 Speaker 1: Study out of London last week found something called the 3 00:00:10,720 --> 00:00:15,200 Speaker 1: dating app effect. The New York Post reports this occurs 4 00:00:15,240 --> 00:00:20,480 Speaker 1: when dating app devotees experience such a severe chemical imbalance 5 00:00:20,960 --> 00:00:25,120 Speaker 1: that it resembles a chronic stress disorder and addictive behavior. 6 00:00:25,840 --> 00:00:29,160 Speaker 1: When a user gets a match or engagement, this messes 7 00:00:29,240 --> 00:00:34,480 Speaker 1: with the brain's reward system pathway, eventually resulting in neurochemical dependence. 8 00:00:34,800 --> 00:00:37,800 Speaker 1: While this initial dopamine dump might seem like a good thing, 9 00:00:38,320 --> 00:00:41,640 Speaker 1: the uncertainty of getting matched prompts the user to adopt 10 00:00:41,880 --> 00:00:45,280 Speaker 1: seeking behavior so they can get their fix, much like 11 00:00:45,280 --> 00:00:49,680 Speaker 1: a gambling addict playing the slots, per the study, and there's. 12 00:00:49,520 --> 00:00:50,320 Speaker 2: No win here. 13 00:00:50,560 --> 00:00:53,840 Speaker 1: Unlike when you're playing the slots, there's no amount of 14 00:00:54,680 --> 00:00:58,000 Speaker 1: likes or connections that you could walk away with and 15 00:00:58,080 --> 00:01:02,920 Speaker 1: feel like you succeeded. The study also found that dating 16 00:01:02,960 --> 00:01:08,160 Speaker 1: app usage can paradoxically torpedo one sex drive as well 17 00:01:08,200 --> 00:01:12,360 Speaker 1: by messing with testosterone levels, a side effect that affects 18 00:01:12,400 --> 00:01:16,800 Speaker 1: both sexes. A match can cause male sex hormone levels 19 00:01:16,840 --> 00:01:20,360 Speaker 1: to spike by fifteen to twenty percent in twenty minutes, 20 00:01:20,520 --> 00:01:25,560 Speaker 1: while getting ghosted or unmatched can send testosterone production plummeting 21 00:01:25,680 --> 00:01:28,800 Speaker 1: by ten to twenty five percent. The latter plunge can 22 00:01:29,040 --> 00:01:34,280 Speaker 1: in turn cause symptoms ranging from decreased energy to reduce libido. 23 00:01:34,640 --> 00:01:37,399 Speaker 1: One doesn't have to get rejected to have match induced 24 00:01:37,440 --> 00:01:41,319 Speaker 1: mood swings. Constantly awaiting feedback puts the user in a 25 00:01:41,360 --> 00:01:46,800 Speaker 1: state of anticipatory anxiety, where they experience elevated levels of 26 00:01:46,840 --> 00:01:50,920 Speaker 1: the stress hormone cortisol for several hours, which can disrupt 27 00:01:50,960 --> 00:01:54,680 Speaker 1: the body's hormone and thyroid production. I think a lot 28 00:01:54,720 --> 00:01:57,720 Speaker 1: of this stuff is so obvious if you think about it. 29 00:01:57,960 --> 00:02:01,560 Speaker 1: We're feeling fake emotions right when we're on the internet, 30 00:02:01,760 --> 00:02:07,040 Speaker 1: and as you're scrolling potential matches on these apps, these 31 00:02:07,080 --> 00:02:11,480 Speaker 1: fake emotions are kicking in. You're like feeling like people 32 00:02:11,560 --> 00:02:15,560 Speaker 1: like you, that you have options, But then your body 33 00:02:15,800 --> 00:02:20,080 Speaker 1: kind of gets that nothing is actually happening, that you 34 00:02:20,200 --> 00:02:24,239 Speaker 1: might feel something real, but no actual connection is occurring. 35 00:02:24,680 --> 00:02:28,800 Speaker 1: I talk a lot on here about getting off the phones, 36 00:02:29,160 --> 00:02:33,239 Speaker 1: but getting off the dating apps has to be a priority. 37 00:02:33,440 --> 00:02:35,640 Speaker 1: I'm sure there's a way to use the apps in 38 00:02:35,639 --> 00:02:39,280 Speaker 1: a healthy way, but I'm also sure that that very 39 00:02:39,520 --> 00:02:43,519 Speaker 1: rarely actually happens the reaction your body and your brain 40 00:02:44,000 --> 00:02:47,600 Speaker 1: have to being on the apps. I think those reactions 41 00:02:47,639 --> 00:02:52,639 Speaker 1: are actually holding potential daters back. Last summer, I challenged 42 00:02:52,919 --> 00:02:57,560 Speaker 1: my single listeners to try having an analog summer and 43 00:02:57,760 --> 00:03:01,079 Speaker 1: try to meet someone in person. I know several listeners 44 00:03:01,080 --> 00:03:03,680 Speaker 1: wrote in to say they tried it but didn't make 45 00:03:03,680 --> 00:03:07,200 Speaker 1: a love connection. I would say it takes time, and 46 00:03:07,400 --> 00:03:12,000 Speaker 1: if you're single, don't want to be try getting off 47 00:03:12,040 --> 00:03:15,520 Speaker 1: the apps and make dating something that you really only 48 00:03:15,560 --> 00:03:19,280 Speaker 1: do in the real world and take your time with it. 49 00:03:19,880 --> 00:03:23,519 Speaker 1: Maybe you won't meet your person right away, maybe it 50 00:03:23,560 --> 00:03:26,720 Speaker 1: will be harder to meet people. I understand that having 51 00:03:26,919 --> 00:03:29,560 Speaker 1: a catalog of people to scroll through on your phone 52 00:03:30,160 --> 00:03:34,200 Speaker 1: is the easier option here, but I think that you 53 00:03:34,280 --> 00:03:37,320 Speaker 1: will feel better in and of yourself and I think 54 00:03:37,360 --> 00:03:41,200 Speaker 1: that will lead to better options dating wise. Try it. 55 00:03:41,840 --> 00:03:45,160 Speaker 1: Thanks for listening. Coming up my interview with David Zweig. 56 00:03:45,480 --> 00:03:50,119 Speaker 1: But first, Israel is still under attack. Missile fire has 57 00:03:50,160 --> 00:03:55,160 Speaker 1: resumed from Israel's enemies, terrorists who seek utter death and destruction. 58 00:03:56,160 --> 00:03:59,240 Speaker 1: Here in America, we can't imagine what it's like to 59 00:03:59,280 --> 00:04:01,680 Speaker 1: live in content and fear like this. But for the 60 00:04:01,720 --> 00:04:05,400 Speaker 1: people of Israel. It's all very real every single day. 61 00:04:06,280 --> 00:04:09,280 Speaker 1: Please join me and show the people of Israel you'll 62 00:04:09,320 --> 00:04:12,480 Speaker 1: help protect them in this time of attack and uncertainty. 63 00:04:12,640 --> 00:04:14,680 Speaker 1: And one of the best ways to do this is 64 00:04:14,680 --> 00:04:18,400 Speaker 1: by giving to the International Fellowship of Christians and Jews. 65 00:04:18,960 --> 00:04:23,520 Speaker 1: Your gift today helps provide security essentials like bomb shelters, 66 00:04:23,960 --> 00:04:29,520 Speaker 1: flack jackets and bulletproof vests, armored security vehicles, ambulances, and 67 00:04:29,640 --> 00:04:32,520 Speaker 1: so much more. There is no better time to give 68 00:04:32,560 --> 00:04:36,159 Speaker 1: than right now, during the Passover holiday, when we celebrate 69 00:04:36,279 --> 00:04:40,320 Speaker 1: Israel's historic deliverance and birth as a nation. Give a 70 00:04:40,400 --> 00:04:44,520 Speaker 1: special Passover gift today and help protect the people of Israel. 71 00:04:45,320 --> 00:04:50,320 Speaker 1: All eight eight eight for eight eight IFCJ that's eight 72 00:04:50,360 --> 00:04:54,479 Speaker 1: eight eight four eight eight four three two five, or 73 00:04:54,560 --> 00:05:01,920 Speaker 1: go online at SUPPORTIFCJ dot org one word support IFCJ 74 00:05:02,320 --> 00:05:09,200 Speaker 1: dot org. Welcome back to the Carol Markowitz Show on iHeartRadio. 75 00:05:09,760 --> 00:05:13,240 Speaker 1: My guest today is david's Wig. David is an investigative 76 00:05:13,279 --> 00:05:16,440 Speaker 1: journalist and author. His new book, An Abundance of Caution 77 00:05:16,680 --> 00:05:19,440 Speaker 1: is out now by it. Hi David, how are you? 78 00:05:20,000 --> 00:05:21,560 Speaker 2: I'm great, how are you doing that? 79 00:05:21,760 --> 00:05:24,600 Speaker 1: So nice to have you on. I am a huge fan. 80 00:05:24,800 --> 00:05:27,839 Speaker 1: You did such amazing work during the pandemic, and I 81 00:05:27,880 --> 00:05:31,839 Speaker 1: thought you were undervalued. Really, I thought you should have 82 00:05:31,880 --> 00:05:34,240 Speaker 1: been far more famous. And I hope this book is 83 00:05:34,279 --> 00:05:38,640 Speaker 1: a giant bestseller because you are terrific. What made you 84 00:05:38,800 --> 00:05:40,400 Speaker 1: write this book five years later? 85 00:05:42,080 --> 00:05:45,800 Speaker 2: Well, the process has been going on a long time, yes, 86 00:05:46,040 --> 00:05:50,880 Speaker 2: so it's only coming out five years later. I would say. 87 00:05:51,040 --> 00:05:54,640 Speaker 2: What prompted me to start on reporting on the pandemic 88 00:05:54,680 --> 00:05:57,200 Speaker 2: and then ultimately what led to working on the book 89 00:05:57,640 --> 00:06:02,839 Speaker 2: was my own experience. I watched my kids at home, 90 00:06:02,960 --> 00:06:06,119 Speaker 2: and I described it in the book as withering away 91 00:06:06,200 --> 00:06:09,400 Speaker 2: or excuse me, wilting, almost like a plant, you know, 92 00:06:09,480 --> 00:06:15,159 Speaker 2: without sunlight. And something just seemed off about this. It 93 00:06:15,200 --> 00:06:19,719 Speaker 2: didn't make sense. From early on, we knew the data 94 00:06:19,720 --> 00:06:22,839 Speaker 2: that kids were not at high risk. They also were 95 00:06:22,880 --> 00:06:27,760 Speaker 2: not transmitting the virus more than anyone else. Closing schools 96 00:06:27,800 --> 00:06:31,400 Speaker 2: just didn't seem like the right area for society to 97 00:06:31,480 --> 00:06:37,680 Speaker 2: be focusing its interventions in order to try to slow 98 00:06:37,720 --> 00:06:40,479 Speaker 2: the spread of this virus. It seemed like the one 99 00:06:40,600 --> 00:06:46,000 Speaker 2: area that was perhaps the least sensible to focus on 100 00:06:46,120 --> 00:06:50,880 Speaker 2: became the area of the most intense focus. And that 101 00:06:51,000 --> 00:06:52,880 Speaker 2: sort of set me on a path to trying to 102 00:06:53,000 --> 00:06:54,920 Speaker 2: understand exactly what was happening. 103 00:06:55,520 --> 00:06:58,880 Speaker 1: What do you want people to remember most about that time? 104 00:06:59,760 --> 00:07:03,520 Speaker 2: I think I want people to and I would sort 105 00:07:03,560 --> 00:07:05,800 Speaker 2: of split it into different groups, But I think what's 106 00:07:05,839 --> 00:07:10,600 Speaker 2: really important is that our society, as everyone knows, is 107 00:07:11,200 --> 00:07:15,160 Speaker 2: stratified in a lot of different ways. And the thing 108 00:07:15,200 --> 00:07:18,840 Speaker 2: that I talk about a lot in the book is 109 00:07:19,520 --> 00:07:26,920 Speaker 2: how the people who made the policies, how that impacted 110 00:07:27,200 --> 00:07:30,720 Speaker 2: everyday Americans, and in particular I focus on, though it's 111 00:07:30,720 --> 00:07:34,400 Speaker 2: not exclusively to them children, is that the people who 112 00:07:34,440 --> 00:07:38,480 Speaker 2: made these policies, in many instances, we're leading very different 113 00:07:38,520 --> 00:07:42,600 Speaker 2: types of lives than millions and millions of Americans. I mean, 114 00:07:42,680 --> 00:07:46,440 Speaker 2: at a base level, they were adults, and an adult 115 00:07:46,800 --> 00:07:50,000 Speaker 2: is able to do zoom for work, just like you 116 00:07:50,040 --> 00:07:52,880 Speaker 2: and I are talking remotely right now, in a way 117 00:07:52,960 --> 00:07:55,880 Speaker 2: that that children cannot. The assumption, you know, children are 118 00:07:55,880 --> 00:08:00,280 Speaker 2: not small adults. And then but pulling the lens act 119 00:08:00,320 --> 00:08:03,400 Speaker 2: beyond that sort of you know, primary point is the 120 00:08:03,480 --> 00:08:06,680 Speaker 2: fact that the people who work in you know, the 121 00:08:06,840 --> 00:08:10,560 Speaker 2: NIH and in public health and sort of broader professionals 122 00:08:10,800 --> 00:08:15,520 Speaker 2: in the medical field, these people generally are quite well off. 123 00:08:15,920 --> 00:08:20,440 Speaker 2: They generally have a comfortable home to live in, and 124 00:08:20,640 --> 00:08:23,160 Speaker 2: they are able to do their work in a manner 125 00:08:23,360 --> 00:08:26,680 Speaker 2: that that doesn't put them, it doesn't require them to 126 00:08:26,720 --> 00:08:30,360 Speaker 2: be in person. A lot of them, right, and so 127 00:08:30,640 --> 00:08:36,600 Speaker 2: it's quite telling how those people. And there's a philosopher 128 00:08:36,640 --> 00:08:39,559 Speaker 2: of medicine named Eric Winsburg who I interviewed for the book, 129 00:08:39,559 --> 00:08:42,640 Speaker 2: and he said, you know, it's quite telling how well 130 00:08:42,760 --> 00:08:46,560 Speaker 2: those people did during the pandemic relative to so many others. 131 00:08:46,880 --> 00:08:50,839 Speaker 2: And I think that's the main thing that I want 132 00:08:50,880 --> 00:08:55,040 Speaker 2: people to be aware of or to remember about that time. 133 00:08:55,160 --> 00:08:58,320 Speaker 2: That while if you're some corporate attorney or some other 134 00:08:58,400 --> 00:09:01,320 Speaker 2: white collar worker and you're like, yeah, it kind of 135 00:09:01,360 --> 00:09:04,000 Speaker 2: sucked having my kids home from school, but it wasn't 136 00:09:04,040 --> 00:09:07,160 Speaker 2: that big of a deal. That's for you and for 137 00:09:07,400 --> 00:09:11,640 Speaker 2: your family, but you're not thinking about the millions and 138 00:09:11,720 --> 00:09:15,160 Speaker 2: millions of children who did not have that same circumstance, 139 00:09:15,520 --> 00:09:18,760 Speaker 2: and by the way, including children who came from well 140 00:09:18,800 --> 00:09:22,200 Speaker 2: off backgrounds but maybe had all sorts of different learning 141 00:09:22,280 --> 00:09:27,040 Speaker 2: challenges or social challenges right where their experience was different. 142 00:09:27,200 --> 00:09:31,560 Speaker 2: And that's what was lost in so much of the 143 00:09:31,760 --> 00:09:36,280 Speaker 2: sort of media coverage of what was happening during the pandemic, 144 00:09:36,360 --> 00:09:39,040 Speaker 2: and I think lost on the experts themselves. 145 00:09:39,120 --> 00:09:41,160 Speaker 1: To say that one of the things that impressed me 146 00:09:41,440 --> 00:09:44,800 Speaker 1: about that time. So I would say that for our family, 147 00:09:45,200 --> 00:09:48,360 Speaker 1: it was an amazing time. We loved having our kids home. 148 00:09:48,440 --> 00:09:52,120 Speaker 1: It was a cozy, beautiful time, and everything was really nice, 149 00:09:52,160 --> 00:09:54,880 Speaker 1: and I could see why people leaned into it. But 150 00:09:54,960 --> 00:09:57,640 Speaker 1: I knew that my kids aren't the only kids in 151 00:09:57,640 --> 00:09:59,680 Speaker 1: the world, and that the things that we were able 152 00:09:59,679 --> 00:10:02,800 Speaker 1: to prove for them isn't what other families might be 153 00:10:02,800 --> 00:10:05,520 Speaker 1: able to provide. I grew up pretty poor in Brooklyn, 154 00:10:05,559 --> 00:10:08,360 Speaker 1: and I knew what people in my old neighborhood were 155 00:10:08,360 --> 00:10:12,280 Speaker 1: going through. But I did see people, you know, even 156 00:10:12,520 --> 00:10:16,920 Speaker 1: on x and in media who didn't even have kids, 157 00:10:17,080 --> 00:10:19,600 Speaker 1: standing up for people who had kids, and I thought 158 00:10:19,600 --> 00:10:22,480 Speaker 1: that that was kind of one of the few positives 159 00:10:22,520 --> 00:10:24,400 Speaker 1: that I saw. I could think of a bunch off 160 00:10:24,400 --> 00:10:26,720 Speaker 1: the top of my head. I mean the Instagram, I 161 00:10:26,760 --> 00:10:30,599 Speaker 1: mean the Twitter account Red Stea's Steve Miller was passionate 162 00:10:30,679 --> 00:10:34,320 Speaker 1: about reopening schools, and he doesn't have kids, so it 163 00:10:34,360 --> 00:10:37,680 Speaker 1: was doable. People just some people just chose not to. 164 00:10:39,280 --> 00:10:43,560 Speaker 2: That's right, and kind of the broader point that I 165 00:10:43,600 --> 00:10:46,440 Speaker 2: spent a lot of the book looking at the science 166 00:10:46,520 --> 00:10:49,400 Speaker 2: behind this. In part the book is a social critique 167 00:10:49,679 --> 00:10:53,560 Speaker 2: and trying to understand how society as a whole and 168 00:10:53,600 --> 00:10:56,760 Speaker 2: the political class make decisions. But a lot of the 169 00:10:56,760 --> 00:11:01,440 Speaker 2: book I really go deep, the layer of layer into 170 00:11:01,440 --> 00:11:06,199 Speaker 2: the strata of the earth, basically trying to understand how 171 00:11:06,240 --> 00:11:09,720 Speaker 2: we think about evidence in a very sort of like 172 00:11:09,840 --> 00:11:14,440 Speaker 2: epistemological kind of way. What does it mean for something 173 00:11:14,559 --> 00:11:17,600 Speaker 2: to be true or not true? And these kind of 174 00:11:17,800 --> 00:11:22,200 Speaker 2: very abstract questions were a big driver for me because 175 00:11:22,520 --> 00:11:25,839 Speaker 2: I think it's fascinating when we really stop to think, well, 176 00:11:25,920 --> 00:11:28,679 Speaker 2: what is evidence we kept hearing about? And this is 177 00:11:29,200 --> 00:11:32,480 Speaker 2: rightly so a mocked expression now, but you have to 178 00:11:32,559 --> 00:11:36,640 Speaker 2: follow the science, but what is the science? And science 179 00:11:36,720 --> 00:11:39,880 Speaker 2: also doesn't tell you what to do. Science may give 180 00:11:39,920 --> 00:11:43,400 Speaker 2: you information, but that's very different from values. And this 181 00:11:43,480 --> 00:11:45,920 Speaker 2: kind of brings us back to your first question again. 182 00:11:46,520 --> 00:11:50,160 Speaker 2: We let's even set aside that the science was incredibly 183 00:11:50,240 --> 00:11:52,760 Speaker 2: poor on a lot of this the evidence, but even 184 00:11:52,800 --> 00:11:55,480 Speaker 2: if we were to grant that it's accurate, that still 185 00:11:55,520 --> 00:12:00,320 Speaker 2: doesn't tell politicians about what to do. Andrew Klalm, the 186 00:12:00,360 --> 00:12:04,160 Speaker 2: former governor of New York, repeatedly kept saying follow the data. 187 00:12:04,520 --> 00:12:07,760 Speaker 2: You have to follow the evidence, follow the science. But 188 00:12:07,800 --> 00:12:11,560 Speaker 2: that means nothing. It was a nonsensical statement that does 189 00:12:11,600 --> 00:12:14,960 Speaker 2: not tell you what policies to put in place. The 190 00:12:15,040 --> 00:12:18,880 Speaker 2: policies are based on what you value. And again, if 191 00:12:18,880 --> 00:12:22,400 Speaker 2: you're a poor kid in the Bronx and your family's 192 00:12:22,520 --> 00:12:26,080 Speaker 2: value maybe for you to be in school because you 193 00:12:26,440 --> 00:12:28,920 Speaker 2: are a football player and this is your one chance 194 00:12:29,240 --> 00:12:32,080 Speaker 2: to get possibly get into college, Well they canceled the 195 00:12:32,080 --> 00:12:36,080 Speaker 2: football season. Now you're out of luck. And this happened 196 00:12:36,080 --> 00:12:40,360 Speaker 2: to countless kids. Now, the argument, which is a straw 197 00:12:40,400 --> 00:12:44,000 Speaker 2: man that you'll hear and that we've had heard for years, 198 00:12:44,080 --> 00:12:46,840 Speaker 2: is well, it's better than being dead. But what I 199 00:12:47,040 --> 00:12:49,240 Speaker 2: show in the book, and I don't think people have 200 00:12:49,400 --> 00:12:56,840 Speaker 2: really shown this before, is how schools had absolutely nothing 201 00:12:56,920 --> 00:12:59,760 Speaker 2: to do with the overall case rates. And there is 202 00:12:59,800 --> 00:13:02,520 Speaker 2: a wealth of data, and this is really important, Carol. 203 00:13:02,520 --> 00:13:05,920 Speaker 2: At this point, this is not a Monday morning quarterbacking 204 00:13:05,960 --> 00:13:08,240 Speaker 2: that I'm doing. This is not an after the fact. 205 00:13:08,720 --> 00:13:12,360 Speaker 2: I tried very hard throughout the book, almost without exception, 206 00:13:12,720 --> 00:13:17,080 Speaker 2: everything I talk about is evidence that was available at 207 00:13:17,120 --> 00:13:20,920 Speaker 2: the time. It's basically kind of a TikTok chronology as 208 00:13:20,960 --> 00:13:24,440 Speaker 2: you're going through that first year, and I show that 209 00:13:24,480 --> 00:13:29,360 Speaker 2: the evidence was there at the moment, but our health 210 00:13:29,400 --> 00:13:34,640 Speaker 2: officials and the broader medical establishment ignored or dismissed this 211 00:13:34,800 --> 00:13:36,119 Speaker 2: evidence that was available. 212 00:13:36,480 --> 00:13:36,720 Speaker 3: Right. 213 00:13:37,240 --> 00:13:40,320 Speaker 1: One of my favorite moments of the Follow the Science, 214 00:13:40,360 --> 00:13:43,960 Speaker 1: Follow the Data was when schools had opened in New 215 00:13:44,040 --> 00:13:48,160 Speaker 1: York City. They opened late late September of twenty twenty, 216 00:13:48,640 --> 00:13:53,240 Speaker 1: and then in November build Aeblasio closed them again because 217 00:13:53,559 --> 00:13:57,520 Speaker 1: the COVID rate, the COVID percentage for the city had 218 00:13:57,840 --> 00:13:59,839 Speaker 1: reached a certain percentage. I forgot, I think it my 219 00:14:00,240 --> 00:14:04,000 Speaker 1: three percent. But basically what that meant was that three 220 00:14:04,200 --> 00:14:06,920 Speaker 1: out of every hundred people who took the COVID tests 221 00:14:06,960 --> 00:14:11,080 Speaker 1: at facilities were testing positive. So if you had a 222 00:14:11,160 --> 00:14:13,880 Speaker 1: day where two were testing positive were four, it just 223 00:14:14,120 --> 00:14:16,280 Speaker 1: it made no sense. But then the best part to 224 00:14:16,320 --> 00:14:20,040 Speaker 1: me was Cuomo got on and he said first of 225 00:14:20,040 --> 00:14:22,760 Speaker 1: all that the Bill Deblasio cannot close schools. Only he 226 00:14:22,800 --> 00:14:24,960 Speaker 1: could close schools, and he could open schools, and he's 227 00:14:24,960 --> 00:14:27,880 Speaker 1: the only one in charge. But also he had different data. 228 00:14:27,920 --> 00:14:30,560 Speaker 1: He didn't have the same numbers as Bill Deblasio did, 229 00:14:30,600 --> 00:14:33,720 Speaker 1: which was like, wait, how's that even possible? How do 230 00:14:33,760 --> 00:14:36,800 Speaker 1: you guys have different numbers? Stuff like that I think 231 00:14:37,000 --> 00:14:39,760 Speaker 1: went overlooked, even though that to me was a very 232 00:14:40,200 --> 00:14:44,240 Speaker 1: kind of pivotal point where people's eyes were open. But 233 00:14:44,760 --> 00:14:46,880 Speaker 1: data even isn't always the same. 234 00:14:48,000 --> 00:14:51,040 Speaker 2: Yeah, I love that you touch on this point, and 235 00:14:51,360 --> 00:14:53,760 Speaker 2: I talk about this a lot in the book where 236 00:14:53,800 --> 00:14:56,400 Speaker 2: I look at that. Yes, as you noted, you know, 237 00:14:56,520 --> 00:14:59,520 Speaker 2: different facts and figures are going to come to different people, 238 00:14:59,520 --> 00:15:03,880 Speaker 2: including even you know, different politicians and health officials are 239 00:15:03,880 --> 00:15:07,600 Speaker 2: going to have different information at any given moment. And 240 00:15:07,760 --> 00:15:09,720 Speaker 2: it doesn't mean that it's wrong to have some sort 241 00:15:09,720 --> 00:15:12,280 Speaker 2: of benchmark, you know, like we need some sort of 242 00:15:12,280 --> 00:15:16,240 Speaker 2: parameters to operate within. So I don't necessarily fault officials 243 00:15:16,240 --> 00:15:18,280 Speaker 2: for saying, hey, this is the line that we're going 244 00:15:18,360 --> 00:15:22,880 Speaker 2: to go toward. However, they needed to be honest that 245 00:15:22,960 --> 00:15:23,960 Speaker 2: this was made up. 246 00:15:24,000 --> 00:15:27,840 Speaker 1: That line made no sense, and that line fictional line. 247 00:15:27,640 --> 00:15:30,240 Speaker 2: Right, this was made And again I don't think it's 248 00:15:30,280 --> 00:15:32,840 Speaker 2: wrong necessarily to have some sort of target. Otherwise it's 249 00:15:33,080 --> 00:15:37,320 Speaker 2: it's hard to corral society in a certain direction. But 250 00:15:37,440 --> 00:15:39,400 Speaker 2: you need to be honest and in fact, and they 251 00:15:39,400 --> 00:15:42,080 Speaker 2: were the opposite of honest. They said, this was the science, 252 00:15:42,160 --> 00:15:47,240 Speaker 2: this is what And it's interesting I show how different areas, 253 00:15:47,280 --> 00:15:50,480 Speaker 2: different regions within our country and different countries, everyone had 254 00:15:50,520 --> 00:15:54,000 Speaker 2: wildly different benchmarks. They were all over the place because 255 00:15:54,280 --> 00:15:58,960 Speaker 2: everyone had one different data oftentimes and two different ways 256 00:15:58,960 --> 00:16:02,400 Speaker 2: of interpreting it about what they felt was quote unquote safe. 257 00:16:02,440 --> 00:16:04,440 Speaker 2: Is it three percent, is it five percent, is it 258 00:16:04,480 --> 00:16:08,880 Speaker 2: twelve percent? Of you know, some various rates of prevalence 259 00:16:08,920 --> 00:16:13,480 Speaker 2: within the community. There's no correct answer to this, Yet 260 00:16:13,520 --> 00:16:16,560 Speaker 2: over and over it was presented to the public that 261 00:16:16,960 --> 00:16:20,080 Speaker 2: there's sort of like very bright red line about this 262 00:16:20,160 --> 00:16:23,920 Speaker 2: is what's appropriate in this area, and anyone who disagreed 263 00:16:24,520 --> 00:16:28,120 Speaker 2: was a complete jerk. You know what, are you a moron? 264 00:16:28,360 --> 00:16:31,600 Speaker 2: You're trying to do your own research. But the reality 265 00:16:31,680 --> 00:16:34,000 Speaker 2: is they had no idea what they were talking about. 266 00:16:34,240 --> 00:16:36,840 Speaker 2: Yet it was and this is a large part of 267 00:16:36,880 --> 00:16:39,040 Speaker 2: the book that I try to talk that I talk 268 00:16:39,120 --> 00:16:44,480 Speaker 2: about is the narrative formation and how the media very 269 00:16:44,560 --> 00:16:49,280 Speaker 2: much so worked in conjunction with the health authorities and 270 00:16:49,440 --> 00:16:54,880 Speaker 2: generally with democratic politicians that instead of this sort of 271 00:16:54,920 --> 00:16:58,160 Speaker 2: classic role where media is supposed to be skeptical of 272 00:16:58,280 --> 00:17:01,360 Speaker 2: claims by those in power, whether it's a politician or 273 00:17:01,360 --> 00:17:05,360 Speaker 2: whether it's a health official. Instead of that, they basically 274 00:17:05,400 --> 00:17:08,879 Speaker 2: just acted as an amplifier. And I give all sorts 275 00:17:08,920 --> 00:17:11,919 Speaker 2: of examples where you can see how articles in the 276 00:17:11,920 --> 00:17:16,240 Speaker 2: New York Times and elsewhere they kept quoting various people 277 00:17:16,280 --> 00:17:19,720 Speaker 2: saying things, but they never provided any evidence or any 278 00:17:19,760 --> 00:17:22,919 Speaker 2: proof behind what they were saying. And this is how 279 00:17:23,240 --> 00:17:27,120 Speaker 2: the narrative calcified for much of the public was that 280 00:17:27,760 --> 00:17:30,240 Speaker 2: schools in Europe when they opened, well that's because Europe 281 00:17:30,320 --> 00:17:33,119 Speaker 2: is different. And they made up this long list of 282 00:17:33,920 --> 00:17:38,520 Speaker 2: contrived reasons. Yet there was never any evidence behind these reasons, 283 00:17:38,720 --> 00:17:40,800 Speaker 2: and if you question them, you're an asshole. 284 00:17:41,200 --> 00:17:44,479 Speaker 1: But yeah, and you want teachers to die, obviously, well 285 00:17:44,520 --> 00:17:45,119 Speaker 1: people would die. 286 00:17:45,200 --> 00:17:50,560 Speaker 3: But again, the media failed in its most basic obligation, 287 00:17:50,960 --> 00:17:54,520 Speaker 3: which is to actually question what people are telling them, 288 00:17:54,840 --> 00:17:57,200 Speaker 3: and if they don't question it, then to at least 289 00:17:57,280 --> 00:18:00,560 Speaker 3: do the research themselves as journalists to find out what 290 00:18:00,760 --> 00:18:01,520 Speaker 3: is the evidence. 291 00:18:01,800 --> 00:18:06,080 Speaker 2: Yet I give example after example after example of where 292 00:18:06,440 --> 00:18:09,960 Speaker 2: our most prestigious news outlets in the country just simply 293 00:18:10,960 --> 00:18:15,639 Speaker 2: acted as a pr arm of many of these officials 294 00:18:15,680 --> 00:18:18,479 Speaker 2: and their agencies, just simply repeating what they said right 295 00:18:18,600 --> 00:18:21,919 Speaker 2: without asking for or seeking on their own the actual 296 00:18:22,000 --> 00:18:25,200 Speaker 2: evidence behind these claims, and what I do in the book, 297 00:18:25,240 --> 00:18:29,080 Speaker 2: it's essentially a case study, is show one by one 298 00:18:29,160 --> 00:18:31,919 Speaker 2: by one how each of these claims was false. 299 00:18:32,280 --> 00:18:34,200 Speaker 1: We're going to take a quick break and be right 300 00:18:34,240 --> 00:18:40,840 Speaker 1: back on the Carol Marcowitch Show. So, you know, I 301 00:18:40,880 --> 00:18:44,120 Speaker 1: was a columnist for many years before the pandemic, and 302 00:18:44,160 --> 00:18:48,320 Speaker 1: I had some level of success, but that time really, 303 00:18:49,080 --> 00:18:52,120 Speaker 1: you know, increased my prominence. My career definitely took off 304 00:18:52,240 --> 00:18:56,480 Speaker 1: during those years. I became somebody who was writing about 305 00:18:56,520 --> 00:18:59,360 Speaker 1: it often and people enjoyed that and that kind of thing. 306 00:18:59,520 --> 00:19:03,040 Speaker 1: You were, thank you, thank you, but you were more, 307 00:19:03,320 --> 00:19:07,119 Speaker 1: you know, in the mainstream media. Did your career suffer? 308 00:19:07,240 --> 00:19:10,399 Speaker 1: Did you take hits for your coverage? 309 00:19:10,560 --> 00:19:14,440 Speaker 2: No, I don't think my career suffered, and in fact, 310 00:19:14,760 --> 00:19:16,920 Speaker 2: I think I did quite well. 311 00:19:17,080 --> 00:19:19,480 Speaker 1: But I think you did quite well. But I'm just 312 00:19:19,520 --> 00:19:23,080 Speaker 1: wondering if you lost any opportunities for being right too early, 313 00:19:23,280 --> 00:19:25,119 Speaker 1: which is a common thing, you know. 314 00:19:25,560 --> 00:19:31,760 Speaker 2: I would say I was astonished at the sort of 315 00:19:32,240 --> 00:19:35,560 Speaker 2: rejection or dismissal some of the early work I was 316 00:19:35,640 --> 00:19:40,200 Speaker 2: doing by large outlets with whom I had worked with 317 00:19:40,520 --> 00:19:41,159 Speaker 2: in the past. 318 00:19:41,320 --> 00:19:43,480 Speaker 1: This is what I meant it had happened to you. 319 00:19:43,600 --> 00:19:47,520 Speaker 2: Yeah. Look, as a journalist, you should just expect most 320 00:19:47,560 --> 00:19:49,720 Speaker 2: of the things you pitch are going to be rejected. 321 00:19:49,720 --> 00:19:53,000 Speaker 2: No one owes me anything With that said, I know 322 00:19:53,200 --> 00:19:57,160 Speaker 2: enough about what you know, what works and what doesn't work. 323 00:19:57,480 --> 00:19:59,679 Speaker 2: And these are places I had written for in the past, 324 00:20:00,000 --> 00:20:03,480 Speaker 2: and I knew I wasn't pitching, uh, you know, an editorial. 325 00:20:03,720 --> 00:20:05,919 Speaker 2: I wasn't pitching you know, Oh, this is my opinion 326 00:20:05,960 --> 00:20:09,920 Speaker 2: on this. I had a compendium of evidence behind anything. 327 00:20:09,960 --> 00:20:13,119 Speaker 2: And that's why ultimately I was able to write for 328 00:20:13,359 --> 00:20:16,479 Speaker 2: some of these more mainstream outlets. And it's funny there 329 00:20:16,480 --> 00:20:19,000 Speaker 2: were a number of other journalists in the sort of 330 00:20:19,080 --> 00:20:22,439 Speaker 2: contrarian space, if you will, but who are writing for 331 00:20:22,520 --> 00:20:25,480 Speaker 2: more either right leaning outlets or just you know, for 332 00:20:25,560 --> 00:20:28,119 Speaker 2: their sub stacks or blogs, and they kept asking me, 333 00:20:28,400 --> 00:20:30,159 Speaker 2: how are you doing this? How are you getting this 334 00:20:30,200 --> 00:20:30,760 Speaker 2: stuff through? 335 00:20:30,960 --> 00:20:33,960 Speaker 1: Yeah? And it was impressive, it really was. I remember, 336 00:20:34,000 --> 00:20:35,800 Speaker 1: I mean I remember several of your pieces that I 337 00:20:35,800 --> 00:20:37,439 Speaker 1: thought like, oh, how did you get you know, the 338 00:20:37,480 --> 00:20:43,200 Speaker 1: Wired piece, just you know, Wired mystic that was those 339 00:20:43,200 --> 00:20:48,960 Speaker 1: were roundbreaking pieces and they were in kind of again 340 00:20:49,040 --> 00:20:51,560 Speaker 1: mainstream outlets, which was unique. 341 00:20:52,080 --> 00:20:54,359 Speaker 2: Yeah, and I thought about this a lot, you know, 342 00:20:54,400 --> 00:20:56,280 Speaker 2: in part just on my own and because so many 343 00:20:56,520 --> 00:20:58,720 Speaker 2: other journalists and people have mentioned this to me and 344 00:20:58,720 --> 00:21:01,040 Speaker 2: asked me about it. And the answer I can come 345 00:21:01,119 --> 00:21:05,840 Speaker 2: up with is that I always came with evidence. And 346 00:21:05,960 --> 00:21:10,000 Speaker 2: while most editors are not interested in going against the narratives, 347 00:21:10,720 --> 00:21:14,840 Speaker 2: thank god, there are some who actually really care about, well, 348 00:21:14,840 --> 00:21:18,600 Speaker 2: what's the truth? What's going on here? Is this, even 349 00:21:18,800 --> 00:21:22,800 Speaker 2: though this goes against the sort of establishment view, is 350 00:21:22,840 --> 00:21:27,439 Speaker 2: this a well supported argument? And that's the theme of 351 00:21:27,480 --> 00:21:31,879 Speaker 2: my book, is that we need evidence behind claims that 352 00:21:31,920 --> 00:21:35,360 Speaker 2: we make. And I give lots of fun examples throughout history, 353 00:21:35,880 --> 00:21:38,399 Speaker 2: in particular in medicine, but this happens in a variety 354 00:21:38,400 --> 00:21:40,800 Speaker 2: of fields. But they give lots of examples about how 355 00:21:41,040 --> 00:21:45,000 Speaker 2: o our intuition, including the intuition of experts about what 356 00:21:45,119 --> 00:21:49,040 Speaker 2: is true, is often wrong. And this is the foundation 357 00:21:49,119 --> 00:21:51,639 Speaker 2: of what's known as evidence based medicine, where you have 358 00:21:51,720 --> 00:21:54,880 Speaker 2: this hierarchy of evidence and you have to actually look 359 00:21:54,920 --> 00:21:57,919 Speaker 2: at what do the data show, what is the actual 360 00:21:58,000 --> 00:22:02,439 Speaker 2: evidence out there, and how does that compare versus what 361 00:22:02,520 --> 00:22:05,760 Speaker 2: do you think is going to happen in time and 362 00:22:05,840 --> 00:22:10,120 Speaker 2: time again throughout history, including right up to today. Our 363 00:22:10,240 --> 00:22:16,160 Speaker 2: intuition is often wrong, and nevertheless, the media and many 364 00:22:16,200 --> 00:22:19,480 Speaker 2: of the health experts in America kept going with their 365 00:22:19,520 --> 00:22:23,919 Speaker 2: intuition about something, their assumptions, and so for me, I 366 00:22:24,040 --> 00:22:28,199 Speaker 2: found that frustrating and I found it offensive. And the 367 00:22:28,280 --> 00:22:30,199 Speaker 2: thing that I always do with my articles and what 368 00:22:30,240 --> 00:22:32,720 Speaker 2: I try to do in the book is that, And 369 00:22:33,240 --> 00:22:35,560 Speaker 2: much to my editor's dismay, I mean, I have there 370 00:22:35,600 --> 00:22:40,480 Speaker 2: are hundreds of endnotes with citations in the book, hundreds 371 00:22:40,560 --> 00:22:43,280 Speaker 2: upon hundreds of them, because I tried to not make 372 00:22:43,440 --> 00:22:47,280 Speaker 2: any statement without providing a source behind it. Now, I'm 373 00:22:47,320 --> 00:22:50,159 Speaker 2: sure I failed in some circumstances. I'm sure there are 374 00:22:50,200 --> 00:22:54,320 Speaker 2: some mistakes in there, but anyone can see a massive 375 00:22:54,440 --> 00:22:57,439 Speaker 2: effort was made at least to provide evidence, because there 376 00:22:57,520 --> 00:23:00,000 Speaker 2: was no way I was going to criticize everyone else 377 00:23:00,760 --> 00:23:03,959 Speaker 2: for not providing evidence behind their claims for me to 378 00:23:03,960 --> 00:23:07,200 Speaker 2: do the same thing. And that that's sort of the 379 00:23:07,520 --> 00:23:10,879 Speaker 2: kind of like meta message of the book is really 380 00:23:10,960 --> 00:23:14,159 Speaker 2: thinking about evidence and about what is true. And it 381 00:23:14,160 --> 00:23:15,720 Speaker 2: doesn't mean we're always going to be right in the 382 00:23:15,760 --> 00:23:17,399 Speaker 2: way we put it together, but you have to at 383 00:23:17,480 --> 00:23:20,600 Speaker 2: least make an effort. And it was just so troubling 384 00:23:20,920 --> 00:23:23,840 Speaker 2: how the media. It's not that they had a source, 385 00:23:23,880 --> 00:23:26,360 Speaker 2: but the source was wrong. That happened plenty of times too, 386 00:23:26,400 --> 00:23:28,800 Speaker 2: and again that happens to me, That happens to everyone. 387 00:23:28,840 --> 00:23:31,679 Speaker 2: Sometimes you just misinterpret a study or you cite the 388 00:23:31,720 --> 00:23:34,320 Speaker 2: wrong thing. That's one thing I'm not I try not 389 00:23:34,400 --> 00:23:38,399 Speaker 2: to really go after people for that. What's problematic was 390 00:23:38,480 --> 00:23:41,680 Speaker 2: not even making an effort. It's this sort of expert say, 391 00:23:42,040 --> 00:23:44,400 Speaker 2: I think was it years ago? Trump or someone used 392 00:23:44,400 --> 00:23:49,600 Speaker 2: to always say. People are saying, people are saying, yeah, 393 00:23:49,640 --> 00:23:53,159 Speaker 2: what an ironic twist, that exact same thing that Trump 394 00:23:53,240 --> 00:23:55,359 Speaker 2: was vilified for. People are saying, you know, he's just 395 00:23:55,440 --> 00:23:58,919 Speaker 2: kind of play That's exactly what the most prestigious media 396 00:23:59,320 --> 00:24:02,800 Speaker 2: did all the time during the pandemic. Experts say, yeah, 397 00:24:02,880 --> 00:24:04,800 Speaker 2: they might even quote an expert, but there was no 398 00:24:04,840 --> 00:24:07,159 Speaker 2: evidence behind with that. I don't really care what this 399 00:24:07,200 --> 00:24:13,920 Speaker 2: person's opinion is on something that's actually within evidence based medicine. Physicians' 400 00:24:13,960 --> 00:24:18,680 Speaker 2: opinions is considered generally the lowest form of evidence possible. 401 00:24:19,040 --> 00:24:22,479 Speaker 2: I don't want your opinion. Show me the evidence, and 402 00:24:22,520 --> 00:24:25,800 Speaker 2: then we can talk about things and then make decisions 403 00:24:25,840 --> 00:24:28,760 Speaker 2: as a society based on our values. 404 00:24:29,000 --> 00:24:31,400 Speaker 1: Right, and that also was the fact that a lot 405 00:24:31,440 --> 00:24:34,480 Speaker 1: of the experts were changing their minds in real time 406 00:24:34,520 --> 00:24:37,840 Speaker 1: and not correcting anything. One of my favorite moments was 407 00:24:37,960 --> 00:24:40,119 Speaker 1: June twenty twenty and again, I have this all like, 408 00:24:40,680 --> 00:24:42,199 Speaker 1: I'm sure you do too. And I know a lot 409 00:24:42,240 --> 00:24:45,200 Speaker 1: of people who were in this COVID, you know, fight 410 00:24:45,280 --> 00:24:48,320 Speaker 1: with us. They could recall these times and figures and 411 00:24:48,560 --> 00:24:54,080 Speaker 1: statements so easily. But June twenty twenty May Sorry, I 412 00:24:54,080 --> 00:24:56,639 Speaker 1: actually I might have gotten that one wrong. Might a 413 00:24:56,640 --> 00:25:00,359 Speaker 1: little later, but Anthony Falluci said, as we've been saying 414 00:25:00,480 --> 00:25:03,760 Speaker 1: all along, outdoor masking is not necessary. It was June 415 00:25:03,760 --> 00:25:06,679 Speaker 1: twenty one because it was after it was after the vaccines. 416 00:25:06,920 --> 00:25:08,879 Speaker 1: So June twenty twenty one, he says, as we've been 417 00:25:08,880 --> 00:25:13,119 Speaker 1: saying all along, outdoor masking is unnecessary. And I was like, 418 00:25:13,240 --> 00:25:15,760 Speaker 1: as we've been saying all along, Like when were you 419 00:25:15,880 --> 00:25:18,399 Speaker 1: saying this? This is the first I hear of. But 420 00:25:18,440 --> 00:25:21,920 Speaker 1: my kids are masking in between bites while eating lunch 421 00:25:21,960 --> 00:25:24,440 Speaker 1: on the ground outside of their schools in New York. 422 00:25:24,560 --> 00:25:27,560 Speaker 1: I mean, if you've been saying this, that message has 423 00:25:27,640 --> 00:25:31,680 Speaker 1: not penetrated. So they were changing their minds and their 424 00:25:31,720 --> 00:25:35,600 Speaker 1: opinions and not following the science at all. 425 00:25:35,680 --> 00:25:40,439 Speaker 2: Actually, let's be clear, it's not only fine but appropriate 426 00:25:40,520 --> 00:25:42,719 Speaker 2: to change your opinion as evidence changes. 427 00:25:42,880 --> 00:25:45,159 Speaker 1: Yeah, but be honest about it right. 428 00:25:45,000 --> 00:25:50,160 Speaker 2: Now, exactly all along you are exactly Yeah. The defense 429 00:25:50,320 --> 00:25:52,960 Speaker 2: that was given over and over was you know, we're 430 00:25:54,080 --> 00:25:56,639 Speaker 2: building the plane as we fly it, and other metaphors 431 00:25:56,640 --> 00:26:01,600 Speaker 2: such as that. But again that was in effect a 432 00:26:01,680 --> 00:26:05,359 Speaker 2: lie because it was just always exculpatory. No matter what 433 00:26:05,480 --> 00:26:08,320 Speaker 2: you did, there was a get out of jail free 434 00:26:08,359 --> 00:26:11,920 Speaker 2: card that you could use. Oh, we've never experienced this before. 435 00:26:12,160 --> 00:26:14,800 Speaker 2: We never knew what was going on. But again, the 436 00:26:14,920 --> 00:26:19,520 Speaker 2: thing I show is that the evidence was there from 437 00:26:19,560 --> 00:26:21,960 Speaker 2: the beginning, and I show as time goes on the 438 00:26:22,000 --> 00:26:27,640 Speaker 2: evidence that was available and including about masking and outdoor masking, 439 00:26:27,800 --> 00:26:29,919 Speaker 2: there was never any evidence that this was going to 440 00:26:29,960 --> 00:26:32,560 Speaker 2: be successful. But that's not how it was presented to 441 00:26:32,600 --> 00:26:34,760 Speaker 2: the public. One of the things that I'm most proud 442 00:26:34,760 --> 00:26:38,119 Speaker 2: of is I had written a piece about the summer 443 00:26:38,160 --> 00:26:42,400 Speaker 2: camp guidelines from the CDC. Yeah, and you know, I 444 00:26:42,440 --> 00:26:45,000 Speaker 2: saw them and I saw that it included outdoor masking, 445 00:26:45,200 --> 00:26:48,200 Speaker 2: and I was like, immediately, because how do you do it? 446 00:26:48,440 --> 00:26:50,639 Speaker 2: I was very familiar with the data, and immediately I 447 00:26:50,680 --> 00:26:52,879 Speaker 2: was like, this makes no sense. And I had a 448 00:26:52,960 --> 00:26:56,040 Speaker 2: number of prominent people, including the editor in chief of 449 00:26:56,119 --> 00:27:00,520 Speaker 2: a Jama Pediatrics who said who were like, yes, I 450 00:27:00,560 --> 00:27:02,680 Speaker 2: will talk with you for the article on the record, 451 00:27:02,680 --> 00:27:05,560 Speaker 2: and they called it, this is draconian, this makes no sense. 452 00:27:05,760 --> 00:27:10,879 Speaker 2: Immediately Rachel Olenski was questioned about this, and very shortly thereafter, 453 00:27:11,760 --> 00:27:16,960 Speaker 2: the CDC rescinded the outdoor mask guidance for summer camps. 454 00:27:17,040 --> 00:27:19,080 Speaker 2: That was a moment that I was proud of, but 455 00:27:19,200 --> 00:27:20,960 Speaker 2: you know, those types of things were few and far 456 00:27:21,040 --> 00:27:26,720 Speaker 2: between to actually affect policy in that matter. And the 457 00:27:27,400 --> 00:27:33,879 Speaker 2: notion of certainty, I think is another really important thread 458 00:27:34,359 --> 00:27:36,880 Speaker 2: within the book that I talk about a lot. It's 459 00:27:37,080 --> 00:27:40,520 Speaker 2: it's the way that health officials and then sort of 460 00:27:40,760 --> 00:27:44,040 Speaker 2: by an extension of that, politicians, and then the buy 461 00:27:44,080 --> 00:27:46,320 Speaker 2: an extension of that, the media and the sort of 462 00:27:46,320 --> 00:27:50,480 Speaker 2: what we might call like elite society. Overall, the degree 463 00:27:50,520 --> 00:27:56,080 Speaker 2: of confidence and the degree of arrogance and righteousness within 464 00:27:56,200 --> 00:28:01,280 Speaker 2: which these opinions were expressed was incredibly damaging, and we 465 00:28:01,320 --> 00:28:03,800 Speaker 2: can see, you know the result of that was an 466 00:28:03,960 --> 00:28:08,320 Speaker 2: enormous lost in trust in our public health institutions. Now 467 00:28:08,359 --> 00:28:10,160 Speaker 2: part of that is, you know, I think people can 468 00:28:10,200 --> 00:28:13,040 Speaker 2: make a reasonable argument to some extent that Trump and 469 00:28:13,040 --> 00:28:15,439 Speaker 2: some others had sort of poisoned the well and this 470 00:28:15,640 --> 00:28:19,320 Speaker 2: was reactive to that. But we are responsible for our 471 00:28:19,400 --> 00:28:24,240 Speaker 2: own actions. Trump didn't make anybody lie. Trump didn't make 472 00:28:24,600 --> 00:28:28,560 Speaker 2: Fauci or others have a degree of confidence they should 473 00:28:28,600 --> 00:28:32,639 Speaker 2: not have had when they projected these various pronouncements to 474 00:28:32,720 --> 00:28:35,720 Speaker 2: the public about what they should and shouldn't do. And 475 00:28:36,200 --> 00:28:40,120 Speaker 2: it's so important to I hope a lesson that people 476 00:28:40,120 --> 00:28:44,160 Speaker 2: within media and then more broadly within the sort of 477 00:28:44,400 --> 00:28:48,680 Speaker 2: health establishment understand is that they need a degree of 478 00:28:48,800 --> 00:28:52,440 Speaker 2: humility and that I know it goes against someone's instinct, 479 00:28:52,720 --> 00:28:57,320 Speaker 2: but to be persuasive, it's actually better to be honest 480 00:28:57,960 --> 00:29:00,920 Speaker 2: and show nuance and show what you don't know. I mean, 481 00:29:00,960 --> 00:29:03,120 Speaker 2: that's the best rule as a writer. You know. Just 482 00:29:03,120 --> 00:29:04,880 Speaker 2: going back, you asked me about how did I get 483 00:29:04,880 --> 00:29:09,280 Speaker 2: these pieces and some of these sort of legacy media outlets, 484 00:29:09,400 --> 00:29:13,520 Speaker 2: And I think, in part I try to never overstate things, 485 00:29:13,640 --> 00:29:16,680 Speaker 2: and I try to be honest and express nuance and say, look, 486 00:29:16,800 --> 00:29:20,760 Speaker 2: here here's the evidence of what it shows, here's what 487 00:29:20,800 --> 00:29:23,680 Speaker 2: we know, and what we don't know, and ultimately, I 488 00:29:24,560 --> 00:29:28,000 Speaker 2: think people find that most persuasive. You actually can convince 489 00:29:28,040 --> 00:29:32,800 Speaker 2: someone of something more when you don't overstate things. And 490 00:29:32,880 --> 00:29:36,000 Speaker 2: in fact, the public health quote unquote experts did the 491 00:29:36,080 --> 00:29:40,800 Speaker 2: exact opposite of that time and again they overstated the 492 00:29:40,840 --> 00:29:45,040 Speaker 2: degree of evidence, they overstated their confidence, and conversely, anyone 493 00:29:45,040 --> 00:29:49,440 Speaker 2: who disagreed was maligned as a fool, as someone who 494 00:29:49,480 --> 00:29:55,240 Speaker 2: is dangerous. And the divergence between the United States and 495 00:29:55,400 --> 00:30:00,920 Speaker 2: Europe in particular regarding schools is so extraordinary, and the 496 00:30:01,080 --> 00:30:07,239 Speaker 2: idea that someone was this Republican crank for wanting to 497 00:30:08,120 --> 00:30:12,920 Speaker 2: model our school's policy on good old, progressive western Europe 498 00:30:13,280 --> 00:30:16,720 Speaker 2: was absurd. But that's what was claimed over and over again, 499 00:30:16,760 --> 00:30:19,000 Speaker 2: and what people believed, and what I tried to show 500 00:30:19,040 --> 00:30:21,160 Speaker 2: in the book was how and I talk about a 501 00:30:21,160 --> 00:30:23,880 Speaker 2: whole variety of countries within Europe and also outside of 502 00:30:23,920 --> 00:30:27,080 Speaker 2: Europe and other places, there was never any sort of 503 00:30:27,320 --> 00:30:34,440 Speaker 2: ideological through line between various pandemic policies in different locations. 504 00:30:34,880 --> 00:30:38,160 Speaker 2: So what we were experiencing in the United States was 505 00:30:38,320 --> 00:30:42,080 Speaker 2: very much its own kind of isolated experience. But yet 506 00:30:42,120 --> 00:30:45,880 Speaker 2: people just assumed and extrapolated that this was normal and 507 00:30:45,920 --> 00:30:48,760 Speaker 2: that it made sense that, well, of course the Republicans 508 00:30:49,320 --> 00:30:52,040 Speaker 2: want things to open up for the economy. No, there 509 00:30:52,080 --> 00:30:55,240 Speaker 2: were plenty of countries with a conservative government that wanted 510 00:30:55,280 --> 00:30:57,120 Speaker 2: everything shut down. That was the opposite, And there were 511 00:30:57,120 --> 00:31:00,400 Speaker 2: plenty of progressive countries that none of this made any sense. Now, 512 00:31:00,560 --> 00:31:04,080 Speaker 2: And the important part is that the public by and 513 00:31:04,160 --> 00:31:08,520 Speaker 2: large was kept misinformed and uninformed about what was going on. 514 00:31:08,560 --> 00:31:10,640 Speaker 2: And I give these sort of case studies within the 515 00:31:10,720 --> 00:31:14,680 Speaker 2: media about how things were framed about what was happening 516 00:31:14,760 --> 00:31:17,760 Speaker 2: out the outside the United States, and there was always 517 00:31:18,000 --> 00:31:23,680 Speaker 2: some sort of very soft but nevertheless definitive excuse. So 518 00:31:23,880 --> 00:31:26,360 Speaker 2: why such and such was happening and why it couldn't 519 00:31:26,400 --> 00:31:29,000 Speaker 2: happen here? All this was made up? Carol. 520 00:31:29,320 --> 00:31:32,120 Speaker 1: It was insane. What a crazy, crazy time that we 521 00:31:32,200 --> 00:31:35,719 Speaker 1: lived through. So one of the questions I ask all 522 00:31:35,760 --> 00:31:37,880 Speaker 1: of my guests is what do you worry about? I 523 00:31:37,920 --> 00:31:40,680 Speaker 1: want to bring it back to your book, Abundance of Caution. 524 00:31:41,720 --> 00:31:44,640 Speaker 1: What do you worry about that you covered in the 525 00:31:44,680 --> 00:31:47,640 Speaker 1: book that you could see resurfacing again in our society. 526 00:31:48,560 --> 00:31:51,240 Speaker 2: Well, I worry about I guess, just from my own 527 00:31:51,280 --> 00:31:56,000 Speaker 2: background in journalism, I worry about the media still not 528 00:31:56,360 --> 00:31:59,960 Speaker 2: quote learning its lessons. And while I'm thrilled that there's 529 00:32:00,160 --> 00:32:02,000 Speaker 2: a vibrant sort of whatever, if we want to call 530 00:32:02,000 --> 00:32:06,680 Speaker 2: it alternative media landscape, that's terrific. Substacks and podcasts, all 531 00:32:06,760 --> 00:32:10,880 Speaker 2: that's great, the legacy media still holds an enormous amount 532 00:32:10,960 --> 00:32:14,400 Speaker 2: of sway and power within our country, in particular with 533 00:32:14,560 --> 00:32:21,200 Speaker 2: decision makers within Washington, DC and elsewhere. And you can 534 00:32:21,240 --> 00:32:23,440 Speaker 2: see it on a whole variety of issues. But when 535 00:32:23,480 --> 00:32:26,320 Speaker 2: you become an expert on one particular issue, it then 536 00:32:26,360 --> 00:32:28,880 Speaker 2: makes you wonder about issues that you're not an expert 537 00:32:28,880 --> 00:32:31,840 Speaker 2: on when you see, oh my god, the way this 538 00:32:32,000 --> 00:32:36,160 Speaker 2: is framed is completely wrong. And I give a little 539 00:32:36,160 --> 00:32:39,240 Speaker 2: bit of a kind of a behind the curtain look 540 00:32:39,320 --> 00:32:44,120 Speaker 2: at what happened with some legacy media publications about how 541 00:32:44,560 --> 00:32:48,160 Speaker 2: these sort of the framing of articles was done very 542 00:32:48,200 --> 00:32:54,880 Speaker 2: purposefully to exclude or to manipulate information that they didn't 543 00:32:54,960 --> 00:32:58,000 Speaker 2: want that in the piece, that didn't fit the narrative. 544 00:32:58,320 --> 00:33:00,680 Speaker 2: And you can see this happening again now. And as 545 00:33:00,680 --> 00:33:03,959 Speaker 2: long as that keeps happening, we're I think we're going 546 00:33:04,000 --> 00:33:06,160 Speaker 2: to be in trouble. I think we're destined for yet 547 00:33:06,160 --> 00:33:08,760 Speaker 2: another problem. You know, in the next crisis, whether it's 548 00:33:08,800 --> 00:33:12,840 Speaker 2: a medical crisis or something else, until the people who 549 00:33:12,840 --> 00:33:16,600 Speaker 2: communicate to the public, who act as that filter. Because look, 550 00:33:16,720 --> 00:33:19,360 Speaker 2: most people have regular jobs. They don't have time, right 551 00:33:20,160 --> 00:33:22,800 Speaker 2: be I don't expect regular people that are going to 552 00:33:22,800 --> 00:33:24,560 Speaker 2: work each day doing whatever it is they do. They 553 00:33:24,600 --> 00:33:27,560 Speaker 2: don't have time to start reading studies. They're relying on 554 00:33:27,760 --> 00:33:31,560 Speaker 2: other people to present this information to them, and it 555 00:33:31,600 --> 00:33:34,440 Speaker 2: needs to be presented in a manner that is far 556 00:33:34,520 --> 00:33:38,720 Speaker 2: more objective than in the manner. And it doesn't mean 557 00:33:38,760 --> 00:33:42,000 Speaker 2: that there are mistakes or errors in an article. I 558 00:33:42,000 --> 00:33:45,360 Speaker 2: think that's often kind of the way it's described that oh, 559 00:33:45,400 --> 00:33:49,719 Speaker 2: well this is fact checked, and well everything could be true, 560 00:33:50,200 --> 00:33:52,080 Speaker 2: and at the same time the piece can still be 561 00:33:52,120 --> 00:33:54,240 Speaker 2: incredibly misleading, right. 562 00:33:54,320 --> 00:33:56,640 Speaker 1: Yeah, we saw a lot of that. I mean even 563 00:33:56,880 --> 00:33:59,240 Speaker 1: pieces that were technically true, and a lot of them 564 00:33:59,240 --> 00:34:01,800 Speaker 1: were technically A lot of them didn't have that as 565 00:34:02,040 --> 00:34:04,480 Speaker 1: there was a lot of errors, a lot of errors. 566 00:34:04,560 --> 00:34:08,360 Speaker 1: So what advice would you give your sixteen year old self, 567 00:34:08,560 --> 00:34:10,680 Speaker 1: knowing kind of what you know now. 568 00:34:12,280 --> 00:34:17,640 Speaker 2: I mean, outside of you know this narrow discussion, I 569 00:34:17,640 --> 00:34:20,560 Speaker 2: would say it's the same advice I try to give 570 00:34:20,560 --> 00:34:24,400 Speaker 2: myself now. I wish I had given to myself back then, 571 00:34:25,239 --> 00:34:28,440 Speaker 2: which is to be more courageous. 572 00:34:28,239 --> 00:34:29,320 Speaker 1: Urnty pretty courageous. 573 00:34:30,440 --> 00:34:31,839 Speaker 2: Well, thank you for saying so. 574 00:34:31,920 --> 00:34:36,000 Speaker 4: But it still feels like it's not enough to try 575 00:34:36,040 --> 00:34:41,960 Speaker 4: to remove any semblance of concern about failure or about 576 00:34:42,040 --> 00:34:45,560 Speaker 4: what people will think about you, and to really just 577 00:34:45,760 --> 00:34:50,120 Speaker 4: have the courage to go after what you feel is 578 00:34:50,160 --> 00:34:51,480 Speaker 4: the right thing to go after. 579 00:34:53,080 --> 00:34:54,120 Speaker 2: That's really hard to do. 580 00:34:54,800 --> 00:34:59,680 Speaker 1: I like that. Okay, I've loved this conversation you. Your 581 00:34:59,719 --> 00:35:03,080 Speaker 1: book is fantastic abundance of caution. Everyone should go buy it. 582 00:35:03,760 --> 00:35:07,120 Speaker 1: And here with your best tip for my listeners on 583 00:35:07,160 --> 00:35:09,080 Speaker 1: how they can improve their lives. 584 00:35:09,840 --> 00:35:12,239 Speaker 2: Well, you know, there are so many things to so 585 00:35:12,239 --> 00:35:14,200 Speaker 2: many directions to go with this, but the area that 586 00:35:14,239 --> 00:35:16,040 Speaker 2: I think I have some knowledge on is what we've 587 00:35:16,040 --> 00:35:18,560 Speaker 2: been talking about. And I would say, when you are 588 00:35:18,760 --> 00:35:22,439 Speaker 2: reading an article or watching something on TV or YouTube, whatever, 589 00:35:22,480 --> 00:35:26,759 Speaker 2: when you are presented with information, always ask yourself, is 590 00:35:26,800 --> 00:35:30,080 Speaker 2: this an argument from authority? That that means is this 591 00:35:30,239 --> 00:35:32,440 Speaker 2: just being told to you by someone who has a 592 00:35:32,480 --> 00:35:36,239 Speaker 2: credential or are they actually providing evidence for what they 593 00:35:36,280 --> 00:35:39,680 Speaker 2: are saying? And if you can follow that, you're going 594 00:35:39,760 --> 00:35:42,960 Speaker 2: to change how you view the world, and you will 595 00:35:42,960 --> 00:35:46,160 Speaker 2: start to see, including from media that you like and 596 00:35:46,200 --> 00:35:49,480 Speaker 2: that you favor, you will start to see how so 597 00:35:49,640 --> 00:35:52,919 Speaker 2: much of what we are told is simply arguments from 598 00:35:52,960 --> 00:35:58,319 Speaker 2: authority rather than the actual true information that you need 599 00:35:58,360 --> 00:36:01,360 Speaker 2: to be told. In that degree three of awareness and 600 00:36:01,440 --> 00:36:06,560 Speaker 2: skepticism will make you a more informed person in life. 601 00:36:06,719 --> 00:36:10,160 Speaker 1: He is David's Vike. Check out his work everywhere by 602 00:36:10,239 --> 00:36:12,960 Speaker 1: abundance of caution. Thank you so much for coming on, David. 603 00:36:13,520 --> 00:36:14,719 Speaker 2: Thanks Carol, this is great. 604 00:36:14,840 --> 00:36:17,479 Speaker 1: Thanks so much for joining us on the Carol Marcowitz Show. 605 00:36:17,560 --> 00:36:19,640 Speaker 1: Subscribe wherever you get your podcasts.