1 00:00:05,200 --> 00:00:08,320 Speaker 1: The mainstream media today is out of touch and out 2 00:00:08,360 --> 00:00:12,600 Speaker 1: of control. We see dishonest reporting meant to push self 3 00:00:12,640 --> 00:00:17,439 Speaker 1: serving narratives, cozy relationships between the press and the powerful 4 00:00:17,480 --> 00:00:21,360 Speaker 1: elites of society, and no accountability when they get it 5 00:00:21,360 --> 00:00:25,640 Speaker 1: wrong or get caught. This has been going on for 6 00:00:25,680 --> 00:00:29,680 Speaker 1: a while, but it became apparent when Trump ran for 7 00:00:29,720 --> 00:00:34,919 Speaker 1: president in twenty sixteen, accelerated and became undeniable during the 8 00:00:34,920 --> 00:00:38,919 Speaker 1: presidential election, and COVID pandemic in twenty twenty, and is 9 00:00:39,040 --> 00:00:44,839 Speaker 1: now moving at breakneck speed. We've seen collusion and censorship. 10 00:00:45,200 --> 00:00:50,400 Speaker 1: We've seen American citizens docked, intimidated and silenced. We've seen 11 00:00:50,440 --> 00:00:54,560 Speaker 1: the Biden administration, big tech, and mainstream media gaslight the 12 00:00:54,600 --> 00:00:59,200 Speaker 1: American people, and many Americans are left wondering what to 13 00:00:59,240 --> 00:01:04,320 Speaker 1: believe anymore. Well, my guest today has spent his entire 14 00:01:04,520 --> 00:01:08,080 Speaker 1: life in the media, from starting off as a self 15 00:01:08,120 --> 00:01:13,920 Speaker 1: described journalism nerd in high school to working at CNN, FOX, NBC, 16 00:01:14,120 --> 00:01:18,080 Speaker 1: The Blaze. Steve Krackauer is his name, and he's a 17 00:01:18,160 --> 00:01:20,440 Speaker 1: journalist and media critic, and he's the author of the 18 00:01:20,480 --> 00:01:23,920 Speaker 1: brand new book Uncovered. How the Media got Cozy with 19 00:01:24,040 --> 00:01:28,760 Speaker 1: Power abandoned its principles and lost the people. Steve is 20 00:01:28,880 --> 00:01:32,520 Speaker 1: unbelievably brilliant with this stuff, and he's probably forgotten more 21 00:01:32,520 --> 00:01:35,160 Speaker 1: about the media industry in this country than I'll ever 22 00:01:35,440 --> 00:01:38,320 Speaker 1: learn in my lifetime. Steve and I discussed the state 23 00:01:38,319 --> 00:01:41,400 Speaker 1: of journalism today in the media and how we got here, 24 00:01:41,680 --> 00:01:46,600 Speaker 1: the dangers and fallout of an unaccountable media establishment, and 25 00:01:46,680 --> 00:01:47,960 Speaker 1: most importantly. 26 00:01:48,080 --> 00:01:49,040 Speaker 2: How we can fix it. 27 00:01:49,920 --> 00:01:54,160 Speaker 1: So this was a fascinating, fascinating interview, and I hope 28 00:01:54,160 --> 00:01:57,400 Speaker 1: you enjoyed as much as I did. Without further ado, 29 00:01:57,760 --> 00:02:06,960 Speaker 1: ladies and gentlemen, my discuss with Steve Krackout. Steve, thank 30 00:02:07,000 --> 00:02:09,840 Speaker 1: you so much for coming on Battleground and talking with 31 00:02:09,919 --> 00:02:12,839 Speaker 1: us about your latest book, Uncovered, How the media got 32 00:02:12,880 --> 00:02:16,919 Speaker 1: cozy with power, abandon its principles and lost to people, 33 00:02:17,120 --> 00:02:19,079 Speaker 1: and just for everybody he's watching or listening. 34 00:02:20,360 --> 00:02:21,799 Speaker 2: I read this book cover to cover. 35 00:02:21,840 --> 00:02:24,880 Speaker 1: I also listened to the audio book, which is obviously 36 00:02:24,919 --> 00:02:27,119 Speaker 1: if you're an audiobook person, which a lot of people 37 00:02:27,200 --> 00:02:30,360 Speaker 1: are in this day and age, it's pretty damn awesome 38 00:02:30,560 --> 00:02:32,920 Speaker 1: because you get to, like, not only do you get 39 00:02:32,919 --> 00:02:35,880 Speaker 1: the benefit of actually hearing you read the actual book 40 00:02:35,880 --> 00:02:40,520 Speaker 1: itself in your voice, you actually hear these interviews with 41 00:02:40,639 --> 00:02:46,240 Speaker 1: these journalists that you talk to, like Tucker Carlson or 42 00:02:46,480 --> 00:02:49,760 Speaker 1: Kaylie mcinnerney or Ben Smith, Peers Morgan or like one 43 00:02:49,760 --> 00:02:52,560 Speaker 1: of my all time favorites, Selena Zito, and many many 44 00:02:52,600 --> 00:02:58,480 Speaker 1: more like you've just this is a rigorous book. And 45 00:02:58,560 --> 00:03:00,000 Speaker 1: what I always used to tell people in the army, 46 00:03:00,040 --> 00:03:02,560 Speaker 1: Steve is like, as a commander, like, don't just come 47 00:03:02,560 --> 00:03:06,320 Speaker 1: to me bitching, griping and moaning about something. Come to 48 00:03:06,360 --> 00:03:10,040 Speaker 1: me with the issue, a discussion, and a recommendation. And 49 00:03:10,120 --> 00:03:11,880 Speaker 1: what I loved about this book is not only do 50 00:03:11,880 --> 00:03:14,440 Speaker 1: you have five very clear things that the media is 51 00:03:14,480 --> 00:03:18,600 Speaker 1: not doing so well, you also give a recommendation on 52 00:03:18,639 --> 00:03:19,320 Speaker 1: how to fix it. 53 00:03:20,120 --> 00:03:22,840 Speaker 3: I try to Yeah, that's the that's the goal, because 54 00:03:22,880 --> 00:03:24,560 Speaker 3: you know, as I lay out in the book, I 55 00:03:24,600 --> 00:03:28,680 Speaker 3: think that the media is very broken, the corporate media. 56 00:03:28,800 --> 00:03:32,079 Speaker 3: It was. It was not perfect when in the days 57 00:03:32,080 --> 00:03:34,440 Speaker 3: when I was at CNN in twenty ten to twenty thirteen, 58 00:03:34,520 --> 00:03:37,600 Speaker 3: by any means, there was valid criticisms. But it's gotten 59 00:03:37,800 --> 00:03:40,760 Speaker 3: significantly worse. And I think it's been It's obvious to 60 00:03:40,840 --> 00:03:43,600 Speaker 3: most people, you know, even people that are apolitical. You know, 61 00:03:43,640 --> 00:03:46,040 Speaker 3: I live in Texas. Most of the people I interact 62 00:03:46,040 --> 00:03:48,600 Speaker 3: with on a daily basis are not interested in politics 63 00:03:48,640 --> 00:03:51,240 Speaker 3: as a hobby or interested even in the media as 64 00:03:51,240 --> 00:03:53,040 Speaker 3: a hobby. They watch it because they want to know 65 00:03:53,040 --> 00:03:55,240 Speaker 3: what's going on in the world, and they can see it. 66 00:03:55,280 --> 00:03:58,840 Speaker 3: They can't necessarily see why it happened or how it's happening, 67 00:03:59,000 --> 00:04:02,440 Speaker 3: but they can feel so thing fundamentally change and they've 68 00:04:02,480 --> 00:04:05,200 Speaker 3: lost trust in these places that they maybe had even 69 00:04:05,240 --> 00:04:08,120 Speaker 3: ten years ago, some degree of trust. And so I 70 00:04:08,160 --> 00:04:09,400 Speaker 3: do try to lay that out. And first of all, 71 00:04:09,400 --> 00:04:11,560 Speaker 3: I'll have to say thank you on the audiobook side, 72 00:04:11,600 --> 00:04:14,120 Speaker 3: because I do. It was something that was really important 73 00:04:14,120 --> 00:04:17,120 Speaker 3: to me, was every conversation I had in the book 74 00:04:17,200 --> 00:04:19,880 Speaker 3: twenty six interviews that I do with people across the spectrum, 75 00:04:19,880 --> 00:04:21,520 Speaker 3: as you mentioned, a lot of people from Fox News, 76 00:04:21,680 --> 00:04:25,120 Speaker 3: people from NBC, people from the New York Times and beyond. 77 00:04:25,520 --> 00:04:27,640 Speaker 3: Everything is on the record, first of all, so there's 78 00:04:27,680 --> 00:04:29,839 Speaker 3: no anonymous sources in there. And not only that, but 79 00:04:29,880 --> 00:04:33,080 Speaker 3: I've recorded all these conversations and the audience could hear 80 00:04:33,120 --> 00:04:36,240 Speaker 3: these the people in their own voices give all the 81 00:04:36,320 --> 00:04:38,640 Speaker 3: quotes that they give in the audiobook. If you choose 82 00:04:38,640 --> 00:04:39,160 Speaker 3: that route. 83 00:04:39,720 --> 00:04:42,800 Speaker 1: I mean, it's so I mean, you've clearly forgotten more 84 00:04:42,960 --> 00:04:47,320 Speaker 1: about the media than I could have even hope to remember, Steve. 85 00:04:47,720 --> 00:04:49,320 Speaker 1: I mean, but you've done it all. I mean, like 86 00:04:49,600 --> 00:04:52,479 Speaker 1: you've worked for CNN and now you're an executive producer 87 00:04:52,480 --> 00:04:54,880 Speaker 1: for Megan Kelly, and you've done all sorts of stuff 88 00:04:54,920 --> 00:04:58,280 Speaker 1: in between. It seems like you know everybody. And as 89 00:04:58,320 --> 00:05:00,240 Speaker 1: I mentioned, there are five clear things in this book 90 00:05:00,279 --> 00:05:02,040 Speaker 1: that you talk about that the media just doesn't do 91 00:05:02,160 --> 00:05:05,120 Speaker 1: very well, one of which is geographic bias, lack of introspection, 92 00:05:05,320 --> 00:05:10,000 Speaker 1: coziness with power, broken financial incentives and influencers, and anti 93 00:05:10,160 --> 00:05:14,360 Speaker 1: speech activism. And we could probably do twenty four hours 94 00:05:14,400 --> 00:05:17,520 Speaker 1: on all of this stuff, and I'm so interested in it. 95 00:05:17,600 --> 00:05:21,200 Speaker 1: But like, before we get into any of that, how 96 00:05:21,200 --> 00:05:24,240 Speaker 1: did you get into this? Like where are you from? 97 00:05:24,440 --> 00:05:25,520 Speaker 2: And how did you? How did you? 98 00:05:25,680 --> 00:05:28,440 Speaker 1: Because one of the things that blew me away about 99 00:05:28,440 --> 00:05:30,960 Speaker 1: you is like learning about all the information that you 100 00:05:31,040 --> 00:05:34,760 Speaker 1: know about this stuff. Steve, you clearly love and have 101 00:05:34,960 --> 00:05:36,840 Speaker 1: an affinity for the media. You don't want to burn 102 00:05:36,880 --> 00:05:41,360 Speaker 1: it all down, right, you think it clearly you see 103 00:05:40,440 --> 00:05:44,320 Speaker 1: in the vital role that the media plays in this country. 104 00:05:45,080 --> 00:05:49,599 Speaker 1: But what like, what about this profession attracted you in 105 00:05:49,600 --> 00:05:50,239 Speaker 1: the first place. 106 00:05:50,920 --> 00:05:54,400 Speaker 3: Yeah, I was a journalism nerd from back in high school. 107 00:05:54,400 --> 00:05:55,360 Speaker 3: To be honest, my. 108 00:05:55,320 --> 00:05:57,960 Speaker 2: First place, what's a journalism nerd? 109 00:05:58,960 --> 00:06:02,440 Speaker 3: I went to a camp after my junior high school 110 00:06:02,480 --> 00:06:03,280 Speaker 3: for journalism. 111 00:06:03,400 --> 00:06:05,320 Speaker 2: Ever, one of those guys I. 112 00:06:05,160 --> 00:06:08,200 Speaker 3: Was, I was, I mean, I I really I ate 113 00:06:08,240 --> 00:06:11,320 Speaker 3: it up. And my first job in when I was 114 00:06:11,360 --> 00:06:13,679 Speaker 3: in high school, I was working at my local town paper. 115 00:06:13,720 --> 00:06:16,080 Speaker 3: I worked in this little town in New Jersey. We 116 00:06:16,160 --> 00:06:18,840 Speaker 3: had a weekly paper that came out I in this 117 00:06:18,920 --> 00:06:22,520 Speaker 3: summer I covered men's softball leagues that I'd go and 118 00:06:22,560 --> 00:06:25,320 Speaker 3: I'd get paid twenty dollars if they ran my article. 119 00:06:25,400 --> 00:06:29,600 Speaker 3: And yeah, it was funny here I did. Yeah, I knew. 120 00:06:29,600 --> 00:06:31,720 Speaker 3: I knew early on I was very interested in journalism. 121 00:06:32,240 --> 00:06:35,239 Speaker 3: Went to Syracuse University and knew how school, the journalism 122 00:06:35,320 --> 00:06:40,560 Speaker 3: school there, and really from really from that moment, you know, 123 00:06:40,600 --> 00:06:43,279 Speaker 3: fifteen sixteen years old, was very interested in it. And 124 00:06:44,000 --> 00:06:45,400 Speaker 3: at the same time, you know, I mentioned I was 125 00:06:45,400 --> 00:06:47,640 Speaker 3: grew up in New Jersey, went to school at Syracuse 126 00:06:47,880 --> 00:06:49,440 Speaker 3: in New York. I lived in New York City for 127 00:06:49,800 --> 00:06:52,799 Speaker 3: many years, working at a variety of outlet's covering the media. 128 00:06:52,920 --> 00:06:54,400 Speaker 3: That's where I kind of really got to know a 129 00:06:54,400 --> 00:06:56,880 Speaker 3: lot of the people in the media. Was working at 130 00:06:56,920 --> 00:07:00,600 Speaker 3: sites like Mediaite, which I helped launch in two thousand 131 00:06:59,880 --> 00:07:03,680 Speaker 3: and then working at CNN and then The Blaze. And 132 00:07:03,720 --> 00:07:05,880 Speaker 3: at the Blaze, I ended up moving down to Dallas 133 00:07:05,880 --> 00:07:07,719 Speaker 3: where we have a big office. Where we had a 134 00:07:07,720 --> 00:07:10,720 Speaker 3: big office Glenn Beck, Glennbeck's company there. They were based 135 00:07:10,720 --> 00:07:13,040 Speaker 3: in New York and based in Dallas. I moved with 136 00:07:13,080 --> 00:07:15,960 Speaker 3: my wife in twenty fourteen down to Dallas. So I've 137 00:07:15,960 --> 00:07:17,960 Speaker 3: been here for nine years now, and frankly, you ask, 138 00:07:18,000 --> 00:07:22,320 Speaker 3: where's the genesis of this? Also is I love journalism. 139 00:07:22,400 --> 00:07:26,200 Speaker 3: But by getting outside of that literal bubble, the New 140 00:07:26,240 --> 00:07:28,960 Speaker 3: York and DC bubble, opened my eyes in a lot 141 00:07:29,000 --> 00:07:32,160 Speaker 3: of ways. And after the Blaze, I really stepped outside 142 00:07:32,200 --> 00:07:35,000 Speaker 3: of the media overall. I had my own company, I 143 00:07:35,040 --> 00:07:38,560 Speaker 3: worked in a marketing agency, and I was viewing this 144 00:07:39,320 --> 00:07:43,040 Speaker 3: very strange to me moment in twenty fifteen and even 145 00:07:43,040 --> 00:07:45,679 Speaker 3: going back a little bit earlier and then twenty sixteen 146 00:07:45,720 --> 00:07:48,600 Speaker 3: and what happened with Trump, but far beyond Trump too, 147 00:07:49,560 --> 00:07:52,200 Speaker 3: and trying to diagnose what was happening because it was 148 00:07:52,240 --> 00:07:55,680 Speaker 3: so foreign to what I was viewing, to what the 149 00:07:55,680 --> 00:07:59,080 Speaker 3: people that I encounter are experiencing and I think about 150 00:07:59,080 --> 00:08:01,240 Speaker 3: it a lot. Sean, if I was still at CNN, 151 00:08:01,280 --> 00:08:03,240 Speaker 3: if I still lived in New York, would I have 152 00:08:03,320 --> 00:08:06,000 Speaker 3: been able to see frankly, what I was able to 153 00:08:06,040 --> 00:08:08,360 Speaker 3: see and put into uncovered, I don't know. I think 154 00:08:08,400 --> 00:08:09,840 Speaker 3: I may have gotten caught up in a lot of 155 00:08:09,840 --> 00:08:12,720 Speaker 3: this also, So it was being physically outside of that 156 00:08:12,720 --> 00:08:14,320 Speaker 3: that was able to I think open my eyes to 157 00:08:14,400 --> 00:08:16,280 Speaker 3: a little bit and try to diagnose this from a 158 00:08:16,320 --> 00:08:20,320 Speaker 3: position of love, in some ways tough love and trying 159 00:08:20,360 --> 00:08:21,920 Speaker 3: to get it better, because I think we do need 160 00:08:21,920 --> 00:08:23,800 Speaker 3: a strong media, but we just don't have it right now. 161 00:08:24,000 --> 00:08:27,880 Speaker 1: I mean so clear that you love and respect your profession, 162 00:08:27,920 --> 00:08:30,040 Speaker 1: and it makes a lot of sense that you were 163 00:08:30,080 --> 00:08:33,600 Speaker 1: one of those kids that was going to journalism summer camps. 164 00:08:33,920 --> 00:08:37,000 Speaker 1: Like so much makes sense now about this book because 165 00:08:37,040 --> 00:08:39,960 Speaker 1: it does seem like you really do love the profession 166 00:08:40,480 --> 00:08:44,240 Speaker 1: and you want to see it improve. And so you 167 00:08:44,280 --> 00:08:47,079 Speaker 1: were like a Blaze guy. You were one of the 168 00:08:47,559 --> 00:08:51,000 Speaker 1: Blaze farm team, so to speak, because in twenty fourteen, 169 00:08:51,120 --> 00:08:55,320 Speaker 1: like one of my first forays into the media, one 170 00:08:55,320 --> 00:08:57,679 Speaker 1: of my first experiences with television. 171 00:08:57,200 --> 00:08:59,079 Speaker 2: Media was the Blaze and it was. 172 00:08:59,040 --> 00:09:01,680 Speaker 1: Real News, and Sexton was there, yeah, yeah, and Will 173 00:09:01,800 --> 00:09:05,600 Speaker 1: Caine was there, and Pete Hegseth was there, and so 174 00:09:05,760 --> 00:09:06,400 Speaker 1: many people. 175 00:09:06,600 --> 00:09:08,400 Speaker 2: So many people were there, and you were there as well. 176 00:09:08,480 --> 00:09:09,720 Speaker 2: We must have just missed each other. 177 00:09:10,040 --> 00:09:11,880 Speaker 3: Yeah. So I was there in the sense that I 178 00:09:11,960 --> 00:09:14,199 Speaker 3: was a producer. My job title at the Blaze was 179 00:09:14,280 --> 00:09:17,319 Speaker 3: Vice President of Digital Content. I was fully behind the scenes. 180 00:09:17,600 --> 00:09:19,839 Speaker 3: I worked really closely with people like Buck and Will 181 00:09:19,880 --> 00:09:24,320 Speaker 3: Kane in particular. Yeah. On basically my job was to 182 00:09:24,440 --> 00:09:26,960 Speaker 3: you know, there was loosely television, you know, Real News 183 00:09:27,000 --> 00:09:29,280 Speaker 3: aired at six pm every night on The Blaze. It 184 00:09:29,280 --> 00:09:31,960 Speaker 3: was a streaming network and very early in the process. Frankly, 185 00:09:32,000 --> 00:09:34,520 Speaker 3: I think they were early in what was to come 186 00:09:34,559 --> 00:09:37,320 Speaker 3: with like the netflixes up the world, the whole business model. 187 00:09:37,720 --> 00:09:39,840 Speaker 3: But then it was like, well, what can we do 188 00:09:40,000 --> 00:09:44,319 Speaker 3: to expand our reach through YouTube, through social media? Working 189 00:09:44,360 --> 00:09:47,079 Speaker 3: with people like Will and Buck and Tara Setmeyer was 190 00:09:47,080 --> 00:09:49,839 Speaker 3: there and se Cup people that have gotten in other directions, 191 00:09:49,960 --> 00:09:53,720 Speaker 3: let's say, and expand their personal brands. What else do 192 00:09:53,760 --> 00:09:55,280 Speaker 3: you want to do? What can we put on YouTube? 193 00:09:55,280 --> 00:09:57,880 Speaker 3: What can we put on other platforms? So, yeah, it 194 00:09:57,960 --> 00:09:59,839 Speaker 3: was it was interesting and you know, obviously I was 195 00:09:59,840 --> 00:10:02,439 Speaker 3: at CNN, I worked for Beers Morgan, I worked very 196 00:10:02,440 --> 00:10:05,240 Speaker 3: closely with Jeff Zucker, and then I was at The Blaze. 197 00:10:05,240 --> 00:10:07,280 Speaker 3: I worked very closely with Glenn Beck. I'm not an 198 00:10:07,320 --> 00:10:10,240 Speaker 3: overly political person one way or the other. I think 199 00:10:10,440 --> 00:10:13,599 Speaker 3: it is probably clear I've got personal opinions, but I 200 00:10:13,640 --> 00:10:16,640 Speaker 3: mean I'm a registered independent. I don't, you know, see 201 00:10:16,640 --> 00:10:20,600 Speaker 3: this as an ideological thing. I think that it's beyond that, 202 00:10:20,720 --> 00:10:23,120 Speaker 3: and I try to also look for the best in people. 203 00:10:23,840 --> 00:10:25,840 Speaker 3: That's another thing is I think there's so much judgment 204 00:10:26,000 --> 00:10:29,360 Speaker 3: in especially in the corporate media, and I think that 205 00:10:29,559 --> 00:10:32,360 Speaker 3: generally the average person in America is a good person 206 00:10:32,520 --> 00:10:35,640 Speaker 3: and they are not being reflected and served well by 207 00:10:35,760 --> 00:10:38,240 Speaker 3: a media that has such disdain for them. Frankly on 208 00:10:38,320 --> 00:10:39,120 Speaker 3: the right or the left. 209 00:10:40,000 --> 00:10:42,959 Speaker 1: I completely agree with you and so much of this book, 210 00:10:42,960 --> 00:10:45,440 Speaker 1: and the thesis that the underlying thesis of it all 211 00:10:45,559 --> 00:10:48,160 Speaker 1: is that you talk about the media at least over 212 00:10:48,200 --> 00:10:50,280 Speaker 1: the last twenty years or so in the way at 213 00:10:50,360 --> 00:10:52,320 Speaker 1: least it broke down in my mind, was that you 214 00:10:52,360 --> 00:10:56,319 Speaker 1: talk about like from twenty ten up to basically twenty sixteen. 215 00:10:56,400 --> 00:11:00,160 Speaker 1: And in twenty sixteen somebody came along named Donald Trump. 216 00:11:00,600 --> 00:11:03,840 Speaker 1: That sort of shifted everything in many ways you talk. 217 00:11:03,679 --> 00:11:04,360 Speaker 2: About in the book. 218 00:11:05,440 --> 00:11:09,760 Speaker 1: Yeah, the media was probably always democratic, maybe always leaned 219 00:11:09,840 --> 00:11:13,280 Speaker 1: left a little bit. But when Trump came along, man, 220 00:11:13,400 --> 00:11:16,600 Speaker 1: it's like it seemed like the media abandoned so many 221 00:11:16,640 --> 00:11:19,840 Speaker 1: of the principles that of what make the media what 222 00:11:19,880 --> 00:11:22,720 Speaker 1: it is important to the people, and basically speaking truth 223 00:11:22,760 --> 00:11:25,120 Speaker 1: the power and being a guard dog of our republic 224 00:11:25,200 --> 00:11:29,040 Speaker 1: and our democracy, to just like a rabid focus on 225 00:11:29,120 --> 00:11:32,240 Speaker 1: Donald Trump, in part because like maybe he was good 226 00:11:32,280 --> 00:11:34,640 Speaker 1: for ratings and that was good for some of these 227 00:11:34,679 --> 00:11:39,160 Speaker 1: networks in their bottom line, but talk about that. And 228 00:11:39,320 --> 00:11:43,360 Speaker 1: in that shift, Donald Trump was just like a paradigm shifter. 229 00:11:43,480 --> 00:11:44,720 Speaker 2: I guess he was. 230 00:11:44,880 --> 00:11:47,800 Speaker 3: I think it was in a void that was perhaps 231 00:11:47,880 --> 00:11:50,960 Speaker 3: already created I think through through social media, through the 232 00:11:50,960 --> 00:11:55,360 Speaker 3: incentive structure. There there was fear in not saying the 233 00:11:55,720 --> 00:11:58,160 Speaker 3: wrong thing and covering something in a way that was 234 00:11:58,440 --> 00:12:00,240 Speaker 3: you know, might get you backlash on Twitter. So that 235 00:12:00,320 --> 00:12:03,520 Speaker 3: already existed. And then as you mentioned, yeah, now comes 236 00:12:03,600 --> 00:12:07,160 Speaker 3: Donald Trump down the escalator and and and just completely 237 00:12:07,240 --> 00:12:10,960 Speaker 3: upsets everything. And I lay out and uncovered in chapter five, 238 00:12:11,200 --> 00:12:12,960 Speaker 3: which I call the Trump Addiction, because I do think 239 00:12:13,000 --> 00:12:15,320 Speaker 3: it's an addiction. I think that there's something that they 240 00:12:15,360 --> 00:12:20,280 Speaker 3: almost can't control that they're feeling for him, and and 241 00:12:20,440 --> 00:12:22,640 Speaker 3: you know, hatred in one way, love in another way, 242 00:12:23,440 --> 00:12:26,240 Speaker 3: and trying to diagnose what it is that that happened here. 243 00:12:26,600 --> 00:12:28,800 Speaker 3: And it's kind of three things. I think that there 244 00:12:29,040 --> 00:12:32,120 Speaker 3: there absolutely was. It was business on one level of it. Yes, 245 00:12:32,160 --> 00:12:34,040 Speaker 3: he was great for ratings, it was great for clicks, 246 00:12:34,040 --> 00:12:38,319 Speaker 3: he was great subscribers. That's that's absolutely a part of it. 247 00:12:38,920 --> 00:12:41,000 Speaker 3: But then on an addition, you know, it's also personal. 248 00:12:41,320 --> 00:12:43,160 Speaker 3: And I talk about this with so many different people. 249 00:12:43,160 --> 00:12:46,320 Speaker 3: It's business and personal. Because Donald Trump was you have 250 00:12:46,360 --> 00:12:49,120 Speaker 3: to understand he was in that media environment. This is 251 00:12:49,160 --> 00:12:51,160 Speaker 3: a guy in nineteen eighty seven who writes The Art 252 00:12:51,160 --> 00:12:53,000 Speaker 3: of the Deal, which is a fascinating book for anyone 253 00:12:53,000 --> 00:12:55,240 Speaker 3: who wants to understand. Here we go again. By the way, 254 00:12:55,240 --> 00:12:56,960 Speaker 3: I mean, this is all going to be repeated as 255 00:12:57,000 --> 00:12:59,559 Speaker 3: for finding out, So it's never too late to read 256 00:12:59,640 --> 00:13:02,880 Speaker 3: art and understand what is happening here. Because he controlled 257 00:13:02,880 --> 00:13:05,640 Speaker 3: the media to such an extent, he's part of that world. 258 00:13:05,880 --> 00:13:07,640 Speaker 3: This is a guy who was married in two thousand 259 00:13:07,640 --> 00:13:11,000 Speaker 3: and five, and Jeff Zucker was there and Katie Kuric 260 00:13:11,080 --> 00:13:13,199 Speaker 3: was there, and Gail King was there, and Chris Matthews. 261 00:13:13,600 --> 00:13:15,640 Speaker 3: This is I mean, you know, The Apprentice was an 262 00:13:15,760 --> 00:13:19,720 Speaker 3: enormous hit show. He hosted Saturday Night Live in twenty fifteen. 263 00:13:19,800 --> 00:13:22,040 Speaker 3: So he's part of that world. And then he becomes 264 00:13:22,040 --> 00:13:24,760 Speaker 3: his turncoat to the world, you know, he becomes a 265 00:13:25,840 --> 00:13:28,600 Speaker 3: he exposes that world in a lot of ways. Again 266 00:13:28,640 --> 00:13:31,160 Speaker 3: a reminder, Hillary Clinton and Bill Clinton were at that 267 00:13:31,200 --> 00:13:34,040 Speaker 3: wedding in two thousand and five, and so were all 268 00:13:34,080 --> 00:13:36,480 Speaker 3: this out as well. Yeah, so he was very much 269 00:13:36,480 --> 00:13:38,679 Speaker 3: connected in that way. And then I also think though, 270 00:13:38,679 --> 00:13:40,760 Speaker 3: because you think, well, all that still is not a 271 00:13:40,800 --> 00:13:44,920 Speaker 3: reason to completely abandon these principles. But what I heard 272 00:13:45,000 --> 00:13:47,320 Speaker 3: from people who were in these newsrooms during the Trump 273 00:13:47,400 --> 00:13:50,720 Speaker 3: years was that they believed some people really did believe 274 00:13:50,760 --> 00:13:53,520 Speaker 3: they were in this existential fight to save democracy. That 275 00:13:53,840 --> 00:13:56,480 Speaker 3: they Donald Trump was this huge and I think it's 276 00:13:56,520 --> 00:13:59,000 Speaker 3: completely ridiculous. I'm sure a lot of people think it's ridiculous. 277 00:13:59,000 --> 00:14:00,600 Speaker 3: But what I would say to that is, if you 278 00:14:00,679 --> 00:14:03,160 Speaker 3: really believe this, I mean, let's just grant that that's 279 00:14:03,240 --> 00:14:05,240 Speaker 3: the case. That's when you need to double down on 280 00:14:05,280 --> 00:14:07,240 Speaker 3: your principles even more. You know, that's when you need 281 00:14:07,280 --> 00:14:10,400 Speaker 3: to to really be as strong and your you know, 282 00:14:10,640 --> 00:14:13,800 Speaker 3: instincts and in your journalistic ability and your integrity so 283 00:14:13,840 --> 00:14:15,839 Speaker 3: that you can convince the widest number of people that 284 00:14:16,080 --> 00:14:18,680 Speaker 3: he's this great threat. But instead they went the other direction. 285 00:14:18,760 --> 00:14:21,040 Speaker 3: Then the guardrails were completely off. You know, they said, oh, 286 00:14:21,200 --> 00:14:22,960 Speaker 3: we have to meet this moment and so we must 287 00:14:23,200 --> 00:14:27,040 Speaker 3: now give monologues in our news programs. No, that that's 288 00:14:27,120 --> 00:14:29,720 Speaker 3: going the opposite way, and it just destroyed the trust 289 00:14:29,960 --> 00:14:31,600 Speaker 3: of the average viewer and reader. 290 00:14:31,960 --> 00:14:34,600 Speaker 2: I mean, what do you think created this? Oh? 291 00:14:34,640 --> 00:14:37,000 Speaker 1: I mean people call it Trump derangement syndrome, and I'm 292 00:14:37,040 --> 00:14:39,000 Speaker 1: sure that you've heard that, but like, what do you 293 00:14:39,080 --> 00:14:41,520 Speaker 1: think What I found fascinating is it just what you 294 00:14:41,560 --> 00:14:45,280 Speaker 1: talked about. All these people from George Stephanopoulos to Hillary 295 00:14:45,280 --> 00:14:48,120 Speaker 1: Clinton and Bill Clinton and Katie Kirk, all these people 296 00:14:48,120 --> 00:14:51,160 Speaker 1: were at his wedding and prior to Donald Trump running 297 00:14:51,160 --> 00:14:53,320 Speaker 1: for office. I mean, celebrities were lining up around the 298 00:14:53,360 --> 00:14:56,040 Speaker 1: block to go to his parties and be a part 299 00:14:56,040 --> 00:14:58,960 Speaker 1: of the world that he created. And another thing that 300 00:14:58,960 --> 00:15:01,240 Speaker 1: I found fascinating about your book is is at his 301 00:15:01,360 --> 00:15:04,720 Speaker 1: relationship with Zucker right where Zucker was working at NBC 302 00:15:04,880 --> 00:15:09,640 Speaker 1: and how Donald Trump was being prior to running for president, 303 00:15:10,120 --> 00:15:11,000 Speaker 1: or at least the second time. 304 00:15:11,040 --> 00:15:12,720 Speaker 2: I think he ran for president one time before that 305 00:15:12,840 --> 00:15:15,120 Speaker 2: or whatever. But he was teased at yeah, yeah, I 306 00:15:15,240 --> 00:15:15,720 Speaker 2: teased it. 307 00:15:16,480 --> 00:15:18,720 Speaker 1: They were propping him up to be the face of NBC. 308 00:15:18,800 --> 00:15:21,640 Speaker 1: And so in wait, like, so Jeff Zucker and Donald 309 00:15:21,680 --> 00:15:26,600 Speaker 1: Trump almost they work, they created each other, and then 310 00:15:26,640 --> 00:15:30,080 Speaker 1: they find themselves on opposite ends of the spectrum. When 311 00:15:30,160 --> 00:15:32,680 Speaker 1: Zucker is at CNN going to war with Donald Trump 312 00:15:32,720 --> 00:15:34,720 Speaker 1: and Donald Trump at war with CNN, it was just like, 313 00:15:35,920 --> 00:15:37,120 Speaker 1: how the hell does that happen? 314 00:15:37,600 --> 00:15:40,880 Speaker 3: Yeah, well, and it's important to remember also the primary process, 315 00:15:40,920 --> 00:15:43,680 Speaker 3: because they really were not at odds right away in 316 00:15:43,720 --> 00:15:44,360 Speaker 3: twenty fifty. 317 00:15:44,400 --> 00:15:45,600 Speaker 2: So true, that's right. 318 00:15:45,720 --> 00:15:48,040 Speaker 3: Yeah, you know, Jeff Zucker got a lot of criticism 319 00:15:48,080 --> 00:15:51,880 Speaker 3: from the left for the empty podium and just the 320 00:15:51,920 --> 00:15:55,480 Speaker 3: NonStop coverage of the speeches. And again I think it's 321 00:15:55,560 --> 00:15:58,000 Speaker 3: potentially happening again. Jeff Zucker is not there, but CNN 322 00:15:58,040 --> 00:16:01,440 Speaker 3: and other outlets are are certainly devoting of oxygen all 323 00:16:01,480 --> 00:16:04,360 Speaker 3: of their attention to Donald Trump right now. But so 324 00:16:04,400 --> 00:16:06,160 Speaker 3: it didn't start that way. And in fact, there was 325 00:16:07,000 --> 00:16:08,720 Speaker 3: a story that I cite that I believe it was 326 00:16:08,760 --> 00:16:11,520 Speaker 3: Tucker Carlson that broke the news that Jeff Zucker was 327 00:16:11,520 --> 00:16:13,720 Speaker 3: having conversation with Michael Cohen, who at the time was 328 00:16:13,760 --> 00:16:16,920 Speaker 3: still on the Trump side, about how when Donald Trump, 329 00:16:17,000 --> 00:16:19,280 Speaker 3: you know, loses to Hillary Clinton, maybe we'll give him 330 00:16:19,280 --> 00:16:21,720 Speaker 3: a show on CNN. I mean, that was the mentality, 331 00:16:21,760 --> 00:16:25,400 Speaker 3: and the belief was, which they turned out to be wrong, 332 00:16:25,520 --> 00:16:27,360 Speaker 3: was that he would be easy to beat. Put prop 333 00:16:27,440 --> 00:16:30,480 Speaker 3: him up, make him this great star of the media, 334 00:16:30,560 --> 00:16:33,720 Speaker 3: get all the attention, make him the GOP nominee, and 335 00:16:33,760 --> 00:16:36,440 Speaker 3: then have him nicely lose to Hillary Clinton and go 336 00:16:36,480 --> 00:16:38,080 Speaker 3: off in the sunset and we can move on with 337 00:16:38,120 --> 00:16:41,280 Speaker 3: our lives. Well, November twenty sixteen came and there's people, 338 00:16:41,320 --> 00:16:44,680 Speaker 3: as I've reported, also crying in the newsroom about this. 339 00:16:44,720 --> 00:16:47,000 Speaker 3: I mean, it was not just that they were sad 340 00:16:47,000 --> 00:16:49,160 Speaker 3: that Donald Trump won. It was that they believed that 341 00:16:49,200 --> 00:16:52,080 Speaker 3: there was a zero percent chance in these people's minds 342 00:16:52,120 --> 00:16:54,840 Speaker 3: that it was even possible. And it shows how completely 343 00:16:54,920 --> 00:16:57,160 Speaker 3: out of touch they were with the entire mood of 344 00:16:57,160 --> 00:16:59,640 Speaker 3: the country that they didn't even conceive that it was possible. 345 00:17:00,240 --> 00:17:02,920 Speaker 1: I mean, what doesn't that support like one of the 346 00:17:03,000 --> 00:17:06,160 Speaker 1: whole point of geographic bias, which, by the way, by 347 00:17:06,200 --> 00:17:09,280 Speaker 1: the way from western Pennsylvania, born and raised my entire life. 348 00:17:09,440 --> 00:17:11,520 Speaker 1: I fell into that as well a little bit. I mean, 349 00:17:11,840 --> 00:17:14,960 Speaker 1: in twenty sixteen, I backed Mark Rubio. I liked the 350 00:17:15,040 --> 00:17:17,439 Speaker 1: idea of a younger conservative, the new face of the 351 00:17:17,440 --> 00:17:20,840 Speaker 1: Republican Party running for president. And if you remember in 352 00:17:20,880 --> 00:17:23,280 Speaker 1: the South Carolina primary, I was down there with Mark Rubio. 353 00:17:23,560 --> 00:17:26,879 Speaker 1: The dude must have knocked thousands of doors, had an 354 00:17:26,960 --> 00:17:30,480 Speaker 1: unrivaled ground game, and you pull out or like leaving. 355 00:17:30,560 --> 00:17:33,040 Speaker 1: And if you remember back then, like Marco, Rubio had 356 00:17:33,080 --> 00:17:35,480 Speaker 1: to win South Carolina, right, had to to stay in 357 00:17:35,520 --> 00:17:38,600 Speaker 1: the race or to even have any shot at the nomination. 358 00:17:38,720 --> 00:17:41,639 Speaker 1: And I mean he pulled out all the stops there, Steve, 359 00:17:41,880 --> 00:17:44,639 Speaker 1: and we're pulling out, you know, as campaign buses do. 360 00:17:44,800 --> 00:17:47,120 Speaker 1: And like all there are all these staffers and volunteers everywhere, 361 00:17:47,119 --> 00:17:49,480 Speaker 1: like running into McDonald's or going into gas stations and 362 00:17:49,520 --> 00:17:51,720 Speaker 1: getting drinks or whatever. And I like see this like 363 00:17:51,720 --> 00:17:53,960 Speaker 1: little Donald Trump headquarters, and I like walk in there, 364 00:17:54,000 --> 00:17:56,639 Speaker 1: and I'm like, what's all this about? You know, because 365 00:17:57,040 --> 00:17:59,640 Speaker 1: it didn't matter what what Mark Rubio said or did, 366 00:17:59,720 --> 00:18:02,240 Speaker 1: or any Republican that was running against him is like 367 00:18:02,280 --> 00:18:04,639 Speaker 1: the teflon dawn thing, right, nothing. 368 00:18:04,400 --> 00:18:07,080 Speaker 2: Stocked to him. And I walk in there, there's like one. 369 00:18:06,960 --> 00:18:10,439 Speaker 1: Old dude there with like forty iPads, and I'm like, 370 00:18:10,560 --> 00:18:15,199 Speaker 1: what the hell is happening. Clearly I'm missing something. The 371 00:18:15,280 --> 00:18:17,240 Speaker 1: media is missing something. And then I went back to 372 00:18:17,240 --> 00:18:19,600 Speaker 1: Pennsylvania and drove all around the state and you know, 373 00:18:19,680 --> 00:18:22,040 Speaker 1: saw signs and writing an article in the Hill about 374 00:18:22,640 --> 00:18:25,400 Speaker 1: Donald Trump's gonna win, right, And of course I got 375 00:18:25,440 --> 00:18:28,119 Speaker 1: attacked by all of the people that you call Essela media, 376 00:18:28,160 --> 00:18:30,000 Speaker 1: and which, by the way, people in New York City 377 00:18:30,280 --> 00:18:32,439 Speaker 1: take the train to DC and back and forth, and 378 00:18:32,440 --> 00:18:35,480 Speaker 1: so Escela media is the train that they take, the Essella, right, 379 00:18:35,600 --> 00:18:36,240 Speaker 1: And I did it. 380 00:18:36,280 --> 00:18:37,800 Speaker 3: I did it myself many many times. 381 00:18:37,920 --> 00:18:40,080 Speaker 2: Yeah, so it's a very appropriate name. 382 00:18:40,119 --> 00:18:42,360 Speaker 1: And of course I'm like, oh, there are signs everywhere, which, 383 00:18:42,400 --> 00:18:45,919 Speaker 1: of course all of that's anecdotal, right, but I'm like, 384 00:18:46,040 --> 00:18:48,320 Speaker 1: there's just something in the air that people were missing. 385 00:18:48,400 --> 00:18:50,120 Speaker 2: I'm like, Trump was gonna win, and I got attacked. 386 00:18:50,160 --> 00:18:53,760 Speaker 1: We're like, that's just bullshit, that's anecdotal crap, Like you 387 00:18:53,760 --> 00:18:54,760 Speaker 1: don't know what you're talking about. 388 00:18:54,800 --> 00:18:57,920 Speaker 2: I'm like, okay, and sure, sure enough in twenty sixty 389 00:18:57,920 --> 00:18:58,360 Speaker 2: eighty one. 390 00:18:58,520 --> 00:19:02,240 Speaker 3: So so I mean, and you know, it's fascinating. I also, 391 00:19:02,640 --> 00:19:05,359 Speaker 3: you know, you can sure it's a sample size of 392 00:19:05,760 --> 00:19:08,320 Speaker 3: one or a couple. But I had those those same 393 00:19:08,359 --> 00:19:11,040 Speaker 3: sort of moments that that were eye opening to me. 394 00:19:11,160 --> 00:19:13,439 Speaker 3: I had an h VAC repair guy come to my 395 00:19:13,480 --> 00:19:15,840 Speaker 3: house in twenty fifteen. This was before the primary process 396 00:19:15,880 --> 00:19:18,280 Speaker 3: even started. But Donald Trump's running and there's other people running, 397 00:19:18,280 --> 00:19:20,280 Speaker 3: and it was sort of a joke. And he says 398 00:19:20,320 --> 00:19:21,600 Speaker 3: to me, he's like, hey, what did you know. I 399 00:19:21,640 --> 00:19:24,119 Speaker 3: told him I work in media, And I said, what 400 00:19:24,160 --> 00:19:25,600 Speaker 3: do you think bout Donald Trump? I was, you know, 401 00:19:25,640 --> 00:19:27,159 Speaker 3: I don't know, what do you think? He said, Well, 402 00:19:27,200 --> 00:19:28,800 Speaker 3: I'm gonna I'm gonna vote for him. He said, i'mnna 403 00:19:28,840 --> 00:19:30,760 Speaker 3: vote for him because'm in it. With one vote, I 404 00:19:30,760 --> 00:19:34,480 Speaker 3: can piss off both parties. That's exactly right. Yeah, that's 405 00:19:34,520 --> 00:19:36,840 Speaker 3: exactly right. And so that was one and then the 406 00:19:36,880 --> 00:19:39,520 Speaker 3: other one. I went to this dinner party. You know, 407 00:19:39,600 --> 00:19:42,280 Speaker 3: Dallas is a pretty fifty to fifty city, you know, 408 00:19:42,359 --> 00:19:45,359 Speaker 3: red and blue. I went to dinner party. It was 409 00:19:45,560 --> 00:19:47,560 Speaker 3: I have a lot of gay friends here. This was 410 00:19:47,600 --> 00:19:49,959 Speaker 3: I was sitting next to gay, married couple and they 411 00:19:50,000 --> 00:19:52,800 Speaker 3: were huge Trump supporters. These are guys, you know, and 412 00:19:52,800 --> 00:19:54,919 Speaker 3: and I was just talking about what they're why they 413 00:19:54,960 --> 00:19:56,960 Speaker 3: were what they were, why they were excited to wrote 414 00:19:57,000 --> 00:19:59,480 Speaker 3: vote for him, and and it was eye opening to 415 00:19:59,520 --> 00:20:01,399 Speaker 3: me because these these people are you know, are not 416 00:20:01,480 --> 00:20:04,640 Speaker 3: reflected in your average cable news pundit battle that's going 417 00:20:04,680 --> 00:20:08,560 Speaker 3: on in twenty sixteen. It was completely unreflective of it. 418 00:20:08,840 --> 00:20:11,600 Speaker 3: And you mentioned Selena Zido earlier in the way that 419 00:20:11,640 --> 00:20:14,080 Speaker 3: she really got the mood of the country and she's 420 00:20:14,119 --> 00:20:15,960 Speaker 3: a great bell weather also for it, because you know, 421 00:20:16,000 --> 00:20:18,119 Speaker 3: I talked to Selena for the book about how she 422 00:20:18,240 --> 00:20:21,399 Speaker 3: was initially hired by CNN because there was this brief 423 00:20:21,440 --> 00:20:24,119 Speaker 3: moment of introspection after the election that said, wow, we 424 00:20:24,200 --> 00:20:26,600 Speaker 3: really miss something like let's get Selena in here, hire 425 00:20:26,640 --> 00:20:29,600 Speaker 3: her as a contributor for four years. And that initially, 426 00:20:29,680 --> 00:20:31,439 Speaker 3: you know, she told me, and she's never told the 427 00:20:31,480 --> 00:20:33,679 Speaker 3: story before that she was put in front of the 428 00:20:33,800 --> 00:20:36,639 Speaker 3: entire newsroom and was interviewed about basically what they missed, 429 00:20:37,000 --> 00:20:39,679 Speaker 3: and there was a she was getting booked on shows. 430 00:20:39,960 --> 00:20:43,360 Speaker 3: But she described how it went from being asked, oh, 431 00:20:43,480 --> 00:20:45,560 Speaker 3: what did the Trump supporters think about this. That was 432 00:20:45,600 --> 00:20:48,159 Speaker 3: what she was being asked about to why do they 433 00:20:48,240 --> 00:20:51,320 Speaker 3: believe that? Why would they believe these lies? And then, 434 00:20:51,440 --> 00:20:53,600 Speaker 3: as she said, I don't know, I'm a reporter. After 435 00:20:53,640 --> 00:20:56,479 Speaker 3: a couple of those within months in twenty seventeen, she 436 00:20:56,520 --> 00:20:59,520 Speaker 3: was off the air completely, just sidelined the entire duration 437 00:20:59,600 --> 00:21:03,800 Speaker 3: of her because that initial burst of introspection immediately went away, 438 00:21:03,840 --> 00:21:07,200 Speaker 3: and instead of just being against Trump, the media turned 439 00:21:07,200 --> 00:21:09,320 Speaker 3: against his supporters and the people that put him there 440 00:21:09,320 --> 00:21:10,000 Speaker 3: in the first place. 441 00:21:10,440 --> 00:21:13,520 Speaker 1: I mean, there's no question about that, right, And this 442 00:21:13,720 --> 00:21:16,160 Speaker 1: is this is why the whole concept of fake news 443 00:21:16,200 --> 00:21:19,000 Speaker 1: caught on with damn near fifty percent of the country 444 00:21:19,359 --> 00:21:23,399 Speaker 1: because they felt that in this sort of parlays into it, 445 00:21:23,560 --> 00:21:25,640 Speaker 1: like just connects perfectly to what I wanted to talk 446 00:21:25,680 --> 00:21:28,280 Speaker 1: to you next about, is the whole idea of coziness 447 00:21:28,280 --> 00:21:32,400 Speaker 1: with power, right. And I just found this so compelling, 448 00:21:32,680 --> 00:21:36,359 Speaker 1: right because the media, in many ways, and why do 449 00:21:36,440 --> 00:21:40,080 Speaker 1: they believe that they almost believe, well not almost, I 450 00:21:40,119 --> 00:21:41,959 Speaker 1: think in many ways they do believe that they are 451 00:21:42,040 --> 00:21:45,600 Speaker 1: part of the elite in this country. They're powerful themselves, 452 00:21:45,600 --> 00:21:47,879 Speaker 1: which of course they are. They're on TV, they have 453 00:21:48,000 --> 00:21:50,800 Speaker 1: massive Twitter followings, they're talking to millions of people every night, 454 00:21:51,280 --> 00:21:54,760 Speaker 1: and they're rubin elbows with US senators and presidents and 455 00:21:54,880 --> 00:21:57,040 Speaker 1: members of Congress. So like, of course they have an 456 00:21:57,119 --> 00:22:01,200 Speaker 1: unbelievable amount of power and how much of that And obviously, 457 00:22:01,720 --> 00:22:03,600 Speaker 1: like Donald Trump was one of those people, they were 458 00:22:03,840 --> 00:22:06,639 Speaker 1: half of these people were at his wedding. And the 459 00:22:06,680 --> 00:22:10,000 Speaker 1: idea that like, when how do you speak truth the 460 00:22:10,119 --> 00:22:13,800 Speaker 1: power when you were in that same group and many 461 00:22:13,840 --> 00:22:15,800 Speaker 1: of the people that you'd be speaking truth to you're 462 00:22:15,800 --> 00:22:21,520 Speaker 1: friends with. You know, Yeah, it's just it's unbelievably concerning 463 00:22:21,560 --> 00:22:21,800 Speaker 1: to me. 464 00:22:22,320 --> 00:22:25,920 Speaker 3: I know, I know, and it's not purely about Trump. 465 00:22:25,960 --> 00:22:27,639 Speaker 3: I'll give you just two examples that I give in 466 00:22:27,680 --> 00:22:29,760 Speaker 3: the book when it comes to coziness with power. First 467 00:22:29,760 --> 00:22:33,120 Speaker 3: of all, let's think about twenty twenty and Bernie Sanders. Okay, 468 00:22:33,160 --> 00:22:35,240 Speaker 3: let's go to the other side of the aisle and 469 00:22:35,320 --> 00:22:37,719 Speaker 3: the way that Bernie Sanders. We know, thanks to what 470 00:22:37,760 --> 00:22:40,720 Speaker 3: we learned in the emails, the DNC essentially worked with 471 00:22:40,800 --> 00:22:44,240 Speaker 3: the Hillary Clinton campaign to undermine Bernie Sanders in twenty sixteen. 472 00:22:44,280 --> 00:22:44,800 Speaker 3: We know that. 473 00:22:44,960 --> 00:22:47,639 Speaker 1: I mean, of course, they basically rigged the process for Hillary. 474 00:22:47,760 --> 00:22:50,080 Speaker 1: I mean absolutely, I hate the word rig but it's 475 00:22:50,080 --> 00:22:51,000 Speaker 1: definitely what they did. 476 00:22:51,359 --> 00:22:53,800 Speaker 3: They did it, they worked it away. That was not 477 00:22:54,160 --> 00:22:57,080 Speaker 3: what the DNC was supposed to do to undermine Hillary 478 00:22:57,119 --> 00:22:57,800 Speaker 3: Clinton's top. 479 00:22:57,640 --> 00:22:58,840 Speaker 2: Competitor, no doubt about it. 480 00:22:58,920 --> 00:23:02,640 Speaker 3: In twenty twenty, sure, there was definitely some political wrangling 481 00:23:02,680 --> 00:23:04,640 Speaker 3: that was happening against Bernie Sanders as well. I think 482 00:23:04,640 --> 00:23:08,040 Speaker 3: the Bloomberg candidacy was essentially this kamakazi to take him out. 483 00:23:08,400 --> 00:23:11,400 Speaker 3: And then eventually, as we learned before Super Tuesday, all 484 00:23:11,440 --> 00:23:14,680 Speaker 3: of the top competitors galvinized behind Biden to stop Bernie Sanders. 485 00:23:14,680 --> 00:23:17,200 Speaker 3: So the klobashar and they just dropped out and then 486 00:23:17,280 --> 00:23:20,199 Speaker 3: just right so there was still that happening, But it 487 00:23:20,240 --> 00:23:23,040 Speaker 3: was also the media, you know MSNBC, people like Joy 488 00:23:23,119 --> 00:23:26,840 Speaker 3: Reid and Chris Matthews also and others who it gets 489 00:23:26,880 --> 00:23:29,280 Speaker 3: lost a lot of time. They attacked Bernie Sanders and 490 00:23:29,320 --> 00:23:32,760 Speaker 3: his supporters in a shocking way for a network that 491 00:23:32,880 --> 00:23:34,920 Speaker 3: was just really what you would think is supporting whatever 492 00:23:34,920 --> 00:23:37,199 Speaker 3: the lefty candidate of the time is. Because in a 493 00:23:37,240 --> 00:23:39,479 Speaker 3: moment there he won Nevada and it looked like he 494 00:23:39,520 --> 00:23:41,919 Speaker 3: was cruising to become the nominee, and they could not 495 00:23:42,040 --> 00:23:43,960 Speaker 3: have that. They had to stop that. Also, he was 496 00:23:44,000 --> 00:23:47,280 Speaker 3: this disruption to the establishment in the way Trump was also, 497 00:23:47,359 --> 00:23:50,040 Speaker 3: and so that it's not just about Trump. Also, it's 498 00:23:50,040 --> 00:23:52,840 Speaker 3: at any disruption to the power, to the coziness that 499 00:23:52,840 --> 00:23:54,800 Speaker 3: we see there. So that's the first thing, and then 500 00:23:54,880 --> 00:23:57,320 Speaker 3: out of politics completely. I spent a lot of time 501 00:23:57,359 --> 00:24:00,600 Speaker 3: talking about Jeffrey Epstein in the book because talk about it. 502 00:24:00,600 --> 00:24:02,840 Speaker 2: Was fascinating, by the way, it's fascinating stuff. 503 00:24:03,200 --> 00:24:06,040 Speaker 3: Yeah, I mean you talk about Trump's wedding and the 504 00:24:06,080 --> 00:24:09,560 Speaker 3: guest list there. Well, George Stephanopolis and Katie Kirk and 505 00:24:10,280 --> 00:24:13,760 Speaker 3: a lot of other people have met with Jeffrey Epstein 506 00:24:13,880 --> 00:24:17,320 Speaker 3: at a dinner party at his house, not just before 507 00:24:17,359 --> 00:24:20,040 Speaker 3: we knew anything about Jeffrey Epstein, but after he had 508 00:24:20,040 --> 00:24:24,440 Speaker 3: gotten out of jail for pleading guilty to underage prostitution 509 00:24:24,880 --> 00:24:27,040 Speaker 3: and serving time. I mean, it was sort of in jail. 510 00:24:27,080 --> 00:24:29,359 Speaker 3: He was kind of like an house arrest. But that 511 00:24:29,480 --> 00:24:31,320 Speaker 3: was in Florida, and then he moved to New York. 512 00:24:31,600 --> 00:24:33,399 Speaker 3: And then he has a dinner party where all these 513 00:24:33,440 --> 00:24:37,320 Speaker 3: people go, Chelsea Handler, Katie Kirk, George Stephanopolis and they 514 00:24:37,359 --> 00:24:38,760 Speaker 3: all say it was the one time we met him. 515 00:24:38,760 --> 00:24:41,960 Speaker 3: We didn't know anything about him. Okay, sure, sure, fine, 516 00:24:42,119 --> 00:24:45,240 Speaker 3: but you're in the news business. Kerry Kirk was at 517 00:24:45,240 --> 00:24:47,600 Speaker 3: CBS or Stephanopolis was at ABC. You might be a 518 00:24:47,600 --> 00:24:50,199 Speaker 3: little bit interested in this person that you just had 519 00:24:50,200 --> 00:24:52,640 Speaker 3: dinner with, who has a giant picture of Bill Clinton 520 00:24:52,680 --> 00:24:55,040 Speaker 3: in a blue dress on his wall that you just 521 00:24:55,119 --> 00:24:57,439 Speaker 3: had the dinner party at. Maybe you might pursue that 522 00:24:57,560 --> 00:25:00,679 Speaker 3: as a journalistic endeavor instead of sitting on this story 523 00:25:00,760 --> 00:25:03,960 Speaker 3: for years and years while these continued to happen and 524 00:25:04,119 --> 00:25:07,960 Speaker 3: still I mean, you know, Jeffrey Epstein then killed himself 525 00:25:08,040 --> 00:25:10,240 Speaker 3: in theory in his jail cell, and then the story 526 00:25:10,320 --> 00:25:12,880 Speaker 3: was gone completely. We don't even know the half of it. 527 00:25:13,000 --> 00:25:16,600 Speaker 3: And there's no curiosity among the corporate press. Why because 528 00:25:17,160 --> 00:25:20,120 Speaker 3: if we start to uncover stuff, how implicated are they 529 00:25:20,160 --> 00:25:21,240 Speaker 3: in that story as well? 530 00:25:21,680 --> 00:25:23,560 Speaker 1: I mean it's so true and not just not just 531 00:25:23,640 --> 00:25:26,719 Speaker 1: Jeffrey Epstein. I mean you talk about Harvey Weinstein, right, 532 00:25:26,720 --> 00:25:29,760 Speaker 1: and how a reporter I can't remember her name, but 533 00:25:29,960 --> 00:25:32,439 Speaker 1: had the story about Harvey Harvey Weinstein for what did 534 00:25:32,520 --> 00:25:34,040 Speaker 1: you say three years prior. 535 00:25:33,840 --> 00:25:37,960 Speaker 3: To Sharon, Yeah, Sharon Blaxmouth the New York Times talks 536 00:25:37,960 --> 00:25:40,080 Speaker 3: about how the story got killed. Rich McHugh, who I 537 00:25:40,080 --> 00:25:42,000 Speaker 3: speak to was at NBC at the time, working with 538 00:25:42,080 --> 00:25:44,879 Speaker 3: Ronan Farrow, describes in great detail how that story was 539 00:25:45,000 --> 00:25:47,679 Speaker 3: killed by his bosses and how it completely changed the 540 00:25:47,680 --> 00:25:50,080 Speaker 3: way he thinks about the media. Uh, yeah, no, they 541 00:25:50,119 --> 00:25:51,080 Speaker 3: covered for him then. 542 00:25:51,800 --> 00:25:54,240 Speaker 1: I mean so And and by the way you lead 543 00:25:54,359 --> 00:25:57,000 Speaker 1: in with the book about the Hunter Biden laptop and 544 00:25:57,040 --> 00:25:58,520 Speaker 1: how and again I don't want to put words in 545 00:25:58,520 --> 00:26:00,920 Speaker 1: your mouth about but how you see it as almost, 546 00:26:01,680 --> 00:26:05,280 Speaker 1: I mean an egregious example of censorship bias, call it 547 00:26:05,320 --> 00:26:07,359 Speaker 1: whatever you want, but almost an inflection point for you. 548 00:26:07,400 --> 00:26:09,360 Speaker 2: And you start your book that with that story. 549 00:26:10,040 --> 00:26:13,640 Speaker 1: And I mean as I read that, actually, as as 550 00:26:13,640 --> 00:26:16,840 Speaker 1: I finished your book, I thought about that Hunter Biden's 551 00:26:16,880 --> 00:26:19,399 Speaker 1: laptop and wondered how much of that was you know, 552 00:26:20,280 --> 00:26:23,280 Speaker 1: media wanting to be cozy with power, like hey, thinking 553 00:26:23,320 --> 00:26:25,439 Speaker 1: that maybe Joe Biden would win. And hey, if I 554 00:26:25,640 --> 00:26:30,239 Speaker 1: write about his son in Hunter Biden's Laptop and some 555 00:26:30,280 --> 00:26:33,600 Speaker 1: of the horrific things that he did, you know, are 556 00:26:33,640 --> 00:26:35,240 Speaker 1: they right there on that laptop. And by the way, 557 00:26:35,280 --> 00:26:36,840 Speaker 1: you look at some of that stuff on that laptop, 558 00:26:36,840 --> 00:26:39,080 Speaker 1: if you have any common sets or half a brain whatsoever, 559 00:26:39,119 --> 00:26:42,680 Speaker 1: you know that it's real, you know, right, But how 560 00:26:42,720 --> 00:26:46,280 Speaker 1: many people like didn't report on that laptop because they thought, 561 00:26:46,320 --> 00:26:48,760 Speaker 1: like maybe Joe Biden wouldn't allow them into the White 562 00:26:48,760 --> 00:26:50,920 Speaker 1: House press room or allow them to ask a question, 563 00:26:51,040 --> 00:26:53,760 Speaker 1: or wouldn't be invited to that swanky dinner or fundraiser 564 00:26:53,880 --> 00:26:58,119 Speaker 1: or whatever. And ultimately, I mean, if you look at 565 00:26:58,240 --> 00:27:00,320 Speaker 1: I mean there are some polls out there, Steve that 566 00:27:00,320 --> 00:27:02,359 Speaker 1: that said if the Hunter Biden laptop story had been 567 00:27:02,400 --> 00:27:06,040 Speaker 1: complete reported on like, it might have swayed the election 568 00:27:06,200 --> 00:27:08,000 Speaker 1: by upwards of five percentage points. 569 00:27:08,320 --> 00:27:09,800 Speaker 3: Oh yeah, and it didn't even need that much, right, 570 00:27:09,800 --> 00:27:11,720 Speaker 3: I mean, it was really a couple hundred thousand votes 571 00:27:11,760 --> 00:27:14,000 Speaker 3: in a few states and Trump woult have n so, 572 00:27:14,000 --> 00:27:17,199 Speaker 3: So yeah, I think it's it's it is there was. 573 00:27:17,359 --> 00:27:19,200 Speaker 3: We could spend hours talking just about the Hunter Biden 574 00:27:19,280 --> 00:27:21,520 Speaker 3: laptop story. I do think that it was one of 575 00:27:21,520 --> 00:27:26,000 Speaker 3: the most important instances of understanding where how how far 576 00:27:26,080 --> 00:27:27,760 Speaker 3: our media has fallen by the way that that that 577 00:27:27,800 --> 00:27:31,800 Speaker 3: story was handled Jumbout verifying that laptop. Tucker Carlson tells 578 00:27:31,840 --> 00:27:33,120 Speaker 3: me in the book that he knew it was real 579 00:27:33,200 --> 00:27:36,239 Speaker 3: instantly because he was sent an email between himself and 580 00:27:36,320 --> 00:27:39,200 Speaker 3: Hunter Biden that he knows he wrote, and so he said, oh, okay, 581 00:27:39,240 --> 00:27:41,040 Speaker 3: I mean you don't have this email from me to 582 00:27:41,119 --> 00:27:43,600 Speaker 3: Hunter Biden, who he was friendly with at one point, 583 00:27:43,640 --> 00:27:45,800 Speaker 3: and describes how he was, you know, hit. The families 584 00:27:45,800 --> 00:27:48,480 Speaker 3: were close at one point before Hunter Biden kind of 585 00:27:48,480 --> 00:27:51,399 Speaker 3: went off the deep end. Uh that they proved it. 586 00:27:51,400 --> 00:27:53,880 Speaker 3: It was real from the moment essentially it came out, 587 00:27:54,240 --> 00:27:56,639 Speaker 3: but there was a real push. So what what what happened? 588 00:27:56,640 --> 00:27:58,359 Speaker 3: What was that push against it? And as I've read 589 00:27:58,400 --> 00:28:00,959 Speaker 3: about an uncovered, I think there were It really boils 590 00:28:01,000 --> 00:28:04,080 Speaker 3: down to guilt and fear. The guilt was the way 591 00:28:04,080 --> 00:28:07,920 Speaker 3: that they were attacked, and they felt personally responsible because 592 00:28:07,960 --> 00:28:10,880 Speaker 3: of how they handled Hillary's emails in twenty sixteen, and 593 00:28:11,240 --> 00:28:14,360 Speaker 3: they feel I think they're being too hard on themselves. 594 00:28:14,640 --> 00:28:16,760 Speaker 3: I don't think the media is handling of Hillary Clinton's 595 00:28:16,800 --> 00:28:19,879 Speaker 3: emails elected Donald Trump. I don't think they have that power. 596 00:28:19,920 --> 00:28:22,320 Speaker 3: But they really believe that they believed that they did 597 00:28:22,359 --> 00:28:24,280 Speaker 3: something that helped put him in office, so they were 598 00:28:24,280 --> 00:28:26,879 Speaker 3: not going to make that mistake again. And then also fear. 599 00:28:27,359 --> 00:28:30,480 Speaker 3: We saw journalists who like Maggie Haberman at the New 600 00:28:30,560 --> 00:28:32,640 Speaker 3: York Times tweeted a link to the New York Post 601 00:28:32,680 --> 00:28:35,040 Speaker 3: story as it came out, and just sort of questioned 602 00:28:35,040 --> 00:28:37,159 Speaker 3: the sourcing. Oh, this seems sort of sketchy, linked to 603 00:28:37,200 --> 00:28:39,920 Speaker 3: the New York Post, and she was attacked for daring 604 00:28:40,000 --> 00:28:42,600 Speaker 3: to just even link to it and question the sourcing. 605 00:28:42,640 --> 00:28:45,280 Speaker 3: She was called Maga Haberman. She was trending on Twitter. 606 00:28:46,200 --> 00:28:46,960 Speaker 2: I remember her. 607 00:28:47,000 --> 00:28:49,320 Speaker 3: Yeah, so when you have that, you know, you have 608 00:28:49,400 --> 00:28:53,280 Speaker 3: to have a level of balls to even go that 609 00:28:53,400 --> 00:28:55,760 Speaker 3: far of just tweeting a link to it. And then 610 00:28:55,920 --> 00:28:58,440 Speaker 3: what we saw was the suppression. Right the New York 611 00:28:58,440 --> 00:29:01,040 Speaker 3: Post was locked out of their Twitter account and then 612 00:29:01,120 --> 00:29:04,600 Speaker 3: in an unprecedented move, Twitter made it so that link 613 00:29:04,680 --> 00:29:08,480 Speaker 3: was not shaable through publicly or even privately, and there 614 00:29:08,520 --> 00:29:10,880 Speaker 3: was no outrage by the corporate press about that. Yes, 615 00:29:10,920 --> 00:29:13,440 Speaker 3: they eventually did reverse themselves a day or two later, 616 00:29:13,640 --> 00:29:15,760 Speaker 3: but then Your Post remained locked out of their account 617 00:29:15,800 --> 00:29:18,680 Speaker 3: for weeks to come, and there was no outrage by 618 00:29:18,680 --> 00:29:22,000 Speaker 3: the press over that total overreach when it comes to 619 00:29:22,040 --> 00:29:25,440 Speaker 3: suppression and censorship. And that's what started what was to come, 620 00:29:25,480 --> 00:29:28,160 Speaker 3: which is this rise of anti speech activism. We saw 621 00:29:28,200 --> 00:29:31,160 Speaker 3: it during COVID in so many instances, just this idea 622 00:29:31,240 --> 00:29:34,400 Speaker 3: of we can't trust the public with this information, so 623 00:29:34,440 --> 00:29:36,400 Speaker 3: we need to just make sure they have less information. 624 00:29:36,760 --> 00:29:38,560 Speaker 3: That is not an ideal of the press. 625 00:29:38,880 --> 00:29:40,640 Speaker 1: And you talk about this in your book what did 626 00:29:40,680 --> 00:29:46,840 Speaker 1: you call it information being an information maximalist versus information minimalist, right, 627 00:29:46,560 --> 00:29:49,440 Speaker 1: And I believe that you say you're an information maximalist 628 00:29:49,480 --> 00:29:52,720 Speaker 1: as Amy, and I because I mean, ultimately, I think 629 00:29:52,880 --> 00:29:55,520 Speaker 1: that the American people are smart enough to decipher for 630 00:29:56,120 --> 00:29:59,640 Speaker 1: themselves what to do with maybe certain health information, if 631 00:29:59,640 --> 00:30:02,720 Speaker 1: we're talking about COVID or trying to decipher whether or 632 00:30:02,720 --> 00:30:05,840 Speaker 1: not somebody's legitimately insane. I think the American people are 633 00:30:05,840 --> 00:30:08,480 Speaker 1: smart enough to figure that out for themselves. And in fact, 634 00:30:09,000 --> 00:30:12,080 Speaker 1: I think one can make the argument, Steve that by 635 00:30:12,160 --> 00:30:15,720 Speaker 1: censoring those people or labeling them disinformation, you actually help 636 00:30:15,760 --> 00:30:21,040 Speaker 1: their platform grow, you know, yes, and and so what 637 00:30:21,920 --> 00:30:24,720 Speaker 1: it's it's if you you talk about in your book, 638 00:30:24,960 --> 00:30:27,320 Speaker 1: like the thinking about how the censorship of the Hunter 639 00:30:27,360 --> 00:30:30,160 Speaker 1: Biden laptop story is such an egregious example that ultimately 640 00:30:30,200 --> 00:30:33,480 Speaker 1: swayed an election, and and and you also talk in 641 00:30:33,520 --> 00:30:37,600 Speaker 1: depth about COVID, which also exposed some glaring faults uh 642 00:30:37,880 --> 00:30:42,480 Speaker 1: within the media and the whole Cuomo and Quomo show, Yeah, 643 00:30:42,480 --> 00:30:45,000 Speaker 1: and them having an hour and primetime of you know, 644 00:30:45,080 --> 00:30:47,479 Speaker 1: Andrew Cuomo and Chris Cuomo on their same show with 645 00:30:47,520 --> 00:30:50,840 Speaker 1: the with the big like Q tip and talking about 646 00:30:50,960 --> 00:30:54,240 Speaker 1: testing and then where you had. CNN has like this 647 00:30:54,400 --> 00:30:58,000 Speaker 1: hundred page ethics handbook of like of how to conduct 648 00:30:58,040 --> 00:31:00,800 Speaker 1: yourself as an on air personality or or a journalist, 649 00:31:00,840 --> 00:31:04,400 Speaker 1: and Zucker himself waived a rule to allow the governor 650 00:31:04,440 --> 00:31:07,040 Speaker 1: of the then governor of New York on Chris Cuomo's show, 651 00:31:07,120 --> 00:31:10,360 Speaker 1: to basically goof around for an hour during primetime at 652 00:31:10,360 --> 00:31:12,240 Speaker 1: the height of COVID, which of course this was all 653 00:31:12,280 --> 00:31:14,880 Speaker 1: before like we learned that, you know, Andrew Cuoma was 654 00:31:14,920 --> 00:31:17,360 Speaker 1: shunting old folks into nursing homes and they were dying 655 00:31:18,080 --> 00:31:20,080 Speaker 1: to the tune of tens of thousands. I mean, it's 656 00:31:20,080 --> 00:31:22,160 Speaker 1: just a tidal wave of human suffering there. And then 657 00:31:22,320 --> 00:31:27,520 Speaker 1: was accused of sexual harassment ultimately resigned. But what's crazy 658 00:31:27,560 --> 00:31:28,960 Speaker 1: to me about that, and I kept thinking is that, 659 00:31:29,000 --> 00:31:33,880 Speaker 1: like CNN almost created this scenario to where they had. 660 00:31:34,040 --> 00:31:36,680 Speaker 1: Then after Andrew Cuomo had to resign, they had an 661 00:31:36,720 --> 00:31:39,320 Speaker 1: hour like at what's the seven o'clock time, so leading 662 00:31:39,360 --> 00:31:42,520 Speaker 1: into primetime where Chris Cuomo wasn't even allowed to talk 663 00:31:42,560 --> 00:31:44,240 Speaker 1: about what was one of the biggest stories in the 664 00:31:44,240 --> 00:31:46,560 Speaker 1: country at that time, and then hand off to Don 665 00:31:46,640 --> 00:31:49,400 Speaker 1: Lemon after the fact where he then batched his brother. 666 00:31:49,480 --> 00:31:52,360 Speaker 2: I'm like, this is just all so crazy to me. 667 00:31:52,560 --> 00:31:53,840 Speaker 2: I just it's just crazy. 668 00:31:54,440 --> 00:31:56,760 Speaker 3: It's a great example of the way just a little 669 00:31:56,800 --> 00:31:59,480 Speaker 3: bit of loosening up the standards can can be the 670 00:31:59,520 --> 00:32:02,200 Speaker 3: snowball effect. And I should also point out. 671 00:32:02,080 --> 00:32:04,080 Speaker 2: A great way to say it's interesting thing. 672 00:32:04,440 --> 00:32:06,640 Speaker 3: Yeah, because and I should also say, you know, this 673 00:32:06,680 --> 00:32:10,560 Speaker 3: section of Uncovered really highlights something that I It's complicated. 674 00:32:10,560 --> 00:32:13,640 Speaker 3: The whole issue of the media is complicated because, frankly, 675 00:32:13,640 --> 00:32:15,720 Speaker 3: some of the best voices I would say in that chapter, 676 00:32:15,760 --> 00:32:18,600 Speaker 3: people like David Fulkanflick of NPR, Eric Wemple of the 677 00:32:18,640 --> 00:32:21,600 Speaker 3: Washington Post who were able to call out this CNN 678 00:32:21,640 --> 00:32:24,120 Speaker 3: and what they did, are obviously part of the corporate 679 00:32:24,160 --> 00:32:26,400 Speaker 3: media themselves. So there are some people that I think 680 00:32:26,400 --> 00:32:28,400 Speaker 3: have some introspection about this, and I think it's not 681 00:32:28,640 --> 00:32:30,960 Speaker 3: campaign with a broad brush. There's good journalism being done 682 00:32:30,960 --> 00:32:33,640 Speaker 3: every day, and yet they're often undermined by their own colleagues. 683 00:32:33,880 --> 00:32:36,080 Speaker 3: So that's the first thing to say. But yeah, I 684 00:32:36,120 --> 00:32:40,760 Speaker 3: think that what the Cuomo story was one where it's 685 00:32:40,800 --> 00:32:43,760 Speaker 3: like in the height of COVID and in March April 686 00:32:43,800 --> 00:32:47,640 Speaker 3: May of twenty twenty, there's there were mistakes being made 687 00:32:47,680 --> 00:32:49,680 Speaker 3: when it comes to journalism and ethics. There were mistakes 688 00:32:49,680 --> 00:32:52,240 Speaker 3: being made when it comes to to what was true 689 00:32:52,240 --> 00:32:54,040 Speaker 3: about COVID, And I think all of that can be 690 00:32:54,160 --> 00:32:57,280 Speaker 3: on some level forgiven because it was this crazy time. 691 00:32:57,360 --> 00:33:01,960 Speaker 3: You know, it was this unprecedented a pandemic and no 692 00:33:02,000 --> 00:33:06,400 Speaker 3: one really knew what was going on. But two things happened. One, 693 00:33:06,960 --> 00:33:09,040 Speaker 3: when it comes to like the the ethics and standards 694 00:33:09,080 --> 00:33:11,600 Speaker 3: of the Cuomo Show is it went on forever and 695 00:33:11,720 --> 00:33:14,440 Speaker 3: long after even we learned about the nursing home story. 696 00:33:14,640 --> 00:33:16,640 Speaker 3: And then of course when they when they decide, oh, 697 00:33:16,720 --> 00:33:19,520 Speaker 3: let's put the with the standards back on. Now Chris 698 00:33:19,640 --> 00:33:22,200 Speaker 3: can't talk about his brother at all, certainly can't interview him. 699 00:33:23,000 --> 00:33:25,480 Speaker 3: It just makes it so glaring that there's still now 700 00:33:25,480 --> 00:33:27,520 Speaker 3: covering it on other parts of the network. And then 701 00:33:27,560 --> 00:33:29,840 Speaker 3: of course the Jeff Zucker gets wrapped up into it 702 00:33:29,880 --> 00:33:32,360 Speaker 3: and he gets pushed out. Also that lawsuit still ongoing, 703 00:33:32,400 --> 00:33:34,160 Speaker 3: by the way, and it just was kind of the 704 00:33:34,240 --> 00:33:37,840 Speaker 3: nail in the coffin to be honest on seeing incredibility issue, it. 705 00:33:37,760 --> 00:33:38,680 Speaker 2: Was crazy, Steve. 706 00:33:38,920 --> 00:33:40,560 Speaker 1: I was I was running. I think I was running 707 00:33:40,560 --> 00:33:44,000 Speaker 1: for Congress at that point. And just as a personal anecdote, 708 00:33:44,080 --> 00:33:47,160 Speaker 1: like I've always been a political outsider, been involved, you know, 709 00:33:47,400 --> 00:33:49,800 Speaker 1: but like it was busy doing my own thing. And 710 00:33:49,880 --> 00:33:52,120 Speaker 1: I always knew, like you as certain the book, the 711 00:33:52,160 --> 00:33:55,080 Speaker 1: media maybe always leaned left, and I was cognizant of that. 712 00:33:55,440 --> 00:33:57,960 Speaker 1: And you know, anytime I watched the news, I was 713 00:33:58,000 --> 00:34:00,680 Speaker 1: cognizant of it. But what I when I ran boy, 714 00:34:00,800 --> 00:34:03,800 Speaker 1: I realized, like, holy shit, there's a lot more to 715 00:34:03,840 --> 00:34:08,400 Speaker 1: this than meets the eye. And and what I'm thinking 716 00:34:08,440 --> 00:34:11,160 Speaker 1: of right now is Chris Cuomo on a show like 717 00:34:11,320 --> 00:34:14,040 Speaker 1: walking up from the steps out of Quarantine, and I'm 718 00:34:14,080 --> 00:34:17,279 Speaker 1: like that was completely made up, like like none of 719 00:34:17,320 --> 00:34:20,160 Speaker 1: that was real, Like it was complete bullshit. And I 720 00:34:20,160 --> 00:34:23,840 Speaker 1: started thinking to myself, Steve, like like, so I can't 721 00:34:23,840 --> 00:34:26,759 Speaker 1: be the only one feeling this, right, Like how much 722 00:34:26,840 --> 00:34:29,239 Speaker 1: of this stuff is just performative bullshit? 723 00:34:29,600 --> 00:34:29,799 Speaker 2: Right? 724 00:34:30,239 --> 00:34:32,880 Speaker 1: And and I know, by the way, this was happening 725 00:34:32,960 --> 00:34:36,360 Speaker 1: during a pandemic that was very real, in which thousands 726 00:34:36,360 --> 00:34:38,879 Speaker 1: and thousands of people were losing their life or getting sick, 727 00:34:39,239 --> 00:34:41,440 Speaker 1: or loved ones being forced to die alone. People were 728 00:34:41,440 --> 00:34:44,040 Speaker 1: trying to sift through what's true and what's not. And 729 00:34:44,080 --> 00:34:47,080 Speaker 1: you talk about a loosening of an ethical standard like 730 00:34:47,600 --> 00:34:52,520 Speaker 1: stuff like that didn't help. It hurt and real people's 731 00:34:52,560 --> 00:34:54,359 Speaker 1: lives were hurt because of it, you know, when you 732 00:34:54,400 --> 00:34:57,480 Speaker 1: talk about like one example I mentioned was loved ones 733 00:34:57,560 --> 00:34:59,640 Speaker 1: dying alone, but people lost their life savings. 734 00:34:59,640 --> 00:35:01,000 Speaker 2: Business we're closed. 735 00:35:01,040 --> 00:35:03,680 Speaker 1: And you have Chris Cuomo like from I don't even 736 00:35:03,760 --> 00:35:06,240 Speaker 1: know where Martha's vineyard, like coming up from the basement, 737 00:35:06,280 --> 00:35:09,280 Speaker 1: acting like he's like Lazarus rising from the dead or something, 738 00:35:09,560 --> 00:35:12,399 Speaker 1: and then nobody from CNN even says anything about it, 739 00:35:12,440 --> 00:35:14,960 Speaker 1: like like, I can't how does that, How does something 740 00:35:15,000 --> 00:35:15,680 Speaker 1: like that happen? 741 00:35:16,040 --> 00:35:17,680 Speaker 3: I know, I know, I think he was at his 742 00:35:17,880 --> 00:35:22,319 Speaker 3: one of his Hampton's houses over there. It's it's it's 743 00:35:22,360 --> 00:35:26,280 Speaker 3: just well, frankly, I think that whole thing and really 744 00:35:26,360 --> 00:35:30,240 Speaker 3: just the the arrogance, the smugness of the Chris Cuomo's 745 00:35:30,320 --> 00:35:32,440 Speaker 3: up the world comes a little bit also from social 746 00:35:32,480 --> 00:35:35,440 Speaker 3: media and buying your own hype. In so many of 747 00:35:35,440 --> 00:35:38,799 Speaker 3: these instances and the rise of these influencers that these 748 00:35:39,120 --> 00:35:42,439 Speaker 3: media members now think that they're stars in a way, 749 00:35:42,600 --> 00:35:44,879 Speaker 3: and and that you know, even people that are far 750 00:35:45,080 --> 00:35:47,440 Speaker 3: below the bottom of the rung but have twenty thousand 751 00:35:47,480 --> 00:35:49,560 Speaker 3: Twitter followers, you can feel like a lot. You feel 752 00:35:49,560 --> 00:35:51,799 Speaker 3: like you got this big platform. You know, true, you 753 00:35:51,840 --> 00:35:54,799 Speaker 3: know you maybe maybe don't in reality. But no, I 754 00:35:54,800 --> 00:35:58,239 Speaker 3: think COVID was was just so disastrous because some one 755 00:35:58,280 --> 00:36:00,759 Speaker 3: of these stories we talk about Chris Cooman coming out 756 00:36:00,760 --> 00:36:02,600 Speaker 3: of the basement is you know, a lighter example, and 757 00:36:02,640 --> 00:36:04,680 Speaker 3: there's other ones also that I give out in the book, 758 00:36:04,840 --> 00:36:06,920 Speaker 3: But COVID was serious. You know, this was something that 759 00:36:07,320 --> 00:36:10,120 Speaker 3: and it's not just about you know, the initial part 760 00:36:10,120 --> 00:36:12,480 Speaker 3: of it, but like lockdowns, the way lockdowns were covered, 761 00:36:12,520 --> 00:36:15,120 Speaker 3: you know, masks and vaccines, and we could go right 762 00:36:15,160 --> 00:36:19,040 Speaker 3: down the line. In every case, the media essentially chose 763 00:36:19,120 --> 00:36:23,600 Speaker 3: what was the one true scientific expert opinion, the consensus opinion, 764 00:36:24,200 --> 00:36:28,160 Speaker 3: and said anything that deviates from that is dangerous. And 765 00:36:28,200 --> 00:36:31,400 Speaker 3: that's not just wrong, I mean not just as actually 766 00:36:31,440 --> 00:36:34,400 Speaker 3: wrong that they got so much wrong, but it's dangerous 767 00:36:34,400 --> 00:36:36,440 Speaker 3: to even have that point of view, the idea that 768 00:36:36,480 --> 00:36:38,799 Speaker 3: you can't have a debate about something or a discussion 769 00:36:39,200 --> 00:36:42,000 Speaker 3: a nuanced conversation. These things weren't black and white. We 770 00:36:42,080 --> 00:36:43,799 Speaker 3: knew that at the time, and they knew that at 771 00:36:43,840 --> 00:36:45,719 Speaker 3: the time, but they went with this because they just 772 00:36:45,760 --> 00:36:48,560 Speaker 3: didn't feel that the public could deal with the nuance 773 00:36:48,600 --> 00:36:50,520 Speaker 3: of these issues. I think, I mean, in some ways 774 00:36:50,520 --> 00:36:52,480 Speaker 3: they were completely spun by the Fauccies of the world, 775 00:36:52,520 --> 00:36:56,200 Speaker 3: as we would find out, and allowing them to suppress 776 00:36:56,239 --> 00:36:56,760 Speaker 3: and censor. 777 00:36:57,120 --> 00:36:59,799 Speaker 1: So you, Steve, you wrote about this stuff in your book, 778 00:37:00,320 --> 00:37:04,080 Speaker 1: and it pissed me off so much because, first of all, 779 00:37:04,320 --> 00:37:08,480 Speaker 1: you talk about Fauci, right, and how COVID itself became 780 00:37:09,200 --> 00:37:13,319 Speaker 1: super politicized, and you know, Trump was a part of it, 781 00:37:13,480 --> 00:37:15,760 Speaker 1: you know, and you talk about in the book about 782 00:37:15,840 --> 00:37:20,280 Speaker 1: how if Trump supported something, the media is almost and 783 00:37:19,960 --> 00:37:22,799 Speaker 1: I'm painting with broad strokes here, and I agree with 784 00:37:22,840 --> 00:37:24,520 Speaker 1: you we shouldn't do that, but I say the media 785 00:37:24,600 --> 00:37:27,080 Speaker 1: is like many in the media just ran to oppose 786 00:37:27,160 --> 00:37:30,200 Speaker 1: whatever it is that he said. And you had Fauci, 787 00:37:30,239 --> 00:37:33,560 Speaker 1: who was not necessarily an expert on any of these topics, 788 00:37:33,600 --> 00:37:36,880 Speaker 1: but because of you know, the time that he spent 789 00:37:36,920 --> 00:37:39,799 Speaker 1: in Washington as part of these bureaucratic institutions, he makes 790 00:37:39,880 --> 00:37:42,280 Speaker 1: contact and he's cozy with all these people with power, 791 00:37:43,000 --> 00:37:45,440 Speaker 1: was able to I think one of the people that 792 00:37:45,440 --> 00:37:48,359 Speaker 1: he interviewed said that like more than almost anything else, 793 00:37:48,440 --> 00:37:50,960 Speaker 1: Like he was just a master at manipulating the media 794 00:37:51,000 --> 00:37:53,080 Speaker 1: and was able to smear so many of these other 795 00:37:53,200 --> 00:37:57,120 Speaker 1: doctors who had opposing positions. And what I kept thinking 796 00:37:57,120 --> 00:38:00,319 Speaker 1: about is I was reading it was like, what is. 797 00:38:00,280 --> 00:38:01,800 Speaker 2: In it for the media? 798 00:38:01,640 --> 00:38:01,839 Speaker 3: Yeah? 799 00:38:02,560 --> 00:38:04,880 Speaker 2: To do that? All it does is hurt their credibility. 800 00:38:04,920 --> 00:38:07,880 Speaker 1: And ultimately, what really pissed me off about it, Steve, 801 00:38:07,960 --> 00:38:10,560 Speaker 1: is it people's lives were destroyed because of it. 802 00:38:10,880 --> 00:38:12,840 Speaker 2: Yes, And there was no pressibility for any. 803 00:38:12,640 --> 00:38:16,000 Speaker 3: Of it, not not nearly enough and not nearly enough 804 00:38:16,000 --> 00:38:19,080 Speaker 3: self reflection from the media for what they did in 805 00:38:19,560 --> 00:38:20,080 Speaker 3: joining in that. 806 00:38:20,200 --> 00:38:20,239 Speaker 1: No. 807 00:38:20,480 --> 00:38:22,920 Speaker 3: Doctor j Bodataria is the person that I spoke to 808 00:38:22,960 --> 00:38:25,800 Speaker 3: who describes Fauci as a genius, you know, an evil 809 00:38:25,840 --> 00:38:29,840 Speaker 3: genius if you will. That's someone who understood the way. Yeah, 810 00:38:29,880 --> 00:38:32,480 Speaker 3: he was. He had built these relationships with the media, 811 00:38:32,800 --> 00:38:35,880 Speaker 3: with those in power in government and the media for 812 00:38:36,400 --> 00:38:38,560 Speaker 3: decades leading up to the pandemic, and so he was 813 00:38:38,600 --> 00:38:42,719 Speaker 3: well positioned to speak with this this great authority and 814 00:38:42,880 --> 00:38:46,160 Speaker 3: to not just say I'm the authority, which again probably wrong, 815 00:38:46,239 --> 00:38:48,319 Speaker 3: but at the very least you could say that. But 816 00:38:48,440 --> 00:38:50,680 Speaker 3: to say anyone else who deviates from what I say 817 00:38:50,760 --> 00:38:53,080 Speaker 3: is dangerous. And doctor J. Bodataria was one of the 818 00:38:53,120 --> 00:38:56,520 Speaker 3: three experts. This is a Stanford professor and a. 819 00:38:56,480 --> 00:38:58,800 Speaker 2: Doctor absolutely brilliant, yes. 820 00:38:58,760 --> 00:39:02,200 Speaker 3: Who put to the Great Barrington Declaration. And if anyone's 821 00:39:02,360 --> 00:39:05,200 Speaker 3: even remotely familiar with the Great Barrington Declaration, you may 822 00:39:05,360 --> 00:39:08,680 Speaker 3: think that it was this quack, you know, dangerous paper 823 00:39:08,680 --> 00:39:10,920 Speaker 3: that was put together in order to get people killed, 824 00:39:11,000 --> 00:39:12,839 Speaker 3: you know, let it rip. That was what people would 825 00:39:12,840 --> 00:39:15,600 Speaker 3: describe it as, let COVID just rip through society. That 826 00:39:15,680 --> 00:39:17,319 Speaker 3: was not it at all. It was an argument for 827 00:39:17,360 --> 00:39:20,920 Speaker 3: focused protection on the elderly, not to lock down everyone, 828 00:39:20,960 --> 00:39:22,719 Speaker 3: but to make sure the elderly had taken care of 829 00:39:23,000 --> 00:39:26,799 Speaker 3: and those who are immunecropromised. That literally the entire thing 830 00:39:26,840 --> 00:39:29,600 Speaker 3: they were. They didn't have to be proven right for 831 00:39:29,640 --> 00:39:31,799 Speaker 3: it to be ridiculous how they were treated. But they 832 00:39:31,800 --> 00:39:34,879 Speaker 3: were proven right, as we now know scientifically. And yet 833 00:39:34,960 --> 00:39:37,880 Speaker 3: they were attacked in the media. They were attacked, they 834 00:39:37,880 --> 00:39:42,600 Speaker 3: were censored and smeared as these spreading disinformation and dangers 835 00:39:42,840 --> 00:39:44,759 Speaker 3: from the faucties of the world. But also from the 836 00:39:44,760 --> 00:39:47,879 Speaker 3: corporate media themselves, from the New York Times, and that 837 00:39:48,120 --> 00:39:51,360 Speaker 3: is something that is it can't ever get fully corrected, 838 00:39:51,360 --> 00:39:53,520 Speaker 3: because once we've gone down this road, you know, we 839 00:39:53,760 --> 00:39:56,160 Speaker 3: can't go back. But at the very least you'd hope 840 00:39:56,160 --> 00:39:58,440 Speaker 3: for a little bit of introspection from the press for 841 00:39:58,560 --> 00:40:01,719 Speaker 3: how they handled COVID. They got it so wrong and 842 00:40:02,480 --> 00:40:05,960 Speaker 3: just spun just what would turned out to be completely untrue. 843 00:40:06,560 --> 00:40:08,080 Speaker 2: I mean, it's so true. 844 00:40:08,120 --> 00:40:10,480 Speaker 1: And you talk about the importance of humility in the book, 845 00:40:10,480 --> 00:40:13,560 Speaker 1: and that you know when when journalism, you know it's 846 00:40:14,080 --> 00:40:17,160 Speaker 1: journalists have a brand now and that brand is built 847 00:40:17,239 --> 00:40:21,279 Speaker 1: upon on social media, and like the lack of introspection, 848 00:40:21,520 --> 00:40:23,960 Speaker 1: which which comes part and parcel with that as a 849 00:40:24,080 --> 00:40:26,960 Speaker 1: lack of humility. And I think the American people are 850 00:40:27,080 --> 00:40:29,600 Speaker 1: understanding even though COVID was a really tough time for 851 00:40:29,640 --> 00:40:31,800 Speaker 1: a lot of people. I mean, you'd be hard pressed 852 00:40:31,800 --> 00:40:34,160 Speaker 1: to find a family in this country that that didn't 853 00:40:34,160 --> 00:40:37,560 Speaker 1: have somebody that was very very ill or maybe even 854 00:40:37,600 --> 00:40:40,759 Speaker 1: perhaps lost somebody. Yeah, so very serious time, but a 855 00:40:40,800 --> 00:40:43,239 Speaker 1: little goes a long way in terms of humility and 856 00:40:43,280 --> 00:40:46,160 Speaker 1: the media just saying like, hey, look, we're sorry, we 857 00:40:46,160 --> 00:40:48,280 Speaker 1: were trying to sift through this as best we could, 858 00:40:48,719 --> 00:40:52,200 Speaker 1: and mistakes were made along the way, but we will 859 00:40:52,200 --> 00:40:54,359 Speaker 1: be better and try to get better every day. Like 860 00:40:54,400 --> 00:40:57,040 Speaker 1: that didn't happen, and it should have happened, but it 861 00:40:57,040 --> 00:40:59,160 Speaker 1: would have gone a long way had some people just 862 00:40:59,200 --> 00:40:59,560 Speaker 1: done that. 863 00:40:59,600 --> 00:41:02,680 Speaker 3: I think, Yeah, I think it's still it's never too 864 00:41:02,760 --> 00:41:05,759 Speaker 3: late to admit that you were wrong. I think I 865 00:41:05,800 --> 00:41:07,160 Speaker 3: agree with you. I think it goes a long way. 866 00:41:07,200 --> 00:41:09,080 Speaker 3: I think there is so much sense in the corporate 867 00:41:09,120 --> 00:41:11,600 Speaker 3: media that they don't want to admit when they get 868 00:41:11,600 --> 00:41:14,120 Speaker 3: things wrong because they think that the audience that they 869 00:41:14,160 --> 00:41:16,000 Speaker 3: have might continue to turn against them. I think that's 870 00:41:16,000 --> 00:41:18,920 Speaker 3: completely wrong. I think that's the totally incorrect approach. They 871 00:41:18,960 --> 00:41:21,920 Speaker 3: would find out if they would actually go and be accountable, 872 00:41:22,160 --> 00:41:24,640 Speaker 3: that that would ingratiate them to the public, that they 873 00:41:24,640 --> 00:41:26,480 Speaker 3: would trust them more if they would say when they 874 00:41:26,480 --> 00:41:29,160 Speaker 3: get it wrong. So I agree with you, But I 875 00:41:29,160 --> 00:41:32,000 Speaker 3: also think what this is a symptom of, you think, 876 00:41:32,040 --> 00:41:35,799 Speaker 3: why did this happen? Is that there journalism is a 877 00:41:35,880 --> 00:41:38,799 Speaker 3: very weird occupation in the sense that it's not like 878 00:41:39,719 --> 00:41:43,640 Speaker 3: a meritocracy. Okay, you know, if you're talking about like plumbers, 879 00:41:44,760 --> 00:41:46,839 Speaker 3: you were there be Yelp reviews, and you would see 880 00:41:46,840 --> 00:41:48,799 Speaker 3: the ones that would do a great job would get 881 00:41:48,840 --> 00:41:51,200 Speaker 3: five stars, and the ones that are lazy or are 882 00:41:51,280 --> 00:41:54,120 Speaker 3: not so good would get you know, put down the line, 883 00:41:54,160 --> 00:41:56,600 Speaker 3: and the best one would get rise to the top 884 00:41:56,640 --> 00:41:58,919 Speaker 3: and the bad ones would get weeded out. Journalism doesn't 885 00:41:58,960 --> 00:42:00,480 Speaker 3: work like that at all, you know. In fact, it's 886 00:42:00,480 --> 00:42:03,160 Speaker 3: almost the opposite. A lot of times, the worst journalists 887 00:42:03,200 --> 00:42:06,359 Speaker 3: get rewarded in the biggest ways possible because they're good 888 00:42:06,360 --> 00:42:09,000 Speaker 3: at other aspects of things, whether it's through building brands 889 00:42:09,040 --> 00:42:12,000 Speaker 3: on social media or playing the game in the powerful 890 00:42:12,000 --> 00:42:17,200 Speaker 3: scent circles. So there is no mechanism to help bring 891 00:42:17,239 --> 00:42:20,239 Speaker 3: the best journalists to the top. And because of that, 892 00:42:20,400 --> 00:42:22,799 Speaker 3: you have a lot of lazy, incompetent people who make 893 00:42:22,880 --> 00:42:24,879 Speaker 3: up these newsrooms, not all of them, but a lot 894 00:42:24,880 --> 00:42:26,600 Speaker 3: of them. And these people are not going to get 895 00:42:26,600 --> 00:42:29,239 Speaker 3: better because they can't. They are incapable of getting better. 896 00:42:29,400 --> 00:42:33,160 Speaker 3: And until we actually have bosses who care about what 897 00:42:33,320 --> 00:42:36,920 Speaker 3: actually is journalism, actually is what you know, what should 898 00:42:36,920 --> 00:42:38,880 Speaker 3: it be to serve the public, we're not going to 899 00:42:38,880 --> 00:42:39,439 Speaker 3: get this better. 900 00:42:40,120 --> 00:42:43,200 Speaker 1: I mean, you talk about being lazy and incompetent, and 901 00:42:43,280 --> 00:42:46,120 Speaker 1: you also talk about Jesse Smolette and the hoax that 902 00:42:46,120 --> 00:42:51,440 Speaker 1: that was, and I mean like the Gorilla TV and 903 00:42:51,600 --> 00:42:53,799 Speaker 1: the idea that President Trump is sitting around in the 904 00:42:53,800 --> 00:42:56,960 Speaker 1: Oval office and watching Gorilla's fight on a network that 905 00:42:57,080 --> 00:43:01,160 Speaker 1: was just for him was more believable than the smoletoaks, 906 00:43:01,280 --> 00:43:03,840 Speaker 1: Like the idea that they heard of two Maga types 907 00:43:03,880 --> 00:43:06,320 Speaker 1: in Chicago in the middle of a polar vortex in 908 00:43:06,360 --> 00:43:08,000 Speaker 1: the middle of the night just so happened to be 909 00:43:08,040 --> 00:43:12,560 Speaker 1: carrying bleach and like conduct some horrific attack on Justy 910 00:43:12,600 --> 00:43:14,719 Speaker 1: Smolett and pour bleach on him and everything else and 911 00:43:14,760 --> 00:43:16,640 Speaker 1: say this is Maga country, even though it's probably like 912 00:43:16,640 --> 00:43:20,839 Speaker 1: a D plus one hundred, like any one could have known, 913 00:43:20,960 --> 00:43:23,920 Speaker 1: just like anybody that was like all of these people 914 00:43:24,000 --> 00:43:27,279 Speaker 1: exist in the media, from from producers to editors, and 915 00:43:28,200 --> 00:43:30,759 Speaker 1: no one thought for a second that like maybe this 916 00:43:30,840 --> 00:43:34,400 Speaker 1: isn't How does some I guess my question for you is, Steve, 917 00:43:34,480 --> 00:43:37,200 Speaker 1: is how the hell does something like that happen to? 918 00:43:37,200 --> 00:43:37,399 Speaker 2: Where? 919 00:43:37,520 --> 00:43:40,160 Speaker 1: Like what in twenty four hours he's got He's like 920 00:43:40,200 --> 00:43:44,319 Speaker 1: getting a primetime interview with what Robin Robin Roberts, Yah, 921 00:43:44,440 --> 00:43:47,040 Speaker 1: Robin Roberts, who is who is who I respect? And 922 00:43:47,200 --> 00:43:50,040 Speaker 1: and thinks she's great, by the way, but how does 923 00:43:50,080 --> 00:43:53,719 Speaker 1: that happen? And she got celebrities like be Strong Jesse, 924 00:43:53,960 --> 00:43:56,160 Speaker 1: like everybody in the country who seemed to galvanize, but 925 00:43:56,200 --> 00:43:59,080 Speaker 1: he what was a completely ridiculous fake story. 926 00:43:59,480 --> 00:44:02,600 Speaker 3: Yeah, it's I lay this out in the in the 927 00:44:02,640 --> 00:44:07,760 Speaker 3: chapter about incompetence and introspection or lage and laziness, because 928 00:44:07,800 --> 00:44:10,560 Speaker 3: I also think that there's another element to this, and 929 00:44:10,600 --> 00:44:14,239 Speaker 3: that is what happened in this instance. You know, we 930 00:44:14,400 --> 00:44:17,000 Speaker 3: it seems completely ridiculous, Like as you're laying out the 931 00:44:17,000 --> 00:44:19,239 Speaker 3: facts and people know, you know the Justice Smilette story 932 00:44:19,239 --> 00:44:21,400 Speaker 3: pretty well, but what you might not remember is like 933 00:44:21,440 --> 00:44:23,920 Speaker 3: there were outlets, let's just say, like Entertainment Tonight was 934 00:44:23,960 --> 00:44:25,439 Speaker 3: one of those that I lay out in the book, 935 00:44:25,680 --> 00:44:29,520 Speaker 3: which described it as an alleged homophobic and racist attack. 936 00:44:29,600 --> 00:44:33,040 Speaker 3: And he was allegedly attacked. And the way that they 937 00:44:33,080 --> 00:44:36,240 Speaker 3: couched that, which was the right thing to do journalistically, 938 00:44:36,920 --> 00:44:40,000 Speaker 3: was then attacked by people like AOC for how dare 939 00:44:40,040 --> 00:44:42,839 Speaker 3: you call what happened to him racist and homophobic? They 940 00:44:42,840 --> 00:44:45,680 Speaker 3: were then you know, got into the Twitter storm. They 941 00:44:45,680 --> 00:44:49,400 Speaker 3: were the bad account of the week, and and so 942 00:44:49,560 --> 00:44:51,920 Speaker 3: you start to lay out there's not the incentive structure 943 00:44:52,000 --> 00:44:54,279 Speaker 3: in the media now because of social media, because how 944 00:44:54,320 --> 00:44:57,040 Speaker 3: everything is public. It's not just about getting it right 945 00:44:57,120 --> 00:45:00,040 Speaker 3: or wrong, but it's also about what is acceptable and 946 00:45:00,080 --> 00:45:03,360 Speaker 3: what's unacceptable. And it's it's very sad, but in a 947 00:45:03,360 --> 00:45:05,640 Speaker 3: lot of these newsrooms, what if you cover something in 948 00:45:05,680 --> 00:45:08,279 Speaker 3: a way that is wrong, is incorrect, like so many 949 00:45:08,280 --> 00:45:12,200 Speaker 3: did with Jesse Smillette, but it's acceptable, then you're not 950 00:45:12,239 --> 00:45:14,040 Speaker 3: going to be hit as hard as if you covered 951 00:45:14,080 --> 00:45:16,600 Speaker 3: something in a correct way, but about in a way 952 00:45:16,640 --> 00:45:19,040 Speaker 3: that was unacceptable to the to the you know, the 953 00:45:19,080 --> 00:45:21,320 Speaker 3: twitter Ati, the people that are that are the loudest 954 00:45:21,400 --> 00:45:24,360 Speaker 3: voices on Twitter, and so that's completely messed up. And 955 00:45:24,360 --> 00:45:26,920 Speaker 3: that's a new phenomenon. That's not something that existed when 956 00:45:26,920 --> 00:45:29,480 Speaker 3: you when you didn't have social media and people yelling 957 00:45:29,480 --> 00:45:31,200 Speaker 3: at you. You only had people maybe calling into the 958 00:45:31,239 --> 00:45:34,719 Speaker 3: newsroom or sending a letter. It wasn't public, There wasn't 959 00:45:34,840 --> 00:45:37,000 Speaker 3: there weren't able to be these pressure campaigns that came 960 00:45:37,040 --> 00:45:39,160 Speaker 3: about it. And that's why I think we get we 961 00:45:39,200 --> 00:45:41,920 Speaker 3: see so much get you know, incorrectly covered like the 962 00:45:42,040 --> 00:45:45,040 Speaker 3: Justice Smilette story is, because why why cover it right? 963 00:45:45,280 --> 00:45:47,640 Speaker 3: Why try to get it right, because you're only gonna 964 00:45:47,640 --> 00:45:50,120 Speaker 3: get yelled at by people like aoc eh, we'll just 965 00:45:50,200 --> 00:45:51,239 Speaker 3: get it wrong and we'll move on. 966 00:45:53,200 --> 00:45:54,680 Speaker 2: Well, I mean, it's such a great point. 967 00:45:54,719 --> 00:45:56,759 Speaker 1: And I think you talk about in the book where 968 00:45:56,760 --> 00:46:00,640 Speaker 1: you're interviewing Jimmy Fallon and his producers and how I 969 00:46:00,640 --> 00:46:03,799 Speaker 1: guess how much of this is like self created by 970 00:46:04,640 --> 00:46:07,640 Speaker 1: media executives. And I found this fascinating. I actually highlighted 971 00:46:07,640 --> 00:46:10,680 Speaker 1: the quote where he said, we like to do things 972 00:46:10,719 --> 00:46:15,000 Speaker 1: that certain people love as opposed to something everybody likes. 973 00:46:15,400 --> 00:46:18,000 Speaker 1: So in other words, and this was sort of born 974 00:46:18,040 --> 00:46:21,480 Speaker 1: out of the idea that media companies are like Fox 975 00:46:21,520 --> 00:46:23,759 Speaker 1: News for example, or maybe even perhaps CNN, are owned 976 00:46:23,800 --> 00:46:26,440 Speaker 1: by bigger and bigger conglomerates. So you end up now 977 00:46:26,760 --> 00:46:28,200 Speaker 1: we're in an environment. 978 00:46:27,840 --> 00:46:32,040 Speaker 2: Where you have media like media networks. 979 00:46:31,640 --> 00:46:33,960 Speaker 1: Owned by massive, massive companies, and I think you refer 980 00:46:34,000 --> 00:46:36,080 Speaker 1: to them in the book as blobs, to where now 981 00:46:36,080 --> 00:46:39,440 Speaker 1: you have also journalists with small media companies doing podcasts 982 00:46:39,440 --> 00:46:42,560 Speaker 1: and substacks who have a really strong connection with their audience. 983 00:46:42,600 --> 00:46:46,360 Speaker 1: The smaller folks bigger media conglomerates perhaps don't have a 984 00:46:46,360 --> 00:46:48,400 Speaker 1: stronger connection with their audience. So in an attempt to 985 00:46:48,480 --> 00:46:52,000 Speaker 1: actually get a base. Shows like the like Jimmy Fallon 986 00:46:52,239 --> 00:46:53,960 Speaker 1: and The Tonight Show are going all in with a 987 00:46:54,000 --> 00:46:56,360 Speaker 1: certain demographic or a certain audience to make sure that 988 00:46:56,400 --> 00:46:57,000 Speaker 1: they have a base. 989 00:46:57,040 --> 00:46:58,000 Speaker 2: Do I have that right? 990 00:46:58,360 --> 00:47:01,080 Speaker 3: Yeah? I think that's absolutely right. The way that the 991 00:47:01,520 --> 00:47:06,279 Speaker 3: financial incentive structure has changed so dramatically, this relates to 992 00:47:06,560 --> 00:47:08,840 Speaker 3: people like Jimmy Fallon and entertainment shows. I talk a 993 00:47:08,840 --> 00:47:10,520 Speaker 3: lot about ESPN in the book and the way that 994 00:47:10,520 --> 00:47:13,920 Speaker 3: that's shifted the big media outlets like CNN and New 995 00:47:14,000 --> 00:47:16,319 Speaker 3: York Times. You know, there was once a time when 996 00:47:16,480 --> 00:47:18,719 Speaker 3: that business was just going to be printing money. You 997 00:47:18,760 --> 00:47:22,319 Speaker 3: could really go and build a giant audience and try 998 00:47:22,360 --> 00:47:24,120 Speaker 3: to attract as many people as you could, and you 999 00:47:24,120 --> 00:47:25,960 Speaker 3: try to maybe water down a little bit so that 1000 00:47:26,000 --> 00:47:28,719 Speaker 3: you can just get bring in everyone, not offend anyone. 1001 00:47:29,080 --> 00:47:31,759 Speaker 3: And that was the actual mentality. But you know, when 1002 00:47:31,760 --> 00:47:34,120 Speaker 3: I interview Jimmy Fallon about this and his producers, and 1003 00:47:34,239 --> 00:47:36,520 Speaker 3: this was like two thousand and nine, this was early on, 1004 00:47:36,640 --> 00:47:39,920 Speaker 3: but they saw that this was changing, that there was 1005 00:47:40,000 --> 00:47:42,040 Speaker 3: no longer going to be these five or six million 1006 00:47:42,040 --> 00:47:44,279 Speaker 3: people watching at night. No, you could win that time 1007 00:47:44,280 --> 00:47:46,560 Speaker 3: slot with two million or one and a half million, 1008 00:47:46,920 --> 00:47:50,520 Speaker 3: and so you need to appeal to smaller number of people, 1009 00:47:50,719 --> 00:47:53,000 Speaker 3: but people that love you, that people that will have 1010 00:47:53,040 --> 00:47:55,920 Speaker 3: a much greater affinity to you. And so yeah, I 1011 00:47:55,960 --> 00:47:59,120 Speaker 3: think we absolutely saw that with the espns of the world, 1012 00:47:59,280 --> 00:48:01,880 Speaker 3: with the CNN, the New York Times. Is they don't 1013 00:48:01,880 --> 00:48:04,719 Speaker 3: care about alienating people on the right in some capacity 1014 00:48:04,760 --> 00:48:08,600 Speaker 3: because the pool that they need to pull from is 1015 00:48:09,040 --> 00:48:12,319 Speaker 3: smaller and smaller. They're feeling the quicksand beneath their feet. 1016 00:48:12,320 --> 00:48:14,480 Speaker 3: They're just trying to gramb onto the lowest hanging fruit 1017 00:48:14,719 --> 00:48:16,719 Speaker 3: and hold on for dear life and get make the 1018 00:48:16,760 --> 00:48:19,239 Speaker 3: most out of their time before everyone just shifts to 1019 00:48:19,320 --> 00:48:22,000 Speaker 3: streaming or some other platforms. So yeah, I think that's 1020 00:48:22,000 --> 00:48:26,000 Speaker 3: a huge problem because in some ways institutional media is needed. 1021 00:48:26,000 --> 00:48:29,440 Speaker 3: We need media that has large resources and legal departments 1022 00:48:29,480 --> 00:48:31,520 Speaker 3: to fight the good fight, even if they have their 1023 00:48:31,840 --> 00:48:34,160 Speaker 3: hearts in the right place, which obviously they don't. But 1024 00:48:34,480 --> 00:48:37,640 Speaker 3: when you have this current structure, you're going to make 1025 00:48:37,680 --> 00:48:40,200 Speaker 3: these They're going to make these bad business decisions because 1026 00:48:40,200 --> 00:48:43,000 Speaker 3: they no longer have to appeal to the masses. 1027 00:48:43,080 --> 00:48:45,279 Speaker 1: Well, right, And so, like, what I was thinking about is, like, 1028 00:48:45,320 --> 00:48:47,320 Speaker 1: how much of this is self created. In other words, 1029 00:48:47,520 --> 00:48:50,440 Speaker 1: the media executives going all in with a certain audience, 1030 00:48:50,480 --> 00:48:53,200 Speaker 1: what they do is in effect create a self flicking 1031 00:48:53,200 --> 00:48:56,160 Speaker 1: ice cream cone, where yeah, they might have a really 1032 00:48:56,719 --> 00:48:59,160 Speaker 1: strong audience that they go all in on, but they 1033 00:48:59,200 --> 00:49:03,880 Speaker 1: all end up being like ideologically dogmatic and similar. And 1034 00:49:03,920 --> 00:49:05,840 Speaker 1: so what happens is when a situation like at the 1035 00:49:05,880 --> 00:49:08,440 Speaker 1: New York Times where they run a Tom Cotton opbed 1036 00:49:08,480 --> 00:49:10,640 Speaker 1: at the height of the Summer of Love, where Tom 1037 00:49:10,680 --> 00:49:13,400 Speaker 1: Cotton is advocating sending troops into cities to quell some 1038 00:49:13,440 --> 00:49:16,440 Speaker 1: of these violent riots. You have people in the New 1039 00:49:16,520 --> 00:49:21,240 Speaker 1: York Times like newsrooms who are crying and freaking out 1040 00:49:21,719 --> 00:49:25,320 Speaker 1: and organizing in a way to whereby their firing editors 1041 00:49:25,400 --> 00:49:28,480 Speaker 1: and pulling the op eds and apologizing to the public. 1042 00:49:28,480 --> 00:49:30,880 Speaker 1: Whereas if they just like did, like maybe had a 1043 00:49:30,880 --> 00:49:34,800 Speaker 1: broader audience, it was more intellectually diverse, maybe they wouldn't 1044 00:49:34,800 --> 00:49:37,200 Speaker 1: have had a problem with maybe losing their audience that 1045 00:49:37,200 --> 00:49:38,560 Speaker 1: they went all in on in the first place. 1046 00:49:38,600 --> 00:49:40,480 Speaker 2: Does that make sense, That's exactly right. 1047 00:49:40,560 --> 00:49:44,040 Speaker 3: Yeah, Intellectual diversity is what's really missing in a lot 1048 00:49:44,080 --> 00:49:46,759 Speaker 3: of these newsrooms. And I think that that's it takes 1049 00:49:46,800 --> 00:49:50,120 Speaker 3: from pulling from different experiences people that I don't even 1050 00:49:50,160 --> 00:49:52,360 Speaker 3: need to necessarily fill it with a bunch of maga people. 1051 00:49:52,440 --> 00:49:54,440 Speaker 3: You just have people that are open minded and that 1052 00:49:54,560 --> 00:49:57,400 Speaker 3: right a long way. So yeah, I completely agree with you. 1053 00:49:58,160 --> 00:50:00,840 Speaker 3: I think that the Tom cotton Ed story, which I 1054 00:50:00,880 --> 00:50:03,879 Speaker 3: write about in Uncovered in chapter ten, is another one 1055 00:50:03,920 --> 00:50:06,360 Speaker 3: that's really emblematic of the problems with the press. And 1056 00:50:06,360 --> 00:50:09,280 Speaker 3: I talked to Sean mccreich, who was in the newsroom 1057 00:50:09,320 --> 00:50:10,880 Speaker 3: at the time. This is now he's out in New 1058 00:50:10,960 --> 00:50:14,520 Speaker 3: York magazine, and he was laying out that people were so, 1059 00:50:16,080 --> 00:50:20,520 Speaker 3: you know, just emotional about the publication of this, said. 1060 00:50:20,360 --> 00:50:22,680 Speaker 1: One dude like logged onto like one of the chats 1061 00:50:22,680 --> 00:50:24,160 Speaker 1: and was crying or something about it. 1062 00:50:25,880 --> 00:50:26,160 Speaker 2: Together. 1063 00:50:26,760 --> 00:50:29,160 Speaker 3: And why was he crying Not because it was published, 1064 00:50:29,280 --> 00:50:31,719 Speaker 3: but because his friends wouldn't talk to him because he 1065 00:50:31,800 --> 00:50:33,920 Speaker 3: worked for the New York Times and that that dared 1066 00:50:33,920 --> 00:50:35,360 Speaker 3: to publish this horrible op ed. 1067 00:50:36,040 --> 00:50:38,840 Speaker 2: I'm sorry, it's like absolutely ridiculous. 1068 00:50:38,920 --> 00:50:44,120 Speaker 4: Yes, say so, yes, it's insane that then what was 1069 00:50:44,200 --> 00:50:47,480 Speaker 4: what was done was publicly there was this pressure campaign 1070 00:50:47,520 --> 00:50:49,319 Speaker 4: being done by the staffers of the New York Times 1071 00:50:49,400 --> 00:50:51,920 Speaker 4: to say publishing this op ed puts the lives of 1072 00:50:52,000 --> 00:50:54,919 Speaker 4: black New York Times staffers in danger, as if they. 1073 00:50:54,880 --> 00:50:57,560 Speaker 3: Actually believe publishing an op ed by a senator puts 1074 00:50:57,719 --> 00:51:00,480 Speaker 3: lives in danger. That's completely ridiculous. That's the shows how 1075 00:51:00,480 --> 00:51:03,719 Speaker 3: they don't even understand journalism. But you were able to 1076 00:51:03,760 --> 00:51:07,719 Speaker 3: see that this worked. They got the people fired, they 1077 00:51:07,800 --> 00:51:10,759 Speaker 3: got less voices like Tom Cotton to be published in 1078 00:51:10,760 --> 00:51:13,719 Speaker 3: the New York Times or you know, go even beyond that, 1079 00:51:13,760 --> 00:51:15,759 Speaker 3: Tom Cotton is the United States Center, how about just 1080 00:51:16,160 --> 00:51:19,359 Speaker 3: anyone else. So they were able to really make this 1081 00:51:19,360 --> 00:51:22,640 Speaker 3: pressure campaign work. And that is that's a scary moment 1082 00:51:22,719 --> 00:51:25,279 Speaker 3: because the bosses were not able to stand firm in 1083 00:51:25,320 --> 00:51:26,040 Speaker 3: their principles. 1084 00:51:26,080 --> 00:51:28,040 Speaker 1: Well yeah, and they end up getting rid of the 1085 00:51:28,040 --> 00:51:30,799 Speaker 1: New York Times Opinion editorials page and they just go 1086 00:51:30,880 --> 00:51:33,799 Speaker 1: with guest essays to give them a level of diffusion. Hey, 1087 00:51:33,800 --> 00:51:36,200 Speaker 1: this is just a guess like we don't actually believe 1088 00:51:36,239 --> 00:51:37,960 Speaker 1: in any of this stuff, Like what the hell is 1089 00:51:38,000 --> 00:51:38,480 Speaker 1: that about? 1090 00:51:38,520 --> 00:51:41,200 Speaker 2: Like how that was one of the most cowardly things 1091 00:51:41,200 --> 00:51:41,920 Speaker 2: I've ever heard. 1092 00:51:42,600 --> 00:51:45,120 Speaker 1: And I mean, this is why this book is fascinating. 1093 00:51:45,160 --> 00:51:48,240 Speaker 1: It's uncovered. You've got to get it if you're interested 1094 00:51:48,239 --> 00:51:51,239 Speaker 1: in politics, whether you're a Democrat or Republican, it doesn't matter. 1095 00:51:51,280 --> 00:51:52,200 Speaker 2: You're interested in. 1096 00:51:52,200 --> 00:51:55,400 Speaker 1: Why we are where we are right now as a 1097 00:51:55,440 --> 00:51:57,520 Speaker 1: country with regards to a lack of trust in the media. 1098 00:51:57,560 --> 00:51:59,279 Speaker 1: You got to read this book. Steve's been in the 1099 00:51:59,280 --> 00:52:02,960 Speaker 1: game for really long time. Okay, well, and the interview 1100 00:52:03,040 --> 00:52:05,920 Speaker 1: with like what do we do to get the hell 1101 00:52:05,920 --> 00:52:07,640 Speaker 1: out of here? And I'll tell you, like, I really 1102 00:52:07,719 --> 00:52:09,879 Speaker 1: like some of your recommendations. And one of the things 1103 00:52:09,920 --> 00:52:11,880 Speaker 1: that Rick Garnello's used to talk about as part of 1104 00:52:11,880 --> 00:52:16,720 Speaker 1: the Trump administration is breaking up, you know, concentrated power 1105 00:52:16,880 --> 00:52:18,840 Speaker 1: in Washington. In other words, like let's get rid of 1106 00:52:18,880 --> 00:52:21,400 Speaker 1: something like take some of these cabinet level departments, like 1107 00:52:21,440 --> 00:52:24,359 Speaker 1: the Department of Education and put it in Idaho. Let's 1108 00:52:24,400 --> 00:52:26,839 Speaker 1: take the irs out of Washington, D C. And put 1109 00:52:26,840 --> 00:52:29,040 Speaker 1: it in Dallas, Texas. Spread it out so that they're 1110 00:52:29,080 --> 00:52:32,960 Speaker 1: more emblematic of the actual country that they serve. You 1111 00:52:33,080 --> 00:52:36,680 Speaker 1: kind of recommend that with the ACELA media in New 1112 00:52:36,760 --> 00:52:38,400 Speaker 1: York City and in Washington, D C. 1113 00:52:38,600 --> 00:52:39,799 Speaker 2: And not just like maybe. 1114 00:52:39,600 --> 00:52:43,040 Speaker 1: Hiring journalists that live in communities or if they're not 1115 00:52:43,080 --> 00:52:46,640 Speaker 1: already from there, sending people to actually embed with those communities. 1116 00:52:46,880 --> 00:52:49,320 Speaker 1: And one of the best things I loved about it 1117 00:52:49,440 --> 00:52:52,279 Speaker 1: was that finding journalists that might not have the right Hey, look, 1118 00:52:52,400 --> 00:52:54,719 Speaker 1: maybe they don't have the right educational pedigree. They don't 1119 00:52:54,840 --> 00:52:57,400 Speaker 1: maybe they didn't go to Harvard School of Journalism or 1120 00:52:57,440 --> 00:53:00,200 Speaker 1: in your case, Syracuse, that they have an attitude like 1121 00:53:00,239 --> 00:53:02,880 Speaker 1: I don't give a fuck, Like I'm covering the truth 1122 00:53:02,960 --> 00:53:06,320 Speaker 1: and I don't care what it costs me. I'm speaking 1123 00:53:06,320 --> 00:53:08,600 Speaker 1: truth the power and look, I don't care if that 1124 00:53:08,640 --> 00:53:12,440 Speaker 1: person doesn't like me. But the American people can come 1125 00:53:12,520 --> 00:53:14,239 Speaker 1: hell or high water, can believe. 1126 00:53:13,960 --> 00:53:17,480 Speaker 2: That this is the actual truth. Yeah, well, I think 1127 00:53:17,520 --> 00:53:18,840 Speaker 2: it's great. I love the idea. 1128 00:53:19,600 --> 00:53:21,880 Speaker 3: Sel I mean, not giving a fuck is like, really 1129 00:53:21,880 --> 00:53:24,640 Speaker 3: what should be at the top of every journalist and 1130 00:53:24,680 --> 00:53:26,320 Speaker 3: this is what you need to do as a journalist. 1131 00:53:27,000 --> 00:53:29,719 Speaker 3: People ask me, well, what's the skill set? I say, 1132 00:53:29,760 --> 00:53:31,680 Speaker 3: you know, not just about being curious. You need to 1133 00:53:31,680 --> 00:53:34,399 Speaker 3: be nosy. You know. Nosy is a negative term. 1134 00:53:34,520 --> 00:53:37,239 Speaker 1: Like Nancy from Stranger Things you ever see Stranger thinks 1135 00:53:37,280 --> 00:53:40,239 Speaker 1: she's like the young reports. She's like you, you're like 1136 00:53:40,480 --> 00:53:43,040 Speaker 1: the male version of Nancy. Like Nancy's like in a 1137 00:53:43,120 --> 00:53:45,560 Speaker 1: newsroom with a bunch of men. They were like laughing 1138 00:53:45,600 --> 00:53:47,319 Speaker 1: her off and she's like, I'm not gonna let those 1139 00:53:47,400 --> 00:53:49,279 Speaker 1: men tell me what to do. And she's a hard 1140 00:53:49,320 --> 00:53:53,640 Speaker 1: news journalist, like she's out there getting the stories. Just yes, sorry, 1141 00:53:53,800 --> 00:53:55,480 Speaker 1: I mean, just co op what you were saying. 1142 00:53:55,480 --> 00:53:57,399 Speaker 3: But it's it's got to check things. 1143 00:53:57,480 --> 00:54:00,560 Speaker 2: I don't know, you've never seen stranger things it now. 1144 00:54:00,600 --> 00:54:03,600 Speaker 3: It's sad I've been since this time by on Twitter. 1145 00:54:03,640 --> 00:54:04,120 Speaker 3: I need to go. 1146 00:54:04,320 --> 00:54:07,319 Speaker 2: There's a journalism sub story there, like you into it. 1147 00:54:07,680 --> 00:54:10,120 Speaker 3: See that's why I love secessions because it's about the 1148 00:54:10,120 --> 00:54:13,520 Speaker 3: media too, you know. I love just watching that last night. 1149 00:54:14,200 --> 00:54:14,439 Speaker 2: Yeah. 1150 00:54:14,520 --> 00:54:17,359 Speaker 3: No, no, you need people that it's a mentality. It's 1151 00:54:17,360 --> 00:54:20,840 Speaker 3: not about credentials, it's not even about experience. It's a mentality. 1152 00:54:20,880 --> 00:54:22,480 Speaker 3: And you need to find people. And that's going to 1153 00:54:22,480 --> 00:54:24,960 Speaker 3: take people that don't care about how many Twitter followers 1154 00:54:25,000 --> 00:54:28,399 Speaker 3: they have or who they're offending, and also moving them 1155 00:54:28,440 --> 00:54:30,200 Speaker 3: to places like New York and DC where they can 1156 00:54:30,280 --> 00:54:33,160 Speaker 3: just be sanitized from the background that they come from. 1157 00:54:33,360 --> 00:54:36,520 Speaker 3: Put people in places, put newsrooms in places. I completely 1158 00:54:36,520 --> 00:54:39,040 Speaker 3: agree with that from what Cornell said. I think that 1159 00:54:39,040 --> 00:54:42,200 Speaker 3: that goes a long way, and also is about admitting 1160 00:54:42,200 --> 00:54:44,080 Speaker 3: when you get to get things wrong. And I think 1161 00:54:44,080 --> 00:54:45,920 Speaker 3: that that's a huge one. But also, you know, I 1162 00:54:46,160 --> 00:54:49,919 Speaker 3: describe public editors or on Budsman could go a long way. 1163 00:54:50,040 --> 00:54:52,880 Speaker 3: People that their job is to work in these newsrooms, 1164 00:54:53,080 --> 00:54:55,040 Speaker 3: but to be the voice of the public, to be 1165 00:54:55,080 --> 00:54:57,480 Speaker 3: a check on these organizations. This is something that existed 1166 00:54:57,480 --> 00:54:59,960 Speaker 3: at places like The New York Times and Washington Post 1167 00:55:00,120 --> 00:55:03,080 Speaker 3: an MPR, and they've mostly gone away because you know, 1168 00:55:03,120 --> 00:55:05,520 Speaker 3: this idea of oh, we can't hold ourselves accountable, that's 1169 00:55:05,560 --> 00:55:07,440 Speaker 3: that might you know, turn off the audience. No, the 1170 00:55:07,480 --> 00:55:09,640 Speaker 3: audience wants that. It wants someone who you can go 1171 00:55:09,719 --> 00:55:12,280 Speaker 3: to and trust, even if you don't trust the whole organization. 1172 00:55:12,360 --> 00:55:14,400 Speaker 3: That's a step in the right direction to start to 1173 00:55:14,440 --> 00:55:16,719 Speaker 3: win back some of the trust that they've completely, you know, 1174 00:55:16,800 --> 00:55:17,720 Speaker 3: just completely lost. 1175 00:55:18,280 --> 00:55:22,000 Speaker 1: I totally agree. And so and here you are, right, 1176 00:55:22,360 --> 00:55:25,600 Speaker 1: you're the executive producer of the Megan Kelly Show, right 1177 00:55:26,000 --> 00:55:27,640 Speaker 1: and you're still doing that. 1178 00:55:28,120 --> 00:55:28,520 Speaker 2: Yeah. 1179 00:55:28,600 --> 00:55:31,400 Speaker 3: Yeah, I mean that's a day job for sure. Yeah. 1180 00:55:31,440 --> 00:55:35,000 Speaker 1: Well, it's like it's like sometimes people find themselves in 1181 00:55:35,040 --> 00:55:39,120 Speaker 1: the exact right place, Like Megan seems like exactly that 1182 00:55:39,400 --> 00:55:42,919 Speaker 1: kind of a person right where she doesn't really care 1183 00:55:43,040 --> 00:55:46,799 Speaker 1: what corporate media says, doesn't care what liberals or conservators 1184 00:55:46,800 --> 00:55:49,480 Speaker 1: say she's like out there saying like, well, DeSantis is 1185 00:55:49,520 --> 00:55:51,399 Speaker 1: afraid to actually come and do an interview with me. 1186 00:55:51,440 --> 00:55:54,000 Speaker 2: Like, by the way, like that is awesome. I love that. 1187 00:55:54,640 --> 00:55:58,640 Speaker 1: I love that America needs more people like you in 1188 00:55:58,719 --> 00:56:01,480 Speaker 1: the media and like Megan in the media. But this 1189 00:56:01,520 --> 00:56:03,960 Speaker 1: book is a roadmap if we want to make this 1190 00:56:04,000 --> 00:56:06,319 Speaker 1: system better. Or I'm an executive at any one of 1191 00:56:06,360 --> 00:56:10,560 Speaker 1: these news companies, like you're my first phone call, Steve. 1192 00:56:10,600 --> 00:56:13,120 Speaker 3: We thank you. I appreciate that. And I will say Megan, 1193 00:56:13,160 --> 00:56:15,720 Speaker 3: you know, she's someone who's in the corporate media at NBC, 1194 00:56:15,840 --> 00:56:17,879 Speaker 3: at Fox News. She knows what it's like much more 1195 00:56:17,880 --> 00:56:19,839 Speaker 3: than even I do, and so so I mean, it's 1196 00:56:20,040 --> 00:56:22,560 Speaker 3: it's been fantastic working with her. We definitely agree very 1197 00:56:22,680 --> 00:56:24,440 Speaker 3: much on on a lot of the problems that exist 1198 00:56:24,800 --> 00:56:26,759 Speaker 3: in the corporate media structure right now. And I would 1199 00:56:26,760 --> 00:56:29,200 Speaker 3: also say, though, I really wrote the book for the 1200 00:56:29,320 --> 00:56:31,600 Speaker 3: for the public, in the sense of I want to 1201 00:56:31,680 --> 00:56:34,399 Speaker 3: lay out not just what happened over these last five 1202 00:56:34,400 --> 00:56:37,600 Speaker 3: to seven years, but why, and lay out some red flags, 1203 00:56:37,640 --> 00:56:40,160 Speaker 3: some things for you to look for, and say, Okay, 1204 00:56:40,200 --> 00:56:42,240 Speaker 3: this is happening again, because it's going to be continuing 1205 00:56:42,239 --> 00:56:44,480 Speaker 3: to happen. So here's what you should be armed with 1206 00:56:44,640 --> 00:56:47,399 Speaker 3: to know when it happens in the future, and how 1207 00:56:47,440 --> 00:56:49,960 Speaker 3: to be skeptical, not totally distrust the media, but to 1208 00:56:50,000 --> 00:56:52,239 Speaker 3: be skeptical, and what to look for so you know 1209 00:56:52,239 --> 00:56:55,040 Speaker 3: when it happens again, how to how you might be 1210 00:56:55,080 --> 00:56:58,080 Speaker 3: getting spun or lied to in this sort of propaganda 1211 00:56:58,080 --> 00:56:59,080 Speaker 3: push that's being out there. 1212 00:56:59,600 --> 00:57:02,600 Speaker 1: I love it, ladies and gentlemen, if you're listening or 1213 00:57:02,600 --> 00:57:06,040 Speaker 1: you're watching Get Uncovered, it's it's a great book. And 1214 00:57:06,040 --> 00:57:08,200 Speaker 1: and Steve, what's what's next for you? 1215 00:57:08,280 --> 00:57:08,799 Speaker 2: Real quick? 1216 00:57:09,239 --> 00:57:11,439 Speaker 3: Well? Thank you? You know you look at Megan Kelly's show. 1217 00:57:11,640 --> 00:57:13,359 Speaker 3: Is it's that's that's where we're that's where we're putting 1218 00:57:13,360 --> 00:57:15,520 Speaker 3: all our energy. I think it's going to be. It's 1219 00:57:15,560 --> 00:57:17,040 Speaker 3: an interesting time. I think we've got a lot to 1220 00:57:17,040 --> 00:57:18,360 Speaker 3: cover these these next step. 1221 00:57:18,200 --> 00:57:20,440 Speaker 1: We see an ass kicker, man, she's an ass kicker. 1222 00:57:20,800 --> 00:57:23,480 Speaker 1: It's a good place to be, you know. Sure, And 1223 00:57:23,480 --> 00:57:26,320 Speaker 1: and so get the book Uncovered. Steve, thank you for 1224 00:57:26,400 --> 00:57:31,280 Speaker 1: joining us for like almost an hour. I appreciate your time, man, 1225 00:57:31,440 --> 00:57:34,680 Speaker 1: and good luck with everything. Thanks, good talking, all right, 1226 00:57:34,720 --> 00:57:38,360 Speaker 1: take care Steve. All right, everybody that's a wrap. Steve 1227 00:57:38,440 --> 00:57:41,320 Speaker 1: is a pretty amazing guy. He's super smart. His book 1228 00:57:41,400 --> 00:57:42,960 Speaker 1: is great. You got to run out and get it 1229 00:57:42,960 --> 00:57:46,200 Speaker 1: if you want to understand how we got to where 1230 00:57:46,280 --> 00:57:49,000 Speaker 1: we are in this country today. With regards to the media, 1231 00:57:49,800 --> 00:57:53,280 Speaker 1: read the book. It's it's just a behind the scenes 1232 00:57:53,360 --> 00:57:57,800 Speaker 1: look of of everything. And if you like what you heard, 1233 00:57:57,960 --> 00:58:02,760 Speaker 1: subscribe to Battleground Podcast wherever you listen to podcasts, it's everywhere. 1234 00:58:03,440 --> 00:58:06,040 Speaker 1: Leave reviews if you feel so inclined. 1235 00:58:06,080 --> 00:58:07,120 Speaker 2: We need them. 1236 00:58:07,640 --> 00:58:10,880 Speaker 1: You can also subscribe to my brand new Rumble channel 1237 00:58:10,960 --> 00:58:13,720 Speaker 1: for exclusive content. I'm also on YouTube as well, at 1238 00:58:13,840 --> 00:58:17,640 Speaker 1: least for the time being, until I get censored or 1239 00:58:17,680 --> 00:58:23,120 Speaker 1: canceled from there. We'll see anyways, Ladies and gentlemen, thank 1240 00:58:23,160 --> 00:58:26,160 Speaker 1: you so much for listening or watching this. None of 1241 00:58:26,200 --> 00:58:31,200 Speaker 1: this is possible without you. God bless you all, and 1242 00:58:31,280 --> 00:58:56,520 Speaker 1: God bless this amazing country that we live in. Take care,