1 00:00:05,480 --> 00:00:09,159 Speaker 1: On this episode of Newsworld, America's corporate news media is 2 00:00:09,240 --> 00:00:12,240 Speaker 1: less trusted than ever. According to the most recent Gallop 3 00:00:12,240 --> 00:00:15,800 Speaker 1: Pole in October twenty twenty one, trust and mass media 4 00:00:15,880 --> 00:00:18,599 Speaker 1: is at its second lowest point ever, with just thirty 5 00:00:18,600 --> 00:00:21,480 Speaker 1: six percent of Americans say they have either a great 6 00:00:21,520 --> 00:00:24,120 Speaker 1: deal or even a fair amount of trust in the media, 7 00:00:24,360 --> 00:00:28,680 Speaker 1: and so unsurprisingly, there's a split along party lines. Sixty 8 00:00:28,720 --> 00:00:31,640 Speaker 1: eight percent of Democrats trust the media, while only eleven 9 00:00:31,680 --> 00:00:35,800 Speaker 1: percent of Republicans did. In his new book Uncovered, How 10 00:00:35,800 --> 00:00:38,600 Speaker 1: the Media got a Cozy with Power, abandoned his principles, 11 00:00:38,800 --> 00:00:42,000 Speaker 1: and lost the people, Steve Krakauer gives readers a peak 12 00:00:42,280 --> 00:00:45,000 Speaker 1: behind the curtain of the media challenges in America to day. 13 00:00:45,560 --> 00:00:48,240 Speaker 1: The book dies deep into some of the most important 14 00:00:48,440 --> 00:00:52,120 Speaker 1: and egregious examples of the elite censorship collusion racket, like 15 00:00:52,240 --> 00:00:55,840 Speaker 1: how tech suppression and media fear led the New York 16 00:00:55,840 --> 00:00:59,760 Speaker 1: Post Hunter Biden email debacle before the twenty twenty election. 17 00:01:00,560 --> 00:01:03,960 Speaker 1: Krakauer takes leaders inside CNN after the election of President Trump, 18 00:01:04,200 --> 00:01:06,640 Speaker 1: inside the New York Times after the Tom Cotton op 19 00:01:06,800 --> 00:01:11,160 Speaker 1: ed backlash, inside ESPN after the shift away from sports 20 00:01:11,160 --> 00:01:15,800 Speaker 1: only coverage, and more revealing never before seeing details about 21 00:01:15,800 --> 00:01:18,319 Speaker 1: the press over the past five years. Here to talk 22 00:01:18,360 --> 00:01:20,679 Speaker 1: about his new book. I'm really pleased to welcome my guest, 23 00:01:20,720 --> 00:01:24,160 Speaker 1: Steve Krakauer. He is a terrific journalist. I read his 24 00:01:24,280 --> 00:01:27,840 Speaker 1: material constantly. He is a media critic who's worked at CNN, 25 00:01:28,200 --> 00:01:31,240 Speaker 1: Fox News, NBC in the Blaze. The author is the 26 00:01:31,280 --> 00:01:34,440 Speaker 1: Fourth Watch media newsletter, which I urge you to subscribe, 27 00:01:34,440 --> 00:01:38,040 Speaker 1: to host the Fourth Watch podcast, and is the executive 28 00:01:38,040 --> 00:01:51,160 Speaker 1: producer of The Meghan Kelly Show. Steve, Welcome and thank 29 00:01:51,200 --> 00:01:53,520 Speaker 1: you for joining me on News World. Thank you so much. 30 00:01:53,560 --> 00:01:55,840 Speaker 1: It's always great talking with you. You know, you've had 31 00:01:55,880 --> 00:01:58,640 Speaker 1: a remarkable career already. Can you talk a little bit 32 00:01:58,720 --> 00:02:03,160 Speaker 1: about sort of the past if you've taken in journalism? Sure? Yeah, absolutely. 33 00:02:03,160 --> 00:02:04,880 Speaker 1: I was an NBC page back in the day. I 34 00:02:04,880 --> 00:02:07,240 Speaker 1: worked at Fox News early on in my career, but 35 00:02:07,360 --> 00:02:10,000 Speaker 1: then I moved to a couple outlets, Mediaite and TV 36 00:02:10,120 --> 00:02:13,239 Speaker 1: newser which people that are real inside the media might recognize. 37 00:02:13,280 --> 00:02:15,960 Speaker 1: But I covered the media industry for many years working 38 00:02:15,960 --> 00:02:17,639 Speaker 1: out of New York City. I was very much part 39 00:02:17,639 --> 00:02:19,760 Speaker 1: of that media bubble, and then I ended up moving 40 00:02:19,760 --> 00:02:23,919 Speaker 1: to CNN in twenty eleven and working for Piers Morgan 41 00:02:24,000 --> 00:02:26,799 Speaker 1: at first and then the network more broadly, so they're 42 00:02:26,800 --> 00:02:30,239 Speaker 1: in the twenty twelve election. My responsibility was essentially how 43 00:02:30,280 --> 00:02:33,640 Speaker 1: does television live on social media and on the website 44 00:02:33,880 --> 00:02:37,079 Speaker 1: on digital properties. I did that for several years before 45 00:02:37,120 --> 00:02:39,480 Speaker 1: moving to The Blaze and physically moving out of New 46 00:02:39,560 --> 00:02:42,480 Speaker 1: York moving to Dallas in that role, and then I 47 00:02:42,520 --> 00:02:44,320 Speaker 1: had my own company for a little bit before I 48 00:02:44,360 --> 00:02:47,760 Speaker 1: did start Fourth Watch in December of twenty nineteen and 49 00:02:47,919 --> 00:02:50,480 Speaker 1: started working with mcinkelly for The Megan Kelly Show as well. 50 00:02:50,520 --> 00:02:53,120 Speaker 1: And really the book came out of Fourth Watch and 51 00:02:53,200 --> 00:02:57,200 Speaker 1: really what my kind of outsider now but former insider 52 00:02:57,280 --> 00:02:59,720 Speaker 1: perspective of what the media has been like over the 53 00:02:59,800 --> 00:03:02,840 Speaker 1: last five to seven years. You know, you were in 54 00:03:02,880 --> 00:03:06,720 Speaker 1: your book with a prologue entitled The Laptop. Why did 55 00:03:06,800 --> 00:03:10,480 Speaker 1: you decide to start with the Hunter Biden laptop story. 56 00:03:10,800 --> 00:03:13,600 Speaker 1: I really think that it was one of the most 57 00:03:13,680 --> 00:03:17,160 Speaker 1: important stories over the last seven to eight years in 58 00:03:17,320 --> 00:03:21,160 Speaker 1: showing not just where the media declined to but the 59 00:03:21,200 --> 00:03:23,320 Speaker 1: next step that it went, because I think it really 60 00:03:23,360 --> 00:03:27,000 Speaker 1: represented the end of the Trump era of journalism, which 61 00:03:27,080 --> 00:03:30,280 Speaker 1: there were some major black eyes in journalism and really 62 00:03:30,360 --> 00:03:34,000 Speaker 1: lows in what happened during that era. But the Hunter 63 00:03:34,080 --> 00:03:36,800 Speaker 1: laptop story was the beginning of something even worse, what 64 00:03:36,840 --> 00:03:40,520 Speaker 1: I would describe as anti speech activism, this effort by 65 00:03:40,560 --> 00:03:44,640 Speaker 1: the media to suppress information, which you think about the media, 66 00:03:44,680 --> 00:03:47,280 Speaker 1: of all occupations, it should be the one that cares 67 00:03:47,280 --> 00:03:50,040 Speaker 1: the most about freedom of speech, freedom of expression, and 68 00:03:50,040 --> 00:03:53,320 Speaker 1: a free flow exchange of ideas, But instead it's gone 69 00:03:53,360 --> 00:03:56,840 Speaker 1: completely the opposite direction. And so I dug back into 70 00:03:56,960 --> 00:03:59,680 Speaker 1: what happened in October of twenty twenty, because it's incredible 71 00:03:59,720 --> 00:04:03,440 Speaker 1: to look got it now the way that journalists in 72 00:04:03,600 --> 00:04:07,720 Speaker 1: partnership with tech platforms and their censors there with as 73 00:04:07,760 --> 00:04:10,520 Speaker 1: we now know thanks to the Twitter files, intel agencies 74 00:04:10,560 --> 00:04:13,880 Speaker 1: and other government entities, but the corporate media themselves as 75 00:04:13,920 --> 00:04:18,160 Speaker 1: being part of this elite censorship collusion racket, worked together 76 00:04:18,480 --> 00:04:21,919 Speaker 1: to suppress this story, to plant in people's minds, the 77 00:04:21,960 --> 00:04:26,240 Speaker 1: audience's mind that this was likely Russian disinformation, and then 78 00:04:26,680 --> 00:04:29,480 Speaker 1: years later to fay oh, actually the laptop is real 79 00:04:29,560 --> 00:04:31,919 Speaker 1: and recovered it. But in the moment, it was the 80 00:04:31,960 --> 00:04:34,200 Speaker 1: way that they handled it, and it was also what 81 00:04:34,279 --> 00:04:38,080 Speaker 1: was most concerning to me was that the media themselves. 82 00:04:38,120 --> 00:04:40,679 Speaker 1: The colleagues at the New York Post were being censored 83 00:04:40,720 --> 00:04:45,440 Speaker 1: in unprecedented ways, journalists like Jake Sherman, who I write 84 00:04:45,480 --> 00:04:48,520 Speaker 1: about in Politico at the time, Maggie Haberman of the 85 00:04:48,520 --> 00:04:50,720 Speaker 1: New York Times at the time. They shared a link 86 00:04:50,760 --> 00:04:52,520 Speaker 1: to The New York Post to the story, they said, Oh, 87 00:04:52,560 --> 00:04:55,200 Speaker 1: I questioning the sourcing about this, or I wonder if 88 00:04:55,200 --> 00:04:58,440 Speaker 1: the Biden campaign will respond. They're not even sharing it positively. 89 00:04:58,640 --> 00:05:01,920 Speaker 1: But they were attacked by their own colleagues, and they 90 00:05:01,960 --> 00:05:05,440 Speaker 1: apologized for daring to link to this story. It was 91 00:05:05,480 --> 00:05:08,200 Speaker 1: so embarrassing, and the fact that they didn't go and 92 00:05:08,480 --> 00:05:10,479 Speaker 1: jump in and come to the defense of the New 93 00:05:10,560 --> 00:05:13,560 Speaker 1: York Post really was disturbing to me. So I think 94 00:05:13,560 --> 00:05:15,719 Speaker 1: it represented so much that was wrong with the press, 95 00:05:15,920 --> 00:05:18,120 Speaker 1: and so much that's continued to get worse when it 96 00:05:18,160 --> 00:05:21,320 Speaker 1: comes to censorship and suppression. It's interesting to me, He goes, 97 00:05:21,800 --> 00:05:25,040 Speaker 1: if you go back and look at Theodore White's Making 98 00:05:25,040 --> 00:05:28,040 Speaker 1: of the President of nineteen sixty eight, he has an 99 00:05:28,200 --> 00:05:32,560 Speaker 1: entire chapter on how the press was going wrong, and 100 00:05:32,680 --> 00:05:34,760 Speaker 1: then the role of the New York Times in particular 101 00:05:35,279 --> 00:05:39,320 Speaker 1: in leading the pressure down a trail of bias. I mean, 102 00:05:39,360 --> 00:05:43,400 Speaker 1: that's almost fifty years ago, and when you read that chapter, 103 00:05:43,600 --> 00:05:47,040 Speaker 1: it could be today. It's just worse. But it's the 104 00:05:47,080 --> 00:05:50,800 Speaker 1: same long term pattern. I grew up in sort of 105 00:05:50,800 --> 00:05:54,400 Speaker 1: the shadow of World War Two, when people like Edward R. 106 00:05:54,480 --> 00:05:58,320 Speaker 1: Murrow had really created a sense of trust and a 107 00:05:58,400 --> 00:06:01,640 Speaker 1: sense of belief that these were courageous folks out there 108 00:06:01,640 --> 00:06:06,000 Speaker 1: covering the news. Watching the decay of the system has 109 00:06:06,040 --> 00:06:10,880 Speaker 1: been sort of astonishing. What do you think drove compelled 110 00:06:11,040 --> 00:06:14,320 Speaker 1: this kind of behavior. I think that there have always 111 00:06:14,360 --> 00:06:18,039 Speaker 1: been valid criticisms of the corporate media, the broader mainstream media. 112 00:06:18,120 --> 00:06:19,880 Speaker 1: In fact, you when I was at CNN and from 113 00:06:19,920 --> 00:06:22,920 Speaker 1: twenty ten to twenty thirteen, there was certainly valid criticisms 114 00:06:22,920 --> 00:06:24,720 Speaker 1: of the network. I think I worked with a lot 115 00:06:24,720 --> 00:06:26,200 Speaker 1: of people, and I know you worked there for a 116 00:06:26,240 --> 00:06:28,240 Speaker 1: time also when I was there, I would say the 117 00:06:28,240 --> 00:06:30,720 Speaker 1: majority of people that worked there leaned left. They probably 118 00:06:30,720 --> 00:06:33,719 Speaker 1: if they voted, they voted for Democrats. That bias was 119 00:06:33,760 --> 00:06:37,719 Speaker 1: always there, But something significant changed in more recent years. 120 00:06:37,839 --> 00:06:40,680 Speaker 1: I actually trace it to twenty fourteen and what happened 121 00:06:40,680 --> 00:06:44,480 Speaker 1: in Ferguson. I think certainly the Trump primary process, and 122 00:06:44,520 --> 00:06:46,880 Speaker 1: then again as he became president and what we saw 123 00:06:46,920 --> 00:06:49,080 Speaker 1: then played a big role in it. And I would 124 00:06:49,080 --> 00:06:51,720 Speaker 1: also think some of external factors I have led to this. 125 00:06:51,880 --> 00:06:54,320 Speaker 1: I think one of the five main problems with the 126 00:06:54,360 --> 00:06:57,360 Speaker 1: press over these last few years has been Twitter. I 127 00:06:57,360 --> 00:07:00,200 Speaker 1: think if Twitter didn't exist, we would have gotten a 128 00:07:00,240 --> 00:07:03,679 Speaker 1: different media during the Trump years and since. But because 129 00:07:03,720 --> 00:07:06,440 Speaker 1: of that, because of several things, that's really key about 130 00:07:06,480 --> 00:07:09,320 Speaker 1: what happens with Twitter that has changed. You know, you 131 00:07:09,400 --> 00:07:11,400 Speaker 1: have to think about for people that don't spend a 132 00:07:11,400 --> 00:07:13,520 Speaker 1: lot of time on Twitter, and that's most people in 133 00:07:13,520 --> 00:07:18,240 Speaker 1: this country. Just two percent of American adults represent ninety 134 00:07:18,240 --> 00:07:20,800 Speaker 1: percent of all tweets when it comes to news and politics. 135 00:07:20,800 --> 00:07:22,800 Speaker 1: I mean, it is such a bubble, and yet that's 136 00:07:22,800 --> 00:07:25,640 Speaker 1: where the media spends their time. And if they get 137 00:07:26,400 --> 00:07:30,200 Speaker 1: fifty positive comments or fifty negative comments, it can feel 138 00:07:30,320 --> 00:07:33,000 Speaker 1: so large when in reality it's such a small cross 139 00:07:33,040 --> 00:07:36,040 Speaker 1: section of the country. But I quote in the book 140 00:07:36,280 --> 00:07:38,560 Speaker 1: more than two dozen people on the record. Everyone I 141 00:07:38,600 --> 00:07:40,480 Speaker 1: speak to in the book is on the record. It's 142 00:07:40,560 --> 00:07:42,320 Speaker 1: very important to me for that. And one of the 143 00:07:42,320 --> 00:07:44,920 Speaker 1: people I talk to is a founder of a media 144 00:07:44,960 --> 00:07:47,480 Speaker 1: company called The Rap and she tells me that she 145 00:07:47,560 --> 00:07:51,360 Speaker 1: has seen her own journalists cover a story differently, or 146 00:07:51,400 --> 00:07:54,360 Speaker 1: maybe not covered at all because of the fear that 147 00:07:54,400 --> 00:07:57,560 Speaker 1: their coverage might have when it comes to Twitter, the 148 00:07:57,600 --> 00:07:59,800 Speaker 1: reaction that it might get. And that's a huge problem. 149 00:08:00,080 --> 00:08:02,360 Speaker 1: And the other thing that I think has really changed 150 00:08:02,400 --> 00:08:05,040 Speaker 1: now is that there was a time when objectivity was 151 00:08:05,080 --> 00:08:08,080 Speaker 1: at least the goal, at the very least. Maybe you missed, 152 00:08:08,080 --> 00:08:10,680 Speaker 1: maybe you made mistakes, you had some bias, but you 153 00:08:10,720 --> 00:08:12,400 Speaker 1: were trying to get it right. You were trying to 154 00:08:12,440 --> 00:08:15,360 Speaker 1: be objective. And I quote another person, a long time 155 00:08:15,400 --> 00:08:17,480 Speaker 1: New York Times reporter who goes on the record to 156 00:08:17,520 --> 00:08:21,560 Speaker 1: say that she has seen younger members of these newsrooms 157 00:08:21,600 --> 00:08:25,440 Speaker 1: across the country that believe that objectivity is akin to 158 00:08:25,520 --> 00:08:28,200 Speaker 1: white supremacy. And when you have a generation of people 159 00:08:28,240 --> 00:08:31,160 Speaker 1: in newsrooms that believe that if they start to win 160 00:08:31,200 --> 00:08:33,480 Speaker 1: out and they are I think in the Tom Cotton 161 00:08:33,480 --> 00:08:36,440 Speaker 1: OpEd situation in others, we are seeing the youngest people 162 00:08:36,440 --> 00:08:40,000 Speaker 1: in the newsrooms become the most powerful if they believe that, 163 00:08:40,000 --> 00:08:43,440 Speaker 1: that is going to fundamentally change journalism forever. But why 164 00:08:43,440 --> 00:08:49,160 Speaker 1: do you think the sort of ideological aggressiveness as one 165 00:08:49,200 --> 00:08:52,120 Speaker 1: in the sense of your point, which is that once 166 00:08:52,200 --> 00:08:55,600 Speaker 1: upon the time, editors had real power. Now editors cal 167 00:08:55,760 --> 00:08:59,079 Speaker 1: and whimper in front of twenty six year old What 168 00:08:59,360 --> 00:09:02,120 Speaker 1: led to that? I think there's a couple of factors. 169 00:09:02,120 --> 00:09:04,280 Speaker 1: I think one of the biggest ones is that the 170 00:09:04,320 --> 00:09:07,600 Speaker 1: business itself has changed. Right, So, there was a time 171 00:09:07,840 --> 00:09:09,680 Speaker 1: one of the big problems I think is called the 172 00:09:09,720 --> 00:09:12,720 Speaker 1: broken financial incentive structure. There was a time I write 173 00:09:12,720 --> 00:09:15,440 Speaker 1: about ESPN, I write about The New York Times and CNN. 174 00:09:15,720 --> 00:09:18,760 Speaker 1: These were outlets that they might have really great years 175 00:09:18,760 --> 00:09:21,079 Speaker 1: and they might have slightly blessed good years, but their 176 00:09:21,240 --> 00:09:26,199 Speaker 1: business is they're printing money consistently, They're making profits over profits. 177 00:09:26,760 --> 00:09:29,000 Speaker 1: But that has started to change, even if they were 178 00:09:29,000 --> 00:09:32,560 Speaker 1: doing everything right. The viewing habits, the reading habits, the 179 00:09:32,640 --> 00:09:35,240 Speaker 1: way people get their news is changing. The way people 180 00:09:35,280 --> 00:09:38,480 Speaker 1: consume content is changing, and so suddenly you have in 181 00:09:38,520 --> 00:09:41,560 Speaker 1: this void that's created because they no longer feel like 182 00:09:41,880 --> 00:09:44,280 Speaker 1: they can simply do what's best for the bottom line. 183 00:09:44,840 --> 00:09:46,760 Speaker 1: They starting to make these kind of decisions that are 184 00:09:46,760 --> 00:09:49,120 Speaker 1: not based on what's best for the business, but are 185 00:09:49,120 --> 00:09:51,760 Speaker 1: based on so many other factors. So that's definitely an element. 186 00:09:51,960 --> 00:09:53,840 Speaker 1: But I also think it is what can happen on 187 00:09:53,880 --> 00:09:56,960 Speaker 1: social media. Twitter is public and so if you can 188 00:09:57,000 --> 00:09:59,680 Speaker 1: get a certain small number of people that are maybe 189 00:09:59,720 --> 00:10:02,560 Speaker 1: in your newsroom or activists working with these people on 190 00:10:02,600 --> 00:10:06,640 Speaker 1: the outside to exert pressure campaigns on the top levels, 191 00:10:07,120 --> 00:10:09,360 Speaker 1: they will cower. We have seen it with the Tom 192 00:10:09,400 --> 00:10:12,160 Speaker 1: Cotton at bed where they go and fire editors for 193 00:10:12,200 --> 00:10:14,600 Speaker 1: no reason other than putting a point of view out there. 194 00:10:14,880 --> 00:10:18,040 Speaker 1: But by labeling that point of view dangerous, and by 195 00:10:18,080 --> 00:10:21,080 Speaker 1: doing it so publicly on social media, it can really 196 00:10:21,120 --> 00:10:24,880 Speaker 1: affect the bosses if they do not have the capability 197 00:10:25,000 --> 00:10:27,200 Speaker 1: to stand strong in their principles, and we're seeing more 198 00:10:27,240 --> 00:10:29,120 Speaker 1: and more that they don't. And how much of that's 199 00:10:29,200 --> 00:10:31,679 Speaker 1: because the owners won't back come up. I think it's 200 00:10:31,679 --> 00:10:34,160 Speaker 1: a huge factor. Again. I think if the business was 201 00:10:34,160 --> 00:10:36,520 Speaker 1: still strong, then they would be very quick to just 202 00:10:36,559 --> 00:10:39,000 Speaker 1: dismiss these lower level people and say, listen, you want 203 00:10:39,000 --> 00:10:41,720 Speaker 1: to be a journalist, this is what journalists do. Journalists 204 00:10:41,800 --> 00:10:44,719 Speaker 1: allow for a free exchange of ideas. But we have 205 00:10:44,840 --> 00:10:46,920 Speaker 1: seen a strong sense against that. And I think there 206 00:10:46,960 --> 00:10:50,360 Speaker 1: is something else also here, which is that in addition 207 00:10:50,400 --> 00:10:54,480 Speaker 1: to an ideological bias, we have seen geographic bias only 208 00:10:54,480 --> 00:10:57,079 Speaker 1: getting more severe in recent years. This is the first 209 00:10:57,120 --> 00:10:59,520 Speaker 1: big problem that I write about and uncovered is that 210 00:10:59,559 --> 00:11:02,480 Speaker 1: we have newsrooms that always were really based in New 211 00:11:02,520 --> 00:11:05,360 Speaker 1: York in DC, but that bubble, that what I call 212 00:11:05,400 --> 00:11:08,599 Speaker 1: a sell a media bubble, is so much more pronounced 213 00:11:08,640 --> 00:11:11,800 Speaker 1: now and so much more disconnected than the average citizen 214 00:11:11,840 --> 00:11:15,280 Speaker 1: in this country. That the people that run these companies, 215 00:11:15,360 --> 00:11:17,920 Speaker 1: the journalists at the top of their game, have a 216 00:11:18,000 --> 00:11:21,920 Speaker 1: general distrust of the public, of their own audience, and 217 00:11:21,960 --> 00:11:24,920 Speaker 1: so because they distrust their audience, they are so much 218 00:11:24,920 --> 00:11:28,439 Speaker 1: more apt to not give that audience more perspectives or 219 00:11:28,520 --> 00:11:30,560 Speaker 1: more ideas because they don't think that they can do 220 00:11:31,320 --> 00:11:34,000 Speaker 1: what's right with that information. We see all this so 221 00:11:34,080 --> 00:11:35,920 Speaker 1: much with COVID, which is why I spend so much 222 00:11:35,920 --> 00:11:38,400 Speaker 1: time on COVID in my book. I think that that's 223 00:11:38,400 --> 00:11:40,600 Speaker 1: a huge problem. And as you mentioned at the beginning, 224 00:11:40,600 --> 00:11:43,320 Speaker 1: you know, the way that the reaction to it is 225 00:11:43,320 --> 00:11:46,560 Speaker 1: the public trusts these media outlets so much less, and 226 00:11:46,600 --> 00:11:50,000 Speaker 1: so it's the snowball effect, and who loses is the consumer. 227 00:11:50,559 --> 00:11:54,480 Speaker 1: These people see themselves as warriors for their ideology, not 228 00:11:54,640 --> 00:11:58,480 Speaker 1: as journalists trying to cover the truth. Absolutely, yeah, I 229 00:11:58,520 --> 00:12:01,200 Speaker 1: think that we see at play out in so many 230 00:12:01,200 --> 00:12:03,439 Speaker 1: different stories. I think COVID is a great example, though, 231 00:12:03,480 --> 00:12:06,000 Speaker 1: because let's just take one element of COVID. You could 232 00:12:06,000 --> 00:12:08,679 Speaker 1: talk about the policies when it comes to lockdowns and 233 00:12:09,120 --> 00:12:11,520 Speaker 1: with masks and vaccines, but let's just take the lab 234 00:12:11,600 --> 00:12:14,040 Speaker 1: leak theory is just one example of this, Okay, The 235 00:12:14,160 --> 00:12:17,080 Speaker 1: way that the people who even dare to say this 236 00:12:17,160 --> 00:12:20,680 Speaker 1: is a possibility, whether a consensus, the medical consensus, the 237 00:12:20,760 --> 00:12:24,240 Speaker 1: elite media consensus is that it came from a wet market, 238 00:12:24,280 --> 00:12:26,640 Speaker 1: he was in natural origin. And then there are certain 239 00:12:26,679 --> 00:12:29,320 Speaker 1: people who are saying, well, there's some evidence that it 240 00:12:29,360 --> 00:12:31,680 Speaker 1: may have come from the lab in Wuhunt. So we've 241 00:12:31,679 --> 00:12:33,920 Speaker 1: got a couple of different points of view here. And 242 00:12:34,000 --> 00:12:36,199 Speaker 1: I quote Nate Silver in the Book of ABC and 243 00:12:36,240 --> 00:12:38,880 Speaker 1: five thirty eight, who has I think a very apt 244 00:12:39,000 --> 00:12:41,520 Speaker 1: quote here, and he says that when there's an argument, 245 00:12:41,679 --> 00:12:45,120 Speaker 1: and there's two sides, and there's evidence on both sides, 246 00:12:45,160 --> 00:12:48,160 Speaker 1: and there's experts on both sides, but only one side 247 00:12:48,400 --> 00:12:51,720 Speaker 1: is concerned with policing the discourse. Only one side is 248 00:12:51,760 --> 00:12:55,600 Speaker 1: saying we think that this, the other side can't even 249 00:12:55,640 --> 00:12:58,400 Speaker 1: be heard. That's the side you should be skeptical of. 250 00:12:58,480 --> 00:13:01,320 Speaker 1: And that's also the side that's off and wrong. And 251 00:13:01,400 --> 00:13:04,080 Speaker 1: so we've seen this in story after story, especially in 252 00:13:04,120 --> 00:13:07,800 Speaker 1: the last three four years, even post to Trump, that 253 00:13:08,080 --> 00:13:13,280 Speaker 1: the media, the corporate press, wants to suppress certain points 254 00:13:13,280 --> 00:13:17,040 Speaker 1: of view, certain information from the public. And the reason 255 00:13:17,080 --> 00:13:20,240 Speaker 1: they're losing, they're losing in power and they're losing financially 256 00:13:20,240 --> 00:13:23,520 Speaker 1: and status is we have another group of independent media 257 00:13:23,600 --> 00:13:28,360 Speaker 1: through substack, through podcasts Joe Rogan, where they are putting 258 00:13:28,360 --> 00:13:31,880 Speaker 1: out information and reaching an audience that's hungry for at 259 00:13:31,960 --> 00:13:34,720 Speaker 1: least just an alternate point of view, and they can 260 00:13:34,760 --> 00:13:37,199 Speaker 1: see that they're not being served by the corporate press 261 00:13:37,240 --> 00:13:54,800 Speaker 1: that used to potentially try to at least serve them. 262 00:13:54,920 --> 00:13:59,439 Speaker 1: The corporate media both as declined in copular support and 263 00:13:59,480 --> 00:14:04,520 Speaker 1: gallops is the last time they had over fifty percent 264 00:14:05,240 --> 00:14:10,040 Speaker 1: approval in terms of trusting them was two thousand and three. 265 00:14:10,880 --> 00:14:12,880 Speaker 1: They had less than the majority of the country for 266 00:14:13,120 --> 00:14:15,920 Speaker 1: twenty years now. It's a little harder for me to 267 00:14:16,080 --> 00:14:21,680 Speaker 1: imagine quite how you lead a free society with that 268 00:14:22,000 --> 00:14:24,480 Speaker 1: level of distrust. But on the other hand, the point 269 00:14:24,520 --> 00:14:27,480 Speaker 1: you're making, which is there are all of these alternatives, 270 00:14:27,520 --> 00:14:32,000 Speaker 1: from talk radio to podcasts to newsletters like yours. I 271 00:14:32,160 --> 00:14:37,880 Speaker 1: find myself fairly routinely learning more from newsletters than I 272 00:14:37,960 --> 00:14:41,480 Speaker 1: do from any of the major news corporate media versus 273 00:14:41,600 --> 00:14:44,320 Speaker 1: more of the independent media. I look at the Independent line. 274 00:14:44,440 --> 00:14:47,120 Speaker 1: I'm a registered independent. I'm someone who thinks to myself. 275 00:14:47,360 --> 00:14:50,240 Speaker 1: I also interact for the most part here in Dallas 276 00:14:50,320 --> 00:14:52,440 Speaker 1: where I live with mostly people who are out of 277 00:14:52,440 --> 00:14:55,760 Speaker 1: the industry, who are a lot of times don't center 278 00:14:55,840 --> 00:14:59,080 Speaker 1: their lives around politics and don't think about politics as 279 00:14:59,080 --> 00:15:01,600 Speaker 1: a hobby. They would cosider themselves independent, and they would 280 00:15:01,720 --> 00:15:03,960 Speaker 1: mirror what I see in these polls. With the Independent 281 00:15:04,040 --> 00:15:06,960 Speaker 1: line in Gallop, it's completely fallen off a cliff. It 282 00:15:07,120 --> 00:15:10,040 Speaker 1: was once Independence were fairly close to the Democrat line 283 00:15:10,160 --> 00:15:13,840 Speaker 1: towards trust in the media, and now it's almost identical 284 00:15:13,920 --> 00:15:16,040 Speaker 1: with the Republican line. It was almost cut in half 285 00:15:16,440 --> 00:15:19,360 Speaker 1: in just since twenty seventeen. In another poll, I saw 286 00:15:19,560 --> 00:15:23,040 Speaker 1: just a TV news so trust in television news. Republicans 287 00:15:23,240 --> 00:15:26,400 Speaker 1: trusted television news at an eight percent, just eight percent, 288 00:15:26,640 --> 00:15:30,640 Speaker 1: and Independence trusted television news at just eight percent. So 289 00:15:31,280 --> 00:15:33,840 Speaker 1: suddenly you've got a country of people that I would 290 00:15:33,840 --> 00:15:36,320 Speaker 1: say Independence would describe themselves as people in the middle, 291 00:15:36,600 --> 00:15:39,880 Speaker 1: but also people that are more a political and don't 292 00:15:39,880 --> 00:15:43,040 Speaker 1: center their lives around politics, and they are catching on 293 00:15:43,320 --> 00:15:45,960 Speaker 1: because nude, as you say, this goes back Dick. It's 294 00:15:46,040 --> 00:15:48,720 Speaker 1: there were reasons to distrust the press for a very 295 00:15:48,760 --> 00:15:52,040 Speaker 1: long time and at least be skeptical, But in more 296 00:15:52,120 --> 00:15:55,640 Speaker 1: recent years it's become so obvious towards the average person, 297 00:15:56,200 --> 00:16:00,840 Speaker 1: and we see examples now where because it's playing out publicly, 298 00:16:00,880 --> 00:16:04,120 Speaker 1: the curtain has been pulled back. The media treats social 299 00:16:04,200 --> 00:16:06,920 Speaker 1: media Twitter like it's their diary, and yet it's public. 300 00:16:07,000 --> 00:16:09,160 Speaker 1: Everyone can see it now, and so it becomes so 301 00:16:09,360 --> 00:16:13,080 Speaker 1: obvious what's actually happening now that even the average person 302 00:16:13,320 --> 00:16:15,400 Speaker 1: who is not overly political one way or the other, 303 00:16:15,920 --> 00:16:18,400 Speaker 1: maybe even wants to trust the media, is finding that 304 00:16:18,480 --> 00:16:20,240 Speaker 1: they can't. And that's really what drove me to write 305 00:16:20,240 --> 00:16:22,080 Speaker 1: the book. It's trying to be a little bit of 306 00:16:22,120 --> 00:16:25,840 Speaker 1: a playbook to say, listen, there are gatekeepers that don't 307 00:16:25,920 --> 00:16:28,960 Speaker 1: need to be listened to, and here are some tools 308 00:16:29,200 --> 00:16:31,320 Speaker 1: for you to sniff out when the press is lying 309 00:16:31,360 --> 00:16:34,560 Speaker 1: again misleading you in the future, some real red flags 310 00:16:34,600 --> 00:16:36,800 Speaker 1: to look for, because even if they had their hearts 311 00:16:36,840 --> 00:16:39,440 Speaker 1: in the right place, this is not changing anytime soon. 312 00:16:39,840 --> 00:16:43,640 Speaker 1: It's fascinating process because I think as long as the 313 00:16:43,760 --> 00:16:48,440 Speaker 1: militants think that they are invulnerable, they're not going to change, 314 00:16:48,840 --> 00:16:51,440 Speaker 1: They're just going to grow old. They'll become old militants 315 00:16:51,480 --> 00:16:55,400 Speaker 1: instead of young militants. But my experience was pre date yours. 316 00:16:55,880 --> 00:16:58,600 Speaker 1: I mean, part of our effectiveness in the sixteen year 317 00:16:58,680 --> 00:17:01,480 Speaker 1: projects to create the first public a majority of forty years. 318 00:17:01,960 --> 00:17:05,159 Speaker 1: And one of the real factors was CNN, which at 319 00:17:05,200 --> 00:17:07,919 Speaker 1: that time was the only cable news channel, and they 320 00:17:08,040 --> 00:17:11,680 Speaker 1: had a show called Crossfire, and while Crossfire was a 321 00:17:11,760 --> 00:17:15,960 Speaker 1: little bit biased, it wasn't wildly biased, and you really 322 00:17:16,000 --> 00:17:18,320 Speaker 1: did get a Republican a Democrat, and you really did 323 00:17:18,440 --> 00:17:21,719 Speaker 1: have a sort of public relations brawl and a debate. 324 00:17:22,240 --> 00:17:25,719 Speaker 1: CNN in that period when Turner was still a dominating 325 00:17:25,800 --> 00:17:29,719 Speaker 1: factor was a very interesting place. But nonetheless, you had 326 00:17:29,760 --> 00:17:33,720 Speaker 1: a sense that there was an adventurous, serious effort to 327 00:17:33,880 --> 00:17:38,440 Speaker 1: do things in a way that somehow changed and for 328 00:17:38,680 --> 00:17:43,200 Speaker 1: some reason something crystallized in twenty fifteen and sixteen, whether 329 00:17:43,200 --> 00:17:46,560 Speaker 1: it was because they really liked Hillary and they really 330 00:17:47,320 --> 00:17:51,520 Speaker 1: loathed Trump or whatever it was, you could see them 331 00:17:51,640 --> 00:17:56,840 Speaker 1: go from sort of being liberal to being so rapidly 332 00:17:56,920 --> 00:18:00,919 Speaker 1: anti Trump that anything which might hurt him was worth covering. 333 00:18:01,440 --> 00:18:04,480 Speaker 1: And in many ways they created him because the weight 334 00:18:04,600 --> 00:18:07,480 Speaker 1: of the coverage drowned his opponents. So then the primaries, 335 00:18:08,040 --> 00:18:10,280 Speaker 1: they may have thought they were hurting him, but every 336 00:18:10,320 --> 00:18:12,440 Speaker 1: time they put his name up, the rest of his 337 00:18:12,560 --> 00:18:16,280 Speaker 1: opponents got smaller. Could you see the sea change underway? 338 00:18:16,800 --> 00:18:19,280 Speaker 1: I left CNN in twenty thirteen, and so I was 339 00:18:19,320 --> 00:18:21,000 Speaker 1: a little bit before that. Although I have to say 340 00:18:21,040 --> 00:18:22,960 Speaker 1: you to mention Crossfire. I know you were there at 341 00:18:23,000 --> 00:18:26,240 Speaker 1: the relaunched Crossfire back around that time also, which again 342 00:18:26,320 --> 00:18:28,880 Speaker 1: it feels like a quaint almost different time back then, 343 00:18:28,920 --> 00:18:31,240 Speaker 1: and we're talking about less than ten years ago, when 344 00:18:31,440 --> 00:18:34,680 Speaker 1: you could still have dialogue and discourse and disagreement, and 345 00:18:35,000 --> 00:18:38,800 Speaker 1: there wasn't this demonization of Frankly half the country is 346 00:18:38,840 --> 00:18:41,639 Speaker 1: where I think what we ended up after that. But 347 00:18:41,840 --> 00:18:44,160 Speaker 1: my read on that, I think he was your reaction, Steve. 348 00:18:44,600 --> 00:18:46,920 Speaker 1: My read on what happened to Cross frequens It didn't 349 00:18:46,960 --> 00:18:50,440 Speaker 1: work then, the objective, the sense of building a big audience, 350 00:18:50,800 --> 00:18:53,480 Speaker 1: And I think it's because people were already so deep 351 00:18:53,560 --> 00:18:57,080 Speaker 1: into fighting that having a show dedicated to that kind 352 00:18:57,119 --> 00:19:01,000 Speaker 1: of a debate was no longer particularly interesting because it's 353 00:19:01,040 --> 00:19:03,480 Speaker 1: what was happening all day long. That's what It was 354 00:19:03,520 --> 00:19:07,280 Speaker 1: a very brave effort to try to find a way 355 00:19:07,320 --> 00:19:10,000 Speaker 1: to actually have a level playing field. But that may 356 00:19:10,080 --> 00:19:12,399 Speaker 1: also be part of what hurt the show because on 357 00:19:12,640 --> 00:19:15,119 Speaker 1: Fox people did not cripally want a level playing field. 358 00:19:16,280 --> 00:19:19,080 Speaker 1: An MSNBC, they don't want a level playing field. So 359 00:19:19,520 --> 00:19:22,800 Speaker 1: each audience wants their team to win. Yeah, well maybe 360 00:19:22,840 --> 00:19:24,760 Speaker 1: that's right. Maybe that's just the way that Capel News 361 00:19:24,840 --> 00:19:26,639 Speaker 1: has evolved. But I do you know. My theory on 362 00:19:26,840 --> 00:19:29,040 Speaker 1: what happened with Trump, which I lay out and uncovered, 363 00:19:29,520 --> 00:19:31,600 Speaker 1: is that it's really three factors. First of all, it 364 00:19:31,760 --> 00:19:35,080 Speaker 1: was business. Look, he was undoubtedly great for business. We 365 00:19:35,160 --> 00:19:39,280 Speaker 1: saw in the primary viewers, readers clicks. All of that 366 00:19:39,800 --> 00:19:41,639 Speaker 1: was driven by that. I think that was certainly an 367 00:19:41,680 --> 00:19:43,680 Speaker 1: element of it. And then it was personal. It was 368 00:19:43,760 --> 00:19:48,000 Speaker 1: business and personal because Jeff Zucker was made at NBC 369 00:19:48,280 --> 00:19:50,479 Speaker 1: by Donald Trump. He didn't have a hit show when 370 00:19:50,520 --> 00:19:53,040 Speaker 1: he was the head of NBC Entertainment until The Apprentice 371 00:19:53,080 --> 00:19:56,000 Speaker 1: came along, and he was then intertwined with Donald Trump 372 00:19:56,080 --> 00:19:58,639 Speaker 1: to the point where he was at Jeff Zucker but 373 00:19:58,800 --> 00:20:01,960 Speaker 1: not Jeff Jeff Zucker. Gale King and Katie Couric and 374 00:20:02,119 --> 00:20:05,200 Speaker 1: Chris Matthews these were all at Donald Trump's wedding in 375 00:20:05,240 --> 00:20:07,760 Speaker 1: two thousand and five, not that long ago, and so 376 00:20:08,080 --> 00:20:10,439 Speaker 1: these people he was part of that world, and then 377 00:20:10,480 --> 00:20:12,159 Speaker 1: he became a turncoat to that world. So I think 378 00:20:12,200 --> 00:20:14,280 Speaker 1: there was a personal side of it also. But then 379 00:20:14,359 --> 00:20:16,440 Speaker 1: the third factor, which I think is important to note, 380 00:20:16,560 --> 00:20:18,720 Speaker 1: is that there were people at CNA, and I talked 381 00:20:18,720 --> 00:20:21,479 Speaker 1: to them during the Trump years that truly believed they 382 00:20:21,520 --> 00:20:25,399 Speaker 1: were in this existential fight for democracy against Donald Trump, 383 00:20:25,560 --> 00:20:28,959 Speaker 1: and that they were saving democracy by standing up against him. 384 00:20:29,040 --> 00:20:31,800 Speaker 1: Now I find that to be completely ridiculous. But what 385 00:20:31,920 --> 00:20:33,920 Speaker 1: I would say to them is if you believe that, 386 00:20:34,359 --> 00:20:36,440 Speaker 1: if you really did believe that, then that's when you 387 00:20:36,480 --> 00:20:40,359 Speaker 1: should double down on your standards and editorial practices. That's 388 00:20:40,400 --> 00:20:42,159 Speaker 1: when you need to if you want to convince the 389 00:20:42,320 --> 00:20:45,440 Speaker 1: public his supporters that he's really this danger, you need 390 00:20:45,480 --> 00:20:47,800 Speaker 1: to be extra sure that you're standing up for your 391 00:20:47,880 --> 00:20:50,960 Speaker 1: journalistic principles. But instead they went the other direction and 392 00:20:51,000 --> 00:20:53,520 Speaker 1: said they decided, Okay, this is such a unique threat 393 00:20:53,800 --> 00:20:55,800 Speaker 1: that we need to take the guard rails off completely. 394 00:20:55,880 --> 00:20:58,120 Speaker 1: Oh we need two sources now, maybe just one. Maybe 395 00:20:58,680 --> 00:21:00,320 Speaker 1: we can just go the New York Time as a 396 00:21:00,359 --> 00:21:03,080 Speaker 1: source who heard someone here from Muller and let's just 397 00:21:03,200 --> 00:21:05,680 Speaker 1: run with that. For twenty four hours. They completely abandoned 398 00:21:05,720 --> 00:21:08,199 Speaker 1: their principles in the service of this fight that they 399 00:21:08,240 --> 00:21:10,800 Speaker 1: believed that they were in, and that's what led to 400 00:21:10,840 --> 00:21:28,600 Speaker 1: their decline. Also, as you described in the book, a 401 00:21:28,800 --> 00:21:33,560 Speaker 1: precursor in the way Benaghazi was essentially either ignored or 402 00:21:33,640 --> 00:21:36,840 Speaker 1: covered up by overwhelmingly by the media, so that no 403 00:21:36,920 --> 00:21:41,200 Speaker 1: matter what information was coming out about for example, Ambassador 404 00:21:41,320 --> 00:21:45,119 Speaker 1: Stevens effort to get more security warning that they just 405 00:21:45,320 --> 00:21:48,920 Speaker 1: didn't have enough security, which would have had a penetrated 406 00:21:48,960 --> 00:21:53,760 Speaker 1: would have heard Hillary Clinton pretty dramatically, and somehow all 407 00:21:53,880 --> 00:21:56,760 Speaker 1: just smothered. It was one of these red flag moments 408 00:21:56,800 --> 00:21:59,160 Speaker 1: for men that I do write about in chapter six 409 00:21:59,240 --> 00:22:01,600 Speaker 1: of Uncovered, which was about my time at CNN, and 410 00:22:01,680 --> 00:22:03,720 Speaker 1: this happened, you know, Benghazi happened when I was there. 411 00:22:04,280 --> 00:22:07,080 Speaker 1: And it's one of these complicated factors with how we 412 00:22:07,160 --> 00:22:09,360 Speaker 1: think about the media, because I have to say CNN 413 00:22:10,119 --> 00:22:13,400 Speaker 1: broke many of the Benghazi stories. They broke arwad demon 414 00:22:13,720 --> 00:22:17,560 Speaker 1: walked into that unsecured compound and found Christopher Stevens diary 415 00:22:17,840 --> 00:22:20,159 Speaker 1: just laying about without the military and not gone in 416 00:22:20,240 --> 00:22:22,280 Speaker 1: and gotten this, and that's how we learned a lot 417 00:22:22,320 --> 00:22:24,639 Speaker 1: about how he had called for security it was denied 418 00:22:24,680 --> 00:22:27,560 Speaker 1: by Hillary Clinton and the secretary, and then we broke 419 00:22:27,640 --> 00:22:30,520 Speaker 1: stories about the CIA and how they were told to 420 00:22:30,600 --> 00:22:33,480 Speaker 1: stand down. And so we broke these stories when I 421 00:22:33,560 --> 00:22:36,000 Speaker 1: was at CNN, and then what would happen is I 422 00:22:36,040 --> 00:22:38,640 Speaker 1: would watch the briefing the next day and no one 423 00:22:38,800 --> 00:22:41,479 Speaker 1: would ask Jay Carney about it, not even the CNN 424 00:22:41,600 --> 00:22:43,159 Speaker 1: reporter in that room, and so then they would just 425 00:22:43,240 --> 00:22:45,159 Speaker 1: quit a sort of fizzle out. And that was a 426 00:22:45,240 --> 00:22:48,200 Speaker 1: really big eye opening for me, was here's something that 427 00:22:48,480 --> 00:22:51,400 Speaker 1: a hungry journalistic entity, and I'm talking about the entire 428 00:22:51,520 --> 00:22:54,080 Speaker 1: not just CNN, but across the board, that should be 429 00:22:54,200 --> 00:22:57,280 Speaker 1: a story you want to pursue. This is Pulitzer possible 430 00:22:57,680 --> 00:23:00,800 Speaker 1: material here, and yet that didn't happen. In fact, what 431 00:23:00,960 --> 00:23:03,919 Speaker 1: I found is, look at what happened with Edward Snowden 432 00:23:03,960 --> 00:23:06,520 Speaker 1: and Glenn Greenwald or some of the stories Matt Taiebi 433 00:23:06,600 --> 00:23:08,640 Speaker 1: wrote for a Rolling Stone. Glenn Greenwald wont to pullets 434 00:23:08,680 --> 00:23:10,240 Speaker 1: and when he was at The Guardian, Well, all these 435 00:23:10,320 --> 00:23:13,800 Speaker 1: people now have been relegated, They've been excommunicated from the 436 00:23:13,880 --> 00:23:15,960 Speaker 1: corporate press. That they were once a part of and 437 00:23:16,119 --> 00:23:19,520 Speaker 1: are now outside, not because they've changed, but because the 438 00:23:19,600 --> 00:23:22,879 Speaker 1: corporate press structure has changed so dramatically that they no 439 00:23:23,040 --> 00:23:25,159 Speaker 1: longer have a place in them. I don't think the 440 00:23:25,359 --> 00:23:30,680 Speaker 1: people in the corporate media allow themselves to be self 441 00:23:30,760 --> 00:23:35,800 Speaker 1: aware enough to realize how counterproductive they are. I just 442 00:23:35,880 --> 00:23:38,840 Speaker 1: said a newsletter and the gains sixteen on the notion 443 00:23:38,960 --> 00:23:43,560 Speaker 1: that the haters are actually helping Trump. Mark Kauporan has 444 00:23:43,600 --> 00:23:46,920 Speaker 1: this line he's been using recently, the Trump camp come 445 00:23:47,000 --> 00:23:50,680 Speaker 1: back because he never left, which you know, if you 446 00:23:50,800 --> 00:23:53,560 Speaker 1: watch the recent coverage, you have all sorts of major 447 00:23:53,680 --> 00:23:57,679 Speaker 1: world things going on. You're usually in Moscow with Putin, 448 00:23:57,760 --> 00:24:00,719 Speaker 1: for example, but whether or not there will be an 449 00:24:00,800 --> 00:24:04,680 Speaker 1: indictment by addition attorney in Manhattan, Oh. A former president 450 00:24:05,200 --> 00:24:09,000 Speaker 1: just absorbs the media at a level that s goofy. 451 00:24:09,720 --> 00:24:12,680 Speaker 1: It's absurd how much media he gets. It is. I 452 00:24:12,760 --> 00:24:14,560 Speaker 1: describe it as an addiction. I think that they are 453 00:24:14,600 --> 00:24:18,080 Speaker 1: addicted to Donald Trump, and that there are side effects 454 00:24:18,200 --> 00:24:20,000 Speaker 1: when you're addicted to him. And they may have gone 455 00:24:20,000 --> 00:24:22,119 Speaker 1: away a little bit, but they need their fix. I 456 00:24:22,200 --> 00:24:24,680 Speaker 1: think that they will easily go back and give into 457 00:24:24,720 --> 00:24:27,720 Speaker 1: that addiction. Again as he begins his run now again 458 00:24:27,760 --> 00:24:30,800 Speaker 1: for twenty twenty four. Frankly, there's a few topics that 459 00:24:30,960 --> 00:24:32,879 Speaker 1: I didn't really get to dig into as much as 460 00:24:32,920 --> 00:24:35,080 Speaker 1: I would have liked and uncovered. One of those is 461 00:24:35,160 --> 00:24:38,320 Speaker 1: January sixth, and I do think that January six became 462 00:24:38,400 --> 00:24:41,200 Speaker 1: this stand in for Trump. Trump was gone, but then 463 00:24:41,240 --> 00:24:43,560 Speaker 1: they could find a way to talk about him again. 464 00:24:43,600 --> 00:24:46,480 Speaker 1: Because you had January six you had a way of 465 00:24:47,240 --> 00:24:49,960 Speaker 1: demonizing half the country, which is basically what I believe. 466 00:24:50,119 --> 00:24:53,320 Speaker 1: The coverage of January sixth became in the service of 467 00:24:53,400 --> 00:24:55,960 Speaker 1: demonizing Donald Trump and making it all about him. The 468 00:24:56,040 --> 00:25:00,119 Speaker 1: whole January sixth committee show that was produced by an 469 00:25:00,119 --> 00:25:03,240 Speaker 1: ABC News executive, I mean, it was completely ridiculous, and 470 00:25:03,560 --> 00:25:05,520 Speaker 1: that was like their holdover. They could get their fix 471 00:25:05,960 --> 00:25:09,040 Speaker 1: through January sixth until Donald Trump came back into the spotlight. 472 00:25:09,119 --> 00:25:11,320 Speaker 1: And yeah, he's back again, and they are just more 473 00:25:11,359 --> 00:25:14,399 Speaker 1: than happy to do exactly what they did in twenty fifteen. Frankly, 474 00:25:14,480 --> 00:25:16,520 Speaker 1: I believe because they'd love for him to be the 475 00:25:16,600 --> 00:25:19,240 Speaker 1: nominee again, because just like in twenty sixteen, they believe 476 00:25:19,280 --> 00:25:21,560 Speaker 1: he'll be easy to beat and he very easily could 477 00:25:21,600 --> 00:25:24,640 Speaker 1: win again. You would think by now they would begin 478 00:25:24,720 --> 00:25:28,800 Speaker 1: to figure out that this guy is a remarkably dangerous candid, 479 00:25:29,320 --> 00:25:32,040 Speaker 1: and I mean dangerous in the sense that he can win, 480 00:25:32,880 --> 00:25:36,440 Speaker 1: and so to sort of cheerfully prop him up. And 481 00:25:36,640 --> 00:25:39,520 Speaker 1: he doesn't mind that you're attacking him as long as 482 00:25:39,600 --> 00:25:42,120 Speaker 1: he's the center of attention. He actually says, I think 483 00:25:42,160 --> 00:25:44,800 Speaker 1: it's in his first book, The Art of the Deal, 484 00:25:45,320 --> 00:25:49,920 Speaker 1: that he bought this building which had a really beautiful facade, 485 00:25:50,440 --> 00:25:52,720 Speaker 1: and he wanted to keep the facade, and announced he 486 00:25:52,800 --> 00:25:56,600 Speaker 1: wanted to keep it, and then the engineers looked at 487 00:25:56,640 --> 00:25:58,399 Speaker 1: what it would cost, and it turned out to be 488 00:25:58,600 --> 00:26:02,320 Speaker 1: millions of dollars. So he gave up and announced that, 489 00:26:02,400 --> 00:26:04,040 Speaker 1: you know, I really would love to keep it, but 490 00:26:04,119 --> 00:26:06,520 Speaker 1: I can't afford him. Well, The New York Times one 491 00:26:06,600 --> 00:26:09,800 Speaker 1: after him on page one because he was destroying us. 492 00:26:11,000 --> 00:26:13,920 Speaker 1: But he said what he learned was within two days 493 00:26:14,600 --> 00:26:18,200 Speaker 1: the condominiums in that building that all sold out because 494 00:26:18,520 --> 00:26:21,960 Speaker 1: people suddenly learned he was building the scooping. I think 495 00:26:22,000 --> 00:26:24,440 Speaker 1: that was one of his early lessons. And just keep 496 00:26:24,520 --> 00:26:27,560 Speaker 1: writing about me. I don't care exactly. It's so funny 497 00:26:27,560 --> 00:26:29,679 Speaker 1: you say that during the campaign, I read The Art 498 00:26:29,720 --> 00:26:31,600 Speaker 1: of the Deal this book they wrote nineteen eighty seven, 499 00:26:31,640 --> 00:26:33,560 Speaker 1: and I think that if the media had read the book, 500 00:26:34,000 --> 00:26:35,840 Speaker 1: I think they would have understood what was going on 501 00:26:36,000 --> 00:26:37,920 Speaker 1: a lot better. They would not have been played in 502 00:26:37,960 --> 00:26:40,639 Speaker 1: the way that they were because they totally fed into it. 503 00:26:40,840 --> 00:26:42,359 Speaker 1: He told another story in the book which I just 504 00:26:42,440 --> 00:26:45,080 Speaker 1: think is so representative of what happens there, of another 505 00:26:45,160 --> 00:26:48,040 Speaker 1: project that was delayed and they weren't able to get 506 00:26:48,040 --> 00:26:50,240 Speaker 1: anything done, but he had some investors coming, and so 507 00:26:50,400 --> 00:26:52,640 Speaker 1: what he told the dump trucks and the other equipment 508 00:26:52,760 --> 00:26:55,479 Speaker 1: was just drive around and removed some dirt from one 509 00:26:55,520 --> 00:26:57,280 Speaker 1: place to the other, and they moved and just looked 510 00:26:57,280 --> 00:26:59,840 Speaker 1: like you're working. And he was able to by doing 511 00:27:00,160 --> 00:27:02,960 Speaker 1: by looking like things were happening, and get coverage that 512 00:27:03,080 --> 00:27:05,359 Speaker 1: then led to more investment and move on. And I mean, 513 00:27:05,400 --> 00:27:08,760 Speaker 1: it's just very transparent about that, and it totally plays 514 00:27:08,760 --> 00:27:10,840 Speaker 1: the media in ways that they are in some ways 515 00:27:10,880 --> 00:27:14,680 Speaker 1: willing participants because they benefit financially from it. Also, I'm 516 00:27:14,760 --> 00:27:16,680 Speaker 1: sure if we get into the thick of it and 517 00:27:16,800 --> 00:27:19,280 Speaker 1: Donald Trump's there every day, they will get the ratings 518 00:27:19,320 --> 00:27:21,800 Speaker 1: they have been craving, and they will get the clicks 519 00:27:21,800 --> 00:27:24,200 Speaker 1: that they have lost, and the subscribers. The Washington Post 520 00:27:24,240 --> 00:27:27,280 Speaker 1: has been hemorrhaging, so it will happen again. It's going 521 00:27:27,359 --> 00:27:30,560 Speaker 1: to be just another reducts of what happened in twenty sixteen. 522 00:27:30,560 --> 00:27:32,640 Speaker 1: It feels like I want to thank you. I think 523 00:27:33,359 --> 00:27:38,520 Speaker 1: you may be the best overall critic of modern media 524 00:27:38,680 --> 00:27:41,040 Speaker 1: in the entire country. I want you. I'll want to 525 00:27:41,080 --> 00:27:44,200 Speaker 1: continue reading new material. I hope you're going to continue 526 00:27:44,200 --> 00:27:47,879 Speaker 1: to be as open they're challenging everybody. You're one of 527 00:27:47,920 --> 00:27:50,920 Speaker 1: the handful of people I look too and think you 528 00:27:51,040 --> 00:27:53,960 Speaker 1: actually try to pursue the facts, and you try to 529 00:27:54,000 --> 00:27:56,560 Speaker 1: bring things out the opening. So I'm going to thank 530 00:27:56,640 --> 00:28:00,119 Speaker 1: you for joining me. I really applaud your efforts. I 531 00:28:00,200 --> 00:28:03,720 Speaker 1: think what you're doing is an extraordinarily important act of citizenship, 532 00:28:04,160 --> 00:28:07,400 Speaker 1: and I urge everybody to both sign up for your newsletter, 533 00:28:07,480 --> 00:28:10,480 Speaker 1: but I also urge them to get your new book uncovered, 534 00:28:10,800 --> 00:28:14,480 Speaker 1: How the media got cozy with power, abandon its principles 535 00:28:14,760 --> 00:28:16,840 Speaker 1: and lost the people Who're going to have it on 536 00:28:16,880 --> 00:28:19,720 Speaker 1: our show page so listeners can order a copy. So Steve, 537 00:28:20,160 --> 00:28:22,399 Speaker 1: thank you for joining me on news World. Thank you 538 00:28:22,440 --> 00:28:26,200 Speaker 1: so much. I really appreciate it. Thank you to my 539 00:28:26,280 --> 00:28:28,560 Speaker 1: guest Steve Kracker. You can get a link to buy 540 00:28:28,680 --> 00:28:32,280 Speaker 1: his new book uncovered on our show page at newtsworld 541 00:28:32,320 --> 00:28:36,560 Speaker 1: dot com. News World is produced by Gingwistweet sixty and iHeartMedia. 542 00:28:36,880 --> 00:28:40,720 Speaker 1: Our executive producers Guards and Sloan. Our producers Rebecca Howell, 543 00:28:41,000 --> 00:28:44,000 Speaker 1: and our researcher was Rachel Peterson. The artwork for the 544 00:28:44,040 --> 00:28:47,440 Speaker 1: show was created by Steve Penley Special Things the team 545 00:28:47,480 --> 00:28:51,040 Speaker 1: at gingwistre sixty. If you've been enjoying Newtsworld, I hope 546 00:28:51,040 --> 00:28:53,680 Speaker 1: you'll go to Apple podcast and both rate us with 547 00:28:53,800 --> 00:28:57,040 Speaker 1: five stars and give us a review so others can 548 00:28:57,160 --> 00:29:00,200 Speaker 1: learn what it's all about. Right now, listeners of news 549 00:29:00,240 --> 00:29:03,440 Speaker 1: World can sign up for my three free weekly columns 550 00:29:03,640 --> 00:29:07,680 Speaker 1: at Ganwish free sixty dot com slash newsletter. I'm new 551 00:29:07,760 --> 00:29:09,680 Speaker 1: gangrig This is news World.