1 00:00:00,800 --> 00:00:04,000 Speaker 1: You're listening to the Tutor Dixon Podcast in the Clay 2 00:00:04,040 --> 00:00:10,360 Speaker 1: and Book Podcast Network. Hello, and welcome to the Tutor 3 00:00:10,360 --> 00:00:13,320 Speaker 1: Dixon Podcast. I'm Tutor Dixon and I'm excited to have 4 00:00:13,440 --> 00:00:16,119 Speaker 1: you join me today. We've got an interesting show for 5 00:00:16,200 --> 00:00:18,919 Speaker 1: you because it's something that affects all of you, no 6 00:00:19,000 --> 00:00:21,680 Speaker 1: matter who you are or how you live. Today we're 7 00:00:21,680 --> 00:00:25,479 Speaker 1: covering the media itself. My guest considers the media the 8 00:00:25,520 --> 00:00:28,480 Speaker 1: fourth estate of government. So yes, if you weren't aware, 9 00:00:28,640 --> 00:00:31,200 Speaker 1: there is a group operating as a fourth branch of 10 00:00:31,240 --> 00:00:35,040 Speaker 1: our government, the media. Well, it's criticized by both sides 11 00:00:35,080 --> 00:00:37,800 Speaker 1: of the aisle, but it takes someone who's been actually 12 00:00:38,000 --> 00:00:40,960 Speaker 1: on the inside and both sides of the inside for 13 00:00:41,000 --> 00:00:43,440 Speaker 1: that matter, to really break it down for us. Let's 14 00:00:43,479 --> 00:00:45,720 Speaker 1: get right to it because there is certainly a lot 15 00:00:45,760 --> 00:00:48,960 Speaker 1: to cover here. My guest today is journalist and media 16 00:00:49,000 --> 00:00:52,239 Speaker 1: critic who has been working at while He's worked at CNN, 17 00:00:52,440 --> 00:00:56,320 Speaker 1: Fox News, NBC, and The Blaze, so this is quite 18 00:00:56,320 --> 00:00:59,720 Speaker 1: an array of media. He authors the Fourth Watch media 19 00:00:59,800 --> 00:01:02,800 Speaker 1: News this letter, hosts the Fourth Watch podcast, and is 20 00:01:02,800 --> 00:01:06,759 Speaker 1: the executive producer of The Megan Kelly Show. Steve krak Our, 21 00:01:06,880 --> 00:01:09,800 Speaker 1: author of Uncovered. Welcome to the podcast. Hey jude A, 22 00:01:09,800 --> 00:01:11,440 Speaker 1: great to be here. It is great to have you. 23 00:01:11,520 --> 00:01:15,080 Speaker 1: Now this new book that you have uncovered, You're not 24 00:01:15,240 --> 00:01:18,000 Speaker 1: trying to burn down the media. You're just telling us 25 00:01:18,040 --> 00:01:20,920 Speaker 1: what's really going on behind the scenes, right absolutely. Yeah. Look, 26 00:01:20,959 --> 00:01:23,760 Speaker 1: I'm someone who believes that, as you mentioned the Fourth 27 00:01:23,800 --> 00:01:27,119 Speaker 1: of State, we need a strong press, an institutional press. 28 00:01:27,160 --> 00:01:29,800 Speaker 1: I think we have lots of strong independent media out there, 29 00:01:29,800 --> 00:01:33,520 Speaker 1: which is great, and the rise of podcasts and substack 30 00:01:33,600 --> 00:01:36,360 Speaker 1: and YouTube, there's a lot of opportunities and I'm glad 31 00:01:36,400 --> 00:01:38,920 Speaker 1: for that. I think that more choice is better for 32 00:01:38,959 --> 00:01:41,760 Speaker 1: the American people. But in a perfect world, we have 33 00:01:42,080 --> 00:01:45,919 Speaker 1: really strong institutional press as well, that speaks truth to power, 34 00:01:45,959 --> 00:01:49,000 Speaker 1: that holds power accountable and is in cozy with power 35 00:01:49,040 --> 00:01:52,120 Speaker 1: instead is really a check on power. We don't have that. 36 00:01:52,280 --> 00:01:54,360 Speaker 1: And I can say, you know, you mentioned CNN. I 37 00:01:54,440 --> 00:01:57,200 Speaker 1: was at CNN twenty ten to twenty thirteen. It was 38 00:01:57,440 --> 00:02:00,320 Speaker 1: not that long ago, but it feels like ages ago 39 00:02:00,480 --> 00:02:04,920 Speaker 1: because really in the subsequent years twenty fourteen twenty fifteen, 40 00:02:05,440 --> 00:02:07,680 Speaker 1: it just completely went off the rails, not just CNN 41 00:02:07,760 --> 00:02:10,400 Speaker 1: but the entire corporate media over the last seven years 42 00:02:10,480 --> 00:02:14,600 Speaker 1: and there was valid criticism before, but something fundamental has changed, 43 00:02:14,840 --> 00:02:16,960 Speaker 1: and so I think it was really important to document 44 00:02:17,080 --> 00:02:21,320 Speaker 1: what happened, why did it happen, and really trying to 45 00:02:21,080 --> 00:02:25,040 Speaker 1: tell the readers and to tell the audience and potential 46 00:02:25,040 --> 00:02:27,560 Speaker 1: news consumers in the future, here's the red flags, here's 47 00:02:27,560 --> 00:02:29,720 Speaker 1: what you need to look for, because this is what happened, 48 00:02:29,720 --> 00:02:31,359 Speaker 1: and now you know what to look for down the 49 00:02:31,400 --> 00:02:33,480 Speaker 1: road when it happens again, because it will. I've heard 50 00:02:33,520 --> 00:02:36,720 Speaker 1: the news kind of described as infotainment, and I want 51 00:02:36,760 --> 00:02:39,280 Speaker 1: to get to that because we now have a twenty 52 00:02:39,280 --> 00:02:42,200 Speaker 1: four hour news cycle. So that was just something that 53 00:02:42,400 --> 00:02:44,840 Speaker 1: sort of started in the last twenty years, I'd say. 54 00:02:44,840 --> 00:02:46,600 Speaker 1: I mean when I was a kid, I remember my 55 00:02:46,720 --> 00:02:49,280 Speaker 1: parents watching the evening news and that was where they 56 00:02:49,280 --> 00:02:52,160 Speaker 1: got their news. When you have twenty four hours of news, 57 00:02:52,440 --> 00:02:55,079 Speaker 1: doesn't there have to be some just kind of shows 58 00:02:55,160 --> 00:02:59,440 Speaker 1: and commentation and not necessarily journalism going on all of 59 00:02:59,440 --> 00:03:02,239 Speaker 1: the time in those twenty four hour news cycles. Absolutely, 60 00:03:02,480 --> 00:03:05,760 Speaker 1: I think the incentive structure has changed also, So yes, 61 00:03:05,840 --> 00:03:07,400 Speaker 1: I think even when I was there, you know, ten 62 00:03:07,480 --> 00:03:11,200 Speaker 1: fifteen years ago, there was there's this twenty four hour 63 00:03:11,280 --> 00:03:14,320 Speaker 1: news cycle, and yes, things move fast get something wrong, 64 00:03:14,480 --> 00:03:16,160 Speaker 1: Oh it's gone. By the next day, We're onto the 65 00:03:16,160 --> 00:03:20,600 Speaker 1: next story. You know, lots of more entertainment. Whether were 66 00:03:20,600 --> 00:03:23,000 Speaker 1: the incentives that are driving it, Oh you know, this 67 00:03:23,080 --> 00:03:25,240 Speaker 1: is fun, this will get ratings, this will get clicks. 68 00:03:25,280 --> 00:03:27,320 Speaker 1: There's a lot of that happening. But but now, with 69 00:03:27,400 --> 00:03:30,480 Speaker 1: the introduction of social media and particularly Twitter, these things 70 00:03:30,560 --> 00:03:33,280 Speaker 1: move within hours. You know what, something that's bubbling up 71 00:03:33,280 --> 00:03:36,080 Speaker 1: for an hour can be gone, you know, a little 72 00:03:36,080 --> 00:03:38,720 Speaker 1: bit later, and that really leaves room for what I 73 00:03:38,760 --> 00:03:42,400 Speaker 1: call glance journalism. We get an initial story about lots 74 00:03:42,400 --> 00:03:45,480 Speaker 1: of different things, whether it's political or cultural, or just 75 00:03:45,560 --> 00:03:47,760 Speaker 1: some random story that you find out in the news, 76 00:03:47,880 --> 00:03:50,440 Speaker 1: and then it's gone and the way it was originally covered, 77 00:03:50,480 --> 00:03:55,120 Speaker 1: which oftentimes whether it's a mistake, whether it's intentional, is wrong, 78 00:03:55,320 --> 00:03:58,240 Speaker 1: something has left the audience with a bad impression, and 79 00:03:58,280 --> 00:04:00,320 Speaker 1: then maybe down the road it gets corrected. We see 80 00:04:00,320 --> 00:04:02,960 Speaker 1: it on a big scale, like with the Hunter Biden laptop. 81 00:04:03,000 --> 00:04:05,040 Speaker 1: We see it on small scale with so many stories 82 00:04:05,040 --> 00:04:08,160 Speaker 1: that I lay out and uncovered, and that really really 83 00:04:08,280 --> 00:04:11,480 Speaker 1: hurts the audience because they are not served by a 84 00:04:11,560 --> 00:04:15,600 Speaker 1: media that is so interested in getting that initial coverage, 85 00:04:15,680 --> 00:04:17,920 Speaker 1: that initial burst and then moving on to the next thing, 86 00:04:18,279 --> 00:04:21,719 Speaker 1: it really harms this, Well, can't this really impact the 87 00:04:21,920 --> 00:04:25,320 Speaker 1: entire country? Because I mean, you bring up the Hunter 88 00:04:25,360 --> 00:04:27,719 Speaker 1: Biden laptop and I kind of want to go through 89 00:04:27,760 --> 00:04:31,040 Speaker 1: that for just a second, because this comes up. This 90 00:04:31,120 --> 00:04:34,160 Speaker 1: is the October we call this the October Surprise, right, 91 00:04:34,200 --> 00:04:37,920 Speaker 1: this comes out, this is potentially going to change the election. 92 00:04:38,440 --> 00:04:42,160 Speaker 1: The October Surprise gets gobbled up and pulled off of 93 00:04:42,600 --> 00:04:45,160 Speaker 1: all of the media lines. Everybody says, you can't talk 94 00:04:45,200 --> 00:04:48,640 Speaker 1: about it. Not only does it get removed. If you 95 00:04:48,720 --> 00:04:51,560 Speaker 1: talk about it, you are demonized. This is bad. You 96 00:04:51,600 --> 00:04:55,560 Speaker 1: can't talk about this. But now fast forward two two 97 00:04:55,600 --> 00:04:59,200 Speaker 1: and a half years later, we see President g meeting 98 00:04:59,320 --> 00:05:03,960 Speaker 1: with Putin talking about what they're going to do to 99 00:05:04,320 --> 00:05:06,720 Speaker 1: make peace. And I have to use air quotes when 100 00:05:06,760 --> 00:05:09,599 Speaker 1: I say that in the world because this is really 101 00:05:09,600 --> 00:05:13,560 Speaker 1: looking like world domination. And you have a president in 102 00:05:13,600 --> 00:05:17,880 Speaker 1: the United States that, according to this laptop is quite 103 00:05:18,080 --> 00:05:22,560 Speaker 1: possibly compromised by the Chinese government. What does that mean? 104 00:05:22,839 --> 00:05:26,400 Speaker 1: When you see media take a turn like that, it's 105 00:05:26,440 --> 00:05:28,760 Speaker 1: a really important moment. I think it was a bit 106 00:05:28,760 --> 00:05:30,680 Speaker 1: of a turning point. I write a lot in the 107 00:05:30,680 --> 00:05:33,120 Speaker 1: book about the Trump era, and a lot of the 108 00:05:33,120 --> 00:05:35,080 Speaker 1: mistakes that were made, a lot of the problems why 109 00:05:35,080 --> 00:05:38,640 Speaker 1: they happened twenty fifteen to twenty twenty. But I really 110 00:05:38,640 --> 00:05:41,800 Speaker 1: do think that that Hunter Biden laptop story was the 111 00:05:42,040 --> 00:05:44,239 Speaker 1: end of that era and the beginning of an even 112 00:05:44,279 --> 00:05:46,640 Speaker 1: worse era that we've seen since then, which I think 113 00:05:46,760 --> 00:05:51,599 Speaker 1: is really built around anti speech activism, censorship, and the press, 114 00:05:51,640 --> 00:05:54,120 Speaker 1: the media, the ones that should be about free speech 115 00:05:54,160 --> 00:05:58,760 Speaker 1: and about free expression being part of the collusion racket 116 00:05:58,800 --> 00:06:02,880 Speaker 1: that goes between tech platforms, the censors at the center 117 00:06:02,880 --> 00:06:07,400 Speaker 1: of this, government agencies, intel agencies, and the media themselves 118 00:06:07,480 --> 00:06:10,120 Speaker 1: that worked together for Because yes, as you say, there 119 00:06:10,200 --> 00:06:13,000 Speaker 1: was a time it was it was unprecedented. You could 120 00:06:13,080 --> 00:06:15,520 Speaker 1: not share a link to the New York Post and 121 00:06:15,839 --> 00:06:18,280 Speaker 1: that was completely ridiculous. We now know thanks to the 122 00:06:18,279 --> 00:06:20,920 Speaker 1: Twitter files why that happened. But as I detail in 123 00:06:21,000 --> 00:06:23,320 Speaker 1: the in the book, I really went back and looked 124 00:06:23,320 --> 00:06:26,599 Speaker 1: at exactly what happened. Journalists would share this story, you know, 125 00:06:27,000 --> 00:06:28,880 Speaker 1: to the New York Post and say, oh, I questioned 126 00:06:28,880 --> 00:06:31,080 Speaker 1: the sourcing about this, or Hi, I wonder if the 127 00:06:31,080 --> 00:06:33,440 Speaker 1: Biden campaign will respond to this, and they were locked 128 00:06:33,440 --> 00:06:35,840 Speaker 1: out of their Twitter accounts because they shared it. And 129 00:06:35,880 --> 00:06:39,440 Speaker 1: then they rather than saying this is completely ridiculous The 130 00:06:39,440 --> 00:06:42,480 Speaker 1: New York Post are our colleagues. No, they deleted their tweet. 131 00:06:42,640 --> 00:06:45,760 Speaker 1: They apologized for daring to link to this story. They 132 00:06:45,839 --> 00:06:48,400 Speaker 1: were they were pillary. They were trending on Twitter as 133 00:06:48,720 --> 00:06:52,720 Speaker 1: as maga. You know journalists, these New York Times reporters 134 00:06:52,720 --> 00:06:55,599 Speaker 1: just sharing a link to it. That was an embarrassing 135 00:06:55,640 --> 00:06:58,880 Speaker 1: moment and it really began this this drift towards censorship 136 00:06:58,880 --> 00:07:02,400 Speaker 1: and whether it's political, whether it's related to COVID and 137 00:07:02,720 --> 00:07:07,080 Speaker 1: this kind of consensus. We can't go against the science consensus, 138 00:07:07,080 --> 00:07:09,720 Speaker 1: which obviously we now know is completely wrong. They were 139 00:07:09,800 --> 00:07:12,040 Speaker 1: in on it, and that hurts Americans. I mean, we 140 00:07:12,080 --> 00:07:15,239 Speaker 1: need to press that cares about the free exchange of ideas. 141 00:07:15,440 --> 00:07:17,720 Speaker 1: There are people who would say part of this is 142 00:07:17,760 --> 00:07:20,400 Speaker 1: Trump's fault because he went after the media is so hard. 143 00:07:20,440 --> 00:07:22,559 Speaker 1: But on the flip side of that, I think most 144 00:07:22,600 --> 00:07:26,960 Speaker 1: people had this overwhelming trust for the media regardless before 145 00:07:27,240 --> 00:07:30,320 Speaker 1: Trump was president, and then he started to say, hey, wait, 146 00:07:30,440 --> 00:07:33,320 Speaker 1: this is happening, This is happening, and everybody said, no, 147 00:07:33,600 --> 00:07:35,840 Speaker 1: he's nuts, he's nuts, And then all of a sudden, 148 00:07:36,320 --> 00:07:39,320 Speaker 1: that really actually started to come true. Yes, they were 149 00:07:39,400 --> 00:07:42,760 Speaker 1: spying on him. Remember that when they said, oh, he's crazy, 150 00:07:42,760 --> 00:07:45,280 Speaker 1: he thinks they're spying on him. Yes they are. I mean, 151 00:07:45,360 --> 00:07:49,160 Speaker 1: going through a campaign myself, I saw similar things. They 152 00:07:49,160 --> 00:07:52,520 Speaker 1: would put something out like to your point, you said, well, 153 00:07:52,640 --> 00:07:55,280 Speaker 1: a story can go out and then the next day 154 00:07:55,280 --> 00:07:58,760 Speaker 1: it disappears, or it can be wrong and there's nobody 155 00:07:58,800 --> 00:08:01,360 Speaker 1: correcting it. I mean, we had that happened multiple times, 156 00:08:01,440 --> 00:08:04,240 Speaker 1: and it really shaped the entire campaign in the state 157 00:08:04,280 --> 00:08:06,640 Speaker 1: of Michigan. I had an interview where I was asked, 158 00:08:07,320 --> 00:08:09,880 Speaker 1: what do you think about if a fourteen year old 159 00:08:10,160 --> 00:08:12,720 Speaker 1: ended up getting pregnant, And I said, that's the perfect 160 00:08:12,760 --> 00:08:16,360 Speaker 1: example of why it would be so dangerous to eliminate 161 00:08:16,400 --> 00:08:21,560 Speaker 1: parental consent. That journalist cut that, took out why I 162 00:08:21,560 --> 00:08:25,560 Speaker 1: said that, and put together that's the perfect example. A 163 00:08:25,600 --> 00:08:27,560 Speaker 1: life is a life. She has to have that baby, 164 00:08:28,200 --> 00:08:31,520 Speaker 1: and this was a total lie. It framed the entire 165 00:08:31,600 --> 00:08:35,440 Speaker 1: campaign in Michigan, and ultimately I believe that is what 166 00:08:35,840 --> 00:08:38,440 Speaker 1: allowed her to win the state back. And look at 167 00:08:38,440 --> 00:08:41,959 Speaker 1: the state of Michigan today. We have no businesses coming here. 168 00:08:42,120 --> 00:08:45,840 Speaker 1: We're giving money to the Chinese Communist Party. We've just 169 00:08:45,880 --> 00:08:49,160 Speaker 1: found out that giving money to this company, they are 170 00:08:49,200 --> 00:08:52,120 Speaker 1: going to have a grassroots organization in the state of 171 00:08:52,160 --> 00:08:55,680 Speaker 1: Michigan that will be beholden to the Chinese Communist Party 172 00:08:55,720 --> 00:08:58,559 Speaker 1: and helping the Chinese Communist Party in the state of Michigan. 173 00:08:58,880 --> 00:09:02,719 Speaker 1: This is really shocking discoveries that we're having. And this 174 00:09:02,800 --> 00:09:05,839 Speaker 1: is all because, really, I believe the media is able 175 00:09:05,880 --> 00:09:09,360 Speaker 1: to create a story that's not there and burn people down. 176 00:09:09,760 --> 00:09:11,680 Speaker 1: What's your take on that. Yeah, I think that's a 177 00:09:11,720 --> 00:09:14,760 Speaker 1: great example because it is this sign of in the 178 00:09:14,760 --> 00:09:16,480 Speaker 1: old days, and I mean the old days, like fifteen 179 00:09:16,559 --> 00:09:18,640 Speaker 1: twenty years ago, I can tell you a lot of 180 00:09:18,640 --> 00:09:21,760 Speaker 1: the media leaned left. They were they when they voted, 181 00:09:21,760 --> 00:09:24,679 Speaker 1: they voted for democrats that existed. But there was also 182 00:09:24,800 --> 00:09:28,040 Speaker 1: this sense of, unlike any other occupation, you are supposed 183 00:09:28,040 --> 00:09:31,080 Speaker 1: to essentially hide your personal feelings for the good of 184 00:09:31,120 --> 00:09:33,560 Speaker 1: the job that you're doing. Be objective. That's the goal. 185 00:09:33,880 --> 00:09:37,000 Speaker 1: And so yes, you're they're fighting against their own personal biases, 186 00:09:37,040 --> 00:09:40,040 Speaker 1: but they know that that be telling the story the 187 00:09:40,160 --> 00:09:43,520 Speaker 1: right way, telling it correctly was the only thing that mattered. 188 00:09:43,720 --> 00:09:45,480 Speaker 1: You get it right, you're doing a good job. You 189 00:09:45,480 --> 00:09:47,120 Speaker 1: get it wrong, you're doing a bad job. That was 190 00:09:47,160 --> 00:09:49,480 Speaker 1: all that mattered, But that has completely changed now it 191 00:09:49,679 --> 00:09:51,960 Speaker 1: is it is no longer about right and wrong and 192 00:09:52,000 --> 00:09:54,840 Speaker 1: getting it correct or not getting it correct. It's about narrative, 193 00:09:55,280 --> 00:09:59,320 Speaker 1: and it's about what's acceptable versus what's unacceptable. Because you 194 00:09:59,400 --> 00:10:01,280 Speaker 1: just give that sample of the journalist. If they gave 195 00:10:01,320 --> 00:10:04,560 Speaker 1: it the full context, if they actually told what actually 196 00:10:04,559 --> 00:10:07,880 Speaker 1: happened in that quote, they might get attacked by their 197 00:10:08,040 --> 00:10:11,200 Speaker 1: peers on Twitter, for example, for daring to give a 198 00:10:11,360 --> 00:10:16,080 Speaker 1: nuanced take on what your answer was. They the incentives 199 00:10:16,240 --> 00:10:18,440 Speaker 1: of that, and it I can tell you getting yelled 200 00:10:18,440 --> 00:10:20,520 Speaker 1: at by fifty people on Twitter if you if you're 201 00:10:20,559 --> 00:10:22,679 Speaker 1: not really self aware that this is just a small 202 00:10:22,720 --> 00:10:25,040 Speaker 1: bubble of people, can feel like a lot. And so 203 00:10:25,360 --> 00:10:28,120 Speaker 1: in Uncovered, I actually talked to one of the founders 204 00:10:28,679 --> 00:10:31,360 Speaker 1: or the founder of the Rap, a media outlet, who 205 00:10:31,440 --> 00:10:34,600 Speaker 1: says she has seen her own reporters move away from 206 00:10:34,600 --> 00:10:36,920 Speaker 1: a story, or cover story in a different way, or 207 00:10:36,960 --> 00:10:39,120 Speaker 1: not cover it at all because of the fear of 208 00:10:39,160 --> 00:10:41,800 Speaker 1: the backlash they might get on Twitter. I mean, that 209 00:10:41,920 --> 00:10:44,559 Speaker 1: is a really scary time right now, and that means 210 00:10:44,559 --> 00:10:46,959 Speaker 1: that yes, we now get narrative, we now get what's 211 00:10:46,960 --> 00:10:50,640 Speaker 1: acceptable over what's actually correct and what's actually right. That 212 00:10:50,840 --> 00:10:53,560 Speaker 1: is bizarre to me, the power of Twitter, and I 213 00:10:53,600 --> 00:10:57,160 Speaker 1: think that we've seen that change over time as well. 214 00:10:57,200 --> 00:11:00,440 Speaker 1: You talk about influencers in your book. We have this 215 00:11:00,600 --> 00:11:03,480 Speaker 1: now we're seeing this on the de santist Trump side, 216 00:11:03,480 --> 00:11:07,000 Speaker 1: these influencers coming in. This is a very small world. 217 00:11:07,080 --> 00:11:09,640 Speaker 1: I think the majority of my friends who want nothing 218 00:11:09,679 --> 00:11:12,240 Speaker 1: to do with politics or just simply don't pay attention 219 00:11:12,240 --> 00:11:15,520 Speaker 1: to it, have no idea what's happening on Twitter. But 220 00:11:15,640 --> 00:11:20,040 Speaker 1: that Twitter narrative can influence the media, and so I 221 00:11:20,080 --> 00:11:22,720 Speaker 1: think that it seems like a bigger space than it 222 00:11:22,800 --> 00:11:25,320 Speaker 1: is because there can be things that happen on Twitter 223 00:11:25,640 --> 00:11:28,760 Speaker 1: that become news. So how do you break through the 224 00:11:28,840 --> 00:11:33,000 Speaker 1: influencers because they're clearly they are biased, but they are 225 00:11:33,080 --> 00:11:35,840 Speaker 1: creating news. So how does that How do you break 226 00:11:35,880 --> 00:11:38,600 Speaker 1: through that and make sure that the news you're receiving 227 00:11:38,720 --> 00:11:42,280 Speaker 1: is actual news. Yeah, it's a challenge because, yeah, as 228 00:11:42,280 --> 00:11:45,360 Speaker 1: you mentioned, first of all, the journalists are sort of 229 00:11:45,360 --> 00:11:48,760 Speaker 1: the influencers themselves. They treat Twitter like it's their personal diary. 230 00:11:48,880 --> 00:11:51,080 Speaker 1: And yet it's one of the reasons I actually like 231 00:11:51,120 --> 00:11:53,160 Speaker 1: Twitter and appreciate it because we can now see for 232 00:11:53,200 --> 00:11:55,360 Speaker 1: the first time behind the curtain. We can see what 233 00:11:55,400 --> 00:11:58,160 Speaker 1: they're really thinking. Even if some of their journalism is good, 234 00:11:58,400 --> 00:12:00,840 Speaker 1: We actually know what they're really thinking, even not necessarily 235 00:12:00,840 --> 00:12:03,400 Speaker 1: even about these topics that they're covering, but just in general. 236 00:12:03,440 --> 00:12:06,440 Speaker 1: And it's embarrassing. I mean, if if any other occupation, 237 00:12:06,520 --> 00:12:09,520 Speaker 1: I mean, if if you're an accountant who does your 238 00:12:09,520 --> 00:12:12,640 Speaker 1: taxes was tweeting constantly about January sixth being in the 239 00:12:12,640 --> 00:12:15,080 Speaker 1: next nine to eleven, you would be like, this is 240 00:12:15,240 --> 00:12:17,800 Speaker 1: something's wrong with this person. But now the journalists, it's 241 00:12:17,800 --> 00:12:20,160 Speaker 1: just like game for what they do. Now it's part 242 00:12:20,200 --> 00:12:23,160 Speaker 1: of their whole m. And that is really that that's 243 00:12:23,200 --> 00:12:25,880 Speaker 1: really embarrassing. But I do think so we can't rely 244 00:12:26,200 --> 00:12:29,120 Speaker 1: on the media outlets themselves to self correct here. You know, 245 00:12:29,120 --> 00:12:32,120 Speaker 1: the gatekeepers are not going to get better anytime soon. 246 00:12:32,240 --> 00:12:34,120 Speaker 1: So I think what you have to know is be 247 00:12:34,200 --> 00:12:36,920 Speaker 1: self aware of what's happening where these things are bubbling 248 00:12:37,000 --> 00:12:40,199 Speaker 1: up and all and know that I have a stat 249 00:12:40,200 --> 00:12:43,360 Speaker 1: in the book two percent of all Americans account for 250 00:12:43,480 --> 00:12:46,079 Speaker 1: ninety percent of the tweets when it relates to politics 251 00:12:46,080 --> 00:12:49,040 Speaker 1: and news. I mean, that is a minor, minor figure, 252 00:12:49,080 --> 00:12:52,880 Speaker 1: and so that's not important. What's important is finding outlets 253 00:12:52,880 --> 00:12:55,880 Speaker 1: that you trust, probably at this point on the independent side, 254 00:12:56,080 --> 00:12:58,840 Speaker 1: finding people that you trust and relying on them to 255 00:12:58,840 --> 00:13:02,079 Speaker 1: give it to you straight. Because yeah, the influencers on Twitter, 256 00:13:02,160 --> 00:13:04,520 Speaker 1: the people that are that are making so much the 257 00:13:04,640 --> 00:13:08,840 Speaker 1: amplifying just the small voices there that is not going 258 00:13:08,880 --> 00:13:11,000 Speaker 1: to move the needle and that's not reflective of the 259 00:13:11,000 --> 00:13:14,160 Speaker 1: country either. You talk about not wanting to eliminate people 260 00:13:14,160 --> 00:13:17,960 Speaker 1: from the conversation, you want to add folks because legacy 261 00:13:18,040 --> 00:13:21,240 Speaker 1: media seems to be going in a certain direction. Sometimes 262 00:13:21,240 --> 00:13:23,840 Speaker 1: it's hard to tell. We certainly saw that during COVID. 263 00:13:24,120 --> 00:13:27,839 Speaker 1: We saw these reporters coming out and really, I mean 264 00:13:27,880 --> 00:13:30,679 Speaker 1: they were demonizing people. If you decided you didn't want 265 00:13:30,720 --> 00:13:32,559 Speaker 1: the vaccine and if you didn't want to wear a mask, 266 00:13:32,720 --> 00:13:34,920 Speaker 1: you were not It wasn't just that you had a 267 00:13:34,920 --> 00:13:38,560 Speaker 1: different medical opinion for yourself, you were a murderer. I 268 00:13:38,600 --> 00:13:41,720 Speaker 1: mean they were out there saying things that were just horrendous. 269 00:13:42,000 --> 00:13:45,360 Speaker 1: And it's interesting to me because recently my daughter had 270 00:13:45,400 --> 00:13:48,360 Speaker 1: COVID and I and she'd never had it before. So 271 00:13:48,400 --> 00:13:51,960 Speaker 1: she's thirteen and we tested her and I said you've 272 00:13:52,000 --> 00:13:55,440 Speaker 1: got it, and she burst into hysterical tears. I can't 273 00:13:55,480 --> 00:13:58,160 Speaker 1: have this. I can't have this, and I you know, 274 00:13:58,160 --> 00:14:01,480 Speaker 1: I hadn't thought about what the narrative over the last 275 00:14:01,520 --> 00:14:05,319 Speaker 1: two years has done to kids, because she immediately felt 276 00:14:05,320 --> 00:14:07,719 Speaker 1: like there's something wrong with me and I can't let 277 00:14:07,720 --> 00:14:10,480 Speaker 1: anybody know, and I can't be this person that could 278 00:14:10,480 --> 00:14:12,600 Speaker 1: do this to other people. And I said, you know, 279 00:14:12,640 --> 00:14:15,640 Speaker 1: calm down, this is You're gonna be fine. You just 280 00:14:15,679 --> 00:14:17,640 Speaker 1: have to stay home for a few days and you'll 281 00:14:17,640 --> 00:14:20,240 Speaker 1: get through this and we'll be fine. But the fear 282 00:14:20,360 --> 00:14:24,680 Speaker 1: that this created, and really, I feel like that goes 283 00:14:24,720 --> 00:14:27,640 Speaker 1: directly back to the way the media talked about it 284 00:14:27,760 --> 00:14:31,760 Speaker 1: and the discourse that was created amongst parents and schools 285 00:14:31,840 --> 00:14:35,680 Speaker 1: and just communities of people who said, I'm hearing this 286 00:14:35,760 --> 00:14:38,880 Speaker 1: from a trusted source. But then, you know, two years 287 00:14:38,920 --> 00:14:41,840 Speaker 1: down the road, we find out, wow, actually we're not 288 00:14:42,040 --> 00:14:45,120 Speaker 1: really sure if masks did anything. The vaccines weren't as 289 00:14:45,160 --> 00:14:47,720 Speaker 1: effective as we thought they were going to be, staying 290 00:14:47,760 --> 00:14:51,040 Speaker 1: home wasn't effective for kids because they they didn't end 291 00:14:51,120 --> 00:14:54,280 Speaker 1: up learning anything. What is your advice to people when 292 00:14:54,360 --> 00:14:57,720 Speaker 1: they're looking at these situations and going, I really, I 293 00:14:57,760 --> 00:15:01,360 Speaker 1: don't know. I want more people in this conversation, but 294 00:15:01,560 --> 00:15:04,480 Speaker 1: are some of the new people coming in and clouding 295 00:15:04,520 --> 00:15:07,440 Speaker 1: it too, because the anger is so I mean, it's 296 00:15:07,480 --> 00:15:10,600 Speaker 1: become a lot of rage. When you look at the 297 00:15:10,680 --> 00:15:14,200 Speaker 1: new voices that come in, they're mad and the clickbait 298 00:15:14,240 --> 00:15:16,920 Speaker 1: because they matter you are. The more radical you are, 299 00:15:17,080 --> 00:15:20,040 Speaker 1: the more people are like, Yeah, that's affirming what I'm saying, 300 00:15:20,080 --> 00:15:23,120 Speaker 1: So I'm going to follow that person. Yeah, it's really 301 00:15:23,160 --> 00:15:25,680 Speaker 1: I spent a lot of time on COVID in Uncovered 302 00:15:25,680 --> 00:15:27,920 Speaker 1: because I do think it was this crucial moment where 303 00:15:28,400 --> 00:15:30,520 Speaker 1: you would think certain stories I write about are kind 304 00:15:30,520 --> 00:15:32,560 Speaker 1: of funny and silly in the book, but this was 305 00:15:32,600 --> 00:15:35,480 Speaker 1: something that really mattered to people's lives. And I understand 306 00:15:35,640 --> 00:15:39,200 Speaker 1: March April May twenty twenty the media getting certain things wrong, 307 00:15:39,600 --> 00:15:42,960 Speaker 1: not understanding it. This was the totally new situation. I 308 00:15:43,040 --> 00:15:45,080 Speaker 1: understand that there were some mistakes that were made, but 309 00:15:45,120 --> 00:15:47,840 Speaker 1: the problem was a they were never corrected. But also, 310 00:15:48,200 --> 00:15:52,040 Speaker 1: as you mentioned, they completely excluded certain people and voices 311 00:15:52,080 --> 00:15:54,840 Speaker 1: from the conversation. And not just yours are mine, but 312 00:15:55,160 --> 00:15:57,240 Speaker 1: doctor j Boicharia I talked to in the book. This 313 00:15:57,280 --> 00:16:00,800 Speaker 1: is someone a Stanford medical doctor who was one of 314 00:16:00,800 --> 00:16:03,560 Speaker 1: the authors of the Great Barrington Declaration that's simply argued 315 00:16:03,880 --> 00:16:06,280 Speaker 1: that we should have some focused protection of the elderly, 316 00:16:06,400 --> 00:16:08,640 Speaker 1: we shouldn't lock down everyone, and that we should focus 317 00:16:08,640 --> 00:16:11,240 Speaker 1: on that. It was an alternate opinion, and he was 318 00:16:11,320 --> 00:16:14,200 Speaker 1: demonized not just by doctor Fauci's of the world, which 319 00:16:14,200 --> 00:16:18,479 Speaker 1: we've learned now in elite emails, but by the press themselves, 320 00:16:18,640 --> 00:16:20,520 Speaker 1: who were treating him as if he was this quack 321 00:16:20,560 --> 00:16:22,400 Speaker 1: and it was this dangerous person who's going to get 322 00:16:22,400 --> 00:16:24,600 Speaker 1: everybody killed. And we see this, as you mentioned, every 323 00:16:24,600 --> 00:16:30,400 Speaker 1: single story, and it doesn't even matter that yes, lockdowns, masks, vaccines. 324 00:16:30,440 --> 00:16:32,680 Speaker 1: The people that were called the crazy conspiracy theorists have 325 00:16:32,800 --> 00:16:35,800 Speaker 1: essentially become right. It doesn't even matter even if they 326 00:16:35,800 --> 00:16:38,400 Speaker 1: were still wrong. This is a time when we need 327 00:16:38,440 --> 00:16:40,600 Speaker 1: to have conversation. This is a time when we need 328 00:16:40,720 --> 00:16:44,720 Speaker 1: debate and dialogue and nuance. And there was this sense 329 00:16:44,880 --> 00:16:48,080 Speaker 1: of almost paternalism, like we can't tell you the nuance 330 00:16:48,120 --> 00:16:50,000 Speaker 1: of this because we don't know if you'll do the 331 00:16:50,080 --> 00:16:52,920 Speaker 1: right thing with that information. That's really dangerous. And I 332 00:16:53,040 --> 00:16:55,920 Speaker 1: quote in the book Nate Silver from ABC was one 333 00:16:55,960 --> 00:16:58,600 Speaker 1: of the mainstream journalists I actually really respect, particularly on 334 00:16:58,680 --> 00:17:01,240 Speaker 1: COVID who said he was to refer to the lab leak. 335 00:17:01,320 --> 00:17:03,520 Speaker 1: But I think this relates to all of these COVID 336 00:17:03,560 --> 00:17:07,119 Speaker 1: stories lockdowns schools, where he says, if there's two sides 337 00:17:07,119 --> 00:17:10,200 Speaker 1: of an argument, and there's information on both sides, and 338 00:17:10,240 --> 00:17:12,840 Speaker 1: there's evidence on both sides, and there's experts on both sides, 339 00:17:12,880 --> 00:17:16,520 Speaker 1: but only one side wants to lock down the conversation, 340 00:17:16,640 --> 00:17:20,040 Speaker 1: wants to police the discourse, that side is usually wrong. 341 00:17:20,560 --> 00:17:22,920 Speaker 1: So that's what usually look for. If there's two an 342 00:17:23,000 --> 00:17:26,000 Speaker 1: argument going on and you're saying, let's shut the other 343 00:17:26,000 --> 00:17:28,400 Speaker 1: side up, that side is something you should be very 344 00:17:28,480 --> 00:17:32,520 Speaker 1: very cautious about. Obviously, we look at COVID and we think, well, 345 00:17:32,560 --> 00:17:35,720 Speaker 1: I think that you saw the breakdown and the lack 346 00:17:35,760 --> 00:17:38,720 Speaker 1: of trust in the media once you started to have 347 00:17:38,840 --> 00:17:40,840 Speaker 1: the president come out and say fake news and all 348 00:17:40,840 --> 00:17:43,679 Speaker 1: of these things, and people started to get very very concerned. 349 00:17:44,000 --> 00:17:45,960 Speaker 1: But that was really only one side. The other side 350 00:17:46,040 --> 00:17:48,399 Speaker 1: was like, no, no, no, we're right, we're right. But 351 00:17:48,520 --> 00:17:52,000 Speaker 1: then you did see people start to peel off during COVID, 352 00:17:52,160 --> 00:17:54,840 Speaker 1: and on both sides. I think people started to peel off. 353 00:17:55,200 --> 00:17:59,000 Speaker 1: But what about the violence that has ensued, And I 354 00:17:59,040 --> 00:18:04,199 Speaker 1: really do believe that there is some culpability on the 355 00:18:04,240 --> 00:18:07,000 Speaker 1: part of the media when they are saying, yeah, go 356 00:18:07,040 --> 00:18:09,959 Speaker 1: out and protest, and then on the flip side there, well, 357 00:18:10,000 --> 00:18:12,440 Speaker 1: if you're part of this group and you're protesting, then 358 00:18:12,920 --> 00:18:16,520 Speaker 1: you should be you know, attacked, and you should be 359 00:18:16,680 --> 00:18:18,960 Speaker 1: pushed aside and you should be put in jail. And 360 00:18:19,000 --> 00:18:25,119 Speaker 1: this has created this this real I guess. I would say, 361 00:18:25,440 --> 00:18:27,800 Speaker 1: you see a breakdown in the country, you see breakdown 362 00:18:27,840 --> 00:18:32,399 Speaker 1: in relations among people, whether you're conservative or liberal, or 363 00:18:32,440 --> 00:18:36,520 Speaker 1: where you live. How do you think that the media 364 00:18:36,560 --> 00:18:40,280 Speaker 1: should be handling those situations. Yeah, there's certainly not doing 365 00:18:40,320 --> 00:18:42,159 Speaker 1: a good job right now. I think I think that 366 00:18:42,280 --> 00:18:43,840 Speaker 1: this is one of those areas where you see the 367 00:18:43,880 --> 00:18:47,879 Speaker 1: hypocrisy so vast. I mean, I don't spend a lot 368 00:18:47,880 --> 00:18:50,199 Speaker 1: of time on January sixth in the book, but I frankly, 369 00:18:50,440 --> 00:18:53,040 Speaker 1: I think there's a whole book to be written about this. 370 00:18:53,280 --> 00:18:56,520 Speaker 1: The media and how they cover what happened on January sixth, 371 00:18:56,520 --> 00:18:59,480 Speaker 1: and what's happened in the years since. Their obsession with it, 372 00:18:59,520 --> 00:19:01,760 Speaker 1: and not just with an event, and not just in 373 00:19:01,760 --> 00:19:04,760 Speaker 1: how it relates to President Trump, but in how the 374 00:19:04,840 --> 00:19:08,679 Speaker 1: media themselves, the corporate press, has turned a bad riot 375 00:19:08,840 --> 00:19:12,399 Speaker 1: into an attack on half the country. They have successfully 376 00:19:12,400 --> 00:19:15,359 Speaker 1: done that, and I think it is so obvious and 377 00:19:15,440 --> 00:19:17,960 Speaker 1: so wrong. And yes, you compare that to the way 378 00:19:17,960 --> 00:19:20,159 Speaker 1: that they covered like the social justice protests and the 379 00:19:20,240 --> 00:19:22,679 Speaker 1: riot offshoots, which I do write about in chapter one, 380 00:19:22,720 --> 00:19:24,520 Speaker 1: because I actually think this is where it all started. 381 00:19:25,880 --> 00:19:29,000 Speaker 1: I go back to twenty fourteen and Ferguson and the 382 00:19:29,040 --> 00:19:32,080 Speaker 1: way that that story we learned about. You know, hands up, 383 00:19:32,080 --> 00:19:34,600 Speaker 1: don't shoot, and that was what Michael Brown said to 384 00:19:34,680 --> 00:19:37,439 Speaker 1: Darren Wilson before he was shot and murdered. That was 385 00:19:37,480 --> 00:19:39,880 Speaker 1: all untrue. I mean, not just the fact that he 386 00:19:39,920 --> 00:19:42,440 Speaker 1: was shot and murdered, because he was Darren Wilson was 387 00:19:42,480 --> 00:19:45,200 Speaker 1: completely cleared. No racial atomis from that person. This was 388 00:19:45,400 --> 00:19:48,360 Speaker 1: all happened a year later by the Obama Justice Department. 389 00:19:48,640 --> 00:19:51,160 Speaker 1: But the words hands up, don't shoot were never even said. 390 00:19:51,440 --> 00:19:54,560 Speaker 1: That was confirmed by multiple witnesses, and yet the media 391 00:19:54,640 --> 00:19:56,760 Speaker 1: never went back and corrected their coverage about it. No, 392 00:19:57,080 --> 00:20:00,560 Speaker 1: instead they said hands up, don't shoot. It doesn't matter 393 00:20:00,560 --> 00:20:02,119 Speaker 1: really if it was said or not. It became this 394 00:20:02,280 --> 00:20:06,720 Speaker 1: rallying cry for a new movement. I mean, their deference 395 00:20:06,760 --> 00:20:09,439 Speaker 1: to the narrative over the facts. That's where I think 396 00:20:09,480 --> 00:20:11,280 Speaker 1: it all began, and we see it with the social 397 00:20:11,320 --> 00:20:15,159 Speaker 1: justice protests of twenty twenty and beyond. It's that's a 398 00:20:15,240 --> 00:20:17,720 Speaker 1: real problem. And that's also why I think that as 399 00:20:17,720 --> 00:20:20,000 Speaker 1: I lay out and uncovered, here's what happened, but here's 400 00:20:20,000 --> 00:20:21,760 Speaker 1: what to look for the future, because this is going 401 00:20:21,800 --> 00:20:24,720 Speaker 1: to continue to happen. Let's take a quick commercial break. 402 00:20:24,800 --> 00:20:31,880 Speaker 1: We'll continue next on the Tutor Dixon podcast. There does 403 00:20:31,920 --> 00:20:37,840 Speaker 1: seem to be a defense of certain I guess when 404 00:20:37,920 --> 00:20:42,000 Speaker 1: people have been accused of crimes, there's been a defense. 405 00:20:42,040 --> 00:20:46,240 Speaker 1: I would even pick on the Brianna Taylor's story because 406 00:20:46,720 --> 00:20:50,640 Speaker 1: nobody really gave the backstory of what that was. Here. 407 00:20:50,720 --> 00:20:54,360 Speaker 1: She was she was the financial operator in a drug organization. 408 00:20:54,920 --> 00:20:59,320 Speaker 1: They had been selling drugs all through Louisville, Kentucky. She 409 00:20:59,359 --> 00:21:02,760 Speaker 1: had had a body found in her vehicle. There were 410 00:21:02,800 --> 00:21:07,200 Speaker 1: probably multiple overdose desks that had been connected to her organization, 411 00:21:07,680 --> 00:21:10,120 Speaker 1: and they were trying to bust these people who were 412 00:21:10,160 --> 00:21:16,280 Speaker 1: actually incredibly dangerous folks in the city of Louisville, Kentucky. 413 00:21:16,720 --> 00:21:19,080 Speaker 1: That was another one where really we never heard the 414 00:21:19,119 --> 00:21:23,040 Speaker 1: backstory and how dangerous that group really was. Why do 415 00:21:23,080 --> 00:21:24,920 Speaker 1: you think it is that we're not hearing the full 416 00:21:24,960 --> 00:21:27,560 Speaker 1: story in certain cases. I think there's a couple of reasons. 417 00:21:27,600 --> 00:21:29,960 Speaker 1: I think that on one level, there's there's a general 418 00:21:30,040 --> 00:21:33,040 Speaker 1: laziness with a lot of the media and a competence. 419 00:21:33,280 --> 00:21:35,679 Speaker 1: I think that there they would rather tell a story 420 00:21:35,760 --> 00:21:39,520 Speaker 1: in the easiest way possible and not tell a nuanced 421 00:21:39,520 --> 00:21:43,359 Speaker 1: and complicated story that requires more work. So I do 422 00:21:43,400 --> 00:21:44,960 Speaker 1: think that there's an element of that, but that one 423 00:21:45,080 --> 00:21:47,400 Speaker 1: is an example Brianna Taylor, or a lot of these cases. 424 00:21:47,880 --> 00:21:51,679 Speaker 1: I do think that there is a sense of distrust 425 00:21:51,680 --> 00:21:54,080 Speaker 1: in the public. You know, we talk a lot about 426 00:21:54,080 --> 00:21:56,240 Speaker 1: the trust that the public has in the media, and 427 00:21:56,320 --> 00:21:58,920 Speaker 1: as you mentioned, every poll shows it at all time lows, 428 00:21:59,480 --> 00:22:02,640 Speaker 1: not just on the right, but among independence. I saw 429 00:22:02,720 --> 00:22:04,520 Speaker 1: a stat only a couple months ago of the TV 430 00:22:04,640 --> 00:22:08,120 Speaker 1: News trust and TV news. Eight percent of Republicans trust 431 00:22:08,160 --> 00:22:11,240 Speaker 1: TV news and eight percent of independence trust TV News. 432 00:22:11,280 --> 00:22:12,840 Speaker 1: I mean, that's where we are right now in the country. 433 00:22:13,040 --> 00:22:15,359 Speaker 1: But it's part of it is a reaction to the 434 00:22:15,359 --> 00:22:18,160 Speaker 1: trust that the media has in the public. Because, yes, 435 00:22:18,200 --> 00:22:20,760 Speaker 1: if the media trust their own audience more, if the 436 00:22:20,800 --> 00:22:23,919 Speaker 1: corporate press said I believe my audience is smart and 437 00:22:23,960 --> 00:22:27,760 Speaker 1: can understand that these stories are complicated and nuanced and 438 00:22:27,840 --> 00:22:30,120 Speaker 1: not so cut and dry and black and white. They 439 00:22:30,160 --> 00:22:32,840 Speaker 1: would give a lot more credence to telling the full picture, 440 00:22:33,160 --> 00:22:35,800 Speaker 1: but instead they don't. They just say, we need to 441 00:22:35,800 --> 00:22:37,960 Speaker 1: just tell it in the easiest way possible, in a 442 00:22:38,000 --> 00:22:40,159 Speaker 1: way that's not going to get us Twitter backlash, and 443 00:22:40,200 --> 00:22:42,560 Speaker 1: they move on to the next thing. I'd say it's 444 00:22:42,600 --> 00:22:46,200 Speaker 1: hard on both sides because you see one side is saying, well, 445 00:22:46,200 --> 00:22:48,720 Speaker 1: we believe this, one side is saying we believe that, 446 00:22:49,040 --> 00:22:53,000 Speaker 1: and there's misinformation that comes out regardless. There's always going 447 00:22:53,080 --> 00:22:55,680 Speaker 1: to be because you learn. I mean, part of journalism 448 00:22:55,760 --> 00:22:58,000 Speaker 1: is learning the story as you go, and so that's 449 00:22:58,000 --> 00:22:59,800 Speaker 1: the part where I think that a lot of us 450 00:22:59,800 --> 00:23:01,680 Speaker 1: have said, well, once you've learned the story, you haven't 451 00:23:01,680 --> 00:23:03,720 Speaker 1: come back and said there's more to this, we want 452 00:23:03,760 --> 00:23:06,160 Speaker 1: to expand on it. That's the part that I think 453 00:23:06,200 --> 00:23:10,080 Speaker 1: that we've been missing, and honestly, campaigning for a year 454 00:23:10,080 --> 00:23:12,440 Speaker 1: and a half of my life, I saw this on 455 00:23:12,600 --> 00:23:15,439 Speaker 1: the Republican side. I was a Republican candidate, so I 456 00:23:15,440 --> 00:23:17,159 Speaker 1: saw this where people would come up to me, they all, 457 00:23:17,359 --> 00:23:20,080 Speaker 1: what about this and this, and I would have to say, no, 458 00:23:20,400 --> 00:23:23,240 Speaker 1: you know that didn't actually happen. Let me kind of 459 00:23:23,280 --> 00:23:26,480 Speaker 1: break this down for you. And some people would you know, 460 00:23:26,560 --> 00:23:29,159 Speaker 1: understand or would say they're going to go research it. 461 00:23:29,359 --> 00:23:33,280 Speaker 1: And some people were so bought into that story that 462 00:23:33,560 --> 00:23:36,159 Speaker 1: wasn't affirmed in their heart, you know, they believed this, 463 00:23:36,280 --> 00:23:38,280 Speaker 1: it was affirmed and they went forward with it. And 464 00:23:38,359 --> 00:23:41,480 Speaker 1: I mean, I think that's human nature. But obviously it's 465 00:23:41,520 --> 00:23:44,200 Speaker 1: always nice to have folks like you who have been 466 00:23:44,240 --> 00:23:47,399 Speaker 1: behind the scenes and been on both sides to say, yeah, 467 00:23:47,440 --> 00:23:49,760 Speaker 1: this is hard. But I think part of it is 468 00:23:49,800 --> 00:23:53,840 Speaker 1: hard because you are human and these like you said, 469 00:23:54,040 --> 00:23:57,520 Speaker 1: these people have always had some bias, but they've been 470 00:23:57,560 --> 00:24:00,199 Speaker 1: trained to try to keep that out. But really, at 471 00:24:00,200 --> 00:24:02,280 Speaker 1: the end of the day, you're asking humans to report 472 00:24:02,280 --> 00:24:04,760 Speaker 1: on stories and that happens, right, it does. Yeah, And 473 00:24:05,080 --> 00:24:07,240 Speaker 1: I do think so two things I play. First of all, 474 00:24:07,240 --> 00:24:10,040 Speaker 1: I understand the inclination of people on both sides right 475 00:24:10,080 --> 00:24:13,800 Speaker 1: now to say to be so distrustful, but particularly I 476 00:24:13,840 --> 00:24:17,879 Speaker 1: think on the right distrustful of the supposed objective media 477 00:24:17,960 --> 00:24:20,280 Speaker 1: that they dig their heels in even more. I think 478 00:24:20,320 --> 00:24:23,520 Speaker 1: it's a natural reaction that comes when when they're they've 479 00:24:23,560 --> 00:24:26,480 Speaker 1: been spun for so much but I write that misinformation 480 00:24:26,920 --> 00:24:29,359 Speaker 1: is the tax we pay for freedom, you know. There, 481 00:24:29,480 --> 00:24:32,320 Speaker 1: I think it's become this dirty word. But yeah, if 482 00:24:32,400 --> 00:24:35,400 Speaker 1: we have the freedom of the exchange of ideas, we're 483 00:24:35,400 --> 00:24:38,560 Speaker 1: not going to we don't need only the true information 484 00:24:38,640 --> 00:24:41,040 Speaker 1: that's been back checked by thousands of people, because first 485 00:24:41,040 --> 00:24:43,240 Speaker 1: of all, that's probably gonna be wrong anyway. But second 486 00:24:43,240 --> 00:24:45,320 Speaker 1: of all, like that, this is part of the discourse. 487 00:24:45,359 --> 00:24:48,040 Speaker 1: It's part of just being an American is we get 488 00:24:48,080 --> 00:24:50,560 Speaker 1: to have these kinds of conversations and sometimes it's right. 489 00:24:50,680 --> 00:24:53,000 Speaker 1: Usually it is, hopefully it is, but sometimes it's not, 490 00:24:53,040 --> 00:24:55,040 Speaker 1: and that's okay, we'll correct it. That's that's how it 491 00:24:55,040 --> 00:24:57,200 Speaker 1: should be. But that's not obviously the way the media 492 00:24:57,200 --> 00:24:59,040 Speaker 1: treats it now. But the other thing about it is 493 00:24:59,080 --> 00:25:02,280 Speaker 1: objectivity is no longer the goal in a lot of 494 00:25:02,280 --> 00:25:05,240 Speaker 1: these situations. I quote a New York Times reporter very 495 00:25:05,280 --> 00:25:07,119 Speaker 1: candid with me. Everyone is on the record in the 496 00:25:07,119 --> 00:25:10,320 Speaker 1: book and uncovered, and she tells me that for some people, 497 00:25:10,359 --> 00:25:14,560 Speaker 1: the young journalists, objectivity is akin to white supremacy. And 498 00:25:14,640 --> 00:25:17,520 Speaker 1: so when you have that in newsrooms now, that is 499 00:25:17,520 --> 00:25:19,400 Speaker 1: a huge problem, and that's going to pretend a lot 500 00:25:19,400 --> 00:25:21,439 Speaker 1: of issues down the road. If we let the people 501 00:25:21,680 --> 00:25:25,320 Speaker 1: that believe objectivity is white supremacy win out and get 502 00:25:25,400 --> 00:25:27,880 Speaker 1: jobs and start to control these newsrooms, that's where we're 503 00:25:27,880 --> 00:25:30,280 Speaker 1: going to get. But I think I love what you 504 00:25:30,400 --> 00:25:35,680 Speaker 1: said about misinformation can be demonized, because that is really 505 00:25:36,040 --> 00:25:38,040 Speaker 1: what America is. That you can go out and you 506 00:25:38,040 --> 00:25:42,320 Speaker 1: can speak, and that you can have that discourse and conversation. 507 00:25:42,359 --> 00:25:45,239 Speaker 1: And we've gotten so far away from conversation, and I 508 00:25:45,280 --> 00:25:48,919 Speaker 1: think that initially, I believe that social media was created 509 00:25:48,960 --> 00:25:53,080 Speaker 1: for conversation, and it sort of has destroyed the ability 510 00:25:53,160 --> 00:25:55,000 Speaker 1: to talk to one another. That's why I love the 511 00:25:55,000 --> 00:25:57,159 Speaker 1: fact that I get to do this podcast, and I 512 00:25:57,240 --> 00:25:59,200 Speaker 1: feel really honored that I get to be a part 513 00:25:59,240 --> 00:26:01,640 Speaker 1: of the clay Book Network so that I can get 514 00:26:01,640 --> 00:26:04,840 Speaker 1: out there and talk to folks like you who can say, look, 515 00:26:04,880 --> 00:26:08,040 Speaker 1: it's not perfect, it's never going to be perfect, but 516 00:26:08,119 --> 00:26:11,480 Speaker 1: it is American and that is as perfect as you 517 00:26:11,480 --> 00:26:14,040 Speaker 1: can get. I believe in this world. So I appreciate 518 00:26:14,080 --> 00:26:15,840 Speaker 1: what you had to say. Tell people where they can 519 00:26:15,840 --> 00:26:17,520 Speaker 1: get uncovered thanks to Yeah, I agree with you one 520 00:26:17,600 --> 00:26:20,600 Speaker 1: hundred percent. You can find uncovered. Read uncovered dot com, 521 00:26:20,600 --> 00:26:23,480 Speaker 1: Read uncovered dot com. You can find the book the audiobook, 522 00:26:23,520 --> 00:26:25,879 Speaker 1: you can hear from the people themselves, lots of people, 523 00:26:25,920 --> 00:26:29,840 Speaker 1: Tucker Carlson, others at Fox, other mainstream journalists actually say 524 00:26:29,840 --> 00:26:32,000 Speaker 1: their quotes in their words as well. That's what I 525 00:26:32,040 --> 00:26:34,240 Speaker 1: love too, because you're talking, like you said, you have 526 00:26:34,280 --> 00:26:37,199 Speaker 1: New York Times reporters in there. You're talking to both sides. 527 00:26:37,280 --> 00:26:40,640 Speaker 1: This is not one sided. People get to see what 528 00:26:40,840 --> 00:26:44,520 Speaker 1: the behind the scenes of the media really is. Yeah. Absolutely, yeah, 529 00:26:44,520 --> 00:26:47,040 Speaker 1: that was that was the goal. Two dozen people all 530 00:26:47,080 --> 00:26:49,320 Speaker 1: across the industry. There are some good ones. Even at 531 00:26:49,359 --> 00:26:52,400 Speaker 1: these outlets. They often get overshadowed by the loudest, most 532 00:26:52,400 --> 00:26:55,440 Speaker 1: annoying voices on Twitter that in their organizations. But there 533 00:26:55,440 --> 00:26:57,040 Speaker 1: are some good ones and they've got some good ideas 534 00:26:57,040 --> 00:27:00,560 Speaker 1: on how we can fix this, so the messages. Even 535 00:27:00,560 --> 00:27:03,600 Speaker 1: though we get frustrated with the media, we are blessed 536 00:27:03,680 --> 00:27:06,840 Speaker 1: to have a free press in the United States. Steve Krakour, 537 00:27:07,160 --> 00:27:09,639 Speaker 1: thank you so much, author of Uncovered. Go out and 538 00:27:09,640 --> 00:27:11,679 Speaker 1: get the book because you're going to learn a lot 539 00:27:11,880 --> 00:27:13,800 Speaker 1: and you might be a little less angry after you 540 00:27:13,840 --> 00:27:15,560 Speaker 1: read it, or maybe you're a little more angry. I 541 00:27:15,600 --> 00:27:18,480 Speaker 1: don't know, but thank you so much for being here today. 542 00:27:18,960 --> 00:27:21,280 Speaker 1: Appreciate it, and thank you all for joining me on 543 00:27:21,440 --> 00:27:24,040 Speaker 1: the Tutor Dixon Podcast. For this episode and others, go 544 00:27:24,119 --> 00:27:27,639 Speaker 1: to tutor Dixon podcast dot com. You can subscribe right 545 00:27:27,680 --> 00:27:29,679 Speaker 1: there and make sure you join me next time on 546 00:27:29,760 --> 00:27:31,520 Speaker 1: the Tutor Dixon Podcast. Have a great thing.