1 00:00:11,840 --> 00:00:14,520 Speaker 1: Good morning, peeps, and welcome to Okay f Daily with 2 00:00:14,640 --> 00:00:20,120 Speaker 1: Meet your Girl, Danielle Moody, recording from the Home Bunker. Folks, 3 00:00:20,280 --> 00:00:22,480 Speaker 1: there is not often that there are things that are 4 00:00:22,520 --> 00:00:25,960 Speaker 1: in the news that are good. Right, we marinate and 5 00:00:26,000 --> 00:00:29,000 Speaker 1: a lot of misery up in these parts, and so 6 00:00:29,040 --> 00:00:32,760 Speaker 1: when something good finally happens, I want to elevate it 7 00:00:32,920 --> 00:00:35,839 Speaker 1: and I want to start with it, which is that 8 00:00:36,400 --> 00:00:40,159 Speaker 1: finally there seems to be somebody that is awake at 9 00:00:40,159 --> 00:00:43,159 Speaker 1: the Department of Justice. Why do I say that, because 10 00:00:43,200 --> 00:00:47,640 Speaker 1: the four officers that were involved in the murder of 11 00:00:47,840 --> 00:00:52,519 Speaker 1: Brianna Taylor are being charged with federal crimes. It is 12 00:00:52,600 --> 00:00:56,320 Speaker 1: taken way too fucking long to get to this place 13 00:00:56,360 --> 00:01:00,760 Speaker 1: to arrest the officers that would storm into Brianna Taylor's home. 14 00:01:01,080 --> 00:01:04,080 Speaker 1: I have so many I got Brittany Grinder coming up 15 00:01:04,120 --> 00:01:07,160 Speaker 1: in my mind. I got Brianna Taylor in my head. 16 00:01:08,000 --> 00:01:13,039 Speaker 1: But with Brianna, people were in the streets. People have 17 00:01:13,280 --> 00:01:17,479 Speaker 1: been in the streets, people have been asking, begging, fighting 18 00:01:17,880 --> 00:01:21,679 Speaker 1: for those officers that would barge into her home, break 19 00:01:21,720 --> 00:01:23,800 Speaker 1: into her home in the middle of the night while 20 00:01:23,840 --> 00:01:27,320 Speaker 1: she was sleeping, and kill her in her sleep. Right, 21 00:01:27,800 --> 00:01:31,280 Speaker 1: we watch the trial where it was, oh, no one's 22 00:01:31,319 --> 00:01:35,440 Speaker 1: at fault here. Police just get to walk in, kill somebody, 23 00:01:35,840 --> 00:01:39,160 Speaker 1: walk out, and no one is held accountable. Yes, the 24 00:01:39,280 --> 00:01:42,840 Speaker 1: family did receive millions of dollars in her settlement, but 25 00:01:42,959 --> 00:01:45,720 Speaker 1: what does that do when your twenty something year old 26 00:01:45,800 --> 00:01:48,600 Speaker 1: daughter is murdered in her sleep for no other reason 27 00:01:48,640 --> 00:01:51,120 Speaker 1: than what being in the right place at the wrong time, 28 00:01:51,320 --> 00:01:55,960 Speaker 1: and again, no fucking accountability. Well, the Department of Justice 29 00:01:56,440 --> 00:02:00,480 Speaker 1: has finally stepped in and the Department of Justice has 30 00:02:00,560 --> 00:02:04,720 Speaker 1: finally brought charges against the four officers that were involved 31 00:02:04,920 --> 00:02:09,120 Speaker 1: in her murder. Several of them were dismissed, and they 32 00:02:09,120 --> 00:02:12,480 Speaker 1: were dismissed back in twenty twenty and twenty twenty one 33 00:02:12,639 --> 00:02:15,680 Speaker 1: from the force. But we know what happens in this 34 00:02:15,840 --> 00:02:19,520 Speaker 1: blue fraternity is that you know, you can kill somebody 35 00:02:19,560 --> 00:02:21,920 Speaker 1: in one town, one state, and get to go and 36 00:02:22,000 --> 00:02:25,000 Speaker 1: move over to the next place and kill some more people. Right, 37 00:02:26,080 --> 00:02:29,960 Speaker 1: That's how it works. And so the federal government finally 38 00:02:30,000 --> 00:02:35,160 Speaker 1: deciding to bring charges after a long, long aggravated weight 39 00:02:35,720 --> 00:02:38,560 Speaker 1: is something that I will applaud and we will continue 40 00:02:38,600 --> 00:02:42,800 Speaker 1: to watch and see what happens. But this is what 41 00:02:42,840 --> 00:02:46,880 Speaker 1: accountability looks like, I just wish it came quicker speaking 42 00:02:46,919 --> 00:02:51,880 Speaker 1: of accountability Alex Jones's piece of shit ass and his 43 00:02:52,040 --> 00:02:54,760 Speaker 1: stupid fucking lawyers. But I don't know if they're stupid 44 00:02:54,840 --> 00:02:57,600 Speaker 1: or they were just sending over all of the contents 45 00:02:57,639 --> 00:02:59,880 Speaker 1: of his phone over the last two years to a 46 00:03:00,080 --> 00:03:02,440 Speaker 1: posting counsel because they were then going to try and 47 00:03:02,480 --> 00:03:05,720 Speaker 1: push for a mistrial, or they are really that dumb. 48 00:03:06,560 --> 00:03:10,360 Speaker 1: But after about I think a little over an hour 49 00:03:10,560 --> 00:03:16,640 Speaker 1: of deliberation, they the jury has come back with four 50 00:03:16,680 --> 00:03:22,040 Speaker 1: million dollars so far in damages for the family that 51 00:03:22,120 --> 00:03:26,120 Speaker 1: brought Alex Jones to trial. And let me say this, 52 00:03:27,960 --> 00:03:31,600 Speaker 1: Sandy Hook happened ten years ago, so I just like 53 00:03:34,600 --> 00:03:40,400 Speaker 1: their child that Alex Jones and all of the children 54 00:03:40,400 --> 00:03:47,000 Speaker 1: of Sandy Hook would be teenagers now, They would be 55 00:03:47,040 --> 00:03:50,400 Speaker 1: in high school now, and this man is only now 56 00:03:50,520 --> 00:03:58,080 Speaker 1: going to trial for years of fucking lies, conspiracy theories 57 00:03:58,520 --> 00:04:02,440 Speaker 1: putting their lives at steak because of his rabid, fucking 58 00:04:02,880 --> 00:04:09,640 Speaker 1: hopped up KKK bullshit white supremacy, white frail male base 59 00:04:11,080 --> 00:04:13,760 Speaker 1: that would take his words and go and try and 60 00:04:13,840 --> 00:04:20,120 Speaker 1: attack those families. So far, four million dollars they're asking 61 00:04:20,160 --> 00:04:22,840 Speaker 1: for a hundred and fifty million, and we will see 62 00:04:23,240 --> 00:04:25,680 Speaker 1: what this sum total comes out to. But I gotta 63 00:04:25,720 --> 00:04:27,760 Speaker 1: say this, and I said this on Twitter the other day. 64 00:04:28,200 --> 00:04:32,360 Speaker 1: Do you know that that motherfucker is making eight hundred 65 00:04:32,480 --> 00:04:39,840 Speaker 1: thousand dollars a day selling tubs of survival food and 66 00:04:40,360 --> 00:04:45,800 Speaker 1: like gun paraphernalia. Who knew that white supremacy and male 67 00:04:45,880 --> 00:04:51,840 Speaker 1: fragility were so fucking lucrative? My god, did I go 68 00:04:51,880 --> 00:04:54,560 Speaker 1: into the wrong line of work when all I had 69 00:04:54,600 --> 00:04:57,800 Speaker 1: to do was slap a sticker on this on a 70 00:04:57,880 --> 00:05:00,560 Speaker 1: tub of Kraft macaroni and cheese and say that it's 71 00:05:00,600 --> 00:05:04,080 Speaker 1: survival food and sell it to a bunch of idiots 72 00:05:04,160 --> 00:05:09,159 Speaker 1: hunkered down in their like gun sheds. I had no idea. 73 00:05:09,400 --> 00:05:15,039 Speaker 1: Now to terrible news, which is that of Britney Griner. 74 00:05:15,640 --> 00:05:20,200 Speaker 1: Britney Griner was found guilty in the sham trial that 75 00:05:20,279 --> 00:05:23,480 Speaker 1: Russia put together and sentenced her to nine years in 76 00:05:23,520 --> 00:05:28,160 Speaker 1: prison with criminal intent because sure, you have remnants of 77 00:05:28,320 --> 00:05:33,400 Speaker 1: marijuana in a vape that you have a prescription for 78 00:05:33,640 --> 00:05:38,320 Speaker 1: from a doctor, and that's criminal intent. I mean, Russia 79 00:05:38,560 --> 00:05:41,520 Speaker 1: is a fucking terrorist state, and I just wish that 80 00:05:41,600 --> 00:05:46,960 Speaker 1: they were pulled out of every single fucking G seven, 81 00:05:47,040 --> 00:05:49,400 Speaker 1: G eight, every single type of group, like, can we 82 00:05:49,440 --> 00:05:52,760 Speaker 1: just stop legitimizing the country of Russia? Can we stop 83 00:05:52,839 --> 00:05:57,920 Speaker 1: legitimizing Putin? Can we just give them whatever fucking criminals 84 00:05:58,040 --> 00:06:00,960 Speaker 1: they want back so that we can get our people home? 85 00:06:01,520 --> 00:06:03,760 Speaker 1: Can we stop with the bullshit like we think that 86 00:06:03,800 --> 00:06:07,279 Speaker 1: we're punishing Russia for something and just give them whatever 87 00:06:07,360 --> 00:06:10,240 Speaker 1: prisoners they want in exchange for the Americans that they're 88 00:06:10,240 --> 00:06:13,800 Speaker 1: holding hostage. Right, And you know, I'll say this that 89 00:06:15,040 --> 00:06:19,640 Speaker 1: there was an artist who said after they heard the 90 00:06:19,680 --> 00:06:23,080 Speaker 1: sentencing come down, that said, you know, I guarantee you 91 00:06:23,120 --> 00:06:25,880 Speaker 1: that if this were Taylor fucking Swift, if this were 92 00:06:26,440 --> 00:06:31,039 Speaker 1: you know, any white NFL star, or white NBA star 93 00:06:31,400 --> 00:06:35,520 Speaker 1: or a white superstar, that America would have done more 94 00:06:36,000 --> 00:06:39,360 Speaker 1: right from the fucking jump. There would have been campaigns, 95 00:06:39,400 --> 00:06:40,920 Speaker 1: and there would have been this, that and the other thing. 96 00:06:40,960 --> 00:06:43,560 Speaker 1: But for Brittany Griner, for some reason, we were all 97 00:06:43,600 --> 00:06:46,440 Speaker 1: told not to march, to stay silent, and that oh 98 00:06:46,480 --> 00:06:48,919 Speaker 1: America was going to do the work. But it wasn't 99 00:06:49,000 --> 00:06:53,719 Speaker 1: until people got in the streets and kept putting her 100 00:06:53,839 --> 00:06:56,880 Speaker 1: name in their mouths. Did this federal government, did this 101 00:06:56,960 --> 00:07:00,320 Speaker 1: administration decide to act so you know what I will 102 00:07:00,320 --> 00:07:04,600 Speaker 1: tell people is that like this is not fucking done right. 103 00:07:04,920 --> 00:07:08,360 Speaker 1: And I heeded the initial warnings of you know, we'll 104 00:07:08,440 --> 00:07:10,960 Speaker 1: do more harm than good than bring attention to this 105 00:07:11,080 --> 00:07:16,440 Speaker 1: fuck that Britney Grinder should be home, right, Britney Griner 106 00:07:16,480 --> 00:07:19,640 Speaker 1: should not be a fucking hostage of Russia on some 107 00:07:19,680 --> 00:07:22,000 Speaker 1: trumped up bullshit. As a matter of fact, Britney Grinder 108 00:07:22,040 --> 00:07:23,680 Speaker 1: should have never had to go to Russia in the 109 00:07:23,720 --> 00:07:26,400 Speaker 1: first fucking place to play ball in the offseason to 110 00:07:26,440 --> 00:07:29,000 Speaker 1: be able to provide for her family because the w 111 00:07:29,360 --> 00:07:32,840 Speaker 1: NBA should be paying on par what the fucking NBA pays. 112 00:07:33,120 --> 00:07:37,720 Speaker 1: How about women deserve to be paid the same. And 113 00:07:37,800 --> 00:07:41,600 Speaker 1: I know that the Women Soccer League just fucking won 114 00:07:41,680 --> 00:07:47,160 Speaker 1: their case, right, the same case that Serena Williams and 115 00:07:47,280 --> 00:07:52,560 Speaker 1: Venus Williams brought against Tennis. If Britty Griner had been 116 00:07:52,600 --> 00:07:58,320 Speaker 1: treated like the NBA players, she wouldn't have been fucking abroad. 117 00:08:01,120 --> 00:08:04,080 Speaker 1: This case makes me sick. It makes me sick to 118 00:08:04,160 --> 00:08:07,760 Speaker 1: my stomach to think about a black, queer woman being 119 00:08:07,800 --> 00:08:12,240 Speaker 1: held in fucking Russia, in a country that hates black 120 00:08:12,280 --> 00:08:17,720 Speaker 1: people and hates gay people, and she embodies both. So 121 00:08:17,880 --> 00:08:20,880 Speaker 1: if you are a praying person, please pray for her. 122 00:08:21,120 --> 00:08:24,520 Speaker 1: If you are a person that will call your representatives 123 00:08:24,560 --> 00:08:28,080 Speaker 1: and say that you want pressure right on this fucking 124 00:08:28,120 --> 00:08:31,520 Speaker 1: administration to do everything and anything that they can, please 125 00:08:31,600 --> 00:08:38,480 Speaker 1: do that this should have never come to pass. My 126 00:08:41,600 --> 00:08:48,640 Speaker 1: heart is broken for her and her family, and I 127 00:08:48,679 --> 00:08:52,920 Speaker 1: hope that years will not pass before Brittany's body and 128 00:08:53,160 --> 00:08:59,360 Speaker 1: mind is broken by this barbaric, fucking country, before she's 129 00:08:59,400 --> 00:09:03,880 Speaker 1: able to come home. Coming up next, my Dear friends, 130 00:09:04,320 --> 00:09:08,400 Speaker 1: is a conversation that I think is really interesting with 131 00:09:08,679 --> 00:09:12,079 Speaker 1: a Kaivan Stroff, who we've had on the show before 132 00:09:12,440 --> 00:09:19,520 Speaker 1: and has done a capstone research project at Harvard on 133 00:09:19,640 --> 00:09:23,880 Speaker 1: what he deems as the trifecta that is happening in 134 00:09:24,000 --> 00:09:28,640 Speaker 1: journalism and modern media right now and where the breakdown 135 00:09:28,760 --> 00:09:34,400 Speaker 1: has come in terms of the Fourth Estate turning into 136 00:09:34,520 --> 00:09:38,120 Speaker 1: nothing more than reality TV filled with stars that want 137 00:09:38,120 --> 00:09:41,920 Speaker 1: to line their own pockets. So we get in to 138 00:09:42,400 --> 00:09:49,560 Speaker 1: that conversation with our friend Kaivan Stroff. Coming up next, Hey, 139 00:09:49,559 --> 00:09:53,280 Speaker 1: I'm David. Plots of Slaves Political Gabfest. As another election 140 00:09:53,559 --> 00:09:57,040 Speaker 1: season accelerates, it can be tricky to sort through all 141 00:09:57,160 --> 00:09:59,880 Speaker 1: the noise and the news Each week on The gap Fest, 142 00:10:00,160 --> 00:10:03,400 Speaker 1: John Dickerson, Emily Bathlon and I decipher the headlines, break 143 00:10:03,400 --> 00:10:06,640 Speaker 1: down the races, and tell you what issues really matter. 144 00:10:07,040 --> 00:10:10,480 Speaker 1: We do not always agree, We definitely do not always agree, 145 00:10:10,720 --> 00:10:13,760 Speaker 1: but we always deliver thoughtful debate and we always have 146 00:10:13,760 --> 00:10:17,720 Speaker 1: a good time. So subscribe to Slate's Political Gapfest. New 147 00:10:17,760 --> 00:10:22,160 Speaker 1: episodes every Thursday. Get up behind the scenes look at 148 00:10:22,200 --> 00:10:25,120 Speaker 1: Comedy Central's The Daily Show on Beyond the Scenes, an 149 00:10:25,160 --> 00:10:28,320 Speaker 1: original podcast from the Daily Show with Trevor Noah. Every week, 150 00:10:28,360 --> 00:10:30,959 Speaker 1: host Roy Wood Junior goes deeper with the notable guests 151 00:10:30,960 --> 00:10:33,920 Speaker 1: and experts from the Emmy Award winning series. Together, they 152 00:10:34,000 --> 00:10:37,120 Speaker 1: use comedy to tackle current topics from gentrification to gun 153 00:10:37,160 --> 00:10:39,240 Speaker 1: laws and take a closer look at how and why 154 00:10:39,320 --> 00:10:41,840 Speaker 1: these topics matter. Listen to Beyond the Scenes from The 155 00:10:41,880 --> 00:10:45,320 Speaker 1: Daily Show with Trevor Noah on the iHeartRadio app, Apple Podcast, 156 00:10:45,400 --> 00:10:52,120 Speaker 1: or wherever you get your podcast. New episodes every Tuesday. Folks. 157 00:10:52,160 --> 00:10:55,360 Speaker 1: I am very happy to welcome back to Woke f 158 00:10:55,840 --> 00:11:01,920 Speaker 1: Daily Kivan Schroff, who is a Democratic commentator and has 159 00:11:02,240 --> 00:11:06,800 Speaker 1: done an amazing Capstone. I believe it's at Harvard, so 160 00:11:06,920 --> 00:11:14,000 Speaker 1: no dummy here on the trifecta of bad incentives facing 161 00:11:14,080 --> 00:11:18,080 Speaker 1: modern journalism. And I think that it's such a wonderful 162 00:11:18,120 --> 00:11:20,800 Speaker 1: time to have this conversation because the question that I 163 00:11:20,880 --> 00:11:24,400 Speaker 1: keep asking is has anybody learned anything over the last 164 00:11:24,440 --> 00:11:27,280 Speaker 1: four years? Did anybody learn anything during the Trump era? 165 00:11:27,640 --> 00:11:31,920 Speaker 1: So Kivan talked to talk to us about this capstone 166 00:11:32,080 --> 00:11:36,360 Speaker 1: and why you decided to make this the topic. Absolutely So. 167 00:11:36,480 --> 00:11:39,240 Speaker 1: I was in on a joint law and policy program 168 00:11:39,240 --> 00:11:42,480 Speaker 1: at Harvard, and our challenge was to pick an extensive 169 00:11:42,480 --> 00:11:46,080 Speaker 1: research topic that posed a public problem today, and I 170 00:11:46,080 --> 00:11:49,040 Speaker 1: couldn't think of a more appropriate topic. I personally do 171 00:11:49,120 --> 00:11:52,319 Speaker 1: a lot on social media on Twitter, getting into TikTok 172 00:11:52,360 --> 00:11:55,440 Speaker 1: as I know you are as well, Yes, And I 173 00:11:55,480 --> 00:11:58,080 Speaker 1: think one thing that we've seen just as sort of 174 00:11:58,120 --> 00:12:02,520 Speaker 1: there's been public backlash in the market when certain businesses, 175 00:12:02,559 --> 00:12:06,400 Speaker 1: you know, engage in unethical, inappropriate behaviors, because consumers have 176 00:12:06,480 --> 00:12:09,360 Speaker 1: finally learned the power that they have together when they 177 00:12:09,520 --> 00:12:12,520 Speaker 1: organize like that on Twitter to boycott or something like that. 178 00:12:12,760 --> 00:12:16,679 Speaker 1: There's also been similar backlash two journalists who have really 179 00:12:16,800 --> 00:12:20,120 Speaker 1: dropped the ball in these recent years, again and again 180 00:12:20,200 --> 00:12:22,760 Speaker 1: and again. And I think at first there was this 181 00:12:22,840 --> 00:12:25,880 Speaker 1: question of whether it was out of incompetence or a 182 00:12:25,960 --> 00:12:28,320 Speaker 1: lack of ability. I mean, I do think there's a 183 00:12:28,360 --> 00:12:31,120 Speaker 1: people problem, for sure, but I think that there's also 184 00:12:31,559 --> 00:12:35,320 Speaker 1: a structural problem where the incentives facing journalists today have 185 00:12:35,520 --> 00:12:40,880 Speaker 1: totally shifted priorities away from truthtelling and informing the American 186 00:12:40,920 --> 00:12:46,319 Speaker 1: people and to these three you know, overarching incentives that 187 00:12:46,400 --> 00:12:48,840 Speaker 1: I call it the trifecta of bad incentives, and that 188 00:12:48,960 --> 00:12:52,400 Speaker 1: is social media, fame, cable news hits, and book deals, 189 00:12:52,400 --> 00:12:56,120 Speaker 1: and all three of those incentives work together to sort 190 00:12:56,120 --> 00:12:58,880 Speaker 1: of push first of all, to draw people into the 191 00:12:58,880 --> 00:13:01,640 Speaker 1: profession that are more interested in being an influencer and 192 00:13:01,720 --> 00:13:06,480 Speaker 1: a journalist, and then to in really problematic behaviors, I think, 193 00:13:06,760 --> 00:13:10,560 Speaker 1: to achieve those ends at the cost of truth telling 194 00:13:10,600 --> 00:13:14,320 Speaker 1: and sort of maintaining a reputation with the public. Um. 195 00:13:14,600 --> 00:13:16,960 Speaker 1: So that is sort of the you know, the project, 196 00:13:17,520 --> 00:13:20,480 Speaker 1: it's it's it's all, you know, because when you think 197 00:13:20,520 --> 00:13:23,880 Speaker 1: about journalism, right, like the advent of journalism and the 198 00:13:23,920 --> 00:13:27,520 Speaker 1: importance behind it is that it was considered the fourth estate, right, 199 00:13:27,640 --> 00:13:32,120 Speaker 1: it was considered you know, this this um this through 200 00:13:32,200 --> 00:13:36,359 Speaker 1: way for the American people to be able to understand 201 00:13:36,559 --> 00:13:39,040 Speaker 1: better the world around them, how their government works, how 202 00:13:39,120 --> 00:13:43,240 Speaker 1: society works, and their function within and what has happened 203 00:13:43,440 --> 00:13:47,080 Speaker 1: right by virtue And I remember doing a podcast about this, 204 00:13:47,360 --> 00:13:51,320 Speaker 1: like I would say probably eight years ago where I was, 205 00:13:51,520 --> 00:13:53,840 Speaker 1: I would know maybe ten years ago. I was watching 206 00:13:53,880 --> 00:13:57,400 Speaker 1: the rise of reality TV and at the same time 207 00:13:57,480 --> 00:14:01,920 Speaker 1: watching the shift that was happening with news, right, and 208 00:14:01,960 --> 00:14:05,319 Speaker 1: I said, there is there isn't going to be this 209 00:14:05,480 --> 00:14:10,880 Speaker 1: clear divide anymore between this reality TV antics, right, and 210 00:14:11,040 --> 00:14:14,840 Speaker 1: what is supposed to be actual news, what is supposed 211 00:14:14,880 --> 00:14:18,200 Speaker 1: to be journalism. And so did you see that confluence 212 00:14:18,240 --> 00:14:21,760 Speaker 1: as well, like with how how we see TV these 213 00:14:21,880 --> 00:14:24,880 Speaker 1: days and caricatures of people one hundred percent? And I 214 00:14:24,880 --> 00:14:26,880 Speaker 1: think last time I was on the show, I was mentioning, 215 00:14:26,920 --> 00:14:28,880 Speaker 1: you know, I think like I looked to platforms like 216 00:14:29,120 --> 00:14:32,080 Speaker 1: RuPaul's Drag Race and The Housewives and all those reality 217 00:14:32,120 --> 00:14:36,240 Speaker 1: shows are absolutely sort of leading on how to effectively 218 00:14:36,440 --> 00:14:39,000 Speaker 1: build an audience and communicate and you can use those 219 00:14:39,040 --> 00:14:41,960 Speaker 1: tools for good and evil, And we're seeing it done, 220 00:14:42,040 --> 00:14:44,680 Speaker 1: you know, for evil by a lot on sort certainly 221 00:14:44,680 --> 00:14:47,800 Speaker 1: the right. But I think also we're seeing journalists engage 222 00:14:47,840 --> 00:14:50,680 Speaker 1: like this and Taylor Lawrence, who's like a prominent reporter 223 00:14:50,800 --> 00:14:54,160 Speaker 1: at the Washington Post that covers influencers has said, you 224 00:14:54,160 --> 00:14:56,840 Speaker 1: know that her she sees media headed to be much 225 00:14:56,840 --> 00:15:00,560 Speaker 1: more distributed personalities instead of sort of these mainstream networks 226 00:15:00,560 --> 00:15:02,360 Speaker 1: that we're so used to. But what comes with that 227 00:15:02,720 --> 00:15:05,960 Speaker 1: is a whole lot of subjectivity that you know, sort 228 00:15:05,960 --> 00:15:09,560 Speaker 1: of obscures objective fact and truthfulness. And I think what 229 00:15:09,560 --> 00:15:12,520 Speaker 1: we're seeing from mainstream prominent journalists is that they are 230 00:15:12,560 --> 00:15:16,040 Speaker 1: accelerating us to that outcome way before we are ready 231 00:15:16,040 --> 00:15:19,080 Speaker 1: for it, you know, And I think so on social 232 00:15:19,120 --> 00:15:22,840 Speaker 1: media prong of this trifecta, there's three behaviors that I 233 00:15:22,840 --> 00:15:25,120 Speaker 1: think we see, and you know, anyone that's online, I 234 00:15:25,160 --> 00:15:27,760 Speaker 1: think can identify at least one or two prominent reporters 235 00:15:27,760 --> 00:15:31,920 Speaker 1: that stand out. But there's hundreds of people engaging this way, right, 236 00:15:31,920 --> 00:15:35,480 Speaker 1: So there's the blurring of lines between professional and personal 237 00:15:35,520 --> 00:15:37,840 Speaker 1: social media, which makes sense because a lot of times 238 00:15:37,840 --> 00:15:41,200 Speaker 1: these reporters are verified because they're a New York Times reporter, 239 00:15:41,520 --> 00:15:44,520 Speaker 1: they have hundreds of thousand followers because of their role, 240 00:15:44,840 --> 00:15:48,080 Speaker 1: and then they're pivoting to tweet every take that would 241 00:15:48,080 --> 00:15:50,480 Speaker 1: never get by an editor that they have no experience 242 00:15:50,560 --> 00:15:52,680 Speaker 1: or background to be commenting on, but it's sort of 243 00:15:52,680 --> 00:15:55,760 Speaker 1: elevated because it's on this larger platform with hundreds of 244 00:15:55,800 --> 00:15:58,360 Speaker 1: thousands of followers that people are following for the news, 245 00:15:58,680 --> 00:16:02,360 Speaker 1: not for Dave Whitegel's you know, racist jokes or you know, 246 00:16:02,480 --> 00:16:05,120 Speaker 1: bad food takes, those types of things. I think, you know, 247 00:16:06,320 --> 00:16:09,280 Speaker 1: that was one episode that stands out, but everybody seeing 248 00:16:09,280 --> 00:16:12,080 Speaker 1: that every week we're seeing, whether it's a Maggie Habermid 249 00:16:12,120 --> 00:16:15,120 Speaker 1: episode or Dave white Gel or you know, I'm sure 250 00:16:15,160 --> 00:16:17,520 Speaker 1: you know more conservative people felt that way about Jim Acosta, 251 00:16:17,560 --> 00:16:21,080 Speaker 1: but every day we're seeing these media figures center themselves 252 00:16:21,520 --> 00:16:24,760 Speaker 1: instead of just focus on communicating important information to the 253 00:16:24,800 --> 00:16:28,600 Speaker 1: American people. That's one. Then two right, there's this direct 254 00:16:28,640 --> 00:16:32,520 Speaker 1: backlash between journalists that get called out then by people 255 00:16:32,520 --> 00:16:36,720 Speaker 1: like you and me and regular news consumers, and then 256 00:16:36,760 --> 00:16:39,680 Speaker 1: they're fighting in public suddenly with people that are supposed to, 257 00:16:39,760 --> 00:16:42,320 Speaker 1: you know, the next day read their recording and feel 258 00:16:42,320 --> 00:16:46,680 Speaker 1: its subjective and trustworthy and truthful, and it's it's incredibly unprofessional. 259 00:16:46,760 --> 00:16:49,800 Speaker 1: But again it's it's moving us faster on that timeline 260 00:16:49,840 --> 00:16:54,440 Speaker 1: towards partisan news. And then finally, you know, I think 261 00:16:54,440 --> 00:16:57,160 Speaker 1: we all see different reporters try to go viral. I 262 00:16:57,200 --> 00:16:58,640 Speaker 1: don't know if this has happened to you, but I've 263 00:16:58,640 --> 00:17:00,360 Speaker 1: had a lot of people, you know, try to DM 264 00:17:00,360 --> 00:17:03,000 Speaker 1: me their stories and things like that, which, again, like, 265 00:17:03,560 --> 00:17:05,440 Speaker 1: if it's good work, I'm happy to amplify it. I'm 266 00:17:05,480 --> 00:17:07,160 Speaker 1: on Twitter all days, so I'm seeing it. But there 267 00:17:07,280 --> 00:17:09,960 Speaker 1: is that sort of questionable issue of blake. Why are 268 00:17:10,080 --> 00:17:13,520 Speaker 1: journalists in the dms of you know, social media people 269 00:17:13,640 --> 00:17:16,199 Speaker 1: asking them to plug their reporting it again, it creates 270 00:17:16,200 --> 00:17:18,920 Speaker 1: a perverse incentive there when you're then going to cover 271 00:17:19,000 --> 00:17:23,440 Speaker 1: that person or something like that. So you know, it's 272 00:17:23,480 --> 00:17:28,720 Speaker 1: funny because for me, you know, people have always confused 273 00:17:28,760 --> 00:17:32,600 Speaker 1: me and said that I was a journalist and I'm not, right, Like, 274 00:17:32,640 --> 00:17:34,800 Speaker 1: I'm actually very clear about the fact that I did 275 00:17:34,840 --> 00:17:37,200 Speaker 1: not go to school for journalism. I'm not a journalist. 276 00:17:37,240 --> 00:17:40,120 Speaker 1: I'm actually an opinionator. And there is a difference, right 277 00:17:40,280 --> 00:17:43,400 Speaker 1: Like I come from a place that is not neutral, 278 00:17:43,760 --> 00:17:45,879 Speaker 1: right Like, I come from a place that is and 279 00:17:45,920 --> 00:17:48,879 Speaker 1: I and I try for the people that listen to me, 280 00:17:49,080 --> 00:17:51,959 Speaker 1: that read, you know, my work, like I try and 281 00:17:52,000 --> 00:17:55,920 Speaker 1: make that very clear, like um, because I want folks 282 00:17:56,000 --> 00:17:58,760 Speaker 1: to understand that there is a line there was a 283 00:17:58,800 --> 00:18:03,680 Speaker 1: line anyway between what it was considered opinions, right, like 284 00:18:03,760 --> 00:18:06,400 Speaker 1: you being asked to write a piece from your vantage 285 00:18:06,400 --> 00:18:10,040 Speaker 1: point versus somebody who is employed by a Washington Post 286 00:18:10,080 --> 00:18:12,879 Speaker 1: in New York Times where they're not employed as an 287 00:18:12,920 --> 00:18:16,919 Speaker 1: opinion writer, they're employed as a reporter. And that you know, 288 00:18:17,000 --> 00:18:20,520 Speaker 1: so I just want you again for folks who are 289 00:18:20,560 --> 00:18:24,960 Speaker 1: not necessarily in the minutia to understand the distinctions between 290 00:18:25,000 --> 00:18:28,240 Speaker 1: those things. Yeah, and then you know, sort of so 291 00:18:28,320 --> 00:18:30,240 Speaker 1: the next player, of course, and these all sort of 292 00:18:30,240 --> 00:18:33,199 Speaker 1: operate together, but the order doesn't really matter. Is you 293 00:18:33,200 --> 00:18:34,920 Speaker 1: want to have a lot of social media followers because 294 00:18:34,920 --> 00:18:36,880 Speaker 1: that's going to get you on cable news and get 295 00:18:36,880 --> 00:18:38,439 Speaker 1: you a book deal, which a lot of these journalists 296 00:18:38,560 --> 00:18:40,560 Speaker 1: want a book deal. That's like considered sort of like 297 00:18:40,920 --> 00:18:43,600 Speaker 1: you know, a major career goal. And especially when you 298 00:18:43,640 --> 00:18:46,280 Speaker 1: go to the White House Press Corps and you know, 299 00:18:46,359 --> 00:18:49,640 Speaker 1: you're sort of immediately incentivized to create this reality television 300 00:18:49,680 --> 00:18:52,320 Speaker 1: as you noted, because the minute you do that, you're 301 00:18:52,320 --> 00:18:55,280 Speaker 1: going to be covered across network news. They might invite 302 00:18:55,320 --> 00:18:57,320 Speaker 1: you on a show to talk about the viral moment. 303 00:18:57,480 --> 00:18:59,920 Speaker 1: And so again, you know what in the Trump era, 304 00:19:00,080 --> 00:19:03,280 Speaker 1: it's a little hard to disaggregate what was appropriate journalism 305 00:19:03,359 --> 00:19:06,080 Speaker 1: standing up to a really corrupt leader. And there were 306 00:19:06,119 --> 00:19:09,919 Speaker 1: some great moments, really brave moments from journalists. But you know, 307 00:19:10,160 --> 00:19:12,760 Speaker 1: then we see under Biden somebody like you know, do 308 00:19:12,960 --> 00:19:17,600 Speaker 1: Sye there, who's literally just exists to troll, create viral moments, 309 00:19:17,800 --> 00:19:21,280 Speaker 1: get Twitter traction, and I saw, like really concerning there's 310 00:19:21,280 --> 00:19:23,280 Speaker 1: a lot of buy it, I think across the industry 311 00:19:23,320 --> 00:19:26,119 Speaker 1: from this approach. Eric Wempel, who's you know again, a 312 00:19:26,119 --> 00:19:29,399 Speaker 1: prominent media critic for The Washington Post, basically said of 313 00:19:29,480 --> 00:19:32,439 Speaker 1: Doucy's trolling, you know, that's his job to get, you know, 314 00:19:32,760 --> 00:19:36,080 Speaker 1: to elicit sort of anger and reaction from President Biden. 315 00:19:36,119 --> 00:19:38,600 Speaker 1: And I actually just fundamentally disagree. I don't think that 316 00:19:38,640 --> 00:19:41,560 Speaker 1: the American people are sitting there watching these sort of 317 00:19:41,600 --> 00:19:44,840 Speaker 1: exchanges hoping for drama that you have reality TV if 318 00:19:44,840 --> 00:19:48,560 Speaker 1: you want to watch that. People want rectation, and so 319 00:19:48,840 --> 00:19:51,560 Speaker 1: the incentive to go fight with Biden or catch him 320 00:19:51,600 --> 00:19:54,800 Speaker 1: off guard, or you know, any leader that's not actually 321 00:19:55,160 --> 00:19:58,800 Speaker 1: prioritizing truth, you know, asking those follow up questions they 322 00:19:58,840 --> 00:20:00,679 Speaker 1: never seem to ask to any buddy when they come 323 00:20:00,720 --> 00:20:03,280 Speaker 1: on their show that's about truth seeking, you know, So 324 00:20:03,400 --> 00:20:06,320 Speaker 1: I think that's another incentive. And then, of course the 325 00:20:06,359 --> 00:20:09,840 Speaker 1: book deals as we've all seen, not only is there 326 00:20:09,840 --> 00:20:12,440 Speaker 1: the issue of self promotion, where if I'm following Jake 327 00:20:12,520 --> 00:20:14,720 Speaker 1: Chapper because I want to know breaking news because he 328 00:20:14,760 --> 00:20:17,080 Speaker 1: has special access, then I got to see him plug 329 00:20:17,080 --> 00:20:20,520 Speaker 1: his book every fifth tweet, like or plug a Collie's book, right, 330 00:20:20,560 --> 00:20:23,359 Speaker 1: Like that's like, that's sort of a perverse incentive because 331 00:20:23,480 --> 00:20:26,680 Speaker 1: we know, as we just default know that salespeople are 332 00:20:26,720 --> 00:20:29,040 Speaker 1: not objective, and we want our journalists to be objective. 333 00:20:29,080 --> 00:20:31,280 Speaker 1: So if you're hawking a book to me every five seconds, 334 00:20:31,480 --> 00:20:34,760 Speaker 1: of course I'm going to think, well, maybe you're, you know, 335 00:20:34,840 --> 00:20:37,520 Speaker 1: positioning yourself whether on what you cover or how you 336 00:20:37,600 --> 00:20:40,719 Speaker 1: cover it, to sell this book more than you know, 337 00:20:40,800 --> 00:20:43,320 Speaker 1: to tell the truth to the American people. And because 338 00:20:43,359 --> 00:20:45,800 Speaker 1: again these things are all interrelated, we see people saving 339 00:20:45,800 --> 00:20:49,000 Speaker 1: those scoops, whether it's Maggie Haberman or Bob Woodwork or 340 00:20:49,080 --> 00:20:52,120 Speaker 1: you know those two New York Times. Oh yeah, Mark right, right, 341 00:20:52,640 --> 00:20:56,160 Speaker 1: Bob Boardward couldn't let us all know about COVID, right, 342 00:20:56,840 --> 00:20:59,520 Speaker 1: And what Donald Trump had said about COVID in December 343 00:20:59,560 --> 00:21:03,359 Speaker 1: twenty nineteen when he was interviewing him for his book Fear, 344 00:21:03,440 --> 00:21:06,600 Speaker 1: but he would hold that information when Donald Trump told him, oh, 345 00:21:06,720 --> 00:21:09,440 Speaker 1: this is some really bad stuff. It's worse than the flu. 346 00:21:09,840 --> 00:21:13,320 Speaker 1: It's airborne, and he would hold that until he was 347 00:21:13,359 --> 00:21:16,200 Speaker 1: ready to release his book months. I mean literally, it's 348 00:21:16,280 --> 00:21:19,720 Speaker 1: not too dramatic to say life's gonna you know, been 349 00:21:19,760 --> 00:21:23,639 Speaker 1: saved because of that. Yeah. Absolutely, And again like why 350 00:21:23,680 --> 00:21:25,240 Speaker 1: are they doing that? Because you need to have something 351 00:21:25,280 --> 00:21:27,560 Speaker 1: newsworthy so that when you go on cable news months 352 00:21:27,640 --> 00:21:30,159 Speaker 1: later after you submitted your book, you have something that 353 00:21:30,200 --> 00:21:32,080 Speaker 1: you know is going to drive the news cycle because 354 00:21:32,119 --> 00:21:34,680 Speaker 1: even though not that many people are watching cable news relatively, 355 00:21:34,880 --> 00:21:36,720 Speaker 1: that is still what drives a lot of book sales. 356 00:21:36,920 --> 00:21:39,680 Speaker 1: So it's all related, and again it's not part of 357 00:21:39,720 --> 00:21:42,920 Speaker 1: that puzzle is informing the American people, you know, at 358 00:21:42,960 --> 00:21:46,840 Speaker 1: this crisis moment for democracy and a crisis moment of 359 00:21:46,840 --> 00:21:49,280 Speaker 1: course relatedly for the media industry, which you know, I 360 00:21:49,320 --> 00:21:52,840 Speaker 1: think is sort of imploding. We're seeing. So so let 361 00:21:52,840 --> 00:21:56,320 Speaker 1: me let me ask you this because it's something that 362 00:21:56,400 --> 00:21:59,680 Speaker 1: I find that I have found very troubling throughout my 363 00:21:59,760 --> 00:22:03,600 Speaker 1: career in media, which is, you know, we look at 364 00:22:03,600 --> 00:22:07,000 Speaker 1: the types of stories, and I'm particularly talking about cable 365 00:22:07,040 --> 00:22:08,720 Speaker 1: news right now, So we look at the type of 366 00:22:08,720 --> 00:22:11,520 Speaker 1: stories that make it into segments, that make it into shows, 367 00:22:12,040 --> 00:22:15,720 Speaker 1: and you know, I know by virtue of being inside 368 00:22:15,760 --> 00:22:19,000 Speaker 1: these places that these newsrooms do not look like me right, 369 00:22:19,200 --> 00:22:24,200 Speaker 1: that they are filled still still just sprinklings of people 370 00:22:24,240 --> 00:22:29,880 Speaker 1: of color and sprinkling of women or non binary folks 371 00:22:30,000 --> 00:22:32,480 Speaker 1: like as sprinkling, it is mostly still white men that 372 00:22:32,520 --> 00:22:35,880 Speaker 1: get to make new decisions. And because they come from 373 00:22:35,960 --> 00:22:40,600 Speaker 1: this vantage point though of quote unquote neutrality, there has 374 00:22:40,680 --> 00:22:46,399 Speaker 1: been this discrimination towards people of color, towards queer people, 375 00:22:46,600 --> 00:22:50,520 Speaker 1: towards women, about us not being able to be quote 376 00:22:50,560 --> 00:22:54,760 Speaker 1: unquote neutral or objective because we embody some of the 377 00:22:54,880 --> 00:22:59,600 Speaker 1: same demographics as the stories that we're covering. What do 378 00:22:59,640 --> 00:23:03,159 Speaker 1: you say to that, Because to me, I'm like, there's 379 00:23:03,160 --> 00:23:07,960 Speaker 1: a difference between being objective, being neutral, right, and also 380 00:23:08,160 --> 00:23:11,200 Speaker 1: just being honest. And the fact that white men have 381 00:23:12,040 --> 00:23:14,840 Speaker 1: always been the ones to have been said are the 382 00:23:14,880 --> 00:23:18,199 Speaker 1: ones that can be neutral absolutely, So it's such a 383 00:23:18,200 --> 00:23:20,560 Speaker 1: good point, and I think, like to your point, you know, 384 00:23:21,080 --> 00:23:24,080 Speaker 1: people that defend the status quo often like to you know, 385 00:23:24,280 --> 00:23:26,760 Speaker 1: sort of do that under the guise of being neutral. 386 00:23:26,840 --> 00:23:29,920 Speaker 1: But the status quo is not neutral. It has all 387 00:23:29,920 --> 00:23:32,440 Speaker 1: the history behind it, right that led up to that moment, 388 00:23:32,680 --> 00:23:34,800 Speaker 1: to all the points you just made. I think one 389 00:23:34,880 --> 00:23:37,119 Speaker 1: interesting thing. So in my research, I sort of started 390 00:23:37,119 --> 00:23:39,000 Speaker 1: off with a deep dive into the history of journalism 391 00:23:39,000 --> 00:23:41,240 Speaker 1: in America and you know, trace it to how we 392 00:23:41,280 --> 00:23:44,159 Speaker 1: got to this moment, and what you bring up comes 393 00:23:44,240 --> 00:23:47,320 Speaker 1: up in sort of you know, recent decades of journalists 394 00:23:47,400 --> 00:23:50,200 Speaker 1: understanding the pitfalls of you know, it's impossible to actually 395 00:23:50,200 --> 00:23:52,399 Speaker 1: be objective, and of course you're coming from a perspective, 396 00:23:52,600 --> 00:23:55,640 Speaker 1: and often that perspective we know is you know, disproportionately 397 00:23:55,680 --> 00:23:59,160 Speaker 1: white and male. Um. But that doesn't mean that we 398 00:23:59,200 --> 00:24:02,680 Speaker 1: throw out the project of objective truth telling all together, 399 00:24:03,080 --> 00:24:05,680 Speaker 1: because I think the added value the journalists should be 400 00:24:05,720 --> 00:24:09,040 Speaker 1: bringing is not They're sort of subjective. This is my 401 00:24:09,080 --> 00:24:12,399 Speaker 1: background experience that I'm sort of super imposing on the news, 402 00:24:12,560 --> 00:24:15,919 Speaker 1: but it should be. I'm somebody that's spent ten years 403 00:24:16,000 --> 00:24:19,600 Speaker 1: covering Capitol Hill. I know, within the bounds of reason, 404 00:24:19,840 --> 00:24:23,520 Speaker 1: this is normal behavior. And I'm using that objective experience 405 00:24:23,560 --> 00:24:26,760 Speaker 1: that's personal too, because it's my experience to say this 406 00:24:26,800 --> 00:24:30,239 Speaker 1: was outrageous or to say this was a huge deviation, right, Like, 407 00:24:30,280 --> 00:24:32,440 Speaker 1: those are the types of ways that you can sort 408 00:24:32,440 --> 00:24:35,840 Speaker 1: of still be objective and of course bring that experience in, 409 00:24:35,880 --> 00:24:38,560 Speaker 1: but it's the useful, relevant experience and not you know, 410 00:24:38,800 --> 00:24:42,240 Speaker 1: random and appropriate experience. So I think that's a huge issue. 411 00:24:42,359 --> 00:24:45,919 Speaker 1: And I think again, like not solving that diversity issue 412 00:24:46,200 --> 00:24:51,399 Speaker 1: is so huge because it's been years now, and you know, 413 00:24:51,440 --> 00:24:54,960 Speaker 1: I still still people complaining about like, oh I can't 414 00:24:54,960 --> 00:24:57,440 Speaker 1: get like a a young white man's never going to 415 00:24:57,480 --> 00:25:00,600 Speaker 1: get a book published again. It's like I don't right, 416 00:25:00,640 --> 00:25:02,440 Speaker 1: Like I don't know if you've seen those viral tweets 417 00:25:02,440 --> 00:25:05,280 Speaker 1: from people but like this, like crying about how their 418 00:25:05,359 --> 00:25:08,320 Speaker 1: editors are like, oh, yeah, we'll never give this a chance. Like, 419 00:25:08,320 --> 00:25:10,400 Speaker 1: I don't think that's objectively true. If you go look 420 00:25:10,440 --> 00:25:12,520 Speaker 1: at the New York Times bestseller lists, right, I think 421 00:25:12,560 --> 00:25:16,520 Speaker 1: there's a lot of invasion first. Yeah, yeah, that white 422 00:25:16,520 --> 00:25:19,199 Speaker 1: men are hurting same in journalism, you know, Like I 423 00:25:19,280 --> 00:25:21,919 Speaker 1: was able to speak with some prominent reporters for this work, 424 00:25:22,000 --> 00:25:24,639 Speaker 1: and they say that they care a lot about the 425 00:25:24,680 --> 00:25:29,280 Speaker 1: issues of diversity and inclusion. Everything we see going forward 426 00:25:29,320 --> 00:25:31,400 Speaker 1: doesn't seem to match that. If you ask those people 427 00:25:31,440 --> 00:25:33,679 Speaker 1: to follow up and say, okay, so what procedures do 428 00:25:33,680 --> 00:25:35,520 Speaker 1: you have in place? They don't have the answer that 429 00:25:36,160 --> 00:25:38,280 Speaker 1: they're looking at it with that in mind, I guess, 430 00:25:38,400 --> 00:25:41,200 Speaker 1: but they don't have the lived experience necessarily to do 431 00:25:41,240 --> 00:25:43,000 Speaker 1: a good job. And we've seen that over and over 432 00:25:43,160 --> 00:25:45,800 Speaker 1: in the coverage of Vice President Harris, who I think 433 00:25:45,800 --> 00:25:50,520 Speaker 1: has been extraordinarily disrespected, treated much a double standard, and 434 00:25:50,560 --> 00:25:53,040 Speaker 1: it's almost just normal. And again, because you have this 435 00:25:53,080 --> 00:25:56,479 Speaker 1: critical mass of media leadership that is not diverse, they 436 00:25:56,520 --> 00:25:59,400 Speaker 1: all defend each other. So they're never gonna say, yes, 437 00:25:59,440 --> 00:26:02,080 Speaker 1: we mistreat did Paumala Harris. They're never gonna say, yes, 438 00:26:02,119 --> 00:26:06,080 Speaker 1: we mistreated Hillary Clinton. You know, five years later, you'll 439 00:26:06,080 --> 00:26:08,399 Speaker 1: get one or two hot takes that nobody reads that 440 00:26:08,440 --> 00:26:10,479 Speaker 1: are like, oh, you know, actually, I was a journalist 441 00:26:10,520 --> 00:26:12,520 Speaker 1: the New York Times and I covered the Clinton campaign, 442 00:26:12,520 --> 00:26:14,760 Speaker 1: and if we did a terrible drop, that never has 443 00:26:14,800 --> 00:26:18,200 Speaker 1: the same effect as you know, the initial coverage and mistreatment. 444 00:26:18,359 --> 00:26:23,639 Speaker 1: So I'd look to you know, places like the Griot 445 00:26:23,760 --> 00:26:27,359 Speaker 1: for example, even just Brittney Griner right like, we're follow 446 00:26:27,440 --> 00:26:30,359 Speaker 1: up on that story for every week that you know, 447 00:26:30,400 --> 00:26:32,240 Speaker 1: we've been waiting. Now that they you know, had the 448 00:26:32,280 --> 00:26:35,200 Speaker 1: sentencing and trial and everything. It's getting coverage today, But 449 00:26:35,440 --> 00:26:38,240 Speaker 1: I find it hard to believe that, you know, somebody 450 00:26:38,280 --> 00:26:40,840 Speaker 1: of a different identity that was in a similar situation 451 00:26:40,920 --> 00:26:45,280 Speaker 1: would not have been driving totally different, much more persistent 452 00:26:45,320 --> 00:26:47,760 Speaker 1: media narratives up until this moment. And do we know 453 00:26:47,760 --> 00:26:50,120 Speaker 1: if that affects the outcome? You know, I mean hard 454 00:26:50,160 --> 00:26:53,920 Speaker 1: to say. You know. One of the things that too, 455 00:26:53,960 --> 00:26:57,120 Speaker 1: I find really troubling is, I mean, we we always 456 00:26:57,200 --> 00:27:00,359 Speaker 1: will see a double standard, a triple standard, depending on 457 00:27:00,480 --> 00:27:04,280 Speaker 1: who is being covered and who is doing the covering 458 00:27:04,359 --> 00:27:07,960 Speaker 1: of that story. But what I find really troubling now, um, 459 00:27:08,240 --> 00:27:11,600 Speaker 1: which I think is the purpose of your work, is 460 00:27:11,640 --> 00:27:14,400 Speaker 1: that we are in crisis. You said it earlier, like 461 00:27:14,640 --> 00:27:19,360 Speaker 1: this country, this world is in crisis, and the media 462 00:27:19,880 --> 00:27:24,240 Speaker 1: is not doing anything to educate people to use their 463 00:27:24,280 --> 00:27:28,399 Speaker 1: their their their mouthpieces to be able to inform the 464 00:27:28,480 --> 00:27:31,760 Speaker 1: public at these times of crisis, whether we're talking about 465 00:27:31,800 --> 00:27:35,480 Speaker 1: climate change, our democracy and the rise of authoritarianism in 466 00:27:35,520 --> 00:27:38,800 Speaker 1: this country, we're talking about abortion, any of these things. 467 00:27:38,800 --> 00:27:43,040 Speaker 1: And so what do you think, like, is there an 468 00:27:43,080 --> 00:27:47,840 Speaker 1: answer here or is this just going to, honestly cavan 469 00:27:47,960 --> 00:27:51,040 Speaker 1: continue to just get worse you know. So so I 470 00:27:51,080 --> 00:27:53,480 Speaker 1: do address you know, a couple of different solutions in 471 00:27:53,520 --> 00:27:56,560 Speaker 1: the paper, and obviously one is more public funding of journalism, 472 00:27:56,600 --> 00:27:58,800 Speaker 1: which I think the idea of, you know, it's a 473 00:27:58,800 --> 00:28:02,840 Speaker 1: hard thing to execute, but internationally there's some successful models, 474 00:28:02,880 --> 00:28:04,679 Speaker 1: and I think the idea there would be to have 475 00:28:04,840 --> 00:28:08,480 Speaker 1: a credible competitor to CNN and these other outlets that 476 00:28:08,520 --> 00:28:11,960 Speaker 1: are engaging in reckless behavior, whether that's an MPR or 477 00:28:12,040 --> 00:28:14,040 Speaker 1: something like that, which you know doesn't get a whole 478 00:28:14,040 --> 00:28:17,720 Speaker 1: lot of funding MPR compared to things like that the 479 00:28:17,720 --> 00:28:20,280 Speaker 1: way its structure, but I think actually something that's even 480 00:28:20,440 --> 00:28:23,800 Speaker 1: more necessary and more structural, so of course harder to do, 481 00:28:24,400 --> 00:28:27,399 Speaker 1: is every important industry we have, whether it's you know, 482 00:28:27,560 --> 00:28:29,800 Speaker 1: being a lawyer, being a doctor, and I would put 483 00:28:29,800 --> 00:28:32,720 Speaker 1: of course being a journalist. You know, at that level, 484 00:28:33,000 --> 00:28:34,960 Speaker 1: you know, these are people that should have a skill set. 485 00:28:35,000 --> 00:28:37,160 Speaker 1: There are people that I should know, right Like, I 486 00:28:37,200 --> 00:28:39,760 Speaker 1: know you took chemistry if you are a doctor, but 487 00:28:39,880 --> 00:28:43,240 Speaker 1: I don't know that you ethics or sociology or history 488 00:28:43,560 --> 00:28:45,400 Speaker 1: or any of those things. If you're a New York 489 00:28:45,400 --> 00:28:48,560 Speaker 1: Times journalist, and I think you know one of those issues. 490 00:28:48,720 --> 00:28:51,840 Speaker 1: There's this sort of catchway two of Okay, so is 491 00:28:51,840 --> 00:28:54,880 Speaker 1: the answer more credentialing and more school and should people 492 00:28:54,880 --> 00:28:57,480 Speaker 1: go to journalism school? But of course all those institutions 493 00:28:57,520 --> 00:29:02,360 Speaker 1: are historically exclusionary and offensive and hard to access. And 494 00:29:02,400 --> 00:29:04,400 Speaker 1: at the same time, does that mean we should just 495 00:29:04,640 --> 00:29:07,440 Speaker 1: not regulate this hugely important industry, you know, like the 496 00:29:07,520 --> 00:29:11,160 Speaker 1: law is a self regulating industry. Essentially you have the 497 00:29:11,200 --> 00:29:14,000 Speaker 1: bar association, right, Like, what if there was a sort 498 00:29:14,000 --> 00:29:17,520 Speaker 1: of more formal structure where you had, you know, some 499 00:29:17,560 --> 00:29:20,440 Speaker 1: type of counsel that thought through policies and put out 500 00:29:20,480 --> 00:29:23,720 Speaker 1: best practices, whether they were actionable or not. And what if, 501 00:29:23,760 --> 00:29:27,000 Speaker 1: you know, it was more mainstream to go to journalism school. 502 00:29:27,040 --> 00:29:29,600 Speaker 1: And maybe these are hard topics. So I'm not saying 503 00:29:29,600 --> 00:29:32,080 Speaker 1: there's a right or wrong answer. How should journalists behave online? 504 00:29:32,080 --> 00:29:33,800 Speaker 1: I think we've seen some bad examples of what not 505 00:29:33,920 --> 00:29:37,080 Speaker 1: to do, but at least give people that are going 506 00:29:37,080 --> 00:29:39,600 Speaker 1: to be in this important profession the chance to debate that. 507 00:29:39,720 --> 00:29:42,280 Speaker 1: And here's maybe one theory of age online. Here's another. 508 00:29:42,440 --> 00:29:44,600 Speaker 1: At least I'm more thoughtful and not just tweeting out 509 00:29:44,600 --> 00:29:46,800 Speaker 1: every meme that I see on Reddit, you know, with 510 00:29:46,880 --> 00:29:50,080 Speaker 1: the platform of the Washington post political news desk behind 511 00:29:50,160 --> 00:29:54,280 Speaker 1: me and often you know, contributing to misinformation when I'm 512 00:29:54,280 --> 00:29:56,640 Speaker 1: doing that because you know, as funny as these little 513 00:29:56,680 --> 00:29:59,040 Speaker 1: hipster jokes are among you know, maybe people like us 514 00:29:59,080 --> 00:30:02,280 Speaker 1: that follow like the you know, minutia of media whatever, 515 00:30:02,440 --> 00:30:04,520 Speaker 1: most people have no idea what those people are talking 516 00:30:04,520 --> 00:30:06,400 Speaker 1: about or what the joke is, you know, like they're 517 00:30:06,400 --> 00:30:08,880 Speaker 1: not getting it right, just confusing people that want to 518 00:30:08,960 --> 00:30:14,040 Speaker 1: learn and want information. So yeah, well, Kivan, I think 519 00:30:14,040 --> 00:30:17,600 Speaker 1: that the work that you have done is really important. 520 00:30:17,640 --> 00:30:20,840 Speaker 1: I think that it's conversations that we just are not having, 521 00:30:21,080 --> 00:30:23,800 Speaker 1: and I don't you know, I hope that now, but 522 00:30:23,880 --> 00:30:26,160 Speaker 1: again I'm still not sure. I hope that now though, 523 00:30:26,320 --> 00:30:29,800 Speaker 1: people realize the importance of the media and what it 524 00:30:29,880 --> 00:30:33,200 Speaker 1: was supposed to be and what it has become, and 525 00:30:33,360 --> 00:30:37,240 Speaker 1: at the very least, by our eyeballs, which provide the 526 00:30:37,360 --> 00:30:41,000 Speaker 1: ratings and the money, you know, begin to show these 527 00:30:41,040 --> 00:30:44,240 Speaker 1: networks and outlets what it is that we actually want, 528 00:30:44,360 --> 00:30:47,959 Speaker 1: because it is going to be viewer based and reader based, 529 00:30:48,080 --> 00:30:50,520 Speaker 1: right like, it isn't going to be these people deciding, 530 00:30:50,520 --> 00:30:53,120 Speaker 1: you know what, I don't want that six figure book deal, 531 00:30:53,360 --> 00:30:56,000 Speaker 1: you know what, I don't want all these hundred thousand followers. 532 00:30:56,040 --> 00:30:59,960 Speaker 1: It is going to be about people demanding the type 533 00:31:00,040 --> 00:31:02,920 Speaker 1: of information and news that they need, otherwise they're going 534 00:31:02,960 --> 00:31:05,720 Speaker 1: to lose viewers absolutely. And I think, like, you know, 535 00:31:06,320 --> 00:31:07,920 Speaker 1: how I end sort of my research because I'm like, 536 00:31:07,960 --> 00:31:09,560 Speaker 1: these are a bunch of solutions that are really hard 537 00:31:09,600 --> 00:31:11,800 Speaker 1: and probably won't happen. But the one thing we can 538 00:31:11,880 --> 00:31:14,160 Speaker 1: do right now is what you just said, which is 539 00:31:14,480 --> 00:31:17,960 Speaker 1: educate people about this problem, call out and frame. That's 540 00:31:17,960 --> 00:31:20,240 Speaker 1: why I thought this project was important to me because 541 00:31:20,240 --> 00:31:23,479 Speaker 1: I think so much of education has been you know, 542 00:31:23,520 --> 00:31:27,200 Speaker 1: just adding terms and codifying and you know, creating a 543 00:31:27,240 --> 00:31:29,840 Speaker 1: body that people can like reference and build off of, 544 00:31:29,960 --> 00:31:32,240 Speaker 1: even if it's so intuitive that you know, as a consumer, 545 00:31:32,240 --> 00:31:34,520 Speaker 1: I think so many of your listeners will have already 546 00:31:34,560 --> 00:31:37,240 Speaker 1: identified themselves a lot of these issues, but sort of 547 00:31:37,240 --> 00:31:40,160 Speaker 1: adding vocabulary and creating a framework by which I want 548 00:31:40,160 --> 00:31:42,600 Speaker 1: people to recognize, you know, like that is a journalist 549 00:31:42,720 --> 00:31:44,880 Speaker 1: right now who's talking to me something, or that is 550 00:31:44,880 --> 00:31:47,000 Speaker 1: a journalist right now who's just designed this tweet to 551 00:31:47,000 --> 00:31:49,240 Speaker 1: try to get you know, engagement and go viral. Or 552 00:31:49,280 --> 00:31:52,240 Speaker 1: I just saw this journalist like you know, saving a 553 00:31:52,360 --> 00:31:54,600 Speaker 1: scoop while they're on their cable news hit I should 554 00:31:54,600 --> 00:31:56,400 Speaker 1: have known this three months ago, and call it out 555 00:31:56,400 --> 00:31:58,560 Speaker 1: and don't reward that behavior, and don't buy that book 556 00:31:58,760 --> 00:32:03,560 Speaker 1: and tell your friends yep yep Kaivan, trev I hope 557 00:32:03,560 --> 00:32:05,560 Speaker 1: that you will come back and join us again soon. 558 00:32:05,680 --> 00:32:08,360 Speaker 1: Appreciate you, Thank you so much, and I will keep 559 00:32:08,400 --> 00:32:18,200 Speaker 1: following along on by from TikTok. That is it for 560 00:32:18,320 --> 00:32:22,280 Speaker 1: me today. Dear friends, on woke app as always, Power 561 00:32:22,320 --> 00:32:26,320 Speaker 1: to the people and to all the people. Power, Get woke, 562 00:32:26,640 --> 00:32:30,840 Speaker 1: Dear Dear God, get woke and stay woke as fuck. 563 00:32:36,200 --> 00:32:38,560 Speaker 1: Get up. Behind the scenes. Look at Comedy Central's The 564 00:32:38,640 --> 00:32:41,760 Speaker 1: Daily Show on Beyond the Scenes, an original podcast from 565 00:32:41,760 --> 00:32:44,360 Speaker 1: The Daily Show with Trevor Noah. Every week, host Roy 566 00:32:44,360 --> 00:32:47,120 Speaker 1: Wood Junior goes deeper with the notable guests and experts 567 00:32:47,160 --> 00:32:50,040 Speaker 1: from the Emmy Award winning series. Together, they use comedy 568 00:32:50,080 --> 00:32:53,080 Speaker 1: to tackle current topics from gentrification to gun laws and 569 00:32:53,080 --> 00:32:55,640 Speaker 1: take a closer look at how and why these topics matter. 570 00:32:55,800 --> 00:32:58,000 Speaker 1: Listen to Beyond the Scenes from the Daily Show with 571 00:32:58,040 --> 00:33:01,360 Speaker 1: Trevor Noah on the iHeartRadio app, Apple Podcasts, or wherever 572 00:33:01,400 --> 00:33:04,200 Speaker 1: you get your podcasts. New episodes every Tuesday