1 00:00:01,760 --> 00:00:02,880 Speaker 1: Also media. 2 00:00:06,519 --> 00:00:09,240 Speaker 2: We had a very funny introduction. It was really good. 3 00:00:09,640 --> 00:00:13,920 Speaker 1: Yeah, it referenced our company sexual harassment protocols. It was hilarious. 4 00:00:14,440 --> 00:00:16,360 Speaker 1: You're never gonna hear it. We weren't recording. 5 00:00:17,000 --> 00:00:19,720 Speaker 2: I was recording, so you can hear my section. Okay, Yeah, 6 00:00:19,840 --> 00:00:23,439 Speaker 2: if you can just accept what Garrison said without context 7 00:00:23,520 --> 00:00:29,840 Speaker 2: and we'll open with that, that'll be great. Yeah. That 8 00:00:29,960 --> 00:00:33,760 Speaker 2: is a classic Robert Evans intro. You just did it. 9 00:00:34,320 --> 00:00:36,000 Speaker 3: I feel like it always comes from inside. 10 00:00:36,600 --> 00:00:41,199 Speaker 1: Welcome to it could happen here a podcast about journalistic objectivity, 11 00:00:41,320 --> 00:00:45,040 Speaker 1: that's right, a thing that we've just demonstrated perfectly. 12 00:00:46,040 --> 00:00:50,640 Speaker 2: Yeah, yeah, that's the professional media class. So let's have 13 00:00:50,680 --> 00:00:53,280 Speaker 2: a little talk about media objectivity. Right. It's been a 14 00:00:53,320 --> 00:00:58,000 Speaker 2: major tenet of traditional legacy media that they must remain unbiased. 15 00:00:58,880 --> 00:01:01,440 Speaker 2: This hasn't all been the case in the United States. 16 00:01:01,480 --> 00:01:05,199 Speaker 2: Right used to have explicitly partisan news sources, which we 17 00:01:05,360 --> 00:01:08,320 Speaker 2: have now with Fox News, I guess. But that's why 18 00:01:08,360 --> 00:01:10,320 Speaker 2: you have newspapers like I think Saint Louis has a 19 00:01:10,319 --> 00:01:13,399 Speaker 2: Saint Louis Democrat or the so and so Republican like 20 00:01:13,440 --> 00:01:17,560 Speaker 2: that they would be very explicitly a partisan newspaper. It's 21 00:01:17,640 --> 00:01:20,800 Speaker 2: only really when journalism sort of took on this strong 22 00:01:20,959 --> 00:01:24,120 Speaker 2: professional and I mean professional here in terms of like 23 00:01:24,240 --> 00:01:28,360 Speaker 2: the professions right, like law, accounting, jobs that are associated 24 00:01:28,360 --> 00:01:32,959 Speaker 2: with university education, and a class identity, that it started 25 00:01:33,000 --> 00:01:36,679 Speaker 2: to assert this kind of it's an attempt to appear 26 00:01:36,760 --> 00:01:40,039 Speaker 2: rational and scientific in its methodologies, right. And one of 27 00:01:40,080 --> 00:01:43,039 Speaker 2: the ways that journalism did this was to talk about objectivity. 28 00:01:43,560 --> 00:01:46,720 Speaker 2: I should indicate here that objectivity is supposed to be 29 00:01:46,800 --> 00:01:51,360 Speaker 2: a means of verifying information. I like that we should 30 00:01:51,360 --> 00:01:53,520 Speaker 2: objectively check that what we have written is correct. 31 00:01:53,720 --> 00:01:56,440 Speaker 1: The example I always give is that if I'm in 32 00:01:56,480 --> 00:01:59,680 Speaker 1: a protest scene where there's a clash between proud boys 33 00:02:00,040 --> 00:02:02,800 Speaker 1: and you know, a group of leftists, and you know, 34 00:02:02,840 --> 00:02:04,640 Speaker 1: someone on the left pulls out a can of mace 35 00:02:04,760 --> 00:02:08,079 Speaker 1: and sprays it first, that's objectively what happened. Now, that 36 00:02:08,080 --> 00:02:11,880 Speaker 1: doesn't mean that that's the only thing I report. For example, 37 00:02:11,880 --> 00:02:14,400 Speaker 1: if the person they maceed is somebody who has been 38 00:02:14,600 --> 00:02:17,800 Speaker 1: like harassing those individuals online for weeks, or has been 39 00:02:17,840 --> 00:02:20,600 Speaker 1: doxing them or assaulted them at previous like, all of 40 00:02:20,600 --> 00:02:23,240 Speaker 1: that is like relevant context, but it doesn't change what 41 00:02:23,360 --> 00:02:27,440 Speaker 1: objectively happened in that instant, right, Like, it's not on 42 00:02:27,680 --> 00:02:30,720 Speaker 1: me to pretend that I think these sides are equal, 43 00:02:31,080 --> 00:02:34,200 Speaker 1: but it is on me to accurately report like what happens. Yes, 44 00:02:34,639 --> 00:02:36,960 Speaker 1: And I think one of the one of the areas 45 00:02:36,960 --> 00:02:39,520 Speaker 1: in which a lot of people, especially when we were 46 00:02:39,520 --> 00:02:42,799 Speaker 1: talking about, like, you know, situations like this, a lot 47 00:02:42,800 --> 00:02:45,200 Speaker 1: of folks in kind of legacy media get stuff wrong 48 00:02:45,320 --> 00:02:47,519 Speaker 1: is they think that all that matters is what happens 49 00:02:47,560 --> 00:02:51,040 Speaker 1: in that moment, right, and what happened previously, what's happened 50 00:02:51,080 --> 00:02:53,760 Speaker 1: in other engagements, what's happened like over you know, the 51 00:02:53,840 --> 00:02:56,360 Speaker 1: last two or three years of however long the conflict's 52 00:02:56,560 --> 00:02:58,920 Speaker 1: been going on that city is immaterial. Well, all that 53 00:02:58,960 --> 00:03:01,120 Speaker 1: matters is what happened in the second when that reporter 54 00:03:01,280 --> 00:03:04,680 Speaker 1: was scene. And if you're if you're thinking that way, 55 00:03:04,880 --> 00:03:07,360 Speaker 1: you're going to miss more than someone who comes in 56 00:03:07,639 --> 00:03:09,720 Speaker 1: with just an outright bias, you know. 57 00:03:10,080 --> 00:03:13,280 Speaker 2: Yeah, And like I think very often it's seen as 58 00:03:13,360 --> 00:03:15,320 Speaker 2: kind of instead of being like a value of the 59 00:03:15,360 --> 00:03:17,680 Speaker 2: outlet in the way it verifies information, it's seeing as 60 00:03:17,680 --> 00:03:22,120 Speaker 2: being a personal kind of like quality that journalists should 61 00:03:22,200 --> 00:03:25,119 Speaker 2: have in every aspect of their lives. Yeah, Like I'm 62 00:03:25,160 --> 00:03:27,440 Speaker 2: aware that it's some of the big legacy broadsheets in 63 00:03:27,480 --> 00:03:31,440 Speaker 2: the US, like you can't attend to protest unless you 64 00:03:31,480 --> 00:03:33,079 Speaker 2: are covering the protest, right. 65 00:03:33,680 --> 00:03:36,120 Speaker 1: And there's even that famous case of that journalist being like, 66 00:03:36,160 --> 00:03:37,800 Speaker 1: I don't vote because I think that that would be 67 00:03:37,800 --> 00:03:39,720 Speaker 1: a violation of like my objectivity. 68 00:03:39,880 --> 00:03:41,640 Speaker 2: Yeah, yeah, yeah, I remember that. I got it forgotten 69 00:03:41,640 --> 00:03:42,520 Speaker 2: about that one. 70 00:03:42,640 --> 00:03:45,640 Speaker 1: Like you're allowed to have opinions, that's just not supposed 71 00:03:45,680 --> 00:03:48,040 Speaker 1: to be the entire basis of your reporting, you. 72 00:03:48,080 --> 00:03:51,960 Speaker 2: Know exactly, Yeah, Like, and I think sometimes because people 73 00:03:51,960 --> 00:03:55,080 Speaker 2: always do have opinions, right, but the opinions that are 74 00:03:55,120 --> 00:03:59,160 Speaker 2: conceived of as neutral and the ones that are conceived 75 00:03:59,200 --> 00:04:04,160 Speaker 2: of as being subjective are very telling, right, Like, the 76 00:04:04,240 --> 00:04:08,160 Speaker 2: media for a long time has been the domain of educated, 77 00:04:08,200 --> 00:04:10,800 Speaker 2: older white men, like people like me. I guess I'm 78 00:04:10,800 --> 00:04:13,480 Speaker 2: not old, but getting that way. And it also has 79 00:04:13,520 --> 00:04:15,840 Speaker 2: been the domain of like capital in the state, right, 80 00:04:15,880 --> 00:04:20,640 Speaker 2: like Jeffrey Bezos owned several newspapers pro market biases, pro 81 00:04:20,720 --> 00:04:25,040 Speaker 2: capitalism biases, pro state biases, that those are not really 82 00:04:25,080 --> 00:04:28,320 Speaker 2: investigated much in the media in the way that other 83 00:04:29,080 --> 00:04:33,880 Speaker 2: biases might be. Right, It's also created this idea that 84 00:04:34,320 --> 00:04:36,960 Speaker 2: the media always needs to shoot for the middle in 85 00:04:37,040 --> 00:04:39,839 Speaker 2: any given discussion, which I kind of want to investigate 86 00:04:39,880 --> 00:04:44,279 Speaker 2: a bit, when Donald Trump says something which is overt, 87 00:04:44,320 --> 00:04:46,640 Speaker 2: like Donald Trump has said things which are nativist, right, 88 00:04:46,720 --> 00:04:49,720 Speaker 2: nativism is a form of racism. Donald Trump therefore has 89 00:04:49,760 --> 00:04:52,960 Speaker 2: said racist shit. The way that this is far too 90 00:04:53,000 --> 00:04:56,480 Speaker 2: often treated in the legacy media is it's the racist 91 00:04:56,480 --> 00:04:59,039 Speaker 2: shit that Donald Trump said, correct, or like, maybe we 92 00:04:59,080 --> 00:05:02,800 Speaker 2: should consider this racist thing that so and so has said, right, 93 00:05:02,880 --> 00:05:06,040 Speaker 2: rather than this shit is racist Donald Trump has said 94 00:05:06,120 --> 00:05:08,240 Speaker 2: of shit that it's racist, or other members of the 95 00:05:08,320 --> 00:05:13,839 Speaker 2: Republican Party. All this serves to do is when we 96 00:05:13,880 --> 00:05:19,080 Speaker 2: have a topic and the people in Congress anchor themselves 97 00:05:19,120 --> 00:05:21,839 Speaker 2: on the very far right, what is acceptable discourse, the 98 00:05:21,880 --> 00:05:24,919 Speaker 2: media then moves discourse to the right so that position 99 00:05:25,000 --> 00:05:27,200 Speaker 2: is in the center, right, it serves to ratchet that 100 00:05:27,320 --> 00:05:30,359 Speaker 2: overton window to the right. I'm demonstrating this for my 101 00:05:30,360 --> 00:05:33,080 Speaker 2: colleagues with hand signals, which of course only two of 102 00:05:33,160 --> 00:05:36,520 Speaker 2: the hundreds of thousands of people listening to me will 103 00:05:36,560 --> 00:05:39,040 Speaker 2: be able to see the right way to our podcast. 104 00:05:39,640 --> 00:05:43,920 Speaker 3: Yeah, it was a very compelling mime of a ratchet 105 00:05:44,640 --> 00:05:47,240 Speaker 3: like it looked like you basically were doing it. I 106 00:05:47,279 --> 00:05:47,960 Speaker 3: could not tell. 107 00:05:48,400 --> 00:05:51,080 Speaker 1: I couldn't tell the difference. No, that's why we call 108 00:05:51,160 --> 00:05:52,799 Speaker 1: you ratchet. Straps out. 109 00:05:54,720 --> 00:05:57,160 Speaker 2: Got me a ratchet, Jimmy. Yeah, it's This podcast is 110 00:05:57,200 --> 00:06:00,000 Speaker 2: sponsored by Invisible Ratchet. Now it was time to pivot 111 00:06:00,120 --> 00:06:02,200 Speaker 2: to add It's not time to figure to add yet. 112 00:06:02,880 --> 00:06:05,919 Speaker 2: I think we should talk about the way other professions 113 00:06:05,960 --> 00:06:09,679 Speaker 2: concerned with the truths deal with this topic, right, because 114 00:06:10,440 --> 00:06:14,280 Speaker 2: journalism is pretty much unique and considering objectivity something that 115 00:06:14,960 --> 00:06:18,560 Speaker 2: we as individuals have to embody in every action that 116 00:06:18,600 --> 00:06:20,800 Speaker 2: we take. And I guess the most relevant one will 117 00:06:20,839 --> 00:06:24,479 Speaker 2: be academia, which is something else I am unfortunate enough 118 00:06:24,480 --> 00:06:27,240 Speaker 2: to have participated in for far too much of my 119 00:06:27,240 --> 00:06:31,599 Speaker 2: adult life. So academia still not great, But we have 120 00:06:31,720 --> 00:06:37,200 Speaker 2: accepted that everyone is biased in academia. Right, We rely on, 121 00:06:37,560 --> 00:06:40,800 Speaker 2: among many other things, something called standpoint theory, right, which 122 00:06:40,800 --> 00:06:43,800 Speaker 2: is a cornerstone of modern feminist thought. Most of you 123 00:06:43,800 --> 00:06:45,360 Speaker 2: will be aware of it, even if you're not aware 124 00:06:45,360 --> 00:06:47,440 Speaker 2: of it. Basically, it holds that we see the world 125 00:06:47,440 --> 00:06:53,200 Speaker 2: differently based on where we see it from. Our gender, sexuality, raise, ethnicity, experience, age, 126 00:06:53,240 --> 00:06:55,520 Speaker 2: and a million other things impact the truth we know 127 00:06:55,600 --> 00:06:58,520 Speaker 2: in the world. We experience and standpoint theory posits that 128 00:06:58,560 --> 00:07:02,160 Speaker 2: perhaps people not from a certain threating may have valuable 129 00:07:02,200 --> 00:07:05,240 Speaker 2: insights into it. Right, So, sometimes the outsider perspective is 130 00:07:05,279 --> 00:07:08,159 Speaker 2: a valuable one. But also people from that setting may 131 00:07:08,200 --> 00:07:12,040 Speaker 2: see things outsiders may not see, and we have to 132 00:07:12,080 --> 00:07:15,800 Speaker 2: acknowledge those biases, right, and then continue to tell the truth. 133 00:07:16,640 --> 00:07:18,720 Speaker 2: How do we tell the truth? In academia? We do 134 00:07:18,760 --> 00:07:23,080 Speaker 2: something called peer review. Peer review is bad. Peer review 135 00:07:23,560 --> 00:07:29,400 Speaker 2: strongly reinforces the status quote right. I will give one example. 136 00:07:29,760 --> 00:07:32,640 Speaker 2: I once had a journal article, right for a history journal, 137 00:07:32,880 --> 00:07:35,440 Speaker 2: killed in peer review. The piece was about the nineteen 138 00:07:35,480 --> 00:07:39,200 Speaker 2: oh nine tour of Catalonia that was a bicycle competition 139 00:07:39,240 --> 00:07:41,480 Speaker 2: for those of you who aren't familiar. It was killed 140 00:07:41,520 --> 00:07:46,720 Speaker 2: because my media analysis didn't mention television coverage. The television 141 00:07:47,000 --> 00:07:49,320 Speaker 2: was kind of crudely invented in the nineteen twenties, and 142 00:07:49,360 --> 00:07:51,760 Speaker 2: did it become widely available until the nineteen forties, Right, Like, 143 00:07:51,840 --> 00:07:56,559 Speaker 2: this is not a reasonable objection. Nonetheless, someone was able 144 00:07:56,600 --> 00:07:58,520 Speaker 2: to kill my piece because of it, because that's how 145 00:07:58,520 --> 00:08:00,640 Speaker 2: peer review worked. Right. The people who are as people 146 00:08:00,680 --> 00:08:02,880 Speaker 2: who are in petitions of power can kill your shit 147 00:08:02,920 --> 00:08:04,640 Speaker 2: if they want to, and they can give the most 148 00:08:04,680 --> 00:08:08,200 Speaker 2: ludicrous region. That is how peer of view, among other things, 149 00:08:08,240 --> 00:08:12,200 Speaker 2: reinforces state just quo. Right. The other thing that we 150 00:08:12,400 --> 00:08:16,720 Speaker 2: do in academias we declare our conflicts of interest, and 151 00:08:16,760 --> 00:08:19,160 Speaker 2: this is something we don't do in journalism, right, Like 152 00:08:19,600 --> 00:08:22,200 Speaker 2: outlets may have a conflict of interest policy. But again, 153 00:08:22,360 --> 00:08:27,080 Speaker 2: like conflicts of interest aren't explicitly declared in a piece 154 00:08:27,520 --> 00:08:29,960 Speaker 2: like you wouldn't see sometimes NPR does it. 155 00:08:30,040 --> 00:08:33,320 Speaker 1: Actually, Yeah, I mean a number of outlets do declare like, 156 00:08:33,400 --> 00:08:36,760 Speaker 1: for example, this outlet is owned by someone who has 157 00:08:36,760 --> 00:08:39,079 Speaker 1: a financial interest in the company we're reporting on, or 158 00:08:39,120 --> 00:08:39,720 Speaker 1: something like that. 159 00:08:40,000 --> 00:08:42,840 Speaker 3: Yeah, if the Washington Post is doing a story about 160 00:08:43,160 --> 00:08:47,320 Speaker 3: Jeff Bezos or Amazon, yeah, usually they will say in 161 00:08:47,400 --> 00:08:49,959 Speaker 3: the bottom or the top that the paper is owned 162 00:08:49,960 --> 00:08:51,720 Speaker 3: by said said figure. 163 00:08:52,280 --> 00:08:55,600 Speaker 2: Yeah. Where it becomes more murky is like sometimes people 164 00:08:55,640 --> 00:08:58,720 Speaker 2: have a financial interest or if something is your beat, right, 165 00:08:58,920 --> 00:09:01,520 Speaker 2: you may have other fire financial interest within that beat. 166 00:09:01,880 --> 00:09:04,600 Speaker 1: Well, and there's there's the very common case of people, 167 00:09:05,080 --> 00:09:08,720 Speaker 1: especially now within kind of the sub stack journalism, being 168 00:09:08,800 --> 00:09:12,240 Speaker 1: like friends and social with people that they are reporting 169 00:09:12,280 --> 00:09:14,480 Speaker 1: on and not disclosing to their wider audience. 170 00:09:14,920 --> 00:09:18,280 Speaker 2: Yeah, like access journalism more generally ready. Yeah, Like the 171 00:09:18,320 --> 00:09:20,520 Speaker 2: way I got this piece was by being invited to 172 00:09:20,600 --> 00:09:22,360 Speaker 2: the drinks party, And if I say anything I'm kind 173 00:09:22,400 --> 00:09:25,560 Speaker 2: about this person, I won't be invited to the drinks party. Yeah, 174 00:09:25,800 --> 00:09:28,280 Speaker 2: Or simply the conflict of interest that is presented by 175 00:09:29,000 --> 00:09:32,000 Speaker 2: the more ludicrous my headline, the more people will click 176 00:09:32,040 --> 00:09:33,960 Speaker 2: on this website, and the more time they will spend 177 00:09:34,000 --> 00:09:36,160 Speaker 2: on the page, and the more advative you they might generate. 178 00:09:36,360 --> 00:09:39,719 Speaker 1: Yeah, And that's really the largest issue with modern journalism 179 00:09:39,800 --> 00:09:43,480 Speaker 1: is that that kind of determines almost everything for an outlet, 180 00:09:43,600 --> 00:09:46,480 Speaker 1: is like what's what's going to get clicks, what's going 181 00:09:46,480 --> 00:09:49,480 Speaker 1: to rile people up as much as possible, And that 182 00:09:49,679 --> 00:09:51,960 Speaker 1: is that that doesn't count as financial interest, right, Like 183 00:09:52,040 --> 00:09:54,480 Speaker 1: the fact that the outlet has invested financial interest in 184 00:09:54,600 --> 00:09:57,000 Speaker 1: keeping you on the page as often and as long 185 00:09:57,040 --> 00:10:00,840 Speaker 1: as possible doesn't count as like a conflictive interest in 186 00:10:00,880 --> 00:10:04,720 Speaker 1: any way. And that's kind of one of the fundamental issues. Whereas, 187 00:10:04,840 --> 00:10:07,439 Speaker 1: like a lot of times, a lot of outlets won't let, 188 00:10:07,679 --> 00:10:10,839 Speaker 1: for example, a black journalist report on a black man 189 00:10:10,880 --> 00:10:13,120 Speaker 1: being murdered by the police, right because they see that 190 00:10:13,160 --> 00:10:16,199 Speaker 1: as like an inherent conflict of interest, and the gap 191 00:10:16,280 --> 00:10:18,880 Speaker 1: between those two things is where a lot of the 192 00:10:18,920 --> 00:10:21,040 Speaker 1: real problems, a lot of the worst problems in modern 193 00:10:21,080 --> 00:10:21,880 Speaker 1: journalism arise. 194 00:10:22,360 --> 00:10:26,319 Speaker 2: Yeah, talking of problems, we need to pay to ads, sure, 195 00:10:36,960 --> 00:10:40,840 Speaker 2: all right, we are back. Part of this also manifests 196 00:10:40,840 --> 00:10:44,600 Speaker 2: in like journalists being supposed to not have any individual 197 00:10:44,679 --> 00:10:47,679 Speaker 2: opinions about anything, even if it's irrelevant to that beat. 198 00:10:48,280 --> 00:10:51,720 Speaker 2: This has been the case for a lot of people 199 00:10:51,800 --> 00:10:55,480 Speaker 2: regarding the genocide of Palestinian people. Right, Like you could 200 00:10:55,480 --> 00:10:58,720 Speaker 2: be the weekend editor, you could write about brunch, and 201 00:10:59,320 --> 00:11:02,800 Speaker 2: if you work at certain outlets, you are like under 202 00:11:02,840 --> 00:11:05,280 Speaker 2: pain of losing your job, not allowed to post to 203 00:11:05,360 --> 00:11:08,640 Speaker 2: what is happening in Gaza. It is a genocide to 204 00:11:08,760 --> 00:11:11,920 Speaker 2: take a stance on these issues, right, And that is bad. 205 00:11:12,280 --> 00:11:16,160 Speaker 2: Like journalists are human beings too, and it's ridiculous to 206 00:11:16,520 --> 00:11:20,360 Speaker 2: suggest that that we shouldn't or can't have opinions on 207 00:11:20,400 --> 00:11:22,640 Speaker 2: these things and still do good reporting, right we can. 208 00:11:22,720 --> 00:11:24,680 Speaker 2: We just have to make sure that the reporting itself 209 00:11:25,280 --> 00:11:31,040 Speaker 2: is accurate. Sometimes what this leads to is like like 210 00:11:31,080 --> 00:11:32,920 Speaker 2: what I guess another, like Rubbie, you spoke about it, 211 00:11:32,960 --> 00:11:36,360 Speaker 2: that like the inherent conflict of interests that like traffic 212 00:11:36,400 --> 00:11:40,600 Speaker 2: on a website presents for journalism. Another like inherent issue 213 00:11:40,720 --> 00:11:43,880 Speaker 2: is that like every source is seen as biased, right, 214 00:11:43,920 --> 00:11:45,760 Speaker 2: Like you said, like black folks might not be allowed 215 00:11:45,800 --> 00:11:48,240 Speaker 2: to report on black men being shown by the cops, 216 00:11:48,960 --> 00:11:52,120 Speaker 2: accept state sources, which are far too often seen as 217 00:11:52,160 --> 00:11:55,040 Speaker 2: speaking the verbatim truth, right, Well, this is what the 218 00:11:55,040 --> 00:11:58,040 Speaker 2: police said. Yes, yeah, that is how we get I 219 00:11:58,040 --> 00:11:59,840 Speaker 2: guess A pretty good example of this, I'll link to 220 00:11:59,880 --> 00:12:02,200 Speaker 2: it in the show notes is a piece I wrote 221 00:12:03,000 --> 00:12:08,920 Speaker 2: five years ago I think about police officers overdosing on fentanyl. 222 00:12:09,400 --> 00:12:11,160 Speaker 2: Some of you will be familiar with this, some of 223 00:12:11,160 --> 00:12:14,319 Speaker 2: you will not, But it is not possible to overdose 224 00:12:14,480 --> 00:12:19,360 Speaker 2: on fentanyl just from being in its presence, like in 225 00:12:19,400 --> 00:12:22,120 Speaker 2: an outdoor area next to a thing that has fentel 226 00:12:22,120 --> 00:12:24,800 Speaker 2: in it. The piece I wrote dealt with the San 227 00:12:24,840 --> 00:12:28,959 Speaker 2: Diego Union Tribune. Who I mean, this was a spectacular instance, 228 00:12:28,960 --> 00:12:33,040 Speaker 2: I guess, of journalists like serving as police snographers. What 229 00:12:33,120 --> 00:12:35,840 Speaker 2: happened here is that the police had produced an edited 230 00:12:35,960 --> 00:12:39,920 Speaker 2: video with like music and shit of this supposed overdose. 231 00:12:40,000 --> 00:12:43,440 Speaker 2: Right of a young cop who was like, I don't 232 00:12:43,440 --> 00:12:45,679 Speaker 2: know what they call it. He's like apprentice with an 233 00:12:45,679 --> 00:12:48,960 Speaker 2: older cop, like the with a more experienced cop and 234 00:12:48,960 --> 00:12:51,839 Speaker 2: they were going around doing cop stuff. They found some stuff. 235 00:12:51,840 --> 00:12:55,240 Speaker 2: They tested it for fentanyl, and this guy collapses. The 236 00:12:55,320 --> 00:12:58,280 Speaker 2: younger cop, the older cop gives him several nacans. It's 237 00:12:58,320 --> 00:13:02,439 Speaker 2: not just waste some yeah, no, just like I think 238 00:13:02,480 --> 00:13:05,520 Speaker 2: there was one instant where someone received seven knock ants, 239 00:13:05,880 --> 00:13:11,079 Speaker 2: which like, like, that's a threat to your fucking nasal 240 00:13:11,080 --> 00:13:14,280 Speaker 2: integrity if nothing else. Yeah, if knokin doesn't work the 241 00:13:14,280 --> 00:13:15,839 Speaker 2: first time, like I. 242 00:13:15,800 --> 00:13:18,560 Speaker 1: Mean, people do sometimes often it's not especially like with 243 00:13:18,640 --> 00:13:21,400 Speaker 1: serious ods. They'll often put people like in the hospital 244 00:13:21,400 --> 00:13:24,680 Speaker 1: on drips, but you would have to take a massive dose, 245 00:13:24,840 --> 00:13:26,960 Speaker 1: not just be near fucking. 246 00:13:26,720 --> 00:13:29,960 Speaker 2: Fin Yeah, yeah, to like be like I think this instance, 247 00:13:30,000 --> 00:13:32,280 Speaker 2: like they were outside testing it in like the boot 248 00:13:32,280 --> 00:13:35,760 Speaker 2: of a car, Like it's ludicrous thing that you and like, 249 00:13:36,440 --> 00:13:38,560 Speaker 2: it would be good if they familiarize themselves with some 250 00:13:38,640 --> 00:13:42,679 Speaker 2: of the what an overdose looks like. Right, Yeah, And 251 00:13:42,720 --> 00:13:43,360 Speaker 2: I'm mixed. 252 00:13:43,559 --> 00:13:46,560 Speaker 1: If they weren't cops, I'd respect the desire to like 253 00:13:46,880 --> 00:13:48,880 Speaker 1: time theft from work, because I think that's what a 254 00:13:48,880 --> 00:13:50,920 Speaker 1: lot of this is. It's like, oh shit, if I 255 00:13:50,960 --> 00:13:54,839 Speaker 1: have an overdose, like I get to stay out of 256 00:13:54,880 --> 00:13:55,920 Speaker 1: work a couple. 257 00:13:55,600 --> 00:13:59,040 Speaker 2: Of days with day that's what that's a that's a framing. 258 00:13:59,080 --> 00:14:02,760 Speaker 2: I'm amenable to you. Unfortunately, they are gods. Ver If 259 00:14:02,760 --> 00:14:07,520 Speaker 2: you're a reporter, though, like it is absolutely on you to, oh, 260 00:14:07,559 --> 00:14:09,960 Speaker 2: this person having an overdose? What are the symptoms of 261 00:14:10,000 --> 00:14:13,040 Speaker 2: an overdose? What does an overdose look like? Should I 262 00:14:13,120 --> 00:14:16,240 Speaker 2: talk to a medical professional? Or you could just ask 263 00:14:16,760 --> 00:14:20,160 Speaker 2: the perlice information officer who shared this with you, how 264 00:14:20,200 --> 00:14:23,080 Speaker 2: did you verify this as an overdose? With whom did 265 00:14:23,120 --> 00:14:27,080 Speaker 2: you discuss the toxicology report in this case? That information 266 00:14:27,160 --> 00:14:29,760 Speaker 2: wasn't available right the way I was able to obtain 267 00:14:29,800 --> 00:14:32,840 Speaker 2: that just to do I guess clarity is first of all, 268 00:14:33,120 --> 00:14:36,400 Speaker 2: I saw the publication where they didn't mention any fact 269 00:14:36,480 --> 00:14:40,000 Speaker 2: checking that they'd done. You can also pra the emails 270 00:14:40,200 --> 00:14:42,080 Speaker 2: to the police as well as from the police, right, 271 00:14:42,120 --> 00:14:44,640 Speaker 2: so you can see if other reporters have done fact 272 00:14:44,720 --> 00:14:47,040 Speaker 2: checking that way or have asked any follow up questions 273 00:14:47,120 --> 00:14:49,400 Speaker 2: that way I done, that they would have found out 274 00:14:49,400 --> 00:14:52,120 Speaker 2: that you say that you can't overdose in fentanyl this way. 275 00:14:52,520 --> 00:14:54,720 Speaker 2: They didn't even try. And like both sides this, I 276 00:14:54,720 --> 00:14:57,480 Speaker 2: guess like sometimes you'll see outlets doing that now, like 277 00:14:57,600 --> 00:15:01,320 Speaker 2: this cop overdose from fentanyl but doctors say they can't. 278 00:15:01,680 --> 00:15:05,280 Speaker 2: Like it's a which I still think is an absolutely 279 00:15:05,320 --> 00:15:09,440 Speaker 2: ludicrous practice. Right. That's like saying this person tried to fly, 280 00:15:09,720 --> 00:15:12,440 Speaker 2: but you know that people say gravity will make them 281 00:15:12,480 --> 00:15:15,200 Speaker 2: fall to the ground, like one of these things we 282 00:15:15,280 --> 00:15:18,920 Speaker 2: know to be true. So I guess what I would 283 00:15:18,960 --> 00:15:23,160 Speaker 2: propose we do instead of this ludicrous practice of like 284 00:15:23,200 --> 00:15:26,400 Speaker 2: pretending to be objective about everything all the time, is 285 00:15:27,040 --> 00:15:30,480 Speaker 2: that we are honest about our biases, honest about our 286 00:15:30,480 --> 00:15:33,360 Speaker 2: conflict of interests. We're honest about like our standpoint, and 287 00:15:33,400 --> 00:15:37,480 Speaker 2: then we do reporting, which is obviously verifiable, right, And 288 00:15:37,480 --> 00:15:39,440 Speaker 2: that means, like you'll see that at the end of 289 00:15:39,480 --> 00:15:42,680 Speaker 2: these episodes, right, we share our sources that we used 290 00:15:43,240 --> 00:15:46,040 Speaker 2: after we're try and communicate where we got information from 291 00:15:46,160 --> 00:15:48,320 Speaker 2: and how we got it. And I think we should 292 00:15:48,320 --> 00:15:52,320 Speaker 2: strive for moral clarity in the way we say things 293 00:15:52,360 --> 00:15:55,000 Speaker 2: instead of driving to this mill ground. So, like, what 294 00:15:55,040 --> 00:15:56,760 Speaker 2: do I mean by moral clarity? I mean saying the 295 00:15:56,840 --> 00:16:01,560 Speaker 2: cops killed someone, not officer involved shooting. Right. If you 296 00:16:01,680 --> 00:16:04,600 Speaker 2: work with fucking words and you find yourself writing something 297 00:16:04,640 --> 00:16:09,400 Speaker 2: as convoluted as officer involved shooting, then you have strayed 298 00:16:09,440 --> 00:16:14,440 Speaker 2: from the foundational reason for journalism existing. Yeah, you have 299 00:16:14,520 --> 00:16:17,400 Speaker 2: gone beyond God's light. Yeah yeah, yeah, you live in 300 00:16:17,440 --> 00:16:21,280 Speaker 2: the darkness. There is I think a place for fact checkers. 301 00:16:21,800 --> 00:16:26,000 Speaker 2: I think people got a bit carried away with fact checking. 302 00:16:26,280 --> 00:16:28,920 Speaker 2: I don't quite know how to phrase this correctly. I 303 00:16:28,920 --> 00:16:30,720 Speaker 2: had an experience once where I had written a piece. 304 00:16:31,200 --> 00:16:35,080 Speaker 2: The fact checking of that piece centered on the fact 305 00:16:35,080 --> 00:16:37,440 Speaker 2: that I had used the noun beach chair to refer 306 00:16:37,520 --> 00:16:42,840 Speaker 2: to this chair. Yes, the fact checker believed that it 307 00:16:42,880 --> 00:16:48,200 Speaker 2: was a lawn chair. To me, did not impact the 308 00:16:48,280 --> 00:16:52,600 Speaker 2: overall thrust of the piece, right, like the nature of 309 00:16:52,640 --> 00:16:55,600 Speaker 2: the chair. Unfortunately, that ended up killing the story. We 310 00:16:55,680 --> 00:16:58,239 Speaker 2: ran out of time to go over the court documents 311 00:16:58,360 --> 00:17:00,960 Speaker 2: because of the nature of the chair discussion. And I'm 312 00:17:00,960 --> 00:17:02,280 Speaker 2: not sure that's what we need to do. 313 00:17:03,360 --> 00:17:06,120 Speaker 1: No, I mean, and I think the other and probably 314 00:17:06,240 --> 00:17:09,880 Speaker 1: larger problem with fact checking is fact checking is an 315 00:17:09,880 --> 00:17:12,040 Speaker 1: in in and of itself is ha. I showed that 316 00:17:12,080 --> 00:17:14,160 Speaker 1: they were wrong. I checked the fact where it's like, yeah, 317 00:17:14,160 --> 00:17:16,679 Speaker 1: but what they wrote got out to thirty million people 318 00:17:16,720 --> 00:17:19,280 Speaker 1: and your fact check got out to like sixty So 319 00:17:19,320 --> 00:17:21,080 Speaker 1: what you did didn't really matter. And what we should 320 00:17:21,080 --> 00:17:25,080 Speaker 1: probably be doing is looking at an intervention higher up 321 00:17:25,119 --> 00:17:27,560 Speaker 1: on the line to stop the bullshit from getting out, 322 00:17:27,640 --> 00:17:30,720 Speaker 1: rather than being obsessed with well I fact checked it, like, well, 323 00:17:30,760 --> 00:17:33,399 Speaker 1: but that didn't really help you know, yeah, right, just 324 00:17:33,440 --> 00:17:35,000 Speaker 1: at what point do we give that up? 325 00:17:35,080 --> 00:17:40,719 Speaker 2: Is pointless? Yeah, you are like not even a footnote 326 00:17:40,840 --> 00:17:42,280 Speaker 2: to this other thing that this person. 327 00:17:42,440 --> 00:17:45,000 Speaker 1: No, we need to the intervention needs to be happening 328 00:17:45,080 --> 00:17:47,640 Speaker 1: earlier because the bullshit is still getting out. 329 00:17:48,440 --> 00:17:51,560 Speaker 2: Yeah, absolutely, And this happens like we're inly this a 330 00:17:51,560 --> 00:17:55,399 Speaker 2: bizarre situation where like writing outlets can say what the 331 00:17:55,440 --> 00:18:00,720 Speaker 2: fuck they want, right like, Like we have whole massive 332 00:18:00,800 --> 00:18:03,160 Speaker 2: media empires going in on this idea that the twenty 333 00:18:03,160 --> 00:18:06,960 Speaker 2: twenty election was stolen. Then we have like centrist outlets 334 00:18:07,680 --> 00:18:10,200 Speaker 2: instead of being like, no, the election wasn't solen. That 335 00:18:10,200 --> 00:18:15,080 Speaker 2: that's bullshit, constantly trying to like investigate those claims as 336 00:18:15,119 --> 00:18:18,760 Speaker 2: if they were credible and useful, rather than illustrating why 337 00:18:18,760 --> 00:18:21,199 Speaker 2: they should be dismissed and then moving on, right like, 338 00:18:21,320 --> 00:18:25,560 Speaker 2: instead of investigating why this conspiracy is so important. We 339 00:18:25,600 --> 00:18:28,960 Speaker 2: see that a lot with immigration right now, but we 340 00:18:29,040 --> 00:18:32,199 Speaker 2: saw it a ton in the presidential debates, right Like, 341 00:18:32,440 --> 00:18:34,040 Speaker 2: It's a good example of what you were saying JD. 342 00:18:34,160 --> 00:18:38,240 Speaker 2: Vance can just lie and even Donald Trump actually can 343 00:18:38,320 --> 00:18:41,760 Speaker 2: lie about people eating dogs and cats and it doesn't 344 00:18:41,880 --> 00:18:45,280 Speaker 2: hugely matter if an hour later and use outlet tweets, oh, 345 00:18:45,320 --> 00:18:47,520 Speaker 2: we fact check him, and it's not okay. Right, You're 346 00:18:47,520 --> 00:18:51,640 Speaker 2: still broadcast to millions of people that Haitian migrants eat 347 00:18:51,680 --> 00:18:54,320 Speaker 2: dogs and cats and that's not true. And I think 348 00:18:54,359 --> 00:18:57,560 Speaker 2: we need to strive for something that it's closer to 349 00:18:57,600 --> 00:18:59,840 Speaker 2: the truth, and it's closer to fairness, and it gives 350 00:18:59,920 --> 00:19:03,440 Speaker 2: us moral clarity because what we're all doing right now, 351 00:19:03,520 --> 00:19:06,480 Speaker 2: what the legacy media doing is doing right now, is 352 00:19:06,560 --> 00:19:10,200 Speaker 2: like woefully inadequate to meet the moment. Yeah, I mean 353 00:19:10,200 --> 00:19:11,359 Speaker 2: I agree. 354 00:19:11,920 --> 00:19:14,760 Speaker 1: I think where I don't actually know how to solve 355 00:19:14,840 --> 00:19:19,040 Speaker 1: things is the incentive structure, Yeah, is so broken. And 356 00:19:19,400 --> 00:19:22,239 Speaker 1: to an extent, all of this talk about objectivity. And 357 00:19:22,560 --> 00:19:24,199 Speaker 1: when I say that, I mean, like the talk that 358 00:19:24,600 --> 00:19:28,359 Speaker 1: outlets and editors have about objectivity, is there more than 359 00:19:28,359 --> 00:19:33,520 Speaker 1: anything to obscure the fact that the economics of journalism 360 00:19:33,760 --> 00:19:36,960 Speaker 1: make it almost impossible for it to be anything but 361 00:19:37,119 --> 00:19:41,800 Speaker 1: a willing agent of disinformation. That's the real issue is 362 00:19:42,359 --> 00:19:45,080 Speaker 1: you can have the Washington Post, and you can have 363 00:19:45,119 --> 00:19:49,480 Speaker 1: the New York Times host good reporting, but a huge 364 00:19:49,520 --> 00:19:53,400 Speaker 1: amount of their income will always come from having columnists 365 00:19:53,520 --> 00:19:57,399 Speaker 1: whose entire job is to piss people off or to 366 00:19:57,760 --> 00:20:01,560 Speaker 1: stoke the egos of people in power. And I don't 367 00:20:01,600 --> 00:20:05,040 Speaker 1: know that the good work those outlets does outweighs the 368 00:20:05,240 --> 00:20:10,680 Speaker 1: crap that they spill into the public discourse, because that's 369 00:20:11,840 --> 00:20:15,320 Speaker 1: what's incentivized. And so I think to an extent, there's 370 00:20:15,320 --> 00:20:20,639 Speaker 1: almost no point in actually engaging with the objectivity debate 371 00:20:21,000 --> 00:20:22,960 Speaker 1: with the people who are pushing it, because they're not 372 00:20:23,040 --> 00:20:25,879 Speaker 1: pushing it honestly, they're pushing it as a way to 373 00:20:25,920 --> 00:20:28,880 Speaker 1: obscure the fact that they make their money the same 374 00:20:28,920 --> 00:20:33,520 Speaker 1: way Mark Zuckerberg makes his money, which is by spreading fear, anger, 375 00:20:33,720 --> 00:20:34,240 Speaker 1: and doubt. 376 00:20:34,960 --> 00:20:38,560 Speaker 2: Yeah. Yeah, that's the bad op ed industrial complex. Like 377 00:20:39,320 --> 00:20:41,600 Speaker 2: I've been guilty of that. Right, you see a fucking 378 00:20:41,640 --> 00:20:44,639 Speaker 2: headline on social media and you're like, that's bullshit, and 379 00:20:44,640 --> 00:20:47,080 Speaker 2: then you click and read. Right. I used to like, 380 00:20:47,119 --> 00:20:48,840 Speaker 2: when I was a little baby, channel let's engage with 381 00:20:48,840 --> 00:20:51,600 Speaker 2: this and be like that's bullshit because and I try 382 00:20:51,600 --> 00:20:54,040 Speaker 2: and write about it somewhere or post it on social media. 383 00:20:54,119 --> 00:20:56,240 Speaker 2: But I have come to realize and in doing that, 384 00:20:56,280 --> 00:20:59,600 Speaker 2: I'm doing exactly what they want me to do, which 385 00:20:59,640 --> 00:21:02,520 Speaker 2: is in you sending people to their website to click 386 00:21:02,560 --> 00:21:04,680 Speaker 2: on adverts and to make the money. So I think 387 00:21:04,680 --> 00:21:06,360 Speaker 2: it's better that we do not do that. But yeah, 388 00:21:06,359 --> 00:21:11,280 Speaker 2: that is the fundamental conceit of journalism right now. How 389 00:21:11,320 --> 00:21:14,159 Speaker 2: it pays the bills is keeping you on that page, 390 00:21:14,160 --> 00:21:15,600 Speaker 2: and a way it keeps you on that page is 391 00:21:15,720 --> 00:21:19,119 Speaker 2: making you angry. There is like a model, I think. 392 00:21:19,680 --> 00:21:23,960 Speaker 2: Then you see this like in community small community newspapers 393 00:21:24,040 --> 00:21:26,480 Speaker 2: right now, like I guess outlets like Left Coast Right 394 00:21:26,520 --> 00:21:31,160 Speaker 2: Watching in California and Oregon, where like people genuinely buy 395 00:21:31,280 --> 00:21:34,600 Speaker 2: building trust and telling the truth, gain the support of 396 00:21:34,920 --> 00:21:38,840 Speaker 2: their communities and financed by them. But I mean the 397 00:21:38,960 --> 00:21:42,600 Speaker 2: orders of magnitude and income difference are like they're not 398 00:21:42,640 --> 00:21:44,920 Speaker 2: making Washington Post money over at Left Coast Right Watch. 399 00:21:45,160 --> 00:21:49,240 Speaker 2: I know this should be true. So yeah, pretty fucked 400 00:21:49,800 --> 00:21:53,600 Speaker 2: and it will only get worse. I think, like, as 401 00:21:54,720 --> 00:21:57,520 Speaker 2: as as we continue to slide into the post truth 402 00:21:58,080 --> 00:22:03,560 Speaker 2: fascism world, I can't really our legacy outlets doing much 403 00:22:03,600 --> 00:22:05,600 Speaker 2: about it. If all they ever going to do is 404 00:22:05,920 --> 00:22:07,520 Speaker 2: strive for the middle ground. 405 00:22:07,200 --> 00:22:12,560 Speaker 1: On this, Well, all right, Okay, everybody, all right, you 406 00:22:12,680 --> 00:22:18,480 Speaker 1: go have a good day and now, world, It Could Happen. 407 00:22:18,240 --> 00:22:20,600 Speaker 4: Here is a production of cool Zone Media. For more 408 00:22:20,640 --> 00:22:24,120 Speaker 4: podcasts from cool Zone Media, visit our website cool Zonemedia 409 00:22:24,200 --> 00:22:27,000 Speaker 4: dot com, or check us out on the iHeartRadio app, 410 00:22:27,040 --> 00:22:30,600 Speaker 4: Apple Podcasts, or wherever you listen to podcasts. You can 411 00:22:30,680 --> 00:22:33,000 Speaker 4: now find sources for it could Happen here. Listen directly 412 00:22:33,000 --> 00:22:35,320 Speaker 4: in episode descriptions. Thanks for listening.