1 00:00:21,360 --> 00:00:25,759 Speaker 1: Next month, the twenty seventh Conference of the Parties or 2 00:00:25,960 --> 00:00:33,239 Speaker 1: COP the UN Climate Negotiations Conference, will kick off in Cairo, Egypt. 3 00:00:34,120 --> 00:00:37,720 Speaker 1: Once again, negotiators and politicians from all over the world 4 00:00:38,320 --> 00:00:40,880 Speaker 1: will join up to talk through what they are willing 5 00:00:41,000 --> 00:00:46,559 Speaker 1: to do to stave off human extinction. That sounds dramatic, right, 6 00:00:46,640 --> 00:00:49,599 Speaker 1: but that's kind of what we're talking about here. There 7 00:00:49,640 --> 00:00:53,200 Speaker 1: will be the usual massive contingent from the fossil fuel 8 00:00:53,240 --> 00:00:56,880 Speaker 1: industry there as well. Last time in Glasgow, the fossil 9 00:00:56,920 --> 00:01:01,560 Speaker 1: fuel industry sent more representatives than anyone country did, and 10 00:01:01,640 --> 00:01:04,560 Speaker 1: there's likely to be more of the same in Egypt. 11 00:01:04,959 --> 00:01:09,679 Speaker 1: The organizers have, in a much publicized and criticized move, 12 00:01:10,120 --> 00:01:14,759 Speaker 1: allowed major global polluter Coca Cola to sponsor the event. 13 00:01:15,319 --> 00:01:19,360 Speaker 1: Even worse, they've also hired Coca Cola's publicist Hill and 14 00:01:19,440 --> 00:01:23,080 Speaker 1: Knowlton to do pr for the conference. If you missed 15 00:01:23,200 --> 00:01:26,319 Speaker 1: season three of Drill, go back and listen to it. 16 00:01:26,640 --> 00:01:29,679 Speaker 1: We did an entire episode on Helen Milton founder John 17 00:01:29,760 --> 00:01:32,560 Speaker 1: Hill and his work for both the oil industry and 18 00:01:32,600 --> 00:01:36,480 Speaker 1: the tobacco industry all at the same time. Hill masterminded 19 00:01:36,480 --> 00:01:40,480 Speaker 1: the strategy of hiring scientists to say tobacco smoking wasn't 20 00:01:40,520 --> 00:01:42,960 Speaker 1: bad for you, and that the jury was still out 21 00:01:43,000 --> 00:01:46,280 Speaker 1: on whether it caused cancer. At the very same time, 22 00:01:46,800 --> 00:01:50,240 Speaker 1: his firm was working for the American Petroleum Institute and 23 00:01:50,360 --> 00:01:53,800 Speaker 1: some of its member companies. In fact, it was Hill 24 00:01:54,200 --> 00:01:58,880 Speaker 1: who brought tobacco folks into the American Petroleum Institute. The 25 00:01:58,960 --> 00:02:03,760 Speaker 1: long standing relation whip between oil and tobacco eventually resulted 26 00:02:03,840 --> 00:02:07,600 Speaker 1: in the creation of the cigarette filter, a money maker 27 00:02:07,720 --> 00:02:10,200 Speaker 1: for oil and gas companies looking for a new place 28 00:02:10,240 --> 00:02:13,480 Speaker 1: to sell petrochemicals, and a slide of hand for the 29 00:02:13,520 --> 00:02:17,680 Speaker 1: tobacco industry looking to convince consumers that they could make 30 00:02:17,760 --> 00:02:23,200 Speaker 1: smoking less harmful by smoking lighter filtered cigarettes. Hill represented 31 00:02:23,280 --> 00:02:26,280 Speaker 1: Mensanto around the same time, too, so no surprise that 32 00:02:26,360 --> 00:02:30,400 Speaker 1: the chemical industry embraced a lot of the same tactics. Definitely, 33 00:02:30,400 --> 00:02:34,960 Speaker 1: the folks you want strategizing the messaging for your climate conference, 34 00:02:35,800 --> 00:02:39,440 Speaker 1: I'm sure they won't be reporting directly back to their clients. 35 00:02:39,960 --> 00:02:42,800 Speaker 1: Q head banging against Wall video. 36 00:02:44,160 --> 00:02:44,760 Speaker 2: At any rate. 37 00:02:44,919 --> 00:02:49,040 Speaker 1: Cop isn't just a time for official corporate greenwashing. It's 38 00:02:49,120 --> 00:02:51,720 Speaker 1: also a period of time that tends to see major 39 00:02:51,760 --> 00:02:57,200 Speaker 1: spikes in climate disinformation. Social media will almost assuredly be 40 00:02:57,240 --> 00:03:00,920 Speaker 1: flooded with various memes about the elitist at COP, the 41 00:03:01,000 --> 00:03:04,919 Speaker 1: failures of renewable energy, and that dam frozen windmill picture 42 00:03:04,960 --> 00:03:08,320 Speaker 1: that makes the rounds every couple of years. A coalition 43 00:03:08,400 --> 00:03:11,480 Speaker 1: of environmental groups has come together under the banner of 44 00:03:11,600 --> 00:03:17,320 Speaker 1: Climate Action against Disinformation to monitor the disinformation that spawns 45 00:03:17,360 --> 00:03:21,079 Speaker 1: around some of these big inflection points. They initially got 46 00:03:21,200 --> 00:03:24,480 Speaker 1: organized around COP twenty six in Glasgow and now in 47 00:03:24,520 --> 00:03:27,440 Speaker 1: advance of COP twenty seven in Cairo, they've put together 48 00:03:27,520 --> 00:03:30,880 Speaker 1: a report full of information aimed at helping journalists and 49 00:03:31,000 --> 00:03:37,680 Speaker 1: other communicators avoid unintentionally spreading or amplifying mis and disinformation. 50 00:03:38,600 --> 00:03:41,800 Speaker 1: The lead author on that report, Connor Gibson, is joining 51 00:03:41,800 --> 00:03:45,360 Speaker 1: me here today to walk us through it. That's coming 52 00:03:45,480 --> 00:04:00,120 Speaker 1: up after this quick break. 53 00:04:10,160 --> 00:04:13,680 Speaker 3: A little bit more savviness is needed in newsrooms in 54 00:04:13,720 --> 00:04:16,680 Speaker 3: the modern era in order to not allow that to happen, 55 00:04:16,680 --> 00:04:20,880 Speaker 3: because certainly disinformers now know how to exploit that tension 56 00:04:21,200 --> 00:04:23,960 Speaker 3: in terms of something like viral sloganeering, which is a 57 00:04:24,000 --> 00:04:27,440 Speaker 3: concept that the Data in Society Research Institute wrote. 58 00:04:27,240 --> 00:04:28,039 Speaker 4: To report on. 59 00:04:28,880 --> 00:04:31,000 Speaker 3: You know, basically, climate gate is the example that I 60 00:04:31,120 --> 00:04:35,000 Speaker 3: use in this report, you know how many major news 61 00:04:35,000 --> 00:04:39,880 Speaker 3: outlets printed the word climate gate, that is the tagline 62 00:04:39,960 --> 00:04:44,720 Speaker 3: of the climate change denial movement, and just putting that 63 00:04:44,760 --> 00:04:50,800 Speaker 3: in your headline over and over again really created a 64 00:04:50,920 --> 00:04:54,400 Speaker 3: sense among news readers that there was guilt and that 65 00:04:54,440 --> 00:04:57,920 Speaker 3: there was fabrication of data and all of the false 66 00:04:58,720 --> 00:05:02,040 Speaker 3: accusations that had been aid after those climate scientists emails 67 00:05:02,040 --> 00:05:04,680 Speaker 3: were hacked and taken out of context and released. And 68 00:05:05,480 --> 00:05:08,760 Speaker 3: you know, the fact checking that came after the investigations 69 00:05:08,800 --> 00:05:12,400 Speaker 3: and exonerations that came after that didn't spread. That didn't 70 00:05:12,400 --> 00:05:15,560 Speaker 3: spread nearly as widely as the phrase climate gate. It 71 00:05:15,600 --> 00:05:19,680 Speaker 3: never does, Yeah, exactly, it never does. So I think 72 00:05:20,200 --> 00:05:26,159 Speaker 3: one of the biggest insurmountable feeling trends that I looked 73 00:05:26,200 --> 00:05:30,359 Speaker 3: at in this report is just the economics of the newsroom. 74 00:05:31,040 --> 00:05:32,720 Speaker 4: You know, when there is a. 75 00:05:32,680 --> 00:05:37,640 Speaker 3: Revenue incentive that is based around clicks shares, most emailed articles, 76 00:05:38,240 --> 00:05:42,240 Speaker 3: that precludes the type of tone that is actually needed 77 00:05:42,880 --> 00:05:49,320 Speaker 3: to stop misinformation from festering. And so the simple capitalist 78 00:05:49,360 --> 00:05:55,120 Speaker 3: economics of keeping a news outlet functioning actually creates a 79 00:05:55,160 --> 00:05:59,400 Speaker 3: breeding ground for misinformation as well, and is something that 80 00:05:59,440 --> 00:06:01,520 Speaker 3: needs to be taken and a lot more seriously than 81 00:06:01,520 --> 00:06:04,760 Speaker 3: it currently is. But unfortunately, because it's about economics. I 82 00:06:04,800 --> 00:06:07,839 Speaker 3: think that is one of the biggest obstacles that reporters 83 00:06:07,839 --> 00:06:10,520 Speaker 3: will face when it comes to that tension between the 84 00:06:10,600 --> 00:06:14,880 Speaker 3: responsibility of a reporter to communicate something accurately without feeding 85 00:06:14,880 --> 00:06:18,840 Speaker 3: into conspiracy theory or myth, and the responsibility of an 86 00:06:18,920 --> 00:06:24,440 Speaker 3: editor to keep the newspaper in circulation and financially healthy. 87 00:06:25,000 --> 00:06:28,560 Speaker 2: Oh totally, especially given all the social media stuff too. 88 00:06:29,000 --> 00:06:33,560 Speaker 2: Like I see it all the time on Twitter A 89 00:06:33,600 --> 00:06:36,400 Speaker 2: lot of times. You know, the article itself will be 90 00:06:36,480 --> 00:06:40,119 Speaker 2: halfway decent, but the tweet is really seeing sure. 91 00:06:40,960 --> 00:06:41,159 Speaker 4: Yeah. 92 00:06:41,400 --> 00:06:44,640 Speaker 3: The research started with a literature review, mostly a review 93 00:06:44,680 --> 00:06:47,720 Speaker 3: of academic articles, many of which are peer reviewed, some 94 00:06:47,800 --> 00:06:50,760 Speaker 3: of which are not, but are very prescriptive in terms 95 00:06:50,760 --> 00:06:54,640 Speaker 3: of communications expertise. And that's where First Draft and the 96 00:06:54,720 --> 00:06:58,240 Speaker 3: Data and Society Research Institute, the Union of Concerned Scientists 97 00:06:58,279 --> 00:07:01,200 Speaker 3: and some others have published really helpful information when it 98 00:07:01,200 --> 00:07:05,839 Speaker 3: comes to just psychology based communication techniques, perhaps nobody more 99 00:07:05,920 --> 00:07:12,480 Speaker 3: than academics like Stefan Lewandowski and John Cook, and a 100 00:07:12,520 --> 00:07:16,440 Speaker 3: lot of the collaborators on reports like the Debunking Handbook 101 00:07:16,600 --> 00:07:20,960 Speaker 3: and the Conspiracy Theory handbook, just really excellent psychology based 102 00:07:21,760 --> 00:07:26,160 Speaker 3: communications advice that these academics have published in order to 103 00:07:26,200 --> 00:07:30,600 Speaker 3: help people navigate the most basic traps. And this stuff 104 00:07:30,640 --> 00:07:33,160 Speaker 3: has been around for decades, right like Richard Nixon, I'm 105 00:07:33,160 --> 00:07:36,480 Speaker 3: not a crook. Everybody hears the word crook. George Lindhoff 106 00:07:36,560 --> 00:07:39,400 Speaker 3: wrote a book, Don't Think of an Elephant, because the 107 00:07:39,400 --> 00:07:41,400 Speaker 3: only word that sticks in your brain after you say 108 00:07:41,440 --> 00:07:43,640 Speaker 3: that is the word elephant. So, you know, a lot 109 00:07:43,680 --> 00:07:46,600 Speaker 3: of this stuff has been known for a long time. 110 00:07:46,640 --> 00:07:50,120 Speaker 3: But I poked around at some headlines, you know, from 111 00:07:50,160 --> 00:07:53,239 Speaker 3: major outlets, and I still see the same mistakes happening. 112 00:07:53,280 --> 00:07:57,440 Speaker 3: And you know, a lot of that is not it's 113 00:07:57,480 --> 00:08:01,640 Speaker 3: not because people are foolish. It's just that there is 114 00:08:01,720 --> 00:08:05,240 Speaker 3: an amount of faith in the audience when it comes 115 00:08:05,320 --> 00:08:08,920 Speaker 3: to an honest journalist and editor writing a report and 116 00:08:08,920 --> 00:08:13,160 Speaker 3: writing a headline that unfortunately just misses the traps that 117 00:08:13,360 --> 00:08:17,480 Speaker 3: disinformation relies on in order to spread. And that's got 118 00:08:17,520 --> 00:08:20,000 Speaker 3: to be really frustrating when you're a professional who writes 119 00:08:20,040 --> 00:08:23,720 Speaker 3: something that's very intellectually honest, and yet as a result 120 00:08:23,720 --> 00:08:26,760 Speaker 3: of writing it in an intellectually honest way, it is 121 00:08:26,800 --> 00:08:30,400 Speaker 3: misinformation and disinformation to spread. So that's really why we 122 00:08:30,480 --> 00:08:34,840 Speaker 3: wrote this report. It's why we focused on climate as 123 00:08:34,880 --> 00:08:38,079 Speaker 3: a topic because of a COP twenty seven coming up 124 00:08:38,840 --> 00:08:44,400 Speaker 3: in Egypt, and every year when the United Nations climate 125 00:08:44,440 --> 00:08:49,840 Speaker 3: negotiations happen, there's an inevitable amount of misinformation online and 126 00:08:49,880 --> 00:08:52,320 Speaker 3: social media related to climate and you can predict what 127 00:08:52,360 --> 00:08:54,199 Speaker 3: some of it will be. It will be rich people 128 00:08:54,200 --> 00:08:58,840 Speaker 3: are flying on jets, how ironic and things like that. 129 00:08:58,840 --> 00:09:04,920 Speaker 3: There will probably be some outdated imagery circulated, like perhaps 130 00:09:05,080 --> 00:09:09,400 Speaker 3: frozen wind turbines or you know, solar panels not failing 131 00:09:09,440 --> 00:09:12,959 Speaker 3: to meet electricity demand in some country. You know, something 132 00:09:13,000 --> 00:09:16,040 Speaker 3: that's taken out of context that usually has recirculated every 133 00:09:16,080 --> 00:09:18,560 Speaker 3: few years in a moment like this in order just 134 00:09:18,600 --> 00:09:22,200 Speaker 3: to see it a narrative of cynicism, in order to 135 00:09:22,240 --> 00:09:25,320 Speaker 3: make people feel like this isn't a problem that can 136 00:09:25,360 --> 00:09:27,360 Speaker 3: be addressed, or this isn't a problem that needs to 137 00:09:27,360 --> 00:09:31,040 Speaker 3: be addressed. Also, climate journalists are kind of a good 138 00:09:31,880 --> 00:09:36,280 Speaker 3: target audience for us here because the phenomenon of climate 139 00:09:36,320 --> 00:09:39,280 Speaker 3: change denial, I think has made climate change reporters a 140 00:09:39,280 --> 00:09:43,320 Speaker 3: little more sophisticated and understanding these trends. Before social media 141 00:09:43,720 --> 00:09:47,440 Speaker 3: made disinformation such a rampant problem in the last few years. 142 00:09:47,720 --> 00:09:49,720 Speaker 1: I haven't thought about it that way before, but yeah, 143 00:09:49,840 --> 00:09:52,680 Speaker 1: climate reporters to be sort of dealt with the disinformation 144 00:09:52,920 --> 00:09:57,600 Speaker 1: thing before everybody else did in some ways. 145 00:09:59,280 --> 00:10:03,600 Speaker 3: And just the the relentless nature in which that community 146 00:10:03,640 --> 00:10:06,439 Speaker 3: of climate change deniers, you know, has stayed organized like 147 00:10:06,480 --> 00:10:08,920 Speaker 3: they really and it's the same guy as mostly men 148 00:10:09,400 --> 00:10:12,120 Speaker 3: you know now for twenty thirty years that have. 149 00:10:12,120 --> 00:10:12,600 Speaker 4: Been at it. 150 00:10:13,720 --> 00:10:16,000 Speaker 3: You know, it took a long time, but I would 151 00:10:16,000 --> 00:10:18,520 Speaker 3: say that most of the people who are in the 152 00:10:18,520 --> 00:10:22,840 Speaker 3: climate reporting world are far more sophisticated now than certainly 153 00:10:22,840 --> 00:10:25,120 Speaker 3: they were in the nineteen nineties or the two thousands, 154 00:10:25,160 --> 00:10:29,080 Speaker 3: when that phenomenon was a little underreported, the cast of 155 00:10:29,160 --> 00:10:32,880 Speaker 3: characters wasn't as widely known. And there are definitely improvements 156 00:10:32,920 --> 00:10:35,680 Speaker 3: in how that has been covered, in kind of the 157 00:10:35,880 --> 00:10:38,959 Speaker 3: level of scrutiny that is applied to those folks as 158 00:10:39,000 --> 00:10:42,080 Speaker 3: messengers when they have a vested interest or they just 159 00:10:42,120 --> 00:10:46,840 Speaker 3: have a long history of at this point debunked contrarianism. 160 00:10:47,559 --> 00:10:50,880 Speaker 3: So you know, we try to write this report and 161 00:10:50,920 --> 00:10:54,960 Speaker 3: publish it with acknowledgement that we're not writing it because 162 00:10:55,000 --> 00:10:59,040 Speaker 3: climate reporters are particularly gullible or anything like that. We 163 00:10:59,080 --> 00:11:03,400 Speaker 3: think it's one of the more relevant fields with which 164 00:11:04,240 --> 00:11:08,400 Speaker 3: to understand what communication techniques are required in order to 165 00:11:08,480 --> 00:11:12,440 Speaker 3: mitigate misinformation and disinformation, as well as a field where 166 00:11:12,440 --> 00:11:14,160 Speaker 3: some of that understanding is already a little bit more 167 00:11:14,200 --> 00:11:14,920 Speaker 3: fully developed. 168 00:11:15,480 --> 00:11:19,120 Speaker 1: Yeah, yeah, that's super interesting. Can you talk a little 169 00:11:19,120 --> 00:11:22,600 Speaker 1: bit about why this report is coming out now and 170 00:11:22,640 --> 00:11:25,600 Speaker 1: particularly why it's being pegged to cop. 171 00:11:25,559 --> 00:11:30,080 Speaker 3: Sure this report has been over a year in the making. 172 00:11:30,360 --> 00:11:34,320 Speaker 3: I've been doing research on behalf of Greenpeace, which is 173 00:11:34,400 --> 00:11:39,280 Speaker 3: my former employer of a decade until twenty twenty. And 174 00:11:39,360 --> 00:11:43,000 Speaker 3: green Peace, you know, in several different offices around the 175 00:11:43,000 --> 00:11:47,679 Speaker 3: world that operates and has an interest in misinformation and disinformation. 176 00:11:48,520 --> 00:11:51,319 Speaker 3: And green Peace has partnered with a variety of groups 177 00:11:52,240 --> 00:11:55,760 Speaker 3: like a VASE as well as a coalition that this 178 00:11:55,840 --> 00:11:58,920 Speaker 3: report will be published on behalf of which is Climate 179 00:11:58,960 --> 00:12:04,920 Speaker 3: Action Against Disinformation. THEA and Climate Action Against Disinformation is 180 00:12:04,960 --> 00:12:09,240 Speaker 3: a coalition that is formed actually from work it did 181 00:12:09,400 --> 00:12:12,960 Speaker 3: a year ago monitoring the previous conference of Parties COP 182 00:12:12,960 --> 00:12:19,040 Speaker 3: twenty six in Glasgow, Scotland, and that was a less 183 00:12:19,120 --> 00:12:22,560 Speaker 3: formal effort but the same kind of idea of what 184 00:12:22,679 --> 00:12:25,680 Speaker 3: is happening this year, which is a variety of groups 185 00:12:25,679 --> 00:12:31,080 Speaker 3: including the Institute for Strategic Dialogue, Stop Funding Heat, Friends 186 00:12:31,080 --> 00:12:36,160 Speaker 3: of the Earth, Climate Nexus, Greenpeace. Many organizations are just 187 00:12:36,360 --> 00:12:41,400 Speaker 3: trying to proactively monitor social media disinformation related to climate 188 00:12:41,440 --> 00:12:45,000 Speaker 3: change and related to COP twenty seven in Egypt as 189 00:12:45,040 --> 00:12:47,839 Speaker 3: it's happening, and so they will be publishing daily briefings 190 00:12:48,360 --> 00:12:50,600 Speaker 3: that will be made available to reporters that are covering it. 191 00:12:51,280 --> 00:12:57,040 Speaker 3: Trying to observe disinformation and misinformation on social media as 192 00:12:57,040 --> 00:12:59,959 Speaker 3: it's emerging in order to kind of sound the law 193 00:13:00,440 --> 00:13:05,360 Speaker 3: and let people know which types of misleading narratives are 194 00:13:05,559 --> 00:13:10,319 Speaker 3: being seated in order to undermine the United Nations negotiations 195 00:13:10,360 --> 00:13:13,200 Speaker 3: this year. The Institute for Strategic Dialogue and a bunch 196 00:13:13,240 --> 00:13:17,280 Speaker 3: of other organizations published a report earlier this year in 197 00:13:17,280 --> 00:13:21,760 Speaker 3: twenty twenty called Deny Deceive Delay Documenting and Responding to 198 00:13:21,760 --> 00:13:25,440 Speaker 3: Climate Disinformation at COP twenty six and Beyond. And that 199 00:13:25,559 --> 00:13:30,720 Speaker 3: report I thought was a particularly coherent breakdown of trends 200 00:13:30,720 --> 00:13:34,600 Speaker 3: on social media, who some of the most prominent misinformers 201 00:13:34,640 --> 00:13:38,400 Speaker 3: and disinformers were, and just like kind of tracing the 202 00:13:38,760 --> 00:13:42,640 Speaker 3: who was the original person that put you know, the 203 00:13:42,679 --> 00:13:47,320 Speaker 3: twenty fourteen photo of a de icing exercise of a 204 00:13:47,320 --> 00:13:50,480 Speaker 3: windmill and recirculating that into a myth as if it 205 00:13:50,559 --> 00:13:53,440 Speaker 3: was something that were happening in twenty twenty one. 206 00:13:54,000 --> 00:13:57,560 Speaker 1: Yes, yes, we actually had Jenny king on to like 207 00:13:57,640 --> 00:14:00,120 Speaker 1: walk through that report. When it came out to I 208 00:14:00,120 --> 00:14:01,720 Speaker 1: thought that was really interesting. 209 00:14:01,280 --> 00:14:04,240 Speaker 4: Too, right, So that's the origin. 210 00:14:04,360 --> 00:14:09,360 Speaker 3: It's that cluster of organizations that are working to monitor 211 00:14:09,440 --> 00:14:13,520 Speaker 3: disinformation around COP twenty seven. This report is intended to 212 00:14:13,559 --> 00:14:17,600 Speaker 3: complement that effort to give a little bit more prescription 213 00:14:17,720 --> 00:14:21,080 Speaker 3: based on the best research that's available right now for journalists, 214 00:14:21,080 --> 00:14:24,800 Speaker 3: specifically to assess when to write or to not write 215 00:14:24,800 --> 00:14:28,880 Speaker 3: about a trend. When is their platform giving more oxygen 216 00:14:29,400 --> 00:14:34,240 Speaker 3: to a harmful narrative as opposed to recognizing, oh, I 217 00:14:34,280 --> 00:14:36,880 Speaker 3: do not have control over the oxygen hose. This is 218 00:14:36,920 --> 00:14:39,760 Speaker 3: a serious, widespread problem that needs to be covered. But 219 00:14:39,880 --> 00:14:43,080 Speaker 3: how do I cover it? What techniques can we use 220 00:14:43,120 --> 00:14:49,760 Speaker 3: in order to help interrupt misinformation without actually helping it grow? 221 00:14:49,840 --> 00:14:51,000 Speaker 4: And that's really tricky. 222 00:14:51,880 --> 00:14:56,880 Speaker 3: The research is actually complicated on that, but there is 223 00:14:56,960 --> 00:15:00,280 Speaker 3: now a more sophisticated understanding, certainly than there were three 224 00:15:00,400 --> 00:15:02,680 Speaker 3: or four years ago, about what some of the best 225 00:15:02,680 --> 00:15:05,520 Speaker 3: practices are, and the meat and potatoes of this report, 226 00:15:05,920 --> 00:15:09,200 Speaker 3: or the tofun potatoes, is to summarize what some of 227 00:15:09,200 --> 00:15:12,160 Speaker 3: those best practices are and also highlight what some of 228 00:15:12,200 --> 00:15:15,280 Speaker 3: the unknowns and some of the nuances are that are 229 00:15:15,320 --> 00:15:16,280 Speaker 3: still being researched. 230 00:15:18,040 --> 00:15:21,360 Speaker 1: Okay, can you get into some of the specific examples. 231 00:15:21,440 --> 00:15:23,400 Speaker 1: Obviously we can't get through all of them, and there 232 00:15:23,440 --> 00:15:25,440 Speaker 1: are a lot of really good ones in here, but 233 00:15:25,600 --> 00:15:28,960 Speaker 1: what are some that really jump out to you? 234 00:15:29,320 --> 00:15:29,600 Speaker 4: Sure? 235 00:15:29,760 --> 00:15:31,800 Speaker 3: And I want to say too, none of this report 236 00:15:31,840 --> 00:15:36,800 Speaker 3: is intended to shame specific outlets or reporters. It's intended 237 00:15:36,800 --> 00:15:39,600 Speaker 3: to learn together. And for that reason, I actually included 238 00:15:39,640 --> 00:15:43,160 Speaker 3: myself on my blog doing something that was a mistake, 239 00:15:43,880 --> 00:15:47,640 Speaker 3: and in order to try to reinforce that notion that 240 00:15:47,680 --> 00:15:50,800 Speaker 3: this isn't about trying to make any particular media outlet 241 00:15:50,880 --> 00:15:53,160 Speaker 3: look bad, you know, unless they actually did something in 242 00:15:53,240 --> 00:15:58,080 Speaker 3: bad faith like bright Bark, right of course, of course, yeah, 243 00:15:58,200 --> 00:16:01,640 Speaker 3: you know. One example is the hurricane and climate conversation. 244 00:16:02,280 --> 00:16:05,120 Speaker 3: Science continues to develop. We actually are getting to the 245 00:16:05,160 --> 00:16:10,280 Speaker 3: point where scientists are more able to estimate how much 246 00:16:10,360 --> 00:16:14,320 Speaker 3: worse hurricanes are as a result of ocean temperatures being 247 00:16:14,360 --> 00:16:17,400 Speaker 3: warmer as a result of sea level rise and what 248 00:16:17,400 --> 00:16:19,600 Speaker 3: that means for storm surge. You know, that is not 249 00:16:19,760 --> 00:16:24,080 Speaker 3: something that scientists could do ten years ago. The modeling 250 00:16:24,120 --> 00:16:26,840 Speaker 3: technology was not at that point, and so the goal 251 00:16:26,920 --> 00:16:32,400 Speaker 3: coaster shifting as scientific understanding has caught up, and in general, 252 00:16:32,720 --> 00:16:36,280 Speaker 3: it's still the case that it is difficult to assign 253 00:16:37,080 --> 00:16:39,200 Speaker 3: like a percentage in terms of how much worse any 254 00:16:39,240 --> 00:16:41,520 Speaker 3: given hurricane is as a result of climate change. But 255 00:16:41,520 --> 00:16:45,960 Speaker 3: there are very specific factors which scientists clearly understand make 256 00:16:46,040 --> 00:16:50,160 Speaker 3: hurricanes stronger, make them hit harder, make them cause more 257 00:16:50,240 --> 00:16:53,720 Speaker 3: damage and suffering, and that is a nuance that can 258 00:16:53,760 --> 00:16:56,760 Speaker 3: be navigated. We have an example of the Washington Post 259 00:16:56,840 --> 00:17:00,000 Speaker 3: and this report doing a good job talking about climate 260 00:17:00,120 --> 00:17:03,920 Speaker 3: change in hurricanes without losing track of a couple things. 261 00:17:04,040 --> 00:17:08,320 Speaker 3: One is the nuance between how weather variability is unpredictable 262 00:17:08,520 --> 00:17:13,800 Speaker 3: and it's hard to assign any specific amount of you know, 263 00:17:13,800 --> 00:17:15,800 Speaker 3: how much worse any given storm is as a result 264 00:17:15,840 --> 00:17:19,040 Speaker 3: of climate change, but the factors that definitely make hurricanes worse, 265 00:17:19,240 --> 00:17:22,200 Speaker 3: as well as not losing track of the human suffering 266 00:17:22,320 --> 00:17:25,080 Speaker 3: and how it is a also not just a climate 267 00:17:25,119 --> 00:17:28,320 Speaker 3: change story, but it is an impact on humanity story 268 00:17:28,800 --> 00:17:32,320 Speaker 3: and not to lose that point, you know, while people 269 00:17:32,320 --> 00:17:37,000 Speaker 3: are still digging up you know, possessions from the wreckage. 270 00:17:37,400 --> 00:17:41,480 Speaker 1: Yeah, I love this, this next one. The right headlines 271 00:17:41,480 --> 00:17:46,480 Speaker 1: that omit the disinformation. I think that is something that 272 00:17:46,560 --> 00:17:51,199 Speaker 1: I see happening all the time, and partly because of 273 00:17:51,240 --> 00:17:53,280 Speaker 1: this thing that we were talking about before. It Like, 274 00:17:53,400 --> 00:17:59,520 Speaker 1: if something is controversial or trending, then there's a desire 275 00:17:59,640 --> 00:18:03,280 Speaker 1: to and have capitalize on that by sticking in the headline, right, 276 00:18:03,359 --> 00:18:06,119 Speaker 1: but yeah, talk me through how damaging that is and 277 00:18:06,119 --> 00:18:08,680 Speaker 1: also like what people should be thinking about doing instead. 278 00:18:09,840 --> 00:18:11,800 Speaker 3: Sure, and this brings us back to the classic the 279 00:18:11,880 --> 00:18:15,560 Speaker 3: Richard Nixon I'm not a crook example. Everybody heard him 280 00:18:15,600 --> 00:18:18,760 Speaker 3: basically admitting he was a crook. Because that's how psychology works. 281 00:18:18,920 --> 00:18:22,199 Speaker 3: You don't want to uplift the words that you do, 282 00:18:22,280 --> 00:18:24,600 Speaker 3: not want to stick in people's mind. That's just how 283 00:18:25,040 --> 00:18:27,960 Speaker 3: our brains work. We know how to conceptualize the thing 284 00:18:28,000 --> 00:18:31,320 Speaker 3: that's being illustrated through language. We're not going to be 285 00:18:31,760 --> 00:18:35,080 Speaker 3: illustrating something that's not mentioned. So if you are a 286 00:18:35,119 --> 00:18:39,440 Speaker 3: communications professional, including journalist, but also advocates activists, you don't 287 00:18:39,440 --> 00:18:42,320 Speaker 3: want to be using the language that paints a picture 288 00:18:42,680 --> 00:18:46,160 Speaker 3: that's going to stick, where the misinformation is the thing 289 00:18:46,200 --> 00:18:48,879 Speaker 3: that is being illustrated. You know, again, this is not 290 00:18:49,000 --> 00:18:54,320 Speaker 3: intended to shame anybody. The PolitiFact website kind of does 291 00:18:54,359 --> 00:18:56,479 Speaker 3: not adhere to best practices when it comes to this. 292 00:18:56,680 --> 00:18:59,600 Speaker 3: They quote the myth that they are debunking. So even 293 00:18:59,640 --> 00:19:03,560 Speaker 3: when they have these really great images, including the pants 294 00:19:03,600 --> 00:19:06,840 Speaker 3: on fire logo, the reality is that they are still 295 00:19:07,760 --> 00:19:10,159 Speaker 3: quoting the myth at the top of their article. And 296 00:19:10,200 --> 00:19:13,320 Speaker 3: then oftentimes the next thing is they use the word no, 297 00:19:13,800 --> 00:19:15,359 Speaker 3: and then they refute the myth. 298 00:19:15,680 --> 00:19:16,080 Speaker 4: Again. 299 00:19:16,440 --> 00:19:22,119 Speaker 3: That's two rounds of uplifting misleading language before getting to 300 00:19:22,920 --> 00:19:26,320 Speaker 3: the nuances of their fact check. It's also really important 301 00:19:26,320 --> 00:19:28,879 Speaker 3: for me to say that this is a matter of 302 00:19:28,960 --> 00:19:32,360 Speaker 3: best practices. With results of the language, it is always 303 00:19:32,400 --> 00:19:35,600 Speaker 3: worth doing a fact check. So I do not want 304 00:19:35,640 --> 00:19:40,359 Speaker 3: to make it out like that PolitiFact shouldn't exist because 305 00:19:40,359 --> 00:19:44,760 Speaker 3: they're not always using the best communication technique. It's actually 306 00:19:44,800 --> 00:19:48,400 Speaker 3: always best to fact check. That's the most important thing 307 00:19:48,720 --> 00:19:50,879 Speaker 3: that can't be done. There was some research in recent 308 00:19:50,960 --> 00:19:55,960 Speaker 3: years that was a little overconcerned about various backfire effects, 309 00:19:56,560 --> 00:19:59,200 Speaker 3: and it is turning out that any attempt to fact 310 00:19:59,280 --> 00:20:03,040 Speaker 3: check or debunk misleading information is worth it first and foremost, 311 00:20:03,160 --> 00:20:05,439 Speaker 3: so that's the most important thing, is please debunk the 312 00:20:05,480 --> 00:20:09,440 Speaker 3: information and spread that as far as possible. That said, 313 00:20:09,440 --> 00:20:13,280 Speaker 3: there are still best communication practices that can be adhered to, 314 00:20:14,160 --> 00:20:18,320 Speaker 3: and that's where some of these examples, like uplifting misinformation 315 00:20:18,400 --> 00:20:24,080 Speaker 3: about climate change on Facebook or repeating Tucker Carlson's words verbatim, 316 00:20:24,480 --> 00:20:27,680 Speaker 3: that's not the best practice because that uplifts the misleading information, 317 00:20:27,840 --> 00:20:31,959 Speaker 3: that keeps it in circulation, that continues to frame the conversation. 318 00:20:32,080 --> 00:20:35,879 Speaker 3: In terms of misleading language, it is better, as with 319 00:20:35,960 --> 00:20:40,040 Speaker 3: the example from the Associated Press here to in the headline, 320 00:20:40,119 --> 00:20:43,480 Speaker 3: address that there is a myth that is circulating before 321 00:20:44,200 --> 00:20:47,600 Speaker 3: uplifting what the myth and the language surrounding it is. 322 00:20:48,160 --> 00:20:51,680 Speaker 3: And a lot of these better fact checks in terms 323 00:20:51,680 --> 00:20:53,920 Speaker 3: of the language and the order in which things are addressed, 324 00:20:54,040 --> 00:20:57,239 Speaker 3: are what is now being called the truth sandwich, in 325 00:20:57,280 --> 00:21:00,480 Speaker 3: which you first warned the audience that they are about 326 00:21:00,520 --> 00:21:04,760 Speaker 3: to hear misinformation, and you give them important details on 327 00:21:04,800 --> 00:21:08,600 Speaker 3: what the topic is, who said it, before you uplift 328 00:21:08,600 --> 00:21:11,440 Speaker 3: any of the misleading information, and you explicitly warn them 329 00:21:11,480 --> 00:21:14,920 Speaker 3: that they are about to hear something that's misleading after 330 00:21:14,960 --> 00:21:18,359 Speaker 3: they've already heard the context in which it is false, 331 00:21:18,560 --> 00:21:23,719 Speaker 3: and then you can kind of address the explicit topic 332 00:21:23,840 --> 00:21:29,000 Speaker 3: of misinformation before again reasserting the truth and why that 333 00:21:29,040 --> 00:21:32,840 Speaker 3: information is false. And that's something I first saw again 334 00:21:32,880 --> 00:21:36,280 Speaker 3: from academics like John Cook, who's at the George Mason 335 00:21:36,560 --> 00:21:41,520 Speaker 3: University Center for Climate Communication, in publications like the Debunking Handbook. 336 00:21:41,960 --> 00:21:44,280 Speaker 3: They've been very clear that just because of how our 337 00:21:44,320 --> 00:21:47,600 Speaker 3: brains work, it's really important not to mention the myth first. 338 00:21:47,760 --> 00:21:51,520 Speaker 3: It's important to mention the truth first and contextualize where 339 00:21:51,560 --> 00:21:54,920 Speaker 3: the myth happened before you address it explicitly, and then 340 00:21:54,960 --> 00:21:57,840 Speaker 3: to follow up again by reiterating the truth. That's a 341 00:21:57,920 --> 00:22:02,080 Speaker 3: much more effective way to this lawe misinformation from a 342 00:22:02,160 --> 00:22:03,680 Speaker 3: human's brain after they've already been. 343 00:22:03,560 --> 00:22:07,639 Speaker 4: Exposed to it. Yeah, that's so interesting again. 344 00:22:07,760 --> 00:22:11,760 Speaker 3: George Lakeoff, author of Don't Think, I believe he might 345 00:22:11,800 --> 00:22:15,959 Speaker 3: have coined the term truth sandwich. The technique of the 346 00:22:16,000 --> 00:22:19,119 Speaker 3: truth sandwich is something that has been around for longer 347 00:22:19,160 --> 00:22:21,560 Speaker 3: than that, and I'm not sure if John Cook and 348 00:22:21,600 --> 00:22:24,560 Speaker 3: Stephan Lewandowski and all the co authors of the Debunking 349 00:22:24,560 --> 00:22:27,000 Speaker 3: Handbook were first to do it, but They're the first 350 00:22:27,000 --> 00:22:31,160 Speaker 3: people that I saw addressing it in some of their publications, 351 00:22:31,160 --> 00:22:33,800 Speaker 3: and I've found those reports to be invaluable in terms 352 00:22:33,800 --> 00:22:38,800 Speaker 3: of explaining communications best practices from a psychological perspective. 353 00:22:39,520 --> 00:22:43,439 Speaker 1: Yeah, let's talk about the partisan signaling thing, because I 354 00:22:43,480 --> 00:22:49,040 Speaker 1: also thought this was really interesting and something that I think, well, A, 355 00:22:49,320 --> 00:22:52,600 Speaker 1: I'm pretty sure I have made this mistake myself, and 356 00:22:52,640 --> 00:22:54,080 Speaker 1: I think a lot of journalists too. 357 00:22:54,400 --> 00:22:57,760 Speaker 3: Great, and I agree this one's really tricky just as 358 00:22:57,800 --> 00:23:01,080 Speaker 3: a matter of personal preference. Like when I wrote publications 359 00:23:01,080 --> 00:23:05,880 Speaker 3: for Greenpeace, which is an explicit activist organization, unapologetically that's 360 00:23:05,880 --> 00:23:09,560 Speaker 3: what it exists to do, I tried very hard to 361 00:23:09,680 --> 00:23:12,920 Speaker 3: never write conservative or right wing in any of my posts. 362 00:23:12,960 --> 00:23:15,119 Speaker 3: I did not want to signal to an audience that like, 363 00:23:15,359 --> 00:23:18,000 Speaker 3: this is only something liberals should care about and conservatives 364 00:23:18,000 --> 00:23:21,000 Speaker 3: should not care, because that conversation is just a race 365 00:23:21,040 --> 00:23:24,560 Speaker 3: to the bottom, and it's hyper prevalent in our media 366 00:23:24,640 --> 00:23:27,199 Speaker 3: environment here in the United States. But what makes it 367 00:23:27,240 --> 00:23:28,800 Speaker 3: so tricky is the fact that when it comes to 368 00:23:28,800 --> 00:23:32,920 Speaker 3: climate change, Republicans don't care and Democrats do care, and 369 00:23:33,400 --> 00:23:37,520 Speaker 3: their constituents follow that exact pattern as well, and that 370 00:23:37,680 --> 00:23:41,000 Speaker 3: is a factor in all of this. So how is 371 00:23:41,040 --> 00:23:43,040 Speaker 3: a reporter is supposed to navigate that when that is 372 00:23:43,080 --> 00:23:47,680 Speaker 3: actually the factual reality of this situation. But it does 373 00:23:47,720 --> 00:23:50,920 Speaker 3: readers of disservice to signal that you should care about 374 00:23:50,960 --> 00:23:55,119 Speaker 3: something or not as a result of partisan royalty. And 375 00:23:55,160 --> 00:23:58,960 Speaker 3: I think that it's something that can be navigated by 376 00:23:59,560 --> 00:24:04,679 Speaker 3: framing the conversation around something else and including statements from 377 00:24:04,720 --> 00:24:08,520 Speaker 3: politicians later in the article, perhaps to help illustrate the 378 00:24:08,560 --> 00:24:13,920 Speaker 3: partisan divide or the massive disparity in how science is 379 00:24:14,280 --> 00:24:18,359 Speaker 3: accepted or interpreted depending on partisan affiliation. But making that 380 00:24:18,440 --> 00:24:22,640 Speaker 3: the story or making that the headline only serves to 381 00:24:22,680 --> 00:24:27,240 Speaker 3: increase those divides. When you know, polling indicates there is 382 00:24:27,280 --> 00:24:29,720 Speaker 3: a little bit more nuance among the electorate in terms 383 00:24:29,800 --> 00:24:33,000 Speaker 3: of how they care about climate change. Support for renewable 384 00:24:33,119 --> 00:24:36,240 Speaker 3: energy and other policy solutions to climate change are a 385 00:24:36,240 --> 00:24:39,280 Speaker 3: little more popular than you would expect when you just 386 00:24:39,359 --> 00:24:42,320 Speaker 3: read headlines about Republicans trying to tank climate policy and 387 00:24:42,640 --> 00:24:44,000 Speaker 3: Democrats trying to pass it. 388 00:24:44,800 --> 00:24:46,200 Speaker 4: Yeah, it's true. 389 00:24:46,240 --> 00:24:48,919 Speaker 1: It's a good point. And also, like I feel like 390 00:24:49,000 --> 00:24:53,880 Speaker 1: with climate in particular, part of the disinformation effort has 391 00:24:53,960 --> 00:24:57,439 Speaker 1: been a concerted effort to politicize it so to the 392 00:24:57,560 --> 00:25:01,280 Speaker 1: extent that we can move away from that, it seems 393 00:25:01,320 --> 00:25:05,320 Speaker 1: like a good Okay, This next one I also found 394 00:25:05,359 --> 00:25:08,439 Speaker 1: really interesting. I mean, there are lots of reasons to 395 00:25:08,520 --> 00:25:11,639 Speaker 1: avoid the passive voice, but I had not thought about 396 00:25:11,680 --> 00:25:14,119 Speaker 1: how it might preclude accountability. 397 00:25:14,520 --> 00:25:17,600 Speaker 4: Yeah, this was one of the most revelatory things. 398 00:25:17,600 --> 00:25:20,240 Speaker 3: So this is an idea that was published in a 399 00:25:20,320 --> 00:25:24,360 Speaker 3: video by the Union of Concerned Scientists through a communications 400 00:25:24,359 --> 00:25:28,120 Speaker 3: expert named Sabrina Joyce Stevens, and it's about not writing 401 00:25:28,160 --> 00:25:31,800 Speaker 3: in the passive voice. So Stevens, I'm going to quote 402 00:25:31,880 --> 00:25:35,560 Speaker 3: from what she says in the video. She instructs users, 403 00:25:35,600 --> 00:25:39,520 Speaker 3: when we just name disparities and outcomes without naming who 404 00:25:39,520 --> 00:25:42,639 Speaker 3: and what is responsible for those disparities, we make it 405 00:25:42,680 --> 00:25:46,920 Speaker 3: seem like a person's identity is responsible for the problem 406 00:25:47,000 --> 00:25:50,199 Speaker 3: instead of the people and institutions discriminating against them on 407 00:25:50,240 --> 00:25:57,160 Speaker 3: that basis. So there is an implication, an unintentional consequence 408 00:25:57,240 --> 00:26:02,800 Speaker 3: to glossing over and issue without kind of naming people 409 00:26:02,880 --> 00:26:05,600 Speaker 3: who are most impacted by it and people who are 410 00:26:05,600 --> 00:26:11,800 Speaker 3: most responsible for making decisions or incentivizing that problematic trend. 411 00:26:12,560 --> 00:26:16,199 Speaker 3: Pollution is the example that Sabrina Joy Stevens uses in 412 00:26:16,240 --> 00:26:22,200 Speaker 3: the video, and again, environmental organizations I think often fall 413 00:26:22,240 --> 00:26:25,200 Speaker 3: into this trap just as much as reporters do, where 414 00:26:25,280 --> 00:26:27,960 Speaker 3: you're trying to do the responsible thing by mentioning, hey, 415 00:26:28,400 --> 00:26:32,320 Speaker 3: it's communities of color that are most disproportionately subject to 416 00:26:33,080 --> 00:26:39,000 Speaker 3: polluting infrastructure, which is, you know, responsible for higher rates 417 00:26:39,000 --> 00:26:44,480 Speaker 3: of chronic illness and preventable death. But that narrative doesn't 418 00:26:44,480 --> 00:26:47,600 Speaker 3: include the fact that it's not an accident. Those polluting 419 00:26:47,680 --> 00:26:52,680 Speaker 3: refineries and facilities are built in communities are that are 420 00:26:53,200 --> 00:26:57,040 Speaker 3: you know, lower income, majority people of color. Those are 421 00:26:57,119 --> 00:27:00,200 Speaker 3: decisions that are made on purpose by executives, by politicians, 422 00:27:00,520 --> 00:27:03,879 Speaker 3: by officials who have the power to permit them, and 423 00:27:04,040 --> 00:27:08,240 Speaker 3: excluding that from the story does a disservice that is 424 00:27:08,760 --> 00:27:13,160 Speaker 3: akin to victim blaming. And it's really important for journalists, 425 00:27:13,200 --> 00:27:15,720 Speaker 3: i think, to feel empowered to be able to put 426 00:27:15,760 --> 00:27:19,000 Speaker 3: that in print without it being seen as an activist move. 427 00:27:19,200 --> 00:27:22,320 Speaker 3: It's actually just part of the reality and something worth 428 00:27:22,320 --> 00:27:25,399 Speaker 3: including reporting that it's not an accident that these things 429 00:27:26,000 --> 00:27:31,520 Speaker 3: impact different groups disproportionately, that is by design, and it's 430 00:27:31,560 --> 00:27:34,040 Speaker 3: okay to put that in print because that's just the 431 00:27:34,080 --> 00:27:36,440 Speaker 3: reality of the situation. We don't really need any more 432 00:27:37,240 --> 00:27:40,560 Speaker 3: examples or data to understand that's how this happens. 433 00:27:41,200 --> 00:27:43,520 Speaker 1: It's an interesting one to think about, and. 434 00:27:43,520 --> 00:27:46,320 Speaker 3: It just requires such self awareness of your own writing. 435 00:27:46,359 --> 00:27:48,280 Speaker 3: I think that's why it struck me, you know, I 436 00:27:48,359 --> 00:27:50,080 Speaker 3: was like, how many times have I done this by 437 00:27:50,280 --> 00:27:54,719 Speaker 3: kind of not stating something explicitly or just following the 438 00:27:54,760 --> 00:27:59,160 Speaker 3: norms of you know how I've seen reporting and writing 439 00:27:59,280 --> 00:28:04,160 Speaker 3: done from both the media as well as from advocacy groups. 440 00:28:04,720 --> 00:28:07,520 Speaker 3: Maybe something that can be done is in the final 441 00:28:07,640 --> 00:28:12,439 Speaker 3: editing stage doing a scan intentionally for are there points 442 00:28:12,480 --> 00:28:14,880 Speaker 3: in this article where the passive voice is being used 443 00:28:14,880 --> 00:28:19,879 Speaker 3: that actually leaves a lot left on? Said that precludes accountability? 444 00:28:20,359 --> 00:28:21,760 Speaker 3: And in my life, I think. 445 00:28:22,200 --> 00:28:25,640 Speaker 1: Part of that is sort of norms in the media too, 446 00:28:25,720 --> 00:28:29,720 Speaker 1: that very much, I don't know, started to shift away 447 00:28:29,720 --> 00:28:33,040 Speaker 1: from accountability in like the thirties and forties and have 448 00:28:33,200 --> 00:28:36,199 Speaker 1: never really gone back. You're sort of conditioned to not 449 00:28:37,680 --> 00:28:42,600 Speaker 1: make accountability statements in a sort of straightforward way. You're 450 00:28:42,720 --> 00:28:48,720 Speaker 1: just sort of describing the situation versus assigning agency or 451 00:28:48,800 --> 00:28:52,280 Speaker 1: blame to anyone. That is very much sort of how 452 00:28:53,040 --> 00:28:56,840 Speaker 1: a lot of newsrooms encourage people to write even though 453 00:28:57,040 --> 00:29:00,200 Speaker 1: in general, the passive voice is something that editors like 454 00:29:00,240 --> 00:29:03,640 Speaker 1: to edit out of pieces for grammar reasons. 455 00:29:03,880 --> 00:29:06,880 Speaker 3: Yes, I think that's also Again, this is just how 456 00:29:07,000 --> 00:29:10,480 Speaker 3: a function of how newsrooms operate, the need for brevity 457 00:29:11,000 --> 00:29:14,880 Speaker 3: competes with the need for nuance, and yeah, how that 458 00:29:14,920 --> 00:29:19,600 Speaker 3: can actually unintentionally lead to a more inaccurate reporting. 459 00:29:19,920 --> 00:29:24,040 Speaker 1: Yeah, I mean sometimes it's even driven by a desire 460 00:29:24,160 --> 00:29:27,960 Speaker 1: to not seem opinionated or bias, which means this is 461 00:29:28,000 --> 00:29:30,560 Speaker 1: a conversation that's been going on for years now around 462 00:29:30,800 --> 00:29:33,600 Speaker 1: the sort of myth of objectivity and media and how 463 00:29:34,280 --> 00:29:37,880 Speaker 1: in fact it often emphasizes a particular bias in the 464 00:29:38,000 --> 00:29:42,760 Speaker 1: interest of avoiding any sort of opinion or bias. This 465 00:29:42,800 --> 00:29:44,920 Speaker 1: one was really interesting to me because I'm like, oh, 466 00:29:45,000 --> 00:29:48,440 Speaker 1: I definitely see this a lot, and I also have 467 00:29:48,600 --> 00:29:54,480 Speaker 1: seen and had editors actually encourage that type of construction 468 00:29:54,640 --> 00:29:58,520 Speaker 1: because you're not supposed to ascribe intentionality to anyone. 469 00:29:58,800 --> 00:30:00,760 Speaker 3: Sure, that's got to be one one of the trickiest 470 00:30:00,800 --> 00:30:06,080 Speaker 3: things about best communication practices for journalists and editors when 471 00:30:06,120 --> 00:30:09,080 Speaker 3: it comes to misinformation. And even that's a point where 472 00:30:09,560 --> 00:30:13,400 Speaker 3: best practices in this report might seem to contradict each other. 473 00:30:14,000 --> 00:30:19,440 Speaker 3: Like I'm saying, some research indicates precludes accountability to write 474 00:30:19,440 --> 00:30:21,840 Speaker 3: in the passive voice, and I'm also saying, please avoid 475 00:30:21,920 --> 00:30:25,760 Speaker 3: partisan signaling. There's an irony there where Republicans are more 476 00:30:25,760 --> 00:30:30,040 Speaker 3: obstructive to climate change, and yet this guide says, don't 477 00:30:30,080 --> 00:30:34,280 Speaker 3: just write about Republicans and what Republicans don't do and 478 00:30:34,320 --> 00:30:38,920 Speaker 3: what Democrats do. There's an irony there. There's a lot 479 00:30:38,960 --> 00:30:41,720 Speaker 3: of nuance here that's actually very very hard to navigate 480 00:30:41,920 --> 00:30:44,160 Speaker 3: if you're a journalist or a newsroom. And I think 481 00:30:44,160 --> 00:30:47,640 Speaker 3: that's why this is so important too, is there's not 482 00:30:47,680 --> 00:30:50,360 Speaker 3: a lot of time to sit and think about this. 483 00:30:50,560 --> 00:30:53,520 Speaker 3: If you're a journalist, you're on a deadline for story 484 00:30:53,560 --> 00:30:56,560 Speaker 3: after story after story. You don't want to get scooped. 485 00:30:57,560 --> 00:31:01,360 Speaker 3: You want to do an informative piece without taking too 486 00:31:01,400 --> 00:31:04,440 Speaker 3: much time away from the next responsibility. And that's you know, 487 00:31:04,640 --> 00:31:08,160 Speaker 3: the competing interests, the finances of a newsroom, and the 488 00:31:08,160 --> 00:31:10,400 Speaker 3: time constraints on a journalist or an editor are also 489 00:31:10,440 --> 00:31:15,240 Speaker 3: major factors here. So I'm hoping that a report like 490 00:31:15,280 --> 00:31:19,440 Speaker 3: this can help start conversations between journalists and editors and 491 00:31:19,560 --> 00:31:23,640 Speaker 3: anybody else in the journalist profession about you know, what 492 00:31:23,680 --> 00:31:26,240 Speaker 3: are the next steps in terms of best practices to 493 00:31:26,320 --> 00:31:28,000 Speaker 3: navigate some of this stuff, because some of it is 494 00:31:28,040 --> 00:31:31,760 Speaker 3: seemingly contradictory. I don't think it is necessarily but like 495 00:31:31,880 --> 00:31:35,760 Speaker 3: more conversation is needed to figure out what journalistic norms 496 00:31:35,760 --> 00:31:40,560 Speaker 3: are necessary now that social media misinformation has interrupted previous 497 00:31:40,560 --> 00:31:42,880 Speaker 3: best practices that are now irrelevant and learned how to 498 00:31:42,880 --> 00:31:45,239 Speaker 3: exploit them in order to generate coverage for something that 499 00:31:45,400 --> 00:31:49,040 Speaker 3: shouldn't be covered, or in order to let misinformation go 500 00:31:49,200 --> 00:31:53,200 Speaker 3: viral before it is debunked in a way that is 501 00:31:53,280 --> 00:31:56,840 Speaker 3: like less makes less impact than the viral misinformation in 502 00:31:56,880 --> 00:31:57,480 Speaker 3: the first place. 503 00:31:58,280 --> 00:32:02,360 Speaker 1: Yeah, I think the of inoculation is one that I 504 00:32:02,440 --> 00:32:04,960 Speaker 1: know I've talked to John Cook about before too, but 505 00:32:05,240 --> 00:32:07,880 Speaker 1: I think it's really interesting and important. How do you 506 00:32:07,920 --> 00:32:12,080 Speaker 1: think about inoculation in general and how successful have some 507 00:32:12,120 --> 00:32:13,400 Speaker 1: of these tactics been. 508 00:32:13,760 --> 00:32:16,920 Speaker 3: I'm not sure how to measure if inoculation is successful, 509 00:32:16,920 --> 00:32:20,560 Speaker 3: which I think makes it frustrating in terms of knowing 510 00:32:20,560 --> 00:32:22,320 Speaker 3: how to prioritize it. If you do a good job 511 00:32:22,320 --> 00:32:27,840 Speaker 3: inoculating against disinformation, the disinformation just doesn't really take hold right, 512 00:32:27,920 --> 00:32:30,560 Speaker 3: So it's much easier for us to look back in 513 00:32:30,640 --> 00:32:35,080 Speaker 3: time and say, here's a narrative that was not inoculated against, 514 00:32:35,120 --> 00:32:39,000 Speaker 3: and it really took off, like the false blaming of 515 00:32:39,040 --> 00:32:42,120 Speaker 3: wind power for the Texas freeze disaster in February twenty 516 00:32:42,120 --> 00:32:46,280 Speaker 3: twenty one. Very obvious example. Most of Texas's city generation 517 00:32:46,360 --> 00:32:52,120 Speaker 3: infrastructure that winter was from thermal power, mostly gas and nuclear. 518 00:32:52,240 --> 00:32:55,560 Speaker 3: That's the majority of what failed, and yet some politicians 519 00:32:56,000 --> 00:33:00,880 Speaker 3: falsely blamed wind turbines for the lapse of the grid 520 00:33:01,360 --> 00:33:05,400 Speaker 3: in twenty twenty one, a deadly disaster, as well as 521 00:33:05,440 --> 00:33:09,560 Speaker 3: the misleading images that circulated online, like frozen wind turbines 522 00:33:09,560 --> 00:33:14,120 Speaker 3: in Sweden many years previous, where helicopters were actually performing 523 00:33:14,160 --> 00:33:18,840 Speaker 3: a maintenance exercise about de icing them. You know, that 524 00:33:18,880 --> 00:33:21,360 Speaker 3: had nothing to do with Texas. They're completely different regions 525 00:33:21,360 --> 00:33:24,680 Speaker 3: of the world. Texas did not weatherize any of their 526 00:33:24,680 --> 00:33:29,360 Speaker 3: electricity generation infrastructure, not just wind turbines, but also there 527 00:33:29,400 --> 00:33:32,920 Speaker 3: are gas and nuclear plants which froze. So that's an 528 00:33:32,960 --> 00:33:37,280 Speaker 3: example I think where inoculation could have gone a long way. 529 00:33:37,440 --> 00:33:41,720 Speaker 3: But it's complex to anticipate. You can't anticipate a disaster 530 00:33:41,880 --> 00:33:44,920 Speaker 3: like that necessarily or when it's going to happen, and 531 00:33:44,960 --> 00:33:47,800 Speaker 3: that means there was no priority for journalists just to 532 00:33:47,840 --> 00:33:51,000 Speaker 3: start writing articles to make sure people understand the composition 533 00:33:51,080 --> 00:33:52,680 Speaker 3: of Texas's grid, and. 534 00:33:54,240 --> 00:33:57,480 Speaker 4: You know, things that could have inoculated right against it. 535 00:33:57,760 --> 00:34:01,560 Speaker 3: That said, the contexts that were in Right now, we're 536 00:34:01,560 --> 00:34:08,000 Speaker 3: coming up toward the twenty seventh United Nations Climate change negotiations, 537 00:34:08,640 --> 00:34:13,719 Speaker 3: there are certain predictable pieces of misinformation that will circulate, 538 00:34:13,719 --> 00:34:17,800 Speaker 3: including against the leitism, the irony of using Jeff fuel 539 00:34:17,840 --> 00:34:21,239 Speaker 3: to go to a climate conference. Those are arguments that, 540 00:34:21,960 --> 00:34:24,520 Speaker 3: out of context, are very easy for people to scoff 541 00:34:24,520 --> 00:34:28,640 Speaker 3: at and become cynical about. And I think when there's 542 00:34:28,680 --> 00:34:31,319 Speaker 3: a major event that is predictable that's coming up, that 543 00:34:31,440 --> 00:34:34,239 Speaker 3: is a good time for journalists to have a think 544 00:34:34,239 --> 00:34:39,439 Speaker 3: about writing some inoculation pieces like predictable misinformation that will 545 00:34:39,640 --> 00:34:42,839 Speaker 3: likely circulate in the next few weeks and why that 546 00:34:42,960 --> 00:34:45,839 Speaker 3: is not true, the cherry picking that's required in order 547 00:34:45,880 --> 00:34:49,560 Speaker 3: to make that sound reasonable in the minds of people 548 00:34:49,560 --> 00:34:51,360 Speaker 3: who don't spend all day paying. 549 00:34:51,080 --> 00:34:52,280 Speaker 4: Attention to these trends. 550 00:34:53,080 --> 00:34:56,520 Speaker 3: So inoculation is a tricky one because there's not necessarily 551 00:34:56,520 --> 00:35:00,000 Speaker 3: a direct revenue incentive to write articles that get out 552 00:35:00,080 --> 00:35:04,000 Speaker 3: head of misinformation that is only theoretical. You know, this 553 00:35:04,120 --> 00:35:06,520 Speaker 3: information might circulate. That's a much harder thing, I think 554 00:35:06,520 --> 00:35:11,000 Speaker 3: to get past an editor's desk than a disinformation trend 555 00:35:11,040 --> 00:35:13,919 Speaker 3: that just happened and had widespread impact. 556 00:35:14,400 --> 00:35:19,400 Speaker 1: Right right, that makes a lot of sense. Awesome, All right, Well, 557 00:35:19,440 --> 00:35:22,880 Speaker 1: we will definitely share a link to the whole report 558 00:35:22,920 --> 00:35:25,239 Speaker 1: and show notes so people can check that out. I 559 00:35:25,320 --> 00:35:28,920 Speaker 1: appreciate you walking me through it. It's a handy resource. 560 00:35:28,960 --> 00:35:33,080 Speaker 1: I feel like, you know, I'm not coming at this cold, 561 00:35:33,200 --> 00:35:35,520 Speaker 1: but there was definitely things in here that I hadn't 562 00:35:35,520 --> 00:35:37,120 Speaker 1: thought about that are super helpful. 563 00:35:37,600 --> 00:35:39,920 Speaker 4: Thanks so much, Amy. 564 00:35:51,480 --> 00:35:54,000 Speaker 1: That's it for this week, Thanks for listening, and we'll 565 00:35:54,000 --> 00:36:03,920 Speaker 1: see you next time. Drilled is an original Critical Frequency production. 566 00:36:04,680 --> 00:36:08,160 Speaker 1: The show was created and reported by me Amy Westervelt. 567 00:36:08,600 --> 00:36:12,520 Speaker 1: Original music and mixing and mastering for this episode by 568 00:36:12,600 --> 00:36:16,880 Speaker 1: Peter duff. Our artwork is by Matthew Fleming. For ad 569 00:36:16,880 --> 00:36:20,319 Speaker 1: free episodes and bonus content, you can sign up for 570 00:36:20,560 --> 00:36:24,080 Speaker 1: our Patreon at patreon dot com slash Drilled