1 00:00:09,520 --> 00:00:13,680 Speaker 1: Welcome back to Drilled. I'm Amy Westervelt. A new report 2 00:00:13,760 --> 00:00:18,439 Speaker 1: was released recently from the Institute for Strategic Dialogue and 3 00:00:18,560 --> 00:00:24,120 Speaker 1: the twenty plus member Coalition Climate Action Against Disinformation. It's 4 00:00:24,160 --> 00:00:29,280 Speaker 1: called Deny, Deceive and Delay, and it documents and responds 5 00:00:29,360 --> 00:00:34,280 Speaker 1: to climate disinformation from the COP twenty sixth summit to today. 6 00:00:34,880 --> 00:00:40,360 Speaker 1: It also offers some suggestions for countering disinformation policies that 7 00:00:40,440 --> 00:00:45,200 Speaker 1: could help and messaging frameworks that might be helpful. Today, 8 00:00:45,240 --> 00:00:48,240 Speaker 1: I'm joined by Jenny King, who was the lead author 9 00:00:48,400 --> 00:00:51,839 Speaker 1: on the report, to walk us through some of the 10 00:00:51,920 --> 00:00:54,120 Speaker 1: findings and what to make of them. 11 00:00:54,960 --> 00:01:22,000 Speaker 2: That conversation's coming up right after this quick break. 12 00:01:11,080 --> 00:01:14,440 Speaker 3: So I'm curious to hear maybe a little bit about 13 00:01:14,480 --> 00:01:19,440 Speaker 3: the process first, how you started looking at this issue 14 00:01:19,800 --> 00:01:23,080 Speaker 3: and how you kind of honed in on these super 15 00:01:23,080 --> 00:01:25,279 Speaker 3: spreader accounts in the first place. 16 00:01:26,240 --> 00:01:31,520 Speaker 4: Sure, my organization ISD has traditionally looked at the evolution 17 00:01:31,640 --> 00:01:35,000 Speaker 4: of extremist and conspiracist ideologies and the roles that they 18 00:01:35,000 --> 00:01:39,560 Speaker 4: are playing in undermining democratic processes and norms and ultimately 19 00:01:39,640 --> 00:01:43,160 Speaker 4: fueling haide speech and violence. And over the years we've 20 00:01:43,200 --> 00:01:46,240 Speaker 4: developed a very large portfolio in lots of different vectors 21 00:01:46,240 --> 00:01:50,320 Speaker 4: of disinformation, because obviously, disinformation is a key tool used 22 00:01:50,360 --> 00:01:53,880 Speaker 4: by both extremist and conspiracist movements and also used in 23 00:01:53,880 --> 00:01:57,480 Speaker 4: the effort to drive polarization in a number of societies, 24 00:01:57,840 --> 00:02:00,440 Speaker 4: and so working on climate change might seem like a 25 00:02:00,440 --> 00:02:04,279 Speaker 4: slightly left field choice. But the reason why we became 26 00:02:04,320 --> 00:02:07,240 Speaker 4: so interested and decided to build out an entirely new 27 00:02:07,280 --> 00:02:10,560 Speaker 4: portfolio of work in this space is because we noticed 28 00:02:10,560 --> 00:02:13,000 Speaker 4: that a lot of the communities we have been monitoring 29 00:02:13,000 --> 00:02:16,720 Speaker 4: for a number of years were beginning to espouse anti 30 00:02:16,840 --> 00:02:21,040 Speaker 4: environmental stances, and that there seemed to be a growing 31 00:02:21,160 --> 00:02:26,520 Speaker 4: confluence between the kinds of anti government, anti elite, broad 32 00:02:26,720 --> 00:02:29,960 Speaker 4: culture wars, and identity politics framing that you would find 33 00:02:30,000 --> 00:02:34,360 Speaker 4: in other really important issue sets like migration or sectional 34 00:02:34,400 --> 00:02:38,720 Speaker 4: reproductive health rights or hate speech incitement to violence, and 35 00:02:38,760 --> 00:02:42,200 Speaker 4: what was happening around climate change, and that to us 36 00:02:42,280 --> 00:02:46,320 Speaker 4: felt like an extremely concerning shift in the evolution of 37 00:02:46,400 --> 00:02:49,640 Speaker 4: rhetoric around this particular issue, and so we decided to 38 00:02:49,639 --> 00:02:52,280 Speaker 4: get into the weeds of why that was happening, who 39 00:02:52,280 --> 00:02:54,480 Speaker 4: it was being driven by, where it was taking place 40 00:02:54,520 --> 00:02:59,079 Speaker 4: across both social media and legacy media outlets, and more importantly, 41 00:02:59,080 --> 00:03:01,520 Speaker 4: what could be done about it, And over the past 42 00:03:01,600 --> 00:03:05,079 Speaker 4: eighteen months or so, our efforts have then broadened out 43 00:03:05,160 --> 00:03:09,280 Speaker 4: into the formalization of this coalition Climate Action Against Disinformation, 44 00:03:09,840 --> 00:03:14,200 Speaker 4: which involves over twenty organizations globally that work either in 45 00:03:14,360 --> 00:03:18,800 Speaker 4: the counter disinformation space or in the climate science, climate policy, 46 00:03:18,840 --> 00:03:22,040 Speaker 4: and climate advocacy spaces, and that the combination of those 47 00:03:22,040 --> 00:03:25,160 Speaker 4: two sets of expertise has given us a really unique 48 00:03:25,280 --> 00:03:28,919 Speaker 4: lens in trying to understand what are the tactics at play, 49 00:03:29,680 --> 00:03:33,400 Speaker 4: not anymore really to deny that climate change exists, but 50 00:03:33,520 --> 00:03:37,480 Speaker 4: to make sure that meaningful policy is never implemented, either 51 00:03:37,520 --> 00:03:39,840 Speaker 4: at the dotstick or the multilateral level. 52 00:03:40,400 --> 00:03:43,480 Speaker 3: Do you think that there's any indication that this is 53 00:03:43,640 --> 00:03:46,880 Speaker 3: part of the sort of pivot to ecofascism. 54 00:03:47,320 --> 00:03:50,200 Speaker 4: You are certainly seeing the growth in eco fascist rhetoric, 55 00:03:50,320 --> 00:03:53,119 Speaker 4: or at least the mainstreaming and the normalization of those 56 00:03:53,240 --> 00:03:56,280 Speaker 4: kinds of ideologies, and of course that fits with the 57 00:03:56,280 --> 00:04:01,360 Speaker 4: fact that racist rhetoric and particularly phobic and anti migrant 58 00:04:01,400 --> 00:04:05,640 Speaker 4: rhetoric has become so mainlined in a lot of political discourse, 59 00:04:05,720 --> 00:04:09,680 Speaker 4: particularly across Europe, across North America, across Asia Pacific. And 60 00:04:10,160 --> 00:04:13,720 Speaker 4: it's really worth noting that both the christ Church shooter 61 00:04:14,160 --> 00:04:19,160 Speaker 4: in New Zealand who committed mass atrocity against Muslim citizens 62 00:04:19,600 --> 00:04:24,200 Speaker 4: and the recent mass shooter in Buffalo who murdered a 63 00:04:24,240 --> 00:04:27,159 Speaker 4: number of African American citizens, that both of them, in 64 00:04:27,200 --> 00:04:32,719 Speaker 4: their online manifestos explicitly associated their own personal stances with 65 00:04:33,040 --> 00:04:35,719 Speaker 4: eco fascism and the idea that the only way to 66 00:04:35,800 --> 00:04:41,800 Speaker 4: solve environmental issues is by reducing population size, stopping the 67 00:04:42,279 --> 00:04:46,040 Speaker 4: spread of certain communities around the globe, and also linking 68 00:04:46,080 --> 00:04:50,279 Speaker 4: that to other extremely dangerous and harmful conspiracies like the 69 00:04:50,279 --> 00:04:53,200 Speaker 4: Great Replacement, which you claim that there is a white 70 00:04:53,240 --> 00:04:57,640 Speaker 4: genocide being enacted by some shadowy cabal of elites around 71 00:04:57,640 --> 00:05:01,640 Speaker 4: the world. It's certainly a growing body of content. I 72 00:05:01,680 --> 00:05:04,960 Speaker 4: wouldn't say that it's necessarily the narrative which is driving 73 00:05:05,080 --> 00:05:09,560 Speaker 4: most of the high traction discussions across social media, but yes, 74 00:05:09,640 --> 00:05:13,279 Speaker 4: I would say that far right political movements in general 75 00:05:13,680 --> 00:05:17,839 Speaker 4: are certainly engaging with environmentalism in many cases through the 76 00:05:17,920 --> 00:05:24,080 Speaker 4: lens of their other ideologies, which are you know, exclusionary, nativist, isolationists, etc. 77 00:05:24,920 --> 00:05:31,359 Speaker 3: Right right, Okay, So I'm curious how these sixteen accounts 78 00:05:31,360 --> 00:05:34,320 Speaker 3: that you highlight in the report what you were looking for. 79 00:05:34,839 --> 00:05:37,320 Speaker 3: That made it clear that, Okay, these people are really 80 00:05:37,400 --> 00:05:39,719 Speaker 3: like central hubs of this stuff. 81 00:05:40,640 --> 00:05:43,120 Speaker 4: So what became very acute to us as a new 82 00:05:43,200 --> 00:05:46,440 Speaker 4: trend is the fact that a large amount of online 83 00:05:46,480 --> 00:05:49,320 Speaker 4: discourse is being driven by actors who we are referring 84 00:05:49,360 --> 00:05:52,479 Speaker 4: to as non climate influencers. And by that what I 85 00:05:52,520 --> 00:05:55,680 Speaker 4: mean is that these are accounts or individuals for whom 86 00:05:55,720 --> 00:05:57,960 Speaker 4: climate has never been a key part of either their 87 00:05:58,000 --> 00:06:00,640 Speaker 4: brand or their public outputs in the past. So they're 88 00:06:00,640 --> 00:06:03,200 Speaker 4: not the historic deniers that have been active in this 89 00:06:03,279 --> 00:06:06,520 Speaker 4: space and determined to maintain the status quo for decades. 90 00:06:06,800 --> 00:06:10,600 Speaker 4: These are people for whom environmentalism as part of their 91 00:06:10,640 --> 00:06:15,800 Speaker 4: public platform is potentially relatively new, and that rather than 92 00:06:15,839 --> 00:06:19,240 Speaker 4: being a preoccupation in and of itself or them having 93 00:06:19,240 --> 00:06:22,560 Speaker 4: a particular interest in that as an issue, it has 94 00:06:22,600 --> 00:06:28,239 Speaker 4: become symbolic or indicative for them of broader societal trends, 95 00:06:28,279 --> 00:06:32,599 Speaker 4: in particular the woke, agender and identity politics, and therefore 96 00:06:32,640 --> 00:06:35,360 Speaker 4: it provides a very useful vector for them to get 97 00:06:35,640 --> 00:06:41,160 Speaker 4: other worldviews, other incendiary and sensationist ideas into the mainstream 98 00:06:41,320 --> 00:06:45,320 Speaker 4: and to galvanize an audience against climate action. What we 99 00:06:45,440 --> 00:06:48,159 Speaker 4: began to see is if you looked at keywords that 100 00:06:48,480 --> 00:06:52,440 Speaker 4: very clearly situated climate within that culture wars frame. It 101 00:06:52,600 --> 00:06:57,320 Speaker 4: was people like Jordan B. Peterson, who boasts a followership 102 00:06:57,360 --> 00:06:59,920 Speaker 4: of over ten million across his digital footprint if you 103 00:07:00,120 --> 00:07:04,159 Speaker 4: combine his Facebook into Twitter, spreadit and his TikTok et cetera, 104 00:07:04,600 --> 00:07:07,400 Speaker 4: or other people that broadly sit in They often refer 105 00:07:07,440 --> 00:07:11,320 Speaker 4: to themselves as the quote unquote intellectual dark web that 106 00:07:11,480 --> 00:07:14,320 Speaker 4: suddenly they had moved into the climate space, and that 107 00:07:14,400 --> 00:07:18,560 Speaker 4: they clearly saw it as being a very useful framing 108 00:07:18,640 --> 00:07:21,960 Speaker 4: mechanism for the discussions that they wanted to have about 109 00:07:22,080 --> 00:07:26,480 Speaker 4: power and states of stimulations and infringemental civil liberties and 110 00:07:26,720 --> 00:07:28,440 Speaker 4: other perceived grievances. 111 00:07:29,680 --> 00:07:33,440 Speaker 3: So I know that there's been a debate within the 112 00:07:33,880 --> 00:07:37,040 Speaker 3: climate movement for quite a while about whether or not 113 00:07:37,760 --> 00:07:42,560 Speaker 3: climate should be connected to other social justice issues or 114 00:07:42,600 --> 00:07:46,640 Speaker 3: this idea of like, look, climate is an intersectional issue. 115 00:07:47,120 --> 00:07:52,480 Speaker 3: You don't necessarily get climate crisis without massive power and 116 00:07:52,600 --> 00:07:56,520 Speaker 3: balance that precedes it in all these other ways. And 117 00:07:57,240 --> 00:07:59,440 Speaker 3: I don't know how people can sort of take this 118 00:07:59,600 --> 00:08:03,480 Speaker 3: in from me and use it without being like, Okay, 119 00:08:03,520 --> 00:08:06,200 Speaker 3: we need to start talking about those things because the 120 00:08:06,280 --> 00:08:07,480 Speaker 3: rail is going to weaponize it. 121 00:08:08,080 --> 00:08:10,640 Speaker 4: Yeah, it's a really valid point, and I think to 122 00:08:10,640 --> 00:08:12,920 Speaker 4: some extent, you know, the way that you're seeing is 123 00:08:13,080 --> 00:08:15,640 Speaker 4: being co opted and weaponized within the culture wars in 124 00:08:15,680 --> 00:08:18,680 Speaker 4: a more far right media ecosystem is sort of the 125 00:08:18,720 --> 00:08:21,200 Speaker 4: reverse side of the coin of the climate justice movement, 126 00:08:21,640 --> 00:08:25,560 Speaker 4: which is doing absolutely essential work in pointing out the 127 00:08:25,600 --> 00:08:29,520 Speaker 4: intersectionality the inequities in our approach to the climate response, 128 00:08:29,960 --> 00:08:32,839 Speaker 4: whether that's within countries or between the global North and 129 00:08:32,880 --> 00:08:36,320 Speaker 4: the global South, etc. My own personal take on this 130 00:08:36,480 --> 00:08:39,240 Speaker 4: is that it's absolutely not about saying, oh, we need 131 00:08:39,280 --> 00:08:43,440 Speaker 4: to disentangle climate from other contentious issues because they're holding 132 00:08:43,520 --> 00:08:46,360 Speaker 4: climate back and we want to be as far removed 133 00:08:46,520 --> 00:08:50,199 Speaker 4: from these other major battles of our time, like reproductive 134 00:08:50,200 --> 00:08:53,840 Speaker 4: health rights or migration. And it's more pointing out the 135 00:08:53,960 --> 00:08:58,440 Speaker 4: cynical and in many ways disingenuous use of moral panic 136 00:08:58,920 --> 00:09:03,760 Speaker 4: in other areas to drive opposition to climate action. So 137 00:09:03,840 --> 00:09:06,800 Speaker 4: it's about pointing out and being really explicit in saying 138 00:09:07,360 --> 00:09:14,440 Speaker 4: these actors are using racist rhetoric around Black lives matter, 139 00:09:15,240 --> 00:09:18,600 Speaker 4: or the enormous moral panic that we're seeing in society 140 00:09:18,800 --> 00:09:23,880 Speaker 4: around the trans community, as a point of entry to 141 00:09:24,040 --> 00:09:28,760 Speaker 4: mobilize outrage and grievance in relation to climate change, right, 142 00:09:28,960 --> 00:09:31,400 Speaker 4: And that is a deliberate tactic and it's part of 143 00:09:31,400 --> 00:09:33,720 Speaker 4: their playbook, and it needs to be consciously challenged. 144 00:09:35,040 --> 00:09:37,240 Speaker 3: I wanted to ask you about the free speech stuff 145 00:09:37,240 --> 00:09:39,440 Speaker 3: because I feel like this comes up all the time, 146 00:09:39,520 --> 00:09:43,000 Speaker 3: and I really appreciated how it's dealt with this report, 147 00:09:43,200 --> 00:09:47,600 Speaker 3: because I know in the US in particular, really any 148 00:09:47,679 --> 00:09:55,319 Speaker 3: time you try to say anything about how disinformation needs 149 00:09:55,360 --> 00:09:58,160 Speaker 3: to be dealt with, you immediately sort of get this, Well, 150 00:09:58,200 --> 00:10:01,240 Speaker 3: we don't want to censor people, we don't want to 151 00:10:01,280 --> 00:10:03,600 Speaker 3: infringe on free speech rates and all of that. But 152 00:10:03,720 --> 00:10:06,600 Speaker 3: I know this media exception thing shows up even in 153 00:10:06,679 --> 00:10:09,640 Speaker 3: really tiny ways, like when Twitter tried to say they 154 00:10:09,679 --> 00:10:14,679 Speaker 3: weren't going to allow political ads, the oil companies immediately 155 00:10:14,760 --> 00:10:18,160 Speaker 3: just started working with media outlets to get their ads 156 00:10:18,320 --> 00:10:22,200 Speaker 3: kind of smuggled in through media accounts. Anyway, So yeah, 157 00:10:22,200 --> 00:10:25,840 Speaker 3: I'm curious, like what kind of of pushback you see 158 00:10:26,160 --> 00:10:28,160 Speaker 3: using this kind of free speech argument. 159 00:10:28,559 --> 00:10:32,000 Speaker 4: I fully anticipate that the what the pushback is on 160 00:10:32,040 --> 00:10:34,640 Speaker 4: the horizon, and any time that you try and draw 161 00:10:34,720 --> 00:10:39,840 Speaker 4: parameters around what isn't missed or disinformation. You are always 162 00:10:39,880 --> 00:10:41,760 Speaker 4: going to have a body of people that claim this 163 00:10:41,840 --> 00:10:46,640 Speaker 4: is about censoring certain opinions from public discourse, and I 164 00:10:46,679 --> 00:10:50,600 Speaker 4: hope that the report is very clear in saying that firstly, 165 00:10:51,000 --> 00:10:54,960 Speaker 4: this is not necessarily about content removal or even deplatforming, 166 00:10:55,360 --> 00:10:58,320 Speaker 4: and that is very often the first line of attack 167 00:10:58,480 --> 00:11:01,160 Speaker 4: that you get from actors in this space who are 168 00:11:01,160 --> 00:11:04,679 Speaker 4: exercised about cancel culture and who try to distract from 169 00:11:04,679 --> 00:11:07,280 Speaker 4: the entire conversation by saying or they're trying to read 170 00:11:07,360 --> 00:11:09,840 Speaker 4: everybody that they disagree with out of the Internet. We 171 00:11:09,960 --> 00:11:12,320 Speaker 4: don't advocate for that at any point in this report, 172 00:11:12,640 --> 00:11:16,920 Speaker 4: except where accounts are actively violating the platform's terms of service, 173 00:11:17,160 --> 00:11:19,760 Speaker 4: which they define and they are perfectly within their rights 174 00:11:19,760 --> 00:11:23,680 Speaker 4: to do as private companies, and we do encourage that 175 00:11:23,720 --> 00:11:26,440 Speaker 4: if they're going to have those community guidelines in terms 176 00:11:26,440 --> 00:11:29,400 Speaker 4: of service in place, they should be enforced. Otherwise they 177 00:11:29,440 --> 00:11:32,720 Speaker 4: are somewhat meaningless and don't produce the kind of value 178 00:11:32,800 --> 00:11:37,040 Speaker 4: sets and architecture for their products and services that they intend. 179 00:11:37,679 --> 00:11:40,200 Speaker 4: But beyond that, what we're trying to emphasize is that 180 00:11:40,240 --> 00:11:43,560 Speaker 4: there are so many mechanisms at our disposal, both as 181 00:11:43,559 --> 00:11:48,240 Speaker 4: a society as regulators and policymakers, and as tech companies 182 00:11:48,600 --> 00:11:51,559 Speaker 4: that can help to reduce the prominence and the impact 183 00:11:51,720 --> 00:11:55,680 Speaker 4: of these patently misleading and false claims without telling people 184 00:11:55,760 --> 00:11:57,880 Speaker 4: you're not allowed to have these opinions and you're not 185 00:11:57,920 --> 00:12:00,600 Speaker 4: allowed to say them anywhere on social media. To me, 186 00:12:00,760 --> 00:12:05,240 Speaker 4: it's a false dichotomy, and it's used deliberately to make 187 00:12:05,320 --> 00:12:08,400 Speaker 4: this a question about censorship rather than a question about 188 00:12:09,080 --> 00:12:14,439 Speaker 4: algorithmic amplification, arout gaming of social media, about the monetization 189 00:12:14,960 --> 00:12:18,080 Speaker 4: of myths and disinformation, about the opaqueness of the ad 190 00:12:18,080 --> 00:12:21,920 Speaker 4: tech industry, and all these other solvable problems. And I 191 00:12:21,960 --> 00:12:25,680 Speaker 4: guess the neatest way to summarize how I feel about this, 192 00:12:25,760 --> 00:12:27,720 Speaker 4: and I guess what is maybe more of a European 193 00:12:27,760 --> 00:12:30,920 Speaker 4: approach enshrined in things like the Digital Services Act, is 194 00:12:30,920 --> 00:12:34,920 Speaker 4: that freedom of speech is a fundamental pillar of liberal 195 00:12:34,920 --> 00:12:37,800 Speaker 4: democracies and we should preserve it at all costs. And 196 00:12:37,840 --> 00:12:41,880 Speaker 4: of course America has historically taken a more absolutist view 197 00:12:41,960 --> 00:12:45,679 Speaker 4: or purest view of what free speech entails. But free 198 00:12:45,679 --> 00:12:49,600 Speaker 4: speech is not the same as freedom of reach right 199 00:12:50,000 --> 00:12:55,720 Speaker 4: and your own ability to espouse an opinion. However, extreme However, 200 00:12:55,840 --> 00:13:01,120 Speaker 4: unpalatable is totally different from being able to mobilize the 201 00:13:01,160 --> 00:13:04,920 Speaker 4: biggest megaphones that exist in the modern world to get 202 00:13:04,960 --> 00:13:08,480 Speaker 4: those opinions out to millions or hundreds of millions of people. 203 00:13:09,080 --> 00:13:11,480 Speaker 4: And I think that we need to draw a clearer 204 00:13:11,520 --> 00:13:13,480 Speaker 4: delineation between those two things. 205 00:13:14,040 --> 00:13:18,480 Speaker 3: There's something happening here where corporations have for a long 206 00:13:18,520 --> 00:13:23,240 Speaker 3: time been on this path of continuously broadening their free 207 00:13:23,240 --> 00:13:26,840 Speaker 3: speech rights, and that there's another kind of effort perfectively 208 00:13:27,000 --> 00:13:30,800 Speaker 3: arguing to blur the line between fraud and free speech 209 00:13:31,200 --> 00:13:34,640 Speaker 3: in a very concerning way. I'm curious what you think 210 00:13:34,679 --> 00:13:40,040 Speaker 3: about how this plays into some of the organizations or 211 00:13:40,120 --> 00:13:46,440 Speaker 3: corporations that might be funding and supporting disinformation, given that 212 00:13:46,480 --> 00:13:49,440 Speaker 3: they kind of operate in a slightly different legal framework. 213 00:13:49,960 --> 00:13:53,480 Speaker 4: There are two elements of the way that corporate entities 214 00:13:53,720 --> 00:13:56,640 Speaker 4: seem to be interacting with this issue across the digital 215 00:13:56,679 --> 00:14:00,720 Speaker 4: ecosystem and not just social media platforms. The first is 216 00:14:00,720 --> 00:14:03,920 Speaker 4: the ability of what we would call carbon majors, So 217 00:14:03,960 --> 00:14:06,720 Speaker 4: these are the one hundred companies that are responsible for 218 00:14:06,840 --> 00:14:11,000 Speaker 4: around seventy percent of historic carbon emissions. Their ability to 219 00:14:11,200 --> 00:14:15,880 Speaker 4: use products and services online to spread their greenwashing narratives 220 00:14:16,280 --> 00:14:19,280 Speaker 4: and to mislead the public about what the viable solutions 221 00:14:19,320 --> 00:14:22,400 Speaker 4: to let's say, mitigation and adaptation of climate change are 222 00:14:22,680 --> 00:14:26,760 Speaker 4: going forward. One of our partners called ecobotnet, did a 223 00:14:26,800 --> 00:14:29,480 Speaker 4: study in the first nine months of twenty twenty one, 224 00:14:29,800 --> 00:14:33,440 Speaker 4: and they found that just sixteen companies had posted seventeen 225 00:14:33,600 --> 00:14:37,480 Speaker 4: hundred adverts to Facebook, which had garnered one hundred and 226 00:14:37,520 --> 00:14:41,520 Speaker 4: fifty million impressions and more importantly, had generated nearly five 227 00:14:41,600 --> 00:14:44,560 Speaker 4: million dollars worth of revenue for the parent company Meta. 228 00:14:44,600 --> 00:14:47,240 Speaker 4: And that business model needs to be called out and 229 00:14:47,280 --> 00:14:49,840 Speaker 4: it needs to be undermined. We should not continue to 230 00:14:49,880 --> 00:14:53,360 Speaker 4: allow actors who we know have been bad faith and 231 00:14:53,440 --> 00:14:58,080 Speaker 4: indeed who are being asked to testify at congressional hearings 232 00:14:58,480 --> 00:15:01,280 Speaker 4: in the US or hopefully in the hearings in the 233 00:15:01,320 --> 00:15:05,000 Speaker 4: EU on their historic role in causing issues in the 234 00:15:05,000 --> 00:15:09,440 Speaker 4: information space and delaying action and progress on climate Who 235 00:15:09,440 --> 00:15:13,880 Speaker 4: shouldn't allow them to continue spreading content with impunity and 236 00:15:14,360 --> 00:15:17,560 Speaker 4: using the huge amounts of financial resources at their disposal 237 00:15:17,920 --> 00:15:21,880 Speaker 4: to get undue oxygen. But that applies also to their 238 00:15:21,960 --> 00:15:25,640 Speaker 4: known front groups and lobbyists, many of whom have very 239 00:15:25,640 --> 00:15:28,960 Speaker 4: clear affiliations with industry and who have been exposed or 240 00:15:29,000 --> 00:15:32,680 Speaker 4: detailed by investigative journalists to date. The other side of 241 00:15:32,720 --> 00:15:36,920 Speaker 4: this is the monetization that exists through the broader ad 242 00:15:36,960 --> 00:15:40,280 Speaker 4: tech system online and the fact that many outlets who 243 00:15:40,320 --> 00:15:44,280 Speaker 4: have a well known track record of spreading myths and 244 00:15:44,440 --> 00:15:48,480 Speaker 4: or disinformation or active propaganda and hate speech are still 245 00:15:48,520 --> 00:15:51,520 Speaker 4: able to generate revenue through adverts that appear on their 246 00:15:51,520 --> 00:15:57,000 Speaker 4: websites or on their channels, and historically brands have claimed 247 00:15:57,040 --> 00:16:00,320 Speaker 4: that they were ignorant of that fact or of where 248 00:16:00,360 --> 00:16:02,880 Speaker 4: their adverts were appearing. So you know, if you had 249 00:16:03,200 --> 00:16:07,800 Speaker 4: something from to take a completely random example, you Unilever 250 00:16:08,320 --> 00:16:11,640 Speaker 4: that was appearing next to a bright art article calling 251 00:16:11,640 --> 00:16:14,600 Speaker 4: climate change of hopes, they might have said, because we 252 00:16:14,720 --> 00:16:18,000 Speaker 4: have so little sense of what happens in between us 253 00:16:18,120 --> 00:16:20,320 Speaker 4: using an ad tech system and where the final product 254 00:16:20,440 --> 00:16:23,880 Speaker 4: ends up, we didn't know that this was happening. I 255 00:16:23,920 --> 00:16:27,080 Speaker 4: think now it's probably fair to say that it's either 256 00:16:27,160 --> 00:16:30,880 Speaker 4: wilful or woeful ignorance on behalf of the private sector, 257 00:16:31,120 --> 00:16:32,840 Speaker 4: and that there is also a lot of lobbying and 258 00:16:32,920 --> 00:16:36,280 Speaker 4: advocacy that can and is being done by organizations like 259 00:16:36,400 --> 00:16:40,800 Speaker 4: check my Ads, The Conscious Advertising networks sleeping giants. That 260 00:16:40,960 --> 00:16:46,720 Speaker 4: is saying you hold enormous leverage, financial leverage in this system, 261 00:16:47,080 --> 00:16:50,960 Speaker 4: and if you are more stringent in the parameters of 262 00:16:51,000 --> 00:16:53,480 Speaker 4: where you do and do not want your adverts to 263 00:16:53,520 --> 00:16:56,160 Speaker 4: appear on what you think is acceptable, you have the 264 00:16:56,160 --> 00:16:59,960 Speaker 4: opportunity to pull the rug out from under this entire 265 00:17:00,040 --> 00:17:05,960 Speaker 4: kaya business model of disinformation and propaganda. And there was 266 00:17:06,000 --> 00:17:08,960 Speaker 4: one study done last year which estimated that two point 267 00:17:09,000 --> 00:17:12,800 Speaker 4: six billion dollars a year is spent advertising on some 268 00:17:12,840 --> 00:17:16,280 Speaker 4: of the worst perpetrators of missing disinformation. That's not sustainable 269 00:17:16,359 --> 00:17:16,920 Speaker 4: going forward. 270 00:17:18,400 --> 00:17:22,040 Speaker 3: That's really interesting. I know, we definitely saw this culture 271 00:17:22,080 --> 00:17:27,639 Speaker 3: war messaging around gas prices and the Russia Ukraine war 272 00:17:27,920 --> 00:17:31,000 Speaker 3: here in the US. In that Influence Map study, I 273 00:17:31,000 --> 00:17:33,200 Speaker 3: think they found that one of the top three messages 274 00:17:33,320 --> 00:17:36,560 Speaker 3: was this like, you know, climate policy is something that 275 00:17:36,640 --> 00:17:39,760 Speaker 3: only elite, woke liberals want and it's the thing that's 276 00:17:39,840 --> 00:17:42,680 Speaker 3: driving up gas prices and whatever. But I'm wondering if 277 00:17:42,720 --> 00:17:47,800 Speaker 3: you have a sense of when exactly that messaging framework 278 00:17:47,960 --> 00:17:51,119 Speaker 3: started to appear and if there was any kind of 279 00:17:52,080 --> 00:17:55,680 Speaker 3: catalyst for it, or if it just sort of has 280 00:17:55,760 --> 00:18:00,160 Speaker 3: slowly started to bubble up as a dominant message. 281 00:18:00,280 --> 00:18:04,080 Speaker 4: I think over probably the last decade, we have seen 282 00:18:04,400 --> 00:18:08,359 Speaker 4: a broader societal and global level shift from what's often 283 00:18:08,400 --> 00:18:12,920 Speaker 4: referred to as identity based polarization, so we hold polar 284 00:18:13,760 --> 00:18:18,359 Speaker 4: opinions or stances on a particular issue, through to effective 285 00:18:18,600 --> 00:18:22,439 Speaker 4: or identity based polarization, which is we might actually agree 286 00:18:22,600 --> 00:18:25,600 Speaker 4: on the topic, but because I call myself X and 287 00:18:25,640 --> 00:18:29,040 Speaker 4: you call yourself why, and we associate with different tribes, 288 00:18:29,160 --> 00:18:32,960 Speaker 4: I automatically view myself in opposition and antagonism to you. 289 00:18:33,760 --> 00:18:37,040 Speaker 4: So I think that that's something which is well beyond climate. 290 00:18:37,160 --> 00:18:40,679 Speaker 4: It's applying to almost every aspect of public life, but 291 00:18:41,040 --> 00:18:45,080 Speaker 4: has unfortunately also seeped into and ended up dominating the 292 00:18:45,119 --> 00:18:48,760 Speaker 4: discussion around environmental issues. I see have only been doing 293 00:18:48,880 --> 00:18:51,800 Speaker 4: research in this space for eighteen months, But one really 294 00:18:51,800 --> 00:18:54,919 Speaker 4: acute example that I can give which shows how there 295 00:18:55,000 --> 00:18:58,800 Speaker 4: is a lot of opportunism by adversary actors or those 296 00:18:58,840 --> 00:19:02,080 Speaker 4: who oppose climate action, which is to piggyback on whatever 297 00:19:02,119 --> 00:19:04,720 Speaker 4: else is happening in the news cycle in order to 298 00:19:04,760 --> 00:19:08,600 Speaker 4: insert their pre existing world views and stances. And one 299 00:19:08,600 --> 00:19:10,560 Speaker 4: that we saw in the last couple of years is 300 00:19:10,560 --> 00:19:13,720 Speaker 4: that right from the beginning of the COVID nineteen pandemic. 301 00:19:14,000 --> 00:19:16,919 Speaker 4: In March twenty twenty one, there were a number of 302 00:19:16,960 --> 00:19:20,240 Speaker 4: individuals who were trying to land the idea of climate 303 00:19:20,320 --> 00:19:25,000 Speaker 4: lockdown as a conspiracy, including a couple of actors associated 304 00:19:25,040 --> 00:19:27,520 Speaker 4: with the Heartland Institute, where I'm sure many of your 305 00:19:27,520 --> 00:19:30,639 Speaker 4: listeners will unfortunately be aware of the central hub and 306 00:19:30,800 --> 00:19:34,199 Speaker 4: historic hub of climate denial climate skepticism, both in the 307 00:19:34,280 --> 00:19:35,640 Speaker 4: US and transnationally. 308 00:19:36,320 --> 00:19:40,040 Speaker 3: Yeah, I remember seeing like Steve Molloy, for example. 309 00:19:40,280 --> 00:19:43,240 Speaker 4: So Steve Manoy, you know, was really committed. He was 310 00:19:43,280 --> 00:19:45,719 Speaker 4: putting out to meet right from marchway twenty one. And 311 00:19:46,520 --> 00:19:49,440 Speaker 4: unfortunately for Steve Malloy, he does not boast a great 312 00:19:49,560 --> 00:19:51,359 Speaker 4: organic reach on Twitter. 313 00:19:51,440 --> 00:19:53,879 Speaker 3: I'm going to make him cry when he listen to this. 314 00:19:55,520 --> 00:19:58,280 Speaker 4: Sorry, really know that the data speaks to itself. He 315 00:19:58,359 --> 00:20:02,760 Speaker 4: was getting two or three or retweets. However, by September 316 00:20:02,800 --> 00:20:06,080 Speaker 4: twenty twenty one, an article was released by a very 317 00:20:06,160 --> 00:20:11,280 Speaker 4: renowned economist called Marianna Mattakato which was looking at environmental 318 00:20:11,280 --> 00:20:14,280 Speaker 4: issues in the context of COVID nineteen and saying the 319 00:20:14,359 --> 00:20:18,520 Speaker 4: kind of mobilized systemic global response that we've had around 320 00:20:18,560 --> 00:20:21,399 Speaker 4: this pandemic, why are we not replicating that sense of 321 00:20:21,520 --> 00:20:25,080 Speaker 4: urgency to climate change which poses an existential threat. And 322 00:20:25,119 --> 00:20:28,439 Speaker 4: the article mentioned the language around climate lockdown, but not 323 00:20:28,560 --> 00:20:31,919 Speaker 4: as something that in any way should be advocated. In fact, 324 00:20:31,960 --> 00:20:34,240 Speaker 4: what the article was saying is if we don't take 325 00:20:34,320 --> 00:20:37,480 Speaker 4: the necessary mitigation and adaptation steps, we may end up 326 00:20:37,480 --> 00:20:39,879 Speaker 4: in a position where we don't have other options like 327 00:20:39,960 --> 00:20:42,880 Speaker 4: we've had with the pandemic. And this was the point 328 00:20:42,920 --> 00:20:46,080 Speaker 4: where the conspiracy went from being something on the fringe 329 00:20:46,359 --> 00:20:49,080 Speaker 4: that was really not gaining traction and was absolutely failing 330 00:20:49,080 --> 00:20:52,480 Speaker 4: to become a phenomenon into something that was turbocharged into 331 00:20:52,480 --> 00:20:57,240 Speaker 4: the mainstream and has now become a central conspiracy in 332 00:20:57,280 --> 00:21:01,359 Speaker 4: a number of anti climate movements. And the reason, in 333 00:21:01,440 --> 00:21:04,919 Speaker 4: part is because so many of these conspiracies rely on 334 00:21:05,000 --> 00:21:09,320 Speaker 4: a perceived reactionary dynamic with the mainstream media. So they 335 00:21:09,359 --> 00:21:12,960 Speaker 4: saw this as perfect evidence. This provided the fodder or 336 00:21:13,000 --> 00:21:15,560 Speaker 4: the grist to the mill for them to say, see, look, 337 00:21:15,600 --> 00:21:18,800 Speaker 4: they've said the quiet part out loud. This pandemic is 338 00:21:18,840 --> 00:21:21,960 Speaker 4: just a test bend for the kinds of green tyranny 339 00:21:22,119 --> 00:21:24,840 Speaker 4: that is just looming on the horizon. And if you 340 00:21:24,960 --> 00:21:27,200 Speaker 4: think that the kind of infringement on your civil liberties 341 00:21:27,200 --> 00:21:29,760 Speaker 4: that are happening now are bad. You should see what's 342 00:21:29,760 --> 00:21:33,280 Speaker 4: going to happen when the great reset happens for climate change, 343 00:21:33,680 --> 00:21:37,640 Speaker 4: and what was so depressing to watch happened in real time, 344 00:21:37,800 --> 00:21:40,680 Speaker 4: and IC have produced a whole report that really forensically 345 00:21:40,760 --> 00:21:43,520 Speaker 4: tracks the evolution of this conspiracy, of the way that 346 00:21:43,600 --> 00:21:46,639 Speaker 4: it moved then from being on the fringes driven by 347 00:21:46,640 --> 00:21:49,040 Speaker 4: a small set of actors, to being taken up by 348 00:21:49,040 --> 00:21:53,600 Speaker 4: the right wing media ecosystem Fox News, etc. And where 349 00:21:53,600 --> 00:21:56,840 Speaker 4: it eventually ended up was in the QAnon telegram groups 350 00:21:56,960 --> 00:22:00,240 Speaker 4: and the anti vaccination groups and the anti loocko on 351 00:22:00,240 --> 00:22:05,280 Speaker 4: telegram groups, because it absolutely fit the same rhetorical and 352 00:22:05,400 --> 00:22:08,440 Speaker 4: narrative frame, which is, you know, there are a group 353 00:22:08,480 --> 00:22:11,159 Speaker 4: of people who are trying to take away your fundamental 354 00:22:11,200 --> 00:22:15,679 Speaker 4: freedoms and change your life beyond all recognition. And it 355 00:22:15,680 --> 00:22:18,520 Speaker 4: doesn't matter whether it's around public health or around environment. 356 00:22:18,960 --> 00:22:21,800 Speaker 4: This needs to be opposed at all costs. And so 357 00:22:22,240 --> 00:22:26,440 Speaker 4: it was such a neatly articulated case study of piggybacking 358 00:22:26,480 --> 00:22:29,280 Speaker 4: on the news and exploiting the fact that people have 359 00:22:29,680 --> 00:22:32,880 Speaker 4: real and genuine trauma in related to the pandemic. It's 360 00:22:32,880 --> 00:22:38,359 Speaker 4: been an unbelievably distressing and difficult time for people around 361 00:22:38,400 --> 00:22:41,919 Speaker 4: the globe, and lots of people do have legitimate grievances 362 00:22:42,000 --> 00:22:44,760 Speaker 4: around the ways that their government have managed this pandemic 363 00:22:44,800 --> 00:22:47,520 Speaker 4: and the kinds of measures that have been instituted, and 364 00:22:47,560 --> 00:22:51,840 Speaker 4: they have cynically used that trauma to suddenly turn people 365 00:22:51,920 --> 00:22:54,600 Speaker 4: against climate change as an issue and against climate action 366 00:22:54,640 --> 00:22:57,400 Speaker 4: as a policy platform. 367 00:22:57,400 --> 00:23:01,320 Speaker 3: It's so interesting just the flow of information and the 368 00:23:01,359 --> 00:23:04,159 Speaker 3: way that it moves. I have seen a tendency to 369 00:23:04,680 --> 00:23:08,800 Speaker 3: silo climate disinformation out as like a a separate thing, 370 00:23:08,920 --> 00:23:11,919 Speaker 3: even from the government here. It's like they'll have all 371 00:23:11,960 --> 00:23:15,959 Speaker 3: these hearings about disinformation in general that's very focused on 372 00:23:16,080 --> 00:23:20,199 Speaker 3: like electoral politics and social media platforms and whatever, and 373 00:23:20,240 --> 00:23:24,000 Speaker 3: then they'll have this separate side thing for climate disinformation. 374 00:23:24,160 --> 00:23:27,119 Speaker 3: I'm just curious what you think about that. Why so 375 00:23:27,240 --> 00:23:31,400 Speaker 3: many of the folks who have focused on disinformation kind 376 00:23:31,440 --> 00:23:34,400 Speaker 3: of see climate as a whole separate thing. 377 00:23:35,000 --> 00:23:38,119 Speaker 4: I think one of the reasons is that, up until now, 378 00:23:38,400 --> 00:23:42,119 Speaker 4: there has been a very poorly defined taxonomy of harm 379 00:23:42,160 --> 00:23:45,000 Speaker 4: for climate disinformation. What I mean by that is that 380 00:23:45,040 --> 00:23:49,440 Speaker 4: in other areas like public health or electoral integrity or 381 00:23:49,520 --> 00:23:52,760 Speaker 4: you know, extreme hate speech. There is a clear through line, 382 00:23:52,840 --> 00:23:55,840 Speaker 4: or a clearer through line between a piece of content 383 00:23:55,880 --> 00:23:58,280 Speaker 4: online and the imminent harm that it can pose in 384 00:23:58,320 --> 00:24:02,480 Speaker 4: the real world. If somebody spreads information about a supposed 385 00:24:02,560 --> 00:24:05,040 Speaker 4: cure for COVID nineteen and people take it and end 386 00:24:05,119 --> 00:24:07,399 Speaker 4: up in a and E, that poses a real present 387 00:24:07,600 --> 00:24:11,560 Speaker 4: danger for individuals and for society. If people spread false 388 00:24:11,560 --> 00:24:15,120 Speaker 4: and misleading claims about an election being stolen, it can 389 00:24:15,240 --> 00:24:18,480 Speaker 4: lead to an insurrection at the capitol, and so on 390 00:24:18,560 --> 00:24:22,040 Speaker 4: and so forth. And where bad actors in the climate 391 00:24:22,080 --> 00:24:26,080 Speaker 4: space have been very clever is in keeping it abstract 392 00:24:26,200 --> 00:24:29,160 Speaker 4: enough that it feels like it can't be acted upon 393 00:24:29,440 --> 00:24:33,000 Speaker 4: because it doesn't have that through line to real world harm. 394 00:24:33,400 --> 00:24:35,640 Speaker 4: But I don't think that we can justify that anymore 395 00:24:35,720 --> 00:24:39,399 Speaker 4: because the IPCC themselves, for the very first time this year, 396 00:24:39,720 --> 00:24:42,680 Speaker 4: in their three thousand plus plage report on Climate Change 397 00:24:42,680 --> 00:24:49,840 Speaker 4: in Mitigation, explicitly said that misleading content, including from industry 398 00:24:49,880 --> 00:24:54,000 Speaker 4: actors and those with vested interests, had actively provided a 399 00:24:54,040 --> 00:24:56,679 Speaker 4: block to achieve in climate action in line with the 400 00:24:56,680 --> 00:25:00,119 Speaker 4: goals of the Paris Agreement and with scientific consensus. So 401 00:25:00,520 --> 00:25:03,399 Speaker 4: if you view as we all should, based on the 402 00:25:03,440 --> 00:25:08,159 Speaker 4: scientific evidence that climate change is a ever present threat 403 00:25:08,200 --> 00:25:12,120 Speaker 4: and indeed already having devastating impacts for communities all around 404 00:25:12,160 --> 00:25:14,480 Speaker 4: the world, including in America. You know, we're not just 405 00:25:14,520 --> 00:25:17,800 Speaker 4: talking about the global self here, we're talking about communities 406 00:25:17,840 --> 00:25:21,640 Speaker 4: on your doorstep. Then we should be taking this content seriously. 407 00:25:21,800 --> 00:25:23,639 Speaker 4: So I think that's one of the reasons why it 408 00:25:23,800 --> 00:25:26,679 Speaker 4: ends up being separated slightly, is that disinformation has been 409 00:25:26,720 --> 00:25:29,159 Speaker 4: hard enough to mobilize a systemic response around in the 410 00:25:29,200 --> 00:25:31,639 Speaker 4: first place, and people have been trying to focus on 411 00:25:31,680 --> 00:25:34,680 Speaker 4: what they see as quote unquote the lowest hanging fruit, 412 00:25:35,000 --> 00:25:38,320 Speaker 4: which is content that clearly has the potential for incitement 413 00:25:38,359 --> 00:25:42,480 Speaker 4: to violence or causing harm at personal and a societal level. 414 00:25:43,200 --> 00:25:46,040 Speaker 4: The other reason is that I just don't think anyone 415 00:25:46,119 --> 00:25:49,280 Speaker 4: was talking about this issue in a mainstreamed way as 416 00:25:49,359 --> 00:25:53,440 Speaker 4: part of the climate response until maybe a year ago. 417 00:25:53,600 --> 00:25:56,520 Speaker 4: And I'm relatively new to this space, and I'm not 418 00:25:56,560 --> 00:25:59,359 Speaker 4: saying that climate organizations have not been banging the drum 419 00:25:59,640 --> 00:26:04,080 Speaker 4: about nihilism since the eighties, but more that it wasn't 420 00:26:04,119 --> 00:26:07,440 Speaker 4: necessarily a mainstream media conversation or something that was being 421 00:26:07,480 --> 00:26:10,760 Speaker 4: talked about within the political architecture. And indeed we've heard 422 00:26:10,800 --> 00:26:14,560 Speaker 4: that from stakeholders and organizations like the unfrical C who 423 00:26:14,680 --> 00:26:18,040 Speaker 4: run the climate summits in saying, you know, this really 424 00:26:18,200 --> 00:26:22,360 Speaker 4: was not on our radar until a year ago. That's wild, 425 00:26:22,720 --> 00:26:26,040 Speaker 4: and we're so glad that it is now, because of 426 00:26:26,040 --> 00:26:28,399 Speaker 4: course we were kind of obliquely aware of the problem, 427 00:26:28,440 --> 00:26:29,639 Speaker 4: and we know that there have been people that have 428 00:26:29,680 --> 00:26:32,840 Speaker 4: been trying to push this but actively drawing the link 429 00:26:33,119 --> 00:26:38,439 Speaker 4: between the information ecosystem public mandates and the ability to 430 00:26:38,560 --> 00:26:45,320 Speaker 4: achieve strong nationally determined contributions and multilateral climate agreements, I'm 431 00:26:45,359 --> 00:26:48,879 Speaker 4: not sure that that equation had ever been completed. 432 00:26:49,960 --> 00:26:52,800 Speaker 3: Well, I think also, I mean, just speaking from my 433 00:26:52,920 --> 00:26:57,680 Speaker 3: own experience, the mainstream media is very hesitant to cover 434 00:26:57,800 --> 00:27:01,520 Speaker 3: this because they've been so comple listed in it. I 435 00:27:01,560 --> 00:27:03,680 Speaker 3: wrote something for the washi In Post a few years 436 00:27:03,720 --> 00:27:07,000 Speaker 3: ago about how the fossil feel industry had a real 437 00:27:07,080 --> 00:27:11,560 Speaker 3: hand in creating false equivalents in general across the board, 438 00:27:11,680 --> 00:27:17,200 Speaker 3: and my editor just sort of surgically removed any reference 439 00:27:17,240 --> 00:27:20,439 Speaker 3: to anything that the Post themselves had also done in 440 00:27:20,480 --> 00:27:22,760 Speaker 3: the past. So I don't even know if it's like 441 00:27:23,400 --> 00:27:27,760 Speaker 3: cognizant or intentional, but they're very hesitant to run anything 442 00:27:28,080 --> 00:27:32,240 Speaker 3: that you know, might implicate some of their own coverage. 443 00:27:32,560 --> 00:27:36,720 Speaker 4: There is also a lot of cognitive dissonance or disparity 444 00:27:36,760 --> 00:27:39,520 Speaker 4: within publications to this very day. Yes, I know, you 445 00:27:39,560 --> 00:27:45,040 Speaker 4: can have outlets whose main editorial stance is absolutely recognizing 446 00:27:45,040 --> 00:27:48,000 Speaker 4: the climate crisis, trying to put out nuanced journalism that 447 00:27:48,080 --> 00:27:50,840 Speaker 4: talks about the level of urgency and the kinds of 448 00:27:51,160 --> 00:27:53,560 Speaker 4: good faith debates that need to happen in order to 449 00:27:53,600 --> 00:27:57,080 Speaker 4: implement climate policy that is appropriate for a given country. Who, 450 00:27:57,119 --> 00:27:59,760 Speaker 4: at the same time will allow the fossil fuel industry 451 00:27:59,800 --> 00:28:03,600 Speaker 4: to take out entire page adverts or and this is 452 00:28:03,680 --> 00:28:07,399 Speaker 4: really really key, will allow climate deniers to post things 453 00:28:07,440 --> 00:28:09,840 Speaker 4: in their editorial and their op ed pages right, And 454 00:28:09,880 --> 00:28:13,480 Speaker 4: the amount of disparity that you see between breaking news 455 00:28:13,520 --> 00:28:17,800 Speaker 4: coverage or sort of general reporting versus the op ed 456 00:28:17,840 --> 00:28:23,760 Speaker 4: pages is absolutely staggering. My colleague Phil, he has absolutely 457 00:28:23,800 --> 00:28:26,560 Speaker 4: dogged on this. The man is a machine. He has 458 00:28:26,600 --> 00:28:29,520 Speaker 4: a running spreadsheet that looks at every single op ed 459 00:28:29,640 --> 00:28:32,600 Speaker 4: on climate that has run in the Wall Street Journal 460 00:28:32,680 --> 00:28:36,359 Speaker 4: from the nineteen nineties to now. And this figure is 461 00:28:36,400 --> 00:28:38,800 Speaker 4: something like, I apologize if I slightly miscreat this, but 462 00:28:39,000 --> 00:28:41,840 Speaker 4: in the ballpark of there have only ever been around 463 00:28:41,960 --> 00:28:46,120 Speaker 4: ten op eds and that entire time that actually acknowledged 464 00:28:46,400 --> 00:28:50,320 Speaker 4: the link between fossil fuels and climate changes are phenomenon. 465 00:28:50,920 --> 00:28:53,640 Speaker 3: Wow, I wish I could say I'm surprised. You know. 466 00:28:53,680 --> 00:28:56,479 Speaker 3: The sad thing about the Wall Street Journal too, is 467 00:28:56,480 --> 00:29:00,360 Speaker 3: that pre Rupert Murdoch, they were like the biggest pain 468 00:29:00,440 --> 00:29:03,600 Speaker 3: in the oil industry's ass. I mean, Mobile Oil in 469 00:29:03,680 --> 00:29:07,600 Speaker 3: the eighties actually banned the Wall Street Journal from getting 470 00:29:07,600 --> 00:29:10,480 Speaker 3: their press releases or talking to any of their executives 471 00:29:10,560 --> 00:29:14,080 Speaker 3: because they were so mad about how critical the Wall 472 00:29:14,080 --> 00:29:16,480 Speaker 3: Street Journal was being of the oil industry. 473 00:29:17,000 --> 00:29:19,640 Speaker 4: And now you know, at Cock twenty six, what did 474 00:29:19,640 --> 00:29:22,800 Speaker 4: they do. They gave Beyond Longblog his own special column, 475 00:29:23,120 --> 00:29:27,880 Speaker 4: and even exactly in that column there was an editor's 476 00:29:27,920 --> 00:29:32,600 Speaker 4: note that talked about how pure Lomborg was a neutral 477 00:29:32,640 --> 00:29:37,280 Speaker 4: adjudicator in this discussion and was providing data based journalism 478 00:29:37,600 --> 00:29:41,000 Speaker 4: in order to help the public navigate this really complex topic. 479 00:29:41,280 --> 00:29:44,360 Speaker 4: I mean, you just as fair, because the thing is 480 00:29:44,400 --> 00:29:46,880 Speaker 4: they are seen as a credible outlet, and you know, 481 00:29:46,920 --> 00:29:49,600 Speaker 4: in so many other aspects of their reporting, they are 482 00:29:49,920 --> 00:29:52,360 Speaker 4: the work that Jeff Orowitz and this team have done 483 00:29:52,520 --> 00:29:56,800 Speaker 4: on social media platform accountability and the Facebook files absolutely 484 00:29:56,840 --> 00:29:59,959 Speaker 4: extraordinary and very relevant to the report that we've just released. 485 00:30:00,000 --> 00:30:01,920 Speaker 4: You know, we cite them a number of times, but 486 00:30:02,000 --> 00:30:04,440 Speaker 4: we also have an entire section of the report that 487 00:30:04,480 --> 00:30:06,600 Speaker 4: profiles the Wall Street Journal and the role they're playing 488 00:30:06,600 --> 00:30:10,520 Speaker 4: in climate skepticism. So it's not consistent. 489 00:30:10,520 --> 00:30:13,680 Speaker 3: Both your Lumberger and Michael Schollenberger have this thing, and 490 00:30:13,800 --> 00:30:19,240 Speaker 3: the Canada Greenpeace guy Patrick Moore Patrick Well, to me, 491 00:30:19,320 --> 00:30:21,160 Speaker 3: I'm like, oh my god, this is just right out 492 00:30:21,200 --> 00:30:25,440 Speaker 3: of the right wing playbook too, of like the reformed environmentalist. 493 00:30:25,680 --> 00:30:28,400 Speaker 3: They love a black man who hates civil rights too, 494 00:30:28,680 --> 00:30:32,560 Speaker 3: or a woman who's anti abortion, But in climate in particular, 495 00:30:33,280 --> 00:30:37,360 Speaker 3: it does seem to lend this kind of credibility. And 496 00:30:37,400 --> 00:30:40,720 Speaker 3: I think that the message of like, oh, sure it's 497 00:30:40,800 --> 00:30:44,000 Speaker 3: a thing, but it's not that bad, it's way more 498 00:30:44,000 --> 00:30:46,880 Speaker 3: harmful than saying, oh, it's not happening at all. 499 00:30:47,280 --> 00:30:51,840 Speaker 4: Of course, And think how successful that is and how 500 00:30:52,560 --> 00:30:54,680 Speaker 4: powerful that can be as an argument for a couple 501 00:30:54,720 --> 00:30:58,400 Speaker 4: of different reasons. The first is that these people absolutely 502 00:30:58,480 --> 00:31:02,040 Speaker 4: lean into and create them brands around the idea that 503 00:31:02,080 --> 00:31:06,440 Speaker 4: they are sober rationalist intellectuals and that they don't come 504 00:31:06,520 --> 00:31:09,040 Speaker 4: with any sort of agenda on climate, but that they 505 00:31:09,080 --> 00:31:13,400 Speaker 4: are fiscal pragmatists is very often kind of language that 506 00:31:13,440 --> 00:31:16,440 Speaker 4: you'll see being used, and that they are neutral commentators 507 00:31:16,440 --> 00:31:19,640 Speaker 4: on what is happening in the climate policy space, which 508 00:31:19,680 --> 00:31:22,200 Speaker 4: is somewhat difficult to pass when you look at the 509 00:31:22,200 --> 00:31:25,640 Speaker 4: fact that they also run organizations called things like the 510 00:31:25,680 --> 00:31:29,680 Speaker 4: Co two Coalition, which claim that carbon should be celebrated 511 00:31:29,760 --> 00:31:32,760 Speaker 4: because it enhances our lives and has improved the environment. 512 00:31:32,480 --> 00:31:33,520 Speaker 3: Grows more appliance. 513 00:31:33,840 --> 00:31:38,800 Speaker 5: Yeah, it's also really interesting to see that this veneer 514 00:31:38,880 --> 00:31:42,800 Speaker 5: of academical or scientific credibility, which is touted by by 515 00:31:42,880 --> 00:31:44,760 Speaker 5: all three of the names that you've mentioned, but Patrick 516 00:31:44,800 --> 00:31:48,800 Speaker 5: Moore in particular always harks back to his previous affiliation 517 00:31:48,880 --> 00:31:49,560 Speaker 5: with Greenpeace. 518 00:31:49,880 --> 00:31:53,680 Speaker 4: But Greenpeace themselves have put out an absolutely unequivocal statement 519 00:31:54,040 --> 00:31:57,040 Speaker 4: that both fact checks the level or that the role 520 00:31:57,120 --> 00:32:00,600 Speaker 4: that More played within the organization decades ago, but also 521 00:32:01,200 --> 00:32:05,000 Speaker 4: have completely distanced themselves from his current stances and outputs 522 00:32:05,000 --> 00:32:07,920 Speaker 4: on climate. And yet you know, he relies on the 523 00:32:07,920 --> 00:32:10,400 Speaker 4: fact that people won't do their due diligence, and all 524 00:32:10,400 --> 00:32:12,480 Speaker 4: they'll see is, oh, this is a person who says 525 00:32:12,560 --> 00:32:14,720 Speaker 4: he used to be the co founder of Greenpeace, so 526 00:32:14,760 --> 00:32:18,320 Speaker 4: I should trust what he says. It's a very savvy 527 00:32:19,040 --> 00:32:21,640 Speaker 4: way of gaming the media landscape. It allows them to 528 00:32:21,680 --> 00:32:26,000 Speaker 4: get continually platformed in the mainstream media, which is infuriating 529 00:32:26,040 --> 00:32:28,960 Speaker 4: in the extreme because I don't know how many more 530 00:32:28,960 --> 00:32:32,640 Speaker 4: times we need to document their financial or other affiliations 531 00:32:32,720 --> 00:32:39,120 Speaker 4: with historic polluters, denialists, other unsavory and unbalidable characters. But 532 00:32:39,400 --> 00:32:42,200 Speaker 4: clearly that efforts is not over. What we need to 533 00:32:42,240 --> 00:32:45,920 Speaker 4: remove is basically the social license for these people to 534 00:32:46,080 --> 00:32:50,760 Speaker 4: use or try and claim association degree movements when they 535 00:32:50,800 --> 00:32:55,200 Speaker 4: are entirely going against the current scientific consensus. And what 536 00:32:55,360 --> 00:32:59,120 Speaker 4: bodies like the IPCC and the un tricle C that 537 00:32:59,480 --> 00:33:03,040 Speaker 4: I will size are bodies that almost every country around 538 00:33:03,040 --> 00:33:05,600 Speaker 4: the world have signed onto and who have to approve 539 00:33:06,040 --> 00:33:08,680 Speaker 4: the reports and the documents that come out. So these 540 00:33:08,720 --> 00:33:14,080 Speaker 4: are not isolated bodies associated with one particular country. And say, 541 00:33:14,680 --> 00:33:17,959 Speaker 4: why are we still treating these people as if they 542 00:33:18,000 --> 00:33:22,840 Speaker 4: are credible spokesman. They're not, And every time that they 543 00:33:22,880 --> 00:33:25,480 Speaker 4: appear in public life, we need to have some sort 544 00:33:25,520 --> 00:33:28,440 Speaker 4: of counter effort that just re emphasizes look what these 545 00:33:28,440 --> 00:33:32,040 Speaker 4: people were saying ten years ago. They were denialists, They 546 00:33:32,080 --> 00:33:34,720 Speaker 4: now realize that denihalism is not a very useful tactic 547 00:33:34,760 --> 00:33:38,000 Speaker 4: in public life, so they've shifted to delayism. Don't take 548 00:33:38,040 --> 00:33:41,000 Speaker 4: them at their word. These people are committed to maintaining 549 00:33:41,040 --> 00:33:44,320 Speaker 4: the status quo, and they will use whatever argument seems 550 00:33:44,320 --> 00:33:46,640 Speaker 4: to have traction at a particular point in time, and 551 00:33:46,680 --> 00:33:50,240 Speaker 4: they are lurking on the margins constantly for those points 552 00:33:50,240 --> 00:33:53,360 Speaker 4: of entry provided by as I said before, the news 553 00:33:53,360 --> 00:33:57,200 Speaker 4: cycle or the particular hot button issue or corese celebra 554 00:33:57,240 --> 00:34:00,720 Speaker 4: at any given time to remake in a different form 555 00:34:00,800 --> 00:34:04,040 Speaker 4: the same arguments that they've been using for fifteen years. 556 00:34:04,480 --> 00:34:07,600 Speaker 3: In Schellenberger's case, and that now I'm seeing Mark Morano 557 00:34:07,720 --> 00:34:11,920 Speaker 3: do this to this thing of calling themselves journalists now 558 00:34:12,880 --> 00:34:16,160 Speaker 3: as a way to legitimize themselves. And I wonder if 559 00:34:16,200 --> 00:34:18,879 Speaker 3: that's something that you've seen either amongst some of these 560 00:34:19,000 --> 00:34:21,919 Speaker 3: other actors or in other realms of disinformation. 561 00:34:23,120 --> 00:34:25,440 Speaker 4: Yes. Absolutely, And this is part of the reason why 562 00:34:26,160 --> 00:34:30,480 Speaker 4: we cautioned against these all out media exemptions being enshrined 563 00:34:30,520 --> 00:34:34,480 Speaker 4: in tech regulation, because very often the definitions of what 564 00:34:34,680 --> 00:34:38,320 Speaker 4: isn't journalism are quite blurred, and some of the policy 565 00:34:38,360 --> 00:34:40,920 Speaker 4: that we are seeing being debated, for example, the online safety. 566 00:34:40,960 --> 00:34:43,919 Speaker 4: But in the UK I think could pose a large 567 00:34:44,000 --> 00:34:48,040 Speaker 4: number of issues in people calling themselves citizen journalists and 568 00:34:48,160 --> 00:34:52,200 Speaker 4: therefore under the wording of these policies claiming exemption from 569 00:34:52,320 --> 00:34:55,840 Speaker 4: any kind of content moderation or punitive action against their 570 00:34:55,880 --> 00:34:59,960 Speaker 4: accounts and when they violate terms of service. So again, 571 00:35:00,280 --> 00:35:03,000 Speaker 4: it's a tactic, partly as a way of laundering their 572 00:35:03,040 --> 00:35:06,160 Speaker 4: image and making them seem more credible, but also it's 573 00:35:06,200 --> 00:35:08,600 Speaker 4: kind of casting their minds ahead to how can we 574 00:35:08,640 --> 00:35:11,640 Speaker 4: avoid any scrutiny for what we're doing in the future 575 00:35:12,040 --> 00:35:14,719 Speaker 4: by using the language of being a media outlet or 576 00:35:14,719 --> 00:35:17,800 Speaker 4: being a journalist to escape repercussions for our actions. 577 00:35:18,400 --> 00:35:21,279 Speaker 3: That's so interesting. I do sometimes wonder if some of 578 00:35:21,320 --> 00:35:25,920 Speaker 3: these contrariants do they even have a driving political ideology 579 00:35:26,000 --> 00:35:28,360 Speaker 3: or is this just like a handy way to make money. 580 00:35:29,320 --> 00:35:33,480 Speaker 4: That's kind of the million dollar question, isn't it. I'm 581 00:35:33,480 --> 00:35:35,520 Speaker 4: not sure I have the answer. I feel as though 582 00:35:35,520 --> 00:35:37,600 Speaker 4: if you look at some of these actors, it's difficult 583 00:35:37,760 --> 00:35:41,040 Speaker 4: to make the argument that they are not entirely cynical 584 00:35:41,600 --> 00:35:44,920 Speaker 4: and just using whatever tools are at their disposal to 585 00:35:45,600 --> 00:35:49,080 Speaker 4: create the most incendary and sensationist content drive as much 586 00:35:49,080 --> 00:35:51,680 Speaker 4: traffic as possible to their channels and to their products 587 00:35:51,719 --> 00:35:54,600 Speaker 4: and their services, and therefore generate as much revenue as possible. 588 00:35:54,760 --> 00:35:56,840 Speaker 4: But I do also believe that some of these people 589 00:35:56,880 --> 00:35:59,919 Speaker 4: have completely drunk the call aid, whether it's their cooler 590 00:36:00,200 --> 00:36:04,400 Speaker 4: or others, and do honestly believe that climate changes, I 591 00:36:04,440 --> 00:36:07,080 Speaker 4: don't know, not a problem, or that the fossil fuel 592 00:36:07,080 --> 00:36:11,560 Speaker 4: industry is going to solve it all with unsubstantiated technologies 593 00:36:11,600 --> 00:36:13,560 Speaker 4: like carbon capture and storage, and that none of us 594 00:36:13,600 --> 00:36:16,120 Speaker 4: need to worry because if we leave the free market 595 00:36:16,120 --> 00:36:18,799 Speaker 4: to solve the issue, we'll be fine. 596 00:36:23,520 --> 00:36:26,520 Speaker 1: That's it for this time, Thanks for listening, and we'll 597 00:36:26,560 --> 00:36:50,480 Speaker 1: see you next week.