1 00:00:01,920 --> 00:00:06,000 Speaker 1: Hello, and welcome back to Drilled. I'm Amy Westervelt. In 2 00:00:06,080 --> 00:00:09,880 Speaker 1: this season, we are going chapter by chapter of a 3 00:00:10,039 --> 00:00:13,480 Speaker 1: great new book that pulls together all of the social 4 00:00:13,520 --> 00:00:17,759 Speaker 1: science that we have so far peer reviewed social science 5 00:00:17,920 --> 00:00:23,880 Speaker 1: on climate obstruction globally. It's called Climate Obstruction a Global Assessment. 6 00:00:24,000 --> 00:00:27,400 Speaker 1: It's been pulled together by the Climate Social Science Network 7 00:00:27,440 --> 00:00:32,080 Speaker 1: at Brown, which includes hundreds of social scientists working all 8 00:00:32,159 --> 00:00:37,479 Speaker 1: over the world on trying to understand this issue. Today 9 00:00:37,640 --> 00:00:39,879 Speaker 1: we are digging into a subject that is near and 10 00:00:39,960 --> 00:00:43,400 Speaker 1: dear to my heart, the role of pr and media 11 00:00:43,800 --> 00:00:47,200 Speaker 1: in climate obstruction. And to do that, I'm joined by 12 00:00:47,240 --> 00:00:50,800 Speaker 1: two people who have been guests on this show before, 13 00:00:51,479 --> 00:00:56,200 Speaker 1: Melissa Aronchek at Rutgers University and Max Boykoff from the 14 00:00:56,320 --> 00:01:00,320 Speaker 1: University of Colorado at Boulder. We had a great conversation 15 00:01:00,640 --> 00:01:04,760 Speaker 1: about the role that media plays both in helping to 16 00:01:04,760 --> 00:01:08,440 Speaker 1: shape the public's understanding of climate and therefore the role 17 00:01:08,480 --> 00:01:12,280 Speaker 1: it can play in obstruction, especially if it is targeted 18 00:01:12,480 --> 00:01:15,520 Speaker 1: by bad faith actors, which it often is. It is 19 00:01:15,560 --> 00:01:18,920 Speaker 1: a super interesting conversation. I hope you enjoy it as 20 00:01:18,959 --> 00:01:21,560 Speaker 1: much as I did. That's coming up right after this 21 00:01:21,680 --> 00:01:27,400 Speaker 1: quick break. 22 00:01:38,160 --> 00:01:41,320 Speaker 2: My name is Max Boycott. I'm a professor here in 23 00:01:41,480 --> 00:01:44,960 Speaker 2: environmental studies and a fellow in the Cooperative Institute for 24 00:01:45,000 --> 00:01:48,280 Speaker 2: Research and Environmental Sciences at the University of Colorado Bowlder 25 00:01:48,320 --> 00:01:49,280 Speaker 2: in the US. 26 00:01:49,760 --> 00:01:53,560 Speaker 3: I'm Melissa Aaronjik. I'm a professor in the School of 27 00:01:53,680 --> 00:01:57,800 Speaker 3: Communication and Information at Rutgers University in New Jersey. 28 00:01:57,960 --> 00:02:00,280 Speaker 1: So I've probably talked to both of you about this 29 00:02:00,360 --> 00:02:02,320 Speaker 1: separately before, but it does seem to me that the 30 00:02:02,360 --> 00:02:07,640 Speaker 1: media fin avoids responsibility for climate obstruction or for playing 31 00:02:07,680 --> 00:02:09,240 Speaker 1: any kind of a role in it. I think you 32 00:02:09,280 --> 00:02:11,400 Speaker 1: show really clearly hear how it's kind of been used 33 00:02:11,440 --> 00:02:13,679 Speaker 1: as a tool. So I want to kind of start 34 00:02:13,680 --> 00:02:18,919 Speaker 1: by having you explain how, particularly the weaponization of this 35 00:02:19,200 --> 00:02:24,000 Speaker 1: journalistic norm balanced reporting has contributed to the disparity between 36 00:02:24,560 --> 00:02:29,920 Speaker 1: scientific consensus and the public's understanding of climate change. 37 00:02:30,240 --> 00:02:32,639 Speaker 2: Yeah. I'm happy to start us off, and if you'll 38 00:02:32,680 --> 00:02:34,480 Speaker 2: allow me to back out just a little bit to 39 00:02:34,520 --> 00:02:37,839 Speaker 2: answer that question. And I think if we look back 40 00:02:37,840 --> 00:02:42,000 Speaker 2: through time we can track in our group, the Media 41 00:02:42,080 --> 00:02:45,600 Speaker 2: Climate Change Observatory has done this where we've tracked media 42 00:02:45,639 --> 00:02:48,639 Speaker 2: coverage of climate change over time, and it really came 43 00:02:48,639 --> 00:02:52,919 Speaker 2: into public consciousness and into the news starting the late 44 00:02:53,000 --> 00:02:57,520 Speaker 2: nineteen eighties, and when that was happening, it was a 45 00:02:57,560 --> 00:03:00,920 Speaker 2: new story to tell, and there a lot of different 46 00:03:01,600 --> 00:03:04,320 Speaker 2: journalists and actors trying to make sense of what was 47 00:03:04,360 --> 00:03:08,320 Speaker 2: going on and how to tell these stories. And as 48 00:03:08,400 --> 00:03:10,639 Speaker 2: time went on, though, there was you know, in the 49 00:03:10,720 --> 00:03:13,639 Speaker 2: late nineteen eighties was the establishment of the IPCC, the 50 00:03:13,680 --> 00:03:18,440 Speaker 2: and a governmental pedaloid climate change. There was increasing understanding 51 00:03:18,440 --> 00:03:22,040 Speaker 2: within the scientific community that humans contribute to climate change. 52 00:03:22,200 --> 00:03:26,360 Speaker 2: And so from that period of time, journalism, as we've 53 00:03:26,440 --> 00:03:29,040 Speaker 2: tracked it through various studies, had tried to make sense 54 00:03:29,080 --> 00:03:31,520 Speaker 2: of that. And as they did that, and as they 55 00:03:31,560 --> 00:03:35,480 Speaker 2: started to drawn expert voices and others to give them 56 00:03:35,480 --> 00:03:39,400 Speaker 2: a sense of where science was on human contributions to 57 00:03:39,400 --> 00:03:44,960 Speaker 2: climate change, journalists leaned back into their journalistic norms, including 58 00:03:45,000 --> 00:03:48,600 Speaker 2: the journalistic normal balanced reporting, which one perspective to get 59 00:03:48,600 --> 00:03:52,440 Speaker 2: another and help reader or the viewer the listener makes 60 00:03:52,440 --> 00:03:56,080 Speaker 2: sense of what's going on. And so some early research 61 00:03:56,280 --> 00:03:58,560 Speaker 2: that I had done with my brother, Jules Boykoff, who's 62 00:03:58,560 --> 00:04:02,320 Speaker 2: at Pacific University and Political Science, we had just taken 63 00:04:02,320 --> 00:04:04,480 Speaker 2: a look at how had US media been covering it 64 00:04:04,960 --> 00:04:07,800 Speaker 2: from that time in the late nineteen eighties through at 65 00:04:07,800 --> 00:04:10,040 Speaker 2: that point in two thousand and two when we finished 66 00:04:10,080 --> 00:04:13,360 Speaker 2: up the study, and while there had been that conversion 67 00:04:13,400 --> 00:04:17,680 Speaker 2: agreement in the scientific community and that had been communicated 68 00:04:17,800 --> 00:04:23,200 Speaker 2: clearly through Dinner Governmental Panel and climate change assessments, that 69 00:04:23,600 --> 00:04:26,120 Speaker 2: we had found that the journalistic community had continued to 70 00:04:26,160 --> 00:04:30,200 Speaker 2: tell this quote unquote balance story and in so doing 71 00:04:30,360 --> 00:04:35,440 Speaker 2: had perpetrated an informational bias and actually skewed the conversations 72 00:04:35,480 --> 00:04:38,880 Speaker 2: in the public arena. And you know, part of the 73 00:04:38,920 --> 00:04:42,120 Speaker 2: reason that we make those claims, and then others have 74 00:04:42,240 --> 00:04:45,640 Speaker 2: also been looking at this Ober time, is that you know, 75 00:04:45,720 --> 00:04:48,520 Speaker 2: quite frankly, people don't pick up peer review literature of 76 00:04:48,560 --> 00:04:52,480 Speaker 2: the IPCC assessment reports. They rely on news coverage to 77 00:04:52,480 --> 00:04:54,960 Speaker 2: help make sense of what's going on around them, and 78 00:04:55,040 --> 00:04:58,560 Speaker 2: so news become this important bridge and this powerful driver 79 00:04:58,800 --> 00:05:03,520 Speaker 2: of public conversation station. There has been a learning going on, 80 00:05:03,640 --> 00:05:06,719 Speaker 2: a maturation or just sort of an integration of new 81 00:05:06,800 --> 00:05:10,520 Speaker 2: understanding within the journalistic community. So we had put out 82 00:05:10,520 --> 00:05:15,640 Speaker 2: that study, it actually gained traction. It ended up in 83 00:05:15,640 --> 00:05:18,479 Speaker 2: al Gore's and Inconvenient Truth just briefly mentioned, so it 84 00:05:18,560 --> 00:05:22,520 Speaker 2: got a lot more attention and journalists some were very 85 00:05:22,760 --> 00:05:26,560 Speaker 2: eager to think carefully and recalibrate how they're telling these 86 00:05:26,560 --> 00:05:31,559 Speaker 2: stories through their own reporting. Others, you know, weren't quite 87 00:05:31,600 --> 00:05:35,800 Speaker 2: as receptive to that messaging and critique. But nonetheless, you know, 88 00:05:35,839 --> 00:05:38,839 Speaker 2: we did some follow up work years later and found 89 00:05:38,880 --> 00:05:41,400 Speaker 2: that that coverage had improved it, you know, when we 90 00:05:41,480 --> 00:05:45,960 Speaker 2: looked it's with another collection of researchers here, led by 91 00:05:46,000 --> 00:05:48,400 Speaker 2: Lucy mckelister. When we looked at it again in twenty 92 00:05:48,440 --> 00:05:52,280 Speaker 2: twenty one, we found that there had been tremendous improvements 93 00:05:52,279 --> 00:05:56,160 Speaker 2: in US press along with UK, Australian, Canadian and New 94 00:05:56,240 --> 00:05:59,760 Speaker 2: Zealand newspapers. But there it still isn't one hundred percent. 95 00:06:00,360 --> 00:06:02,120 Speaker 2: And so you know, the way in which we went 96 00:06:02,160 --> 00:06:04,400 Speaker 2: about it to look at the unit of analysis being 97 00:06:04,400 --> 00:06:07,440 Speaker 2: each newspaper article. We can still see there's this room 98 00:06:07,480 --> 00:06:10,760 Speaker 2: for improvement. But if somebody's picking up the paper and 99 00:06:10,839 --> 00:06:16,560 Speaker 2: reading a very influential piece that isn't accurate, that is 100 00:06:16,600 --> 00:06:21,240 Speaker 2: still falling, whether unwittingly or wittingly, to this sort of 101 00:06:21,279 --> 00:06:25,360 Speaker 2: quote unquote balanced reporting that they themselves then start to 102 00:06:25,360 --> 00:06:27,240 Speaker 2: get a skewed views. Still to this very day. 103 00:06:27,520 --> 00:06:30,159 Speaker 1: Yeah, I don't know if I've ever told either or 104 00:06:30,160 --> 00:06:31,960 Speaker 1: maybe both of you this story. But I had I 105 00:06:32,000 --> 00:06:35,359 Speaker 1: had this experience directly with I was a stringer at 106 00:06:35,400 --> 00:06:38,039 Speaker 1: the Washington Post for a while, and like, because where 107 00:06:38,040 --> 00:06:41,680 Speaker 1: I lived, that meant covering a lot of wildfires. And 108 00:06:42,760 --> 00:06:45,040 Speaker 1: I went, I covered a fire once and the cal 109 00:06:45,080 --> 00:06:48,599 Speaker 1: fire chief told me, give me this like very simple 110 00:06:48,960 --> 00:06:53,480 Speaker 1: explanation for why fires were burning for so much longer 111 00:06:53,720 --> 00:06:55,400 Speaker 1: now than they had. And this was, I mean, this 112 00:06:55,440 --> 00:06:59,760 Speaker 1: wasn't that long ago. This was probably like twenty fifteen ish, 113 00:06:59,240 --> 00:07:02,640 Speaker 1: and and he said, you know, are you from California. 114 00:07:02,720 --> 00:07:04,640 Speaker 1: I said yes, And he said, well, you probably remember 115 00:07:04,680 --> 00:07:07,719 Speaker 1: when you were growing up that you would like go 116 00:07:07,839 --> 00:07:09,840 Speaker 1: to bout at night and the fire would be twenty 117 00:07:09,920 --> 00:07:11,920 Speaker 1: percent contained, and you'd wake up in the morning and 118 00:07:11,960 --> 00:07:15,000 Speaker 1: it would be like sixty percent contained or at least fifty. 119 00:07:15,040 --> 00:07:17,120 Speaker 1: And I said, yeah, that was that was the norm, 120 00:07:17,200 --> 00:07:18,840 Speaker 1: you know. And he said, yeah, that's how it used 121 00:07:18,880 --> 00:07:21,640 Speaker 1: to work because it would get cool and humidity would 122 00:07:21,720 --> 00:07:24,560 Speaker 1: increase at night. And now that doesn't happen anymore. So 123 00:07:24,680 --> 00:07:28,520 Speaker 1: these fires are burning like the same intensity twenty four 124 00:07:28,560 --> 00:07:31,120 Speaker 1: to seven and that's why they're getting so big, and 125 00:07:31,160 --> 00:07:34,000 Speaker 1: that's climate change. And I was like, wow, what a great, 126 00:07:34,040 --> 00:07:36,600 Speaker 1: like simple way to understand that. And I put it 127 00:07:36,600 --> 00:07:38,840 Speaker 1: on this story and my editor, who at the time 128 00:07:39,000 --> 00:07:43,560 Speaker 1: was the national editor of the Washington Post, suggested taking 129 00:07:43,600 --> 00:07:46,280 Speaker 1: out the part where he said that's climate change because 130 00:07:46,320 --> 00:07:50,080 Speaker 1: and his comment was, this is a wildfire story, not 131 00:07:50,160 --> 00:07:55,200 Speaker 1: a politics story. I said, like, I had been a 132 00:07:55,200 --> 00:07:58,480 Speaker 1: climate reporter for a long time, so I knew enough 133 00:07:58,480 --> 00:08:00,120 Speaker 1: to kind of push back on that and say, well, 134 00:08:00,280 --> 00:08:03,680 Speaker 1: you know, that's an industry talking point. There's nothing political 135 00:08:03,680 --> 00:08:07,560 Speaker 1: about what he's saying, you do, and he like, O, 136 00:08:07,680 --> 00:08:09,080 Speaker 1: listen to me and let me keep it in. But 137 00:08:09,120 --> 00:08:11,440 Speaker 1: the average person that's just like getting sent off to 138 00:08:12,040 --> 00:08:14,800 Speaker 1: you know, cover a flutter of fire, is not necessarily 139 00:08:15,160 --> 00:08:18,720 Speaker 1: going to feel confident, you know, saying oh, you're wrong, 140 00:08:18,760 --> 00:08:21,360 Speaker 1: it's not political. But like, you know, I don't think 141 00:08:21,360 --> 00:08:24,960 Speaker 1: that that guy was being hounded by exon every day. 142 00:08:25,080 --> 00:08:29,560 Speaker 1: It was just so ingrained in him that like, I 143 00:08:29,560 --> 00:08:31,480 Speaker 1: don't know, I think of it every time I think 144 00:08:31,480 --> 00:08:34,120 Speaker 1: about this, because it's it was such a simple way 145 00:08:34,840 --> 00:08:36,800 Speaker 1: that that stuff like creeps in. 146 00:08:37,720 --> 00:08:38,400 Speaker 2: Yeah. 147 00:08:38,760 --> 00:08:42,720 Speaker 1: Yeah, anyway, Obviously the media is not coming up with 148 00:08:42,800 --> 00:08:46,200 Speaker 1: this stuff on their own. What entities are working to 149 00:08:46,360 --> 00:08:51,800 Speaker 1: serve up misleading climate narratives to the media or ensure 150 00:08:51,840 --> 00:08:55,000 Speaker 1: that people who are like spokespeople for the industry or 151 00:08:55,040 --> 00:08:58,160 Speaker 1: who carry water for them and their ideas are given 152 00:08:58,160 --> 00:08:59,640 Speaker 1: a voice in the media. 153 00:09:00,320 --> 00:09:03,880 Speaker 3: Well, are you asking Amy about public relations firms outages 154 00:09:04,080 --> 00:09:05,520 Speaker 3: looking for fossil fuel? 155 00:09:06,080 --> 00:09:11,160 Speaker 1: Yeah, like the like broad ecosystem of like people that 156 00:09:11,320 --> 00:09:16,760 Speaker 1: are approaching the media with these narratives or talking points. 157 00:09:16,920 --> 00:09:23,040 Speaker 1: I would think of PR firms maybe like internal PR folks, companies, 158 00:09:23,480 --> 00:09:26,960 Speaker 1: prink tanks, I think are doing a fair bit, you know, 159 00:09:27,600 --> 00:09:28,840 Speaker 1: that whole ecosystem. 160 00:09:29,240 --> 00:09:32,840 Speaker 3: What does it look like. Well, one thing we need 161 00:09:32,840 --> 00:09:38,560 Speaker 3: to think about is the network of influence. That's really 162 00:09:38,600 --> 00:09:41,440 Speaker 3: something that has taken us a while to really get 163 00:09:41,920 --> 00:09:45,440 Speaker 3: our thinking around. And I think, you know, we're still 164 00:09:45,520 --> 00:09:49,520 Speaker 3: finding connections that we didn't really know existed, which is 165 00:09:49,640 --> 00:09:53,360 Speaker 3: kind of amazing. At least since the nineteen seventies, we 166 00:09:53,520 --> 00:09:59,920 Speaker 3: have started to see connections among not just media platforms 167 00:10:00,080 --> 00:10:04,080 Speaker 3: and people who set the agenda in the media, but 168 00:10:04,760 --> 00:10:11,960 Speaker 3: think tanks, research centers, private donors, lobbyists, and lawyers PR firms. 169 00:10:12,640 --> 00:10:16,040 Speaker 3: We're starting to get better and better evidence of this network, 170 00:10:16,360 --> 00:10:20,439 Speaker 3: and both journalists and researchers are uncovered just how far 171 00:10:20,480 --> 00:10:24,360 Speaker 3: back these relationships of denial go. But I do think 172 00:10:24,400 --> 00:10:28,079 Speaker 3: it's also important to remember that we're you know, all 173 00:10:28,120 --> 00:10:32,360 Speaker 3: of those groups I just mentioned operate in society and 174 00:10:32,400 --> 00:10:35,840 Speaker 3: that society in general, and really, you know, if we 175 00:10:36,040 --> 00:10:40,160 Speaker 3: just again zoom out from fossil fuels per se and 176 00:10:40,280 --> 00:10:44,679 Speaker 3: just look at our entire consumer society, the way our 177 00:10:44,760 --> 00:10:48,160 Speaker 3: lifestyles are set up right now, the status quo is 178 00:10:48,920 --> 00:10:51,880 Speaker 3: one in which fossil fuels play a very big role, 179 00:10:52,520 --> 00:10:57,439 Speaker 3: and it's very hard for people to imagine alternatives, especially 180 00:10:57,520 --> 00:11:00,520 Speaker 3: when we don't see a lot of presentation of those 181 00:11:00,559 --> 00:11:05,600 Speaker 3: alternatives in public communication. So it's a very vast and 182 00:11:06,240 --> 00:11:09,319 Speaker 3: really has been for a long time, very intractable problem 183 00:11:09,640 --> 00:11:11,440 Speaker 3: that we've been trying to work through. 184 00:11:11,880 --> 00:11:15,720 Speaker 2: Yes, definitely, I mean we can draw links directly from 185 00:11:16,000 --> 00:11:19,000 Speaker 2: through the structures from who owns a lot of the 186 00:11:19,000 --> 00:11:22,599 Speaker 2: corporate media. But then also, as you mentioned, Mellissa, that 187 00:11:22,720 --> 00:11:27,280 Speaker 2: just the pr the way that these firms are so influential, 188 00:11:27,360 --> 00:11:30,479 Speaker 2: the way they play roles in the network developing strategy, 189 00:11:30,600 --> 00:11:34,280 Speaker 2: we really get immersed if you're watching commercial television just 190 00:11:34,320 --> 00:11:38,960 Speaker 2: with all kinds of marketing and imagery and assertions that 191 00:11:39,080 --> 00:11:42,800 Speaker 2: can at times be seen as greenwashing. But then on 192 00:11:43,360 --> 00:11:48,360 Speaker 2: top of that, you know, at the very core, we're 193 00:11:48,520 --> 00:11:52,000 Speaker 2: primed to fall into this because we've got a lot 194 00:11:52,000 --> 00:11:54,360 Speaker 2: of other things that we want to focus in on. 195 00:11:54,480 --> 00:11:57,200 Speaker 2: We don't have to worry about it. And so when 196 00:11:57,520 --> 00:12:00,800 Speaker 2: this sort of apparatus, when it flows through courses, through 197 00:12:00,840 --> 00:12:03,559 Speaker 2: the veins of our lives, we take it. We're ready 198 00:12:03,559 --> 00:12:06,960 Speaker 2: for it. Would just say all right, let's focus in 199 00:12:07,000 --> 00:12:10,200 Speaker 2: on other things. And so this apparatus gives us that 200 00:12:10,280 --> 00:12:13,360 Speaker 2: opportunity to not have to face it. 201 00:12:14,160 --> 00:12:19,120 Speaker 1: Yeah, you mentioned four key contrarian viewpoints that are amplified 202 00:12:19,120 --> 00:12:21,000 Speaker 1: by the media, and I wonder if I could have 203 00:12:21,120 --> 00:12:25,160 Speaker 1: you name those and talk about how you or the 204 00:12:25,200 --> 00:12:27,680 Speaker 1: research that you were looking at honed in on those 205 00:12:27,720 --> 00:12:28,480 Speaker 1: four Yeah. 206 00:12:28,520 --> 00:12:34,119 Speaker 3: One really influential source for thinking about these four viewpoints 207 00:12:34,200 --> 00:12:37,040 Speaker 3: these kind of arguments for climate delay comes from the 208 00:12:37,080 --> 00:12:41,040 Speaker 3: scholar William Lamb and collaborators. It's in the academic Journal 209 00:12:41,040 --> 00:12:44,280 Speaker 3: of Global Sustainability, and it's really become a go to 210 00:12:44,440 --> 00:12:47,120 Speaker 3: source for a lot of us. They have this fantastic 211 00:12:47,240 --> 00:12:51,640 Speaker 3: chart which summarizes the argument, so it's very useful. So 212 00:12:51,679 --> 00:12:56,439 Speaker 3: the first argument for climate delay is to suggest that 213 00:12:56,520 --> 00:12:59,319 Speaker 3: someone else should take actions first, so in other words, 214 00:12:59,360 --> 00:13:04,480 Speaker 3: redirecting responsibility for the problem away from themselves, which in 215 00:13:04,520 --> 00:13:07,840 Speaker 3: this case would often be a fossil fuel company onto 216 00:13:07,960 --> 00:13:12,880 Speaker 3: somebody else. So the classic one is suggesting that consumers themselves, 217 00:13:12,960 --> 00:13:17,800 Speaker 3: individuals themselves should take actions to address climate change, you know, 218 00:13:17,880 --> 00:13:22,160 Speaker 3: recycle for instance. The second argument has to do with 219 00:13:22,559 --> 00:13:27,120 Speaker 3: saying that we can't mitigate climate change. It's simply not possible. 220 00:13:27,520 --> 00:13:30,000 Speaker 3: You have to just kind of give in to the 221 00:13:30,080 --> 00:13:32,600 Speaker 3: fact that it's never going to happen. And you've seen 222 00:13:32,640 --> 00:13:36,360 Speaker 3: a lot of arguments like that, especially by climate deni 223 00:13:36,400 --> 00:13:39,079 Speaker 3: as saying, you know, change is just impossible. We can't 224 00:13:39,200 --> 00:13:41,640 Speaker 3: change our current way of life, we can't do that, 225 00:13:42,120 --> 00:13:45,920 Speaker 3: or saying it's basically a catastrophe and you know, doom 226 00:13:45,920 --> 00:13:49,560 Speaker 3: and bloom apocalypse, there's nothing to be done. There's a 227 00:13:49,559 --> 00:13:52,720 Speaker 3: third argument, which is actually a little more insidious, where 228 00:13:52,960 --> 00:13:57,800 Speaker 3: fossil fuel companies will push non transformative solutions. That's how 229 00:13:57,960 --> 00:14:00,720 Speaker 3: Lamb and his collaborators put it. And you know, this 230 00:14:00,760 --> 00:14:04,400 Speaker 3: one's insidious because it seems like solutions are just around 231 00:14:04,440 --> 00:14:08,359 Speaker 3: the corner, but in fact these solutions are very problematic 232 00:14:08,440 --> 00:14:12,280 Speaker 3: and often really backwards. One would be to say future 233 00:14:12,320 --> 00:14:14,440 Speaker 3: technologies are going to solve the problem, so you know, 234 00:14:14,480 --> 00:14:16,800 Speaker 3: we don't have to worry now because in the future 235 00:14:17,280 --> 00:14:21,040 Speaker 3: we'll have AI solving climate change. These kind of very vague, 236 00:14:21,240 --> 00:14:25,120 Speaker 3: very always future focused ideas for how the problem will 237 00:14:25,160 --> 00:14:29,080 Speaker 3: be solved. Another one is to say you can't restrict 238 00:14:29,160 --> 00:14:33,480 Speaker 3: or regulate companies because then they won't be able to 239 00:14:33,600 --> 00:14:36,720 Speaker 3: come up with solutions for climate change, so you have 240 00:14:36,760 --> 00:14:39,080 Speaker 3: to kind of give them time to figure it out. 241 00:14:39,440 --> 00:14:44,560 Speaker 3: Yet again, this future facing solutionism. Again, these are problematic 242 00:14:44,600 --> 00:14:48,200 Speaker 3: because they seem like, back to what Max was saying 243 00:14:48,240 --> 00:14:51,800 Speaker 3: a moment ago, they seem to make us feel better, 244 00:14:51,960 --> 00:14:54,200 Speaker 3: and they make us feel like we don't have to 245 00:14:54,280 --> 00:14:56,760 Speaker 3: do anything that you know, companies have. It's kind of 246 00:14:56,760 --> 00:15:00,440 Speaker 3: the opposite from the redirect responsibility frame anyway. And then 247 00:15:00,480 --> 00:15:03,920 Speaker 3: the last one is about emphasizing all of the problems 248 00:15:04,000 --> 00:15:08,720 Speaker 3: with the solutions that are being proposed that are viable. So, 249 00:15:08,880 --> 00:15:11,760 Speaker 3: for instance, you've got a solution in mind which involves 250 00:15:12,080 --> 00:15:16,000 Speaker 3: social justice, but you know that means that vulnerable members 251 00:15:16,080 --> 00:15:19,200 Speaker 3: of our society might have to bear the brunt of 252 00:15:19,480 --> 00:15:22,200 Speaker 3: the solutions, So we can't do that. We need solutions 253 00:15:22,440 --> 00:15:25,080 Speaker 3: that work for everyone. Otherwise we shouldn't have solutions at all. 254 00:15:25,640 --> 00:15:32,040 Speaker 3: So these discourse types to be fancy shows this tendency 255 00:15:32,080 --> 00:15:36,920 Speaker 3: among actors of disinformation to actually move away from overt denial, 256 00:15:37,040 --> 00:15:40,760 Speaker 3: like moving away from saying climate change isn't happening, it's 257 00:15:40,800 --> 00:15:45,880 Speaker 3: not real, and having more subtle tactics to just undo 258 00:15:46,320 --> 00:15:50,000 Speaker 3: or delegitimate climate solutions just by saying that they're not 259 00:15:50,040 --> 00:15:51,800 Speaker 3: going to work for a variety of reasons. 260 00:15:52,000 --> 00:15:54,960 Speaker 2: It could be a good opportunity to just to point 261 00:15:54,960 --> 00:15:58,120 Speaker 2: out that Melissa and I were leading on this chapter 262 00:15:58,200 --> 00:16:02,240 Speaker 2: where we were essentially pulling together a lot of the 263 00:16:02,320 --> 00:16:06,000 Speaker 2: current research to help us understand how this discourse is 264 00:16:06,040 --> 00:16:11,960 Speaker 2: being steered through news, social media, advertising, public relations, and 265 00:16:12,640 --> 00:16:17,040 Speaker 2: we led on this chapter, but Trav Cone, Maanelas, and 266 00:16:17,120 --> 00:16:20,040 Speaker 2: Hannah Morris and Chris Russell also contributed to this. So 267 00:16:20,040 --> 00:16:22,080 Speaker 2: it was a great opportunity for us to pull together 268 00:16:22,200 --> 00:16:25,120 Speaker 2: this kind of research from William Lamb and colleagues and 269 00:16:25,160 --> 00:16:28,600 Speaker 2: others to help provide a bit more of a cogent picture. 270 00:16:29,200 --> 00:16:33,400 Speaker 1: Yeah, yeah, I love that study. The graphic so simple, 271 00:16:33,480 --> 00:16:38,440 Speaker 1: but like it's so helpful. Okay, so you mentioned social 272 00:16:38,520 --> 00:16:43,040 Speaker 1: media there Obviously, the media itself has gone through quite 273 00:16:43,080 --> 00:16:46,880 Speaker 1: a few pretty big transitions over the last couple decades. 274 00:16:47,640 --> 00:16:53,479 Speaker 1: How have those transitions contributed to the ability of corporations 275 00:16:53,480 --> 00:16:56,360 Speaker 1: and PR firms and these other entities that you've mentioned 276 00:16:56,640 --> 00:17:02,000 Speaker 1: to really kind of hijack the new even more, or 277 00:17:02,360 --> 00:17:06,720 Speaker 1: push even legacy media to kind of amplify these sorts 278 00:17:06,720 --> 00:17:07,840 Speaker 1: of talking points. 279 00:17:08,080 --> 00:17:11,639 Speaker 3: Well, it will be no surprise to anyone if I 280 00:17:11,760 --> 00:17:15,280 Speaker 3: say that in the Western hemisphere, at least in the 281 00:17:15,280 --> 00:17:20,480 Speaker 3: global North, our media are in crisis, especially news media, 282 00:17:20,520 --> 00:17:23,880 Speaker 3: and even more in trouble our legacy news media such 283 00:17:23,920 --> 00:17:29,800 Speaker 3: as radio and newspapers, which are dealing with economic and 284 00:17:29,880 --> 00:17:35,000 Speaker 3: political crises as well as extreme political polarization, which has 285 00:17:35,760 --> 00:17:39,840 Speaker 3: shattered the sense among a lot of the public that 286 00:17:39,920 --> 00:17:45,680 Speaker 3: our news outlets provide fact based and reliable information. That's 287 00:17:45,880 --> 00:17:48,440 Speaker 3: kind of the current picture. But it has been quite 288 00:17:48,480 --> 00:17:51,680 Speaker 3: a while now that oil and gas producers and their 289 00:17:51,720 --> 00:17:56,919 Speaker 3: collaborators have actively worked with both legacy news and social 290 00:17:56,960 --> 00:18:01,639 Speaker 3: media platforms, as well as PR firms and advertising agencies 291 00:18:02,119 --> 00:18:07,719 Speaker 3: to promote disinformation that obstricts climate policy. Max talked about 292 00:18:07,840 --> 00:18:11,840 Speaker 3: the news balance, but we've also seen how oil and 293 00:18:11,920 --> 00:18:18,240 Speaker 3: gas companies do source building with journalists to promote their 294 00:18:18,760 --> 00:18:21,879 Speaker 3: industry point of view and news coverage. But then, of 295 00:18:21,920 --> 00:18:25,280 Speaker 3: course we also have social media where there's less of 296 00:18:25,320 --> 00:18:29,080 Speaker 3: an intermediary or date keeping function, so oil and gas 297 00:18:29,080 --> 00:18:32,919 Speaker 3: companies can pretty much post whatever they want. Not to 298 00:18:32,960 --> 00:18:38,760 Speaker 3: mention blogs, instant messaging, and because our mainstream media is 299 00:18:39,720 --> 00:18:44,720 Speaker 3: under corporate control, and because their business model needs advertising 300 00:18:44,760 --> 00:18:48,520 Speaker 3: revenue to stay afloat, it has also become harder and 301 00:18:48,600 --> 00:18:53,720 Speaker 3: harder for news to remain separate from promotional content. Yes, 302 00:18:53,840 --> 00:18:55,560 Speaker 3: I don't you know, there's even more we could say. 303 00:18:55,880 --> 00:18:58,480 Speaker 3: We talk about this in the chapter for the book. 304 00:18:58,680 --> 00:19:03,280 Speaker 3: We have these newer kinds of threats with emerging digital 305 00:19:03,280 --> 00:19:08,480 Speaker 3: technologies and social media. Because now disinformation campaigns can operate 306 00:19:08,560 --> 00:19:13,240 Speaker 3: at a massive scale and scope, they can target certain 307 00:19:13,280 --> 00:19:17,600 Speaker 3: groups much more precisely. You have technology companies and their 308 00:19:17,600 --> 00:19:22,800 Speaker 3: platforms that let powerful actors shape information in ways that 309 00:19:23,080 --> 00:19:26,960 Speaker 3: lack accountability. And on top of everything else, we have 310 00:19:27,240 --> 00:19:32,520 Speaker 3: the explosion of artificial intelligence, which is changing at day 311 00:19:32,720 --> 00:19:37,679 Speaker 3: and is incredibly difficult to detect and predict. Not to 312 00:19:37,720 --> 00:19:41,640 Speaker 3: mention the environmental costs of AI use, the energy consumption 313 00:19:42,320 --> 00:19:47,480 Speaker 3: generated by AI, the waste generated by AI systems, especially 314 00:19:47,560 --> 00:19:52,080 Speaker 3: data centers, which contribute to greenhouse gas emissions and environmental pollution. 315 00:19:52,320 --> 00:19:54,040 Speaker 3: So we've got like a I was going to say 316 00:19:54,040 --> 00:19:56,160 Speaker 3: a triple threat, but I think this is like an 317 00:19:56,520 --> 00:19:57,440 Speaker 3: octuple threat. 318 00:19:57,520 --> 00:20:01,320 Speaker 2: Yeah, there are definitely several dynamics that and they're pointing 319 00:20:01,320 --> 00:20:04,639 Speaker 2: in discouraging directions. It means you've pointed out, Melissa, that 320 00:20:05,240 --> 00:20:09,040 Speaker 2: there's been increasing consolidation. Some of the media mergers and 321 00:20:09,119 --> 00:20:11,240 Speaker 2: acquisitions and everything that are going on just kind of 322 00:20:11,240 --> 00:20:15,280 Speaker 2: feel beyond us as everyday people. Skydance is merging with Paramount, 323 00:20:15,359 --> 00:20:17,200 Speaker 2: and what does that mean for Colbert and what does 324 00:20:17,200 --> 00:20:19,719 Speaker 2: that mean for the news? I get it's getting just 325 00:20:20,640 --> 00:20:23,800 Speaker 2: bigger and bigger all the time. And as we look around, 326 00:20:23,880 --> 00:20:28,040 Speaker 2: we can see that our local independent news are strained. 327 00:20:28,320 --> 00:20:30,800 Speaker 2: Just some of the recent decisions from the Trump administration 328 00:20:31,760 --> 00:20:34,480 Speaker 2: have put even further strain on these news sources. And 329 00:20:34,520 --> 00:20:37,840 Speaker 2: so we've across this country, across the world, we've got 330 00:20:38,040 --> 00:20:41,840 Speaker 2: many news deserts. So people are increasingly reliant on these 331 00:20:41,880 --> 00:20:45,399 Speaker 2: corporate media sources to make sense of what's going on 332 00:20:45,520 --> 00:20:47,720 Speaker 2: in the world, and that's through legacy media. As Melissa's 333 00:20:47,760 --> 00:20:51,119 Speaker 2: pointed out, once you get into social media, gets even 334 00:20:51,160 --> 00:20:55,639 Speaker 2: more confusing with algorithms and the blending. Even some of 335 00:20:55,680 --> 00:20:58,320 Speaker 2: your work amy has found its way into our chapter 336 00:20:58,400 --> 00:21:01,960 Speaker 2: where you've pointed out this confusion that can take place 337 00:21:02,000 --> 00:21:04,760 Speaker 2: between straight news reporting and some advertising that's going on 338 00:21:05,560 --> 00:21:10,240 Speaker 2: from these carb based industry players. And so while there 339 00:21:10,240 --> 00:21:14,440 Speaker 2: are definitely a lot of dynamics at play, it's discouraging 340 00:21:14,480 --> 00:21:15,080 Speaker 2: and challenging. 341 00:21:15,320 --> 00:21:18,360 Speaker 1: Yeah, I wonder if you've looked at or if there's 342 00:21:18,400 --> 00:21:20,879 Speaker 1: any research yet, or if it's maybe too early on 343 00:21:20,920 --> 00:21:26,480 Speaker 1: this around the way that influencers are sort of replacing reporters, 344 00:21:27,000 --> 00:21:29,520 Speaker 1: Like I see this all the time, even I have 345 00:21:29,560 --> 00:21:31,640 Speaker 1: an intern right now who was. 346 00:21:31,640 --> 00:21:34,240 Speaker 3: Like, do you know this guy blah blah blah. 347 00:21:34,280 --> 00:21:36,520 Speaker 1: I get all my news from him, and I looked 348 00:21:36,600 --> 00:21:39,200 Speaker 1: him up and he's like one of these people who 349 00:21:39,240 --> 00:21:44,160 Speaker 1: sort of reads headline in videos, which is all well 350 00:21:44,160 --> 00:21:47,760 Speaker 1: and good, except that, like, if those people end up 351 00:21:47,760 --> 00:21:50,000 Speaker 1: getting all the funding and all the audience, then there 352 00:21:50,040 --> 00:21:53,879 Speaker 1: won't be any reporters writing the stuff for them to 353 00:21:53,920 --> 00:21:59,560 Speaker 1: read anymore. I just see this huge shift in like 354 00:22:00,080 --> 00:22:06,600 Speaker 1: the credibility that people are ascribing to influencers for what 355 00:22:06,720 --> 00:22:08,719 Speaker 1: appears to be no rhyme or reason, and it's not 356 00:22:08,760 --> 00:22:10,840 Speaker 1: like these people are like, hey, listen to me, I'm 357 00:22:10,920 --> 00:22:14,080 Speaker 1: very credible for these reasons. It's just like some random 358 00:22:14,119 --> 00:22:15,200 Speaker 1: guy anyway. 359 00:22:15,359 --> 00:22:15,840 Speaker 3: I don't know. 360 00:22:15,800 --> 00:22:18,000 Speaker 1: If you guys have looked at that, but it's freaking 361 00:22:18,080 --> 00:22:18,360 Speaker 1: me out. 362 00:22:18,680 --> 00:22:18,879 Speaker 4: You know. 363 00:22:19,040 --> 00:22:22,520 Speaker 3: That problem first became really clear to me when I 364 00:22:22,600 --> 00:22:26,159 Speaker 3: was teaching an undergraduate class on media and politics a 365 00:22:26,200 --> 00:22:29,639 Speaker 3: few years ago and we had this assignment that I 366 00:22:29,680 --> 00:22:33,040 Speaker 3: gave them where I said the student had to identify 367 00:22:33,080 --> 00:22:36,800 Speaker 3: the source of the media they were looking at, and 368 00:22:37,160 --> 00:22:40,480 Speaker 3: their answers were things like Facebook or well, probably not, 369 00:22:40,680 --> 00:22:44,200 Speaker 3: because nobody under the age of twenty five books of Facebook, 370 00:22:44,200 --> 00:22:48,000 Speaker 3: but you know, social media were listed as the source 371 00:22:48,080 --> 00:22:50,720 Speaker 3: for these students, and I said, no, no, I don't 372 00:22:50,800 --> 00:22:53,840 Speaker 3: mean where you saw it. I mean the original source. 373 00:22:54,440 --> 00:22:56,720 Speaker 3: And they really didn't know what I was talking about. 374 00:22:57,040 --> 00:23:01,000 Speaker 3: And that's when it hit me that the way our 375 00:23:01,119 --> 00:23:05,359 Speaker 3: media platforms are organized, the way we access information online 376 00:23:06,280 --> 00:23:10,080 Speaker 3: is really not the way we people of my generation 377 00:23:10,480 --> 00:23:15,199 Speaker 3: used to access or understand. So the source is not 378 00:23:15,600 --> 00:23:20,960 Speaker 3: nearly as relevant to younger people as the content itself 379 00:23:21,000 --> 00:23:23,959 Speaker 3: and the way it's presented. So that you know, if 380 00:23:24,000 --> 00:23:27,040 Speaker 3: we're talking about a space where influencers come to have 381 00:23:27,119 --> 00:23:31,240 Speaker 3: a lot more credibility. That really does have to do 382 00:23:31,359 --> 00:23:34,120 Speaker 3: with what's getting put in front of people, and that 383 00:23:34,240 --> 00:23:37,600 Speaker 3: in itself has to do with how algorithms work, and 384 00:23:37,640 --> 00:23:42,000 Speaker 3: how prior usage patterns work, and how the location of 385 00:23:42,040 --> 00:23:44,560 Speaker 3: the individual works. We know about all of these data 386 00:23:44,560 --> 00:23:47,280 Speaker 3: points that inform what kind of information is put in 387 00:23:47,280 --> 00:23:51,679 Speaker 3: front of people, and so an influencer who might froughten 388 00:23:51,720 --> 00:23:55,240 Speaker 3: your attention at some other time for some other completely 389 00:23:55,240 --> 00:23:59,480 Speaker 3: different topic suddenly seems like somebody much more reliable and 390 00:23:59,520 --> 00:24:03,000 Speaker 3: trust worth than a source you don't know. And that 391 00:24:03,440 --> 00:24:08,480 Speaker 3: proximity in time and space to you is really problematic 392 00:24:08,520 --> 00:24:14,800 Speaker 3: when it comes to information about complex, scientific and highly 393 00:24:14,840 --> 00:24:16,480 Speaker 3: polarized issues like climate chain. 394 00:24:17,320 --> 00:24:18,040 Speaker 1: Yeah. 395 00:24:18,440 --> 00:24:21,800 Speaker 2: Yeah, yeah, I'm finding that too with students I work 396 00:24:21,840 --> 00:24:25,040 Speaker 2: with my kids. As they're getting older, you know, they'll say, oh, 397 00:24:25,440 --> 00:24:30,600 Speaker 2: same new sources, new sources, YouTube, new sources, TikTok, right, 398 00:24:30,720 --> 00:24:34,240 Speaker 2: And so my experiences have been similar. And that's a 399 00:24:34,280 --> 00:24:37,639 Speaker 2: demographic shift that we ought to be talking more about. 400 00:24:38,000 --> 00:24:40,840 Speaker 2: But on top of that, I'd say, relevant to you know, 401 00:24:40,880 --> 00:24:43,080 Speaker 2: what we're talking about here as well, is that there's 402 00:24:43,119 --> 00:24:47,280 Speaker 2: been documentation that now oil companies have been hiring these 403 00:24:47,280 --> 00:24:52,160 Speaker 2: TikTok influencers court young people. Yeah, they are the pathways forward, 404 00:24:52,200 --> 00:24:56,480 Speaker 2: and so the savvy pr firms, advertising firms, carbon based 405 00:24:56,520 --> 00:25:00,920 Speaker 2: industry actors themselves are getting into this. So you're what 406 00:25:00,960 --> 00:25:05,119 Speaker 2: we're sensing together is definitely something that's deliberate and that 407 00:25:05,200 --> 00:25:06,040 Speaker 2: it is going on. 408 00:25:06,800 --> 00:25:07,240 Speaker 3: That's right. 409 00:25:07,600 --> 00:25:12,240 Speaker 1: Yeah, you had some really really interesting references to new 410 00:25:12,280 --> 00:25:17,639 Speaker 1: research around this kind of persistent intersection between a certain 411 00:25:17,680 --> 00:25:22,800 Speaker 1: type of conservative identity and climate contrarianism. I know, the 412 00:25:22,800 --> 00:25:24,680 Speaker 1: first paper I ever read on that that I still 413 00:25:24,720 --> 00:25:26,960 Speaker 1: love because I think it's the funniest title ever is 414 00:25:27,000 --> 00:25:30,399 Speaker 1: the Cool Dude's Paper from it Wrighte and Dunlaver. 415 00:25:31,080 --> 00:25:33,200 Speaker 3: But you have some really new stuff that's really interesting. 416 00:25:33,280 --> 00:25:36,280 Speaker 1: I would love to have you guys just unpack that 417 00:25:36,560 --> 00:25:41,440 Speaker 1: and why this is such a persistent connection between this 418 00:25:41,480 --> 00:25:43,920 Speaker 1: particular identity and climate contriianism. 419 00:25:46,480 --> 00:25:50,359 Speaker 2: We do get into some of the contours of what's 420 00:25:50,400 --> 00:25:55,199 Speaker 2: been developing more recently new research that work. There's a 421 00:25:55,240 --> 00:25:59,040 Speaker 2: whole group of work by Riley Dunlop and Aaron mccrite 422 00:25:59,080 --> 00:26:02,800 Speaker 2: that's been really and really pathbreaking in the early days 423 00:26:02,800 --> 00:26:07,080 Speaker 2: of getting a handle on who are these climate conturions, 424 00:26:07,160 --> 00:26:11,800 Speaker 2: what are their motives? How are they effectively obstructing climate action, 425 00:26:12,040 --> 00:26:15,399 Speaker 2: climate pulsy action, greater engagement. And when you refer to 426 00:26:15,400 --> 00:26:19,919 Speaker 2: that emyre, the cool Dude's piece just refers to maybe 427 00:26:20,160 --> 00:26:24,920 Speaker 2: colloquially older white guys who are very influential in these spaces. 428 00:26:25,200 --> 00:26:28,679 Speaker 2: And some of them, you know, are microblogging on Twitter 429 00:26:28,920 --> 00:26:32,040 Speaker 2: x even Blue Sky and others massed on for a 430 00:26:32,080 --> 00:26:35,879 Speaker 2: little bit there. They're you know, sometimes maybe just sitting 431 00:26:35,920 --> 00:26:39,840 Speaker 2: in their basements and they're comfortable suburban homes. And these 432 00:26:39,840 --> 00:26:42,560 Speaker 2: folks demographically may not be at the forefront of climate 433 00:26:42,560 --> 00:26:46,400 Speaker 2: impacts like many others. And so there's been research that's 434 00:26:46,440 --> 00:26:49,000 Speaker 2: followed on theirs. I've done some myself where I've gone 435 00:26:49,040 --> 00:26:52,600 Speaker 2: to the Heartland Institute meetings about a decade apart and 436 00:26:52,760 --> 00:26:54,520 Speaker 2: sought to make sense of who they are, what are 437 00:26:54,560 --> 00:26:57,280 Speaker 2: their motivations. So that finds its way into our chapter. 438 00:26:57,840 --> 00:27:00,720 Speaker 2: Another kind of layer of this is that we've we've 439 00:27:00,760 --> 00:27:03,320 Speaker 2: documented in this chapter a lot of good research that's 440 00:27:03,359 --> 00:27:06,600 Speaker 2: been done that's been tracking how it is much more 441 00:27:06,640 --> 00:27:12,280 Speaker 2: evident and influential within certain countries with ties to fossil 442 00:27:12,280 --> 00:27:17,000 Speaker 2: fuel industry. And reliance on fossil fuels. Matthew Hornsey and 443 00:27:17,080 --> 00:27:20,040 Speaker 2: colleagues have done some really good work that compares and 444 00:27:20,080 --> 00:27:24,199 Speaker 2: contrasts these discourses across countries. So there's places like the 445 00:27:24,280 --> 00:27:26,320 Speaker 2: United States where the belly of the beast in a 446 00:27:26,359 --> 00:27:31,600 Speaker 2: certain way, the UK, Australia, certain elements of Brazil around 447 00:27:31,640 --> 00:27:36,160 Speaker 2: food consumption and fossil fuels, they have much more vocal 448 00:27:36,440 --> 00:27:43,160 Speaker 2: and I guess legible audible climate contrarian groups collections which 449 00:27:43,160 --> 00:27:47,359 Speaker 2: can be you know, cacaphanous at times, but the overall 450 00:27:48,080 --> 00:27:52,840 Speaker 2: goals of distracting, delaying, denying the facts and the evidence 451 00:27:52,880 --> 00:27:55,399 Speaker 2: around this, there are through lines there. 452 00:27:55,720 --> 00:27:59,080 Speaker 1: Yeah, totally. I didn't realize there was actual research on 453 00:27:59,119 --> 00:28:02,240 Speaker 1: this that amongst the developed countries or more developed countries, 454 00:28:02,280 --> 00:28:05,040 Speaker 1: that the UK and the US really stand out as 455 00:28:05,080 --> 00:28:08,919 Speaker 1: being places where media really continues to give a platform 456 00:28:09,000 --> 00:28:11,840 Speaker 1: to climate skepticism. So again, i'd love to have you 457 00:28:11,880 --> 00:28:15,160 Speaker 1: kind of talk about the research on that and share 458 00:28:15,200 --> 00:28:17,199 Speaker 1: your thoughts on why that might be. 459 00:28:17,880 --> 00:28:21,480 Speaker 3: I do think a big part of that just you know, again, 460 00:28:21,480 --> 00:28:23,840 Speaker 3: I think we have to understand the background or the 461 00:28:23,920 --> 00:28:26,960 Speaker 3: structural or systemic issue here. So I do think a 462 00:28:26,960 --> 00:28:30,520 Speaker 3: big part of that has to do with the ownership 463 00:28:30,600 --> 00:28:34,199 Speaker 3: structure of so many news outlets in the UK and 464 00:28:34,240 --> 00:28:37,399 Speaker 3: the US and who's so the question of who owns 465 00:28:37,440 --> 00:28:41,760 Speaker 3: the media becomes really salient here. But another issue that 466 00:28:42,040 --> 00:28:44,840 Speaker 3: affects the way we think about how the UK and 467 00:28:44,880 --> 00:28:48,320 Speaker 3: the US stand out as giving a platform for skepticism 468 00:28:48,400 --> 00:28:51,240 Speaker 3: is simply that there's a lot more research out there 469 00:28:51,400 --> 00:28:54,520 Speaker 3: about the UK and the US than about other parts 470 00:28:54,560 --> 00:28:57,480 Speaker 3: of the world. So it's not just that American and 471 00:28:57,520 --> 00:29:01,880 Speaker 3: British media platforms enables skepticism, which they do, but that 472 00:29:01,960 --> 00:29:04,680 Speaker 3: we know much more about it there than we do 473 00:29:04,880 --> 00:29:07,560 Speaker 3: about what's going on in other places. And I really 474 00:29:07,640 --> 00:29:10,920 Speaker 3: want to underline that as a very big challenge when 475 00:29:11,080 --> 00:29:14,720 Speaker 3: you're writing, as we did for as a contribution to 476 00:29:14,760 --> 00:29:18,239 Speaker 3: this book, a Global Assessment of Climate Obstruction. It just 477 00:29:18,800 --> 00:29:21,440 Speaker 3: I think one takeaway that we had from working on 478 00:29:21,480 --> 00:29:26,000 Speaker 3: this chapter was that we really need better and more 479 00:29:26,040 --> 00:29:29,960 Speaker 3: sustained research all over the world to recognize how some 480 00:29:30,040 --> 00:29:35,400 Speaker 3: of the climate skeptic frames and platforms and styles are 481 00:29:35,440 --> 00:29:38,479 Speaker 3: being exported to other parts of the world, or how 482 00:29:38,960 --> 00:29:41,400 Speaker 3: we are getting imported from other parts of the world. 483 00:29:41,640 --> 00:29:44,160 Speaker 3: Various kinds of climate skepticism but we just don't know 484 00:29:44,200 --> 00:29:46,280 Speaker 3: as much as we need to in some parts of 485 00:29:46,280 --> 00:29:46,680 Speaker 3: the world. 486 00:29:46,920 --> 00:29:54,760 Speaker 1: Yeah, we've been working more and more with reporters in Uganda, Tanzania, Mozambique, 487 00:29:54,880 --> 00:29:57,920 Speaker 1: and especially in Uganda and Tanzania. I was really surprised 488 00:29:57,960 --> 00:30:01,640 Speaker 1: at how much it had been like really drilled into 489 00:30:01,680 --> 00:30:05,280 Speaker 1: them the like very old school false equivalence thing, you 490 00:30:05,320 --> 00:30:09,560 Speaker 1: know of, like if you quote a scientist about climate change, 491 00:30:09,560 --> 00:30:12,240 Speaker 1: then you have to talk to I don't know, like 492 00:30:12,320 --> 00:30:17,320 Speaker 1: the government person in charge of petroleum contractors too. Made. 493 00:30:18,320 --> 00:30:20,120 Speaker 3: I was just going to mention something that one of 494 00:30:20,120 --> 00:30:23,520 Speaker 3: our co authors, Mayana Laws and has been really focusing 495 00:30:23,520 --> 00:30:26,720 Speaker 3: on that relates to this question, which is about how 496 00:30:27,640 --> 00:30:32,400 Speaker 3: emerging analyzes of climate coverage in places like Brazil might 497 00:30:32,520 --> 00:30:36,560 Speaker 3: make few references to climate skepticism. But that could be 498 00:30:36,560 --> 00:30:40,200 Speaker 3: because they're not focusing on fossil fuel emissions, but they 499 00:30:40,240 --> 00:30:44,240 Speaker 3: are focusing on things like carbon pollution from agriculture or 500 00:30:44,280 --> 00:30:47,800 Speaker 3: other land use practices. So I think that's just something 501 00:30:47,840 --> 00:30:51,760 Speaker 3: else to keep in mind when we're talking about climate skepticism. 502 00:30:51,840 --> 00:30:56,320 Speaker 3: It might sometimes certain stories might fall outside of the 503 00:30:56,360 --> 00:31:01,080 Speaker 3: purview of research on climate skepticism, but it is climate skepticism. 504 00:31:01,120 --> 00:31:03,440 Speaker 3: It just might look different in the news piece. 505 00:31:04,680 --> 00:31:10,400 Speaker 2: Yeah, Melissa, you make great points, and just from referencing Mayena, 506 00:31:10,440 --> 00:31:13,960 Speaker 2: that's she's an example of you know, she has been 507 00:31:14,040 --> 00:31:16,880 Speaker 2: really influential in helping tell these stories out of Brazil. 508 00:31:17,600 --> 00:31:21,600 Speaker 2: And we need many more researchers like Maena and others 509 00:31:21,680 --> 00:31:24,920 Speaker 2: to come together so that we can tell these stories 510 00:31:25,320 --> 00:31:29,080 Speaker 2: more comprehensively. So this book itself, you know, our chapter 511 00:31:29,160 --> 00:31:32,720 Speaker 2: contribution to it, a larger project is trying to take 512 00:31:32,720 --> 00:31:36,800 Speaker 2: a step forward and more systematically understanding how these dynamics 513 00:31:36,840 --> 00:31:40,040 Speaker 2: play out in various times and spaces and that. But 514 00:31:41,120 --> 00:31:45,720 Speaker 2: it is stunning, you know, given the amount of funding 515 00:31:46,120 --> 00:31:50,320 Speaker 2: and capacity that there is within these pr agencies, within advertising, 516 00:31:50,360 --> 00:31:53,800 Speaker 2: within news reporting that shapes the stories that we get 517 00:31:53,840 --> 00:31:57,600 Speaker 2: on a daily basis, that we don't have a larger, 518 00:31:57,840 --> 00:32:01,840 Speaker 2: more coordinated effort to understand all these dynamics from a 519 00:32:01,840 --> 00:32:05,400 Speaker 2: social sciences perspective, from humanities and how that relates to 520 00:32:05,440 --> 00:32:07,000 Speaker 2: the natural physical sciences. 521 00:32:07,360 --> 00:32:11,000 Speaker 1: The most recent IPCC report was the first to include 522 00:32:11,040 --> 00:32:14,959 Speaker 1: social sciences and the first to really note this stuff 523 00:32:15,000 --> 00:32:19,360 Speaker 1: as a huge blocker to policy action. And Max, I 524 00:32:19,400 --> 00:32:23,000 Speaker 1: know I've talked to you about this before, so I 525 00:32:23,000 --> 00:32:25,320 Speaker 1: won't have you repeat like an hour's worth of stuff. 526 00:32:25,320 --> 00:32:27,880 Speaker 1: But yeah, I just am curious, like, why do you 527 00:32:27,880 --> 00:32:32,200 Speaker 1: think it took so long for the IPCC to start 528 00:32:32,240 --> 00:32:35,000 Speaker 1: looking at this as part of the problem. 529 00:32:35,520 --> 00:32:39,240 Speaker 2: Yeah, I think it is a good question. As we've 530 00:32:39,320 --> 00:32:42,959 Speaker 2: grown to understand more about climate change, it's not a 531 00:32:43,000 --> 00:32:46,880 Speaker 2: single issue. We're increasingly recognizing that it's a set of 532 00:32:46,880 --> 00:32:51,440 Speaker 2: intersecting challenges that flow through every aspect of our lives, 533 00:32:51,520 --> 00:32:55,120 Speaker 2: meeting our livelihood needs and everything else, and in so doing, 534 00:32:55,160 --> 00:32:59,600 Speaker 2: we need to understand it from not just physical biological perspectives, 535 00:32:59,640 --> 00:33:04,920 Speaker 2: but human behavior perspectives. And there have been social scientists 536 00:33:04,920 --> 00:33:07,160 Speaker 2: that have come on board and been a part of 537 00:33:07,160 --> 00:33:10,400 Speaker 2: some of the previous IPCC reports, but this really was 538 00:33:10,520 --> 00:33:12,480 Speaker 2: a big step forward with some of us that were 539 00:33:12,480 --> 00:33:16,120 Speaker 2: invited along. So I was a contributing author to that 540 00:33:16,800 --> 00:33:20,400 Speaker 2: most recent IPCC report. And so when you ask that question, 541 00:33:20,480 --> 00:33:22,240 Speaker 2: I mean, I think the kind of simple answer is 542 00:33:22,840 --> 00:33:25,920 Speaker 2: because there were folks like me and me included to 543 00:33:25,960 --> 00:33:30,680 Speaker 2: bring the very thoroughly researched work that we're talking about 544 00:33:30,680 --> 00:33:33,920 Speaker 2: in this chapter and this larger book into the pages 545 00:33:33,960 --> 00:33:36,680 Speaker 2: of the IPCC. But I think a bigger part of 546 00:33:36,680 --> 00:33:40,560 Speaker 2: that is just that more folks from different disciplines, seeing 547 00:33:40,560 --> 00:33:43,880 Speaker 2: this as an integrative, interdisciplinary set of challenges, are being 548 00:33:43,960 --> 00:33:47,400 Speaker 2: invited in to help provide those insights through the thoroughly 549 00:33:47,480 --> 00:33:50,840 Speaker 2: researched work that's going on out there, and so it 550 00:33:50,880 --> 00:33:53,600 Speaker 2: is a step forward. You know, they're selecting authors for 551 00:33:53,680 --> 00:33:56,760 Speaker 2: the next one. We could possibly debate for a full 552 00:33:56,800 --> 00:34:00,440 Speaker 2: hour about the wisdom of another big assessment report years 553 00:34:00,440 --> 00:34:02,640 Speaker 2: and off, while we need to do a lot interim. 554 00:34:03,160 --> 00:34:05,880 Speaker 2: But having said that, they're selecting authors for the next one, 555 00:34:05,920 --> 00:34:10,680 Speaker 2: I hope that continues to flow into these pages. And 556 00:34:11,320 --> 00:34:13,640 Speaker 2: the work that we talk about in the chapter that 557 00:34:13,680 --> 00:34:17,759 Speaker 2: we capture that's in the larger reports themselves actually didn't 558 00:34:17,800 --> 00:34:21,719 Speaker 2: make it into the summary for policy makers, and that too, 559 00:34:21,719 --> 00:34:23,480 Speaker 2: we could talk about for quite a while. But there's 560 00:34:23,480 --> 00:34:25,080 Speaker 2: still a lot of work to do to make this 561 00:34:25,200 --> 00:34:28,759 Speaker 2: much more legible and to make this a larger conversation 562 00:34:28,920 --> 00:34:29,880 Speaker 2: through the IPCC. 563 00:34:30,120 --> 00:34:32,080 Speaker 1: Okay, Melissa, I feel like this is going to be 564 00:34:32,120 --> 00:34:35,080 Speaker 1: your best question, which is about the pr firms. 565 00:34:35,120 --> 00:34:35,680 Speaker 3: I love this. 566 00:34:35,640 --> 00:34:40,040 Speaker 1: Description of them as sort of the glue joining carbon 567 00:34:40,080 --> 00:34:44,880 Speaker 1: based industries, media organizations, and economic sectors using collisions campaigns 568 00:34:44,880 --> 00:34:48,320 Speaker 1: and other coordinated processes to question the efficacy of science, 569 00:34:48,440 --> 00:34:53,600 Speaker 1: news and institutions engaging climate related issues. Again, this is 570 00:34:53,640 --> 00:34:57,560 Speaker 1: one I know you could spend many hours talking about. 571 00:34:58,719 --> 00:34:59,960 Speaker 1: Can I have you give a little bit of an 572 00:35:00,080 --> 00:35:03,640 Speaker 1: overview of the role that PR firms play here. If 573 00:35:03,680 --> 00:35:05,919 Speaker 1: you want to talk about advertising firms too, that's fine. 574 00:35:05,960 --> 00:35:08,040 Speaker 1: Maybe the difference between them, because I feel like they 575 00:35:08,080 --> 00:35:11,200 Speaker 1: get conflated a lot. Maybe there's not that much difference 576 00:35:11,239 --> 00:35:13,719 Speaker 1: between them anymore. I don't know, anyway, over to you. 577 00:35:14,360 --> 00:35:18,600 Speaker 3: So, the most important thing I learned when I started 578 00:35:18,920 --> 00:35:22,000 Speaker 3: researching the role of AD agencies and PR firms and 579 00:35:22,120 --> 00:35:27,600 Speaker 3: climate disinformation is that these firms are not just mouthpieces 580 00:35:27,600 --> 00:35:31,279 Speaker 3: for their fossil fuel clients. They are coming up with 581 00:35:31,320 --> 00:35:35,799 Speaker 3: the strategies, They are creating messages, They are developing the 582 00:35:35,840 --> 00:35:40,440 Speaker 3: relationships with news media and other organizations who are friendly 583 00:35:40,480 --> 00:35:44,840 Speaker 3: to their cause. For a long time, climate researchers only 584 00:35:44,880 --> 00:35:49,759 Speaker 3: focused on the fossil fuel companies themselves, and that was 585 00:35:50,040 --> 00:35:54,160 Speaker 3: relevant and necessary. So we now have a lot more 586 00:35:54,239 --> 00:35:57,719 Speaker 3: information about what Exxon does, or what Shell does, or 587 00:35:57,760 --> 00:36:02,800 Speaker 3: what VP does. But lately journalists and researchers have started 588 00:36:02,840 --> 00:36:07,160 Speaker 3: paying a lot more attention to the consultants and strategists 589 00:36:07,239 --> 00:36:10,080 Speaker 3: who come up with the positioning for these fossil fueld 590 00:36:10,120 --> 00:36:14,920 Speaker 3: companies and who play key roles in managing these companies 591 00:36:15,040 --> 00:36:19,799 Speaker 3: media image and their marketing and strategic plan. I do 592 00:36:19,840 --> 00:36:22,759 Speaker 3: want to mention that there is a very important pioneer 593 00:36:22,840 --> 00:36:25,880 Speaker 3: of this kind of research who's name as John Stauber 594 00:36:26,080 --> 00:36:30,040 Speaker 3: of the Center for Media and Democracy in Wisconsin. He 595 00:36:30,160 --> 00:36:32,680 Speaker 3: founded that center back in nineteen ninety three, and in 596 00:36:32,760 --> 00:36:35,879 Speaker 3: nineteen ninety five he wrote this book which I think 597 00:36:35,880 --> 00:36:39,160 Speaker 3: has the best name that any book like this could have, 598 00:36:39,200 --> 00:36:42,359 Speaker 3: which is called Toxic Sludge Is Good for You, which 599 00:36:42,480 --> 00:36:47,680 Speaker 3: was an expose of the PR firms involved in environmental disinformation. 600 00:36:48,400 --> 00:36:51,000 Speaker 3: And I mentioned that because that really was a very 601 00:36:51,040 --> 00:36:53,719 Speaker 3: important guide that came out at a time when most 602 00:36:53,760 --> 00:36:56,279 Speaker 3: of us were looking the other way. It really has 603 00:36:56,320 --> 00:37:00,600 Speaker 3: helped later researchers and journalists think about this. So a 604 00:37:00,640 --> 00:37:04,000 Speaker 3: second major finding that we have about the role of 605 00:37:04,239 --> 00:37:08,879 Speaker 3: PR firms and AD agencies in climate disinformation is that 606 00:37:08,920 --> 00:37:13,240 Speaker 3: these firms are embedded in a very wide ecosystem of influence. 607 00:37:13,400 --> 00:37:16,640 Speaker 3: We mentioned this well earlier in our conversation, but just 608 00:37:16,719 --> 00:37:20,240 Speaker 3: important to remember PR firms and ad agencies are doing 609 00:37:20,320 --> 00:37:24,040 Speaker 3: work for not only fossil fuel companies, but also their 610 00:37:24,120 --> 00:37:29,959 Speaker 3: trade associations, industry councils and science advisory councils, for think 611 00:37:30,000 --> 00:37:37,080 Speaker 3: tanks and research institutes, ENGOs, foundations, Chambers of commerce, organizational boards. 612 00:37:37,280 --> 00:37:42,279 Speaker 3: I mean, the list goes on, and the important thing 613 00:37:42,320 --> 00:37:45,840 Speaker 3: that really came to me is that these firms play 614 00:37:46,040 --> 00:37:50,239 Speaker 3: several roles in keeping this network together. This is the 615 00:37:50,280 --> 00:37:53,920 Speaker 3: glue part. One of the things that these PR firms 616 00:37:54,000 --> 00:37:59,520 Speaker 3: and other consultants do is intelligence gathering across different industries 617 00:38:00,040 --> 00:38:03,919 Speaker 3: as well as inside the environmental community. So you'll have 618 00:38:04,160 --> 00:38:08,680 Speaker 3: PR firms hiring people who used to work at government 619 00:38:08,760 --> 00:38:13,719 Speaker 3: agencies working on environmental issues or from other organizations that 620 00:38:13,840 --> 00:38:19,160 Speaker 3: influence public policy on environment or climate and energy issues. 621 00:38:20,160 --> 00:38:23,600 Speaker 3: Another thing these PR firms will do is conduct industry 622 00:38:23,680 --> 00:38:28,480 Speaker 3: friendly research that help clients promote their viewpoints in the media. 623 00:38:29,120 --> 00:38:35,320 Speaker 3: So there's a massive production of scientific material, legal material, 624 00:38:35,440 --> 00:38:39,880 Speaker 3: and technical material to circulate in the media. So, like 625 00:38:39,920 --> 00:38:42,680 Speaker 3: you said, Amy, I could really go on for several 626 00:38:42,719 --> 00:38:45,520 Speaker 3: more hours about this, but I just really want to 627 00:38:45,719 --> 00:38:49,879 Speaker 3: communicate that these firms are not just about spin. There's 628 00:38:49,920 --> 00:38:53,880 Speaker 3: so much more going on and they deserve all of 629 00:38:53,920 --> 00:38:56,080 Speaker 3: the attention that we've been trying to give to them 630 00:38:56,080 --> 00:38:57,080 Speaker 3: over the last few years. 631 00:38:57,280 --> 00:38:59,120 Speaker 2: Yeah, this has written up a lot of great research 632 00:38:59,160 --> 00:39:02,040 Speaker 2: on this and a book fit is worth checking out 633 00:39:02,080 --> 00:39:05,520 Speaker 2: for all your listeners. Yes, just to maybe animate this 634 00:39:05,640 --> 00:39:09,640 Speaker 2: with a very recent example. You know, Edelman, big influential 635 00:39:10,400 --> 00:39:15,520 Speaker 2: ad agency that's helped promote fossil fuels with various companies 636 00:39:15,560 --> 00:39:18,799 Speaker 2: as clients, has just been hired by cop thirty in 637 00:39:18,880 --> 00:39:21,880 Speaker 2: Brazil to with the strategy. So these things are happening 638 00:39:21,880 --> 00:39:23,080 Speaker 2: and circulating all the time. 639 00:39:23,600 --> 00:39:28,120 Speaker 1: Yeah, yeah, I know, I was just reading about that. Okay, 640 00:39:28,160 --> 00:39:30,720 Speaker 1: we talked a little bit about social media and digital 641 00:39:31,000 --> 00:39:34,160 Speaker 1: but can I have you hone in on the digital 642 00:39:34,200 --> 00:39:38,000 Speaker 1: platforms in particular and sort of structurally, what is going 643 00:39:38,040 --> 00:39:42,720 Speaker 1: on with them with respect to spreading disinformation around and 644 00:39:42,760 --> 00:39:44,520 Speaker 1: what are people trying to do. 645 00:39:46,440 --> 00:39:52,200 Speaker 3: So, I mean, yes, digital platforms are very troubling sources 646 00:39:52,239 --> 00:39:57,080 Speaker 3: of climate disinformation. In fact, some have argued that digital 647 00:39:57,120 --> 00:40:02,560 Speaker 3: platforms are actually completely complicit in spreading climate denial because 648 00:40:02,600 --> 00:40:06,840 Speaker 3: they do such a terrible job of monitoring the content 649 00:40:07,640 --> 00:40:10,600 Speaker 3: on their sites in the name of you know, they 650 00:40:10,600 --> 00:40:13,920 Speaker 3: want to avoid so called censorship, or they say they're 651 00:40:13,960 --> 00:40:16,960 Speaker 3: just technology companies and not media companies, so they're not 652 00:40:17,080 --> 00:40:21,040 Speaker 3: responsible for the content on their sites. They also take 653 00:40:21,080 --> 00:40:24,480 Speaker 3: ads from fossil fuel companies. We have one report we 654 00:40:24,560 --> 00:40:28,240 Speaker 3: talk about in the chapter that estimates that Google alone 655 00:40:28,960 --> 00:40:33,760 Speaker 3: received twenty three point seven million dollars between twenty twenty 656 00:40:33,960 --> 00:40:37,520 Speaker 3: and twenty twenty two from the five largest oil companies 657 00:40:37,560 --> 00:40:41,239 Speaker 3: in the world to promote their advertising. We can also 658 00:40:41,880 --> 00:40:47,319 Speaker 3: really level our sites at meta slash Facebook, which has 659 00:40:47,480 --> 00:40:52,680 Speaker 3: overridden its own independent fact checkers for climate science. It 660 00:40:52,719 --> 00:40:57,759 Speaker 3: has also allowed fossil fuel companies to purchase misleading ads. 661 00:40:57,800 --> 00:41:02,239 Speaker 3: And another real problem that is posed by companies like 662 00:41:02,280 --> 00:41:06,720 Speaker 3: Facebook is that they have a data sharing tool called 663 00:41:06,880 --> 00:41:11,000 Speaker 3: crowd Tangle, which is traditionally relied upon by journalists and 664 00:41:11,120 --> 00:41:16,319 Speaker 3: academics to analyze engagement with content on the platform, and 665 00:41:16,520 --> 00:41:20,040 Speaker 3: they keep tweaking the data sharing tools so that it's 666 00:41:20,160 --> 00:41:24,560 Speaker 3: less transparent and less useful for journalists and academics. So 667 00:41:24,920 --> 00:41:29,040 Speaker 3: it's very hard to get a long term view of 668 00:41:29,120 --> 00:41:32,680 Speaker 3: how engagement is changing over time, and it's very difficult 669 00:41:32,760 --> 00:41:37,080 Speaker 3: to understand all the demographics and other data points that 670 00:41:37,200 --> 00:41:42,840 Speaker 3: researchers need. So it's so it's yeah, it's really it 671 00:41:42,840 --> 00:41:45,680 Speaker 3: really feels like the wild West. There's a group called 672 00:41:45,719 --> 00:41:50,440 Speaker 3: Climate Action Against Disinformation CAD, which has done a lot 673 00:41:50,440 --> 00:41:54,360 Speaker 3: of really interesting research in this area, and they've talked 674 00:41:54,360 --> 00:42:01,480 Speaker 3: about how really the four big platforms of TikTok, Meta, YouTube, 675 00:42:01,560 --> 00:42:05,640 Speaker 3: and X have become complicit in the spread of climate denial. 676 00:42:06,440 --> 00:42:09,880 Speaker 3: And they put X out of those four last among 677 00:42:09,920 --> 00:42:15,319 Speaker 3: platforms because of their absolute total absence of policies on 678 00:42:15,480 --> 00:42:21,879 Speaker 3: climate disinformation, failing to effectively enforce whatever policies they do have, 679 00:42:22,520 --> 00:42:24,680 Speaker 3: and total lack of public transparency. 680 00:42:24,920 --> 00:42:30,560 Speaker 2: It's a pretty comprehensive answer. It's nice to hear listening 681 00:42:30,600 --> 00:42:34,080 Speaker 2: to what you've laid out, Melissa. I mean, even if 682 00:42:34,120 --> 00:42:38,839 Speaker 2: we were to say there's no deliberate malintent with all 683 00:42:38,880 --> 00:42:42,919 Speaker 2: these big companies, even just as you had pointed out 684 00:42:42,920 --> 00:42:46,600 Speaker 2: the absence of policies on just information the way in 685 00:42:46,680 --> 00:42:49,040 Speaker 2: which you know, at the start of the year, Zuckerberg 686 00:42:49,080 --> 00:42:51,240 Speaker 2: it said that they're getting rid of its fact checking 687 00:42:51,280 --> 00:42:56,080 Speaker 2: policy and just starting some community board community notes. I 688 00:42:56,120 --> 00:43:00,680 Speaker 2: should say that just the misinformation that proliferates because of 689 00:43:00,719 --> 00:43:05,280 Speaker 2: the influence of these and the lack of intentional fact checking, 690 00:43:05,400 --> 00:43:07,200 Speaker 2: amongst them is really troubling. 691 00:43:08,600 --> 00:43:10,879 Speaker 1: I wonder if you both or either of you has 692 00:43:10,960 --> 00:43:17,640 Speaker 1: a response to this very concerning trend of people recasting 693 00:43:17,719 --> 00:43:22,680 Speaker 1: fact checking as censorship and like how that plays into 694 00:43:22,719 --> 00:43:23,320 Speaker 1: this stuff. 695 00:43:23,840 --> 00:43:26,440 Speaker 2: I may not answer that head on, but I think 696 00:43:26,920 --> 00:43:30,759 Speaker 2: part of how we've then gotten into these places is 697 00:43:30,800 --> 00:43:35,399 Speaker 2: that people like those within the academic research community haven't 698 00:43:35,440 --> 00:43:40,120 Speaker 2: advocated adequately for facts and evidence and truths as they 699 00:43:41,120 --> 00:43:45,040 Speaker 2: come about. And so that is a form of advocacy. 700 00:43:45,080 --> 00:43:47,760 Speaker 2: It's advocacy of a sort. It's not advocating for particular 701 00:43:47,760 --> 00:43:51,920 Speaker 2: policy actions or advocating for certain outcomes. But in the 702 00:43:51,960 --> 00:43:55,600 Speaker 2: absence of that, we've allowed ourselves to devolve into this 703 00:43:55,880 --> 00:44:00,560 Speaker 2: place where fact checking can be seen as, like you say, 704 00:44:00,680 --> 00:44:04,600 Speaker 2: some kind of censorship, some kind of intervention, that that 705 00:44:04,760 --> 00:44:07,920 Speaker 2: is unwanted and that curtails free speech. I mean, it's 706 00:44:07,920 --> 00:44:09,959 Speaker 2: where things can turn on their head. But I think 707 00:44:10,760 --> 00:44:15,400 Speaker 2: I find myself somewhat culpable and those within our communities 708 00:44:15,440 --> 00:44:19,520 Speaker 2: for not advocating more strenuously for facts and evidence and 709 00:44:19,560 --> 00:44:24,839 Speaker 2: for not helping others better understand these these uh, you know, 710 00:44:24,920 --> 00:44:28,400 Speaker 2: the landscape of how this matters, how facts matter. 711 00:44:28,760 --> 00:44:31,719 Speaker 3: Maybe I mean we should talk about rapid attribution because 712 00:44:31,719 --> 00:44:33,040 Speaker 3: that's wine. 713 00:44:33,280 --> 00:44:35,839 Speaker 1: It does kind of like blend into that. Yeah, what 714 00:44:35,960 --> 00:44:39,440 Speaker 1: is what is rapid attribution? Let me have you define it? 715 00:44:39,480 --> 00:44:40,920 Speaker 1: And then like how could it help with this? 716 00:44:41,800 --> 00:44:47,359 Speaker 3: Yeah? So I heard about this idea through Jill Hopka's work. 717 00:44:47,520 --> 00:44:49,439 Speaker 3: Others have spoken about it, but she has a great 718 00:44:49,480 --> 00:44:54,440 Speaker 3: piece on this topic. The basic idea behind rapid attribution 719 00:44:54,960 --> 00:44:59,280 Speaker 3: is to respond to misleading media content as quickly as possible. 720 00:45:00,080 --> 00:45:03,279 Speaker 3: So this has to do with, of course, this incredible 721 00:45:03,320 --> 00:45:06,479 Speaker 3: speed of the news cycle these days, and the fact 722 00:45:06,520 --> 00:45:09,200 Speaker 3: that our attention is constantly pulled in a million different 723 00:45:09,239 --> 00:45:14,359 Speaker 3: directions online. And if you contrast that with the academic 724 00:45:14,560 --> 00:45:17,520 Speaker 3: research and publication cycle, I mean, there's just no contest. 725 00:45:17,600 --> 00:45:21,640 Speaker 3: It's our research and publication cycle is incredibly slow. We 726 00:45:21,680 --> 00:45:25,799 Speaker 3: work in cycles of years, not cycles of minutes. So 727 00:45:25,920 --> 00:45:29,279 Speaker 3: what some researchers are calling for is a monitoring and 728 00:45:29,440 --> 00:45:33,880 Speaker 3: rapid response service to provide facts, to provide accurate information 729 00:45:33,960 --> 00:45:36,920 Speaker 3: to the public while they're thinking about the issue, not 730 00:45:37,200 --> 00:45:40,840 Speaker 3: a year or two later. And so some have suggested 731 00:45:40,880 --> 00:45:45,799 Speaker 3: that maybe academic researchers can take a page from nonprofit 732 00:45:45,880 --> 00:45:51,480 Speaker 3: projects or even projects like Max's observatory groups like Max's Observatory, 733 00:45:52,000 --> 00:45:55,600 Speaker 3: by working either more closely with those groups or working 734 00:45:55,640 --> 00:45:59,440 Speaker 3: with other research centers to track and analyze digital media 735 00:45:59,480 --> 00:46:02,960 Speaker 3: and social media disinformation as it happens and provide a 736 00:46:03,000 --> 00:46:05,759 Speaker 3: researcher's lens on the response. 737 00:46:06,080 --> 00:46:08,520 Speaker 2: It reminds me of that old saying why I can 738 00:46:08,640 --> 00:46:10,520 Speaker 2: travel around the world while the truth is putting on 739 00:46:10,600 --> 00:46:17,320 Speaker 2: its shoes. That these are efforts to try and close 740 00:46:17,360 --> 00:46:22,040 Speaker 2: that feedback loop. They're improving, but even today and say 741 00:46:22,080 --> 00:46:25,279 Speaker 2: print news journalism, if something is wrong, they print a 742 00:46:25,400 --> 00:46:29,440 Speaker 2: correction the next day, it's still not adequate. There are 743 00:46:29,480 --> 00:46:31,000 Speaker 2: many more efforts that are going on. 744 00:46:31,640 --> 00:46:35,040 Speaker 1: Yeah, okay, you have this very interesting section to me 745 00:46:35,320 --> 00:46:40,320 Speaker 1: about how how these narratives shape not just people's understanding 746 00:46:40,360 --> 00:46:43,680 Speaker 1: of the issues and how urgent they are, but also 747 00:46:44,480 --> 00:46:48,600 Speaker 1: ambition on climate and I wonder if you could walk 748 00:46:48,640 --> 00:46:51,200 Speaker 1: through that. I also wanted to ask you about a 749 00:46:51,280 --> 00:46:55,520 Speaker 1: tendency for a lot of reporters who are on the 750 00:46:55,560 --> 00:46:59,120 Speaker 1: climate beat to focus on just events based reporting. I 751 00:46:59,120 --> 00:47:00,839 Speaker 1: feel like that kind of into this too. 752 00:47:01,239 --> 00:47:04,080 Speaker 3: I'm going to refreak question like the way all good 753 00:47:04,360 --> 00:47:09,040 Speaker 3: you know, political interviews are so media trained it's great yes, 754 00:47:09,960 --> 00:47:11,680 Speaker 3: you asked, So all I answer a piece of what 755 00:47:11,680 --> 00:47:14,520 Speaker 3: you asked. So you asked about how do some of 756 00:47:14,560 --> 00:47:18,200 Speaker 3: the narratives that we see in the media shape people's 757 00:47:18,200 --> 00:47:21,440 Speaker 3: ambition to do something about climate change. And so here 758 00:47:22,360 --> 00:47:24,840 Speaker 3: there's a variety of climate obstruction that I know really 759 00:47:24,840 --> 00:47:28,439 Speaker 3: well because I've spent a lot of years studying how 760 00:47:28,560 --> 00:47:32,640 Speaker 3: big brands and other business groups sidesteck their role in 761 00:47:32,719 --> 00:47:37,239 Speaker 3: contributing to climate change, and because they generate narratives that 762 00:47:37,360 --> 00:47:43,160 Speaker 3: are very influential, very seductive, but that really are problematic. 763 00:47:43,200 --> 00:47:46,759 Speaker 3: And they do this by promoting that they're at the 764 00:47:46,840 --> 00:47:51,799 Speaker 3: table with us, that they're collaborators, their consensus makers, or 765 00:47:51,840 --> 00:47:55,160 Speaker 3: that they're willing to compromise. And in fact, I would 766 00:47:55,160 --> 00:47:58,839 Speaker 3: say that some of the most effective undermining of environmental 767 00:47:58,880 --> 00:48:02,520 Speaker 3: science has come from business leaders who say that their 768 00:48:02,560 --> 00:48:08,800 Speaker 3: company is working alongside scientists or alongside public policy makers 769 00:48:08,840 --> 00:48:12,920 Speaker 3: to solve climate problems. So here we have to come 770 00:48:12,920 --> 00:48:18,560 Speaker 3: back to PR specialists, because their entire resondet is to 771 00:48:18,600 --> 00:48:23,600 Speaker 3: promote consensus among public audiences, that's the public in public relations, 772 00:48:23,680 --> 00:48:28,120 Speaker 3: is to relate to the public. They influence people by 773 00:48:28,160 --> 00:48:33,120 Speaker 3: aligning their clients' messages with public values and beliefs. That's 774 00:48:33,160 --> 00:48:37,239 Speaker 3: a classic branding strategy. And it's also about creating legitimacy 775 00:48:37,520 --> 00:48:41,840 Speaker 3: for their client organizations like fossil fuel companies, and about 776 00:48:41,840 --> 00:48:46,040 Speaker 3: creating trust in their message. So, for example, PR consultants 777 00:48:46,080 --> 00:48:50,880 Speaker 3: will create public private partnerships or sponsorships for their clients 778 00:48:51,160 --> 00:48:55,520 Speaker 3: that make them look good. Shell for instance, the oil 779 00:48:55,560 --> 00:49:01,360 Speaker 3: company sponsors youth programs, they sponsor healthcare initiatives, sponsor housing projects, 780 00:49:01,719 --> 00:49:05,440 Speaker 3: and that's not It's hard to point to that and 781 00:49:05,440 --> 00:49:09,000 Speaker 3: say that's bad. You shouldn't sponsor those projects. But at 782 00:49:09,000 --> 00:49:11,480 Speaker 3: the same time, if you think about how that acts 783 00:49:11,600 --> 00:49:14,680 Speaker 3: back on the company to make them look good, when 784 00:49:14,719 --> 00:49:20,360 Speaker 3: in fact, alongside those sponsorships they are also expanding the 785 00:49:20,400 --> 00:49:26,040 Speaker 3: infrastructure for oil production, it becomes really challenging to uphold 786 00:49:26,080 --> 00:49:28,880 Speaker 3: their good image. I could go on, but maybe I 787 00:49:28,920 --> 00:49:30,000 Speaker 3: could leave it at that for now. 788 00:49:30,560 --> 00:49:34,400 Speaker 1: Yeah, that's great. Did you want to add anything in there, mix. 789 00:49:35,000 --> 00:49:37,719 Speaker 2: Well, I think it relates back to some of the 790 00:49:37,800 --> 00:49:40,520 Speaker 2: things we were talking about before. Melissa was talking about 791 00:49:41,040 --> 00:49:43,560 Speaker 2: discourses of climate delay and kind of honing in on 792 00:49:43,640 --> 00:49:48,239 Speaker 2: the four that William Lamb and colleagues had pointed out 793 00:49:48,239 --> 00:49:51,360 Speaker 2: that we got in the chapter. And another way in 794 00:49:51,400 --> 00:49:54,880 Speaker 2: which we can see this kind of narrative is through 795 00:49:55,239 --> 00:49:58,000 Speaker 2: the way in which some of the redirection of responsibility 796 00:49:58,000 --> 00:50:02,520 Speaker 2: comes through the individualization of responsibility. And so some of 797 00:50:02,520 --> 00:50:05,080 Speaker 2: the narratives that take place through the media and through 798 00:50:05,239 --> 00:50:09,160 Speaker 2: everyday lives can come into the notion of carbon footprints, 799 00:50:09,160 --> 00:50:12,040 Speaker 2: which themselves we're seen as creation by the foss fuel 800 00:50:12,080 --> 00:50:15,920 Speaker 2: industry to redirect attention to the individual rather than larger 801 00:50:16,080 --> 00:50:20,759 Speaker 2: corporate actors who are contributing significantly to these issues. And 802 00:50:20,840 --> 00:50:25,880 Speaker 2: so that individualization can have several different kind of pushes 803 00:50:25,920 --> 00:50:30,200 Speaker 2: and pulls. It can distract from their responsibilities, but then 804 00:50:30,239 --> 00:50:34,920 Speaker 2: they can also, you know, very importantly, it can have 805 00:50:35,120 --> 00:50:37,440 Speaker 2: us feel like, oh my gosh, we get overwhelmed with 806 00:50:37,560 --> 00:50:40,480 Speaker 2: the enormity of the challenges that we have before us, 807 00:50:41,080 --> 00:50:43,320 Speaker 2: and so it can push us into these spaces of 808 00:50:43,360 --> 00:50:47,640 Speaker 2: paralysis if we if we subscribe to this kind of atomized, 809 00:50:47,719 --> 00:50:51,160 Speaker 2: individualized way of acting, and we also, on top of 810 00:50:51,200 --> 00:50:54,719 Speaker 2: that can tend to you know, start to name and 811 00:50:54,800 --> 00:50:57,840 Speaker 2: shame and blame one another instead of looking at some 812 00:50:57,960 --> 00:51:02,000 Speaker 2: of the clear fact around us. And so you know, 813 00:51:02,040 --> 00:51:05,760 Speaker 2: there's there's been a lot of work over time among 814 00:51:05,800 --> 00:51:08,920 Speaker 2: them was some important work by Richard he that points 815 00:51:08,920 --> 00:51:12,920 Speaker 2: out that two thirds of industrial greenhouse gas emissions, fossil 816 00:51:13,160 --> 00:51:16,240 Speaker 2: fuel use, methane leaks, and manufacturing have come from ninety 817 00:51:16,640 --> 00:51:20,520 Speaker 2: companies around the world since the john of the Industrial Revolution. 818 00:51:20,680 --> 00:51:24,320 Speaker 2: And so this naming and shaming and individualization is both 819 00:51:24,520 --> 00:51:28,959 Speaker 2: a redirection of responsibility and that it's also really is 820 00:51:29,000 --> 00:51:32,080 Speaker 2: a very effective technique of delay and distraction. 821 00:51:32,920 --> 00:51:36,640 Speaker 1: Okay, I want to end with this accountability piece and 822 00:51:36,680 --> 00:51:39,319 Speaker 1: where we're at with accountability measures on this. So we 823 00:51:39,400 --> 00:51:43,080 Speaker 1: talked about rapid attribution, but you mentioned this push for 824 00:51:43,120 --> 00:51:47,960 Speaker 1: a universal definition of climate disinformation in the chapter, and 825 00:51:48,000 --> 00:51:52,800 Speaker 1: then you know the potential for accountability for specific platforms. 826 00:51:53,000 --> 00:51:55,359 Speaker 1: I'm curious to hear from each of you where you 827 00:51:55,440 --> 00:51:58,840 Speaker 1: see the most progress I'm dealing. 828 00:52:00,520 --> 00:52:03,360 Speaker 3: I'll just start by saying that the call for universal 829 00:52:03,440 --> 00:52:07,760 Speaker 3: definition of climate disinformation as well as for more accountability 830 00:52:08,000 --> 00:52:11,400 Speaker 3: my media platforms, came out, I believe in November twenty 831 00:52:11,400 --> 00:52:17,560 Speaker 3: twenty four around the COP twenty nine climate negotiations in Baku, Azerbaijan. 832 00:52:18,239 --> 00:52:24,560 Speaker 3: And this was a large group, a coalition of political groups, nonprofits, 833 00:52:24,680 --> 00:52:28,920 Speaker 3: and so that was trying to encourage political leaders around 834 00:52:28,960 --> 00:52:33,360 Speaker 3: the world to acknowledge that there is a major threat 835 00:52:33,480 --> 00:52:37,400 Speaker 3: posed by climate disinformation as an obstacle to climate action, 836 00:52:38,000 --> 00:52:42,239 Speaker 3: and to adopt a definition that will allow regulation and 837 00:52:42,360 --> 00:52:45,400 Speaker 3: other forms of rules to get put in place, to 838 00:52:45,440 --> 00:52:47,720 Speaker 3: be able to say, you know, to point at something 839 00:52:47,760 --> 00:52:51,120 Speaker 3: and say that is climate disinformation based on this definition 840 00:52:51,160 --> 00:52:52,840 Speaker 3: that all of us have agreed upon. I think that 841 00:52:52,960 --> 00:52:55,840 Speaker 3: that would be very powerful. And then that third pillar 842 00:52:55,920 --> 00:52:59,120 Speaker 3: is to hold platforms accountable. As you said, Amy, so 843 00:52:59,600 --> 00:53:06,680 Speaker 3: just push social media companies ad tech publishers to prevent 844 00:53:06,760 --> 00:53:10,160 Speaker 3: the spread of climate disinformation by agin acknowledging it and 845 00:53:10,520 --> 00:53:13,640 Speaker 3: saying we're not going to allow this kind of garbage 846 00:53:13,680 --> 00:53:16,719 Speaker 3: to be to be on our sites. I would say, 847 00:53:16,800 --> 00:53:19,480 Speaker 3: if I had to, you know, leave with some Oh, 848 00:53:19,600 --> 00:53:21,719 Speaker 3: but this is really not I don't intend this to 849 00:53:21,719 --> 00:53:23,239 Speaker 3: be about t toooting my own hord. This is just 850 00:53:23,239 --> 00:53:25,640 Speaker 3: something I know more about than other areas. We'll talk 851 00:53:25,640 --> 00:53:28,840 Speaker 3: about it. It's in relation to this idea of adopting 852 00:53:28,840 --> 00:53:31,719 Speaker 3: a universal definition, but it's on the topic of greenwashing, 853 00:53:31,800 --> 00:53:36,240 Speaker 3: which is a subset of climate disinformation. This is something 854 00:53:36,280 --> 00:53:40,120 Speaker 3: that some colleagues and I have been working pretty steadily 855 00:53:40,160 --> 00:53:42,920 Speaker 3: on through the Climate Social Science Network we have a 856 00:53:42,960 --> 00:53:46,680 Speaker 3: Greenwashing working group and we've just put out our second 857 00:53:46,760 --> 00:53:52,640 Speaker 3: article that attempts to develop a very clear and coherent 858 00:53:52,760 --> 00:53:58,359 Speaker 3: framework to identify greenwashing and to lay out a set 859 00:53:58,400 --> 00:54:01,520 Speaker 3: of criteria that anyone can use. It's not just an 860 00:54:01,600 --> 00:54:04,799 Speaker 3: academic thing. It's really about saying, here are the features. 861 00:54:04,920 --> 00:54:05,160 Speaker 1: Now. 862 00:54:05,400 --> 00:54:08,000 Speaker 3: You know, if you can identify these features in whatever 863 00:54:08,040 --> 00:54:10,759 Speaker 3: you're looking at that you think is greenwashing, and these 864 00:54:10,800 --> 00:54:13,120 Speaker 3: features are there, you can call it greenwashing. You can 865 00:54:13,200 --> 00:54:16,040 Speaker 3: go aheat them and say that's greenwashing. And we're really 866 00:54:16,080 --> 00:54:20,680 Speaker 3: trying to promote greater accountability by all kinds of organizations, 867 00:54:20,760 --> 00:54:24,080 Speaker 3: not not just companies, but other groups that advertently or 868 00:54:24,120 --> 00:54:30,280 Speaker 3: inadvertently produce greenwashing material and just yeah, getting a definition 869 00:54:30,360 --> 00:54:34,120 Speaker 3: that then will allow all these groups accountable. And we've 870 00:54:34,160 --> 00:54:37,799 Speaker 3: had some real success with that framework. We've had a 871 00:54:37,800 --> 00:54:43,680 Speaker 3: couple of groups, I know in Canadian cities, some lawmakers 872 00:54:43,719 --> 00:54:49,000 Speaker 3: have used that framework to ask for public buses and 873 00:54:49,360 --> 00:54:54,360 Speaker 3: other transit systems to take down their advertising that is 874 00:54:54,360 --> 00:54:57,040 Speaker 3: greenwashing as a result of our study. So that's just 875 00:54:57,320 --> 00:55:01,800 Speaker 3: one very small contribution to try t to just keep 876 00:55:01,840 --> 00:55:05,040 Speaker 3: these different platforms and different outlets accountable. 877 00:55:05,239 --> 00:55:09,640 Speaker 2: That's great I'm glad glad to hear those developments because 878 00:55:09,640 --> 00:55:15,920 Speaker 2: they can feedback into news reporting, into pro engagement advertising, 879 00:55:16,200 --> 00:55:18,880 Speaker 2: and on and on. I guess I would say that, 880 00:55:18,960 --> 00:55:22,080 Speaker 2: you know, while we have talked about how you know, 881 00:55:22,160 --> 00:55:26,040 Speaker 2: there are strains on our media literacy nowadays, generally my 882 00:55:26,080 --> 00:55:28,680 Speaker 2: experience has been that once you open up these conversations, 883 00:55:28,760 --> 00:55:31,319 Speaker 2: especially with young college aged people that I'm working with 884 00:55:31,360 --> 00:55:35,480 Speaker 2: a lot teaching, that there is a real appetite and 885 00:55:35,520 --> 00:55:39,319 Speaker 2: there's an ability to learn very quickly, and especially young 886 00:55:39,320 --> 00:55:41,279 Speaker 2: people who have been born into the world where we've 887 00:55:41,280 --> 00:55:44,080 Speaker 2: already had this on the public agenda. For those of 888 00:55:44,200 --> 00:55:45,600 Speaker 2: us that are a little bit older, you know, we 889 00:55:45,640 --> 00:55:47,520 Speaker 2: grew up in a world where this wasn't being discussed. 890 00:55:47,520 --> 00:55:50,040 Speaker 2: But many young people now that are in college have 891 00:55:50,160 --> 00:55:52,920 Speaker 2: been exposed to this, been talking about it, and they're 892 00:55:53,040 --> 00:55:57,560 Speaker 2: well positioned to start to move forward with solutions. And 893 00:55:58,320 --> 00:56:00,960 Speaker 2: I know at times we all may feel like this 894 00:56:01,120 --> 00:56:05,279 Speaker 2: is really daunting. There are so many large forces and 895 00:56:05,360 --> 00:56:08,799 Speaker 2: pressures that are working against this kind of progress. I 896 00:56:08,800 --> 00:56:10,480 Speaker 2: guess I am in a quoting kind of mood. I 897 00:56:10,560 --> 00:56:13,839 Speaker 2: think about Wes Jackson something he said about if you're 898 00:56:13,880 --> 00:56:16,319 Speaker 2: not working on something that you plan on finishing in 899 00:56:16,320 --> 00:56:20,239 Speaker 2: your lifetime, you're not thinking big enough. And so when 900 00:56:20,280 --> 00:56:24,200 Speaker 2: we start to work with folks across demographics, across generations, 901 00:56:24,800 --> 00:56:26,920 Speaker 2: and there is this appetite to learn, there is this 902 00:56:27,000 --> 00:56:30,080 Speaker 2: appetite for positive change. I see it through the creative 903 00:56:30,080 --> 00:56:33,720 Speaker 2: ways that students I'm working with are communicating about climate 904 00:56:33,800 --> 00:56:37,280 Speaker 2: change with their peers and families and neighbors and roommates. 905 00:56:37,800 --> 00:56:41,359 Speaker 2: That I find that as a source of encouragement. So 906 00:56:41,880 --> 00:56:44,560 Speaker 2: while we have been focused in here on diagnosing some 907 00:56:44,640 --> 00:56:47,759 Speaker 2: of the real daunting challenges, there are places where we 908 00:56:47,800 --> 00:56:49,439 Speaker 2: can look to for encouragement and hope. 909 00:56:49,640 --> 00:56:53,600 Speaker 3: Awesome, No, I really, Max was saying about students that 910 00:56:53,719 --> 00:56:57,560 Speaker 3: I do see. The only moments lately when I do 911 00:56:57,600 --> 00:57:01,720 Speaker 3: feel optimistic are actually working with my students and being 912 00:57:01,800 --> 00:57:05,800 Speaker 3: out with my teen son. Yeah, you know, which is 913 00:57:05,880 --> 00:57:07,920 Speaker 3: kind of ironic because in other ways he drives me 914 00:57:07,920 --> 00:57:09,080 Speaker 3: absolutely Banata's butt. 915 00:57:09,400 --> 00:57:10,040 Speaker 1: Yes he is. 916 00:57:10,640 --> 00:57:11,000 Speaker 3: He is. 917 00:57:11,280 --> 00:57:13,520 Speaker 4: You know, he's growing up to be a very savvy 918 00:57:14,000 --> 00:57:17,680 Speaker 4: and media literate person. Despite what we say earlier, he 919 00:57:17,960 --> 00:57:21,560 Speaker 4: is very aware of the climate crisis and that need 920 00:57:21,640 --> 00:57:25,160 Speaker 4: to do something about it. I feel hopeful when I 921 00:57:25,240 --> 00:57:27,440 Speaker 4: hear the kinds of things that he's thinking about and 922 00:57:27,560 --> 00:57:30,320 Speaker 4: wanting to do when it comes to climate change, because 923 00:57:30,320 --> 00:57:33,040 Speaker 4: he's on it, and I think his friends and roommates 924 00:57:33,080 --> 00:57:34,160 Speaker 4: and others are too. 925 00:57:34,440 --> 00:57:36,520 Speaker 1: Okay, well, it's kind of nice to end on a 926 00:57:36,560 --> 00:57:38,800 Speaker 1: little bit of hope and grit. I do feel like 927 00:57:38,880 --> 00:57:47,000 Speaker 1: people forget that these things take a long time. That's 928 00:57:47,040 --> 00:57:49,680 Speaker 1: it for this time. Make sure you're subscribed so you 929 00:57:49,720 --> 00:57:53,720 Speaker 1: don't miss an episode. You can find more on this season, 930 00:57:53,800 --> 00:57:58,360 Speaker 1: including transcripts and lots of related articles and background information, 931 00:57:58,640 --> 00:58:02,520 Speaker 1: on our website at Drilled Media. You can also sign 932 00:58:02,600 --> 00:58:06,400 Speaker 1: up for our newsletter there. Our producers for this season 933 00:58:06,520 --> 00:58:10,680 Speaker 1: are Martin Saltz, Oustwick and Peter Duff. Our theme song 934 00:58:10,880 --> 00:58:14,000 Speaker 1: is Bird in the Hand by Foreknown. Our cover art 935 00:58:14,120 --> 00:58:17,760 Speaker 1: is by Matthew Fleming. Our First Amendment attorney is James 936 00:58:17,760 --> 00:58:23,000 Speaker 1: Wheaton with the First Amendment Project. The show was created, written, 937 00:58:23,040 --> 00:58:26,360 Speaker 1: and reported by me Amy Westervelt. Thanks for listening and 938 00:58:26,520 --> 00:58:27,320 Speaker 1: see you next time.