1 00:00:00,360 --> 00:00:03,120 Speaker 1: Hi, it's Greg Dalton. I'd like to hear your comments 2 00:00:03,160 --> 00:00:06,080 Speaker 1: on the show, topics we should cover and guess suggestions. 3 00:00:06,320 --> 00:00:08,920 Speaker 1: You can reach me at Greg at climatewe dot org. 4 00:00:10,760 --> 00:00:14,960 Speaker 1: This is Climate One. I'm Greg Dalton. Spreading climate misinformation 5 00:00:15,160 --> 00:00:17,959 Speaker 1: is a favorite tool of fossil fuel companies looking to 6 00:00:18,000 --> 00:00:19,640 Speaker 1: protect their bottom line. 7 00:00:19,800 --> 00:00:23,560 Speaker 2: Whether they're arguing climate change isn't real, therefore we should mact, 8 00:00:24,000 --> 00:00:27,440 Speaker 2: or climate change isn't caused by humans, therefore we should mact. 9 00:00:27,600 --> 00:00:30,160 Speaker 2: Well solutions weren't work, therefore we should mact. 10 00:00:30,280 --> 00:00:33,840 Speaker 1: Companies use advertising and mass media to spread these messages, 11 00:00:33,880 --> 00:00:37,040 Speaker 1: and in some cases want to defend their actions as 12 00:00:37,120 --> 00:00:38,559 Speaker 1: protected free speech. 13 00:00:39,040 --> 00:00:43,559 Speaker 3: This idea that you know, if it's something that connects 14 00:00:43,640 --> 00:00:47,479 Speaker 3: to policy I want to see, then it doesn't have 15 00:00:47,600 --> 00:00:49,080 Speaker 3: to play by the same rules. 16 00:00:49,479 --> 00:00:52,760 Speaker 1: But we can counter these false arguments through critical thinking 17 00:00:52,880 --> 00:00:53,680 Speaker 1: and education. 18 00:00:54,160 --> 00:00:58,600 Speaker 2: If we want a public who are resilient against misinformation, 19 00:00:59,320 --> 00:01:02,880 Speaker 2: we need to build up their ability to spot these 20 00:01:02,920 --> 00:01:03,720 Speaker 2: types of fallacies. 21 00:01:05,400 --> 00:01:09,400 Speaker 1: Fossil fuel corporations have spent decades casting doubt and public 22 00:01:09,440 --> 00:01:13,960 Speaker 1: about climate facts that their own scientists validated in company research. 23 00:01:14,560 --> 00:01:18,280 Speaker 1: These tactics have included a concerted effort to recast political 24 00:01:18,360 --> 00:01:23,280 Speaker 1: speech banned and regulated in some contexts as protected free speech, 25 00:01:23,600 --> 00:01:28,440 Speaker 1: giving corporations more leeway and broadcasting their messages. This week's 26 00:01:28,480 --> 00:01:32,040 Speaker 1: episode is a special collaboration with Amy Westervelt, an award 27 00:01:32,080 --> 00:01:35,920 Speaker 1: winning journalist and creator of the podcast Drilled. She brings 28 00:01:36,000 --> 00:01:38,839 Speaker 1: us the backstory of the free speech argument fossil fuel 29 00:01:38,840 --> 00:01:42,400 Speaker 1: companies are now using to support their efforts to spread 30 00:01:42,480 --> 00:01:43,800 Speaker 1: climate misinformation. 31 00:01:54,640 --> 00:01:58,000 Speaker 3: Most people think the debate over corporate free speech in 32 00:01:58,080 --> 00:02:02,680 Speaker 3: America started with the Citizens United case in twenty ten. 33 00:02:05,640 --> 00:02:09,960 Speaker 4: Mister Olson, are you taking the position that there is 34 00:02:10,040 --> 00:02:14,600 Speaker 4: no difference in the First Amendment rights of an individual? 35 00:02:15,639 --> 00:02:20,760 Speaker 4: A corporation, after all, is not endowed by its creator 36 00:02:20,919 --> 00:02:28,359 Speaker 4: with inalienable rights. So is there any distinction that Congress 37 00:02:28,400 --> 00:02:35,800 Speaker 4: could draw between corporations and natural human beings for purposes 38 00:02:35,840 --> 00:02:37,040 Speaker 4: of campaign finance? 39 00:02:37,280 --> 00:02:40,040 Speaker 5: What the Court has said in the First Amendment context 40 00:02:40,639 --> 00:02:44,799 Speaker 5: New York Times versus Sullivan, Gross Chain versus Associated Press, 41 00:02:45,240 --> 00:02:49,040 Speaker 5: and over and over again, is that corporations are persons 42 00:02:49,520 --> 00:02:51,639 Speaker 5: entitled to protection under the First Amendment. 43 00:02:52,160 --> 00:02:56,200 Speaker 3: That was the lead Justice Ruth Peter Ginsberg questioning Attorney 44 00:02:56,280 --> 00:03:00,280 Speaker 3: Ted Olsen with the firm Gibson Dunn who argued and 45 00:03:00,440 --> 00:03:04,680 Speaker 3: won that case. Just a quick recap here. The case 46 00:03:04,720 --> 00:03:07,360 Speaker 3: was about a film that had been made criticizing Hillary 47 00:03:07,400 --> 00:03:10,040 Speaker 3: Clinton the first time she tried to run for president. 48 00:03:10,160 --> 00:03:13,200 Speaker 3: It was funded by a cohort of right wing organizations 49 00:03:13,240 --> 00:03:17,920 Speaker 3: and corporations, including coke industries, so the Federal Election Commission 50 00:03:17,960 --> 00:03:21,840 Speaker 3: had said that the movie couldn't screen without identifying itself 51 00:03:21,880 --> 00:03:26,800 Speaker 3: as campaign material and noting its funders. The filmmakers and 52 00:03:26,840 --> 00:03:30,239 Speaker 3: their attorneys argue that this violated their free speech rights, 53 00:03:30,840 --> 00:03:34,440 Speaker 3: and they won, opening the door to unlimited corporate funding 54 00:03:34,560 --> 00:03:41,120 Speaker 3: of political propaganda what's generally referred to as simply dark money. 55 00:03:41,800 --> 00:03:44,600 Speaker 3: But Citizens United was not the first battle in the 56 00:03:44,640 --> 00:03:48,440 Speaker 3: war over corporate free speech, nor was it the last. 57 00:03:49,040 --> 00:03:52,600 Speaker 3: The story actually begins back in the late nineteen sixties 58 00:03:52,640 --> 00:03:57,720 Speaker 3: with Mobile Oil and its issue advertising program. It was 59 00:03:57,720 --> 00:04:02,040 Speaker 3: a multifaceted strategy that included to finding a personality for Mobile, 60 00:04:02,400 --> 00:04:08,040 Speaker 3: aligning the company with cultural institutions and advertising ideas rather 61 00:04:08,120 --> 00:04:12,040 Speaker 3: than just gas. The strategy came from Mobile's VP of 62 00:04:12,080 --> 00:04:15,920 Speaker 3: Public Affairs, Herb Schmertz, as a way to counter widespread 63 00:04:15,920 --> 00:04:19,080 Speaker 3: criticism of oil companies in the press, and it was 64 00:04:19,200 --> 00:04:25,360 Speaker 3: championed by the company's CEO, Rawley Warner. Here's Schmertz later 65 00:04:25,440 --> 00:04:29,520 Speaker 3: in life describing Mobil's personality, Well. 66 00:04:29,279 --> 00:04:33,440 Speaker 6: It was multifast. It was a personality where we believe 67 00:04:33,640 --> 00:04:38,680 Speaker 6: very strongly about the importance of public policies. Secondly, we 68 00:04:38,800 --> 00:04:42,599 Speaker 6: believe fervently that as a sort of a custodian of 69 00:04:42,640 --> 00:04:47,600 Speaker 6: a large corporation and the custodians of vast resources and 70 00:04:47,800 --> 00:04:50,840 Speaker 6: employment and everything else, that we were not doing our 71 00:04:50,960 --> 00:04:54,560 Speaker 6: job if we did not participate in the marketplace of ideas. 72 00:04:54,839 --> 00:04:58,280 Speaker 6: Third part of our personality was we believed in that 73 00:04:58,360 --> 00:05:02,120 Speaker 6: a democracy is to host of a group of free institutions. 74 00:05:02,360 --> 00:05:06,120 Speaker 6: We believe in free markets, freedom of speech, freedom of 75 00:05:06,160 --> 00:05:10,560 Speaker 6: the press, academic freedom, freedom to organize and participate in 76 00:05:10,680 --> 00:05:11,720 Speaker 6: union activities. 77 00:05:12,400 --> 00:05:16,680 Speaker 3: In addition to sponsoring Masterpiece Theater, starting in nineteen seventy, 78 00:05:17,040 --> 00:05:20,200 Speaker 3: Mobile worked with The New York Times to create the advertorial. 79 00:05:20,520 --> 00:05:23,080 Speaker 3: Every week, Mobile ran a piece in the Times op 80 00:05:23,240 --> 00:05:28,159 Speaker 3: ed section espousing some idea or another. Here's Schmertz describing 81 00:05:28,200 --> 00:05:30,080 Speaker 3: them on the show Open Mind. 82 00:05:31,720 --> 00:05:33,119 Speaker 7: Heb, thanks for joining me today. 83 00:05:33,279 --> 00:05:34,560 Speaker 6: Great pleasure to be here, Dick. 84 00:05:35,080 --> 00:05:37,600 Speaker 7: I want to turn as quickly as possible to a 85 00:05:37,640 --> 00:05:42,359 Speaker 7: new fairy tale, the mobile ad or op ed piece 86 00:05:42,440 --> 00:05:44,400 Speaker 7: or editorial call, or what you will that. We call 87 00:05:44,440 --> 00:05:47,560 Speaker 7: them pamphlets pamphlets, but they appear in newspapers. 88 00:05:47,640 --> 00:05:55,280 Speaker 3: Yes, pamphlets, but they appear in newspapers. In the early 89 00:05:55,400 --> 00:05:58,720 Speaker 3: nineteen seventies, Schmertz and Warner figured they were having such 90 00:05:58,760 --> 00:06:01,919 Speaker 3: great luck with the newspap for advertorials and their various 91 00:06:01,960 --> 00:06:05,040 Speaker 3: PBS specials that it was time to get mobile content 92 00:06:05,160 --> 00:06:09,160 Speaker 3: onto commercial TV. They reached out to CBS, ABC and 93 00:06:09,520 --> 00:06:13,559 Speaker 3: NBC to buy time, but got a surprise. This time, 94 00:06:13,880 --> 00:06:19,160 Speaker 3: CBS and ABC gave them an emphatic no. They described 95 00:06:19,200 --> 00:06:21,200 Speaker 3: what mobil was trying to do with their ads as 96 00:06:21,240 --> 00:06:25,239 Speaker 3: propaganda and claimed it violated various ethics policies and maybe 97 00:06:25,279 --> 00:06:29,080 Speaker 3: even some FCC laws. Schmertz went one by one to 98 00:06:29,240 --> 00:06:33,800 Speaker 3: independence stations to place his TV advertorials instead. Here's a 99 00:06:33,839 --> 00:06:36,640 Speaker 3: taste of one of them from a mobile information center. 100 00:06:37,640 --> 00:06:39,160 Speaker 8: Good evening, I'm Dick Calliman. 101 00:06:39,800 --> 00:06:43,920 Speaker 9: Most Americans have an exaggerated idea about oil company profits. 102 00:06:44,760 --> 00:06:47,719 Speaker 3: Warner and Schmertz went on the offensive in general too. 103 00:06:47,880 --> 00:06:50,800 Speaker 3: They wrote letters to the network heads. They placed multiple 104 00:06:50,800 --> 00:06:54,120 Speaker 3: New York Times advertorials about how the big TV networks 105 00:06:54,120 --> 00:06:57,080 Speaker 3: were trying to silence them, and they gave speeches at 106 00:06:57,120 --> 00:07:00,440 Speaker 3: various business groups about how this was a huge threat 107 00:07:00,480 --> 00:07:04,440 Speaker 3: to corporate rights. It was the first time any company 108 00:07:04,520 --> 00:07:08,240 Speaker 3: had talked about such a thing as corporate free speech 109 00:07:08,440 --> 00:07:09,200 Speaker 3: the seventies. 110 00:07:09,279 --> 00:07:14,360 Speaker 10: There was a lot of public opinion that they would 111 00:07:14,360 --> 00:07:15,600 Speaker 10: have been concerned about. 112 00:07:15,920 --> 00:07:19,520 Speaker 3: This is Robert Kerr, a media law professor and researcher 113 00:07:19,600 --> 00:07:22,960 Speaker 3: at Oklahoma State University. He's written two books about the 114 00:07:23,000 --> 00:07:26,280 Speaker 3: evolution of corporate free speech and mobile's role in it. 115 00:07:26,640 --> 00:07:29,240 Speaker 3: He's talking here about the situation that oil companies found 116 00:07:29,280 --> 00:07:32,960 Speaker 3: themselves in during the early seventies. There had been the big, 117 00:07:33,080 --> 00:07:36,960 Speaker 3: high profile oils bill in California in nineteen sixty nine, 118 00:07:37,120 --> 00:07:40,040 Speaker 3: and then in nineteen seventy three, the oil embargo hit 119 00:07:40,520 --> 00:07:43,440 Speaker 3: in response to the US support of Israel during the 120 00:07:43,480 --> 00:07:46,600 Speaker 3: Arab Israeli War. Arab members of OPEK put a ban 121 00:07:46,760 --> 00:07:50,440 Speaker 3: on exporting oil to the US. The effect was immediate. 122 00:07:51,280 --> 00:07:53,080 Speaker 10: You know, I lived through that, and I remember it 123 00:07:53,160 --> 00:07:57,280 Speaker 10: was people were scared all of a sudden. You know, 124 00:07:57,720 --> 00:08:01,040 Speaker 10: this something that there were used to going to the 125 00:08:01,080 --> 00:08:05,240 Speaker 10: pump and gas pump and getting for almost nothing was 126 00:08:05,280 --> 00:08:08,800 Speaker 10: not only going way up in price, but you know, 127 00:08:10,000 --> 00:08:12,000 Speaker 10: you might not even be able to buy again, And 128 00:08:12,080 --> 00:08:14,720 Speaker 10: often you couldn't In the same the gas stations would 129 00:08:14,800 --> 00:08:18,080 Speaker 10: run out of or you'd have a really long line, 130 00:08:18,080 --> 00:08:20,160 Speaker 10: you'd wait for hours or and then then maybe you 131 00:08:20,360 --> 00:08:24,520 Speaker 10: still couldn't buy any So, yeah, the public was really alarmed. 132 00:08:24,800 --> 00:08:30,800 Speaker 10: And uh, particularly the Carter administration in the late seventies 133 00:08:34,320 --> 00:08:39,240 Speaker 10: seemed to be a lot less favorable toward the old 134 00:08:39,280 --> 00:08:42,360 Speaker 10: companies in general. You know, Jimmy Carter and his administrations 135 00:08:43,640 --> 00:08:47,400 Speaker 10: seemed willing to hold their feet to the fire. 136 00:08:49,240 --> 00:08:52,200 Speaker 11: Anger and bewilderment are growing. Is more and more Americans 137 00:08:52,240 --> 00:09:04,599 Speaker 11: cope with gasoline lines and empty pumps. Good evening for 138 00:09:04,760 --> 00:09:07,520 Speaker 11: millions of Americans, this may be the worst weekend they've 139 00:09:07,520 --> 00:09:10,559 Speaker 11: ever faced for finding gasoline to give them the autumnabile 140 00:09:10,559 --> 00:09:14,120 Speaker 11: freedom they take as they're due. Gasoline shortages are spreading 141 00:09:14,160 --> 00:09:18,000 Speaker 11: across the country. Odd even service gasoline lines and closed 142 00:09:18,040 --> 00:09:21,640 Speaker 11: gas stations are becoming increasingly common, and the news from 143 00:09:21,640 --> 00:09:24,200 Speaker 11: overseas tonight gives no promise of quick relief. 144 00:09:26,040 --> 00:09:28,680 Speaker 3: People were scared and angry, and a lot of those 145 00:09:28,679 --> 00:09:33,800 Speaker 3: emotions were being directed at oil companies. For mobile access 146 00:09:33,840 --> 00:09:36,160 Speaker 3: to the press and the ability to get out its 147 00:09:36,280 --> 00:09:39,480 Speaker 3: version of the story was critical to the company's ability 148 00:09:39,520 --> 00:09:43,280 Speaker 3: to weather this storm, and that refusal from the commercial 149 00:09:43,320 --> 00:09:46,680 Speaker 3: stations to run its ads that was a huge potential 150 00:09:46,760 --> 00:09:50,000 Speaker 3: threat to that strategy. Mobile toyed with the idea of 151 00:09:50,040 --> 00:09:53,640 Speaker 3: filing a case themselves that would formally establish the corporate 152 00:09:53,679 --> 00:09:57,960 Speaker 3: right to free speech, but it worried that that could backfire. Instead, 153 00:09:58,280 --> 00:10:03,400 Speaker 3: the company started filing a meekis in other cases, and Mobile. 154 00:10:03,120 --> 00:10:06,040 Speaker 12: Was one of the leading corporations to fight for that 155 00:10:06,960 --> 00:10:07,640 Speaker 12: legal right. 156 00:10:07,960 --> 00:10:12,160 Speaker 3: This is doctor Robert Brule, an environmental sociologist at Brown University. 157 00:10:12,720 --> 00:10:17,079 Speaker 12: There was a pretty big effort to get a Supreme 158 00:10:17,080 --> 00:10:20,160 Speaker 12: Court ruling that basically supported corporate speech and the right 159 00:10:20,960 --> 00:10:26,679 Speaker 12: of corporations to do advertising of their not just product advertising, 160 00:10:26,800 --> 00:10:29,479 Speaker 12: but of their positions. 161 00:10:30,120 --> 00:10:33,120 Speaker 3: That Supreme Court case he's talking about was First National 162 00:10:33,120 --> 00:10:37,040 Speaker 3: Bank of Boston versus Ballati. First National, along with two 163 00:10:37,120 --> 00:10:40,880 Speaker 3: other banks and three corporations, had wanted to spend money 164 00:10:40,920 --> 00:10:44,600 Speaker 3: to publicize their opposition to a ballot initiative that would 165 00:10:44,640 --> 00:10:49,080 Speaker 3: permit Massachusetts to implement a graduated income tax. The Attorney 166 00:10:49,080 --> 00:10:52,439 Speaker 3: General of Massachusetts said that violated a state law against 167 00:10:52,480 --> 00:10:55,720 Speaker 3: funding campaigns that would influence the outcome of a vote. 168 00:10:55,960 --> 00:10:58,080 Speaker 3: The bank sued, and the case went to the Supreme 169 00:10:58,120 --> 00:11:01,120 Speaker 3: Court in nineteen seventy seven. The ruling came out in 170 00:11:01,200 --> 00:11:05,040 Speaker 3: nineteen seventy eight. Here's Supreme Court Justice Lewis A. Powell 171 00:11:05,160 --> 00:11:06,360 Speaker 3: giving that ruling. 172 00:11:07,280 --> 00:11:12,040 Speaker 9: The First Amendment's primary concern, and therefore the courts concern 173 00:11:13,040 --> 00:11:18,760 Speaker 9: always has been the preservation of free and uninhibited dissemination 174 00:11:19,440 --> 00:11:25,520 Speaker 9: of information and ideas. If the restrictive view of corporate 175 00:11:25,600 --> 00:11:31,800 Speaker 9: speech taken by the Massachusetts Court were accepted, government would 176 00:11:31,840 --> 00:11:36,040 Speaker 9: have the power to deprive society of the views of corporations. 177 00:11:36,679 --> 00:11:40,160 Speaker 3: Paula is also credited with crafting the Powell Memorandum, which 178 00:11:40,280 --> 00:11:43,600 Speaker 3: outlined the pro corporate strategy that would guide the Republican 179 00:11:43,640 --> 00:11:47,560 Speaker 3: Party from the early nineteen eighties to today. BLODI is 180 00:11:47,679 --> 00:11:52,120 Speaker 3: generally considered the precursor to Citizens United and Mobile was 181 00:11:52,280 --> 00:11:56,760 Speaker 3: hugely influential in securing that ruling. Here's Robert kerr Agan. 182 00:11:57,080 --> 00:11:59,760 Speaker 10: You know, it actually was very close when it first 183 00:11:59,800 --> 00:12:02,400 Speaker 10: got to the Supreme Court. The justices could have gone 184 00:12:02,400 --> 00:12:06,120 Speaker 10: the other way. Justice Pale kind of really finessed it 185 00:12:06,200 --> 00:12:10,240 Speaker 10: and got that first President setting case Blodi into the 186 00:12:10,240 --> 00:12:13,640 Speaker 10: case law, and then later when it got to Citizens United, 187 00:12:14,120 --> 00:12:17,240 Speaker 10: Justice Kennedy kind of ignores the overall body of case 188 00:12:17,360 --> 00:12:21,520 Speaker 10: law and he goes back to Ballotti twenty four times. 189 00:12:21,600 --> 00:12:26,240 Speaker 10: It's really unusual to cite one case twenty four times. 190 00:12:26,360 --> 00:12:30,600 Speaker 3: So why does this matter today? Well, in addition to 191 00:12:30,760 --> 00:12:34,440 Speaker 3: changing public discourse forever, these cases also lead the groundwork 192 00:12:34,520 --> 00:12:37,640 Speaker 3: for the argument that oil companies are using today to 193 00:12:37,760 --> 00:12:42,800 Speaker 3: defend climate disinformation. In some two dozen climate liability cases 194 00:12:42,960 --> 00:12:46,120 Speaker 3: and some additional fraud cases, the oil companies are being 195 00:12:46,160 --> 00:12:50,800 Speaker 3: accused with misleading the public on climate The lawyer appointed 196 00:12:50,840 --> 00:12:53,400 Speaker 3: to speak for all of the companies in these cases 197 00:12:53,480 --> 00:12:56,200 Speaker 3: is Ted Boutros, who's not only a well known First 198 00:12:56,200 --> 00:12:59,360 Speaker 3: Amendment attorney, but also a partner at Gibson Dunn, the 199 00:12:59,400 --> 00:13:02,640 Speaker 3: firm that's cured that when in Citizens United back in 200 00:13:02,679 --> 00:13:06,119 Speaker 3: twenty ten. Here's Boutro speaking on the Climate One podcast 201 00:13:06,240 --> 00:13:07,199 Speaker 3: in twenty twenty. 202 00:13:07,679 --> 00:13:09,720 Speaker 13: I do want to take head on the notion that 203 00:13:09,880 --> 00:13:12,240 Speaker 13: the plaintiff's lawyers in a lot of the climate change 204 00:13:12,280 --> 00:13:16,000 Speaker 13: cases have been advocating is that the oil and gas 205 00:13:16,000 --> 00:13:19,160 Speaker 13: companies were they had secret knowledge and they were then 206 00:13:19,480 --> 00:13:24,720 Speaker 13: putting out misinformation, and they tried to analogize it to 207 00:13:24,720 --> 00:13:27,800 Speaker 13: tobacco and other areas. It just it doesn't make any 208 00:13:27,840 --> 00:13:31,840 Speaker 13: sense because it was well known. The federal government knew 209 00:13:32,160 --> 00:13:35,480 Speaker 13: the problems of climate change, the potential causes, and knew 210 00:13:35,480 --> 00:13:36,679 Speaker 13: that there was an issue here. 211 00:13:36,920 --> 00:13:39,440 Speaker 3: Other attorneys are making this argument on behalf of the 212 00:13:39,480 --> 00:13:43,600 Speaker 3: oil majors as well. When it exhausted all options to 213 00:13:43,679 --> 00:13:47,679 Speaker 3: dismiss the fraud case against it in Massachusetts, Exon Mobile 214 00:13:47,880 --> 00:13:52,160 Speaker 3: filed an anti slapsuit against the Attorney General's office there, 215 00:13:52,320 --> 00:13:55,319 Speaker 3: claiming that the fraud case against it amounted to an 216 00:13:55,320 --> 00:13:59,839 Speaker 3: effort to quash the company's First Amendment rights. SLAP stands 217 00:13:59,840 --> 00:14:05,520 Speaker 3: for strategic litigation against public participation. Anti SLAP statutes like 218 00:14:05,559 --> 00:14:09,080 Speaker 3: the one in Massachusetts, we're meant to protect the press 219 00:14:09,080 --> 00:14:13,760 Speaker 3: and civil society groups from corporations that wanted to silence critics. 220 00:14:14,080 --> 00:14:17,240 Speaker 3: But these days it's become equally common for corporations to 221 00:14:17,360 --> 00:14:21,120 Speaker 3: use these statutes to swat away legal complaints. Here's attorney 222 00:14:21,280 --> 00:14:24,720 Speaker 3: Justin Anderson, a partner with Paul Weiss, that's Exon Mobil's 223 00:14:24,760 --> 00:14:27,920 Speaker 3: law firm, at a March twenty twenty two hearing. 224 00:14:28,400 --> 00:14:32,120 Speaker 14: The alleged misrepresentations are the statements that Exon Mobile has 225 00:14:32,120 --> 00:14:36,400 Speaker 14: made about its views on climate policy on energy policy. 226 00:14:36,760 --> 00:14:40,400 Speaker 14: The anti slap Statute provides a mechanism to have a 227 00:14:40,480 --> 00:14:44,240 Speaker 14: case that is brought against someone for petitioning activity dismissed 228 00:14:44,280 --> 00:14:48,480 Speaker 14: at the outset, before burdensome discovery is imposed on the party, 229 00:14:49,000 --> 00:14:52,040 Speaker 14: before we have our executives come in to give testimony 230 00:14:52,080 --> 00:14:54,920 Speaker 14: and depositions, before we're dragged into a courtroom where we 231 00:14:54,960 --> 00:14:55,960 Speaker 14: have to defend ourselves. 232 00:14:56,080 --> 00:15:00,280 Speaker 3: That phrase petitioning activity is really key here because what 233 00:15:00,320 --> 00:15:03,800 Speaker 3: it means in Plean English is political speech. And the 234 00:15:03,880 --> 00:15:06,560 Speaker 3: argument ex On Mobile is making here and that Boutros 235 00:15:06,560 --> 00:15:09,480 Speaker 3: has been making as well, is that because the oil 236 00:15:09,520 --> 00:15:14,920 Speaker 3: company's campaigns on climate are political speech, not commercial speech, 237 00:15:15,400 --> 00:15:18,040 Speaker 3: they are protected by the First Amendment. It's the sort 238 00:15:18,040 --> 00:15:20,960 Speaker 3: of argument Herbschmertz would have been proud of. Here. He 239 00:15:21,080 --> 00:15:24,280 Speaker 3: is defending corporate pr In the eighties. 240 00:15:24,120 --> 00:15:28,360 Speaker 8: Government intrusion into the marketplace of ideas would limit our 241 00:15:28,400 --> 00:15:31,400 Speaker 8: freedom of speech and distort the selection of our leaders. 242 00:15:31,760 --> 00:15:36,280 Speaker 8: People feel frustrated when the press doesn't deliver a complete 243 00:15:36,280 --> 00:15:39,320 Speaker 8: story or an accurate story, so they bring in people 244 00:15:39,320 --> 00:15:44,600 Speaker 8: who have the ability to add to the spectrum of facts, opinions, views, 245 00:15:44,600 --> 00:15:48,840 Speaker 8: philosophies so that the public can get a more balanced 246 00:15:48,960 --> 00:15:50,760 Speaker 8: view today. 247 00:15:50,800 --> 00:15:55,040 Speaker 3: The groundwork that Schmirtz and Warner laid with Bloody, various 248 00:15:55,040 --> 00:15:58,800 Speaker 3: other cases, with issue advertising, and with their general advocacy 249 00:15:58,840 --> 00:16:02,520 Speaker 3: for corporate free speech over the decades is the foundation 250 00:16:02,840 --> 00:16:07,000 Speaker 3: for Big Oil's argument about climate denial. It couldn't be fraud. 251 00:16:07,280 --> 00:16:10,880 Speaker 3: It was political speech protected by the free speech rights 252 00:16:11,000 --> 00:16:13,840 Speaker 3: they've spent the past fifty years securing. 253 00:16:16,480 --> 00:16:19,400 Speaker 1: You're listening to a Climate one conversation about tactics for 254 00:16:19,480 --> 00:16:24,160 Speaker 1: spreading climate misinformation coming up. One way to break down 255 00:16:24,280 --> 00:16:25,520 Speaker 1: misleading reasoning. 256 00:16:25,880 --> 00:16:30,440 Speaker 2: You can just take a flawed argument and transplant that 257 00:16:30,520 --> 00:16:34,480 Speaker 2: logic into a parallel situation, usually the most absurd and 258 00:16:34,680 --> 00:16:37,680 Speaker 2: extreme situation you can think of, and then use the 259 00:16:37,680 --> 00:16:41,960 Speaker 2: same logic, and that makes it very clear and engaging 260 00:16:42,040 --> 00:16:45,160 Speaker 2: in concrete when you're trying to explain the flawed logic 261 00:16:45,240 --> 00:16:45,680 Speaker 2: to people. 262 00:16:46,080 --> 00:16:54,240 Speaker 1: That's up next, when Climate one continues. This is Climate one. 263 00:16:54,480 --> 00:16:57,960 Speaker 1: I'm Greg Dalton, and today we're talking about climate misinformation 264 00:16:58,480 --> 00:17:02,360 Speaker 1: and how to challenge it. John Cook is a postdoctoral 265 00:17:02,400 --> 00:17:05,720 Speaker 1: research fellow at the Climate Change Communication Research Hub at 266 00:17:05,880 --> 00:17:10,320 Speaker 1: Nash University in Australia. He focuses on using critical thinking 267 00:17:10,400 --> 00:17:14,879 Speaker 1: to build resilience against misinformation. He says the most common 268 00:17:14,960 --> 00:17:20,640 Speaker 1: climate misinformation in the US centers on climate policy being harmful, expensive, 269 00:17:20,960 --> 00:17:21,879 Speaker 1: or ineffective. 270 00:17:22,560 --> 00:17:26,920 Speaker 2: Ultimately, its goal is to delay climate action and maintain 271 00:17:26,960 --> 00:17:29,640 Speaker 2: the status quo. And you will find that no matter 272 00:17:29,720 --> 00:17:32,720 Speaker 2: what the argument is, the conclusion is always the same, 273 00:17:33,119 --> 00:17:37,280 Speaker 2: whether they're arguing climate change isn't real, therefore we shouldn act, 274 00:17:37,560 --> 00:17:40,959 Speaker 2: or climate change isn't caused by humans, therefore we should act, 275 00:17:41,200 --> 00:17:44,640 Speaker 2: or solutions won't work, therefore we should act. It's always 276 00:17:44,320 --> 00:17:49,160 Speaker 2: that end thing. And so it's about delaying action by 277 00:17:49,280 --> 00:17:55,200 Speaker 2: reducing public support for climate action, and there are various 278 00:17:55,200 --> 00:17:59,360 Speaker 2: pathways to do that. Wine is confusing people about the science. 279 00:18:00,080 --> 00:18:03,800 Speaker 2: A famous two thousand and two memo written by a 280 00:18:04,119 --> 00:18:09,840 Speaker 2: political strategist, Frank Lumps, where he basically argued for politicians 281 00:18:10,320 --> 00:18:13,720 Speaker 2: who were trying to win the public debate about climate policy. 282 00:18:14,000 --> 00:18:16,600 Speaker 2: He said, if you want to win the debate about 283 00:18:16,760 --> 00:18:21,520 Speaker 2: climate policy and basically stop climate action, cast out on 284 00:18:21,600 --> 00:18:25,080 Speaker 2: the scientific consensus that humans are causing global warming. If 285 00:18:25,119 --> 00:18:29,400 Speaker 2: the public get confused about the consensus, the attitudes about 286 00:18:29,400 --> 00:18:31,520 Speaker 2: the policy change accordingly, right. 287 00:18:31,359 --> 00:18:35,040 Speaker 1: Which hearkens that confusion and doubt. Good course goes back 288 00:18:35,080 --> 00:18:39,160 Speaker 1: to the tobacco company saying doubt is our product. There's 289 00:18:39,200 --> 00:18:42,480 Speaker 1: been a decade's long effort by old companies and others 290 00:18:42,480 --> 00:18:45,239 Speaker 1: to cast doubt on climate science to allow them to 291 00:18:45,280 --> 00:18:48,240 Speaker 1: continue to profit. The strategy has taken many forms and 292 00:18:48,320 --> 00:18:51,760 Speaker 1: evolved over time. Can you walk us through that evolution 293 00:18:51,960 --> 00:18:55,080 Speaker 1: from deny, dismiss, delay, deflect. 294 00:18:55,840 --> 00:19:00,440 Speaker 2: Yeah, there has been a gradual shift. We've done analysis 295 00:19:00,520 --> 00:19:04,919 Speaker 2: over the last twenty years of climate misinformation, and the 296 00:19:04,960 --> 00:19:10,320 Speaker 2: biggest shift we're seeing is a gradual transition from science 297 00:19:10,359 --> 00:19:15,520 Speaker 2: misinformation to solutions misinformation. As the scientific evidence has gotten 298 00:19:15,560 --> 00:19:20,399 Speaker 2: stronger and stronger and harder to deny, it becomes untenable 299 00:19:20,480 --> 00:19:24,120 Speaker 2: to keep using these same zombie arguments that we've been 300 00:19:24,640 --> 00:19:27,959 Speaker 2: reading on blogs and on social media for many years. 301 00:19:28,800 --> 00:19:33,959 Speaker 2: And so now they're arguing against climate solutions, arguing that 302 00:19:33,960 --> 00:19:39,200 Speaker 2: climate policy might be harmful, arguing that renewables won't work, 303 00:19:40,119 --> 00:19:44,560 Speaker 2: and just more subtle arguments than the usual. Climate change 304 00:19:44,640 --> 00:19:46,400 Speaker 2: is a hoax type misinformation. 305 00:19:47,080 --> 00:19:49,480 Speaker 1: Right, that's no longer tenable in a world where there's 306 00:19:49,520 --> 00:19:51,600 Speaker 1: floods and fires that are rampant. 307 00:19:52,080 --> 00:19:56,359 Speaker 2: I think what's most potent right now is cultural war 308 00:19:56,600 --> 00:20:03,119 Speaker 2: type misinformation arguments that other people who care about climate change, 309 00:20:03,440 --> 00:20:06,600 Speaker 2: who are trying to get climate action, painting them as 310 00:20:07,359 --> 00:20:10,840 Speaker 2: different to us, and they're trying to take away our 311 00:20:10,880 --> 00:20:16,200 Speaker 2: lifestyle or impinge on our freedom, and generally just trying 312 00:20:16,240 --> 00:20:20,160 Speaker 2: to make the climate issue more tribal. The more tribal 313 00:20:20,200 --> 00:20:23,440 Speaker 2: and polarizer becomes, the harder it is to get progress. 314 00:20:23,800 --> 00:20:27,720 Speaker 1: And some of the techniques employed by misinformation you cite 315 00:20:27,840 --> 00:20:33,120 Speaker 1: are magnified minority, cherry picking, false dichotomy. How are those employed? 316 00:20:33,119 --> 00:20:34,600 Speaker 1: And give some examples of each one. 317 00:20:34,520 --> 00:20:38,919 Speaker 2: If you could so. Attacking the scientific consensus on climate 318 00:20:39,000 --> 00:20:43,320 Speaker 2: change has been a common strategy over the last few decades, 319 00:20:44,000 --> 00:20:46,520 Speaker 2: and one way to do that is the use of 320 00:20:46,720 --> 00:20:50,479 Speaker 2: magnified minority. In other words, take a small group and 321 00:20:50,600 --> 00:20:53,240 Speaker 2: make them look much bigger than they actually are and 322 00:20:53,320 --> 00:20:57,480 Speaker 2: much more significant than they really are. And the most 323 00:20:57,520 --> 00:21:02,280 Speaker 2: popular version of this technique is the Global Warming Petition Project. 324 00:21:02,600 --> 00:21:08,240 Speaker 2: This is a website that features thirty one thousand science 325 00:21:08,320 --> 00:21:11,800 Speaker 2: graduates in the US who've signed a statement saying that 326 00:21:11,880 --> 00:21:15,440 Speaker 2: humans aren't disrupting climate and the point of this website 327 00:21:15,480 --> 00:21:18,520 Speaker 2: is to say, hey, look, thirty one thousand people dissent 328 00:21:18,600 --> 00:21:22,840 Speaker 2: against the consensus. That proves that there isn't a scientific consensus. 329 00:21:23,440 --> 00:21:25,760 Speaker 2: But when you look at the total number of science 330 00:21:25,800 --> 00:21:29,040 Speaker 2: graduates in the US, it's millions and millions, and thirty 331 00:21:29,080 --> 00:21:32,280 Speaker 2: one thousand, while seeming like a big number, is actually 332 00:21:32,880 --> 00:21:38,240 Speaker 2: a tiny fraction of a percent. It's magnifying a minority 333 00:21:38,280 --> 00:21:40,359 Speaker 2: to make them look bigger than they really are. 334 00:21:42,160 --> 00:21:44,240 Speaker 1: So policy is harmful? What's that attack? 335 00:21:45,040 --> 00:21:49,680 Speaker 2: Usually arguing that climate policy is harmful takes a form 336 00:21:49,720 --> 00:21:52,720 Speaker 2: of arguing that it's either going to ruin the economy 337 00:21:52,840 --> 00:21:57,840 Speaker 2: or raise prices for people, and it really depends on 338 00:21:57,920 --> 00:22:03,119 Speaker 2: the specific policy. But tiply what this does is cherry 339 00:22:03,119 --> 00:22:07,080 Speaker 2: pigs or oversimplifies the policy. For example, this is a 340 00:22:07,160 --> 00:22:09,840 Speaker 2: very Australian centric one, but it's the one that immediately 341 00:22:09,840 --> 00:22:13,280 Speaker 2: comes to mind. We brought in a carbon price in 342 00:22:13,400 --> 00:22:17,080 Speaker 2: order to send a signal to the market to transition 343 00:22:17,240 --> 00:22:22,240 Speaker 2: from fossil fuels to renewables, and this carbon price generated 344 00:22:22,960 --> 00:22:26,000 Speaker 2: revenue for the government. But all that money was then 345 00:22:26,080 --> 00:22:28,960 Speaker 2: given back and it was mainly given back to lower 346 00:22:28,960 --> 00:22:34,879 Speaker 2: income families, and so it was a revenue neutral carbon price, 347 00:22:35,680 --> 00:22:39,040 Speaker 2: the public shouldn't have any change in their household budget. 348 00:22:39,160 --> 00:22:43,359 Speaker 2: But what the misinformation targeting the policy did was say 349 00:22:43,960 --> 00:22:46,040 Speaker 2: this is putting a price on carbon that's going to 350 00:22:46,160 --> 00:22:52,600 Speaker 2: raise prices for families, while ignoring that money was going 351 00:22:52,640 --> 00:22:56,200 Speaker 2: back to families. So usually at tax on climate policy 352 00:22:56,560 --> 00:23:00,040 Speaker 2: will focus on one part of it, but ignore the 353 00:23:00,040 --> 00:23:03,199 Speaker 2: the entire policy and the aspects of it that make 354 00:23:03,240 --> 00:23:03,880 Speaker 2: it work better. 355 00:23:04,560 --> 00:23:07,640 Speaker 1: And what are some other non policy attacks on solutions? 356 00:23:08,160 --> 00:23:11,840 Speaker 2: The most basic arguments attacking renewables are the sun doesn't 357 00:23:11,840 --> 00:23:16,399 Speaker 2: shine at night or the wind doesn't always blow, and 358 00:23:16,480 --> 00:23:21,000 Speaker 2: therefore renewables aren't a reliable source of energy, which ignores 359 00:23:21,040 --> 00:23:24,679 Speaker 2: again its cherry picking the information because it ignores the 360 00:23:24,720 --> 00:23:29,000 Speaker 2: fact that we have battery storage. And also when you 361 00:23:29,160 --> 00:23:34,080 Speaker 2: have combinations of wind and solar, particularly across a region, 362 00:23:35,320 --> 00:23:37,199 Speaker 2: wind might not be blowing in one place, but it 363 00:23:37,240 --> 00:23:40,480 Speaker 2: is at a different place, and when you have a 364 00:23:40,640 --> 00:23:44,040 Speaker 2: network of renewables, then you get a more reliable source 365 00:23:44,080 --> 00:23:45,440 Speaker 2: of energy. 366 00:23:45,400 --> 00:23:46,639 Speaker 1: And then false dichotomy. 367 00:23:46,680 --> 00:23:50,080 Speaker 2: What's an example of that false dichotomy is when you're 368 00:23:50,080 --> 00:23:53,000 Speaker 2: given two choices and you have to choose one of 369 00:23:53,040 --> 00:23:57,280 Speaker 2: them when both might be true, or maybe there's a 370 00:23:57,280 --> 00:24:00,119 Speaker 2: third choice. And the most common example of this in 371 00:24:00,160 --> 00:24:03,840 Speaker 2: climate change. And this is a little bit technical and complicated, 372 00:24:03,880 --> 00:24:07,080 Speaker 2: but it's looking at the ice core record. When we 373 00:24:07,200 --> 00:24:11,159 Speaker 2: look at ice cores going back hundreds of thousands of 374 00:24:11,240 --> 00:24:16,560 Speaker 2: years in Antarctic ice cores, we see that when temperature 375 00:24:16,600 --> 00:24:20,800 Speaker 2: goes up, CO two goes up afterwards by several hundred 376 00:24:20,880 --> 00:24:25,800 Speaker 2: years roughly. And what that tells us is temperature went 377 00:24:25,880 --> 00:24:30,880 Speaker 2: up before the CO two and climate denies look at 378 00:24:30,920 --> 00:24:35,600 Speaker 2: this and say, wow, either temperature drives CO two or 379 00:24:35,880 --> 00:24:39,159 Speaker 2: CO two drives temperature. You have to choose one or 380 00:24:39,200 --> 00:24:42,640 Speaker 2: the other. But that's actually a false dichotomy because it's 381 00:24:42,640 --> 00:24:45,879 Speaker 2: not a choice between one or the other. Both are 382 00:24:45,920 --> 00:24:50,560 Speaker 2: actually true. Temperature does drive CO two. When it gets warmer, 383 00:24:50,920 --> 00:24:54,080 Speaker 2: the ocean gives up CO two in the atmosphere, and 384 00:24:54,119 --> 00:24:56,200 Speaker 2: then when you have more CO two in the atmosphere, 385 00:24:56,680 --> 00:25:00,640 Speaker 2: that causes warming because it's a greenhouse gas. Put those 386 00:25:00,640 --> 00:25:05,040 Speaker 2: two together and you get a reinforcing feedback. And it's 387 00:25:05,040 --> 00:25:08,399 Speaker 2: actually that reinforcing feedback that pulled the Earth out of 388 00:25:08,440 --> 00:25:12,080 Speaker 2: ice ages in our past over the last eight hundred 389 00:25:12,119 --> 00:25:12,879 Speaker 2: thousand years. 390 00:25:13,440 --> 00:25:16,960 Speaker 1: John, you've also written about how people often substitute judgment 391 00:25:17,040 --> 00:25:20,920 Speaker 1: about complex topics such as climate science, with more simple judgments, 392 00:25:20,960 --> 00:25:23,840 Speaker 1: for example, the character or tribal identity of a person 393 00:25:24,240 --> 00:25:27,800 Speaker 1: talking about climate science. How does that reliance on shortcuts 394 00:25:27,840 --> 00:25:29,760 Speaker 1: fuel climate and misinformation. 395 00:25:30,600 --> 00:25:33,440 Speaker 2: Yeah, it's important to recognize that all of us are 396 00:25:33,520 --> 00:25:39,080 Speaker 2: hardwired to make decisions based on snap mental shortcuts or heuristics, 397 00:25:39,359 --> 00:25:42,240 Speaker 2: and generally it serves us well, that's how we're able 398 00:25:42,280 --> 00:25:45,800 Speaker 2: to escape a saber toothed tiger jumping out of the 399 00:25:45,800 --> 00:25:51,280 Speaker 2: bushes or just immediate threats. The problem is, in this 400 00:25:51,400 --> 00:25:55,840 Speaker 2: modern world, sometimes those mental shortcuts can lead us astray, 401 00:25:56,240 --> 00:26:01,000 Speaker 2: and it can also make us vulnerable to that arguments 402 00:26:01,119 --> 00:26:07,360 Speaker 2: or misinformation. And it's an unfortunate reality that the solution 403 00:26:07,560 --> 00:26:11,600 Speaker 2: to this problem is critical thinking. We need to be 404 00:26:11,680 --> 00:26:16,560 Speaker 2: able to get better at spotting misinformation and the spotting 405 00:26:16,600 --> 00:26:20,520 Speaker 2: attempts to mislead us. What are those different fallacies? Is 406 00:26:20,600 --> 00:26:25,639 Speaker 2: this argument a false dichotomy or does it use magnified 407 00:26:25,640 --> 00:26:30,000 Speaker 2: minority or cherry picking or other misleading techniques. If we 408 00:26:30,200 --> 00:26:35,720 Speaker 2: want a public who are resilient against misinformation, we need 409 00:26:36,040 --> 00:26:40,440 Speaker 2: to build up their ability to spot these types of fallacies. 410 00:26:40,800 --> 00:26:43,400 Speaker 1: Right and sort of complexity can Yeah, we want things, 411 00:26:43,520 --> 00:26:46,640 Speaker 1: especially these days, we want things that are fast, simple, understandable, 412 00:26:46,680 --> 00:26:50,640 Speaker 1: and social media often distorts and doing that and distilling things. 413 00:26:50,680 --> 00:26:53,280 Speaker 1: And I was watching some of your videos online of 414 00:26:53,480 --> 00:26:56,520 Speaker 1: you kind of dissecting the premise and does the conclusion 415 00:26:57,040 --> 00:27:00,640 Speaker 1: logically follow the premise, and it all seems very reasonable. 416 00:27:00,760 --> 00:27:03,399 Speaker 1: I thought, Yeah, but this is like bringing a knife 417 00:27:03,440 --> 00:27:07,520 Speaker 1: to a gunfight with and so I'm just curious about 418 00:27:07,560 --> 00:27:11,720 Speaker 1: your sort of very reasoned approach logic based in an 419 00:27:11,760 --> 00:27:15,680 Speaker 1: information age where things are viral and fake and spreading 420 00:27:15,720 --> 00:27:17,840 Speaker 1: so quickly regardless of their veracity. 421 00:27:18,680 --> 00:27:23,239 Speaker 2: Yeah, it's really hard. I've struggled with those thoughts and 422 00:27:23,320 --> 00:27:26,920 Speaker 2: for many years. When we develop this critical thinking approach 423 00:27:26,960 --> 00:27:31,320 Speaker 2: where you deconstruct arguments into premises and conclusion. I did 424 00:27:31,359 --> 00:27:35,720 Speaker 2: that work with two critical thinking philosophers. They introduced me 425 00:27:35,800 --> 00:27:41,080 Speaker 2: to the idea of parallel argumentation. You can explain logical 426 00:27:41,119 --> 00:27:42,879 Speaker 2: flaws by not You don't have to go into the 427 00:27:42,920 --> 00:27:47,960 Speaker 2: whole premise, premise, conclusion, logical like is it logically about? 428 00:27:48,000 --> 00:27:50,760 Speaker 2: All that kind of analysis. You can just take a 429 00:27:50,800 --> 00:27:56,000 Speaker 2: flawed argument and transplant that logic into a parallel situation. 430 00:27:56,600 --> 00:27:59,680 Speaker 2: Usually the most absurd and extreme situation you can think 431 00:27:59,720 --> 00:28:03,040 Speaker 2: of and then use the same logic, and that makes 432 00:28:03,040 --> 00:28:07,400 Speaker 2: it very clear and engaging in concrete when you're trying 433 00:28:07,400 --> 00:28:10,240 Speaker 2: to explain the flawed logic to people. And when they 434 00:28:10,280 --> 00:28:13,399 Speaker 2: introduced this technique to me, I realized that this was 435 00:28:13,440 --> 00:28:18,000 Speaker 2: what late night comedians used every night. They'll say this 436 00:28:18,280 --> 00:28:21,600 Speaker 2: person said this statement, and well, that's just like being 437 00:28:21,640 --> 00:28:25,240 Speaker 2: in this situation and then using the same logic. Everyone laughs. 438 00:28:25,280 --> 00:28:29,200 Speaker 2: They can immediately see that it's wrong and they're entertained. 439 00:28:29,840 --> 00:28:35,800 Speaker 2: But most importantly, like is the comedian has actually introduced 440 00:28:35,840 --> 00:28:39,200 Speaker 2: a bit of critical thinking because it's shown a logical 441 00:28:39,240 --> 00:28:42,560 Speaker 2: fallacy in a very concrete, engaging way. The beauty of 442 00:28:42,600 --> 00:28:48,040 Speaker 2: this approach is you can use non polarizing examples to 443 00:28:48,240 --> 00:28:52,240 Speaker 2: explain how misinformation is misleading or to explain a fallacy. 444 00:28:53,080 --> 00:28:56,720 Speaker 1: So let's practice this. If I see the climate's change before, 445 00:28:56,760 --> 00:28:59,480 Speaker 1: it's changing now it's always been changing. Yeah, climate changes, 446 00:28:59,480 --> 00:29:02,840 Speaker 1: that's what it does. How would you respond. 447 00:29:02,560 --> 00:29:06,160 Speaker 2: That argument is the same logic as saying, well, people 448 00:29:06,240 --> 00:29:11,920 Speaker 2: have died of natural causes before cigarettes were invented, therefore 449 00:29:12,000 --> 00:29:16,000 Speaker 2: cigarettes can't cause harmful effects, or people have died of 450 00:29:16,040 --> 00:29:20,800 Speaker 2: cancer long before cigarettes were invented. Therefore, smoking doesn't cause cancer. 451 00:29:21,280 --> 00:29:24,960 Speaker 2: It's the same logic, and it commits single cause fallacy, 452 00:29:25,280 --> 00:29:29,680 Speaker 2: in other words, saying that whatever's whatever caused something in 453 00:29:29,720 --> 00:29:32,600 Speaker 2: the past must also be causing it now when you 454 00:29:32,640 --> 00:29:34,160 Speaker 2: can have multiple causes. 455 00:29:34,680 --> 00:29:39,000 Speaker 1: Let's try another one. If I say that models are unreliable, oh, 456 00:29:39,160 --> 00:29:41,560 Speaker 1: climate models, no, they're not accurate. 457 00:29:42,040 --> 00:29:45,320 Speaker 2: So actually we're doing an experiment on that now. And 458 00:29:45,360 --> 00:29:49,400 Speaker 2: so our approach has been to say models are a 459 00:29:49,400 --> 00:29:53,720 Speaker 2: simplification of reality. They don't capture all of reality, and 460 00:29:54,280 --> 00:29:58,960 Speaker 2: we use models to get astronauts to the moon. They're 461 00:29:59,080 --> 00:30:04,960 Speaker 2: simplified versions are reality. Mutant's laws of motion and Newton's 462 00:30:05,040 --> 00:30:09,960 Speaker 2: laws of gravity simplifications. Models don't need to be perfect 463 00:30:10,640 --> 00:30:14,560 Speaker 2: in order to give us useful results, and climate models 464 00:30:14,880 --> 00:30:18,560 Speaker 2: they're not perfect. They don't capture absolutely everything, but they 465 00:30:18,640 --> 00:30:22,800 Speaker 2: capture enough to tell us that humans are causing climate 466 00:30:22,880 --> 00:30:26,480 Speaker 2: change and that climate change has serious impacts. 467 00:30:27,000 --> 00:30:30,040 Speaker 1: Many fossil fuel companies now are engaging in what some 468 00:30:30,120 --> 00:30:33,320 Speaker 1: call greenwashing or climate washing, where they're making net zero 469 00:30:33,320 --> 00:30:37,080 Speaker 1: commitments and stating that they're working toward climate solutions you 470 00:30:37,160 --> 00:30:40,160 Speaker 1: mentioned earlier. They are attacking solutions there's other approach, which 471 00:30:40,160 --> 00:30:42,400 Speaker 1: is there kind of co opting solutions, saying we share 472 00:30:42,440 --> 00:30:43,959 Speaker 1: the solutions, we're part of the solution. 473 00:30:44,440 --> 00:30:49,880 Speaker 2: Yes. Greenwashing is another form of climate misinformation, particularly from industry, 474 00:30:50,520 --> 00:30:55,160 Speaker 2: and it's a hard one. Often you need a lot 475 00:30:55,200 --> 00:30:58,800 Speaker 2: of background information in order to fact check whether what 476 00:30:58,840 --> 00:31:03,080 Speaker 2: they're doing is actually helping the environment or whether it's 477 00:31:03,160 --> 00:31:07,360 Speaker 2: just token behavior in order to portray themselves as being 478 00:31:07,560 --> 00:31:12,560 Speaker 2: environmental when actually they're actually being quite distractive. But they're 479 00:31:12,760 --> 00:31:17,600 Speaker 2: generally speaking. The strategy to counter green washing is the 480 00:31:17,640 --> 00:31:21,120 Speaker 2: same as the strategy to count other forms of climate misinformation. 481 00:31:22,160 --> 00:31:25,520 Speaker 2: Learn the techniques and become familiar with them so that 482 00:31:25,520 --> 00:31:29,280 Speaker 2: when you see them in some corporate advertising, that's a 483 00:31:29,320 --> 00:31:32,880 Speaker 2: red flag. The techniques of green washing, just a few 484 00:31:32,920 --> 00:31:37,040 Speaker 2: of them that come immediately to mind is vague terms 485 00:31:37,680 --> 00:31:41,200 Speaker 2: or kind of meaningless terms, so they would say it's 486 00:31:41,280 --> 00:31:46,600 Speaker 2: environmentally friendly, or they'll they'll just use either colors or 487 00:31:46,640 --> 00:31:53,360 Speaker 2: imagery or environmental sounding words, but it's all very loosey goosey. 488 00:31:53,880 --> 00:31:57,160 Speaker 2: The other red flag is when they're a company that 489 00:31:58,000 --> 00:32:03,080 Speaker 2: their main bread and of their business is environmentally destructive, 490 00:32:03,600 --> 00:32:05,960 Speaker 2: and then they talk about something that they're doing that's 491 00:32:06,120 --> 00:32:12,160 Speaker 2: environmentally positive. Usually in those cases, what they're spending on 492 00:32:12,280 --> 00:32:16,440 Speaker 2: this environmental activity is a tiny, tiny fraction of their 493 00:32:16,480 --> 00:32:17,240 Speaker 2: overall budget. 494 00:32:17,520 --> 00:32:21,000 Speaker 1: Yeah, that's the magnified minority. We're spending millions on renewables, 495 00:32:21,000 --> 00:32:25,960 Speaker 1: but they're spending tens of billions on fossil fuels. Amy 496 00:32:26,000 --> 00:32:29,200 Speaker 1: Westervelt reported elsewhere in the show that freedom of speech 497 00:32:29,280 --> 00:32:31,640 Speaker 1: is often viewed as sakra sanct We all know that, 498 00:32:31,680 --> 00:32:34,840 Speaker 1: and has been used by fossil fuel companies as cover 499 00:32:35,000 --> 00:32:38,480 Speaker 1: further misinformation campaigns. What are the bigger implications of that 500 00:32:38,560 --> 00:32:40,160 Speaker 1: in terms of free speech. 501 00:32:40,960 --> 00:32:45,680 Speaker 2: It really depends on the specific situation. But generally speaking, 502 00:32:45,800 --> 00:32:49,640 Speaker 2: my policy or my approach is the antidote to bad 503 00:32:49,680 --> 00:32:55,680 Speaker 2: speech is more speech or good speech, and that is 504 00:32:55,800 --> 00:33:00,960 Speaker 2: kind of the principle that informs building public resilience misinformation, 505 00:33:01,480 --> 00:33:05,800 Speaker 2: so helping people to see through these false arguments from 506 00:33:06,640 --> 00:33:09,240 Speaker 2: fussil fuel companies or other sources of misinformation. 507 00:33:10,040 --> 00:33:12,640 Speaker 1: Right, and we've seen lots of attacks on science, and 508 00:33:12,680 --> 00:33:14,840 Speaker 1: there seems to be you know, it's related to the 509 00:33:14,880 --> 00:33:19,640 Speaker 1: distrust of institutions, and that's certainly been rampant during COVID 510 00:33:19,760 --> 00:33:22,680 Speaker 1: nineteen pandemic and has led to real harm and even 511 00:33:22,720 --> 00:33:26,840 Speaker 1: the death of some vocal anti vaxxers. If you did 512 00:33:26,880 --> 00:33:32,480 Speaker 1: research on whether personal experience does that affect people's receptivity 513 00:33:32,560 --> 00:33:35,640 Speaker 1: to these myths if they know someone who's been affected 514 00:33:35,640 --> 00:33:39,959 Speaker 1: by climate or know someone who's been affected by COVID. 515 00:33:40,440 --> 00:33:43,080 Speaker 2: There's some interesting research done by my colleagues at George 516 00:33:43,080 --> 00:33:48,800 Speaker 2: Mason University threes of myas at MAYBAC. They looked at 517 00:33:49,040 --> 00:33:53,720 Speaker 2: how personal experience can influence people's perceptions about climate change 518 00:33:54,360 --> 00:33:57,880 Speaker 2: and the way to think about these imagine there's three 519 00:33:57,920 --> 00:34:02,240 Speaker 2: segments of society. Is the alarmed and concern people who 520 00:34:02,240 --> 00:34:05,440 Speaker 2: are who are on board about climate change. There's the 521 00:34:05,520 --> 00:34:09,880 Speaker 2: dismissives at the other end, and then there's the mushy middle. 522 00:34:09,960 --> 00:34:13,160 Speaker 2: There's the undecideds in the middle. What they found was 523 00:34:13,640 --> 00:34:17,000 Speaker 2: personal experience about climate change doesn't affect the two groups 524 00:34:17,000 --> 00:34:20,080 Speaker 2: at the ends. The people who are alarmed stay alarmed. 525 00:34:20,320 --> 00:34:23,479 Speaker 2: The people who are dismissive stay dismissive. It's the people 526 00:34:23,520 --> 00:34:26,440 Speaker 2: who are undercided in the middle. When they have personal 527 00:34:26,480 --> 00:34:31,719 Speaker 2: experience with climate related events like increasing extreme weather, those 528 00:34:31,719 --> 00:34:34,440 Speaker 2: are the ones whose perceptions about climate change shift. 529 00:34:35,719 --> 00:34:39,640 Speaker 1: You work with Facebook to help them combat misinformation, what 530 00:34:39,680 --> 00:34:42,160 Speaker 1: does that work look like, particularly in the climate realm. 531 00:34:43,400 --> 00:34:48,560 Speaker 2: So Facebook launched the Climate Science Center, and initially the 532 00:34:48,600 --> 00:34:53,240 Speaker 2: Climate Science Center was just about providing authoritative, reliable, accurate 533 00:34:53,280 --> 00:34:56,120 Speaker 2: facts about climate change, and this was done in response 534 00:34:56,200 --> 00:34:59,560 Speaker 2: to a lot of criticism they received about letting this 535 00:34:59,680 --> 00:35:03,440 Speaker 2: information spread on their platform, and a lot of people 536 00:35:03,480 --> 00:35:07,200 Speaker 2: were critical that this was not enough, including myself. I 537 00:35:07,239 --> 00:35:09,440 Speaker 2: didn't have any association with them at the time, so 538 00:35:09,480 --> 00:35:13,120 Speaker 2: I was quite blunt in saying producing facts while letting 539 00:35:13,120 --> 00:35:17,239 Speaker 2: misinformation spread was like poisoning someone and then giving them 540 00:35:17,239 --> 00:35:24,120 Speaker 2: a pamphlet about vegetables. But to their credit, they always 541 00:35:24,160 --> 00:35:27,480 Speaker 2: recognized that just producing the Climate Science Center was a 542 00:35:27,520 --> 00:35:31,759 Speaker 2: first step, and their intent was to gradually ratchet up 543 00:35:32,440 --> 00:35:36,319 Speaker 2: their ambition and proactiveness in taking on climate misinformation. So 544 00:35:36,360 --> 00:35:41,160 Speaker 2: their next step was to work with myself and two 545 00:35:41,200 --> 00:35:47,719 Speaker 2: other climate communication researchers, Tony Lyisowitz and Sander Vandalinden, and 546 00:35:48,239 --> 00:35:51,839 Speaker 2: we went through the process of looking at the most 547 00:35:51,840 --> 00:35:55,640 Speaker 2: common myths about climate change, and then we advise them 548 00:35:55,680 --> 00:35:59,759 Speaker 2: on how to write debunkings about them. It's important when 549 00:35:59,760 --> 00:36:03,800 Speaker 2: you're debunking misinformation not just to explain the facts. Although 550 00:36:03,840 --> 00:36:08,160 Speaker 2: that's crucially important, but also to explain the technique used 551 00:36:08,280 --> 00:36:13,440 Speaker 2: to distort the facts, so fact myth fallacy. Fact is 552 00:36:13,440 --> 00:36:18,160 Speaker 2: a general structure that we recommend for debunking misinformation, and 553 00:36:18,200 --> 00:36:22,080 Speaker 2: they use that. So we produced those with them debunking 554 00:36:22,120 --> 00:36:25,560 Speaker 2: the most common myths about climate change. Since then, it's 555 00:36:25,600 --> 00:36:28,919 Speaker 2: been an ongoing collaboration and they're still looking at other 556 00:36:28,960 --> 00:36:33,440 Speaker 2: ways to use their platform to count of misinformation. It's 557 00:36:33,480 --> 00:36:36,520 Speaker 2: been slow, slower than I would have liked, but there 558 00:36:36,520 --> 00:36:42,239 Speaker 2: has been incremental progress. Misinformation is a really complicated problem. 559 00:36:42,480 --> 00:36:49,480 Speaker 2: It involves psychology, culture, technology, science, a whole range of 560 00:36:49,520 --> 00:36:51,640 Speaker 2: different factors, and we need to be throwing a lot 561 00:36:51,719 --> 00:36:53,360 Speaker 2: of different tools out it. 562 00:36:53,840 --> 00:36:56,040 Speaker 1: John Kirk, thanks for sharing your insights on how to 563 00:36:56,120 --> 00:36:59,560 Speaker 1: identify misinformation and how to respond to the misinformation. 564 00:37:00,000 --> 00:37:01,319 Speaker 2: Thanks Greg, it was great to talk to. 565 00:37:01,280 --> 00:37:05,600 Speaker 1: You coming up the implications of podcasts not being regulated 566 00:37:05,600 --> 00:37:07,880 Speaker 1: the same way as other types of media. 567 00:37:08,600 --> 00:37:12,279 Speaker 3: Every person who's putting out a podcast, it's up to 568 00:37:12,360 --> 00:37:17,640 Speaker 3: them entirely what their process is for fact checking or 569 00:37:18,000 --> 00:37:24,080 Speaker 3: any kind of backstop there on truthfulness, and we've really 570 00:37:24,120 --> 00:37:27,360 Speaker 3: seen in the last few years what that can lead to. 571 00:37:27,960 --> 00:37:37,840 Speaker 1: That's up next when climate One continues, we're talking about 572 00:37:37,880 --> 00:37:42,040 Speaker 1: climate misinformation. This week's show is a special collaboration with 573 00:37:42,080 --> 00:37:46,200 Speaker 1: Amy Westervelt, an award winning print and audio journalist. She's 574 00:37:46,280 --> 00:37:49,960 Speaker 1: founder of the Critical Frequency podcast network, which includes her 575 00:37:50,000 --> 00:37:54,520 Speaker 1: own show Drilled, a true crime style podcast about climate change. 576 00:37:54,960 --> 00:37:56,759 Speaker 1: I asked her to join me to reflect on what 577 00:37:56,760 --> 00:37:59,480 Speaker 1: we've heard so far in this show about ways fossil 578 00:37:59,480 --> 00:38:04,040 Speaker 1: fuel comey spread misinformation. Amy, Welcome to Climate One. I'm 579 00:38:04,080 --> 00:38:05,319 Speaker 1: excited to be talking with you. 580 00:38:05,880 --> 00:38:08,160 Speaker 3: I'm super excited to be here. Thanks for having me. 581 00:38:08,480 --> 00:38:08,680 Speaker 6: Well. 582 00:38:08,719 --> 00:38:12,239 Speaker 1: Reflecting on your piece that opened this episode, it was 583 00:38:12,280 --> 00:38:15,040 Speaker 1: real interesting, and I've been thinking about the distinction between 584 00:38:15,040 --> 00:38:20,640 Speaker 1: commercial speech and political speech. Fossil fuels have amazing energy 585 00:38:20,680 --> 00:38:23,479 Speaker 1: density and they enable us to live our lives every day, 586 00:38:23,640 --> 00:38:26,719 Speaker 1: and fossil fuels are killing the natural systems that we 587 00:38:26,800 --> 00:38:30,480 Speaker 1: rely on every day. Both are true. Where do you 588 00:38:30,560 --> 00:38:33,840 Speaker 1: see the line between commercial speech and political speech that 589 00:38:33,880 --> 00:38:35,480 Speaker 1: you talked about in that opening segment. 590 00:38:36,080 --> 00:38:39,200 Speaker 3: Yeah, I mean I find that really interesting too. Just 591 00:38:39,280 --> 00:38:43,640 Speaker 3: from a legal strategy standpoint, this idea that you know, 592 00:38:44,400 --> 00:38:48,360 Speaker 3: if it's something that connects to policy, I want to 593 00:38:48,400 --> 00:38:52,240 Speaker 3: see then it doesn't have to play by the same 594 00:38:52,360 --> 00:38:55,279 Speaker 3: rules as you know something that's about a product that 595 00:38:55,360 --> 00:38:58,880 Speaker 3: I'm selling. So to me, I actually it reminds me 596 00:38:58,920 --> 00:39:02,360 Speaker 3: of the conversation that sprung up when The Guardian stopped 597 00:39:02,400 --> 00:39:05,319 Speaker 3: taking fossil fuel ads a couple of years ago. So 598 00:39:06,200 --> 00:39:08,319 Speaker 3: a lot of people kind of questioned that and said, well, 599 00:39:08,480 --> 00:39:11,200 Speaker 3: where do you draw the line, What about airplanes and 600 00:39:11,239 --> 00:39:14,480 Speaker 3: air travel and cars and if it has to do 601 00:39:14,600 --> 00:39:20,040 Speaker 3: with climate change and or a product's impact on the 602 00:39:20,160 --> 00:39:23,560 Speaker 3: environment or the world at large, you could theoretically just 603 00:39:23,680 --> 00:39:26,680 Speaker 3: get rid of of any kind of advertiser. And they 604 00:39:26,800 --> 00:39:29,600 Speaker 3: had this very succinct response to that, which was, well, 605 00:39:30,280 --> 00:39:33,040 Speaker 3: all of those other categories are selling a product, and 606 00:39:33,080 --> 00:39:36,440 Speaker 3: the fossil fuel industry is very much selling ideas and 607 00:39:36,520 --> 00:39:41,680 Speaker 3: policy positions. They don't advertise gas anymore. Nobody really chooses 608 00:39:41,719 --> 00:39:46,400 Speaker 3: their gasoline based on brand, right, It's a commodity. So 609 00:39:46,520 --> 00:39:50,359 Speaker 3: to me, there's a very very clear difference between the 610 00:39:50,400 --> 00:39:53,240 Speaker 3: way that the fossil fuel industry advertises for the last 611 00:39:53,280 --> 00:39:58,319 Speaker 3: ten fifteen years and the way most other industries advertise. 612 00:39:58,800 --> 00:40:01,600 Speaker 3: That is a pretty good illustration of the difference between 613 00:40:01,600 --> 00:40:03,359 Speaker 3: commercial and political speech. 614 00:40:03,560 --> 00:40:06,799 Speaker 1: Right, and then that advertising often gets into its branding 615 00:40:06,880 --> 00:40:09,440 Speaker 1: and you know, another part of your segment that was 616 00:40:09,480 --> 00:40:14,280 Speaker 1: real interesting is Mobile VP of Public Affairs formerly Herb Schmertz, 617 00:40:14,320 --> 00:40:17,759 Speaker 1: remarking on the company's personality and its participation in the 618 00:40:17,800 --> 00:40:21,960 Speaker 1: marketplace of ideas. Of course, these days, corporations are often 619 00:40:22,000 --> 00:40:26,640 Speaker 1: invited into public discourse bipartisans and advocates. Disney just called 620 00:40:26,640 --> 00:40:31,439 Speaker 1: for rescinding Floridas Don't Say Gay law, which I said, yay, Yeah, 621 00:40:31,480 --> 00:40:34,000 Speaker 1: So oil companies are not the only ones shaping their 622 00:40:34,000 --> 00:40:37,480 Speaker 1: image and in the public policy sphere. So you know, 623 00:40:37,560 --> 00:40:39,680 Speaker 1: why is it bad for oil companies to do what 624 00:40:39,760 --> 00:40:41,640 Speaker 1: Disney is and others are doing. 625 00:40:42,239 --> 00:40:45,520 Speaker 3: I personally believe it's not great for any of them 626 00:40:45,560 --> 00:40:48,560 Speaker 3: to do it. Actually, whether we agree with them or 627 00:40:48,640 --> 00:40:53,120 Speaker 3: not is sort of irrelevant. I think the invitation to 628 00:40:53,400 --> 00:40:59,919 Speaker 3: corporations into the public square has been a real problem 629 00:41:00,000 --> 00:41:04,279 Speaker 3: America since the seventies. And I think that you see 630 00:41:04,320 --> 00:41:08,319 Speaker 3: this right in this history of Mobile kind of involving 631 00:41:08,360 --> 00:41:11,799 Speaker 3: itself with this that it was very much like we 632 00:41:11,880 --> 00:41:16,320 Speaker 3: need to maintain this position in society to be able 633 00:41:16,400 --> 00:41:23,480 Speaker 3: to effectively lobby both the public and politicians for these 634 00:41:24,600 --> 00:41:27,640 Speaker 3: kinds of policies that we want to control. The narrative 635 00:41:27,680 --> 00:41:31,680 Speaker 3: about what's happening, you know, with our industry. It's actually 636 00:41:31,680 --> 00:41:34,080 Speaker 3: a really interesting time to be talking about this because 637 00:41:34,760 --> 00:41:37,520 Speaker 3: very similar things were happening then that are happening now, 638 00:41:37,840 --> 00:41:41,640 Speaker 3: where you know, the gas phrases were high, and the 639 00:41:41,680 --> 00:41:44,080 Speaker 3: oil companies were saying, it's not our fault, it's the 640 00:41:44,120 --> 00:41:47,120 Speaker 3: government's fault, and there was this kind of you know, 641 00:41:47,280 --> 00:41:49,960 Speaker 3: jockeying for control of the story. 642 00:41:50,520 --> 00:41:52,960 Speaker 1: When I heard that piece that you did, I thought of, 643 00:41:53,000 --> 00:41:57,600 Speaker 1: you know, the status of corporations as individuals, as legal individuals, 644 00:41:57,640 --> 00:41:59,960 Speaker 1: which is kind of another extension of what you pointed out. 645 00:42:00,280 --> 00:42:03,000 Speaker 1: They have First Amendment rights and they are an individual 646 00:42:03,040 --> 00:42:07,120 Speaker 1: which made me think immediately of Stephen Colbert's line a 647 00:42:07,160 --> 00:42:10,520 Speaker 1: while back where he said, I'll believe that corporations are 648 00:42:10,560 --> 00:42:14,640 Speaker 1: individuals when Texas executes one. 649 00:42:15,880 --> 00:42:19,480 Speaker 3: Yes, yeah, exactly, And I mean Herb Schwartz whole personality 650 00:42:19,520 --> 00:42:22,919 Speaker 3: thing really played into that too, This idea of like, oh, 651 00:42:22,960 --> 00:42:28,880 Speaker 3: if we imbue corporations with a personality and an opinion 652 00:42:29,200 --> 00:42:33,200 Speaker 3: and a soul, like a morality too, right, then that 653 00:42:33,480 --> 00:42:35,919 Speaker 3: makes it easier for us to convince the public that, 654 00:42:36,280 --> 00:42:38,920 Speaker 3: you know, we're good faith actors, we care about more 655 00:42:38,920 --> 00:42:41,680 Speaker 3: than just our bottom line. But the reality is that 656 00:42:42,040 --> 00:42:43,600 Speaker 3: they don't have to play by the same rules as 657 00:42:43,640 --> 00:42:46,359 Speaker 3: any other member of the community. You know, nobody else 658 00:42:46,400 --> 00:42:51,040 Speaker 3: gets as many benefits as corporations do in the realm 659 00:42:51,239 --> 00:42:52,520 Speaker 3: where they're not humans. 660 00:42:53,160 --> 00:42:55,399 Speaker 1: And I'm curious, you know. You also, we had John 661 00:42:55,480 --> 00:42:59,440 Speaker 1: Cook talking about deconstructing climate myths, how to identify and 662 00:42:59,480 --> 00:43:02,720 Speaker 1: respond misinformation. What struck you about his piece? 663 00:43:03,239 --> 00:43:06,200 Speaker 3: I find him so interesting every time I read anything 664 00:43:06,239 --> 00:43:08,840 Speaker 3: of his or listen to him talk about this stuff. 665 00:43:09,080 --> 00:43:12,240 Speaker 3: So the thing that struck me this time was this notion. 666 00:43:12,360 --> 00:43:14,600 Speaker 3: And I don't it wasn't necessarily new to me, but 667 00:43:14,719 --> 00:43:17,000 Speaker 3: just the way he phrased it. As you know, we 668 00:43:17,040 --> 00:43:20,240 Speaker 3: had like twenty years of science denial and then twenty 669 00:43:20,320 --> 00:43:25,160 Speaker 3: years of solutions denial. It's a very straightforward way to 670 00:43:25,239 --> 00:43:28,359 Speaker 3: understand it. And you really have seen that over over 671 00:43:28,400 --> 00:43:31,400 Speaker 3: the years of you know, Okay, now we believe the science, Like, 672 00:43:31,480 --> 00:43:34,480 Speaker 3: let's focus on these other things that will help us 673 00:43:34,480 --> 00:43:39,400 Speaker 3: delay policy and regulation and you know, allow us to 674 00:43:39,480 --> 00:43:41,759 Speaker 3: kind of get as much out of these assets as 675 00:43:41,760 --> 00:43:44,279 Speaker 3: we can before we have to retire the you know, 676 00:43:45,040 --> 00:43:47,160 Speaker 3: which is the name of the game. And like to 677 00:43:47,200 --> 00:43:51,640 Speaker 3: be honest, I don't necessarily blame oil companies for doing that. 678 00:43:51,680 --> 00:43:57,000 Speaker 3: They're doing what corporations are encouraged and incentivized and enabled 679 00:43:57,000 --> 00:44:00,160 Speaker 3: to do. Right, So I am kind of a the 680 00:44:00,200 --> 00:44:02,160 Speaker 3: opinion of, well, if we wanted to change, then we 681 00:44:02,280 --> 00:44:06,320 Speaker 3: have to change the rules so that you know, they 682 00:44:06,400 --> 00:44:10,319 Speaker 3: can't do these things that are beneficial to them and 683 00:44:10,760 --> 00:44:13,799 Speaker 3: kind of impose a lot of liabilities and risks on 684 00:44:13,840 --> 00:44:14,640 Speaker 3: the rest of us. 685 00:44:15,360 --> 00:44:18,239 Speaker 1: The idea of kind of solutions that also for me 686 00:44:18,560 --> 00:44:21,799 Speaker 1: was really clarifying and crystallizing when John said, we've moved 687 00:44:21,800 --> 00:44:25,759 Speaker 1: from climate misinformation to solutions misinformation. And maybe think of 688 00:44:26,239 --> 00:44:29,880 Speaker 1: how many times I've heard people question what happens to 689 00:44:29,960 --> 00:44:33,400 Speaker 1: EV batteries after the useful life of the car? And 690 00:44:33,440 --> 00:44:35,680 Speaker 1: I've heard that so many times that I'm suspicious of 691 00:44:35,719 --> 00:44:38,719 Speaker 1: where that's being seated and how that's being seated, Like 692 00:44:38,880 --> 00:44:41,719 Speaker 1: did those people all come up with that organically, or 693 00:44:41,840 --> 00:44:44,760 Speaker 1: you know, is there some you know, Facebook posts somewhere 694 00:44:44,800 --> 00:44:47,960 Speaker 1: that campaign to like so doubt about EV batteries at 695 00:44:48,000 --> 00:44:50,200 Speaker 1: the end of life and we know that that is 696 00:44:50,239 --> 00:44:52,960 Speaker 1: a solvable problem. And now there's a new company that's 697 00:44:53,000 --> 00:44:55,160 Speaker 1: going to harve by a founder of Tesla that's going 698 00:44:55,200 --> 00:44:57,279 Speaker 1: to try to harvest those batteries, et cetera. 699 00:44:57,440 --> 00:45:01,320 Speaker 3: Yeah, it's it's it's I don't know. It's so tough 700 00:45:01,360 --> 00:45:06,880 Speaker 3: because I think there are important and nuanced conversations to 701 00:45:06,960 --> 00:45:11,359 Speaker 3: be had about, you know, some of the the unintended 702 00:45:11,360 --> 00:45:15,759 Speaker 3: consequences of the solutions to climate change too, right, Like, 703 00:45:15,840 --> 00:45:18,560 Speaker 3: I know there's a whole a lot going on around 704 00:45:18,640 --> 00:45:22,279 Speaker 3: lithium mining right now and huge. Yeah, you know, those 705 00:45:22,280 --> 00:45:25,680 Speaker 3: are our very important conversations to be having. We don't 706 00:45:25,719 --> 00:45:29,360 Speaker 3: want to go into you know, the next energy generation 707 00:45:29,600 --> 00:45:33,040 Speaker 3: with the same exact mindset except for like the source 708 00:45:33,080 --> 00:45:35,600 Speaker 3: of energy or else we're going to repeat, you know, 709 00:45:35,719 --> 00:45:37,720 Speaker 3: the same mistakes and end up with a new problem. 710 00:45:37,800 --> 00:45:42,759 Speaker 3: Right and unfortunately, it's like you almost can't have those 711 00:45:42,800 --> 00:45:48,920 Speaker 3: conversations without it being weaponized by you know, people who 712 00:45:49,400 --> 00:45:52,080 Speaker 3: who don't want to see climate policy or who don't 713 00:45:52,120 --> 00:45:56,319 Speaker 3: want to see energy transition to say, oh, let's see there. 714 00:45:56,520 --> 00:45:58,880 Speaker 3: You know, there's problems with this too. There's problems with 715 00:45:58,920 --> 00:46:01,560 Speaker 3: all of it. The other thing I thought was interesting 716 00:46:01,640 --> 00:46:04,560 Speaker 3: in your interview with John where he talks about how 717 00:46:05,160 --> 00:46:09,080 Speaker 3: there's a real focus on kind of weaponizing the human 718 00:46:09,160 --> 00:46:14,040 Speaker 3: tendency towards tribal identity and sticking with our group and 719 00:46:14,080 --> 00:46:16,440 Speaker 3: digging into our opinions and those kinds of things, because 720 00:46:16,520 --> 00:46:20,640 Speaker 3: you really see that in the fracturing of the climate 721 00:46:20,680 --> 00:46:24,680 Speaker 3: movement too, not just like climate people versus people who 722 00:46:24,719 --> 00:46:27,360 Speaker 3: don't think we should act on climate, or centrist versus 723 00:46:27,400 --> 00:46:30,720 Speaker 3: progressives or whatever, like even in these things like should 724 00:46:30,760 --> 00:46:35,120 Speaker 3: we or should we not mind lithium for electrification, it's 725 00:46:35,160 --> 00:46:40,120 Speaker 3: like people can't even have, you know, a remotely nuanced 726 00:46:40,160 --> 00:46:41,239 Speaker 3: conversation about it. 727 00:46:41,600 --> 00:46:43,480 Speaker 1: And you know a lot of this is playing out 728 00:46:43,560 --> 00:46:47,120 Speaker 1: in the pod casting place. And there was a big 729 00:46:47,480 --> 00:46:50,080 Speaker 1: dust up recently with you know, Joe Rogan. He's been 730 00:46:50,520 --> 00:46:53,959 Speaker 1: peddling in both climate and COVID misinformation for a long time, 731 00:46:54,000 --> 00:46:57,200 Speaker 1: but it's COVID that got him in trouble. And then 732 00:46:57,600 --> 00:47:00,719 Speaker 1: you tweeted recently that he interviewed Michael Shellenberger, who I've 733 00:47:00,760 --> 00:47:04,920 Speaker 1: interviewed numerous times, who's running for the governor of California. 734 00:47:05,000 --> 00:47:08,960 Speaker 1: So how does he embody kind of the evolution from 735 00:47:09,520 --> 00:47:13,640 Speaker 1: science misinformation to solution information? And what did you think 736 00:47:13,640 --> 00:47:16,800 Speaker 1: about when you saw Rogan and Schellenberger together that photo? 737 00:47:17,120 --> 00:47:19,000 Speaker 3: It was so interesting because I had just I was 738 00:47:19,200 --> 00:47:22,640 Speaker 3: just listening to your conversation with John and then I 739 00:47:22,680 --> 00:47:25,719 Speaker 3: saw that that you know, interview had happened, and I 740 00:47:25,760 --> 00:47:27,640 Speaker 3: was like, oh, this is like a perfect example of 741 00:47:27,680 --> 00:47:31,120 Speaker 3: this of this evolution, because Joe Rogan has you know, 742 00:47:31,239 --> 00:47:35,640 Speaker 3: interviewed lots of kind of garden variety climate deniers, you know, 743 00:47:36,480 --> 00:47:39,360 Speaker 3: will say, actually Car two is good for us in 744 00:47:39,360 --> 00:47:42,799 Speaker 3: the atmosphere and things of that nature. And now he's 745 00:47:42,880 --> 00:47:50,399 Speaker 3: graduated to Schellenberger, who you know is a like likes 746 00:47:50,440 --> 00:47:54,279 Speaker 3: to kind of burnish his environmental credentials and say I 747 00:47:54,320 --> 00:47:58,719 Speaker 3: was an environmental activist and and now I'm apologizing to 748 00:47:58,800 --> 00:48:02,800 Speaker 3: the world for all the alarmism that we created about 749 00:48:02,840 --> 00:48:06,160 Speaker 3: climate change. And yes it's a problem, but we don't 750 00:48:06,160 --> 00:48:10,120 Speaker 3: need to actually make any drastic changes. And very very 751 00:48:10,239 --> 00:48:13,520 Speaker 3: much a big user of one of the strategies that 752 00:48:13,600 --> 00:48:18,040 Speaker 3: John Cook talked about, which is cherry picking datas. I 753 00:48:18,080 --> 00:48:21,480 Speaker 3: went through his book when it came out and found 754 00:48:21,680 --> 00:48:27,799 Speaker 3: I think three thousand examples of cherry picked data that 755 00:48:28,000 --> 00:48:31,080 Speaker 3: was like making a very flawed argument. 756 00:48:31,719 --> 00:48:33,279 Speaker 1: So how much of this is you know, you and 757 00:48:33,320 --> 00:48:36,600 Speaker 1: I both came out of you know, traditional news backgrounds. 758 00:48:36,680 --> 00:48:38,680 Speaker 1: You know, when I worked at the AP, there was 759 00:48:38,719 --> 00:48:40,839 Speaker 1: a saying that if you think your mother loves you, 760 00:48:40,840 --> 00:48:43,560 Speaker 1: you better call and confirm that it's still true, you know. 761 00:48:43,760 --> 00:48:49,280 Speaker 1: And so the But in Podland and the realm of podcasts, 762 00:48:49,320 --> 00:48:53,120 Speaker 1: which are you know, those traditional rules rules don't apply. 763 00:48:53,239 --> 00:48:56,919 Speaker 1: They're regulated differently. So how you know, talk about you know, 764 00:48:57,080 --> 00:49:01,399 Speaker 1: how much of this is a real function of, you know, 765 00:49:02,080 --> 00:49:07,440 Speaker 1: the surge of podcasting and how can podland avoid becoming 766 00:49:07,480 --> 00:49:09,520 Speaker 1: the cesspool that is social media. 767 00:49:10,320 --> 00:49:15,160 Speaker 3: Yeah, I'm actually very concerned about this because it is 768 00:49:15,200 --> 00:49:18,040 Speaker 3: governed by the exact same rules as Twitter and Facebook. 769 00:49:18,840 --> 00:49:21,759 Speaker 3: Podcasts are. But I think the public thinks of podcasts 770 00:49:21,800 --> 00:49:27,560 Speaker 3: as being media right and therefore governed by media rules 771 00:49:27,600 --> 00:49:31,560 Speaker 3: like websites or newspapers or whatever, and it's not. Actually 772 00:49:32,239 --> 00:49:36,200 Speaker 3: so every person who's putting out a podcast is it's 773 00:49:36,280 --> 00:49:41,560 Speaker 3: up to them entirely what their process is for fact checking. 774 00:49:41,560 --> 00:49:42,440 Speaker 2: Or you know. 775 00:49:42,600 --> 00:49:45,640 Speaker 3: I mean, there are some basic consumer protection laws, but 776 00:49:46,040 --> 00:49:53,000 Speaker 3: in terms of any kind of backstop there on truthfulness, no, 777 00:49:53,200 --> 00:49:57,120 Speaker 3: it's kind of up to each organization. And we've really 778 00:49:57,160 --> 00:50:00,400 Speaker 3: seen in the last few years what that can lead to. 779 00:50:00,880 --> 00:50:04,520 Speaker 3: Joe Rogan is a perfect example of you know, he 780 00:50:04,640 --> 00:50:07,399 Speaker 3: kind of takes this approach that well, I'm just I'm 781 00:50:07,440 --> 00:50:11,040 Speaker 3: just sharing my opinion. And the problem there is that 782 00:50:11,239 --> 00:50:16,640 Speaker 3: when you're sharing your opinion and it sounds like expertise, 783 00:50:17,360 --> 00:50:19,920 Speaker 3: then it can be very misleading to people. 784 00:50:20,080 --> 00:50:24,600 Speaker 1: You know, So maybe I learned from maybe he took 785 00:50:24,600 --> 00:50:25,239 Speaker 1: a page from me. 786 00:50:26,080 --> 00:50:31,040 Speaker 3: Yeah, yeah, exactly. So I think that, you know, I'm 787 00:50:31,040 --> 00:50:35,400 Speaker 3: not a fan of censorship, and I think also like 788 00:50:35,440 --> 00:50:37,040 Speaker 3: the horse is kind of out of the barn, like 789 00:50:37,080 --> 00:50:40,319 Speaker 3: you're not gonna, you know, go back in time and 790 00:50:40,600 --> 00:50:44,000 Speaker 3: set up rules that the podcast industry can you know, 791 00:50:44,719 --> 00:50:49,399 Speaker 3: can live by necessarily. But I do think that there's 792 00:50:49,400 --> 00:50:53,400 Speaker 3: an argument to be made to bring podcasts under FCC 793 00:50:53,960 --> 00:50:58,520 Speaker 3: regulations instead of the Federal Trade Commission FTC, which it's 794 00:50:58,600 --> 00:51:02,520 Speaker 3: under now. Not that the SEC is perfect, obviously, we 795 00:51:02,600 --> 00:51:06,880 Speaker 3: see tons of misinformation on cable news for example, as well, 796 00:51:07,320 --> 00:51:12,400 Speaker 3: but there's at least some amount of more proactive action 797 00:51:12,640 --> 00:51:16,239 Speaker 3: to try to curb that there. And the other thing is, 798 00:51:16,520 --> 00:51:19,439 Speaker 3: I do think that you're seeing the industry itself start 799 00:51:19,560 --> 00:51:23,040 Speaker 3: to sort of take some somewhat of a turn. You're 800 00:51:23,040 --> 00:51:25,200 Speaker 3: always going to have these kind of rogue actors, but 801 00:51:25,760 --> 00:51:29,080 Speaker 3: the podcast platforms are thinking about, you know, what can 802 00:51:29,120 --> 00:51:34,200 Speaker 3: we do to sort of let the more high quality 803 00:51:34,320 --> 00:51:38,640 Speaker 3: reported stuff rise to the top and highlight that stuff 804 00:51:38,760 --> 00:51:43,359 Speaker 3: versus some guy in his mom's garage, you know, And 805 00:51:43,920 --> 00:51:47,440 Speaker 3: companies are starting to hire fact checkers more and more too. 806 00:51:47,480 --> 00:51:49,839 Speaker 3: This has like become just in the last couple of years. 807 00:51:49,880 --> 00:51:52,520 Speaker 3: I've seen a major major shift where I had to 808 00:51:52,600 --> 00:51:56,319 Speaker 3: really convince people before that that was worth spending money on. 809 00:51:56,640 --> 00:51:59,040 Speaker 3: But because of the FTC thing. The other problem with 810 00:51:59,080 --> 00:52:02,160 Speaker 3: podcasts is that the ads don't go by the same rules. 811 00:52:02,480 --> 00:52:05,600 Speaker 1: So we've we've struggled with that. How you've like fact 812 00:52:05,680 --> 00:52:07,960 Speaker 1: checked the ads that come on your platform. That Daily 813 00:52:08,000 --> 00:52:11,360 Speaker 1: got called out for some things on natural gas. I 814 00:52:11,400 --> 00:52:15,400 Speaker 1: think it was that that was not quite meet the 815 00:52:15,440 --> 00:52:16,800 Speaker 1: standards of the New York Times. 816 00:52:16,840 --> 00:52:18,080 Speaker 3: Yeah, right, exactly. 817 00:52:18,200 --> 00:52:18,759 Speaker 1: Yeah. 818 00:52:18,880 --> 00:52:22,960 Speaker 3: The fact that you saw this explosion in oil companies 819 00:52:23,000 --> 00:52:26,919 Speaker 3: in particular advertising and podcasts a few years ago, it's 820 00:52:27,000 --> 00:52:29,600 Speaker 3: for a reason. You know, they don't they don't spend 821 00:52:29,640 --> 00:52:32,799 Speaker 3: money on stuff just to try things out. They're they're 822 00:52:32,920 --> 00:52:35,719 Speaker 3: very smart and very strategic. So if you know, if 823 00:52:35,760 --> 00:52:39,760 Speaker 3: they're investing a lot there and then social media ads, 824 00:52:39,800 --> 00:52:43,000 Speaker 3: it's because they have more control over the story there. 825 00:52:43,360 --> 00:52:45,799 Speaker 1: Right, which all gets to the need for the educated, 826 00:52:46,000 --> 00:52:49,520 Speaker 1: discerning public to sort of, you know, check ourselves and 827 00:52:49,680 --> 00:52:53,680 Speaker 1: what's the difference between the editorial, the podcast, the advertisement. 828 00:52:54,080 --> 00:52:55,960 Speaker 1: One of the themes running through this is the narrative 829 00:52:55,960 --> 00:52:59,520 Speaker 1: of personal responsibility, both for climate action and for I guess, 830 00:52:59,560 --> 00:53:03,960 Speaker 1: for the information we take in versus corporate responsibility, for 831 00:53:04,440 --> 00:53:09,360 Speaker 1: corporate responsibility for climate, corporate responsibility or producer responsibility for 832 00:53:09,480 --> 00:53:14,880 Speaker 1: media and for energy. You know, BP popularized the idea 833 00:53:14,880 --> 00:53:18,480 Speaker 1: of the personal carbon footprint twenty years ago, and I 834 00:53:18,600 --> 00:53:21,279 Speaker 1: respect your work a lot, and you have really gone 835 00:53:21,280 --> 00:53:25,680 Speaker 1: after you know, the villains, energy companies, energy suppliers as 836 00:53:25,840 --> 00:53:31,319 Speaker 1: villains in the climate story. And I've also pursued the 837 00:53:31,400 --> 00:53:35,080 Speaker 1: limitations and the truth of the personal responsibility. And I 838 00:53:35,160 --> 00:53:38,480 Speaker 1: want to play a clip from brit Ray, who's a 839 00:53:38,520 --> 00:53:42,080 Speaker 1: researcher at Stanford University who's had this to say. 840 00:53:42,640 --> 00:53:44,960 Speaker 15: It's a huge part of a lot of activist rhetoric 841 00:53:45,000 --> 00:53:49,480 Speaker 15: that we shouldn't be focusing on our individual minuscule impact 842 00:53:49,760 --> 00:53:53,280 Speaker 15: in relation to who's out there really spreading the damage, 843 00:53:53,320 --> 00:53:55,840 Speaker 15: you know, the fossil fuel companies, the corrupt politicians, the 844 00:53:55,840 --> 00:53:59,360 Speaker 15: lobbyist et cetera, that are fueling the damage as we 845 00:53:59,400 --> 00:54:04,040 Speaker 15: speak and have and for decades. And I really think that. 846 00:54:03,440 --> 00:54:06,600 Speaker 16: That is, of course true on an intellectual level in 847 00:54:06,640 --> 00:54:09,680 Speaker 16: many ways, but there's also perhaps a propulsion to turn 848 00:54:09,719 --> 00:54:13,319 Speaker 16: away from looking within because it brings up shame, it 849 00:54:13,320 --> 00:54:17,160 Speaker 16: brings up guilt, it brings up intolerable emotions that produce 850 00:54:17,200 --> 00:54:18,640 Speaker 16: a bunch of defensive reactions. 851 00:54:19,040 --> 00:54:23,160 Speaker 1: That's Britt Ray, who has a PhD and climate science communication. 852 00:54:23,280 --> 00:54:26,080 Speaker 1: So I'm going to ask you if someone can villainization 853 00:54:26,280 --> 00:54:30,400 Speaker 1: sometimes be easy or than looking at ourselves and our 854 00:54:30,440 --> 00:54:31,239 Speaker 1: own complicity. 855 00:54:31,880 --> 00:54:35,200 Speaker 3: I definitely thin get can be and I also agree 856 00:54:35,200 --> 00:54:40,400 Speaker 3: with Britt And I also think that there's again going 857 00:54:40,440 --> 00:54:42,840 Speaker 3: to sound like a broken record here, like a real 858 00:54:42,920 --> 00:54:46,760 Speaker 3: need for nuance in the conversation around personal responsibility because 859 00:54:47,000 --> 00:54:51,800 Speaker 3: the reality is that the top ten percent of consumers globally, 860 00:54:51,840 --> 00:54:56,279 Speaker 3: which most Americans fall into, are responsible for a much 861 00:54:56,360 --> 00:55:00,919 Speaker 3: larger proportion of global CO two emissions than everyone else 862 00:55:00,960 --> 00:55:03,640 Speaker 3: in the world. Right, So I absolutely think that we 863 00:55:03,680 --> 00:55:07,759 Speaker 3: should look at that and take responsibility for, you know, 864 00:55:07,800 --> 00:55:11,279 Speaker 3: the ways that we're you know, contributing to that. I 865 00:55:11,320 --> 00:55:14,480 Speaker 3: also think, just as a human, it feels better to 866 00:55:14,640 --> 00:55:18,840 Speaker 3: live according to your values than not, you know, on 867 00:55:18,960 --> 00:55:22,120 Speaker 3: a on a real basic level. And I think also 868 00:55:22,200 --> 00:55:27,280 Speaker 3: that there's something to be said for individual action beyond consumerism. 869 00:55:27,400 --> 00:55:30,000 Speaker 3: This is something that really bugs me that the personal 870 00:55:30,000 --> 00:55:32,920 Speaker 3: responsibility stuff always gets well down to what we buy 871 00:55:33,120 --> 00:55:36,320 Speaker 3: right or how we travel. 872 00:55:36,320 --> 00:55:36,640 Speaker 1: You know. 873 00:55:37,160 --> 00:55:41,520 Speaker 3: But like individual action can also be civic action. It 874 00:55:41,520 --> 00:55:45,719 Speaker 3: can be political organizing, It can be finding ways to 875 00:55:46,960 --> 00:55:50,560 Speaker 3: make your community more resilient. It can be mutual aid. 876 00:55:51,080 --> 00:55:52,719 Speaker 3: There are lots and lots and lots of things that 877 00:55:52,760 --> 00:55:56,279 Speaker 3: have nothing to do with buying different stuff that are 878 00:55:56,320 --> 00:55:59,080 Speaker 3: individual actions that are very important and that are a 879 00:55:59,120 --> 00:56:01,960 Speaker 3: critical part of of how we not only address this 880 00:56:01,960 --> 00:56:03,959 Speaker 3: problem but actually survive it, you know. 881 00:56:04,520 --> 00:56:07,879 Speaker 1: Right, And yes, Bill mckibbon said years ago the most 882 00:56:07,880 --> 00:56:10,920 Speaker 1: important thing an individual can do is not act as 883 00:56:10,960 --> 00:56:14,080 Speaker 1: an individual. And I'm recently thinking about the best thing 884 00:56:14,120 --> 00:56:16,239 Speaker 1: you can do is have relationships and make this part 885 00:56:16,239 --> 00:56:19,759 Speaker 1: of your life and your relationships whoever those relationships are with, 886 00:56:20,400 --> 00:56:21,880 Speaker 1: to make climate part of it. 887 00:56:22,360 --> 00:56:25,120 Speaker 3: Yeah, And I know Catherine Hahoe talks a lot about 888 00:56:25,680 --> 00:56:28,839 Speaker 3: the power of talking to other people about this, not 889 00:56:29,000 --> 00:56:32,040 Speaker 3: just in the vein of you know, persuading people to 890 00:56:32,120 --> 00:56:34,160 Speaker 3: see your point of view or things like that, but 891 00:56:34,320 --> 00:56:39,799 Speaker 3: just to create community to like actually to help with 892 00:56:39,920 --> 00:56:43,400 Speaker 3: processing those feelings of shame and fear and anxiety and 893 00:56:43,480 --> 00:56:45,359 Speaker 3: grief and all of those things that come up with 894 00:56:45,400 --> 00:56:47,600 Speaker 3: this too. Like you can't do that alone. You need 895 00:56:48,040 --> 00:56:50,480 Speaker 3: to talk to people, you know, but you speaking to 896 00:56:50,480 --> 00:56:54,240 Speaker 3: someone else about it can absolutely help them to feel 897 00:56:54,560 --> 00:56:57,000 Speaker 3: more like they're able to kind of work through that 898 00:56:57,040 --> 00:57:00,680 Speaker 3: stuff and get to a place where they can. To me, 899 00:57:00,760 --> 00:57:03,840 Speaker 3: it's actually not about finding villains at all. It's about 900 00:57:04,320 --> 00:57:07,239 Speaker 3: figuring out what the what drove a problem like the 901 00:57:07,680 --> 00:57:09,759 Speaker 3: climate crisis in the first place, Like, how do you 902 00:57:09,800 --> 00:57:13,120 Speaker 3: have a society that allows a small group of people 903 00:57:13,520 --> 00:57:16,080 Speaker 3: to make decisions that impact the whole world? 904 00:57:17,840 --> 00:57:18,080 Speaker 2: You know? 905 00:57:19,200 --> 00:57:21,720 Speaker 3: Like how does that happen? How does it get to 906 00:57:21,760 --> 00:57:26,600 Speaker 3: this point where we're facing this catastrophe and everyone feels 907 00:57:26,640 --> 00:57:30,040 Speaker 3: really powerless to do anything about it, you know, So 908 00:57:30,120 --> 00:57:32,800 Speaker 3: that is interesting to me. I'm like, you know, how 909 00:57:32,800 --> 00:57:35,160 Speaker 3: did this system get built? And who built it? And 910 00:57:35,560 --> 00:57:37,280 Speaker 3: why did they have the power to build it? And 911 00:57:37,320 --> 00:57:39,880 Speaker 3: how do we Because for me, I don't think you 912 00:57:40,000 --> 00:57:43,440 Speaker 3: get to effective solutions if you don't understand that. How 913 00:57:43,480 --> 00:57:45,560 Speaker 3: do you solve a problem when you don't even know 914 00:57:45,840 --> 00:57:48,800 Speaker 3: the roots of it or where it came from. We 915 00:57:48,920 --> 00:57:51,160 Speaker 3: have to deal with the power structure, not just the 916 00:57:51,200 --> 00:57:51,880 Speaker 3: power source. 917 00:57:52,440 --> 00:57:55,640 Speaker 1: Oh yeah right, it is ultimately about power. Yeah. Well, 918 00:57:55,640 --> 00:57:58,080 Speaker 1: Amy Westerville, thank you so much for coming on climate 919 00:57:58,120 --> 00:57:59,520 Speaker 1: and it's been a real pleasure. 920 00:57:59,520 --> 00:58:02,360 Speaker 3: And yeah, thanks for having me. Thank you so much. 921 00:58:03,120 --> 00:58:07,040 Speaker 1: On this Climate one, we've been breaking down climate misinformation, tactics, 922 00:58:07,080 --> 00:58:10,720 Speaker 1: and ways to respond. Special thanks to Amy Westervelt for 923 00:58:10,760 --> 00:58:16,040 Speaker 1: this collaboration. Check out her excellent podcast Drilled. Climate one's 924 00:58:16,040 --> 00:58:20,320 Speaker 1: empowering conversations connect all aspects of the climate emergency. To 925 00:58:20,400 --> 00:58:23,320 Speaker 1: hear more, subscribe to our podcast on Apple or wherever 926 00:58:23,360 --> 00:58:28,720 Speaker 1: you get your pods. Talking about climate can be hard, difficult, depressing, awkward, 927 00:58:28,920 --> 00:58:32,600 Speaker 1: and it's critical to addressing the climate emergency. Please help 928 00:58:32,680 --> 00:58:35,040 Speaker 1: us get people talking more about climate by giving us 929 00:58:35,040 --> 00:58:38,040 Speaker 1: a rating or review or tell a friend. It really 930 00:58:38,080 --> 00:58:42,120 Speaker 1: does help advance the climate conversation. Brad Marshland is our 931 00:58:42,160 --> 00:58:45,880 Speaker 1: senior producer. Our producers and audio editors are Aaron Abrocious 932 00:58:45,920 --> 00:58:50,600 Speaker 1: and Austin Cologne. Our team also includes Arnov Gupta, Steve Fox, 933 00:58:50,640 --> 00:58:53,960 Speaker 1: and Tyler Reid. Our theme music was composed by George 934 00:58:54,000 --> 00:58:57,960 Speaker 1: Young and arranged by Matt Wilcox. Gloria Duffy is CEO 935 00:58:58,080 --> 00:59:01,000 Speaker 1: of the Commonwealth Club of California, the non profit and 936 00:59:01,080 --> 00:59:05,200 Speaker 1: non partisan forum or our program originates. I'm Greg Dalton.