1 00:00:01,600 --> 00:00:08,799 Speaker 1: Hmm. Greetings and salutations. This is it could happen here. 2 00:00:09,520 --> 00:00:13,600 Speaker 1: I'm Garrison Davis. I'm part of the research and writing team, 3 00:00:13,640 --> 00:00:18,440 Speaker 1: and today we have a special treat for everybody. Here. 4 00:00:18,920 --> 00:00:22,479 Speaker 1: We are going to be running an interview with Jeff Man. 5 00:00:23,320 --> 00:00:26,720 Speaker 1: Jeff Man is a co author of the fantastic book 6 00:00:26,880 --> 00:00:32,000 Speaker 1: Climate Leviathan, A Political Theory of Our Planetary Future. UM. 7 00:00:32,280 --> 00:00:34,680 Speaker 1: Jeff Man is a as a as a professor, he 8 00:00:34,720 --> 00:00:40,680 Speaker 1: teaches political economy and economic geography. UM. He's done lots 9 00:00:40,720 --> 00:00:45,120 Speaker 1: of writing on capitalism and climate change. He is a 10 00:00:45,240 --> 00:00:49,400 Speaker 1: He's a fantastic resource and I highly recommend his book. 11 00:00:49,640 --> 00:00:52,760 Speaker 1: Um it can get a little academic, um making has 12 00:00:52,800 --> 00:00:54,680 Speaker 1: a lot of has a lot of big fancy words 13 00:00:54,720 --> 00:00:56,960 Speaker 1: that I probably I would have a hard time saying 14 00:00:56,960 --> 00:00:59,959 Speaker 1: out loud, but um, it's a very it's a very 15 00:01:00,040 --> 00:01:03,000 Speaker 1: good read. So I would would recommend picking up the 16 00:01:03,000 --> 00:01:06,160 Speaker 1: book if you want to read about economics and climate 17 00:01:06,240 --> 00:01:09,119 Speaker 1: change and all that kind of stuff. But thankfully we 18 00:01:09,480 --> 00:01:11,960 Speaker 1: interviewed him here in the pod, so if you're more 19 00:01:12,000 --> 00:01:14,840 Speaker 1: so inclined, you can listen to this interview that's going 20 00:01:14,880 --> 00:01:17,559 Speaker 1: to play right after I'm done talking. So without further 21 00:01:17,600 --> 00:01:21,160 Speaker 1: ado Here is our interview with Jeff Man talking about 22 00:01:21,160 --> 00:01:28,000 Speaker 1: politics and climate change. Let's let's go. So we're the 23 00:01:28,000 --> 00:01:31,080 Speaker 1: show we're looking to do. The first season of this, 24 00:01:31,120 --> 00:01:34,319 Speaker 1: which we dropped like years ago twenty nineteen, was like 25 00:01:35,040 --> 00:01:38,720 Speaker 1: kind of uh my, my, My basic experiences journalists isn't 26 00:01:38,760 --> 00:01:41,800 Speaker 1: conflict reporting, so like Rack Siria Ukraine, And it was like, 27 00:01:41,840 --> 00:01:44,800 Speaker 1: what would happen if there were to be a civil 28 00:01:44,840 --> 00:01:47,039 Speaker 1: conflict in the United States? How would that actually look? 29 00:01:47,040 --> 00:01:48,800 Speaker 1: How do these things look in the modern world? And 30 00:01:48,840 --> 00:01:52,320 Speaker 1: all that jazz. This season we're doing what is the 31 00:01:52,360 --> 00:01:55,640 Speaker 1: world going to be? Just based on what we know 32 00:01:56,200 --> 00:01:58,400 Speaker 1: of of how climate change is going to affect things, 33 00:01:58,480 --> 00:02:01,680 Speaker 1: and um, that's all too bleak to get into without 34 00:02:01,760 --> 00:02:05,760 Speaker 1: trying to provide some positive possibilities for how things could be, 35 00:02:05,800 --> 00:02:08,160 Speaker 1: how adaptations that could be made and whatnot. And I 36 00:02:08,200 --> 00:02:12,120 Speaker 1: think what's so interesting about your book is it provides 37 00:02:12,720 --> 00:02:14,880 Speaker 1: all of the difference. With the exception of the best 38 00:02:14,960 --> 00:02:17,840 Speaker 1: case scenario, all the scenarios you present seem very plausible 39 00:02:17,880 --> 00:02:21,880 Speaker 1: to me. And I guess I'm wondering, of of the 40 00:02:21,919 --> 00:02:24,840 Speaker 1: ones that you've put forward in your book, is there 41 00:02:24,840 --> 00:02:26,520 Speaker 1: one that seems more plausible to you right now, are 42 00:02:26,520 --> 00:02:28,400 Speaker 1: you kind of at at a point where you're expecting 43 00:02:28,400 --> 00:02:30,560 Speaker 1: there to be kind of a regional breakdown as to like, 44 00:02:31,120 --> 00:02:33,000 Speaker 1: you know, this chunk of the world goes to this 45 00:02:33,080 --> 00:02:35,519 Speaker 1: kind of climate mouthing climate Leviathan, is this sort of 46 00:02:35,600 --> 00:02:37,960 Speaker 1: chunk of the world, Like, I'm wondering what you're seeing 47 00:02:38,080 --> 00:02:40,880 Speaker 1: right now, just as we're watching ships start to really 48 00:02:40,960 --> 00:02:49,160 Speaker 1: hit home for people. Yeah, yeah, I I'm always reluctant 49 00:02:49,320 --> 00:02:51,919 Speaker 1: in these instances to say that I know more than 50 00:02:51,960 --> 00:02:54,600 Speaker 1: anyone else you know about what's going to happen, So 51 00:02:54,680 --> 00:02:57,040 Speaker 1: I hope it doesn't come across as in any way 52 00:02:57,080 --> 00:03:01,720 Speaker 1: like me agnosticating um, which actually, to be honest with you, 53 00:03:01,800 --> 00:03:05,720 Speaker 1: Jewel is much more comfortable doing the guy I wrote 54 00:03:05,760 --> 00:03:07,360 Speaker 1: the book with, who you should also chat with if 55 00:03:07,360 --> 00:03:12,480 Speaker 1: you ever get the chance, brilliant guy. Um. But I 56 00:03:12,880 --> 00:03:14,280 Speaker 1: think you're right. I do think there's a sort of 57 00:03:14,280 --> 00:03:19,040 Speaker 1: fragmentation right now, Um, whether or not like geo politically 58 00:03:19,080 --> 00:03:21,960 Speaker 1: in the sense like you said, regional you know, breakdowns, 59 00:03:22,480 --> 00:03:26,680 Speaker 1: or the way that like kind of different trajectories could 60 00:03:26,720 --> 00:03:30,520 Speaker 1: be happening simultaneously in different parts of the planet. How 61 00:03:30,600 --> 00:03:34,640 Speaker 1: long that can last, or whether or not it just 62 00:03:34,680 --> 00:03:37,280 Speaker 1: stays that way. I think it's a super interesting question, 63 00:03:37,360 --> 00:03:39,920 Speaker 1: Like it does seem to me that, you know, the 64 00:03:40,000 --> 00:03:43,320 Speaker 1: Chinese state for a variety of reasons, some of which 65 00:03:43,400 --> 00:03:45,320 Speaker 1: I probably can have a handle on, and others I 66 00:03:45,360 --> 00:03:47,960 Speaker 1: just don't know enough to know. The Chinese state, you know, 67 00:03:48,040 --> 00:03:50,720 Speaker 1: approaches these problems in a really different way than we 68 00:03:50,840 --> 00:03:55,240 Speaker 1: do in North America or Western Europe for example. UM 69 00:03:55,320 --> 00:03:58,280 Speaker 1: and and how they handle like what is clearly fucking 70 00:03:58,320 --> 00:04:01,240 Speaker 1: coming down their pipe, you know, not just with these floods, 71 00:04:01,240 --> 00:04:05,520 Speaker 1: but you know, the overall like the soil loss um, 72 00:04:06,040 --> 00:04:09,839 Speaker 1: sort of massive ternment in the West um an urbanization 73 00:04:09,880 --> 00:04:13,400 Speaker 1: at a scale that you know is like completely unsustainable 74 00:04:13,520 --> 00:04:17,560 Speaker 1: because the countryside is you know, can't support its people anymore. 75 00:04:17,800 --> 00:04:21,000 Speaker 1: You know, they have these permit systems and everything. How 76 00:04:21,040 --> 00:04:25,000 Speaker 1: they approach that whole problem from like ecological breakdown perspective 77 00:04:25,320 --> 00:04:28,960 Speaker 1: could very much, I think, take a kind of Leviathan 78 00:04:29,040 --> 00:04:32,120 Speaker 1: like form, but a much more authoritarian version. It will not, 79 00:04:32,320 --> 00:04:34,240 Speaker 1: I don't think in the short term, look like now 80 00:04:34,360 --> 00:04:39,040 Speaker 1: in terms of a sort of revolutionary uh process UM. 81 00:04:39,040 --> 00:04:41,520 Speaker 1: But here in North America, I think that this idea 82 00:04:41,600 --> 00:04:43,799 Speaker 1: that you know, Joe and I tried to float about 83 00:04:44,440 --> 00:04:47,680 Speaker 1: capital taking over and trying to basically maintain itself at 84 00:04:47,680 --> 00:04:52,160 Speaker 1: the top of the hierarchy. UM. And you know, basically 85 00:04:52,200 --> 00:04:54,159 Speaker 1: allow the planet to break down, but to have the 86 00:04:54,200 --> 00:04:58,320 Speaker 1: social uh, to maintain the social order in its own interests. 87 00:04:58,320 --> 00:05:01,200 Speaker 1: I actually still think that's unfolding right front of us, UM. 88 00:05:01,240 --> 00:05:03,640 Speaker 1: And I think Western Europe is the same that just 89 00:05:03,680 --> 00:05:05,560 Speaker 1: managing it in that, you know, in a very different 90 00:05:05,600 --> 00:05:08,560 Speaker 1: kind of technocratic way. UM. But I think you're right 91 00:05:08,600 --> 00:05:11,560 Speaker 1: to identify not a global kind of coalescence, but rather 92 00:05:11,920 --> 00:05:15,680 Speaker 1: a whole variety of conflicting trajectories. That would be my 93 00:05:15,920 --> 00:05:18,600 Speaker 1: take on it right now. How do you when you're 94 00:05:18,640 --> 00:05:21,719 Speaker 1: trying to have these conversations about like what's what's coming 95 00:05:21,720 --> 00:05:25,640 Speaker 1: down the pipe with people who are less buried in 96 00:05:25,640 --> 00:05:29,680 Speaker 1: this than you are, how do you introduce the concept 97 00:05:29,839 --> 00:05:35,919 Speaker 1: of climate Leviathan to them? Well, so, I mean I 98 00:05:36,000 --> 00:05:38,880 Speaker 1: run into this. I mean this might be a terrible 99 00:05:39,440 --> 00:05:45,120 Speaker 1: uh what's the right word of comparison, um to make? 100 00:05:45,400 --> 00:05:47,240 Speaker 1: But I run into this a lot, like you know, 101 00:05:47,360 --> 00:05:50,159 Speaker 1: just in the like classroom, like with students and stuff 102 00:05:50,200 --> 00:05:52,960 Speaker 1: like that. And basically the way I usually begin it 103 00:05:53,000 --> 00:05:56,160 Speaker 1: is I asked the question you know because people often 104 00:05:56,279 --> 00:06:00,839 Speaker 1: I think quite rightly make certain kinds of uh agnostications 105 00:06:00,880 --> 00:06:07,560 Speaker 1: about uh, you know, climate change kind of running out 106 00:06:07,560 --> 00:06:11,240 Speaker 1: of control and destroying life as we know it in 107 00:06:11,240 --> 00:06:15,800 Speaker 1: a rather immediate way. And and I usually just say, like, 108 00:06:18,560 --> 00:06:23,880 Speaker 1: like that that runs so counter to the interests of 109 00:06:23,880 --> 00:06:27,599 Speaker 1: global capital that it's impossible to imagine I'm not responding 110 00:06:28,320 --> 00:06:31,640 Speaker 1: and and and in and in anywhere from a kind 111 00:06:31,680 --> 00:06:34,760 Speaker 1: of minor tweakye to men like full on emergency panic 112 00:06:34,839 --> 00:06:39,000 Speaker 1: mode response, depending upon the situation and conditions. And so 113 00:06:39,120 --> 00:06:42,920 Speaker 1: I basically say the idea of climate Leviathan is precisely 114 00:06:43,000 --> 00:06:47,000 Speaker 1: that it's capital responding. Now, of course, we all, I 115 00:06:47,040 --> 00:06:49,600 Speaker 1: think rightly know that that response will never be adequate 116 00:06:49,640 --> 00:06:51,600 Speaker 1: to the problem, not even at a purely sort of 117 00:06:51,600 --> 00:06:55,320 Speaker 1: survivalist level. But I still think that in the medium term, 118 00:06:55,400 --> 00:06:56,720 Speaker 1: that's what we're going to see. And that's how I 119 00:06:56,800 --> 00:07:00,359 Speaker 1: usually introduced it. It's like, imagine capital response wanting to 120 00:07:00,360 --> 00:07:06,919 Speaker 1: climate change because they'll have to. It'll look like Levia Yeah. 121 00:07:06,960 --> 00:07:10,040 Speaker 1: And I think one of the yeah no, that that 122 00:07:10,160 --> 00:07:12,080 Speaker 1: makes sense. And one of the reasons I like that, 123 00:07:12,200 --> 00:07:16,120 Speaker 1: and I like the way you Enjoel frame things is 124 00:07:16,160 --> 00:07:18,960 Speaker 1: that I'm I've grown very tired of especially in the 125 00:07:19,040 --> 00:07:21,360 Speaker 1: you know, I have some prepping kind of interests and 126 00:07:21,400 --> 00:07:23,880 Speaker 1: stuff like I I think that stuff is neat. But 127 00:07:23,920 --> 00:07:27,160 Speaker 1: I've always felt like the the obsession with collapse is 128 00:07:27,160 --> 00:07:31,680 Speaker 1: not just silly, it's um counter factual um because barring 129 00:07:31,760 --> 00:07:34,000 Speaker 1: some sort of like the the ever present possibility of 130 00:07:34,000 --> 00:07:36,600 Speaker 1: a nuclear conflict or something, I don't I don't see 131 00:07:36,640 --> 00:07:40,640 Speaker 1: collapse as a as a realistic consequence of climate change. 132 00:07:40,640 --> 00:07:44,920 Speaker 1: Actually collapses. I see places collapsing, I see survivability and 133 00:07:45,000 --> 00:07:48,040 Speaker 1: chunks of the world collapse. And but I think you're 134 00:07:48,040 --> 00:07:50,720 Speaker 1: absolutely right. There's no way capital is going to allow 135 00:07:51,200 --> 00:07:55,320 Speaker 1: everything to fall apart, because then they can't go them all, 136 00:07:55,360 --> 00:07:59,360 Speaker 1: you know. Yeah, yeah, and we can't either, and that's 137 00:08:00,000 --> 00:08:02,559 Speaker 1: they're desperate for us to keep doing that, right, So yeah, 138 00:08:02,840 --> 00:08:08,360 Speaker 1: I agree, total. Yeah, I'm wondering this climate Leviathan, Like, 139 00:08:08,400 --> 00:08:11,559 Speaker 1: when you describe it, it doesn't sound great. It also 140 00:08:11,680 --> 00:08:15,000 Speaker 1: sounds at least familiar. It sounds like the same way 141 00:08:15,000 --> 00:08:17,360 Speaker 1: I've see. It's like, that's part of why it's very believable, 142 00:08:17,440 --> 00:08:19,840 Speaker 1: is it sounds like the way the system currently deals 143 00:08:19,880 --> 00:08:24,720 Speaker 1: with every problem. Right, these these technocratic half competent UH 144 00:08:24,720 --> 00:08:28,800 Speaker 1: focus group solutions that are generally too late and you know, 145 00:08:29,760 --> 00:08:34,800 Speaker 1: only occasionally effective. Um. What scares me is UM and 146 00:08:34,800 --> 00:08:39,040 Speaker 1: I forget the exaction, but like essentially the authoritary, the 147 00:08:39,080 --> 00:08:41,520 Speaker 1: more authoritarian version of this, you know, like and and 148 00:08:41,600 --> 00:08:44,400 Speaker 1: the more authoritarian kind of coming from a We're not 149 00:08:44,440 --> 00:08:47,280 Speaker 1: going to fix the problem. We're just going to protect 150 00:08:47,520 --> 00:08:52,079 Speaker 1: whatever kind of identitarian chunk we we we we consider 151 00:08:52,120 --> 00:08:55,600 Speaker 1: our base from it. Um, do you see that gaining 152 00:08:55,640 --> 00:08:58,240 Speaker 1: strength right now? Like where do you where do you 153 00:08:58,679 --> 00:09:00,800 Speaker 1: looking at kind of the the lay of the land 154 00:09:00,840 --> 00:09:03,680 Speaker 1: at the moment? Where are you seeing that? Yeah, that's 155 00:09:03,679 --> 00:09:07,800 Speaker 1: a really good, you know comment from my perspective. I 156 00:09:07,840 --> 00:09:10,880 Speaker 1: agree with you. I think that stuff is serious, and 157 00:09:10,960 --> 00:09:15,520 Speaker 1: I think, if I'm honest with you, it's a little 158 00:09:15,520 --> 00:09:18,400 Speaker 1: bit of a missing bit in the book's argument in 159 00:09:18,440 --> 00:09:20,720 Speaker 1: the sense that I don't think we took seriously enough 160 00:09:21,960 --> 00:09:26,480 Speaker 1: a version of behemoth that doesn't deny climate change but 161 00:09:26,640 --> 00:09:30,680 Speaker 1: only gives a shit about its own internal territories or 162 00:09:30,760 --> 00:09:32,680 Speaker 1: you know what I mean, its own interests, so that 163 00:09:32,760 --> 00:09:36,440 Speaker 1: it becomes a behemoth like fuck you to the rest 164 00:09:36,440 --> 00:09:39,600 Speaker 1: of the world, but but at the same time takes 165 00:09:39,600 --> 00:09:42,360 Speaker 1: climate very seriously. Uh. You know, I think some people 166 00:09:42,440 --> 00:09:45,600 Speaker 1: call it like eco fascism. I'm not so sure. You know, 167 00:09:45,640 --> 00:09:51,160 Speaker 1: I don't I that term. I don't think that covers 168 00:09:51,160 --> 00:09:53,200 Speaker 1: exactly what I'm trying to say. But maybe I'm just 169 00:09:53,200 --> 00:09:56,479 Speaker 1: don't understand it well enough. But I do think that 170 00:09:56,480 --> 00:10:00,760 Speaker 1: that the book Joel and I don't take that prospect 171 00:10:00,840 --> 00:10:03,560 Speaker 1: seriously enough. And I think that is actually like that 172 00:10:03,640 --> 00:10:05,960 Speaker 1: kind of Mike Davis sort of like you know, if 173 00:10:06,000 --> 00:10:08,040 Speaker 1: you guys read that piece who Will Build the Arc? 174 00:10:08,679 --> 00:10:12,360 Speaker 1: Mm hmm, yeah, it's it's fucking awesome. It's an amazing 175 00:10:12,400 --> 00:10:15,160 Speaker 1: piece of work. Um. It's from about two and ten, 176 00:10:15,240 --> 00:10:17,559 Speaker 1: I think, in the New Left Review, and he writes 177 00:10:17,600 --> 00:10:21,000 Speaker 1: basically about, you know, an elite kind of attempt to 178 00:10:21,000 --> 00:10:24,679 Speaker 1: just sort of create islands of survivability or not more 179 00:10:24,720 --> 00:10:28,960 Speaker 1: than survivability, islands of elite leisure in sense um, in 180 00:10:29,000 --> 00:10:31,480 Speaker 1: a world that's falling apart around it. And I actually 181 00:10:31,480 --> 00:10:37,360 Speaker 1: think that that's totally believable. Like I think that's more 182 00:10:37,360 --> 00:10:41,920 Speaker 1: believable than we thought it was when we wrote the book. Yeah, 183 00:10:41,960 --> 00:10:44,280 Speaker 1: and I guess you know, the question of whether or 184 00:10:44,280 --> 00:10:46,640 Speaker 1: not to call it climate fascism, I think that climate 185 00:10:46,679 --> 00:10:50,920 Speaker 1: fascism is a separate thing from the possibility of climate authoritarianism, 186 00:10:50,960 --> 00:10:53,800 Speaker 1: because I think you have this possibility of like, all right, 187 00:10:53,880 --> 00:10:55,560 Speaker 1: we have a state or a group of states that 188 00:10:55,600 --> 00:10:58,400 Speaker 1: are going to produce very authoritarian measures in order to 189 00:10:58,440 --> 00:11:01,640 Speaker 1: protect their people, and they're so called way of life 190 00:11:01,679 --> 00:11:03,400 Speaker 1: as and I think you also have a chance of 191 00:11:04,160 --> 00:11:07,800 Speaker 1: this possibility of kind of a more identarian sort of thing, 192 00:11:07,800 --> 00:11:12,320 Speaker 1: like whether it's white nationalist or whatever. Um uh, I 193 00:11:12,400 --> 00:11:15,240 Speaker 1: kind of see and I see maybe them feeding into 194 00:11:15,280 --> 00:11:17,400 Speaker 1: each other. I don't know, like it's it all gets 195 00:11:17,520 --> 00:11:19,640 Speaker 1: very muddled, and I think like one of the problems 196 00:11:19,679 --> 00:11:23,240 Speaker 1: you have trying to prognosticate about the future is that 197 00:11:23,679 --> 00:11:27,160 Speaker 1: there's always so many variables and you can anything, you 198 00:11:27,600 --> 00:11:30,080 Speaker 1: any kind of permutation of any of these things that 199 00:11:30,120 --> 00:11:31,720 Speaker 1: you can dream up. You can see the seeds of 200 00:11:31,720 --> 00:11:33,080 Speaker 1: them if you go out and find them. You can 201 00:11:33,120 --> 00:11:36,120 Speaker 1: find the Christian dominionist chunk of this, like the eco 202 00:11:36,200 --> 00:11:39,959 Speaker 1: fascist thing, and you can find a more socialist version, 203 00:11:39,960 --> 00:11:42,760 Speaker 1: and you can find a more white nationalist version. And 204 00:11:42,800 --> 00:11:45,040 Speaker 1: it's just kind of anyone's guess as to what's going 205 00:11:45,080 --> 00:11:50,920 Speaker 1: to pick up steam. Yeah, yeah, yeah, I think yeah, 206 00:11:51,080 --> 00:11:55,400 Speaker 1: I mean there's something those I think important about the 207 00:11:55,480 --> 00:11:56,920 Speaker 1: kind of thing that Joel and I are trying to 208 00:11:56,920 --> 00:12:01,160 Speaker 1: do in the book. And there's also something that is uh, 209 00:12:01,160 --> 00:12:10,600 Speaker 1: perhaps inevitably arrogant about that effort, and that that arrogance 210 00:12:11,440 --> 00:12:13,600 Speaker 1: like well, I think both Joe and I would always 211 00:12:13,640 --> 00:12:16,640 Speaker 1: say that the book what for us at least was 212 00:12:16,679 --> 00:12:21,040 Speaker 1: worth writing no matter what, It would be wrong too 213 00:12:22,679 --> 00:12:26,120 Speaker 1: to not to not acknowledge the arrogance of that analysis. 214 00:12:26,480 --> 00:12:28,959 Speaker 1: That then allows you as as soon as you acknowledge arrogance, 215 00:12:28,960 --> 00:12:31,120 Speaker 1: then you acknowledge a lot of the stuff you're talking about, right, 216 00:12:31,160 --> 00:12:34,400 Speaker 1: like the the fact that, yeah, I mean, there's a 217 00:12:34,440 --> 00:12:37,920 Speaker 1: million things that ways that this could go, and some 218 00:12:37,960 --> 00:12:39,480 Speaker 1: of them won't look like what we said they look 219 00:12:39,520 --> 00:12:42,240 Speaker 1: and we can't you know, we have to think hard. Well, 220 00:12:42,280 --> 00:12:44,840 Speaker 1: that's what I find really intelligent about the way you 221 00:12:44,880 --> 00:12:47,480 Speaker 1: set it up, because you're not saying, Okay, this political 222 00:12:47,559 --> 00:12:49,640 Speaker 1: party is going to evolve in these ways. You're trying 223 00:12:49,679 --> 00:12:51,280 Speaker 1: to say that these are kind of the we can 224 00:12:51,440 --> 00:12:54,520 Speaker 1: see from responses to other problems and from responses even 225 00:12:54,520 --> 00:12:56,960 Speaker 1: to climate change. These are kind of the patterns things 226 00:12:56,960 --> 00:13:00,840 Speaker 1: are going to break down in um And I guess 227 00:13:00,920 --> 00:13:03,320 Speaker 1: before I I'll hand it over to Garrison in a 228 00:13:03,360 --> 00:13:05,960 Speaker 1: little bit. But um, kind of the last thing I 229 00:13:06,000 --> 00:13:09,199 Speaker 1: wanted to really get into was the climate X, the 230 00:13:09,280 --> 00:13:11,640 Speaker 1: term you all use for kind of the the most 231 00:13:11,640 --> 00:13:14,160 Speaker 1: optimistic scenario that I don't think any of us believes 232 00:13:14,160 --> 00:13:16,520 Speaker 1: in as much as we'd like to at the moment. Yeah, 233 00:13:17,000 --> 00:13:21,480 Speaker 1: that's exactly it. Yeah, Um, I'm one of the things 234 00:13:21,480 --> 00:13:24,000 Speaker 1: that we're trying to do here is envision how that 235 00:13:24,120 --> 00:13:25,679 Speaker 1: might look. And the best thing that I can come 236 00:13:25,720 --> 00:13:30,679 Speaker 1: up with is a mix of um, really durable, widespread 237 00:13:30,760 --> 00:13:34,079 Speaker 1: mutual aid networks to support some sort of mass general 238 00:13:34,120 --> 00:13:38,599 Speaker 1: strike in order to institute sweeping changes both to the 239 00:13:38,720 --> 00:13:42,920 Speaker 1: nature of capitalism and to like the social system we 240 00:13:42,960 --> 00:13:45,800 Speaker 1: have in order to reduce environmental harm and anyway that 241 00:13:45,960 --> 00:13:47,400 Speaker 1: sort of thing like that's that's the only thing I 242 00:13:47,440 --> 00:13:50,520 Speaker 1: can imagine that I don't think you know, you're talking 243 00:13:50,559 --> 00:13:53,640 Speaker 1: about an effort more ambitious than landing a man on 244 00:13:53,679 --> 00:13:56,240 Speaker 1: the moon. Um, but at least it's a set of 245 00:13:56,960 --> 00:13:59,040 Speaker 1: things that could achieve a goal as opposed to like, 246 00:13:59,080 --> 00:14:02,240 Speaker 1: I don't think it's totally pious, guy, it's a possibility. 247 00:14:02,400 --> 00:14:07,160 Speaker 1: I'm wondering what when you think about best case scenario, 248 00:14:07,240 --> 00:14:10,400 Speaker 1: how if like everything breaks right, things could be resolved positively, 249 00:14:10,559 --> 00:14:13,160 Speaker 1: Like what are you envisioning? I'm wondering kind of like 250 00:14:13,200 --> 00:14:18,920 Speaker 1: what's your optimistic side say when you let it peek through? Yeah, 251 00:14:19,000 --> 00:14:21,680 Speaker 1: I mean I think I think it looks a lot 252 00:14:21,760 --> 00:14:25,480 Speaker 1: like what you're describing. I think that you know that 253 00:14:25,680 --> 00:14:28,640 Speaker 1: they'll have to be and I do think there will be. 254 00:14:28,720 --> 00:14:30,600 Speaker 1: The question is, of course whether it's too late, and 255 00:14:30,640 --> 00:14:33,320 Speaker 1: whether it's effective and all other stuff. But there will 256 00:14:33,360 --> 00:14:35,440 Speaker 1: have to be a kind of mass based to it, 257 00:14:35,640 --> 00:14:38,880 Speaker 1: for sure. But what I don't believe, and I think 258 00:14:38,920 --> 00:14:42,000 Speaker 1: you're hinting at this too, telp me if I'm misunderstanding. 259 00:14:42,560 --> 00:14:45,400 Speaker 1: What I don't believe is that it will be a 260 00:14:45,440 --> 00:14:48,880 Speaker 1: mass based thing from what we might think of as 261 00:14:48,880 --> 00:14:52,880 Speaker 1: a single or monolithic movement. It's gonna it's because the 262 00:14:52,920 --> 00:14:56,960 Speaker 1: ways that people manage what's coming our way are going 263 00:14:57,000 --> 00:14:59,160 Speaker 1: to have to be, for one thing, very locally specific. 264 00:14:59,200 --> 00:15:02,080 Speaker 1: As we know, like like you said, like there's collapses, 265 00:15:02,280 --> 00:15:04,920 Speaker 1: there's you know, people are dealing with different challenges in 266 00:15:04,960 --> 00:15:08,760 Speaker 1: their own places, um, and not just environmentally, but of 267 00:15:08,800 --> 00:15:12,720 Speaker 1: course their own political histories and all our stuff. Um, 268 00:15:12,760 --> 00:15:16,520 Speaker 1: I think it has to be like an articulation in 269 00:15:16,560 --> 00:15:19,000 Speaker 1: these mass moments like you're describing, like a general strike 270 00:15:19,120 --> 00:15:22,120 Speaker 1: or whatever, of a whole variety of movements that are 271 00:15:22,160 --> 00:15:25,080 Speaker 1: actually organized primarily around meeting the needs of the people 272 00:15:25,080 --> 00:15:28,760 Speaker 1: where they live, mutual aid societies, other kinds of distributional 273 00:15:28,920 --> 00:15:31,960 Speaker 1: you know, fixes, and this kind of chaotic breakdown like 274 00:15:32,040 --> 00:15:34,520 Speaker 1: you describe when things can much more important than coffee 275 00:15:34,520 --> 00:15:37,920 Speaker 1: are unavailable widely, those kinds of things. Um, let's not, 276 00:15:38,000 --> 00:15:41,360 Speaker 1: let's not. Let's not downplay at the importance of coffee. No, no, no, 277 00:15:41,640 --> 00:15:46,680 Speaker 1: I would never down for coffee, but for the moment. Yeah, 278 00:15:46,680 --> 00:15:48,840 Speaker 1: I get it, you know, like water, you know what 279 00:15:48,920 --> 00:15:54,880 Speaker 1: I mean, or something. Um uh, Then I think you're right. 280 00:15:54,880 --> 00:15:57,120 Speaker 1: I think it's gonna be. It has to be climate 281 00:15:57,280 --> 00:15:59,520 Speaker 1: X has to be multiple in the sense that it 282 00:15:59,560 --> 00:16:01,800 Speaker 1: has to take many, many forms specific to the needs 283 00:16:01,840 --> 00:16:04,120 Speaker 1: of the folks that are there. But what I am 284 00:16:04,160 --> 00:16:07,400 Speaker 1: convinced of which I wasn't always actually, is that the 285 00:16:07,400 --> 00:16:11,640 Speaker 1: effectiveness of movements like that will have that they will 286 00:16:11,680 --> 00:16:16,720 Speaker 1: depend upon the extent to which they're democratic, not in 287 00:16:16,800 --> 00:16:22,200 Speaker 1: terms of actual long term effectiveness. Cannot be like local authoritarianisms. 288 00:16:22,720 --> 00:16:24,600 Speaker 1: We can't imagine. This is sort of like a series 289 00:16:24,640 --> 00:16:30,240 Speaker 1: of climate change warlords. I don't think that is a 290 00:16:30,240 --> 00:16:33,360 Speaker 1: a realistic solution, even from a purely like kind of 291 00:16:33,400 --> 00:16:36,640 Speaker 1: managing the climate change perspective. Yeah. No, I've known a 292 00:16:36,680 --> 00:16:38,440 Speaker 1: couple of warlords, and none of them are good at 293 00:16:38,480 --> 00:16:41,320 Speaker 1: long term planning. Yes, I've never known a warlord, but 294 00:16:41,600 --> 00:16:56,240 Speaker 1: I can imagine you are, right, Garrison, you want to 295 00:16:56,240 --> 00:16:58,280 Speaker 1: take it over for a bit. Yeah. So yeah, I 296 00:16:58,320 --> 00:17:01,080 Speaker 1: mean I started doing just in general kind of climate 297 00:17:01,120 --> 00:17:03,400 Speaker 1: research like about half a year ago, like getting like 298 00:17:03,760 --> 00:17:06,199 Speaker 1: relatively deep into it, and one of your books was 299 00:17:06,240 --> 00:17:08,600 Speaker 1: one of the things that kept coming up as recommended 300 00:17:08,600 --> 00:17:12,000 Speaker 1: reading on the topic. UM, and yeah, I found it 301 00:17:12,040 --> 00:17:16,639 Speaker 1: super super interesting. There's a lot of a lot of 302 00:17:16,680 --> 00:17:19,240 Speaker 1: stuff focuses on, like a lot of stuff on the 303 00:17:19,240 --> 00:17:25,159 Speaker 1: topic focuses on like potential physical effects happening to like, um, 304 00:17:25,160 --> 00:17:29,320 Speaker 1: geography and to like environments, but not there's not as 305 00:17:29,400 --> 00:17:32,160 Speaker 1: much on like the political side of things and how 306 00:17:32,200 --> 00:17:34,800 Speaker 1: that's going to break down like societally in terms of 307 00:17:34,840 --> 00:17:39,399 Speaker 1: you know, freedom and liberty and sovereignty over specific you 308 00:17:39,400 --> 00:17:43,440 Speaker 1: know states or you know free states. Um. So that yeah, 309 00:17:43,440 --> 00:17:45,800 Speaker 1: that's what really drew me into your book specifically was 310 00:17:45,880 --> 00:17:51,600 Speaker 1: the kind of special focus on that side of things, UM, 311 00:17:51,680 --> 00:17:54,280 Speaker 1: and the the other the other part that got me 312 00:17:54,440 --> 00:17:59,320 Speaker 1: pretty early on is the mitigation versus adaption side of things, 313 00:17:59,760 --> 00:18:05,120 Speaker 1: and how to my understanding, we're kind of like crossing 314 00:18:05,160 --> 00:18:09,520 Speaker 1: into the place where mitigation is becoming more and more 315 00:18:09,560 --> 00:18:16,919 Speaker 1: difficult and adaptation is both necessary but also unlike unfortunately necessary, 316 00:18:16,920 --> 00:18:18,800 Speaker 1: because there's a lot of ways that that can be 317 00:18:18,920 --> 00:18:22,719 Speaker 1: used by authoritarian states to make things harder to have 318 00:18:22,840 --> 00:18:27,359 Speaker 1: change happened in the future. UM, would you like speak 319 00:18:27,400 --> 00:18:31,440 Speaker 1: on what types of mitigation, what type of some mitigation 320 00:18:31,440 --> 00:18:35,400 Speaker 1: efforts might we still have, and how and how adaption 321 00:18:35,520 --> 00:18:37,960 Speaker 1: is both going to be necessary and how there's going 322 00:18:38,000 --> 00:18:40,480 Speaker 1: to be like a dark side to some of those adaptions. 323 00:18:41,040 --> 00:18:44,320 Speaker 1: Mm hmm. I would agree with your read there, Like 324 00:18:44,600 --> 00:18:46,320 Speaker 1: I think we're at the point now where it's fair 325 00:18:46,359 --> 00:18:48,359 Speaker 1: to say that if you ask the climate scientist, I 326 00:18:48,400 --> 00:18:50,199 Speaker 1: could be wrong, but I know a few climate scientists 327 00:18:50,200 --> 00:18:53,560 Speaker 1: and I do quiz them on stuff like this sometimes, UM, 328 00:18:53,680 --> 00:18:58,760 Speaker 1: that we're at the point where mitigation efforts right now 329 00:18:58,880 --> 00:19:02,520 Speaker 1: are actually purely adaptive. Like we we are past many 330 00:19:02,560 --> 00:19:07,320 Speaker 1: thresholds where somehow we could imagine it's like escaping this problem, 331 00:19:07,320 --> 00:19:08,800 Speaker 1: you know what I mean, like evading it, getting out 332 00:19:08,840 --> 00:19:11,000 Speaker 1: the side door or something like that. So even our 333 00:19:11,000 --> 00:19:14,560 Speaker 1: mitigation efforts right now are actually adaptive in that sense, 334 00:19:15,000 --> 00:19:19,080 Speaker 1: an adaptation and I think has become in many ways, 335 00:19:20,000 --> 00:19:24,919 Speaker 1: the the holy grail of modern political economy. Like, you 336 00:19:24,960 --> 00:19:29,919 Speaker 1: know what I mean, how do we have our luxurious 337 00:19:29,960 --> 00:19:34,639 Speaker 1: Western lifestyles and consumption patterns and all this in the 338 00:19:34,680 --> 00:19:38,280 Speaker 1: middle of out collapsing ecosystems? Like how do we how 339 00:19:38,280 --> 00:19:39,800 Speaker 1: do we manage that? You know what I mean? It's 340 00:19:39,800 --> 00:19:41,480 Speaker 1: almost like I think at some point in the book 341 00:19:41,520 --> 00:19:43,879 Speaker 1: we say, and Joel and I have certainly said it 342 00:19:43,880 --> 00:19:47,840 Speaker 1: a lot since you know, adaptation has become the progress 343 00:19:47,880 --> 00:19:49,960 Speaker 1: of our time. Like if in the twentieth century we 344 00:19:49,960 --> 00:19:53,080 Speaker 1: talked about all progress progress, that's where capitalism and liberalism deliver. 345 00:19:53,400 --> 00:19:55,520 Speaker 1: Now we're like, all the best they can deliver is 346 00:19:55,640 --> 00:20:01,800 Speaker 1: adaptation to a fucking crazy set of you know, conditions. Um, 347 00:20:01,840 --> 00:20:03,879 Speaker 1: And so I guess I would say that from a 348 00:20:03,920 --> 00:20:06,879 Speaker 1: mitigation perspective, for sure, we still have the capacity to 349 00:20:07,680 --> 00:20:13,119 Speaker 1: to you know, considerably cut emissions if we you know, 350 00:20:13,680 --> 00:20:15,320 Speaker 1: I mean, we have the sort of pie in the sky, 351 00:20:15,440 --> 00:20:18,720 Speaker 1: but but hopeful things like you know that the elimination 352 00:20:18,720 --> 00:20:20,600 Speaker 1: of the fossil fuel industry, that would do a lot, 353 00:20:21,200 --> 00:20:24,000 Speaker 1: but we would still be in kind of short term, 354 00:20:24,080 --> 00:20:27,200 Speaker 1: I mean, in medium terms sort of fucked like um, 355 00:20:27,240 --> 00:20:30,160 Speaker 1: and that's a big deal. Um So I do think 356 00:20:30,200 --> 00:20:32,719 Speaker 1: that the mitigation efforts. I would never want to say, oh, 357 00:20:32,720 --> 00:20:36,240 Speaker 1: it don't bother like some sort of accelerationist horseshit, because 358 00:20:36,280 --> 00:20:40,080 Speaker 1: of course that will matter. But I do think that, um, 359 00:20:40,160 --> 00:20:43,679 Speaker 1: that adaptation has become in some sense like the mode 360 00:20:43,720 --> 00:20:50,440 Speaker 1: through which we evaluate anything from like political proposals to 361 00:20:50,440 --> 00:20:55,600 Speaker 1: two you know, technical fixes, Like it's at least amongst 362 00:20:55,600 --> 00:20:58,000 Speaker 1: people who are willing to admit there's a problem. I 363 00:20:58,040 --> 00:21:00,480 Speaker 1: guess there's a you know, a whole world of people 364 00:21:00,480 --> 00:21:06,520 Speaker 1: who somehow still don't. Yeah that that that people is 365 00:21:06,520 --> 00:21:10,080 Speaker 1: always larger than what I you know, after like spending 366 00:21:10,119 --> 00:21:15,040 Speaker 1: like months reading you know, so many climate books, I'm 367 00:21:15,119 --> 00:21:18,359 Speaker 1: like still struck at like how basically the majority of 368 00:21:18,359 --> 00:21:21,120 Speaker 1: people in America don't think it's a big problem. And 369 00:21:21,240 --> 00:21:28,440 Speaker 1: that's like, yeah, that's ire we're real script that from yeah, 370 00:21:28,480 --> 00:21:34,840 Speaker 1: it's not it's not. I mean, you know, uh, I 371 00:21:34,920 --> 00:21:38,679 Speaker 1: think about like fascism the last time it came around 372 00:21:38,880 --> 00:21:42,280 Speaker 1: and how what a common attitude that was towards fascism 373 00:21:42,320 --> 00:21:45,120 Speaker 1: sweeping Europe And we eventually got on the same page 374 00:21:45,160 --> 00:21:47,760 Speaker 1: about that, and only like a hundred million people died. 375 00:21:48,680 --> 00:21:54,760 Speaker 1: So yeah, I know, but I guess I'm gonna say 376 00:21:55,240 --> 00:21:58,160 Speaker 1: there there is like there. I think they're really strong. 377 00:21:59,440 --> 00:22:03,720 Speaker 1: I magical, if that's the right word. Reasons for the 378 00:22:03,800 --> 00:22:06,960 Speaker 1: persistence not of like not of denial is m you know, 379 00:22:07,040 --> 00:22:10,520 Speaker 1: in this kind of like stupid stereotypical way, like you know, 380 00:22:10,720 --> 00:22:12,520 Speaker 1: it's a Chinese hoax and all that stuff. I don't 381 00:22:12,520 --> 00:22:14,720 Speaker 1: mean that. I mean more like this kind of you know, 382 00:22:14,760 --> 00:22:17,560 Speaker 1: sometimes people call it the new denialism, where you acknowledge 383 00:22:17,560 --> 00:22:18,960 Speaker 1: it as a problem, but then you don't do anything 384 00:22:18,960 --> 00:22:21,040 Speaker 1: about it. Like this is what we have in Canada. 385 00:22:21,080 --> 00:22:23,720 Speaker 1: We have a national government that that talks all the 386 00:22:23,760 --> 00:22:27,159 Speaker 1: time and then subsidizes every oil industry that you can 387 00:22:27,200 --> 00:22:29,280 Speaker 1: get its hands on, you know, a sort of like yeah, yeah, 388 00:22:29,280 --> 00:22:32,440 Speaker 1: it's a problem, and we're doing everything we can. Here's 389 00:22:32,440 --> 00:22:35,560 Speaker 1: our new l en Gen pipeline or whatever, um that 390 00:22:35,640 --> 00:22:39,119 Speaker 1: kind of thing. I think that the dominant way of 391 00:22:39,160 --> 00:22:42,439 Speaker 1: talking about the problem still contains a lot of like 392 00:22:42,560 --> 00:22:47,119 Speaker 1: weird uncertainty. You know that people say things like, um, 393 00:22:47,119 --> 00:22:51,440 Speaker 1: you know there's a chance that we're heading towards water scarcity. 394 00:22:51,760 --> 00:22:54,679 Speaker 1: It's like no, no, no scientist thinks there's a chance. 395 00:22:55,280 --> 00:22:58,159 Speaker 1: Everybody knows it's the case, but we frame it as 396 00:22:58,200 --> 00:23:01,320 Speaker 1: if it's still this big tig uncertainty in the future. 397 00:23:01,720 --> 00:23:04,719 Speaker 1: And I think that allows people to feel, like what 398 00:23:04,760 --> 00:23:09,760 Speaker 1: Garrison saying, a kind of like it gives a room 399 00:23:09,840 --> 00:23:13,520 Speaker 1: for not doubt, but like distance or something. I don't 400 00:23:13,520 --> 00:23:17,080 Speaker 1: know what to describe it as. Um, yeah, I think distance, 401 00:23:18,200 --> 00:23:22,479 Speaker 1: because that's all that's that's that's so deep, particularly in 402 00:23:23,040 --> 00:23:25,840 Speaker 1: the American and I guess the Canadian psyche, right, like 403 00:23:25,880 --> 00:23:27,840 Speaker 1: even just going back to like the wars of the 404 00:23:27,920 --> 00:23:31,679 Speaker 1: last century, this idea that like, well, we're we're separated 405 00:23:31,720 --> 00:23:34,040 Speaker 1: from it, We're far enough away from it. And I 406 00:23:34,040 --> 00:23:36,280 Speaker 1: think there was even an idea among people who accepted 407 00:23:36,280 --> 00:23:39,359 Speaker 1: the reality of climate change in Canada specifically that like, well, 408 00:23:39,800 --> 00:23:42,199 Speaker 1: it's just gonna make this place a better growing climate 409 00:23:42,280 --> 00:23:44,480 Speaker 1: or whatever, Like it's not gonna it's not going to 410 00:23:44,600 --> 00:23:46,919 Speaker 1: lead to tornado, Like it's not going to lead to 411 00:23:47,000 --> 00:23:51,439 Speaker 1: like massive storm fronts of lightning built by giant fire 412 00:23:51,800 --> 00:23:56,800 Speaker 1: waves destroying entire cities, Like that's not gonna happen, and 413 00:23:56,880 --> 00:24:02,359 Speaker 1: I yeah, yeah, the other Yeah. One interesting thing, This 414 00:24:02,400 --> 00:24:04,320 Speaker 1: is just something I've been writing about lately, like not 415 00:24:04,920 --> 00:24:07,560 Speaker 1: just for myself. I haven't published it or anything, but 416 00:24:07,560 --> 00:24:09,359 Speaker 1: but one of the one of the things I've been 417 00:24:09,400 --> 00:24:13,359 Speaker 1: trying to study a lot is the climate economic modeling, 418 00:24:13,440 --> 00:24:16,240 Speaker 1: you know, like the stuff that the governments supposedly leans 419 00:24:16,320 --> 00:24:19,840 Speaker 1: on to like make its plans or determine its tax 420 00:24:19,920 --> 00:24:22,239 Speaker 1: rates for carbon and this kind of stuff, you know. 421 00:24:23,080 --> 00:24:25,320 Speaker 1: And one of the crazy things about those models, as 422 00:24:25,359 --> 00:24:27,800 Speaker 1: I've dug into them, both at the technical level, like 423 00:24:27,920 --> 00:24:30,880 Speaker 1: right down to the mathematical you know, choice is made, 424 00:24:30,880 --> 00:24:34,040 Speaker 1: but also how they conceive of them is that all 425 00:24:34,119 --> 00:24:39,080 Speaker 1: those models are built h and therefore, you know, most 426 00:24:39,119 --> 00:24:41,280 Speaker 1: of the policy expertise that's based on them is also 427 00:24:41,359 --> 00:24:47,359 Speaker 1: built on this idea that everything political will stay the same, 428 00:24:48,280 --> 00:24:50,720 Speaker 1: everything political economic will stay the same, things will just 429 00:24:50,760 --> 00:24:54,680 Speaker 1: get hotter. So like they are a model of stable 430 00:24:54,720 --> 00:24:58,800 Speaker 1: capitalism in a warmer world. Do you know, does that 431 00:24:58,840 --> 00:25:02,560 Speaker 1: make any sense? No, Yeah, it's this idea that well, 432 00:25:02,920 --> 00:25:06,280 Speaker 1: it's it's this acceptance that like it's going to get hotter. Um, 433 00:25:06,359 --> 00:25:08,440 Speaker 1: But this ignorant, ignorant the fact that like, and that's 434 00:25:08,440 --> 00:25:10,800 Speaker 1: going to increase the number of refugees, and that's going 435 00:25:10,840 --> 00:25:13,000 Speaker 1: to provide fuel for the radical right, and that's going 436 00:25:13,080 --> 00:25:16,160 Speaker 1: to lead to more exterminationists to talk in like mainstream 437 00:25:16,200 --> 00:25:19,560 Speaker 1: politics and like, yeah, it's yes, I get what you're saying. 438 00:25:20,640 --> 00:25:27,280 Speaker 1: And what worries me is that that is absolutely like 439 00:25:27,400 --> 00:25:31,119 Speaker 1: you can walk really far out on that limb and 440 00:25:31,160 --> 00:25:34,520 Speaker 1: then it just cracks and and then you know, the 441 00:25:34,520 --> 00:25:38,200 Speaker 1: whole fun and show is no longer temple. And that's 442 00:25:38,240 --> 00:25:40,920 Speaker 1: I think, you know, those moments like you're describing, there'll 443 00:25:40,960 --> 00:25:45,240 Speaker 1: be placed specific I suppose, yes, but it's that's the 444 00:25:45,240 --> 00:25:50,520 Speaker 1: ship that scares me. Yeah, yeah, yeah, it didn't make 445 00:25:50,560 --> 00:25:52,800 Speaker 1: sense to the normal person in the street a couple 446 00:25:52,800 --> 00:26:01,240 Speaker 1: of weeks ago. It starts to make sense. Yeah, yeah, 447 00:26:01,240 --> 00:26:02,720 Speaker 1: that's that. Yeah, I think, and I think a lot 448 00:26:02,760 --> 00:26:06,399 Speaker 1: of a lot of people there is definitely like the 449 00:26:07,320 --> 00:26:10,439 Speaker 1: everything will stay mostly the same syndrome along among a 450 00:26:10,440 --> 00:26:13,119 Speaker 1: lot of the population, whether you know they fall the 451 00:26:13,200 --> 00:26:15,959 Speaker 1: conservative or liberal side. I think there's there's a lot 452 00:26:16,000 --> 00:26:17,440 Speaker 1: there's a lot of acts that that's a whole lot 453 00:26:17,480 --> 00:26:20,080 Speaker 1: easier and in terms of like even for like you know, 454 00:26:20,359 --> 00:26:25,479 Speaker 1: people who are more radical, um even you know who 455 00:26:25,520 --> 00:26:29,160 Speaker 1: are like on like more on like the left. Um. 456 00:26:29,200 --> 00:26:31,480 Speaker 1: You know, there is like this you know perception that 457 00:26:31,640 --> 00:26:36,720 Speaker 1: capital is just going is it's gonna stay very similar 458 00:26:36,720 --> 00:26:37,760 Speaker 1: to how we see it now. And I think a 459 00:26:37,760 --> 00:26:41,640 Speaker 1: lot of people underestimate the adaptive capabilities. And I think 460 00:26:41,680 --> 00:26:44,760 Speaker 1: one of the useful things about the plagues last year 461 00:26:45,080 --> 00:26:48,640 Speaker 1: is that we've seen capital like transform it like transform 462 00:26:48,680 --> 00:26:51,480 Speaker 1: itself in very large ways in terms of like retail 463 00:26:51,560 --> 00:26:56,560 Speaker 1: industry supply lines very quickly. Um. So we've both seen 464 00:26:56,600 --> 00:26:59,360 Speaker 1: that kind of you know, you know, large scale transformation 465 00:26:59,800 --> 00:27:03,560 Speaker 1: on on like on a global scale. And we've also 466 00:27:03,600 --> 00:27:07,840 Speaker 1: seen the other thing you talked about creating like islands 467 00:27:07,840 --> 00:27:11,560 Speaker 1: of luxury right of like people who are um, you know, 468 00:27:12,400 --> 00:27:14,840 Speaker 1: higher class, but also some people who are like middle 469 00:27:14,840 --> 00:27:18,240 Speaker 1: class being able to basically create isolated pockets where they 470 00:27:18,240 --> 00:27:21,200 Speaker 1: can live in a life that's pretty similar to what 471 00:27:21,240 --> 00:27:25,280 Speaker 1: they already had. Where everyone else has to live and 472 00:27:25,320 --> 00:27:27,680 Speaker 1: ship like they get they they get everyone else has 473 00:27:27,880 --> 00:27:30,359 Speaker 1: so much worse, whereas a middle class class get to 474 00:27:30,359 --> 00:27:33,680 Speaker 1: stay kind of like in this in this small bubble um. 475 00:27:33,720 --> 00:27:37,360 Speaker 1: And I think that is definitely I think it's really 476 00:27:37,400 --> 00:27:39,200 Speaker 1: useful to look at how that happened in the play, 477 00:27:39,440 --> 00:27:41,000 Speaker 1: you know, early on with people like you know, where 478 00:27:41,080 --> 00:27:42,920 Speaker 1: people like flying to New Zealand to live in there 479 00:27:42,960 --> 00:27:47,760 Speaker 1: like cabins um and being like yeah, that's gonna happen. Um. 480 00:27:47,880 --> 00:27:51,320 Speaker 1: And now with you know, Amazon Man going to space, 481 00:27:51,359 --> 00:27:53,520 Speaker 1: it's like the same thing, right, there's gonna be there's 482 00:27:53,520 --> 00:27:59,280 Speaker 1: gonna see more and more extreme versions of this. Um hm. Yeah. 483 00:27:59,280 --> 00:28:02,159 Speaker 1: I don't know any thoughts on that that type of 484 00:28:02,200 --> 00:28:04,160 Speaker 1: thing in terms of you know, how we can look 485 00:28:04,200 --> 00:28:08,200 Speaker 1: at like past past, smaller you know, collapses or crumblings 486 00:28:08,200 --> 00:28:11,520 Speaker 1: of you know, of societal norms get really showing how 487 00:28:11,560 --> 00:28:14,200 Speaker 1: capital is going to adapt and how quickly it can 488 00:28:14,240 --> 00:28:17,520 Speaker 1: adapt in some cases. M I don't know if I 489 00:28:17,560 --> 00:28:20,080 Speaker 1: have any specific thoughts, but I you know, I am 490 00:28:20,119 --> 00:28:23,359 Speaker 1: with you on all of that analysis. I feel like, well, 491 00:28:23,440 --> 00:28:25,639 Speaker 1: for one thing, I think you're very very right to 492 00:28:25,680 --> 00:28:29,520 Speaker 1: emphasize the kind of robustness the capital keeps demonstrating, like 493 00:28:29,520 --> 00:28:33,840 Speaker 1: like we can knock it all at once. It's a yeah, exactly, 494 00:28:33,880 --> 00:28:36,480 Speaker 1: that's exactly I'm just about to say exactly, And it's 495 00:28:36,480 --> 00:28:39,920 Speaker 1: tougher than virtually every other political economic arrangement you know 496 00:28:39,960 --> 00:28:43,040 Speaker 1: that came before it, at least in the recent centuries, 497 00:28:43,480 --> 00:28:46,880 Speaker 1: like it, it does adapt in this remarkable way or 498 00:28:47,160 --> 00:28:50,600 Speaker 1: you know, I don't know, shapeshift. Um. But I also think, 499 00:28:50,640 --> 00:28:53,480 Speaker 1: like you know, just in terms of the kinds of 500 00:28:53,520 --> 00:29:03,440 Speaker 1: dynamics you're describing. Yeah, the the inequality that persists today, 501 00:29:03,480 --> 00:29:06,360 Speaker 1: not only like in its purely economic form, in the 502 00:29:06,400 --> 00:29:08,840 Speaker 1: sense that you know, there's a few very rich, very 503 00:29:08,880 --> 00:29:11,360 Speaker 1: powerful people, um, and then the vast majority of the 504 00:29:11,360 --> 00:29:15,920 Speaker 1: planet you know, um, isn't quite far behind, to put 505 00:29:15,960 --> 00:29:19,959 Speaker 1: it lightly, um, But that but that we're also like 506 00:29:20,280 --> 00:29:24,000 Speaker 1: it's almost like a total disaster that ideologically this problem 507 00:29:24,040 --> 00:29:27,720 Speaker 1: emerges precisely at the moment, it seems to me when 508 00:29:28,240 --> 00:29:34,760 Speaker 1: inequality is is so widely understood to be just normal 509 00:29:34,880 --> 00:29:38,280 Speaker 1: or natural, so that the reaction to almost any crisis 510 00:29:39,440 --> 00:29:43,000 Speaker 1: is that, you know, the rich will be fine and 511 00:29:43,000 --> 00:29:46,479 Speaker 1: and whether like people like me might say that sucks 512 00:29:46,520 --> 00:29:48,720 Speaker 1: and that's shitty, but for the most part, it's wide 513 00:29:48,800 --> 00:29:51,240 Speaker 1: they accepted as just the way the world works, right now, 514 00:29:52,160 --> 00:29:54,520 Speaker 1: you know what I mean, Like, like, there's never been 515 00:29:54,520 --> 00:29:56,360 Speaker 1: a better argument for a wealth tax than there is 516 00:29:56,440 --> 00:29:58,600 Speaker 1: right now. I can't think of one maybe the robber 517 00:29:58,680 --> 00:30:00,920 Speaker 1: barons in the States or whatever, you know, in the 518 00:30:01,000 --> 00:30:05,239 Speaker 1: late nineteenth early twentieth century. But it's it's like, at 519 00:30:05,320 --> 00:30:09,280 Speaker 1: least to me, it seems to be just here in 520 00:30:09,320 --> 00:30:11,120 Speaker 1: Canada at least it's a total joke, Like we talked 521 00:30:11,160 --> 00:30:14,240 Speaker 1: about it, but it's nowhere near happening. There's a sort 522 00:30:14,280 --> 00:30:17,360 Speaker 1: of strange like you know, I think Naomi Client has 523 00:30:17,360 --> 00:30:19,320 Speaker 1: written about this, like the kind of like poor timing 524 00:30:19,360 --> 00:30:22,200 Speaker 1: of the fact that the climate crisis happened precisely when 525 00:30:22,720 --> 00:30:26,880 Speaker 1: you know that democracy and social democratic forces are at 526 00:30:26,880 --> 00:30:29,240 Speaker 1: their weakest, or at least not weakest, but you know, 527 00:30:29,840 --> 00:30:34,720 Speaker 1: not in a good spot. Yeah, I mean, and uh, 528 00:30:34,840 --> 00:30:37,920 Speaker 1: I don't know. It was always tempting earlier to talk 529 00:30:37,960 --> 00:30:41,640 Speaker 1: about Syria and kind of have climate change contributed to 530 00:30:41,680 --> 00:30:44,680 Speaker 1: that and how that contributed to rising authoritarians And there's 531 00:30:44,760 --> 00:30:47,040 Speaker 1: actually been some new analysis on that that kind of 532 00:30:47,800 --> 00:30:50,600 Speaker 1: has made me less confident in climate change as a 533 00:30:50,720 --> 00:30:58,320 Speaker 1: driver of that conflict. Um I am, I guess, But 534 00:30:58,320 --> 00:31:00,800 Speaker 1: but what what what I do and why I do 535 00:31:00,880 --> 00:31:03,360 Speaker 1: still talk about that when I talk about like how 536 00:31:03,360 --> 00:31:04,880 Speaker 1: all this is going to work? Is kind of one 537 00:31:04,920 --> 00:31:07,560 Speaker 1: of the things that's most important to understand is that, 538 00:31:07,640 --> 00:31:10,520 Speaker 1: like the problem is not just climate change, right, like 539 00:31:10,560 --> 00:31:12,160 Speaker 1: you said, it's not just that it's getting warmer, and 540 00:31:12,200 --> 00:31:14,479 Speaker 1: it's not even just that climate change is causing these problems. 541 00:31:14,520 --> 00:31:16,760 Speaker 1: Is that we have these pre existing problems. We have 542 00:31:16,840 --> 00:31:19,760 Speaker 1: all these these issues we didn't deal with for years 543 00:31:19,760 --> 00:31:22,200 Speaker 1: and years. Um. It's like an old house and you 544 00:31:22,200 --> 00:31:25,040 Speaker 1: didn't do the repair work necessary, and then you know 545 00:31:25,240 --> 00:31:29,479 Speaker 1: there's there's extreme weather and the weather does It's not 546 00:31:29,600 --> 00:31:32,040 Speaker 1: just the weather that causes the problems, it's you've got 547 00:31:32,040 --> 00:31:36,680 Speaker 1: all these all these issues that cascade. UM. One of 548 00:31:36,720 --> 00:31:38,560 Speaker 1: the terms I think we're using a lot because we're 549 00:31:38,560 --> 00:31:42,520 Speaker 1: trying to get people away from the from the discussion 550 00:31:42,520 --> 00:31:45,800 Speaker 1: of collapse, which I don't think is productive. UM. A 551 00:31:45,800 --> 00:31:49,120 Speaker 1: friend of both Garrisonizes, is an e R nurse who 552 00:31:49,200 --> 00:31:51,400 Speaker 1: has been kind of working through COVID and was talking 553 00:31:51,440 --> 00:31:55,800 Speaker 1: about the fact that like, um, you know, prior to COVID, 554 00:31:55,840 --> 00:31:58,440 Speaker 1: we had a shortage of healthcare workers. It was exacerbated 555 00:31:58,440 --> 00:32:02,040 Speaker 1: by COVID, more people quit, it was exacerbated by all 556 00:32:02,040 --> 00:32:05,680 Speaker 1: of these different sort of like issues structurally within Portland itself, 557 00:32:05,680 --> 00:32:07,760 Speaker 1: in the way the city is set up, and now 558 00:32:07,840 --> 00:32:11,880 Speaker 1: you've got um, all these different medical systems kind of 559 00:32:11,920 --> 00:32:14,320 Speaker 1: like falling apart at the point when they're most necessary. 560 00:32:14,640 --> 00:32:17,320 Speaker 1: And the term that he uses is the crumbles um, 561 00:32:17,360 --> 00:32:20,360 Speaker 1: which I like a lot. Just this this thing, it's 562 00:32:20,400 --> 00:32:23,360 Speaker 1: not ever going to just fall apart, but pieces of 563 00:32:23,400 --> 00:32:26,800 Speaker 1: it are breaking off at all times, and it's it's 564 00:32:26,880 --> 00:32:32,680 Speaker 1: this um. I think that I I think it's an 565 00:32:32,680 --> 00:32:35,760 Speaker 1: easier way to get people because like there's you know, 566 00:32:35,920 --> 00:32:38,000 Speaker 1: we're having an in Portland right now. We had this 567 00:32:38,080 --> 00:32:40,440 Speaker 1: fucking heatwave hit. At the same time, we got a 568 00:32:40,560 --> 00:32:44,040 Speaker 1: notice that because of supply line issues, um Oregon was 569 00:32:44,120 --> 00:32:47,400 Speaker 1: out of chlor I think it was chlorine in order 570 00:32:47,480 --> 00:32:51,440 Speaker 1: to like uh pure for the for the water filtration systems. 571 00:32:51,960 --> 00:32:53,960 Speaker 1: And they were like, it'll be fine this time, We're 572 00:32:53,960 --> 00:32:56,640 Speaker 1: gonna we have like enough stored up to to deal 573 00:32:56,720 --> 00:32:58,720 Speaker 1: with like the shortage of equipment. But it's like, okay, 574 00:32:58,720 --> 00:33:01,200 Speaker 1: but what about next time? And the same thing. Because 575 00:33:01,200 --> 00:33:05,600 Speaker 1: of COVID jet fuel, like to save money, companies fired 576 00:33:05,640 --> 00:33:07,680 Speaker 1: all of their drivers, so there's not enough jet fuel 577 00:33:07,760 --> 00:33:10,400 Speaker 1: or was not enough jet fuel, which was fine during COVID, 578 00:33:10,440 --> 00:33:13,640 Speaker 1: but then these fires hit and they're having to ground 579 00:33:13,840 --> 00:33:16,760 Speaker 1: firefighting planes because there's just no jet fuel, and what 580 00:33:16,880 --> 00:33:20,560 Speaker 1: there is is being requisitioned to keep flames planes going 581 00:33:20,560 --> 00:33:23,720 Speaker 1: to and fro, and so you can't adequately fight the fire. 582 00:33:23,760 --> 00:33:29,080 Speaker 1: It's these wow, yeah, all these these seemingly little things 583 00:33:29,480 --> 00:33:34,000 Speaker 1: that become big things when you this. Uh it's like 584 00:33:34,040 --> 00:33:38,120 Speaker 1: climate changes like steroids to all these little problems too. Yeah, yeah, yeah, no, 585 00:33:38,160 --> 00:33:39,880 Speaker 1: it makes a lot of sense. Yeah. Then and they 586 00:33:39,880 --> 00:33:42,400 Speaker 1: sort of tend a cascade way that we didn't we 587 00:33:42,440 --> 00:33:45,640 Speaker 1: didn't predict. Yeah. Yeah, it's chaos theory stuff, right, It's 588 00:33:45,640 --> 00:33:47,760 Speaker 1: like we're we're all we're we're tipping over the edge 589 00:33:47,760 --> 00:33:50,240 Speaker 1: of chaos right now, past the point where it makes 590 00:33:50,240 --> 00:33:52,320 Speaker 1: things more adaptable, and towards the point where it all 591 00:33:52,360 --> 00:33:55,840 Speaker 1: just kind of spirals out of control. Yeah, it's hard 592 00:33:55,880 --> 00:33:59,720 Speaker 1: to imagine talking about something like equilibrium right now. It's 593 00:33:59,760 --> 00:34:03,280 Speaker 1: just Yeah, but since virtually all of our science is 594 00:34:03,280 --> 00:34:07,240 Speaker 1: built on the model of equilibrium that causes trouble, should 595 00:34:07,280 --> 00:34:08,759 Speaker 1: you speak to that a little bit more? Because I'm 596 00:34:08,800 --> 00:34:11,120 Speaker 1: I am not a scientist, and I'm fairly certain Garrison 597 00:34:11,160 --> 00:34:13,120 Speaker 1: doesn't either, so that I hadn't really thought of things 598 00:34:13,160 --> 00:34:15,520 Speaker 1: in those terms. Yeah, I mean, I only think of 599 00:34:15,520 --> 00:34:18,719 Speaker 1: it in those terms because you know, I'm training in economics, 600 00:34:18,719 --> 00:34:22,960 Speaker 1: and that's the framing. But in general, like most of 601 00:34:22,960 --> 00:34:26,840 Speaker 1: the complex models not so that's not so true of 602 00:34:26,880 --> 00:34:29,799 Speaker 1: the climate science models necessarily behind any means, but most 603 00:34:29,800 --> 00:34:31,800 Speaker 1: of the ways that we model, like the behavior of 604 00:34:31,840 --> 00:34:35,840 Speaker 1: an ecosystem or the behavior of an economy, assumes that 605 00:34:35,880 --> 00:34:39,799 Speaker 1: there's a sort of tendency toward normally normalizing, you know 606 00:34:39,800 --> 00:34:41,799 Speaker 1: what I mean, Like that over time, a series of 607 00:34:41,840 --> 00:34:45,640 Speaker 1: processes kind of build up a tendency toward a particular direction. 608 00:34:46,200 --> 00:34:50,319 Speaker 1: So like that, you know, in in economics that the 609 00:34:50,320 --> 00:34:53,080 Speaker 1: economy is understood to be kind of self correcting, not 610 00:34:53,120 --> 00:34:56,279 Speaker 1: even kind of self correcting. So like if something they 611 00:34:56,320 --> 00:34:58,920 Speaker 1: call it a shock, If something happens in the in 612 00:34:58,960 --> 00:35:02,359 Speaker 1: the in the in the economic system that's unexpected, then 613 00:35:02,400 --> 00:35:04,319 Speaker 1: of course the whole system kind of shakes a bit. 614 00:35:04,680 --> 00:35:07,640 Speaker 1: But then the assumption is that the overall momentum and 615 00:35:07,719 --> 00:35:10,560 Speaker 1: dynamics of the system will bring things back to normal. 616 00:35:10,680 --> 00:35:13,560 Speaker 1: Does that make sense, Yeah, And that's exactly how we 617 00:35:13,640 --> 00:35:18,320 Speaker 1: model ecosystem behavior. Like cut a hole in the middle 618 00:35:18,320 --> 00:35:20,960 Speaker 1: of a forest, the assumption is that for a second, 619 00:35:21,000 --> 00:35:23,799 Speaker 1: for a second on forest time, this force is like, 620 00:35:23,840 --> 00:35:26,480 Speaker 1: holy fun, there's all this sunlight and there's weird animals 621 00:35:26,520 --> 00:35:28,920 Speaker 1: in a here that weren't here before, but over time, 622 00:35:29,560 --> 00:35:33,239 Speaker 1: the forests sort of pattern of operation will bring it 623 00:35:33,280 --> 00:35:35,400 Speaker 1: back to normal. This is why like clear cutting is 624 00:35:35,400 --> 00:35:40,360 Speaker 1: supposed to be okay, because eventually the ecosystem will recover um. 625 00:35:40,400 --> 00:35:44,640 Speaker 1: And most of our sciences are built on this kind 626 00:35:44,640 --> 00:35:48,800 Speaker 1: of equilibrium oriented model, kind of normalizing of the larger 627 00:35:48,840 --> 00:35:51,680 Speaker 1: processes of the system that just had their own momentum. 628 00:35:51,680 --> 00:35:55,640 Speaker 1: And there's only a few, like ecological sciences that break 629 00:35:55,719 --> 00:35:57,720 Speaker 1: that pattern. And there are things like people who study 630 00:35:57,840 --> 00:36:02,400 Speaker 1: deserts which are like, oh, they don't really have a middle, 631 00:36:02,440 --> 00:36:04,400 Speaker 1: Like it doesn't It doesn't help to talk about a 632 00:36:04,440 --> 00:36:08,040 Speaker 1: desert's average temperature because the desert never is that temperature, 633 00:36:08,200 --> 00:36:10,120 Speaker 1: you know what I mean, It's just in the middle 634 00:36:10,200 --> 00:36:14,840 Speaker 1: of extremes um and uh. And I think that at 635 00:36:14,920 --> 00:36:20,040 Speaker 1: least from a sort of like scientific tech perspective, technological 636 00:36:20,120 --> 00:36:23,040 Speaker 1: fixed perspective for climate change, one of the biggest problems 637 00:36:23,080 --> 00:36:25,560 Speaker 1: is the fact that most of our kind of science 638 00:36:25,600 --> 00:36:30,640 Speaker 1: and knowledge can't deal with dissequilibrium systems, like ones that 639 00:36:30,719 --> 00:36:32,919 Speaker 1: actually we don't know where they're going and we don't 640 00:36:32,920 --> 00:36:35,839 Speaker 1: know where they'll end and we don't have the tools, 641 00:36:35,840 --> 00:36:38,960 Speaker 1: like literally the mathematical tools to manage them, so we 642 00:36:39,000 --> 00:36:41,040 Speaker 1: can't model them, and then we don't know what then 643 00:36:41,160 --> 00:36:54,719 Speaker 1: to do. One of the things I've been reading in 644 00:36:54,719 --> 00:36:57,400 Speaker 1: preparation with this that's been useful from a uh an 645 00:36:57,440 --> 00:36:59,920 Speaker 1: intellectual framework is this. It was written in two thou 646 00:37:00,040 --> 00:37:03,480 Speaker 1: and twelve. It's called the Gonzo Futurist Manifesto, and it's 647 00:37:03,600 --> 00:37:06,480 Speaker 1: a guy kind of laying out as someone looking into 648 00:37:06,560 --> 00:37:07,799 Speaker 1: this and told them be like, well, it looks like 649 00:37:08,160 --> 00:37:12,400 Speaker 1: everything's in this process of like either free fall or 650 00:37:12,520 --> 00:37:14,360 Speaker 1: massive change, and we're like, I don't think a lot 651 00:37:14,360 --> 00:37:15,920 Speaker 1: of people caught up to. And the frame that the 652 00:37:15,960 --> 00:37:19,640 Speaker 1: framework day uses post normal um is like the acceptance 653 00:37:19,680 --> 00:37:21,920 Speaker 1: that you're in a post normal era, which I think 654 00:37:22,000 --> 00:37:24,239 Speaker 1: is what you're getting at. There's the equilibrium isn't going 655 00:37:24,280 --> 00:37:27,879 Speaker 1: to come back right, like we're we and we're you're 656 00:37:28,000 --> 00:37:30,840 Speaker 1: you're starting from a flawed position when you're even thinking 657 00:37:30,840 --> 00:37:34,040 Speaker 1: of considering that as a possibility, um, the shift has 658 00:37:34,040 --> 00:37:36,840 Speaker 1: been too fundamental. It's the same thing politically with assuming 659 00:37:36,880 --> 00:37:39,880 Speaker 1: like anything could work the same way after Trump, Like no, 660 00:37:40,000 --> 00:37:44,240 Speaker 1: we're in post normal times where it's never going back. Yeah. Yeah, 661 00:37:44,280 --> 00:37:46,360 Speaker 1: and that's why I think sometimes it's you know, it 662 00:37:46,400 --> 00:37:49,520 Speaker 1: can be quite problematic to talk to progressives of a 663 00:37:49,560 --> 00:37:52,480 Speaker 1: different generation, like say my parents, who weren't by any 664 00:37:52,520 --> 00:37:59,040 Speaker 1: means lefties, but they were like old school Canadian social democrats, 665 00:37:59,040 --> 00:38:01,360 Speaker 1: you know, big welfares ate that kind of stuff. Is 666 00:38:01,400 --> 00:38:03,839 Speaker 1: that the assumption is that that's what we need right now, 667 00:38:03,840 --> 00:38:06,080 Speaker 1: Like that's we just need to get that back and 668 00:38:06,120 --> 00:38:10,759 Speaker 1: everything will be cool. Like that's that's my dad's analysis 669 00:38:10,760 --> 00:38:16,240 Speaker 1: of the problem. Yeah, and I can see the temptation, sure, 670 00:38:16,840 --> 00:38:22,400 Speaker 1: but it's totally not true. Yeah, I mean, the temptation 671 00:38:22,520 --> 00:38:30,160 Speaker 1: is profound because like, yeah, I mean it if things 672 00:38:30,200 --> 00:38:32,520 Speaker 1: were if you have this feeling that things were good, 673 00:38:32,560 --> 00:38:34,799 Speaker 1: whether or not it's right or wrong, you're naturally gonna 674 00:38:34,840 --> 00:38:37,000 Speaker 1: want to return to that, which, you know, is what 675 00:38:37,000 --> 00:38:39,600 Speaker 1: what's going to be the fuel for the authoritarian version 676 00:38:39,600 --> 00:38:42,040 Speaker 1: of this, but it's also going to be the fuel 677 00:38:42,080 --> 00:38:45,960 Speaker 1: for climate Leviathan, you know, like in any case, like 678 00:38:46,080 --> 00:38:50,120 Speaker 1: it's easier. The scary thing about trying to bring climate 679 00:38:50,320 --> 00:38:52,360 Speaker 1: X into being right is that it's by far the 680 00:38:52,360 --> 00:38:56,200 Speaker 1: best possible kind of solution you have, but it requires 681 00:38:56,200 --> 00:38:59,719 Speaker 1: saying fundamentally we're the way we all live is going 682 00:38:59,760 --> 00:39:02,440 Speaker 1: to have to change our attitudes towards democracy, are going 683 00:39:02,520 --> 00:39:04,880 Speaker 1: to have to change our attitudes towards what a society is. 684 00:39:04,880 --> 00:39:07,080 Speaker 1: You're going to have to shift in the fundamental level. Well, 685 00:39:07,160 --> 00:39:09,880 Speaker 1: everyone else is saying, here's how we bring back what 686 00:39:10,000 --> 00:39:11,800 Speaker 1: you used to have here, so we get the coffee 687 00:39:11,800 --> 00:39:16,200 Speaker 1: back in the stores, you know, Like, yeah, yeah, that's 688 00:39:16,239 --> 00:39:18,359 Speaker 1: kind of been the problem with a lot of left 689 00:39:18,400 --> 00:39:22,359 Speaker 1: leaning projects is that it is a newer thing, and 690 00:39:22,360 --> 00:39:24,680 Speaker 1: that's why none of them have really lasted very long 691 00:39:24,880 --> 00:39:28,600 Speaker 1: or they've you know, gone horribly wrong very quickly. Yeah, 692 00:39:28,960 --> 00:39:31,319 Speaker 1: especially if we're gonna try to do any Climate X 693 00:39:31,360 --> 00:39:33,440 Speaker 1: is more like a stateless world, or at least a 694 00:39:33,440 --> 00:39:36,560 Speaker 1: stateless area. That introduces a whole, a whole new problem 695 00:39:36,560 --> 00:39:39,160 Speaker 1: that we haven't really seen on a mass scale, you know, 696 00:39:39,239 --> 00:39:43,640 Speaker 1: outside of like Rojava or something. Um, it's gonna be 697 00:39:43,680 --> 00:39:47,000 Speaker 1: a whole new, a whole new problem to deal with, 698 00:39:47,040 --> 00:39:49,640 Speaker 1: and a whole lot of people are going to be 699 00:39:49,719 --> 00:39:54,439 Speaker 1: scared of that. Agreed, And I mean I think it's 700 00:39:54,600 --> 00:39:57,560 Speaker 1: you know, like, no, I'm stating you obvious, So I 701 00:39:57,600 --> 00:40:00,480 Speaker 1: apologize for this, But part of me thinks, you know, 702 00:40:00,520 --> 00:40:02,240 Speaker 1: I kind of need to say it to remind myself. 703 00:40:02,280 --> 00:40:05,520 Speaker 1: But you know, for for good or ill. I mean, 704 00:40:05,560 --> 00:40:09,560 Speaker 1: I guess it's for ill. A lot of people, like 705 00:40:09,640 --> 00:40:13,280 Speaker 1: even though they know that how they live now is untenable, 706 00:40:13,960 --> 00:40:17,520 Speaker 1: you know, in the larger frame, it's also how they 707 00:40:17,560 --> 00:40:21,400 Speaker 1: live now. Like like like a lot of us, myself included, 708 00:40:21,960 --> 00:40:25,920 Speaker 1: are invested in the way things work now, do you 709 00:40:25,920 --> 00:40:28,319 Speaker 1: know what I mean, Like like the prospect of that 710 00:40:28,440 --> 00:40:32,719 Speaker 1: radical change that that is a hard sell when you know, 711 00:40:32,880 --> 00:40:34,680 Speaker 1: like this is what put food on the table, this 712 00:40:34,719 --> 00:40:36,640 Speaker 1: is how my kids go to school, Like these are 713 00:40:36,719 --> 00:40:40,680 Speaker 1: you know, this is like that big leap that we 714 00:40:40,840 --> 00:40:44,080 Speaker 1: will have to demand of ourselves and others at some 715 00:40:44,160 --> 00:40:48,920 Speaker 1: point in the near future. Probably is also like justifiably 716 00:40:49,080 --> 00:40:54,120 Speaker 1: terrifying to lots of people. Yeah, and it's I mean 717 00:40:54,160 --> 00:40:57,200 Speaker 1: the big yeah. And it's always easier, it's always it's 718 00:40:57,320 --> 00:41:00,680 Speaker 1: a lot less frightening to tell people. I can make 719 00:41:00,680 --> 00:41:07,560 Speaker 1: it like it was yeah, which but it was you know, yeah, 720 00:41:07,600 --> 00:41:09,960 Speaker 1: and maybe it was awesome for some of them, you know, um, 721 00:41:10,440 --> 00:41:14,280 Speaker 1: for my dada, Yeah, yeah, yeah. I find it interesting 722 00:41:14,280 --> 00:41:17,719 Speaker 1: that you bring up deserts in that framework of like 723 00:41:18,200 --> 00:41:20,759 Speaker 1: of like equilibrium in terms of like, you know, they're 724 00:41:20,760 --> 00:41:22,680 Speaker 1: being a normal and desert is not as one of 725 00:41:22,719 --> 00:41:28,320 Speaker 1: the things that isn't it's always fluctuating between different temperates 726 00:41:28,400 --> 00:41:31,400 Speaker 1: um and you know, and and I think there was 727 00:41:31,480 --> 00:41:34,200 Speaker 1: there was a popular anarchists book I think also written 728 00:41:34,200 --> 00:41:37,399 Speaker 1: in two thousand and twelve, um just called Desert about 729 00:41:37,400 --> 00:41:40,600 Speaker 1: climate change and how basically it's it's it's it's it's 730 00:41:40,640 --> 00:41:42,680 Speaker 1: it's not. It's not about like how the whole world 731 00:41:42,680 --> 00:41:45,040 Speaker 1: we're literally turned into a desert. It's about like how 732 00:41:45,440 --> 00:41:48,680 Speaker 1: the desert model in terms of like they're never being 733 00:41:48,719 --> 00:41:50,880 Speaker 1: a normal again. It's gonna fluctuating between extremes. It's going 734 00:41:50,920 --> 00:41:53,520 Speaker 1: to happen in a lot of places, like like everything 735 00:41:53,560 --> 00:41:56,480 Speaker 1: is gonna get turned into their own version of deserts. Yeah. 736 00:41:56,560 --> 00:41:59,239 Speaker 1: The title is based off the idea of that old 737 00:41:59,320 --> 00:42:02,440 Speaker 1: quote that like empires make a desert and call it 738 00:42:02,480 --> 00:42:05,520 Speaker 1: peace right, and it's kind of seeing global capital as 739 00:42:05,600 --> 00:42:10,640 Speaker 1: the Yeah, it's called desert. Yeah. Online, it's like a 740 00:42:11,360 --> 00:42:16,680 Speaker 1: little bit of a manifesto. But yeah, but yeah, like 741 00:42:16,800 --> 00:42:20,680 Speaker 1: in terms of like we're never gonna even even as 742 00:42:20,719 --> 00:42:26,759 Speaker 1: the crumbles start happening, we're not We're never gonna rereach 743 00:42:26,800 --> 00:42:29,200 Speaker 1: a place of stability. It's always going to be in flux. 744 00:42:29,239 --> 00:42:31,640 Speaker 1: We're never going to get to that normal again. We 745 00:42:31,680 --> 00:42:34,880 Speaker 1: may have coffee for a year, we may have insulin 746 00:42:35,200 --> 00:42:37,319 Speaker 1: you know, being produced locally, but it's going to be 747 00:42:38,200 --> 00:42:40,120 Speaker 1: it's not that we're not going to have the same 748 00:42:40,920 --> 00:42:43,560 Speaker 1: false stability that we have now, right, Like we have 749 00:42:43,600 --> 00:42:45,640 Speaker 1: an idea of stability now now it's not true, you know, 750 00:42:45,680 --> 00:42:48,920 Speaker 1: because Taco Bell doesn't have ground beef anymore. Um right, 751 00:42:49,040 --> 00:42:52,080 Speaker 1: but like, actually real, are you guys saying yeah, yeah, yeah, 752 00:42:52,120 --> 00:42:55,600 Speaker 1: there's I mean there's shortages Taco Bells, like not able 753 00:42:55,640 --> 00:42:58,319 Speaker 1: to serve a lot of stuff. They're having shortages of things. 754 00:42:58,400 --> 00:43:00,440 Speaker 1: You don't really have Taco Bell here in can You're 755 00:43:00,480 --> 00:43:08,879 Speaker 1: not missing. I was, Oh, but you guys have tim 756 00:43:08,920 --> 00:43:11,120 Speaker 1: Horton's so no one's hands are clean. No, no, no, 757 00:43:11,160 --> 00:43:13,400 Speaker 1: it's true. Yeah. I didn't didn't mean. I didn't need 758 00:43:13,440 --> 00:43:16,919 Speaker 1: to absolve my strow responsibility. I just ate some Tim 759 00:43:16,960 --> 00:43:20,440 Speaker 1: Horton's cereal this morning. My my mother, who lives in 760 00:43:20,480 --> 00:43:24,040 Speaker 1: Canada sent me some. Um so that's what I ate 761 00:43:24,080 --> 00:43:27,400 Speaker 1: for breakfast. So yeah, did you used to live in Canada? 762 00:43:27,480 --> 00:43:35,200 Speaker 1: I did, Yeah, I'm Canadian Oh yeah, he's Canadian as hell. Yeah. 763 00:43:35,200 --> 00:43:38,000 Speaker 1: I moved to Portland with my family. Then most most 764 00:43:38,000 --> 00:43:39,960 Speaker 1: of my families moved back to Canada, which is probably 765 00:43:39,960 --> 00:43:43,359 Speaker 1: the smart move. Um. But I mean, as we see, 766 00:43:43,400 --> 00:43:45,879 Speaker 1: you know, both both these countries right now have like 767 00:43:47,360 --> 00:43:51,160 Speaker 1: liberals in charge quote quote right now, and it's not no. 768 00:43:51,880 --> 00:43:54,600 Speaker 1: You know, both Trudeau and Biden have a lot of 769 00:43:54,600 --> 00:43:58,000 Speaker 1: the same problems despite their generational gap, and they have 770 00:43:58,120 --> 00:44:00,000 Speaker 1: kind of the same effect in terms of what they 771 00:44:00,040 --> 00:44:02,520 Speaker 1: say versus what they do. You know, both of them. 772 00:44:02,560 --> 00:44:05,800 Speaker 1: You know, Biden talked about banning fracking in his campaign, 773 00:44:05,920 --> 00:44:08,360 Speaker 1: and everyone who is further to the left of Biden 774 00:44:08,440 --> 00:44:12,719 Speaker 1: new like, no, you're just lying. I like this, come on, 775 00:44:12,960 --> 00:44:15,040 Speaker 1: like come on. And you know Trudeau made a lot 776 00:44:15,080 --> 00:44:17,040 Speaker 1: of promises about pipelines and how that's not that's not 777 00:44:17,080 --> 00:44:19,719 Speaker 1: working out, um. And I think one of the things 778 00:44:19,719 --> 00:44:23,640 Speaker 1: that we haven't talked about yet is the symbiotic relationship 779 00:44:23,719 --> 00:44:27,680 Speaker 1: between the state and like tech companies and oil companies, um, 780 00:44:27,760 --> 00:44:31,000 Speaker 1: and how that will be alert how that I see 781 00:44:31,000 --> 00:44:33,840 Speaker 1: that being a large part of Levivan is basically the 782 00:44:33,880 --> 00:44:37,440 Speaker 1: government subsidizing or the government letting tech companies try to 783 00:44:37,520 --> 00:44:41,400 Speaker 1: fix the problem therefore increasing our reliance on capital and 784 00:44:41,440 --> 00:44:44,520 Speaker 1: those companies to maintain kind of you know, you know 785 00:44:44,560 --> 00:44:46,319 Speaker 1: in terms of like gew in terms of like gew 786 00:44:46,480 --> 00:44:49,600 Speaker 1: and engineering or carbon capture. Right if if the government 787 00:44:49,640 --> 00:44:52,480 Speaker 1: is gonna is gonna help help those companies do those things, 788 00:44:52,920 --> 00:44:56,359 Speaker 1: as soon as those companies go away, we get so 789 00:44:56,480 --> 00:44:59,520 Speaker 1: much more carbon released immediately. And it's kind of like 790 00:44:59,560 --> 00:45:02,520 Speaker 1: this like self preservation that I think capital is going 791 00:45:02,560 --> 00:45:06,120 Speaker 1: to try to do. Um. I know you you you 792 00:45:06,200 --> 00:45:08,279 Speaker 1: brought up stuff similar to this in terms of like 793 00:45:08,640 --> 00:45:11,840 Speaker 1: climate levies. That's like, yeah, do you do you have 794 00:45:11,920 --> 00:45:14,839 Speaker 1: like what what? What do you see? Now? That's kind 795 00:45:14,880 --> 00:45:18,560 Speaker 1: of frightening in terms of you know, tech getting its 796 00:45:18,640 --> 00:45:23,040 Speaker 1: hands on not just like government influence, but like you know, 797 00:45:23,360 --> 00:45:26,480 Speaker 1: trying to make itself a necessary part of our world 798 00:45:26,760 --> 00:45:28,759 Speaker 1: in terms of in term terms of climates. I mean, 799 00:45:28,760 --> 00:45:30,680 Speaker 1: the text is necessary for a lot of ways right now, 800 00:45:30,880 --> 00:45:32,879 Speaker 1: but specifically in terms of climate. How do you kind 801 00:45:32,880 --> 00:45:35,360 Speaker 1: of see that happening? Yeah? Now, I think that's a 802 00:45:35,360 --> 00:45:36,920 Speaker 1: really good point. And I'm not sure I put enough 803 00:45:36,920 --> 00:45:38,440 Speaker 1: thought into it to be honest with you, but I 804 00:45:38,480 --> 00:45:42,080 Speaker 1: do think you're right there. It's it's it's obvious, it 805 00:45:42,120 --> 00:45:45,920 Speaker 1: seems to me right now that that tech or you know, 806 00:45:47,200 --> 00:45:52,440 Speaker 1: green whatever they're calling themselves this kind of stuff. The goal, 807 00:45:52,640 --> 00:45:55,759 Speaker 1: and I think it's actually like quite explicit, is to 808 00:45:55,880 --> 00:46:00,000 Speaker 1: make itself essential to how we deal with the problem, 809 00:46:00,280 --> 00:46:02,880 Speaker 1: which which means that of course, like I think you 810 00:46:03,000 --> 00:46:06,040 Speaker 1: just said, the first thing that that requires is that 811 00:46:06,040 --> 00:46:07,680 Speaker 1: we write off a whole bunch of ways of dealing 812 00:46:07,760 --> 00:46:10,239 Speaker 1: with the problem so that we can prioritize this way 813 00:46:10,239 --> 00:46:13,000 Speaker 1: of doing things. And then and then once that, once 814 00:46:13,040 --> 00:46:16,120 Speaker 1: that becomes the way of managing things like carbon capture 815 00:46:16,360 --> 00:46:19,280 Speaker 1: or you know, something like that, that requires, as you say, 816 00:46:19,520 --> 00:46:21,800 Speaker 1: you know, once you once you start, it's like an addiction, 817 00:46:21,840 --> 00:46:26,280 Speaker 1: like you can't you can't stop it, or you're fucked. Um. 818 00:46:26,320 --> 00:46:28,640 Speaker 1: If we've written the other options off the table or 819 00:46:28,680 --> 00:46:33,359 Speaker 1: they just become untenable at some point, then then yeah, 820 00:46:33,440 --> 00:46:36,720 Speaker 1: I mean, the way that that tech will be crucial 821 00:46:36,719 --> 00:46:41,399 Speaker 1: to this um is absolutely their plan. I would say, 822 00:46:41,440 --> 00:46:44,040 Speaker 1: you know, like and I would say probably it's it's 823 00:46:44,080 --> 00:46:47,680 Speaker 1: already under discussion in big you know, serious ways that 824 00:46:48,000 --> 00:46:49,719 Speaker 1: big companies like Google and all the rest of it. 825 00:46:50,280 --> 00:46:52,560 Speaker 1: But the other thing I would say, on this same front, 826 00:46:52,600 --> 00:46:54,120 Speaker 1: and I think you just mentioned it too, is this 827 00:46:54,200 --> 00:46:56,640 Speaker 1: idea of you know, what will happen with geo engineering, 828 00:46:57,400 --> 00:47:01,760 Speaker 1: which I think it in terms of what I find 829 00:47:01,800 --> 00:47:04,879 Speaker 1: scary that I find a bit scary, and not just 830 00:47:05,120 --> 00:47:07,840 Speaker 1: of course, you know, the sort of experimenting with the 831 00:47:07,840 --> 00:47:13,840 Speaker 1: planetary system, which you know is pretty terrifying, but but 832 00:47:13,920 --> 00:47:19,040 Speaker 1: the political implications of of the power associated with being 833 00:47:19,080 --> 00:47:22,040 Speaker 1: able to manipulate the planet purely in the interests of 834 00:47:22,080 --> 00:47:25,520 Speaker 1: maintaining capital's power, you know, like we're really like, we're 835 00:47:25,560 --> 00:47:28,399 Speaker 1: really talking about we are willing to dicker with the 836 00:47:28,680 --> 00:47:32,399 Speaker 1: entire planet rather than change the way that we live. 837 00:47:33,760 --> 00:47:39,759 Speaker 1: Like that's that's that astounding. The scariest part of your 838 00:47:39,800 --> 00:47:41,879 Speaker 1: book for me was when you started talking about that 839 00:47:42,280 --> 00:47:45,800 Speaker 1: and space weaponry um, and that I was thinking about 840 00:47:45,800 --> 00:47:49,239 Speaker 1: this a lot yesterday with Bess going up in his 841 00:47:49,280 --> 00:47:52,959 Speaker 1: penis rocket um. And I think I even talked about 842 00:47:52,960 --> 00:47:56,560 Speaker 1: it on on another show. Is yeah, is the intersection 843 00:47:56,640 --> 00:48:02,600 Speaker 1: between the militarization of the atmosphere with the UH then 844 00:48:02,640 --> 00:48:05,760 Speaker 1: with like the control of the atmosphere right like basically 845 00:48:05,840 --> 00:48:08,319 Speaker 1: like making the atmosphere a thing that we're like, I 846 00:48:08,360 --> 00:48:12,000 Speaker 1: think colonizes the wrong word. I think that that's kind 847 00:48:12,000 --> 00:48:15,400 Speaker 1: of inappropriate for actual clon colonization. But it's it's kind 848 00:48:15,440 --> 00:48:17,920 Speaker 1: of similar, Like it's like it's the it's another frontier 849 00:48:17,920 --> 00:48:19,840 Speaker 1: that we wanted to conquer. It's the next one is 850 00:48:19,840 --> 00:48:23,000 Speaker 1: gonna be the atmosphere itself in terms of like weapons 851 00:48:23,040 --> 00:48:25,200 Speaker 1: as you talked about in the book, and then geo engineering, 852 00:48:25,480 --> 00:48:29,200 Speaker 1: and then with you know Bezos talking about moving all 853 00:48:29,239 --> 00:48:31,440 Speaker 1: of the polluting stuff into space, you know, it's the 854 00:48:31,480 --> 00:48:35,840 Speaker 1: same thing. Um yeah, I'd like so, yeah, I was 855 00:48:35,840 --> 00:48:37,840 Speaker 1: watching that happened yesterday, and you know, your your book 856 00:48:37,880 --> 00:48:40,120 Speaker 1: was written a few years ago, and it's like it's 857 00:48:40,120 --> 00:48:42,359 Speaker 1: the same thing. You're like, that's and before I read 858 00:48:42,400 --> 00:48:44,520 Speaker 1: your book, that's not that's not. I never thought about 859 00:48:44,560 --> 00:48:49,040 Speaker 1: the space thing specifically, And now with Basis talking about that, 860 00:48:49,040 --> 00:48:51,799 Speaker 1: that's like wow, they're just going all for it, Like 861 00:48:52,320 --> 00:48:56,640 Speaker 1: they're just yeah, they're just like it's like what what what? 862 00:48:56,640 --> 00:48:59,239 Speaker 1: What what made you guys think of that possibility? Like 863 00:48:59,239 --> 00:49:01,680 Speaker 1: what was the thing you saw there? Was like, hey, 864 00:49:01,719 --> 00:49:03,640 Speaker 1: this is how we kind of see this trend going 865 00:49:03,680 --> 00:49:06,840 Speaker 1: that will result in this kind of colonization of the 866 00:49:06,840 --> 00:49:08,640 Speaker 1: atmosphere in space, And well, well what did you see 867 00:49:08,640 --> 00:49:11,200 Speaker 1: that kind of got you there? Well, like, if I'm 868 00:49:11,239 --> 00:49:18,200 Speaker 1: honest with you, ah, that was really Joel's brainchild, like 869 00:49:18,280 --> 00:49:21,560 Speaker 1: that that part of the book about SRM solar Radiation 870 00:49:21,600 --> 00:49:24,200 Speaker 1: management and the space weaponry and that kind of stuff. 871 00:49:24,480 --> 00:49:26,560 Speaker 1: That was something that was a connection that he made 872 00:49:26,920 --> 00:49:31,080 Speaker 1: and he pursued, uh, most rigorously, and he actually wrote 873 00:49:31,120 --> 00:49:33,400 Speaker 1: that part of the book because as you can imagine, 874 00:49:34,040 --> 00:49:36,040 Speaker 1: to split it up, you know what I mean. Um. 875 00:49:36,080 --> 00:49:39,680 Speaker 1: And so I wouldn't want I mean, for me what 876 00:49:39,760 --> 00:49:43,120 Speaker 1: who made that connection was Joel, So I wouldn't want 877 00:49:43,120 --> 00:49:45,319 Speaker 1: to I wouldn't want to speak for him. I think 878 00:49:45,920 --> 00:49:51,560 Speaker 1: he he's he's in conversation both professionally and also just 879 00:49:51,600 --> 00:49:56,040 Speaker 1: like interest wise with a whole range of of international 880 00:49:56,040 --> 00:49:59,760 Speaker 1: relations scholars who inside the university are considered kind of wacky, 881 00:50:00,080 --> 00:50:03,080 Speaker 1: like people who take like space weapons seriously, like they're 882 00:50:03,120 --> 00:50:05,920 Speaker 1: sort of like peripheral, you know what I mean. And 883 00:50:06,040 --> 00:50:08,120 Speaker 1: Joel has been in conversation with him for a very 884 00:50:08,160 --> 00:50:10,840 Speaker 1: long time, so that he knew all that literature like 885 00:50:11,239 --> 00:50:13,560 Speaker 1: already before we even started. And I've never even heard 886 00:50:13,560 --> 00:50:16,520 Speaker 1: of it, um. But I do think the connection is 887 00:50:16,560 --> 00:50:19,319 Speaker 1: really compelling, and I know he's pursued it since I 888 00:50:19,320 --> 00:50:21,120 Speaker 1: hope you guys get a chance to chat with him. 889 00:50:21,160 --> 00:50:25,040 Speaker 1: He's a fucking brilliant I love to you're you're just 890 00:50:25,080 --> 00:50:28,000 Speaker 1: the easier person to contact. Yeah, yeah, you know, he's 891 00:50:28,200 --> 00:50:30,879 Speaker 1: and he loves to chat chat and he's infinitely more 892 00:50:30,960 --> 00:50:34,560 Speaker 1: articulate than me. Like you'll you'll you'll get way more 893 00:50:34,600 --> 00:50:38,279 Speaker 1: out of him than you ever have from me. Um, 894 00:50:38,320 --> 00:50:42,960 Speaker 1: he's a brilliant guy. Uh, but he's he's a that's 895 00:50:43,000 --> 00:50:44,880 Speaker 1: a connection that he made and I wouldn't want to 896 00:50:44,920 --> 00:50:48,560 Speaker 1: speak for how he got there. Sure, like you're more interested, 897 00:50:48,640 --> 00:50:50,520 Speaker 1: Like correct me if I'm wrong, But it's more you 898 00:50:50,520 --> 00:50:53,400 Speaker 1: you have like a lot of more studying in like 899 00:50:53,480 --> 00:50:57,919 Speaker 1: the economics side of things. Um, do do you see 900 00:50:58,000 --> 00:51:00,239 Speaker 1: because like the other the other kind of out of 901 00:51:00,280 --> 00:51:02,279 Speaker 1: this you know, ties in the space separy is just 902 00:51:02,360 --> 00:51:06,160 Speaker 1: the U. S Military itself. Um do you see them 903 00:51:06,280 --> 00:51:09,840 Speaker 1: interacting with the economy and in collapse like in like 904 00:51:09,920 --> 00:51:13,000 Speaker 1: crumble scenarios. How how do you see the military being 905 00:51:13,160 --> 00:51:16,560 Speaker 1: used by the state to kind of not solve issues 906 00:51:16,719 --> 00:51:19,560 Speaker 1: but like you know, mitigate some of them or adapt 907 00:51:19,600 --> 00:51:23,960 Speaker 1: to some of those. Yeah. I mean again, I don't 908 00:51:23,960 --> 00:51:26,239 Speaker 1: want to say I know any better than anyone else, 909 00:51:26,360 --> 00:51:29,080 Speaker 1: but I do think, I mean, the U. S Military 910 00:51:29,080 --> 00:51:32,279 Speaker 1: in particular has a couple of advantages you want to 911 00:51:32,320 --> 00:51:34,880 Speaker 1: call it that in terms of the role that they 912 00:51:34,960 --> 00:51:38,640 Speaker 1: might play as you know, certain kinds of crumbling become 913 00:51:38,960 --> 00:51:41,760 Speaker 1: essential for the state to deal with them. The first 914 00:51:41,840 --> 00:51:46,200 Speaker 1: is that it's been taking climate change seriously for decades 915 00:51:46,600 --> 00:51:51,520 Speaker 1: like that. Yeah, no mencing words in any of those reports, exactly, 916 00:51:51,800 --> 00:51:55,080 Speaker 1: very blunt and very accurate. Yeah. And they even they 917 00:51:55,080 --> 00:51:58,560 Speaker 1: know for a long time, they know the international security risks. 918 00:51:58,880 --> 00:52:01,319 Speaker 1: They you know, they have they have plans, they're you know, 919 00:52:01,320 --> 00:52:04,000 Speaker 1: they're trying to design weapons that don't require fossil fuels 920 00:52:04,040 --> 00:52:07,279 Speaker 1: like they are. They take the problem seriously, so they're 921 00:52:07,320 --> 00:52:09,799 Speaker 1: ahead in that sense. And also, I do think that 922 00:52:09,880 --> 00:52:12,160 Speaker 1: the kind of sort of localized for lack of a 923 00:52:12,200 --> 00:52:14,680 Speaker 1: better term, of regionalized crumbling that you guys are just 924 00:52:15,320 --> 00:52:22,040 Speaker 1: discussing will make the militarization of certain parts of the economy, 925 00:52:22,560 --> 00:52:28,200 Speaker 1: probably especially supply lines um and certain production processes, maybe 926 00:52:28,239 --> 00:52:32,560 Speaker 1: even if something as central as agriculture um, well, the 927 00:52:32,600 --> 00:52:37,320 Speaker 1: militarization and the securitization of those aspects of the modern 928 00:52:37,360 --> 00:52:40,000 Speaker 1: economy are going to become more and more essential, and 929 00:52:40,800 --> 00:52:42,880 Speaker 1: as far as I'm concerned, at least in North America, 930 00:52:42,960 --> 00:52:47,560 Speaker 1: the principle instrument of that securitization will be the military. 931 00:52:48,080 --> 00:52:49,840 Speaker 1: And I think the other interesting things in terms of 932 00:52:49,880 --> 00:52:52,160 Speaker 1: like the plague is like that's the one thing we 933 00:52:52,239 --> 00:52:54,279 Speaker 1: all saw the military do is be crucial in the 934 00:52:54,320 --> 00:52:57,920 Speaker 1: vaccine distribution effort. Um. So I think, you know, really 935 00:52:57,960 --> 00:53:01,600 Speaker 1: the past year has been for interesting glimpse into how 936 00:53:02,200 --> 00:53:08,880 Speaker 1: we're gonna use our capital and military power when stuff 937 00:53:08,920 --> 00:53:12,920 Speaker 1: gets more and more unstable. Um. It seems like in 938 00:53:12,960 --> 00:53:15,920 Speaker 1: the States that's how the state, the state knows how 939 00:53:15,960 --> 00:53:19,160 Speaker 1: to step in is via the military or via the police. 940 00:53:19,200 --> 00:53:20,960 Speaker 1: Do you know what I mean? Because there isn't like 941 00:53:21,040 --> 00:53:23,439 Speaker 1: as you know here in Canada. I mean, we've got 942 00:53:23,440 --> 00:53:26,160 Speaker 1: lots of things that are problematic, but one of the 943 00:53:26,200 --> 00:53:29,759 Speaker 1: things we didn't need for the distribution of the vaccine 944 00:53:30,040 --> 00:53:33,000 Speaker 1: was anything more than our public health care system, which 945 00:53:33,040 --> 00:53:37,359 Speaker 1: was extant and worked perfectly fine, you know what I mean, 946 00:53:37,400 --> 00:53:39,520 Speaker 1: Like we didn't have to build up any new infrastructure, 947 00:53:39,560 --> 00:53:41,520 Speaker 1: and like that, we just had to say, oh, you 948 00:53:41,560 --> 00:53:45,359 Speaker 1: people who are already doing this do this too. Yeah. 949 00:53:45,400 --> 00:53:48,719 Speaker 1: And the lack of the lack of civil infrastructure in 950 00:53:48,760 --> 00:53:53,360 Speaker 1: the States makes US more both need to rely on 951 00:53:53,360 --> 00:53:57,680 Speaker 1: the military, and it makes Americans imaginations so small that 952 00:53:57,800 --> 00:54:00,640 Speaker 1: the only way they can envision that is your military 953 00:54:00,719 --> 00:54:04,680 Speaker 1: force or is through policing because the only civil infrastructure 954 00:54:04,719 --> 00:54:07,279 Speaker 1: we fund is policing in the military. Well in the 955 00:54:07,360 --> 00:54:12,319 Speaker 1: military is also the only thing Americans overwhelmingly trust, Like 956 00:54:12,400 --> 00:54:14,759 Speaker 1: there's no other like you look at polling, like there's 957 00:54:14,840 --> 00:54:18,280 Speaker 1: no other branch of the government that is widely trusted 958 00:54:18,280 --> 00:54:21,200 Speaker 1: by US citizens other than the military. And it's I mean, 959 00:54:21,200 --> 00:54:27,040 Speaker 1: it's because of the most successful propaganda campaign of all time, um, 960 00:54:27,080 --> 00:54:29,920 Speaker 1: their partnership with Hollywood, but it is it is a reality, 961 00:54:30,080 --> 00:54:34,920 Speaker 1: Like huh, yeah, that's a really interesting point. We'll probably 962 00:54:34,960 --> 00:54:36,759 Speaker 1: want to rap up with. The one other thing I 963 00:54:36,800 --> 00:54:40,479 Speaker 1: want to mention is like the hardest part of looking 964 00:54:40,480 --> 00:54:46,440 Speaker 1: at Levithan for me is the is how incapable the 965 00:54:46,520 --> 00:54:50,399 Speaker 1: UN is um in terms of like like how bad 966 00:54:50,440 --> 00:54:52,600 Speaker 1: they are at doing their job. So what do you 967 00:54:52,640 --> 00:54:57,319 Speaker 1: think would need to change um for something like the 968 00:54:57,440 --> 00:55:00,080 Speaker 1: UN And maybe maybe not then specifically, but like you know, 969 00:55:00,200 --> 00:55:03,040 Speaker 1: if we're going to have like a transnational cooperation of 970 00:55:03,120 --> 00:55:06,800 Speaker 1: the state and capital to try to to try to 971 00:55:06,800 --> 00:55:09,120 Speaker 1: to try to you know, alleviate some of the worst 972 00:55:09,120 --> 00:55:11,880 Speaker 1: spect of climate change. What would need to happen for 973 00:55:12,239 --> 00:55:16,080 Speaker 1: to make that more realistic because the u N is 974 00:55:16,400 --> 00:55:21,080 Speaker 1: not it, at least not right now. Um yeah, yeah, 975 00:55:21,239 --> 00:55:22,800 Speaker 1: I don't know if I have a good answer to 976 00:55:22,840 --> 00:55:25,000 Speaker 1: that question, to be honest with you, you know, partly 977 00:55:25,000 --> 00:55:28,359 Speaker 1: because it's such a good question. Um. I think the 978 00:55:28,440 --> 00:55:31,640 Speaker 1: U N and the u n F Triple C you know, 979 00:55:31,760 --> 00:55:36,680 Speaker 1: have have proven like like I mean, I don't think 980 00:55:36,680 --> 00:55:38,640 Speaker 1: it's too much to say that, you know that the 981 00:55:38,680 --> 00:55:42,400 Speaker 1: international negotiations that have gone on around this, you know, Copenhagen, Paris, 982 00:55:42,560 --> 00:55:45,920 Speaker 1: can COOUM whatever, Dada da da have literally got us nowhere, 983 00:55:46,480 --> 00:55:50,920 Speaker 1: like seriously nowhere. Um. It's been more of, you know, 984 00:55:51,400 --> 00:55:55,320 Speaker 1: a sort of long term dithering. H And it's hard 985 00:55:55,520 --> 00:55:59,080 Speaker 1: to imagine to me at least that that framework an 986 00:55:59,120 --> 00:56:03,320 Speaker 1: approach to global problem solving is gonna you know, somehow 987 00:56:03,360 --> 00:56:06,160 Speaker 1: be redeemed the next meeting a Glasgow and everything will 988 00:56:06,160 --> 00:56:10,920 Speaker 1: be fine again. Um. I think that from a purely 989 00:56:11,920 --> 00:56:18,640 Speaker 1: like real politic perspective, it's gonna take like the US 990 00:56:18,680 --> 00:56:21,760 Speaker 1: and China creating a G two and just making rules 991 00:56:21,800 --> 00:56:27,400 Speaker 1: for the world. Those rules will be terrible um and 992 00:56:27,400 --> 00:56:29,879 Speaker 1: and it it will be a kind of levi authoritarian 993 00:56:29,960 --> 00:56:33,319 Speaker 1: Leviathan form if that happens. I think, but I think 994 00:56:33,360 --> 00:56:35,880 Speaker 1: in terms of what we might actually anticipate happening in 995 00:56:35,880 --> 00:56:39,319 Speaker 1: the medium term, that's much closer than any sort of 996 00:56:39,320 --> 00:56:44,919 Speaker 1: like global hug that's gonna get us through this. I'm 997 00:56:44,920 --> 00:56:48,439 Speaker 1: so like, I'm I'm pretty young, I'm eighteen. I'm part 998 00:56:48,480 --> 00:56:53,000 Speaker 1: of the zoomers, um my generation. You know. The friends 999 00:56:53,000 --> 00:56:56,160 Speaker 1: that I've had that are my age don't have much 1000 00:56:56,200 --> 00:56:58,200 Speaker 1: hope for the future. Kind of like the term we 1001 00:56:58,320 --> 00:57:00,080 Speaker 1: use is like a dum er. That's like kind of 1002 00:57:00,160 --> 00:57:02,040 Speaker 1: like the kind of thing we use is like we 1003 00:57:02,160 --> 00:57:06,200 Speaker 1: can't see anything besides doom and despair. And for some 1004 00:57:06,239 --> 00:57:08,680 Speaker 1: people that drives them to nihilism. For some people, it 1005 00:57:08,760 --> 00:57:12,600 Speaker 1: just drives them to apathy. Uh, sometimes it drives them 1006 00:57:12,640 --> 00:57:17,720 Speaker 1: to like anger and resentment and attack. Um do do 1007 00:57:17,800 --> 00:57:20,680 Speaker 1: you have any hope for what's going to happen in 1008 00:57:20,720 --> 00:57:23,120 Speaker 1: the next few years? Do you? You know, I'm not 1009 00:57:23,160 --> 00:57:26,360 Speaker 1: sure if you have kids, but like how what what 1010 00:57:26,400 --> 00:57:28,520 Speaker 1: do you think? You know, like you're you're at least 1011 00:57:28,520 --> 00:57:30,320 Speaker 1: your teachers, Like what what do you say to like 1012 00:57:30,800 --> 00:57:34,160 Speaker 1: younger generations? In terms of like, how can we look 1013 00:57:34,200 --> 00:57:37,760 Speaker 1: at these very depressing problems and how can we get 1014 00:57:37,800 --> 00:57:42,560 Speaker 1: a more useful outlook than just being doomers, because like, 1015 00:57:42,600 --> 00:57:45,280 Speaker 1: the doomer action is natural, it's easy, you know. I 1016 00:57:45,280 --> 00:57:47,800 Speaker 1: I default to that every day. You know, it takes 1017 00:57:47,840 --> 00:57:50,920 Speaker 1: it takes takes active fighting to not just want to 1018 00:57:51,000 --> 00:57:55,400 Speaker 1: lay in my bed and cry. Um, So like what what? Yeah? 1019 00:57:55,440 --> 00:57:58,240 Speaker 1: Like do you have hope? Where does it? If so, 1020 00:57:58,280 --> 00:58:00,840 Speaker 1: where does it come from? Where can you see you know, 1021 00:58:02,280 --> 00:58:07,320 Speaker 1: not non dumor outlooks being useful? Yeah, I think about 1022 00:58:07,360 --> 00:58:12,120 Speaker 1: this all the time too. Um. I do have kids there, 1023 00:58:12,120 --> 00:58:16,000 Speaker 1: you're exactly your age, you're seventeen and twenty. Um. So 1024 00:58:16,160 --> 00:58:19,720 Speaker 1: one just graduated high school. There's part way through university. Um, 1025 00:58:19,880 --> 00:58:24,120 Speaker 1: that's what he went to do. Um And I don't 1026 00:58:24,200 --> 00:58:30,320 Speaker 1: think interestingly enough. And I don't mean this in any 1027 00:58:30,400 --> 00:58:35,320 Speaker 1: kind of like value of way. Neither of them are dumers. 1028 00:58:36,520 --> 00:58:40,400 Speaker 1: Both of them are. I wouldn't call them hopeful, but 1029 00:58:40,600 --> 00:58:47,160 Speaker 1: they are. They are not h which and it would 1030 00:58:47,160 --> 00:58:49,760 Speaker 1: be totally reasonable to be. They are not obsessed with 1031 00:58:50,720 --> 00:58:55,880 Speaker 1: what the future seems not to hold. Um. And I 1032 00:58:56,000 --> 00:59:05,400 Speaker 1: say that only because um, I do think that well, 1033 00:59:05,480 --> 00:59:09,280 Speaker 1: I guess one of the have to sort of responses, 1034 00:59:09,280 --> 00:59:12,760 Speaker 1: and the first of which is is, I apologize quite cliche, 1035 00:59:13,200 --> 00:59:15,840 Speaker 1: but I actually think it still matters, and that is 1036 00:59:15,880 --> 00:59:19,240 Speaker 1: that your generation will soon be in charge and that's 1037 00:59:19,240 --> 00:59:24,160 Speaker 1: a very good, good thing. Um. But but that's the cliche. 1038 00:59:24,600 --> 00:59:29,600 Speaker 1: But the second part is that I I because I because, 1039 00:59:29,720 --> 00:59:32,560 Speaker 1: like Robert was saying earlier, I think I'm assuming you 1040 00:59:32,640 --> 00:59:34,080 Speaker 1: sort of have a little bit of a similar take 1041 00:59:34,080 --> 00:59:37,440 Speaker 1: cares because I don't see this as kind of like 1042 00:59:37,480 --> 00:59:41,520 Speaker 1: a collapse process, but rather us managing a series of 1043 00:59:42,160 --> 00:59:45,880 Speaker 1: radical changes in the way that systems work, crumblings, you know, 1044 00:59:45,960 --> 00:59:49,480 Speaker 1: and breakdowns, that kind of thing, but also changes that 1045 00:59:49,680 --> 00:59:52,480 Speaker 1: that we that my hope lies in our capacity to 1046 00:59:52,800 --> 00:59:57,800 Speaker 1: use those moments as ways to not fix things or 1047 00:59:57,840 --> 01:00:02,080 Speaker 1: make things all better, but to like work together generously 1048 01:00:02,480 --> 01:00:04,840 Speaker 1: with the folks with whom we're alive they'll be dickheads 1049 01:00:04,880 --> 01:00:08,640 Speaker 1: for sure, um, to to make the most of what 1050 01:00:08,680 --> 01:00:11,520 Speaker 1: we have at that moment and to work to the future. 1051 01:00:11,760 --> 01:00:17,760 Speaker 1: And I don't see those moments running out. Those will 1052 01:00:17,800 --> 01:00:22,120 Speaker 1: be there, and insofar as we leap into them generously, 1053 01:00:22,360 --> 01:00:24,880 Speaker 1: because we don't know any better than anyone else, necessarily 1054 01:00:26,040 --> 01:00:29,080 Speaker 1: we will always have the capacity for hopeful and actually 1055 01:00:29,200 --> 01:00:34,680 Speaker 1: joyous solidarity in confronting those problems, and it won't always 1056 01:00:34,720 --> 01:00:36,920 Speaker 1: be fun around on the tournament romanticize it. But all 1057 01:00:36,960 --> 01:00:39,080 Speaker 1: I'm trying to say is that I actually think that 1058 01:00:41,760 --> 01:00:45,280 Speaker 1: they're the world will not be without joy unless we 1059 01:00:45,360 --> 01:00:48,520 Speaker 1: choose to let it go there. And I guess like, 1060 01:00:48,560 --> 01:00:51,200 Speaker 1: at least for now, I still feel like we can. 1061 01:00:51,400 --> 01:00:55,000 Speaker 1: We can tell ourselves, we can, we can confront what's 1062 01:00:55,000 --> 01:00:56,400 Speaker 1: coming our way. I don't know if it's going to 1063 01:00:56,480 --> 01:01:00,400 Speaker 1: be bad or good, but we will do so. Um. 1064 01:01:00,440 --> 01:01:02,000 Speaker 1: And there will be a group of us who does 1065 01:01:02,040 --> 01:01:06,640 Speaker 1: it with good in our hearts, and I would just 1066 01:01:06,680 --> 01:01:09,880 Speaker 1: take that for what it's worth, you know. Yeah, It's like, 1067 01:01:09,960 --> 01:01:12,520 Speaker 1: in like a weird way, a lot of these crumbles 1068 01:01:12,520 --> 01:01:16,480 Speaker 1: will almost give an opportunity for radical freedom because like, 1069 01:01:16,520 --> 01:01:17,920 Speaker 1: you know, we we think of ourselves and looking like 1070 01:01:17,920 --> 01:01:20,080 Speaker 1: a free society, but you know, we rely on so 1071 01:01:20,120 --> 01:01:23,080 Speaker 1: many things that are out of our control and that 1072 01:01:23,120 --> 01:01:28,640 Speaker 1: you know makes us unhappy subconsciously and subconsciously um. And 1073 01:01:28,760 --> 01:01:31,600 Speaker 1: when we're forced to live such be so active in 1074 01:01:31,640 --> 01:01:33,440 Speaker 1: our life and in our communities and with you know, 1075 01:01:33,480 --> 01:01:35,640 Speaker 1: people we love and care about. You know, it does 1076 01:01:35,720 --> 01:01:38,760 Speaker 1: it the one thing that I hope that I do 1077 01:01:38,840 --> 01:01:41,280 Speaker 1: hope for that it will give us more opportunities to 1078 01:01:41,440 --> 01:01:45,400 Speaker 1: have like some like radical freedom, um, and you know, 1079 01:01:46,320 --> 01:01:49,000 Speaker 1: be able to live and to be able to have 1080 01:01:49,080 --> 01:01:52,120 Speaker 1: small communities that can live, you know, much much more 1081 01:01:52,360 --> 01:01:55,000 Speaker 1: free than what we do now. Um in terms of 1082 01:01:55,040 --> 01:01:57,600 Speaker 1: you know, authoritarianism from the state and through you know, 1083 01:01:57,680 --> 01:02:03,160 Speaker 1: companies in capital um. Yeah, as as long as as 1084 01:02:03,200 --> 01:02:06,040 Speaker 1: long as we don't fall into a full military dictatorship 1085 01:02:06,240 --> 01:02:09,640 Speaker 1: of capital tech. UM, that's kind of that's kind of 1086 01:02:09,800 --> 01:02:14,320 Speaker 1: the thing that I would like. Um. Yeah, I don't know. 1087 01:02:14,360 --> 01:02:18,920 Speaker 1: It's really hard to talk about this stuff without sounding cheesy. Yeah. Yeah, 1088 01:02:19,040 --> 01:02:23,840 Speaker 1: especially when you talk about when you try to sell 1089 01:02:23,920 --> 01:02:28,200 Speaker 1: people on the only possible optimistic outcomes for this, because 1090 01:02:28,200 --> 01:02:31,720 Speaker 1: there's you you have to I guess I think we 1091 01:02:31,800 --> 01:02:34,360 Speaker 1: were also trying. This is another thing capitalism does well, 1092 01:02:34,360 --> 01:02:38,200 Speaker 1: talking about it as a resilient system. Um. When you 1093 01:02:38,240 --> 01:02:45,000 Speaker 1: talk about alternatives to capitalism, it's hard not to sound like, uh, 1094 01:02:45,240 --> 01:02:47,480 Speaker 1: it's hard. It's hard not to sound silly to people 1095 01:02:47,520 --> 01:02:52,280 Speaker 1: because anything that isn't this specific system either sounds you're 1096 01:02:52,280 --> 01:02:53,840 Speaker 1: either going back and saying I want to do a 1097 01:02:53,880 --> 01:02:57,080 Speaker 1: community Soviet Union again, but we'll get it right this time, 1098 01:02:57,280 --> 01:03:03,040 Speaker 1: or you're I don't know, it's it's tough because people 1099 01:03:03,080 --> 01:03:06,000 Speaker 1: are are very bought have very much bought into the 1100 01:03:06,040 --> 01:03:08,800 Speaker 1: idea that like anything that isn't a slight modification of 1101 01:03:08,840 --> 01:03:14,920 Speaker 1: what we're doing right now is um is silly, uh 1102 01:03:15,080 --> 01:03:19,040 Speaker 1: star trek bullshit. You know, I agree you guys, you 1103 01:03:19,040 --> 01:03:21,760 Speaker 1: guys have probably both heard that. I always come back 1104 01:03:21,800 --> 01:03:23,160 Speaker 1: to it all the time. I think it's great that 1105 01:03:23,240 --> 01:03:26,400 Speaker 1: famous quote from Frederick Jamison where he says it's easier 1106 01:03:26,400 --> 01:03:27,800 Speaker 1: to imagine the end of the world than it is 1107 01:03:27,800 --> 01:03:31,680 Speaker 1: the end of capitalism and u and it gets attributed 1108 01:03:31,720 --> 01:03:33,120 Speaker 1: to tons of different people, but it was you know, 1109 01:03:33,760 --> 01:03:37,360 Speaker 1: it was coined by that guy Jamison. He's a English 1110 01:03:37,360 --> 01:03:40,200 Speaker 1: professor actually weirdly enough at Duke. But there's a good 1111 01:03:40,280 --> 01:03:43,439 Speaker 1: ursula kay La gwinn uh comment on kind of the same. 1112 01:03:43,680 --> 01:03:46,480 Speaker 1: Oh yeah, I'm sure, but I think you're right, Like 1113 01:03:46,520 --> 01:03:49,440 Speaker 1: it's it's a it's a quip and it's sort of 1114 01:03:49,560 --> 01:03:53,480 Speaker 1: you know, superficial, but it's actually also true. Like a 1115 01:03:53,480 --> 01:03:55,880 Speaker 1: lot of the times today, it does feel like that 1116 01:03:55,960 --> 01:03:59,120 Speaker 1: it's easy, it's literally easier to imagine us driving the 1117 01:03:59,120 --> 01:04:03,520 Speaker 1: planet into the end of its functioning, and it is 1118 01:04:04,000 --> 01:04:06,160 Speaker 1: for many people to imagine otherwise. I think I think 1119 01:04:06,160 --> 01:04:08,320 Speaker 1: you're right, Robert, like when you tell people, oh, no, no, 1120 01:04:08,520 --> 01:04:10,960 Speaker 1: In fact, a lot of things could be otherwise, and 1121 01:04:11,080 --> 01:04:13,560 Speaker 1: we could have it quite quickly if we chose. People 1122 01:04:13,560 --> 01:04:17,160 Speaker 1: look at you like you have two heads. Yeah, And 1123 01:04:17,200 --> 01:04:20,200 Speaker 1: that was one of the most optimistic thing I've experienced 1124 01:04:20,200 --> 01:04:23,960 Speaker 1: in the last several years was going to northeast Syria, 1125 01:04:24,160 --> 01:04:28,080 Speaker 1: which is a mess in a very complicated situation, but 1126 01:04:28,160 --> 01:04:31,840 Speaker 1: like sitting down with like militia in the desert and 1127 01:04:31,920 --> 01:04:34,640 Speaker 1: having these conversations about like the future that they were 1128 01:04:34,680 --> 01:04:36,960 Speaker 1: trying to build, and like here's the like here's our 1129 01:04:37,000 --> 01:04:40,040 Speaker 1: soil reclamation project, and like here's the way we're trying 1130 01:04:40,040 --> 01:04:42,880 Speaker 1: to like alter and it like, well, if they're able 1131 01:04:42,880 --> 01:04:47,920 Speaker 1: to like try, that's remarkable and they've got some ship 1132 01:04:47,960 --> 01:04:52,480 Speaker 1: to deal with, you know. Um amazing, Yeah, thank you 1133 01:04:52,560 --> 01:04:56,440 Speaker 1: so much. Yeah, super nice to meet you both. I 1134 01:04:56,480 --> 01:05:01,480 Speaker 1: really appreciate. And that is our interview with Jeff Mann, 1135 01:05:01,680 --> 01:05:04,520 Speaker 1: co author of Climate Leviathan. You can find him on 1136 01:05:04,520 --> 01:05:08,760 Speaker 1: Twitter at Jeff p. Man Um Jeff spelt with a 1137 01:05:08,880 --> 01:05:10,840 Speaker 1: with a with a G. Not with a not with 1138 01:05:10,880 --> 01:05:14,000 Speaker 1: a J, and follow us on Twitter, at cool Zone 1139 01:05:14,040 --> 01:05:17,360 Speaker 1: Media and happen Here pod. You can find me at 1140 01:05:17,440 --> 01:05:21,720 Speaker 1: Hungry bow Tie. Thank you for listening, and see you tomorrow.