1 00:00:01,440 --> 00:00:07,680 Speaker 1: Welcome to Stuff You Should Know, a production of iHeartRadio. 2 00:00:11,119 --> 00:00:13,680 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:13,840 --> 00:00:18,520 Speaker 2: Chuck and it's just us again, Jerry who And this 4 00:00:18,560 --> 00:00:20,760 Speaker 2: is stuff you should Know. That's right. 5 00:00:21,640 --> 00:00:26,560 Speaker 3: Another Econ edition. Typically not my favorite, but you know, 6 00:00:26,760 --> 00:00:28,639 Speaker 3: this one I could wrap my head around for the 7 00:00:28,640 --> 00:00:29,120 Speaker 3: most part. 8 00:00:30,200 --> 00:00:32,120 Speaker 2: I think that's one of the reasons why it's had 9 00:00:32,159 --> 00:00:35,800 Speaker 2: such an enormous impact on the world. Yes, because it 10 00:00:35,880 --> 00:00:39,160 Speaker 2: is so easy to wrap your head around. We should 11 00:00:39,159 --> 00:00:41,360 Speaker 2: probably say, we're talking about the tragedy of the Commons. 12 00:00:42,200 --> 00:00:45,720 Speaker 2: And for those of you who aren't familiar, it's this idea, 13 00:00:45,800 --> 00:00:51,600 Speaker 2: this concept that if you have a shared resource a commons, say, 14 00:00:52,760 --> 00:00:55,360 Speaker 2: and people are able to use it at their own 15 00:00:57,000 --> 00:01:02,000 Speaker 2: leisure for their own purposes. Event as they seek to 16 00:01:02,040 --> 00:01:07,320 Speaker 2: maximize their profits, they're going to overuse this commons and 17 00:01:07,640 --> 00:01:12,080 Speaker 2: inevitably it'll be ruined because people can't have anything nice. Essentially, 18 00:01:13,080 --> 00:01:13,360 Speaker 2: you know. 19 00:01:13,319 --> 00:01:16,679 Speaker 3: It's funny, you're gonna laugh at me here, they're all 20 00:01:16,680 --> 00:01:20,319 Speaker 3: gonna laugh at you. Central the central area of my 21 00:01:20,440 --> 00:01:24,080 Speaker 3: high school was called the Commons. Yeah, and it just 22 00:01:24,240 --> 00:01:26,800 Speaker 3: occurred to me thirty five years or so after I 23 00:01:26,840 --> 00:01:28,960 Speaker 3: graduated that that's what that meant. 24 00:01:29,840 --> 00:01:31,480 Speaker 2: Oh, I see, I thought you were going to say, 25 00:01:31,520 --> 00:01:34,360 Speaker 2: like my teenage years were the tragedy of the comments. 26 00:01:34,959 --> 00:01:37,320 Speaker 3: No, I just I don't know. I never thought about 27 00:01:37,360 --> 00:01:38,959 Speaker 3: the word because it was when you're in high school, 28 00:01:38,959 --> 00:01:41,240 Speaker 3: it's just sending the comments will meet the commons. But 29 00:01:41,319 --> 00:01:43,040 Speaker 3: it never occurred to me that that's what that meant, 30 00:01:43,080 --> 00:01:46,319 Speaker 3: just like a common area sure by everybody. 31 00:01:45,959 --> 00:01:48,280 Speaker 2: It didn't either to me. I know exactly what you're 32 00:01:48,280 --> 00:01:50,840 Speaker 2: talking about. There was, if not high school, then maybe 33 00:01:50,840 --> 00:01:54,080 Speaker 2: middle school. There was some school where that was called 34 00:01:54,080 --> 00:01:56,760 Speaker 2: that too. Yeah, and I wonder if it was like 35 00:01:56,800 --> 00:02:00,920 Speaker 2: a surreptitious shot at kids, like they're all sheep, because 36 00:02:01,440 --> 00:02:04,960 Speaker 2: I don't think that's true. But one of the major 37 00:02:05,120 --> 00:02:09,000 Speaker 2: uses for commons traditionally has been for grazing. That's a 38 00:02:09,040 --> 00:02:11,239 Speaker 2: really good example. If you've let a bunch of people 39 00:02:11,280 --> 00:02:16,280 Speaker 2: graze on a shared resource, but they're taking what they 40 00:02:16,320 --> 00:02:19,200 Speaker 2: can from that resource to make money for themselves to 41 00:02:19,240 --> 00:02:25,280 Speaker 2: support themselves, then just because humans are rational, selfish, horrible beings, 42 00:02:26,280 --> 00:02:29,760 Speaker 2: that metal or that high school cafeteria will be ruined. 43 00:02:30,000 --> 00:02:32,760 Speaker 3: Yeah, that's right. And the idea of the tragedy of 44 00:02:32,760 --> 00:02:36,959 Speaker 3: the commons started out in nineteen sixty eight an article 45 00:02:37,000 --> 00:02:41,120 Speaker 3: from Science, the journal Science. It was called the Tragedy 46 00:02:41,160 --> 00:02:43,040 Speaker 3: of the Commons, and it was by a biologist named 47 00:02:43,040 --> 00:02:46,440 Speaker 3: Garrett Hardin, and he was sort of piggybacking on a 48 00:02:46,520 --> 00:02:52,000 Speaker 3: nineteenth century English mathematician and economist named William forsteror not Foster. 49 00:02:51,840 --> 00:02:56,600 Speaker 2: I would say, piggybacking or hijacking. 50 00:02:56,200 --> 00:03:01,440 Speaker 3: Hijacking William Forster Lloyd. And you mentioned grazing, and this 51 00:03:01,480 --> 00:03:04,080 Speaker 3: is sort of the thought experiment Harden went with, which 52 00:03:04,120 --> 00:03:06,400 Speaker 3: is like, you have a you know, a grazing field, 53 00:03:06,840 --> 00:03:09,679 Speaker 3: and let's say three farmers have use of it, and 54 00:03:09,680 --> 00:03:12,320 Speaker 3: they're letting their cows and cheap and you know grays 55 00:03:12,360 --> 00:03:16,200 Speaker 3: out there. But at some point one of them is 56 00:03:16,240 --> 00:03:18,480 Speaker 3: going to get an extra fat cow that they sell 57 00:03:18,520 --> 00:03:22,160 Speaker 3: for pretty good money. And they're like, hey, well, if 58 00:03:22,160 --> 00:03:24,760 Speaker 3: that worked out, maybe I can add like an additional cow, 59 00:03:25,480 --> 00:03:28,919 Speaker 3: and maybe it'll make you know, the value of all 60 00:03:29,000 --> 00:03:31,640 Speaker 3: of these cows go down a little bit, including my own, 61 00:03:31,720 --> 00:03:34,200 Speaker 3: because they're not getting as fat because I've added another 62 00:03:34,240 --> 00:03:36,680 Speaker 3: cow here, right, But if I can sell it for 63 00:03:36,760 --> 00:03:39,080 Speaker 3: this much money and it's only costing me a fraction 64 00:03:39,160 --> 00:03:41,040 Speaker 3: of that, for you know the. 65 00:03:43,880 --> 00:03:45,080 Speaker 2: Won't you extra cow? 66 00:03:45,600 --> 00:03:47,680 Speaker 3: Well, I mean it would only cost a fraction of 67 00:03:47,760 --> 00:03:51,440 Speaker 3: loss of resources for the other cows. Like I'm still 68 00:03:51,480 --> 00:03:54,400 Speaker 3: coming out ahead and so bad to being bada boom, 69 00:03:54,720 --> 00:03:55,520 Speaker 3: that's what I'm going. 70 00:03:55,480 --> 00:03:59,080 Speaker 2: To do, right Yeah. So if everybody was making one 71 00:03:59,120 --> 00:04:02,520 Speaker 2: dollar off of their caw when everything was being raised unsustainably, 72 00:04:02,680 --> 00:04:06,440 Speaker 2: then if you add one more cow, now everybody's making 73 00:04:06,960 --> 00:04:10,280 Speaker 2: eighty five cents off of their cows because you're starting 74 00:04:10,280 --> 00:04:14,200 Speaker 2: to overtax the commons. But that person who added that cow, 75 00:04:14,480 --> 00:04:16,960 Speaker 2: that's an extra eighty five cents they wouldn't have had before, 76 00:04:17,080 --> 00:04:18,119 Speaker 2: right yeah. 77 00:04:18,160 --> 00:04:19,839 Speaker 3: And if you're able to sell that cow for like 78 00:04:19,920 --> 00:04:23,440 Speaker 3: five bucks, that eighty five cents is a pittance compared 79 00:04:23,440 --> 00:04:25,640 Speaker 3: to what you're making on the other side. 80 00:04:25,839 --> 00:04:31,480 Speaker 2: Right, So there's always, under the logic of the tragedy 81 00:04:31,520 --> 00:04:35,320 Speaker 2: of the comments, there's always a reason to add another cow, 82 00:04:35,680 --> 00:04:37,960 Speaker 2: and add another cow, and add another cow, because your 83 00:04:38,040 --> 00:04:42,039 Speaker 2: returns are always going to be more than the cost 84 00:04:42,839 --> 00:04:45,159 Speaker 2: that you can offset with the returns. Right. 85 00:04:45,520 --> 00:04:48,760 Speaker 3: Well, that and you're also on the assumption that like 86 00:04:49,040 --> 00:04:50,720 Speaker 3: everyone else is going to be adding cow. So I'm 87 00:04:50,760 --> 00:04:52,719 Speaker 3: not going to be the only sucker not adding cows. 88 00:04:53,160 --> 00:04:56,839 Speaker 2: Right, So you're either a sucker or you're just somebody 89 00:04:56,839 --> 00:05:00,719 Speaker 2: who wants to survive. Because if people start ad cows 90 00:05:01,520 --> 00:05:04,080 Speaker 2: and you're the one farmer who's like, I'm not going 91 00:05:04,160 --> 00:05:08,400 Speaker 2: to do that. I find that morally repugnant. I really 92 00:05:08,520 --> 00:05:12,359 Speaker 2: like sustainably grazing here, and I'll just take one for 93 00:05:12,440 --> 00:05:16,200 Speaker 2: the team. In very short order, that farmer would find 94 00:05:16,200 --> 00:05:20,480 Speaker 2: that they were no longer making money because exactly because 95 00:05:20,520 --> 00:05:25,680 Speaker 2: each additional cow was taking an additional fraction. So let's 96 00:05:25,680 --> 00:05:28,280 Speaker 2: say that every time you added a cow, it reduced 97 00:05:28,320 --> 00:05:31,400 Speaker 2: the price that all the other cows got by fifteen percent. 98 00:05:32,400 --> 00:05:35,400 Speaker 2: That's not a steady price because if you reduce a 99 00:05:35,480 --> 00:05:39,040 Speaker 2: dollar by fifteen percent, now you're getting eighty five cents 100 00:05:39,080 --> 00:05:42,240 Speaker 2: for a cow. But now when you're adding another cow, 101 00:05:42,360 --> 00:05:45,640 Speaker 2: that reduces that eighty five cents by fifteen percent. So 102 00:05:45,640 --> 00:05:47,680 Speaker 2: all of a sudden, you're making seventy two cents, and 103 00:05:47,720 --> 00:05:50,240 Speaker 2: then sixty one cents and then fifty two cents. Right. 104 00:05:50,600 --> 00:05:54,359 Speaker 2: But again, if you're adding cows, that's always extra almost 105 00:05:54,400 --> 00:05:57,800 Speaker 2: found money. If you're not adding cows, then the cows 106 00:05:57,839 --> 00:06:00,400 Speaker 2: that you had originally are now getting less and less 107 00:06:00,440 --> 00:06:03,160 Speaker 2: and less, until you're making zero money, so you are 108 00:06:03,360 --> 00:06:08,480 Speaker 2: forced into adding cows. And that's why it's considered the tragedy. 109 00:06:08,480 --> 00:06:11,240 Speaker 2: You're why Garrett Harden called it the tragedy of the 110 00:06:11,279 --> 00:06:16,920 Speaker 2: comments because inevitably, under these circumstances, which are have long 111 00:06:16,960 --> 00:06:21,880 Speaker 2: been considered to be universal circumstances, basically the commons will 112 00:06:21,920 --> 00:06:25,719 Speaker 2: always get ruined and depleted and everybody will be totally 113 00:06:26,520 --> 00:06:27,200 Speaker 2: up the creek. 114 00:06:27,520 --> 00:06:31,440 Speaker 3: That's right. And if Harden was basically like, there's a 115 00:06:31,440 --> 00:06:34,599 Speaker 3: couple of solutions here. You can either divide this area 116 00:06:34,680 --> 00:06:37,839 Speaker 3: up and now each person has their own little private 117 00:06:37,880 --> 00:06:40,679 Speaker 3: part of the past year that they're only in control 118 00:06:40,760 --> 00:06:42,159 Speaker 3: of and no one else can get on it and 119 00:06:42,200 --> 00:06:45,680 Speaker 3: graze there, or a government a body is going to 120 00:06:45,720 --> 00:06:47,840 Speaker 3: have to step in and regulate this stuff and manage it. 121 00:06:48,880 --> 00:06:51,000 Speaker 3: And you know, if you listen to Harden, and we'll 122 00:06:51,000 --> 00:06:53,080 Speaker 3: talk a little bit about the problems with this guy 123 00:06:53,120 --> 00:06:56,440 Speaker 3: in a second, Yeah, but if you listen to Harden, 124 00:06:56,520 --> 00:06:59,040 Speaker 3: he'll say, when you divide the things, this thing up 125 00:06:59,040 --> 00:07:01,640 Speaker 3: and make it just private it and everyone you know 126 00:07:02,720 --> 00:07:06,599 Speaker 3: is incentivized to basically preserve their plots and do it 127 00:07:06,640 --> 00:07:13,400 Speaker 3: sustainably so their profits are maximized. With, you know, making 128 00:07:13,400 --> 00:07:15,480 Speaker 3: sure the land isn't ruined so they can keep those 129 00:07:15,520 --> 00:07:16,600 Speaker 3: profits maximized. 130 00:07:17,080 --> 00:07:20,680 Speaker 2: Yeah, because now it's their land and they suddenly realize, oh, 131 00:07:20,720 --> 00:07:24,680 Speaker 2: this is a finite resource. If I start to degrade 132 00:07:24,760 --> 00:07:27,559 Speaker 2: this land, it's only coming out of my profits because 133 00:07:27,600 --> 00:07:30,920 Speaker 2: the other farmer's land is not degraded. So I'm the 134 00:07:30,920 --> 00:07:34,320 Speaker 2: only one in competition with myself here, really, and I 135 00:07:34,520 --> 00:07:37,400 Speaker 2: need to make sure that this land is preserved because 136 00:07:37,440 --> 00:07:39,800 Speaker 2: now it's the gold of the goose that laid the 137 00:07:39,840 --> 00:07:41,480 Speaker 2: golden egg, and I need to keep it nice and 138 00:07:41,520 --> 00:07:46,520 Speaker 2: healthy and happy. That's the direct result of privatization, as 139 00:07:46,520 --> 00:07:49,200 Speaker 2: far as Garrett Harden was explaining, as far as the 140 00:07:49,200 --> 00:07:53,320 Speaker 2: tragedy of the Commons goes right, simple solution, just give 141 00:07:53,360 --> 00:07:55,559 Speaker 2: people an incentive to take care of it by making 142 00:07:55,600 --> 00:07:58,920 Speaker 2: it their own. And the other way, like you said, 143 00:07:58,960 --> 00:08:02,360 Speaker 2: is government inter And this seems to be what hardens 144 00:08:02,440 --> 00:08:06,960 Speaker 2: big Like this is his favorite. I think solution was 145 00:08:07,200 --> 00:08:11,080 Speaker 2: bring the government in and just say you can't over 146 00:08:11,480 --> 00:08:14,280 Speaker 2: you can't graze more than these number of sheep, and 147 00:08:14,360 --> 00:08:18,239 Speaker 2: if you do, we'll wipe out your entire family line 148 00:08:18,400 --> 00:08:20,120 Speaker 2: something along those those lines. 149 00:08:20,400 --> 00:08:23,080 Speaker 3: Yeah. But here's the thing, like It wasn't like he 150 00:08:23,160 --> 00:08:26,360 Speaker 3: was some big champion of the government, because if you 151 00:08:26,400 --> 00:08:30,280 Speaker 3: read the whole article, Harden is like, and another thing, 152 00:08:31,520 --> 00:08:34,920 Speaker 3: the reason why the world is headed toward a bad 153 00:08:34,920 --> 00:08:39,200 Speaker 3: place or a bad place already is because freedom to 154 00:08:39,240 --> 00:08:42,160 Speaker 3: breed is intolerable. So there's just too many people. We're 155 00:08:42,160 --> 00:08:45,200 Speaker 3: making too many babies, and the welfare state that the 156 00:08:45,200 --> 00:08:49,360 Speaker 3: government is encouraging is taking away the natural consequence of 157 00:08:49,400 --> 00:08:51,480 Speaker 3: what happens if you have too many kids, like they're 158 00:08:51,520 --> 00:08:53,199 Speaker 3: going to not be able to sustain that, and the 159 00:08:53,280 --> 00:08:56,120 Speaker 3: kids are going to starve to death. So we're overbreeding. 160 00:08:56,720 --> 00:08:59,680 Speaker 3: And all of a sudden, everyone's going whoa, whoaa, whoa. 161 00:09:00,000 --> 00:09:03,840 Speaker 3: And he said and another thing, he said, this overbreeding, 162 00:09:04,120 --> 00:09:06,640 Speaker 3: it could actually be a strategy where a cohesive group 163 00:09:06,679 --> 00:09:09,320 Speaker 3: could come along to increase their power. And they're like, 164 00:09:09,320 --> 00:09:11,280 Speaker 3: wait a minute, are you talking about the great replacement? 165 00:09:11,559 --> 00:09:14,600 Speaker 3: And he said, I don't know what that is, right, 166 00:09:14,679 --> 00:09:16,679 Speaker 3: and they said, well, it'll be a thing at some point. 167 00:09:17,080 --> 00:09:18,760 Speaker 2: He's like, I like the ring of it though, Yeah, 168 00:09:18,760 --> 00:09:23,560 Speaker 2: it's catchy, so yeah. Yeah. His whole thing was we 169 00:09:24,840 --> 00:09:30,040 Speaker 2: using scare quotes are overbreeding, meant poor non white countries 170 00:09:30,040 --> 00:09:34,120 Speaker 2: are overbreeding. Right, that was basically his whole thing, and 171 00:09:34,160 --> 00:09:38,600 Speaker 2: that yes, because there was such thing as international aid 172 00:09:39,280 --> 00:09:41,360 Speaker 2: in the way of food, in the way of money, 173 00:09:42,640 --> 00:09:48,400 Speaker 2: that that people could just keep having kids and knowing 174 00:09:48,440 --> 00:09:50,959 Speaker 2: that the western wealthier governments were going to take care 175 00:09:51,000 --> 00:09:53,160 Speaker 2: of him. And he was like, that's that's not what 176 00:09:53,240 --> 00:09:56,439 Speaker 2: we should be doing. And here's why the tragedy of 177 00:09:56,480 --> 00:09:59,839 Speaker 2: the commons. And to say that, by the way, the 178 00:10:00,440 --> 00:10:05,560 Speaker 2: I read another paper of his two called Lifeboat Ethics, 179 00:10:07,360 --> 00:10:11,520 Speaker 2: the Argument against Helping the poor. Yeah, hell is the subtitle, 180 00:10:12,080 --> 00:10:14,360 Speaker 2: so we really just put it out there. But he 181 00:10:14,440 --> 00:10:18,520 Speaker 2: basically said, Okay, let's imagine that all of the countries 182 00:10:18,559 --> 00:10:23,200 Speaker 2: in the world are lifeboats. Some are ridiculously overloaded with people, 183 00:10:23,320 --> 00:10:27,880 Speaker 2: others the wealthier, more advanced ones. Let's say that they're 184 00:10:27,880 --> 00:10:30,040 Speaker 2: in a lifeboat that seats one hundred and they only 185 00:10:30,080 --> 00:10:33,640 Speaker 2: have ninety people, but there's tons of people swimming around 186 00:10:33,679 --> 00:10:37,720 Speaker 2: them trying to get into the lifeboat. Do you pick 187 00:10:37,840 --> 00:10:40,880 Speaker 2: ten people out of these, you know, thousands of people 188 00:10:40,920 --> 00:10:45,000 Speaker 2: out there to let come into your lifeboat? If so, 189 00:10:45,040 --> 00:10:47,920 Speaker 2: how do you pick those people? What do you tell 190 00:10:47,920 --> 00:10:51,760 Speaker 2: the people you don't pick? And he ultimately concludes that 191 00:10:53,000 --> 00:10:56,040 Speaker 2: just don't help anybody, then you don't have to worry 192 00:10:56,120 --> 00:10:59,560 Speaker 2: about playing favorites or anything like that. Plus you keep 193 00:10:59,600 --> 00:11:04,280 Speaker 2: your ten person safety buffer in case things change for 194 00:11:04,400 --> 00:11:08,040 Speaker 2: you in your lifeboat, and you're fine. He actually argues 195 00:11:08,920 --> 00:11:12,079 Speaker 2: against guilt. You shouldn't feel any guilt, and as a 196 00:11:12,120 --> 00:11:14,040 Speaker 2: matter of fact, we shouldn't put guilt on people for 197 00:11:14,679 --> 00:11:18,440 Speaker 2: making decisions like these because they're just smart. And then secondly, 198 00:11:18,720 --> 00:11:21,199 Speaker 2: he also says that if you are one of those 199 00:11:21,200 --> 00:11:24,520 Speaker 2: people like that one farmer who is like overgrazing is 200 00:11:24,559 --> 00:11:26,800 Speaker 2: morally repugnant, and now I'm not going to do it. 201 00:11:27,320 --> 00:11:29,320 Speaker 2: If you were like that farmer in this lifeboat and 202 00:11:29,320 --> 00:11:31,040 Speaker 2: you said, I just I can't do this. I can't 203 00:11:31,080 --> 00:11:33,800 Speaker 2: sit there and watch people drown while I'm sitting here. 204 00:11:34,080 --> 00:11:36,000 Speaker 2: I'm going to give up my seat for somebody else. 205 00:11:37,559 --> 00:11:41,920 Speaker 2: Harden argued that just by virtue of a person accepting 206 00:11:42,720 --> 00:11:46,840 Speaker 2: your seat, they are less moral than you, and that 207 00:11:46,960 --> 00:11:50,040 Speaker 2: over time is more people in the lifeboat give up 208 00:11:50,040 --> 00:11:53,760 Speaker 2: their seat for moral reasons. Morality will be replaced in 209 00:11:53,800 --> 00:11:56,640 Speaker 2: this lifeboat with self interest, and then what do you 210 00:11:56,720 --> 00:12:01,360 Speaker 2: have then, So he makes all these like like if 211 00:12:01,360 --> 00:12:05,000 Speaker 2: you're a rational person and you take emotion out of it, 212 00:12:05,040 --> 00:12:06,800 Speaker 2: You're like, you know, I guess that kind of makes 213 00:12:06,800 --> 00:12:09,000 Speaker 2: sense in a little bit, but the moment you add 214 00:12:09,000 --> 00:12:11,760 Speaker 2: in any a drop of humanity to it, you're like, 215 00:12:11,800 --> 00:12:14,920 Speaker 2: this is horrible that this guy wrote a series of 216 00:12:14,960 --> 00:12:16,120 Speaker 2: papers arguing this. 217 00:12:17,040 --> 00:12:18,960 Speaker 3: Yeah, and he, you know, as far as the tragedy 218 00:12:19,000 --> 00:12:23,040 Speaker 3: of the Commons goes, he very explicitly says like, you know, 219 00:12:23,360 --> 00:12:26,480 Speaker 3: you got to prioritize yourself here, and you're you know 220 00:12:26,600 --> 00:12:28,920 Speaker 3: what you're doing and maximize your profits, and if you don't, 221 00:12:28,920 --> 00:12:31,680 Speaker 3: if you have ethics or something, then you're you're not 222 00:12:31,720 --> 00:12:32,880 Speaker 3: a very smart person. 223 00:12:33,160 --> 00:12:37,040 Speaker 2: Right exactly. So this is his whole thing that you know, 224 00:12:37,160 --> 00:12:40,000 Speaker 2: like all of it was over of population. The thing 225 00:12:40,080 --> 00:12:46,600 Speaker 2: is it got diluted or take him very literally, very quickly, 226 00:12:46,760 --> 00:12:49,560 Speaker 2: and it was applied to actual commons as we'll see, 227 00:12:50,080 --> 00:12:55,880 Speaker 2: and just it was stripped of its overpopulation, xenophobia, racism, 228 00:12:56,320 --> 00:12:59,679 Speaker 2: all of that stuff and just got applied to real 229 00:12:59,720 --> 00:13:03,319 Speaker 2: world old commons management. And to say like it changed 230 00:13:03,360 --> 00:13:05,720 Speaker 2: things is the understatement of the century. 231 00:13:05,800 --> 00:13:07,920 Speaker 3: Chok, that's right. Should we take a break. 232 00:13:08,880 --> 00:14:11,880 Speaker 1: Yeah, let's all right, We'll be right back, everybody. 233 00:13:43,800 --> 00:13:47,480 Speaker 2: Okay, So, uh, the tragedy of commons became like the 234 00:13:48,120 --> 00:13:51,560 Speaker 2: dominant way of looking at the world among Western nations 235 00:13:51,600 --> 00:13:55,959 Speaker 2: because it did two things. It said that you can 236 00:13:56,040 --> 00:14:00,240 Speaker 2: solve this problem everyone is going to have. Any time 237 00:14:00,280 --> 00:14:02,880 Speaker 2: you have a shared resource, people are going to deplete it. 238 00:14:03,040 --> 00:14:04,720 Speaker 2: That's what it said, no matter where you are in 239 00:14:04,720 --> 00:14:08,440 Speaker 2: the world, no matter who you are, but we want 240 00:14:08,440 --> 00:14:11,040 Speaker 2: to save these shared resources. Because this paper came out 241 00:14:11,040 --> 00:14:13,880 Speaker 2: at about the same time the environmental consciousness came up, 242 00:14:14,320 --> 00:14:17,000 Speaker 2: so it was like perfect timing for that. But it 243 00:14:17,080 --> 00:14:21,640 Speaker 2: also said, just privatize, buddy. And at that time, neoliberalism 244 00:14:21,720 --> 00:14:24,600 Speaker 2: was on the rise, and they're like, yes, privatize, take 245 00:14:24,600 --> 00:14:27,040 Speaker 2: all these government run things and put them in the 246 00:14:27,080 --> 00:14:29,280 Speaker 2: hands of corporations and we'll all be better off. 247 00:14:29,960 --> 00:14:32,000 Speaker 3: Yeah. And you know, one of the ways that happened 248 00:14:32,160 --> 00:14:34,120 Speaker 3: was and we're going to talk about sort of different 249 00:14:34,240 --> 00:14:36,920 Speaker 3: versions of this here and there in this episode, but 250 00:14:37,880 --> 00:14:43,280 Speaker 3: environmental trading markets emerged and that came from Canada actually 251 00:14:43,320 --> 00:14:46,080 Speaker 3: an economist named J. H. Dales in nineteen sixty eight. 252 00:14:46,840 --> 00:14:49,000 Speaker 3: But the eighties and nineties or when they really kind 253 00:14:49,000 --> 00:14:52,120 Speaker 3: of started flying off the shelf. But that's like when 254 00:14:52,160 --> 00:14:56,200 Speaker 3: the government comes in with a regulation. Let's say to 255 00:14:56,280 --> 00:14:59,080 Speaker 3: cap whatever in this case, like let's say they're capping 256 00:14:59,160 --> 00:15:00,280 Speaker 3: emissions on. 257 00:15:00,040 --> 00:15:01,640 Speaker 2: On baseball caps. 258 00:15:02,480 --> 00:15:05,680 Speaker 3: Baseball cap, I need to get rid of some baseball caps. 259 00:15:05,760 --> 00:15:06,600 Speaker 3: So that's great. 260 00:15:06,760 --> 00:15:08,520 Speaker 2: You have too many? Do you have any good Hawks ones? 261 00:15:08,560 --> 00:15:10,360 Speaker 2: I'm on the I'm on the lookout for those. 262 00:15:10,480 --> 00:15:12,680 Speaker 3: I have one Hawks cap that I wear. 263 00:15:14,200 --> 00:15:14,680 Speaker 2: I gotcha. 264 00:15:15,000 --> 00:15:16,480 Speaker 3: I just gave away a second one because it was 265 00:15:16,520 --> 00:15:18,560 Speaker 3: too big. But your your head is too small for 266 00:15:18,600 --> 00:15:18,920 Speaker 3: this thing. 267 00:15:20,200 --> 00:15:23,160 Speaker 2: It's a really tiny head. I'm like that that one 268 00:15:23,280 --> 00:15:24,840 Speaker 2: Safari hunter and beetle juice. 269 00:15:24,880 --> 00:15:26,000 Speaker 3: Do you know what your hat size is? 270 00:15:26,920 --> 00:15:29,600 Speaker 2: Uh? Like point six point six? 271 00:15:29,640 --> 00:15:34,000 Speaker 3: Okay, that's good. So let's say it's fisheries, because we're 272 00:15:34,000 --> 00:15:35,600 Speaker 3: going to talk about the fisheries a lot. So yeah, 273 00:15:35,640 --> 00:15:37,720 Speaker 3: you know, the government will come in and say, hey, 274 00:15:38,920 --> 00:15:41,800 Speaker 3: you're overfishing, so we're going to cap how many fish 275 00:15:41,800 --> 00:15:43,720 Speaker 3: you can harvest or make it a certain size and 276 00:15:43,760 --> 00:15:46,680 Speaker 3: a certain area or something like that. And then here's 277 00:15:46,720 --> 00:15:49,480 Speaker 3: your permit Fishery A, and here's your permit fishery B 278 00:15:49,640 --> 00:15:53,560 Speaker 3: and fishery C. And they grant these permits. But and 279 00:15:53,600 --> 00:15:56,920 Speaker 3: this is the key to the etm's environmental trading markets. 280 00:15:56,960 --> 00:16:00,560 Speaker 3: The word trade you can trade as i e. Buy 281 00:16:00,640 --> 00:16:04,840 Speaker 3: and sell these permits, which and this is something that 282 00:16:04,880 --> 00:16:08,160 Speaker 3: I don't fully understand, or maybe I do understand, And 283 00:16:08,200 --> 00:16:12,960 Speaker 3: that's the whole point. But if company A is let's 284 00:16:12,960 --> 00:16:17,440 Speaker 3: say it's an environmental like, hey, you're a factory and 285 00:16:17,720 --> 00:16:21,560 Speaker 3: we're incentivizing you to clean up your pollution and here's 286 00:16:21,560 --> 00:16:23,600 Speaker 3: your permit or whatever. If Company A does a really 287 00:16:23,640 --> 00:16:26,200 Speaker 3: good job and does the right thing and comes in 288 00:16:26,280 --> 00:16:29,600 Speaker 3: like way under the amount of emissions that they're supposed 289 00:16:29,640 --> 00:16:33,480 Speaker 3: to hit, by simply selling that to Company B, who's like, man, 290 00:16:33,520 --> 00:16:36,320 Speaker 3: we're not too good at that. I mean, is it creating? 291 00:16:36,600 --> 00:16:38,920 Speaker 3: Like doesn't that kind of defeat the whole purpose? 292 00:16:39,600 --> 00:16:41,640 Speaker 2: No, it doesn't. And we talked about this in the 293 00:16:41,720 --> 00:16:45,720 Speaker 2: acid rain episode. Do you remember whatever happened to acid rain? 294 00:16:45,800 --> 00:16:46,080 Speaker 2: That thing? 295 00:16:46,160 --> 00:16:46,240 Speaker 1: Oh? 296 00:16:46,320 --> 00:16:50,920 Speaker 2: Yeah, it can work, and it does work. We actually 297 00:16:51,040 --> 00:16:55,640 Speaker 2: reduced sulfur dioxide emissions that we were associated with acid 298 00:16:55,720 --> 00:16:59,720 Speaker 2: rain so much. The acid rain went away and it 299 00:16:59,800 --> 00:17:03,600 Speaker 2: was like the ozone layer of like the early eighties, 300 00:17:03,640 --> 00:17:06,120 Speaker 2: I think in seventies, like it was a big scary 301 00:17:06,200 --> 00:17:08,440 Speaker 2: thing and we took care of it because of cap 302 00:17:08,480 --> 00:17:10,760 Speaker 2: and trade schemes. So it can work. 303 00:17:10,840 --> 00:17:13,920 Speaker 3: But how That's what I don't get. If one company 304 00:17:14,520 --> 00:17:17,919 Speaker 3: is lowering their output but another one is increasing theirs 305 00:17:17,920 --> 00:17:20,679 Speaker 3: because they just bought the other companies, then how is 306 00:17:20,680 --> 00:17:22,280 Speaker 3: that a net loss. 307 00:17:22,040 --> 00:17:24,199 Speaker 2: Because of the cap? Because you put the cap at 308 00:17:24,240 --> 00:17:27,160 Speaker 2: something close to a target that you want to reduce 309 00:17:27,240 --> 00:17:30,760 Speaker 2: things to, so you don't make like some sky high 310 00:17:30,800 --> 00:17:33,600 Speaker 2: cap that's more than what you're at now. You make 311 00:17:33,680 --> 00:17:35,840 Speaker 2: it less and then maybe a couple of years later, 312 00:17:35,880 --> 00:17:38,639 Speaker 2: you make it less than that. So those caps, those 313 00:17:39,160 --> 00:17:43,159 Speaker 2: those little shares or whatever, those allowances that they can trade, 314 00:17:43,800 --> 00:17:46,280 Speaker 2: they get more and more valuable the less and less 315 00:17:46,280 --> 00:17:48,720 Speaker 2: they represent because the law is kind of bringing these 316 00:17:48,760 --> 00:17:52,679 Speaker 2: emissions down further and further. So ultimately you're rewarding a 317 00:17:52,720 --> 00:17:55,560 Speaker 2: company by reducing their emissions because they can make money 318 00:17:56,080 --> 00:17:59,040 Speaker 2: selling that to another company, and that other company is 319 00:17:59,040 --> 00:18:01,960 Speaker 2: actually technically be punished because they're having to shell out 320 00:18:02,000 --> 00:18:04,600 Speaker 2: more money than they budgeted for because they're omitting more 321 00:18:04,640 --> 00:18:09,000 Speaker 2: than they are supposed to. So ultimately you're penalizing and 322 00:18:09,040 --> 00:18:13,040 Speaker 2: rewarding through this cap and trade scheme, but you're also 323 00:18:13,640 --> 00:18:17,400 Speaker 2: creating an artificial cap. All these companies could just pollute 324 00:18:17,480 --> 00:18:20,480 Speaker 2: as much as they want, but this government is saying no, 325 00:18:20,600 --> 00:18:23,120 Speaker 2: you actually can't. Here's your level. Here it is divided 326 00:18:23,119 --> 00:18:26,320 Speaker 2: among the ten companies go to town. But does that 327 00:18:26,359 --> 00:18:26,800 Speaker 2: make sense? 328 00:18:26,880 --> 00:18:30,280 Speaker 3: Well? Yeah. The thing I still don't get though, is 329 00:18:30,480 --> 00:18:36,680 Speaker 3: the company that's, you know, buying extra emissions whatever output 330 00:18:37,280 --> 00:18:39,160 Speaker 3: from the company that's like really good at it, are 331 00:18:39,160 --> 00:18:40,520 Speaker 3: they allowed to go over the cap? 332 00:18:41,240 --> 00:18:44,000 Speaker 2: They know they But let's say they have a cap 333 00:18:44,040 --> 00:18:47,639 Speaker 2: to put out ten pounds of CO two, right, yeah, 334 00:18:47,720 --> 00:18:50,240 Speaker 2: and they're going to they know they're putting out fifteen 335 00:18:50,280 --> 00:18:53,359 Speaker 2: that year. Well, this other company that's putting out five 336 00:18:53,440 --> 00:18:56,040 Speaker 2: pounds of CO two that year, they can sell their 337 00:18:56,040 --> 00:18:59,320 Speaker 2: other five pounds, So no one goes over the cap 338 00:19:00,200 --> 00:19:05,040 Speaker 2: total cap, but different companies can contribute different amounts based 339 00:19:05,040 --> 00:19:08,800 Speaker 2: on how much of those allowance shares they can purchase 340 00:19:08,920 --> 00:19:10,480 Speaker 2: or sell. Does that make sense? 341 00:19:10,560 --> 00:19:12,399 Speaker 3: Yeah, I guess so. 342 00:19:13,000 --> 00:19:15,919 Speaker 2: I promise it works. It worked for sulfur dioxide. It 343 00:19:16,000 --> 00:19:18,479 Speaker 2: does work. The problem is is it's also like it 344 00:19:18,560 --> 00:19:21,640 Speaker 2: requires a lot of buy in from industry or really 345 00:19:22,320 --> 00:19:24,119 Speaker 2: heavy handed government regulation. 346 00:19:24,520 --> 00:19:26,919 Speaker 3: Well, either way. It became a big deal in the eighties, 347 00:19:26,960 --> 00:19:29,520 Speaker 3: and by the end of the eighties, the World Bank 348 00:19:29,800 --> 00:19:33,520 Speaker 3: referred to tragedy of the commons as quote the dominant 349 00:19:33,520 --> 00:19:39,840 Speaker 3: paradigm within which social scientists assess natural resource issues. So 350 00:19:41,520 --> 00:19:43,879 Speaker 3: I mean you can't like there's a bunch of different 351 00:19:43,880 --> 00:19:46,440 Speaker 3: policies in a bunch of different areas, so you can't 352 00:19:46,480 --> 00:19:48,800 Speaker 3: just look at the entire thing as a whole and 353 00:19:48,840 --> 00:19:51,199 Speaker 3: say like, well it's been a huge success, Like you 354 00:19:51,200 --> 00:19:53,520 Speaker 3: can pick out something like acid rain or say it 355 00:19:53,560 --> 00:19:57,879 Speaker 3: was success in this case, but there's humans involved, so 356 00:19:59,320 --> 00:20:01,400 Speaker 3: you never can tell. Well, it's not like you never 357 00:20:01,400 --> 00:20:03,960 Speaker 3: can't tell like I say, can forecast, but there's not 358 00:20:04,119 --> 00:20:07,480 Speaker 3: just one big blanket like nope, this is the exact solution. 359 00:20:07,880 --> 00:20:10,760 Speaker 2: Right for sure, And it depends on the on the industry, 360 00:20:10,840 --> 00:20:13,520 Speaker 2: right like you said, acid rain, that was a great 361 00:20:13,560 --> 00:20:17,639 Speaker 2: success story. Fisheries are another example. If you take a 362 00:20:17,680 --> 00:20:19,720 Speaker 2: fishery like you were saying, and you divide up the 363 00:20:19,760 --> 00:20:26,080 Speaker 2: total catch between companies, those companies can actually go around 364 00:20:26,080 --> 00:20:30,320 Speaker 2: and buy you know, huge portions of the other company shares, right, 365 00:20:30,680 --> 00:20:34,320 Speaker 2: and so all of a sudden, they're basically one company 366 00:20:34,359 --> 00:20:36,439 Speaker 2: with all of the rights to catch all of the 367 00:20:36,440 --> 00:20:39,879 Speaker 2: fish in that fishery because they managed to consolidate the 368 00:20:39,920 --> 00:20:43,840 Speaker 2: shares over time. And that's a huge problem that's not 369 00:20:44,200 --> 00:20:46,800 Speaker 2: at all what you want. Like, yeah, that that fishing 370 00:20:46,800 --> 00:20:50,000 Speaker 2: company is still staying within the total number of fish 371 00:20:50,040 --> 00:20:52,560 Speaker 2: that can be caught annually, but they ran all the 372 00:20:52,600 --> 00:20:55,280 Speaker 2: other people out of business. That's so that's not at 373 00:20:55,320 --> 00:20:57,800 Speaker 2: all what you want with it. So there's there's pluses 374 00:20:57,800 --> 00:21:03,120 Speaker 2: and minuses. There's a situation. The thing is is the 375 00:21:03,160 --> 00:21:06,879 Speaker 2: tragedy of the commons has largely been overblown and the 376 00:21:06,960 --> 00:21:09,600 Speaker 2: idea that you can come out on top just by 377 00:21:09,640 --> 00:21:13,840 Speaker 2: privatizing things like natural resources has been a lesson learned 378 00:21:13,840 --> 00:21:17,199 Speaker 2: the hard way. That's just overall, that's not necessarily the 379 00:21:17,200 --> 00:21:19,399 Speaker 2: best way to do it, and it can really really 380 00:21:19,480 --> 00:21:24,160 Speaker 2: harm some groups while enriching others tremendously. 381 00:21:24,400 --> 00:21:26,360 Speaker 3: Well. Yeah, and like in the case of a fishery, 382 00:21:26,560 --> 00:21:29,800 Speaker 3: like if you're a small fishing crew or something, good 383 00:21:29,880 --> 00:21:32,399 Speaker 3: luck kind of working your way up and maybe establishing 384 00:21:32,440 --> 00:21:36,320 Speaker 3: yourself as one of the larger companies, especially if the 385 00:21:36,359 --> 00:21:38,639 Speaker 3: people that are buying these things up are people that 386 00:21:38,680 --> 00:21:41,520 Speaker 3: don't live around. There are these corporations that right that 387 00:21:41,680 --> 00:21:45,520 Speaker 3: don't have necessarily a local interest. And again, while they 388 00:21:45,600 --> 00:21:49,679 Speaker 3: might be like hitting you know, staying under the threshold environmentally, 389 00:21:49,720 --> 00:21:52,879 Speaker 3: it might be working out, but you're consolidating the wealth. 390 00:21:53,400 --> 00:21:56,119 Speaker 3: Sometimes there's regulations on that where they say you have 391 00:21:56,160 --> 00:21:58,679 Speaker 3: to be you know, like on the fishing boat, and like, 392 00:21:59,200 --> 00:22:00,440 Speaker 3: you know, like Chick fil A. If you're going to 393 00:22:00,480 --> 00:22:02,120 Speaker 3: buy Chick fil A franchise, you gotta manage that. 394 00:22:02,080 --> 00:22:05,960 Speaker 2: Thing exactly right. You want that Chick fil A Cadillac. 395 00:22:05,880 --> 00:22:10,120 Speaker 3: Exactly But sometimes there aren't regulations like that. And what's 396 00:22:10,200 --> 00:22:14,040 Speaker 3: been proven over and over, especially with etms, if there's 397 00:22:14,200 --> 00:22:17,880 Speaker 3: a way for a system to be exploited for profit, 398 00:22:18,000 --> 00:22:20,199 Speaker 3: then a company will come along and do it no 399 00:22:20,200 --> 00:22:20,520 Speaker 3: matter what. 400 00:22:21,200 --> 00:22:24,960 Speaker 2: That's basically the neoliberalism tea shirt what you just said 401 00:22:25,000 --> 00:22:26,320 Speaker 2: to you know exactly. 402 00:22:27,160 --> 00:22:29,520 Speaker 3: Here's the thing, if you look at the actual try 403 00:22:29,880 --> 00:22:34,280 Speaker 3: tragedy of the commons, Harden sort of just leaves conveniently 404 00:22:34,359 --> 00:22:36,639 Speaker 3: leaves a lot of stuff out. Like he's assuming in 405 00:22:36,680 --> 00:22:39,960 Speaker 3: this argument that like the people that the farmers that 406 00:22:40,000 --> 00:22:43,840 Speaker 3: are sharing this grazing spot aren't talking to each other 407 00:22:43,880 --> 00:22:46,720 Speaker 3: at all about what's going on and saying like, hey, 408 00:22:47,080 --> 00:22:48,920 Speaker 3: we're ruining this land. By the way, maybe we should 409 00:22:48,920 --> 00:22:51,159 Speaker 3: dial it back, like everybody get rid of two cows. 410 00:22:51,800 --> 00:22:55,960 Speaker 3: And it also assumes that their only interest is to 411 00:22:56,000 --> 00:22:58,239 Speaker 3: maximize profit and do not care for the land, and 412 00:22:58,240 --> 00:22:59,639 Speaker 3: that's just not always the case. 413 00:23:01,000 --> 00:23:03,959 Speaker 2: No. I mean that's really a tough sell to a 414 00:23:04,080 --> 00:23:08,760 Speaker 2: neoliberal policy maker, Like they're just like, you're crazy, you're 415 00:23:08,800 --> 00:23:11,080 Speaker 2: so naive for even thinking that. Of course their goal 416 00:23:11,119 --> 00:23:14,840 Speaker 2: is to maximize profits. But what's really cool is we'll 417 00:23:14,880 --> 00:23:19,280 Speaker 2: see a little later on real world examples prove that 418 00:23:19,280 --> 00:23:22,720 Speaker 2: that's not true. They disprove the tragedy of the commons 419 00:23:22,960 --> 00:23:26,080 Speaker 2: by themselves. It wasn't like, oh, let's set up this crazy, 420 00:23:26,600 --> 00:23:29,360 Speaker 2: really kind of rickety experiment and see if we can 421 00:23:29,440 --> 00:23:32,120 Speaker 2: disprove the tragedy of the commons. Now, people went out 422 00:23:32,119 --> 00:23:36,119 Speaker 2: and looked at real world, for example, indigenous treatment of 423 00:23:36,240 --> 00:23:39,840 Speaker 2: common resources, and they're like, these people have been managing 424 00:23:39,840 --> 00:23:42,760 Speaker 2: and not depleting these things for thousands of years now 425 00:23:42,800 --> 00:23:45,679 Speaker 2: because they came up with their own sensible rules that 426 00:23:46,080 --> 00:23:47,760 Speaker 2: aren't neoliberal in nature. 427 00:23:48,040 --> 00:23:51,480 Speaker 3: Yeah, for sure, if you look at the original sort 428 00:23:51,520 --> 00:23:54,520 Speaker 3: of like it's not just a theoretical thing. If you 429 00:23:54,560 --> 00:23:58,119 Speaker 3: look at the paper by William Forrester. I think he 430 00:23:58,160 --> 00:24:01,720 Speaker 3: said Forster earlier, not Forrester Lloyd, that this is the 431 00:24:01,760 --> 00:24:06,360 Speaker 3: one hardened would you say, hijacked to begin with, he did. 432 00:24:07,480 --> 00:24:09,920 Speaker 3: He was talking about actual English commons, which was how 433 00:24:09,920 --> 00:24:13,800 Speaker 3: it used to work in medieval England. Was the lord 434 00:24:13,840 --> 00:24:16,119 Speaker 3: would own all this land, but there were people that 435 00:24:16,200 --> 00:24:18,240 Speaker 3: lived on it. It wasn't like, hey, this is my 436 00:24:18,280 --> 00:24:21,199 Speaker 3: private land, no one can be here at first. So 437 00:24:21,240 --> 00:24:22,760 Speaker 3: the people that lived there, they you know, they had 438 00:24:22,760 --> 00:24:24,760 Speaker 3: to graze their animals and fish out of the streams 439 00:24:24,800 --> 00:24:27,919 Speaker 3: and ponds and things like that, and they were allowed 440 00:24:28,000 --> 00:24:32,800 Speaker 3: to with you know, through certain age old customs. Sometimes 441 00:24:32,800 --> 00:24:34,600 Speaker 3: the local government would step in and kind of help 442 00:24:34,680 --> 00:24:38,120 Speaker 3: manage this stuff and limit grazing and things like that. 443 00:24:38,160 --> 00:24:42,320 Speaker 3: But it wasn't like this set codified system in place 444 00:24:42,520 --> 00:24:44,960 Speaker 3: all over England. It was just they had been working 445 00:24:45,040 --> 00:24:48,240 Speaker 3: it out for centuries like this. It wasn't perfect, but 446 00:24:48,960 --> 00:24:51,760 Speaker 3: it was sustainable and they didn't like just destroy their 447 00:24:51,840 --> 00:24:55,399 Speaker 3: land because they all had self interest to keep it sustainable. 448 00:24:55,440 --> 00:24:58,919 Speaker 2: Right, And then they were enclosed. And enclosure was this 449 00:24:59,160 --> 00:25:06,080 Speaker 2: huge mass as have overlooked world changing event, and we 450 00:25:06,200 --> 00:25:08,600 Speaker 2: need to do an episode on the fencing of the 451 00:25:08,640 --> 00:25:12,680 Speaker 2: Commons someday because it's just it changed the entire mentality. 452 00:25:12,680 --> 00:25:16,520 Speaker 2: Remember our episode on the Luddites and how they emerged 453 00:25:16,520 --> 00:25:18,840 Speaker 2: from a world that was just it just completely turned 454 00:25:18,880 --> 00:25:22,400 Speaker 2: everything on its head. That's what happened with enclosure. When 455 00:25:22,440 --> 00:25:25,920 Speaker 2: they fenced the commons, it changed everything, and the concept 456 00:25:25,960 --> 00:25:28,520 Speaker 2: of private property like really kind of developed out of that, 457 00:25:28,600 --> 00:25:31,280 Speaker 2: at least in the West, right. So we'll do a 458 00:25:31,280 --> 00:25:34,000 Speaker 2: whole separate episode on that, but suffice to say that 459 00:25:34,080 --> 00:25:37,840 Speaker 2: it seems to be once the commons were fenced and 460 00:25:37,920 --> 00:25:41,800 Speaker 2: we're no longer a shared resource, that's when the issue 461 00:25:41,840 --> 00:25:46,080 Speaker 2: started to come up. If you ask Karl Marx, he 462 00:25:46,119 --> 00:25:48,000 Speaker 2: would have said that this is where we came up 463 00:25:48,040 --> 00:25:51,679 Speaker 2: with the landless proletariat, the working class who had to 464 00:25:51,760 --> 00:25:55,680 Speaker 2: work for wages because they no longer owned anything. It 465 00:25:55,720 --> 00:25:59,040 Speaker 2: created the very, very wealthy class that didn't actually have 466 00:25:59,080 --> 00:26:00,959 Speaker 2: to do anything because because all they had to do 467 00:26:01,040 --> 00:26:04,399 Speaker 2: was start renting this private property of theirs to the 468 00:26:04,400 --> 00:26:09,720 Speaker 2: people who needed to work. It created a whole system 469 00:26:09,760 --> 00:26:13,840 Speaker 2: of problems. And in fact, some people are like, whether, 470 00:26:13,960 --> 00:26:16,320 Speaker 2: however you feel about capitalism, you can kind of trace 471 00:26:16,400 --> 00:26:19,280 Speaker 2: us back to the beginning of capitalism, the fencing of 472 00:26:19,280 --> 00:26:22,600 Speaker 2: the commons. The irony of all this is that William 473 00:26:22,640 --> 00:26:28,840 Speaker 2: Forrester or Lloyd was arguing against Adam Smith's capitalist idea 474 00:26:29,440 --> 00:26:32,679 Speaker 2: that the invisible hand of the market will always guide 475 00:26:32,720 --> 00:26:37,720 Speaker 2: things to a good outcome. He created, ultimately the Tragedy 476 00:26:37,760 --> 00:26:40,640 Speaker 2: of the Commons as a thought experiment to show like, no, actually, 477 00:26:41,040 --> 00:26:46,840 Speaker 2: people don't. People aren't guided to this bottom good. Instead, 478 00:26:47,240 --> 00:26:49,639 Speaker 2: they are going to act in their own self interest 479 00:26:49,680 --> 00:26:53,560 Speaker 2: and destroy this stuff. So Harden actually took it and 480 00:26:53,800 --> 00:26:58,560 Speaker 2: turned it around as an argument for capitalism, for private 481 00:26:58,640 --> 00:27:03,520 Speaker 2: enterprise advertising stuff. This argument that was originally used to 482 00:27:03,560 --> 00:27:04,159 Speaker 2: disprove that. 483 00:27:04,800 --> 00:27:07,639 Speaker 3: Yeah, and if you were enclosing your own comments or 484 00:27:07,760 --> 00:27:12,040 Speaker 3: arguing in that favor at the time, you were saying, hey, 485 00:27:12,119 --> 00:27:15,440 Speaker 3: everything has been chaos up into this point. It's very 486 00:27:15,440 --> 00:27:19,320 Speaker 3: inefficient and there's got to be a more organized way 487 00:27:19,320 --> 00:27:23,200 Speaker 3: to do this. And that was you know, it wasn't 488 00:27:23,520 --> 00:27:25,800 Speaker 3: a guys, I guess, but what they were really saying 489 00:27:25,920 --> 00:27:31,080 Speaker 3: was is we want this area, right, it's ours. Yeah, 490 00:27:31,119 --> 00:27:33,919 Speaker 3: I mean, let's just oversell the chaos, maybe because it 491 00:27:33,920 --> 00:27:37,680 Speaker 3: was actually working out okay for many centuries, but we're 492 00:27:37,720 --> 00:27:40,160 Speaker 3: going to just sell it as this this chaotic mess 493 00:27:40,160 --> 00:27:42,399 Speaker 3: that needs to be cleaned up and organized. 494 00:27:42,800 --> 00:27:45,480 Speaker 2: Right, That's how it's done. It seems like, isn't it. 495 00:27:45,600 --> 00:27:49,000 Speaker 2: Charles Like people people come along and create a problem 496 00:27:49,040 --> 00:27:52,600 Speaker 2: that's not actually there for their own benefit. Ultimately. 497 00:27:53,160 --> 00:27:57,560 Speaker 3: Yeah, I have a friend who ever since college, he 498 00:27:57,640 --> 00:27:59,120 Speaker 3: was just one of those guys that while we were 499 00:27:59,160 --> 00:28:03,760 Speaker 3: just like what over, he was about what about everything 500 00:28:03,800 --> 00:28:06,000 Speaker 3: about the world, about politics, about you know that that 501 00:28:06,040 --> 00:28:07,640 Speaker 3: period of college would you just kind of check out 502 00:28:07,640 --> 00:28:09,160 Speaker 3: and all you care about is like where you're going 503 00:28:09,200 --> 00:28:11,200 Speaker 3: that night? All that to say, he was a very 504 00:28:11,200 --> 00:28:12,920 Speaker 3: smart guy back then, and he used to just say 505 00:28:12,960 --> 00:28:15,080 Speaker 3: stuff that used to shake all of us to our 506 00:28:15,119 --> 00:28:18,440 Speaker 3: core about what's coming about this, that or the other 507 00:28:18,480 --> 00:28:20,600 Speaker 3: with a government. And he would always say, but here's 508 00:28:20,640 --> 00:28:22,840 Speaker 3: how they're going to sell it to you, And I, oh, yeah. 509 00:28:22,880 --> 00:28:25,560 Speaker 3: That always stuck with me, and he's totally right, like 510 00:28:25,880 --> 00:28:27,679 Speaker 3: they'll sell it to you as this, but then it 511 00:28:27,720 --> 00:28:28,359 Speaker 3: becomes this. 512 00:28:28,920 --> 00:28:32,720 Speaker 2: Right, Yeah, that's really interesting. I love stuff like that 513 00:28:32,800 --> 00:28:37,639 Speaker 2: about how just massive sweeping changes come from just a 514 00:28:37,760 --> 00:28:41,000 Speaker 2: change in an idea, change in yeah, in perspective, And 515 00:28:41,080 --> 00:28:44,240 Speaker 2: that's a really great That's what the tragedy of the 516 00:28:44,280 --> 00:28:47,000 Speaker 2: Commons was. It was an idea, it was a perspective. 517 00:28:47,520 --> 00:28:51,520 Speaker 2: He'd like Garrett Harden, didn't undertake a bunch of different 518 00:28:51,560 --> 00:28:55,520 Speaker 2: experiments or field studies or anything like that. In his paper, 519 00:28:55,640 --> 00:28:57,960 Speaker 2: which was published in Science by the way, it was 520 00:28:58,000 --> 00:29:02,000 Speaker 2: an essay at his own thoughts on something that he 521 00:29:02,080 --> 00:29:07,840 Speaker 2: presented so reasonably that people who were again neoliberals, who 522 00:29:07,880 --> 00:29:12,800 Speaker 2: were in favor of privatizing everything, including common resources, could 523 00:29:12,880 --> 00:29:16,080 Speaker 2: go to government policymakers and be like, here's how it works. 524 00:29:16,400 --> 00:29:19,880 Speaker 2: Doesn't that make sense? And the government said, yes, that 525 00:29:20,280 --> 00:29:23,600 Speaker 2: makes sense, let's start privatizing everything. And it was just 526 00:29:23,640 --> 00:29:27,600 Speaker 2: because this guy took this idea and made it approachable, 527 00:29:27,920 --> 00:29:31,640 Speaker 2: that's it. Yeah, it's not. It became fact even though 528 00:29:31,680 --> 00:29:35,360 Speaker 2: it was never faced. I just find that fascinating how 529 00:29:35,520 --> 00:29:37,560 Speaker 2: something like that can just change the world. 530 00:29:37,960 --> 00:29:39,840 Speaker 3: Well, you know, I think there were so many people 531 00:29:39,960 --> 00:29:42,560 Speaker 3: licking their chops. They were like, oh, here's a great 532 00:29:42,560 --> 00:29:45,480 Speaker 3: opportunity for us to jump on this bandwagon to take 533 00:29:45,520 --> 00:29:46,959 Speaker 3: more for ourselves. 534 00:29:47,240 --> 00:29:49,680 Speaker 2: Right, But what if the bandwagon had never been there? 535 00:29:49,720 --> 00:29:53,760 Speaker 2: Would anyone have figured out how to manipulate things so thoroughly, 536 00:29:53,920 --> 00:29:58,720 Speaker 2: especially so quickly too? Without this one paper, this one biologist. 537 00:29:58,200 --> 00:29:59,920 Speaker 3: Wrote, Yeah, someone would have. 538 00:30:00,800 --> 00:30:03,640 Speaker 2: Yeah, but it would have happened piecemeal. I think, I 539 00:30:03,720 --> 00:30:06,680 Speaker 2: don't know, it's questionable that it would have happened like 540 00:30:06,840 --> 00:30:07,320 Speaker 2: it had. 541 00:30:07,760 --> 00:30:09,480 Speaker 3: Yeah, maybe let's. 542 00:30:09,280 --> 00:30:09,959 Speaker 2: Take a break. 543 00:30:10,160 --> 00:30:12,080 Speaker 3: All right, we'll be right back. We're going to contemplate 544 00:30:12,120 --> 00:30:14,280 Speaker 3: that and come back with the definitive answer. Right for this, 545 00:30:48,640 --> 00:30:49,920 Speaker 3: we don't have a definitive answer. 546 00:30:49,920 --> 00:30:52,800 Speaker 2: I was just kidding, You got me. 547 00:30:52,960 --> 00:30:57,440 Speaker 3: We should talk about Eleanor Ostrom though. This is a 548 00:30:57,480 --> 00:31:00,200 Speaker 3: woman who wrote a book in nineteen ninety called Governing 549 00:31:00,280 --> 00:31:04,640 Speaker 3: the Commons colon the Evolution of Institutions for Collective Action, 550 00:31:05,400 --> 00:31:07,800 Speaker 3: and she's probably like at the top of the list 551 00:31:08,080 --> 00:31:13,040 Speaker 3: for really bringing this to the upper echelons of the 552 00:31:13,040 --> 00:31:18,680 Speaker 3: political world worldwide. I guess she took a bunch of 553 00:31:18,720 --> 00:31:25,200 Speaker 3: examples all over the world of controlled cprs, like grazing 554 00:31:25,240 --> 00:31:28,520 Speaker 3: areas it's always a good one in Switzerland in this case, 555 00:31:29,600 --> 00:31:33,400 Speaker 3: forests in Japan, meadows in Japan, and of course fisheries 556 00:31:33,440 --> 00:31:35,520 Speaker 3: and things like that in this case in the Philippines. 557 00:31:36,360 --> 00:31:38,680 Speaker 3: So kind of really kind of picking different spots all 558 00:31:38,720 --> 00:31:42,040 Speaker 3: over the world and examining these and how they've worked out, 559 00:31:42,840 --> 00:31:47,880 Speaker 3: and basically argued that like, hey, these things are had 560 00:31:47,920 --> 00:31:51,560 Speaker 3: been working out for centuries and it worked out pretty 561 00:31:51,600 --> 00:31:55,960 Speaker 3: good because everybody lived there who was involved. And when 562 00:31:55,960 --> 00:31:59,160 Speaker 3: you have local people that have long term interest in 563 00:31:59,800 --> 00:32:02,400 Speaker 3: the well being of their land and their area and 564 00:32:02,480 --> 00:32:07,680 Speaker 3: keeping that up. Then the economic side of things, it 565 00:32:07,760 --> 00:32:09,480 Speaker 3: is not going to go away, but it's going to 566 00:32:09,520 --> 00:32:11,720 Speaker 3: take a back seat to ensuring that this land that 567 00:32:11,760 --> 00:32:15,240 Speaker 3: they live in stays as close to as it is 568 00:32:15,280 --> 00:32:15,880 Speaker 3: as possible. 569 00:32:17,160 --> 00:32:21,200 Speaker 2: Yes, And so she was the one who went out 570 00:32:21,240 --> 00:32:24,720 Speaker 2: and actually did those field studies that Garrett Harden didn't like. 571 00:32:24,760 --> 00:32:27,200 Speaker 2: She had the receipts to back up what she was saying, 572 00:32:27,880 --> 00:32:31,960 Speaker 2: which was the tragedy of the Commons isn't actually true. 573 00:32:32,200 --> 00:32:35,680 Speaker 2: It's certainly not in every case. And what's the ironic 574 00:32:35,720 --> 00:32:41,440 Speaker 2: thing is this Chuck the tragedy of the Commons played out. 575 00:32:42,360 --> 00:32:47,000 Speaker 2: I feel like for the most part, when things were 576 00:32:47,040 --> 00:32:51,360 Speaker 2: privatized and outside industry were allowed to come in and 577 00:32:51,480 --> 00:32:55,120 Speaker 2: have a share of the Commons, that's when it was 578 00:32:55,680 --> 00:32:58,000 Speaker 2: that's when the problems really began. 579 00:32:58,320 --> 00:33:00,000 Speaker 3: Yeah. 580 00:33:00,680 --> 00:33:02,560 Speaker 2: So I want to say one other thing too, because 581 00:33:02,560 --> 00:33:05,160 Speaker 2: I know I'm kind of known for having a very 582 00:33:05,200 --> 00:33:09,520 Speaker 2: hard nos certain opinion about things and its lants, you know, 583 00:33:10,000 --> 00:33:13,400 Speaker 2: in this particular direction. Typically, I don't know, in the 584 00:33:13,440 --> 00:33:19,760 Speaker 2: last like years, so I've kind of lost my taste 585 00:33:18,320 --> 00:33:22,560 Speaker 2: for brigidity like that, or black and white thinking. So 586 00:33:22,920 --> 00:33:24,880 Speaker 2: I don't want anybody to think that I'm just like 587 00:33:25,120 --> 00:33:31,240 Speaker 2: neoliberalism equals bad alternatives to neoliberalism all good. Like, I 588 00:33:31,400 --> 00:33:34,840 Speaker 2: just don't see the world like that anymore. So I 589 00:33:34,840 --> 00:33:38,040 Speaker 2: don't want anybody to think that I'm just like I'm 590 00:33:38,080 --> 00:33:42,480 Speaker 2: bashing neoliberalism as if there's no redeemable quality to it whatsoever. 591 00:33:42,560 --> 00:33:45,040 Speaker 2: I just don't believe that. But in the case of 592 00:33:45,160 --> 00:33:48,000 Speaker 2: the tragedy of the commons, I feel like more often 593 00:33:48,040 --> 00:33:54,000 Speaker 2: than not, neoliberal policies again, privatizing shared resources or taking 594 00:33:54,120 --> 00:33:56,120 Speaker 2: oversight away from the government and just putting it in 595 00:33:56,200 --> 00:33:58,920 Speaker 2: the hands of the market has been disastrous for the 596 00:33:58,960 --> 00:34:02,480 Speaker 2: world over the last it's like forty fifty years. It's 597 00:34:02,520 --> 00:34:06,880 Speaker 2: been really good for wealthy people and helping other people 598 00:34:07,280 --> 00:34:11,640 Speaker 2: get wealthier, but that ignores the expense on the backs 599 00:34:11,680 --> 00:34:14,480 Speaker 2: of other people that it's created too. I just wanted 600 00:34:14,520 --> 00:34:17,400 Speaker 2: to put that out there, yeah, that I don't want 601 00:34:17,400 --> 00:34:21,080 Speaker 2: anybody think I'm just bashing neoliberalism because it does help 602 00:34:21,120 --> 00:34:22,120 Speaker 2: in a lot of ways. 603 00:34:22,520 --> 00:34:24,680 Speaker 3: No, And I know you well, and I can tell 604 00:34:24,719 --> 00:34:28,200 Speaker 3: everybody that you are a very varied person who looks 605 00:34:28,200 --> 00:34:31,560 Speaker 3: at things from all angles and is very considerate of 606 00:34:33,080 --> 00:34:35,120 Speaker 3: you don't just shut anything down outright. You like to 607 00:34:35,200 --> 00:34:42,319 Speaker 3: consider things now. Okay, So back to Austrom though, she 608 00:34:42,560 --> 00:34:46,319 Speaker 3: argues that, hey, this we've shown time and time again. 609 00:34:46,360 --> 00:34:48,800 Speaker 3: I show in these real world examples in my books 610 00:34:49,640 --> 00:34:53,000 Speaker 3: in Japan and in Switzerland and the Philippines with the 611 00:34:53,000 --> 00:34:57,480 Speaker 3: fisheries that this can work. But there's some parameters that 612 00:34:57,560 --> 00:35:00,800 Speaker 3: have been shown to ensure that it works, and you 613 00:35:00,880 --> 00:35:02,959 Speaker 3: got to follow these or it's not going to work. 614 00:35:03,360 --> 00:35:06,400 Speaker 3: And she had narrowed it down to eight principles of 615 00:35:06,680 --> 00:35:10,839 Speaker 3: like a successful you know, regulation of a commons, which 616 00:35:10,920 --> 00:35:15,400 Speaker 3: is clearly defined boundaries. So who can use this area, 617 00:35:15,440 --> 00:35:18,680 Speaker 3: what is this area? Rules that fit local needs. And 618 00:35:18,719 --> 00:35:20,839 Speaker 3: this is something we've been kind of hammering on. It's 619 00:35:20,880 --> 00:35:23,279 Speaker 3: got to be the local parties that come up with 620 00:35:23,320 --> 00:35:28,960 Speaker 3: these rules. The addition of outside people stepping in it 621 00:35:29,000 --> 00:35:31,520 Speaker 3: seems to be when all of the problems start because 622 00:35:31,560 --> 00:35:33,719 Speaker 3: their interests are not the same as local interests. 623 00:35:34,239 --> 00:35:36,920 Speaker 2: Yeah, history kind of tells us that when outside groups 624 00:35:36,920 --> 00:35:39,239 Speaker 2: come in and start pushing the people who've been using 625 00:35:39,320 --> 00:35:42,600 Speaker 2: this thing for a thousand years, pushing them around. Yeah, 626 00:35:43,960 --> 00:35:47,080 Speaker 2: it just doesn't go very wellpeatly because the outside groups 627 00:35:47,120 --> 00:35:50,600 Speaker 2: don't necessarily understand this. And as we'll see, that doesn't 628 00:35:50,600 --> 00:35:53,320 Speaker 2: mean that there shouldn't be the involvement of any outside groups. 629 00:35:53,560 --> 00:35:56,520 Speaker 2: What Ostrom was saying that she found from these studies 630 00:35:56,600 --> 00:36:00,080 Speaker 2: is that you have to begin the base the the 631 00:36:00,840 --> 00:36:03,160 Speaker 2: people who are really laying the groundwork and setting the 632 00:36:03,280 --> 00:36:05,480 Speaker 2: rules for this. They have to be the people who 633 00:36:05,520 --> 00:36:07,160 Speaker 2: are actually using this resource. 634 00:36:07,280 --> 00:36:10,360 Speaker 3: Yeah, for sure. The third one is group decision making, 635 00:36:10,480 --> 00:36:13,680 Speaker 3: so it's not just a couple of people or a 636 00:36:13,719 --> 00:36:16,719 Speaker 3: small board deciding these things, like get as many people 637 00:36:16,760 --> 00:36:20,920 Speaker 3: in there as possible, local people monitoring. You have to 638 00:36:21,000 --> 00:36:23,600 Speaker 3: have people monitoring, and you know, she used her forest 639 00:36:23,680 --> 00:36:26,680 Speaker 3: land example in Japan was one where she was like, hey, 640 00:36:26,680 --> 00:36:30,520 Speaker 3: they had locals monitoring the stuff and levying fines. So 641 00:36:30,560 --> 00:36:32,480 Speaker 3: you've got to know that someone is out there doing this, 642 00:36:32,880 --> 00:36:35,320 Speaker 3: so people, you know, know they're going to be held accountable. 643 00:36:35,800 --> 00:36:38,239 Speaker 3: And then the next one ties into that. And then 644 00:36:38,280 --> 00:36:41,400 Speaker 3: when you do that, you've got to start out small 645 00:36:41,440 --> 00:36:43,440 Speaker 3: with these sanctions. It's got to be like a warning 646 00:36:43,480 --> 00:36:46,279 Speaker 3: first and then a small fine. You can't just come 647 00:36:46,320 --> 00:36:49,239 Speaker 3: in there with the billy club and say you're out. 648 00:36:50,040 --> 00:36:54,200 Speaker 3: It's you've got to encourage people to stick around and like, 649 00:36:54,239 --> 00:36:56,759 Speaker 3: all right, here's a warning, but you can't keep letting 650 00:36:56,800 --> 00:36:58,799 Speaker 3: this happen. That kind of thing I was. 651 00:36:58,840 --> 00:37:02,920 Speaker 2: Really curious with the ultimate punishment is for repeat offenders, 652 00:37:03,080 --> 00:37:05,640 Speaker 2: And the only example I could find was among a 653 00:37:05,680 --> 00:37:13,560 Speaker 2: group in Bali. And ostracism is what the person ultimately faces, which, yeah, 654 00:37:13,560 --> 00:37:17,640 Speaker 2: and that's bad enough in like like a Western society, 655 00:37:17,800 --> 00:37:20,880 Speaker 2: you know, but also you can make do you know, 656 00:37:21,239 --> 00:37:23,880 Speaker 2: there's the internet. If you go into a store, the 657 00:37:23,880 --> 00:37:26,520 Speaker 2: person basically has to take your money for food. In 658 00:37:26,760 --> 00:37:30,840 Speaker 2: a more traditional culture, ostracisms like you're in big trouble 659 00:37:31,000 --> 00:37:34,239 Speaker 2: because groups rely on one another for help. And the 660 00:37:34,280 --> 00:37:36,920 Speaker 2: example they gave was if one of your relatives dies, 661 00:37:37,239 --> 00:37:40,640 Speaker 2: there's certain rights you have to perform, and if you're ostracized, 662 00:37:40,719 --> 00:37:43,040 Speaker 2: you have to figure that out all by yourself because 663 00:37:43,080 --> 00:37:44,799 Speaker 2: the community is not going to show up and help 664 00:37:44,840 --> 00:37:49,360 Speaker 2: you for these death rights that your ancestors demand of you, 665 00:37:49,560 --> 00:37:51,360 Speaker 2: because that's just tradition. 666 00:37:51,840 --> 00:37:55,560 Speaker 3: Yeah, yeah, for sure. The last three of the eighth 667 00:37:56,239 --> 00:38:00,440 Speaker 3: easy dispute resolution, so just something on the on the 668 00:38:00,680 --> 00:38:03,200 Speaker 3: cheap that's pretty informal and not some big, drawn out 669 00:38:03,239 --> 00:38:05,759 Speaker 3: expensive you know, tribunal or something like that. 670 00:38:06,160 --> 00:38:08,359 Speaker 2: Yeah, no wigs, no powdered wigs. 671 00:38:08,040 --> 00:38:10,720 Speaker 3: No potter wigs at all. You have to have support 672 00:38:10,760 --> 00:38:15,120 Speaker 3: from authorities. And it may be a local thing, but 673 00:38:15,160 --> 00:38:18,640 Speaker 3: you've got to have the blessing of a larger government 674 00:38:18,680 --> 00:38:21,080 Speaker 3: body to at the very least not get in there 675 00:38:21,120 --> 00:38:23,000 Speaker 3: and mucket up themselves and just sort of let you 676 00:38:23,040 --> 00:38:27,120 Speaker 3: do your thing, but maybe also support and then finally 677 00:38:27,160 --> 00:38:31,200 Speaker 3: building up to a larger system. So again, keep it local, 678 00:38:31,640 --> 00:38:35,360 Speaker 3: but if it's a system that connects to a larger system, 679 00:38:35,440 --> 00:38:38,319 Speaker 3: like a creek that connects to a larger watershed, you're 680 00:38:38,320 --> 00:38:40,480 Speaker 3: going to have to start linking up with other local 681 00:38:40,520 --> 00:38:42,319 Speaker 3: groups and make it a larger thing. 682 00:38:43,120 --> 00:38:46,640 Speaker 2: Right. But the point that she made too is that 683 00:38:46,640 --> 00:38:50,920 Speaker 2: that works, that's actually scalable, which is good because, like 684 00:38:50,960 --> 00:38:52,920 Speaker 2: you said, you know, a creek is one thing, but 685 00:38:53,120 --> 00:38:56,239 Speaker 2: it depends on a watershed. And so if you can 686 00:38:56,440 --> 00:38:59,520 Speaker 2: get together with other people who are managing their own creeks, 687 00:38:59,560 --> 00:39:03,600 Speaker 2: you can manage the watershed based on the individual common 688 00:39:05,080 --> 00:39:07,319 Speaker 2: I guess rules that that groups have come up with, 689 00:39:07,520 --> 00:39:10,600 Speaker 2: and that you can scale that into more modern systems too. Right, So, 690 00:39:11,120 --> 00:39:15,880 Speaker 2: like say a city has somehow created a cap on 691 00:39:15,960 --> 00:39:19,880 Speaker 2: how much emissions that the factories in the cities can create. 692 00:39:21,360 --> 00:39:24,000 Speaker 2: You can combine that with other cities, and now you're 693 00:39:24,000 --> 00:39:26,239 Speaker 2: starting to manage a larger part of the atmosphere. And 694 00:39:26,320 --> 00:39:28,719 Speaker 2: then that the state can come in and work with 695 00:39:28,800 --> 00:39:30,640 Speaker 2: all of those cities, and then the state can work 696 00:39:30,640 --> 00:39:32,080 Speaker 2: with other states, and all of a sudden, now you 697 00:39:32,160 --> 00:39:36,120 Speaker 2: have regional air quality being controlled from the rules that 698 00:39:36,239 --> 00:39:40,759 Speaker 2: are based on the actual stakeholders in each individual community. 699 00:39:41,120 --> 00:39:44,359 Speaker 3: That's right, And did she win something for this work? 700 00:39:44,960 --> 00:39:49,320 Speaker 2: She won the Nobel which Livia foot in parentheses fake 701 00:39:49,520 --> 00:39:52,960 Speaker 2: Nobel Prize and economic science is because that's not one 702 00:39:53,000 --> 00:39:57,520 Speaker 2: that the no that was actually organized originally. 703 00:39:57,520 --> 00:40:00,520 Speaker 3: She meant by that, but what one of the things 704 00:40:00,520 --> 00:40:02,720 Speaker 3: that that came out of the book. There's a University 705 00:40:02,760 --> 00:40:06,360 Speaker 3: of Chicago legal scholar name named lee Ann Fennel who 706 00:40:06,719 --> 00:40:10,080 Speaker 3: points to that book and says, you know, this is 707 00:40:10,120 --> 00:40:14,680 Speaker 3: sort of a corrects these legal theories that say property 708 00:40:14,760 --> 00:40:16,400 Speaker 3: is all or nothing, like you either own it or 709 00:40:16,600 --> 00:40:19,560 Speaker 3: and have total control over it, or you don't. And 710 00:40:19,640 --> 00:40:23,560 Speaker 3: it's actually possible. You can have a common you can 711 00:40:23,719 --> 00:40:28,120 Speaker 3: have an area that people share and the example that 712 00:40:29,080 --> 00:40:31,120 Speaker 3: and still have some autonomy. And the example that Livia 713 00:40:31,160 --> 00:40:33,640 Speaker 3: gave was like a house and if you got a 714 00:40:33,719 --> 00:40:36,200 Speaker 3: family in the house, you know, let's say you know 715 00:40:36,280 --> 00:40:40,200 Speaker 3: two people that have coupled up and have children. You know, 716 00:40:40,280 --> 00:40:44,040 Speaker 3: it's a collective. But like you know, generally speaking, unless 717 00:40:44,080 --> 00:40:46,400 Speaker 3: you live in a in a terrible house with an 718 00:40:46,400 --> 00:40:50,040 Speaker 3: authoritarian dictator as the head of the household. 719 00:40:49,800 --> 00:40:51,880 Speaker 2: Or a stepdad, that makes you feel stupid. 720 00:40:51,640 --> 00:40:53,920 Speaker 3: Yeah stepdad and makes you feel dumb. You've got you know, 721 00:40:54,040 --> 00:40:55,440 Speaker 3: your kid has your room, and you know what you 722 00:40:55,440 --> 00:40:58,520 Speaker 3: can This is your room. You know, you have a 723 00:40:58,600 --> 00:41:01,520 Speaker 3: right to your room and you can basically decorate it 724 00:41:01,600 --> 00:41:03,600 Speaker 3: how you want, within within reason, and do what you 725 00:41:03,640 --> 00:41:06,120 Speaker 3: want in here. But it's still part of the. 726 00:41:06,320 --> 00:41:11,600 Speaker 2: House, right and technically your parents own that room. Yeah, 727 00:41:11,600 --> 00:41:16,520 Speaker 2: but because of custom, custom and tradition, they probably respect 728 00:41:16,600 --> 00:41:18,200 Speaker 2: that room as your private space. 729 00:41:18,920 --> 00:41:21,600 Speaker 3: Yeah. And that's something as a parent I found is 730 00:41:21,719 --> 00:41:25,360 Speaker 3: super important because at a certain age, and this has 731 00:41:25,400 --> 00:41:28,040 Speaker 3: been the last couple of years with Ruby, like you know, 732 00:41:28,120 --> 00:41:30,160 Speaker 3: I'm gonna go into my room and shut the. 733 00:41:30,160 --> 00:41:33,440 Speaker 2: Door, yeah, and you're like, no, you're not, well. 734 00:41:33,239 --> 00:41:36,040 Speaker 3: I'll open the door because I just like and it's 735 00:41:36,040 --> 00:41:37,680 Speaker 3: not in a spy way. I just like peeking in 736 00:41:37,760 --> 00:41:40,239 Speaker 3: and seeing the fun stuff she's doing. And Emily's like, 737 00:41:40,320 --> 00:41:41,879 Speaker 3: you know, leave the door shut. You get she's got 738 00:41:41,880 --> 00:41:42,680 Speaker 3: to have that autonomy. 739 00:41:42,719 --> 00:41:45,960 Speaker 2: And I'm like, good for Emily, that's really sweet. When 740 00:41:45,960 --> 00:41:47,600 Speaker 2: you open the door, do you suddenly open and go 741 00:41:47,640 --> 00:41:48,120 Speaker 2: what are you doing? 742 00:41:49,120 --> 00:41:51,040 Speaker 3: I'll give it you. I'll give a little knock now, 743 00:41:51,239 --> 00:41:54,400 Speaker 3: you know. And you know she's usually in their drawing 744 00:41:54,440 --> 00:41:55,799 Speaker 3: pictures of cats and listening. 745 00:41:55,600 --> 00:41:57,400 Speaker 2: To music, reading Team Beat. 746 00:41:57,960 --> 00:41:58,680 Speaker 3: Yeah. 747 00:41:59,080 --> 00:42:02,080 Speaker 2: Man, do you remember? So was your room your own 748 00:42:02,120 --> 00:42:03,799 Speaker 2: private sanctuary? Yeah? 749 00:42:03,800 --> 00:42:05,680 Speaker 3: I mean Scott and I shared a room until I 750 00:42:05,840 --> 00:42:12,000 Speaker 3: was I feel like I was probably ten or eleven. 751 00:42:12,360 --> 00:42:14,640 Speaker 2: That's the that's the room age right there. 752 00:42:14,800 --> 00:42:17,640 Speaker 3: Yeah, and that's when we split. But we were upstairs 753 00:42:17,640 --> 00:42:21,080 Speaker 3: at our house. It was just two bedrooms and separated 754 00:42:21,120 --> 00:42:25,960 Speaker 3: by a joined bathroom. And my sweet brother, he I 755 00:42:26,040 --> 00:42:29,440 Speaker 3: was so sad about moving out of his room. At first, 756 00:42:29,560 --> 00:42:32,800 Speaker 3: he agreed to keep his bathroom door open so I 757 00:42:32,800 --> 00:42:34,880 Speaker 3: could lay in bed and see through the bathroom to 758 00:42:34,960 --> 00:42:35,879 Speaker 3: him and his work. 759 00:42:35,960 --> 00:42:39,799 Speaker 2: God, he is such a he'd be the rancher that 760 00:42:39,880 --> 00:42:41,880 Speaker 2: would be like, I'm not overgrazing all. 761 00:42:42,120 --> 00:42:45,400 Speaker 3: I know. He's one of the best. But yeah, so 762 00:42:45,480 --> 00:42:48,040 Speaker 3: I had my own room and it was like my 763 00:42:48,160 --> 00:42:50,839 Speaker 3: sports posters and I went through a Marylyn and Roe 764 00:42:50,840 --> 00:42:52,680 Speaker 3: phez for some reason, where I had a bunch of 765 00:42:52,719 --> 00:42:53,600 Speaker 3: hard posters up. 766 00:42:53,920 --> 00:42:56,240 Speaker 2: I went through a James phase that was similar. 767 00:42:56,440 --> 00:42:58,399 Speaker 3: Yeah, exactly, But yeah, it was my room. I took 768 00:42:58,440 --> 00:43:00,359 Speaker 3: a lot of pride in decorating it and doing my thing. 769 00:43:00,400 --> 00:43:04,480 Speaker 3: And my parents my father literally did not come upstairs. 770 00:43:05,640 --> 00:43:08,799 Speaker 3: That's a whole other story, and my mom did for 771 00:43:08,840 --> 00:43:11,440 Speaker 3: a while, but basically, I mean that was our zone 772 00:43:11,600 --> 00:43:12,319 Speaker 3: for the most part. 773 00:43:12,640 --> 00:43:16,880 Speaker 2: That's awesome, you know. I remember they're not being anything 774 00:43:16,920 --> 00:43:21,640 Speaker 2: more satisfying than undertaking a total remodeling of your room, 775 00:43:22,000 --> 00:43:24,120 Speaker 2: moving stuff around, furniture all of a sudden, the beds 776 00:43:24,120 --> 00:43:27,840 Speaker 2: over here, and like doing that, it was it was 777 00:43:27,920 --> 00:43:30,839 Speaker 2: so it just changed everything. I'd love that so much. 778 00:43:30,880 --> 00:43:32,600 Speaker 2: I have so many great memories of my room. 779 00:43:32,880 --> 00:43:34,680 Speaker 3: Yeah, me too, And I used to do that two 780 00:43:34,680 --> 00:43:39,480 Speaker 3: apartments I lived in, uh and I had ironically recently 781 00:43:39,520 --> 00:43:41,200 Speaker 3: had a thought about how much I used to enjoy 782 00:43:41,239 --> 00:43:44,200 Speaker 3: that and how like our house there, you can't really 783 00:43:44,320 --> 00:43:46,120 Speaker 3: do that, Like it's set up in a in a 784 00:43:46,160 --> 00:43:48,680 Speaker 3: way that I can't be like, hey, let's rearrange our 785 00:43:48,760 --> 00:43:51,120 Speaker 3: living room. It's like, well, you can't because. 786 00:43:51,680 --> 00:43:53,160 Speaker 2: Let's move the built ins outside. 787 00:43:53,239 --> 00:43:56,480 Speaker 3: Yeah, exactly, So those days are over for me unfortunately. Yeah. 788 00:43:56,600 --> 00:43:59,240 Speaker 2: Same here. That's just how it goes as you get older. 789 00:43:59,239 --> 00:44:01,560 Speaker 2: It's one more thing that brings you joy or brought 790 00:44:01,600 --> 00:44:04,839 Speaker 2: you joy that's now dead, old leaf. 791 00:44:06,239 --> 00:44:08,880 Speaker 3: All right, Well, let's finish up on tragedy of the commons, 792 00:44:09,000 --> 00:44:14,800 Speaker 3: because we can use an example of Maine lobster fisheries 793 00:44:14,840 --> 00:44:19,600 Speaker 3: as a pretty good example of how people can start 794 00:44:19,640 --> 00:44:23,880 Speaker 3: to do the wrong thing but correct course on their own. Yeah, 795 00:44:23,920 --> 00:44:26,080 Speaker 3: And that was the case in the nineteenth century when 796 00:44:26,160 --> 00:44:29,360 Speaker 3: the state of Maine started setting legal minimum sizes for 797 00:44:29,440 --> 00:44:33,040 Speaker 3: catching lobsters, and they couldn't enforce it that well at first, 798 00:44:33,040 --> 00:44:35,680 Speaker 3: and people broke the rules and we're like, you know, 799 00:44:35,760 --> 00:44:38,160 Speaker 3: trying to make extra money or keep more lobsters than 800 00:44:38,160 --> 00:44:41,160 Speaker 3: they should for a while. But eventually they were like, hey, 801 00:44:41,200 --> 00:44:44,799 Speaker 3: wait a minute, we're not doing ourselves any favors here, 802 00:44:44,840 --> 00:44:47,640 Speaker 3: and if we're going to all continue this to sustain 803 00:44:47,760 --> 00:44:50,600 Speaker 3: our living doing this, we got to work together. So 804 00:44:50,640 --> 00:44:54,960 Speaker 3: they form harbor gangs to start self policing. 805 00:44:55,040 --> 00:44:59,960 Speaker 2: Basically, so main lobsterman harbor gangs. Is there anything scared 806 00:45:00,360 --> 00:45:02,319 Speaker 2: than the sound of that? Just seeing them kind of 807 00:45:02,360 --> 00:45:04,320 Speaker 2: slowly come up on your boat and they're all like 808 00:45:04,600 --> 00:45:06,080 Speaker 2: they have chains in their hands. 809 00:45:06,360 --> 00:45:09,160 Speaker 3: Don't go over that. That's where the Hobba gangs are 810 00:45:09,800 --> 00:45:10,360 Speaker 3: very nice. 811 00:45:10,600 --> 00:45:12,239 Speaker 2: Boy, you just took me to cabit Coat. 812 00:45:13,280 --> 00:45:16,680 Speaker 3: So they started self policing, and everyone bought into this 813 00:45:16,840 --> 00:45:20,640 Speaker 3: idea that let's not ruin all of our livelihood here. 814 00:45:21,239 --> 00:45:24,279 Speaker 3: And one thing that gets pointed out is you can 815 00:45:24,400 --> 00:45:26,360 Speaker 3: do this in a case like that because lobsters are 816 00:45:26,360 --> 00:45:30,960 Speaker 3: close to shore. It's a small, small community. It's local 817 00:45:31,000 --> 00:45:33,879 Speaker 3: people in the case of cod. And at some point 818 00:45:33,920 --> 00:45:36,640 Speaker 3: we should do one on the Cod Wars because that's 819 00:45:36,680 --> 00:45:40,400 Speaker 3: where this factors in cod travel long distances and so 820 00:45:40,520 --> 00:45:43,480 Speaker 3: it's hard to regulate something like that that's widespread and 821 00:45:43,520 --> 00:45:44,800 Speaker 3: there's different countries involved. 822 00:45:46,200 --> 00:45:48,719 Speaker 2: So yeah, I guess tragedy of the Comments has been 823 00:45:48,800 --> 00:45:53,480 Speaker 2: largely debunked, even though it completely altered the world for 824 00:45:53,719 --> 00:45:55,359 Speaker 2: decades and decades and still does. 825 00:45:56,440 --> 00:45:58,799 Speaker 3: Yeah, I mean either blamed or credited with the birth 826 00:45:58,800 --> 00:45:59,600 Speaker 3: of capitalism. 827 00:45:59,600 --> 00:46:03,520 Speaker 2: This no small thing, it's pretty big. Yeah, yeah, Yeah, 828 00:46:03,520 --> 00:46:05,719 Speaker 2: So we're gonna do one on fencing of the commons 829 00:46:05,719 --> 00:46:07,200 Speaker 2: one day, and then we're also going to do one 830 00:46:07,200 --> 00:46:09,120 Speaker 2: on neoliberalism. 831 00:46:08,280 --> 00:46:10,320 Speaker 3: Someday and the cod Wars. 832 00:46:10,640 --> 00:46:14,080 Speaker 2: Yeah, and the cod Wars. There you go. Well, since 833 00:46:14,160 --> 00:46:17,480 Speaker 2: Chuck corrected me and added cod Wars, of course, that 834 00:46:17,520 --> 00:46:19,319 Speaker 2: means it's time everybody for listening to mail. 835 00:46:21,440 --> 00:46:25,160 Speaker 3: Uh. First of all, quick correction to a correction. When 836 00:46:25,200 --> 00:46:29,240 Speaker 3: we got written in about the hodgepodge, I kept saying 837 00:46:29,280 --> 00:46:34,600 Speaker 3: modgepodge right, it is modpodge, and the corrector got it right. 838 00:46:34,680 --> 00:46:38,320 Speaker 3: I just kept saying it wrong, wow, And I played 839 00:46:38,320 --> 00:46:40,880 Speaker 3: it off to myself as his joke, but it was not. 840 00:46:41,719 --> 00:46:44,800 Speaker 2: Modgepodge is better. I think the inventor of mod podge 841 00:46:44,840 --> 00:46:47,680 Speaker 2: is just like, oh, why didn't I call it mogepodge? 842 00:46:47,760 --> 00:46:48,279 Speaker 3: Mister vote? 843 00:46:48,400 --> 00:46:51,040 Speaker 2: That rhymes even I wonder if it was from the 844 00:46:51,040 --> 00:46:53,360 Speaker 2: time when mod squad was out, because it does have 845 00:46:53,480 --> 00:46:56,600 Speaker 2: kind of like a hippie dippy flower look to it, 846 00:46:56,600 --> 00:46:57,080 Speaker 2: doesn't it. 847 00:46:57,480 --> 00:47:00,400 Speaker 3: Well. Decoupage is a very old thing, and I suppose 848 00:47:00,480 --> 00:47:04,719 Speaker 3: that meant a modern des coupage, is my guess. 849 00:47:05,120 --> 00:47:09,520 Speaker 2: Okay, don't you think I don't know? I didn't ever 850 00:47:09,560 --> 00:47:11,600 Speaker 2: think about it more than I have in the last 851 00:47:11,680 --> 00:47:15,920 Speaker 2: couple of weeks. I know, right, we should never brought 852 00:47:15,960 --> 00:47:16,319 Speaker 2: it up. 853 00:47:16,440 --> 00:47:18,640 Speaker 3: Oh man, how do you go? 854 00:47:18,840 --> 00:47:18,960 Speaker 2: Oh? 855 00:47:19,120 --> 00:47:21,359 Speaker 3: I remember now we mentioned Martha Stewart and I said, 856 00:47:21,360 --> 00:47:25,440 Speaker 3: there's the Hodgepodge everywhere. M all right. So this is 857 00:47:25,480 --> 00:47:28,600 Speaker 3: from a friend in the Netherlands who said, good luck 858 00:47:28,600 --> 00:47:31,759 Speaker 3: pronouncing the name. But I think I've nailed it, so 859 00:47:31,800 --> 00:47:32,359 Speaker 3: we're going to see. 860 00:47:32,560 --> 00:47:33,920 Speaker 2: Okay, stay tuned for the end. 861 00:47:34,320 --> 00:47:36,880 Speaker 3: Hey, guys, just finished finished the episode on automats, and 862 00:47:36,880 --> 00:47:38,440 Speaker 3: I felt I needed to write in I'm from the 863 00:47:38,480 --> 00:47:41,360 Speaker 3: Netherlands and believe it or not, automats are still a 864 00:47:41,360 --> 00:47:44,839 Speaker 3: big thing there. And I remember now after I got 865 00:47:44,840 --> 00:47:47,280 Speaker 3: a couple of emails about these of being in Amsterdam, 866 00:47:47,760 --> 00:47:49,680 Speaker 3: Amsterdam and seeing these feebos. 867 00:47:49,360 --> 00:47:53,239 Speaker 2: F e b o places I've not heard of that. 868 00:47:53,560 --> 00:47:57,240 Speaker 3: Well, it's an automat basically, except instead of a huge restaurant, 869 00:47:57,239 --> 00:48:00,879 Speaker 3: it's just like a you know, a smallish room. We 870 00:48:00,920 --> 00:48:02,799 Speaker 3: refer to it as eating out of the wall. These 871 00:48:02,800 --> 00:48:05,319 Speaker 3: places mostly just sort of deep pried food, but in 872 00:48:05,360 --> 00:48:08,759 Speaker 3: general is quite fresh. Not sure why they're still so 873 00:48:08,840 --> 00:48:11,160 Speaker 3: popular though. Maybe it's because the Dutch aren't really known 874 00:48:11,520 --> 00:48:14,280 Speaker 3: for they're fine dining, as we just want our meals 875 00:48:14,280 --> 00:48:16,840 Speaker 3: to be efficient and cheap. Nevertheless, if you ever around 876 00:48:16,880 --> 00:48:19,320 Speaker 3: and invite you to take one of our famous croquettes 877 00:48:19,480 --> 00:48:22,480 Speaker 3: or bitter balls out of the wall, which are especially 878 00:48:22,480 --> 00:48:25,800 Speaker 3: good after a night of heavy drinking, I love the 879 00:48:25,840 --> 00:48:29,279 Speaker 3: show best regards, and that is from Hez. 880 00:48:30,880 --> 00:48:34,359 Speaker 2: Nice. Will you spell that for us non Dutch speakers? 881 00:48:35,120 --> 00:48:36,799 Speaker 3: At least that's how I was told to say it. 882 00:48:36,800 --> 00:48:43,759 Speaker 3: It is pronounced or as I'm sorry spelled, Gijs said, Hes. 883 00:48:43,840 --> 00:48:46,960 Speaker 2: That is fantastic. That looks like I just like brushed 884 00:48:47,000 --> 00:48:48,040 Speaker 2: up against the keyboard. 885 00:48:49,200 --> 00:48:49,880 Speaker 3: It really does. 886 00:48:50,160 --> 00:48:51,520 Speaker 2: Can you pronounce it one more time? 887 00:48:52,440 --> 00:48:54,560 Speaker 3: What I've gotten and what I'm sticking with is Hez. 888 00:48:55,239 --> 00:48:58,120 Speaker 2: That is great. I hope you nailed it, Chuck, you 889 00:48:58,200 --> 00:49:02,160 Speaker 2: deserve too of them. I will say thanks a lot, Hes. 890 00:49:03,640 --> 00:49:07,240 Speaker 3: The name you're trying to pronounce is Hez. 891 00:49:08,880 --> 00:49:11,400 Speaker 2: We appreciate the email. Thanks for filling us in. I 892 00:49:11,480 --> 00:49:14,520 Speaker 2: had no idea about bitter balls and eating from the 893 00:49:14,560 --> 00:49:17,880 Speaker 2: wall through out of the wall in Amsterdam, So thank you. 894 00:49:18,200 --> 00:49:22,720 Speaker 2: And if you want to be like the name is Hes, 895 00:49:23,800 --> 00:49:26,799 Speaker 2: then you can send us an email to send it 896 00:49:26,840 --> 00:49:32,440 Speaker 2: off to Stuff podcast at iHeartRadio dot com. 897 00:49:32,640 --> 00:49:35,520 Speaker 3: Stuff you should know is a production of iHeartRadio. For 898 00:49:35,600 --> 00:49:39,759 Speaker 3: more podcasts my heart Radio, visit the iHeartRadio app, Apple Podcasts, 899 00:49:39,880 --> 00:49:41,720 Speaker 3: or wherever you listen to your favorite shows.