1 00:00:00,280 --> 00:00:02,840 Speaker 1: Brought to you by the reinvented two thousand twelve camera. 2 00:00:03,160 --> 00:00:07,560 Speaker 1: It's ready. Are you welcome to Stuff You Should Know 3 00:00:08,160 --> 00:00:16,479 Speaker 1: from House Stuff Works dot Com? Hey, and welcome to 4 00:00:16,520 --> 00:00:19,240 Speaker 1: the podcast. I'm Josh Clark with me as always as 5 00:00:19,280 --> 00:00:24,040 Speaker 1: Charles W. Chuck Bryant sitting across from me. Um, and 6 00:00:24,320 --> 00:00:27,920 Speaker 1: that makes the Stuff you Should Know the podcast? There 7 00:00:28,000 --> 00:00:31,720 Speaker 1: you go on the infur nation. That's far is Is 8 00:00:31,720 --> 00:00:34,440 Speaker 1: there somebody fast forwarding through this part right now? Huh? Yes? 9 00:00:35,120 --> 00:00:40,360 Speaker 1: So Chuck? Right it is Chuck. Yes. Still have you 10 00:00:40,400 --> 00:00:44,680 Speaker 1: noticed how often I say right? Yeah? It's mind numbing. 11 00:00:44,840 --> 00:00:47,360 Speaker 1: Plus someone will right in and say, do you know 12 00:00:47,479 --> 00:00:50,559 Speaker 1: usually right all the time? And um, it sounds like 13 00:00:50,560 --> 00:00:53,199 Speaker 1: I'm eating hard candy all the time. I know that's 14 00:00:53,240 --> 00:00:55,160 Speaker 1: not the case. You've never eaten anything in here. I 15 00:00:55,160 --> 00:01:02,760 Speaker 1: can attest to that. Uh yeah, I'm overly celebratory, okay, Chuck. Uh. 16 00:01:02,840 --> 00:01:05,800 Speaker 1: As you know, I was a student of anthropology, still 17 00:01:05,840 --> 00:01:10,800 Speaker 1: consider myself such um And I first came upon this 18 00:01:10,920 --> 00:01:14,600 Speaker 1: term called carrying capacity when I was I took this 19 00:01:15,480 --> 00:01:19,720 Speaker 1: life changing anthropology class, right uh And I don't remember 20 00:01:19,800 --> 00:01:22,160 Speaker 1: the teacher's name anymore, but he was awesome. He introduced 21 00:01:22,160 --> 00:01:25,839 Speaker 1: me to probably my favorite article or essay of all time. Uh, 22 00:01:25,880 --> 00:01:28,199 Speaker 1: the worst Mistake in the History of the human race, 23 00:01:28,760 --> 00:01:32,920 Speaker 1: right by Jared Diamond. Awesome stuff. Um any dustin Diamond 24 00:01:33,080 --> 00:01:36,920 Speaker 1: by Mike Diamond, by Jared Diamond, the guy who wrote 25 00:01:36,920 --> 00:01:40,360 Speaker 1: Collapse and Guns, Germs and steel and stuff. Um, that's 26 00:01:40,400 --> 00:01:43,559 Speaker 1: required reading in my opinion. I just think you should 27 00:01:43,640 --> 00:01:48,000 Speaker 1: that essay, not necessarily as books. Um, but this I 28 00:01:48,040 --> 00:01:51,080 Speaker 1: was also introduced to carrying capacity and this there's this 29 00:01:51,200 --> 00:01:53,680 Speaker 1: really cool video. He showed it to get the point across. 30 00:01:53,880 --> 00:01:56,160 Speaker 1: And it's just a map of the world, right, and 31 00:01:56,200 --> 00:02:00,000 Speaker 1: it's um it's there's red dots. It shows population growth 32 00:02:00,600 --> 00:02:02,920 Speaker 1: and each red dot equals I think a million people. 33 00:02:03,480 --> 00:02:07,320 Speaker 1: And so it starts out in Africa, in Ethiopia, I 34 00:02:07,360 --> 00:02:10,520 Speaker 1: believe the creadle of humanity. And it starts there and 35 00:02:10,560 --> 00:02:13,399 Speaker 1: all you know, very slowly, there's like a it's time 36 00:02:13,400 --> 00:02:16,680 Speaker 1: elapse obviously, so the years go by like that, and 37 00:02:16,880 --> 00:02:21,080 Speaker 1: um uh, like the red dots start appearing very slowly, 38 00:02:21,120 --> 00:02:24,320 Speaker 1: start moving out of Africa, spreading to Asia, to Europe 39 00:02:24,480 --> 00:02:27,760 Speaker 1: all that, and then um it starts to to pop 40 00:02:27,840 --> 00:02:31,320 Speaker 1: up around North America and South America and then all 41 00:02:31,360 --> 00:02:33,760 Speaker 1: of a sudden you get to the I think like 42 00:02:33,800 --> 00:02:38,360 Speaker 1: the sixteenth century, maybe a little later, the Industrial Revolution, 43 00:02:38,560 --> 00:02:40,480 Speaker 1: and all of a sudden, this map just goes red 44 00:02:41,080 --> 00:02:43,920 Speaker 1: and it's really jarring. It really gets to point across 45 00:02:43,919 --> 00:02:49,240 Speaker 1: it like how quickly population has grown in the world 46 00:02:49,880 --> 00:02:52,560 Speaker 1: and the impacts of it. You know, that's why he 47 00:02:52,639 --> 00:02:55,320 Speaker 1: coupled this with carrying capacity, because it's like, well, yeah, 48 00:02:55,440 --> 00:02:58,320 Speaker 1: population and growth. Who cares? Then you say, oh, well, 49 00:02:58,560 --> 00:03:03,520 Speaker 1: there's a limit to the amount of resources we have, um. 50 00:03:03,560 --> 00:03:06,760 Speaker 1: And that limit is called the carrying capacity of Earth, 51 00:03:07,240 --> 00:03:11,280 Speaker 1: meaning how much Earth can sustain human life. And there's 52 00:03:11,320 --> 00:03:14,520 Speaker 1: supposedly a point to it, right, Yeah, I got some stats. 53 00:03:14,639 --> 00:03:18,760 Speaker 1: There's my intro here. Here's a couple of stats, Josh, 54 00:03:18,840 --> 00:03:23,760 Speaker 1: the United Nations population Division estimates. Because five babies are 55 00:03:23,800 --> 00:03:30,800 Speaker 1: born every second, and you're like crying all that poop, 56 00:03:31,919 --> 00:03:34,160 Speaker 1: the world is going to have seven billion people by 57 00:03:34,240 --> 00:03:37,640 Speaker 1: years in they think seven billion. Yeah, we're at six 58 00:03:37,680 --> 00:03:40,160 Speaker 1: point nine two and change right now, So I mean 59 00:03:40,200 --> 00:03:43,839 Speaker 1: we're close. And um to to illustrate your point there 60 00:03:43,840 --> 00:03:48,920 Speaker 1: about the red dots spreading like a disease, that is humans, um, 61 00:03:49,120 --> 00:03:52,280 Speaker 1: fewer than a billion people in eighteen hundred. Yeah, it 62 00:03:52,320 --> 00:03:55,080 Speaker 1: was like eight hundred million. Eight hundred dude. That mean 63 00:03:55,080 --> 00:03:57,280 Speaker 1: it seems like ancient history, but it ain't that long ago. 64 00:03:58,160 --> 00:04:01,440 Speaker 1: Three billion people in nineteen sixty and only six billion 65 00:04:01,480 --> 00:04:06,080 Speaker 1: people as recently as nine. Between nineteen fifty, chuck in nine, 66 00:04:07,080 --> 00:04:10,880 Speaker 1: the global population doubled from two point five billion to 67 00:04:11,000 --> 00:04:15,360 Speaker 1: five billion. That is cra azy. And behind this, that's 68 00:04:15,440 --> 00:04:19,880 Speaker 1: what they call exponential growth. It's not just adding like 69 00:04:20,000 --> 00:04:23,800 Speaker 1: a million people a year slow. Insteady, you're adding a 70 00:04:23,839 --> 00:04:27,560 Speaker 1: fixed number. It's you're adding you know, population is doubling 71 00:04:27,600 --> 00:04:30,800 Speaker 1: in forty years. That's exponential growth. And that is the 72 00:04:30,839 --> 00:04:36,440 Speaker 1: basis of what a guy named Thomas Robert Malthus uh In, 73 00:04:36,160 --> 00:04:42,000 Speaker 1: an eighteenth century English clergyman, predicted in his essay Um, 74 00:04:42,040 --> 00:04:45,760 Speaker 1: an essay on the principle of population, basically saying human 75 00:04:45,800 --> 00:04:49,520 Speaker 1: growth is exponential. We have a big problem because the 76 00:04:49,560 --> 00:04:53,599 Speaker 1: growth of food is not. It's linear. It's right, and 77 00:04:53,839 --> 00:04:57,080 Speaker 1: we're in trouble eventually. And he was fairly controversial at 78 00:04:57,080 --> 00:04:59,919 Speaker 1: the time. He was debated by a lot of people, 79 00:05:00,600 --> 00:05:03,400 Speaker 1: one of which was this dude named William Godwin, and 80 00:05:03,480 --> 00:05:06,080 Speaker 1: he had a theory called the perfectibility of society, which 81 00:05:06,120 --> 00:05:08,880 Speaker 1: is basically, you know, did we we're humans and we 82 00:05:09,000 --> 00:05:11,600 Speaker 1: no matter how much we grow, we will be able 83 00:05:11,640 --> 00:05:15,440 Speaker 1: to counter that with advances and technology to allow us 84 00:05:15,440 --> 00:05:19,919 Speaker 1: to grow. So they debated like crazy. Godwin subsequently was 85 00:05:19,960 --> 00:05:26,960 Speaker 1: one of the first proponents of anarchism, and Malthus talked 86 00:05:26,960 --> 00:05:30,240 Speaker 1: about eugenics way back then before it was eugenics. He said, 87 00:05:30,240 --> 00:05:32,159 Speaker 1: I could see something like this being possible, but he 88 00:05:32,160 --> 00:05:35,440 Speaker 1: said it's probably not something we should do. And he 89 00:05:35,600 --> 00:05:40,120 Speaker 1: also incidentally was the one of the first people to 90 00:05:40,440 --> 00:05:45,159 Speaker 1: uh to support or popularize the economic theory of rent. 91 00:05:45,839 --> 00:05:48,520 Speaker 1: Really yeah, well he was just all over the place, 92 00:05:48,560 --> 00:05:51,080 Speaker 1: wouldn't he. Well, but all kind of ties into population 93 00:05:51,120 --> 00:05:53,520 Speaker 1: because eugenics tied into it because he was talking about 94 00:05:53,680 --> 00:05:58,960 Speaker 1: controlling population and rent, he theorized, was only possible with 95 00:05:59,040 --> 00:06:03,240 Speaker 1: a surplus of re sources um, which allows you to 96 00:06:03,360 --> 00:06:06,400 Speaker 1: own a second place and rent it or rent a 97 00:06:06,400 --> 00:06:09,920 Speaker 1: tool or you know whatever people renting man. So what 98 00:06:10,000 --> 00:06:15,240 Speaker 1: Mauthis is talking about is generally classified as economics, right, 99 00:06:15,320 --> 00:06:19,080 Speaker 1: but it's also it stretches into all sorts of dirty, 100 00:06:19,160 --> 00:06:25,920 Speaker 1: nasty little areas like greed, um ecology, population control. So eugenics, 101 00:06:26,040 --> 00:06:31,200 Speaker 1: um family planning, abortion and fanticide, all sorts of stuff. Um. 102 00:06:31,279 --> 00:06:35,280 Speaker 1: That has a lot of implications, far reaching implications, right, 103 00:06:36,279 --> 00:06:38,400 Speaker 1: And so I didn't realize that there was somebody who 104 00:06:38,480 --> 00:06:41,480 Speaker 1: was a contemporary of him that argued, like, no humans 105 00:06:41,480 --> 00:06:46,320 Speaker 1: will use technology to outstrip, to outpace this mauth Mouthusian 106 00:06:46,360 --> 00:06:48,200 Speaker 1: curse is what it's called. Right, Yeah, that was more 107 00:06:48,200 --> 00:06:49,719 Speaker 1: than God when there was a few people too. I 108 00:06:49,720 --> 00:06:51,919 Speaker 1: didn't realize that it was at the time, but I 109 00:06:52,000 --> 00:06:54,520 Speaker 1: know that over the centuries people have been like mauth Is, 110 00:06:54,960 --> 00:06:57,600 Speaker 1: that was a great idea, but you really missed the mark. 111 00:06:57,680 --> 00:07:00,719 Speaker 1: And we're gonna use you as an example of how 112 00:07:00,800 --> 00:07:04,960 Speaker 1: badly somebody can can get it wrong, right, Because it 113 00:07:05,040 --> 00:07:08,080 Speaker 1: wasn't just technology. There's another aspect of it called the 114 00:07:08,120 --> 00:07:13,280 Speaker 1: demographic transition, which is basically as um as we get 115 00:07:13,280 --> 00:07:16,600 Speaker 1: better with this technology, one of the things we come 116 00:07:16,680 --> 00:07:21,000 Speaker 1: up with this birth control um and while we're while 117 00:07:21,240 --> 00:07:26,480 Speaker 1: our mortality rates are are lowering, so to our fertility rates, 118 00:07:27,240 --> 00:07:30,960 Speaker 1: and we eventually come to this thing called the replacement rate, 119 00:07:31,280 --> 00:07:35,000 Speaker 1: which is two point one children per household leads to 120 00:07:35,280 --> 00:07:39,640 Speaker 1: zero population growth, right, And I think they set in 121 00:07:39,720 --> 00:07:43,600 Speaker 1: Western Europe the number was one point four in the 122 00:07:43,680 --> 00:07:48,960 Speaker 1: late nineties. Like some people are afraid that that Mauthis 123 00:07:49,000 --> 00:07:51,200 Speaker 1: was correct at this point, and other people say that, 124 00:07:51,560 --> 00:07:54,040 Speaker 1: like in Europe and Asia they worry about the opposite 125 00:07:54,080 --> 00:07:55,480 Speaker 1: because you know, they have the problem over there that 126 00:07:55,520 --> 00:07:57,720 Speaker 1: they're not enough young people to take care of the 127 00:07:57,800 --> 00:08:02,280 Speaker 1: retirees one day. Exactly, it's negative population growth. So who's 128 00:08:02,360 --> 00:08:05,480 Speaker 1: right they do estimate? Um, who's who they is? I 129 00:08:05,480 --> 00:08:09,760 Speaker 1: don't know, but it just said researchers estimate that population 130 00:08:09,800 --> 00:08:12,000 Speaker 1: is not gonna level off until mid century at about 131 00:08:12,120 --> 00:08:14,920 Speaker 1: nine billion. Well that's at best. If that's if we 132 00:08:15,000 --> 00:08:17,560 Speaker 1: do level off, we could continue to keep going the 133 00:08:17,680 --> 00:08:19,680 Speaker 1: rate we're at now, the replacement rate that leads to 134 00:08:19,760 --> 00:08:22,800 Speaker 1: zero population growth, which is two point one right now, 135 00:08:22,800 --> 00:08:26,720 Speaker 1: we're at two point six worldwide and with Africa UM 136 00:08:26,800 --> 00:08:29,920 Speaker 1: skewing us the other way. Subsidaria in Africa has about 137 00:08:29,920 --> 00:08:33,240 Speaker 1: a five point one fertility rate, which means for every 138 00:08:33,240 --> 00:08:36,199 Speaker 1: household there's five point one children born. Does that point 139 00:08:36,200 --> 00:08:38,679 Speaker 1: one child? You always feels so bad for hus the 140 00:08:38,800 --> 00:08:43,760 Speaker 1: knee down, you know, on one leg um. But the uh, 141 00:08:43,800 --> 00:08:46,199 Speaker 1: if we can get to zero population growth, then we're 142 00:08:46,200 --> 00:08:48,199 Speaker 1: not going to really have to deal with the Mauthusian 143 00:08:48,320 --> 00:08:52,760 Speaker 1: curse possibly ever, but we're not. Then that's that's but 144 00:08:52,840 --> 00:08:55,760 Speaker 1: that's one thing that's um that mouth Is didn't account 145 00:08:55,800 --> 00:08:59,240 Speaker 1: for is things like as society has become more educated, 146 00:08:59,480 --> 00:09:03,280 Speaker 1: fertility rates tend to drop dramatically. So that's that's another 147 00:09:03,280 --> 00:09:05,800 Speaker 1: way to put it off too. So he was scoffed at, 148 00:09:05,920 --> 00:09:07,440 Speaker 1: Like you said, there's a lot of people out there 149 00:09:07,440 --> 00:09:11,240 Speaker 1: who think he he was he missed the mark. But um, 150 00:09:11,480 --> 00:09:13,560 Speaker 1: people have been doing a little bit of math lately 151 00:09:14,040 --> 00:09:17,880 Speaker 1: and have figured out that, Um, it's entirely possible that 152 00:09:17,960 --> 00:09:20,880 Speaker 1: he's right. That's somewhere down the line, He's right. Yeah, 153 00:09:20,920 --> 00:09:23,719 Speaker 1: And at the basis we should say of Mouths, his 154 00:09:23,800 --> 00:09:27,679 Speaker 1: whole thing is a lack of food and water really, 155 00:09:28,480 --> 00:09:30,800 Speaker 1: and we need air, food, water, shelter and all that stuff. 156 00:09:31,040 --> 00:09:35,280 Speaker 1: But what he was mainly centered on was eventually the 157 00:09:35,320 --> 00:09:38,480 Speaker 1: food growth will not match up with the population growth. 158 00:09:38,760 --> 00:09:42,240 Speaker 1: And a billion people go hungry every day already. So 159 00:09:42,240 --> 00:09:44,360 Speaker 1: so I might argue that that's already the case. So 160 00:09:44,480 --> 00:09:49,120 Speaker 1: let's talk about carrying capacity, chuck Um, if we had 161 00:09:49,160 --> 00:09:52,480 Speaker 1: not transitioned, which we have, which kind of proves the 162 00:09:52,920 --> 00:09:59,240 Speaker 1: UM positive positivists camp Um that we can be technological. 163 00:09:59,679 --> 00:10:03,800 Speaker 1: If we hadn't transition from hunter gatherer to agriculture UM, 164 00:10:03,880 --> 00:10:05,959 Speaker 1: the care and capacity of Earth would have been reached 165 00:10:05,960 --> 00:10:09,760 Speaker 1: at about a hundred million people. Yes, because there's just 166 00:10:09,800 --> 00:10:11,960 Speaker 1: so many animals running around that we can kill. There's 167 00:10:12,040 --> 00:10:14,679 Speaker 1: only so many berries that are going to occur naturally 168 00:10:14,800 --> 00:10:19,400 Speaker 1: on the on the vine. Right. But we did transition 169 00:10:19,440 --> 00:10:23,040 Speaker 1: to agriculture UM before we hit the hundred million mark, 170 00:10:23,240 --> 00:10:28,400 Speaker 1: possibly maybe not UM farming, and we we began to 171 00:10:28,520 --> 00:10:34,080 Speaker 1: use technology which is growing crops to feed ourselves. And 172 00:10:34,120 --> 00:10:38,360 Speaker 1: then we reached another point right UM where we hit 173 00:10:38,400 --> 00:10:44,439 Speaker 1: what was called the green revolution, remember that UM, where 174 00:10:44,480 --> 00:10:46,280 Speaker 1: there was a lot of people who were saying about 175 00:10:46,320 --> 00:10:49,240 Speaker 1: a billion people are going to die because we are 176 00:10:49,280 --> 00:10:52,240 Speaker 1: no longer We're not going to be able to provide 177 00:10:52,280 --> 00:10:55,679 Speaker 1: food for all the people here. Um. We've we've come 178 00:10:55,760 --> 00:10:58,320 Speaker 1: up with great vaccines and all this other technology that's 179 00:10:58,640 --> 00:11:02,320 Speaker 1: lowering the mortality rate. But that just means people are 180 00:11:02,400 --> 00:11:04,800 Speaker 1: living longer and they need food longer over the over 181 00:11:04,840 --> 00:11:08,000 Speaker 1: their lifespan. Right, So what are we gonna do? Norman 182 00:11:08,040 --> 00:11:10,200 Speaker 1: Borlog comes along and says, you know what we're gonna 183 00:11:10,200 --> 00:11:15,959 Speaker 1: doing exactly tapioca pudding for everybody and a care bear 184 00:11:16,000 --> 00:11:18,640 Speaker 1: in every garage. No, they go ahead with what he 185 00:11:18,640 --> 00:11:20,360 Speaker 1: said because he was a genius. He said, we're gonna 186 00:11:20,400 --> 00:11:23,400 Speaker 1: maximize the yield that we get out of arable land. 187 00:11:23,720 --> 00:11:25,600 Speaker 1: We're not just gonna plant some seeds and be like, 188 00:11:25,679 --> 00:11:29,520 Speaker 1: hope you grow. We're going to apply tons of pesticide, 189 00:11:29,600 --> 00:11:33,960 Speaker 1: tons of fertilizer, and we're going to squeeze corn the 190 00:11:34,000 --> 00:11:38,480 Speaker 1: size of your torso out of every every plant. Right, Yeah, 191 00:11:38,520 --> 00:11:41,240 Speaker 1: he wasn't some like awful mad That sound makes him 192 00:11:41,240 --> 00:11:44,600 Speaker 1: sound like some awful mad scientists, though in the eyes 193 00:11:44,640 --> 00:11:47,400 Speaker 1: of a lot of environmentalists he he well, I mean 194 00:11:47,440 --> 00:11:51,040 Speaker 1: think about all the runoff, all the soil depletion. Also, 195 00:11:51,160 --> 00:11:54,240 Speaker 1: didn't he also want a Nobel Prize? Sure? Yeah, he's 196 00:11:54,280 --> 00:11:56,960 Speaker 1: credited with saving that billion people that were predicted to 197 00:11:57,000 --> 00:12:00,199 Speaker 1: starve because he came in just in time because Earth 198 00:12:00,240 --> 00:12:03,920 Speaker 1: would have reached this carrying capacity for agriculture. So we've 199 00:12:03,920 --> 00:12:07,120 Speaker 1: had at least two different events where we were able 200 00:12:07,160 --> 00:12:11,360 Speaker 1: to leap forward through technology and avoid the Malthusian curse. Right, Yes, 201 00:12:11,640 --> 00:12:13,920 Speaker 1: So there are people out there who say, well, you know, 202 00:12:14,000 --> 00:12:16,600 Speaker 1: we're we're we're going to avoid it again, but what 203 00:12:16,679 --> 00:12:18,760 Speaker 1: will that be? Sure and come up with another one. 204 00:12:19,080 --> 00:12:22,160 Speaker 1: So I'm sorry, chok, we would have hit the carrying 205 00:12:22,200 --> 00:12:25,560 Speaker 1: capacity a hundred million where we hunter gatherers? Right? What 206 00:12:25,679 --> 00:12:28,760 Speaker 1: are the predictions now? Well, they say, and this is 207 00:12:28,960 --> 00:12:31,360 Speaker 1: where what I think is really interesting and completely sad, 208 00:12:31,920 --> 00:12:35,079 Speaker 1: is that we have a potential caring capacity of two 209 00:12:35,160 --> 00:12:39,200 Speaker 1: billion to forty billion, clearly past the two So one 210 00:12:39,240 --> 00:12:41,760 Speaker 1: might ask, how can it be that big of a range, 211 00:12:42,080 --> 00:12:46,200 Speaker 1: And the answer is lifestyle. And here's a very sad 212 00:12:46,320 --> 00:12:50,080 Speaker 1: sat If the entire earth live like middle class Americans, 213 00:12:50,120 --> 00:12:53,280 Speaker 1: not the super rich, who you know, probably consume more 214 00:12:53,600 --> 00:12:56,960 Speaker 1: energy and the like than your average human, just regular 215 00:12:56,960 --> 00:13:01,280 Speaker 1: middle class American folks consume about three point three times 216 00:13:02,080 --> 00:13:05,800 Speaker 1: the subsistence level of food and two hundred and fifty 217 00:13:05,840 --> 00:13:09,800 Speaker 1: times the subsistence level of water clean water. And that 218 00:13:09,840 --> 00:13:12,040 Speaker 1: means the Earth if we if everyone was like us, 219 00:13:12,440 --> 00:13:15,360 Speaker 1: the Earth could only support about two billion people. So 220 00:13:15,440 --> 00:13:19,680 Speaker 1: what's going on is of the Earth is consuming I 221 00:13:19,720 --> 00:13:22,400 Speaker 1: don't have the percentage, but the other seventy percent of 222 00:13:22,440 --> 00:13:25,640 Speaker 1: the Earth is left with what's left, which is really 223 00:13:25,679 --> 00:13:30,760 Speaker 1: really it's just a it's a uh disparity in the 224 00:13:30,800 --> 00:13:34,000 Speaker 1: allocation of resources and that's consumed. So that's why it 225 00:13:34,000 --> 00:13:36,160 Speaker 1: can be a range of two billion to forty billion 226 00:13:36,280 --> 00:13:39,120 Speaker 1: because of the different lifestyles. If if everyone lived like 227 00:13:39,160 --> 00:13:41,840 Speaker 1: there would be plenty for everyone and no one would 228 00:13:41,840 --> 00:13:45,959 Speaker 1: be starving. No, if everybody lived like we would all 229 00:13:46,200 --> 00:13:51,240 Speaker 1: we would be like sorry, well the um Yeah, that's 230 00:13:51,320 --> 00:13:53,520 Speaker 1: that's where the forty billion number comes in. I've seen 231 00:13:53,559 --> 00:13:55,320 Speaker 1: thirty and I've seen forty on the high end for 232 00:13:55,320 --> 00:13:58,080 Speaker 1: the carrying capacity, and that's where every square inch of 233 00:13:58,160 --> 00:14:01,640 Speaker 1: arable land is being cultivated to its maximum yield, and 234 00:14:01,679 --> 00:14:03,959 Speaker 1: all people live in high rises that are as high 235 00:14:04,000 --> 00:14:06,920 Speaker 1: as we can build them right now, right um, And 236 00:14:06,960 --> 00:14:13,040 Speaker 1: we're mining UM asteroids for uh, for UM minerals and 237 00:14:13,280 --> 00:14:15,960 Speaker 1: all that. We're not we're no longer going we're no 238 00:14:16,000 --> 00:14:20,680 Speaker 1: longer going to the Earth. We're going into outer space. Possibly. 239 00:14:21,000 --> 00:14:23,280 Speaker 1: I don't think that that shouldn't have started about fifty 240 00:14:23,360 --> 00:14:27,560 Speaker 1: years ago, right um. But the that forty billion prediction 241 00:14:28,200 --> 00:14:32,359 Speaker 1: is um based on the absolute minimum requirements, and everybody, 242 00:14:32,440 --> 00:14:36,040 Speaker 1: forty billion people living on the planet UM all using 243 00:14:36,080 --> 00:14:39,560 Speaker 1: the minimum amount, which is four liters of water a 244 00:14:39,640 --> 00:14:44,440 Speaker 1: year and about three ms of food a year, mostly grains, 245 00:14:45,040 --> 00:14:48,520 Speaker 1: and you can basically kiss meat goodbye because we need 246 00:14:48,560 --> 00:14:51,720 Speaker 1: that land to grow our grains rather than grow grains 247 00:14:51,720 --> 00:14:54,160 Speaker 1: to feed cows, which is another way that the West 248 00:14:54,200 --> 00:14:58,480 Speaker 1: consumes resources more than more than its fair share through 249 00:14:58,480 --> 00:15:01,720 Speaker 1: a meat rich diet, which just you're not only eating 250 00:15:01,720 --> 00:15:04,920 Speaker 1: the meat, you're eating the grains that the meat ate. Right, 251 00:15:05,360 --> 00:15:11,560 Speaker 1: So chuck, and let me ask you something. If you 252 00:15:12,040 --> 00:15:14,760 Speaker 1: had if you went home and turned on your tap 253 00:15:14,880 --> 00:15:17,360 Speaker 1: and there was hot water and it was flowing, and 254 00:15:17,400 --> 00:15:20,480 Speaker 1: it was as much as you liked, right, would you 255 00:15:20,600 --> 00:15:24,280 Speaker 1: care how you were getting that? What do you mean 256 00:15:24,320 --> 00:15:28,360 Speaker 1: how it was being delivered through my faucet? Yes? Uh? 257 00:15:28,480 --> 00:15:31,360 Speaker 1: Is this the trict question? No, it's not. Let me rephrase. 258 00:15:31,800 --> 00:15:35,040 Speaker 1: If you went home and turned on your hot water 259 00:15:35,120 --> 00:15:37,920 Speaker 1: and there's as much hot water as you wanted, and 260 00:15:38,120 --> 00:15:40,680 Speaker 1: it was you knew it was coming from a sustainable source, 261 00:15:40,680 --> 00:15:44,760 Speaker 1: would you care if it was sustainable? Yeah? I guess not. 262 00:15:44,880 --> 00:15:48,920 Speaker 1: But I'm kind of like a water saver. So your 263 00:15:48,960 --> 00:15:51,280 Speaker 1: water saver. What if you knew you didn't really have 264 00:15:51,400 --> 00:15:54,960 Speaker 1: to save water because it was so sustainable, you wouldn't care. 265 00:15:55,280 --> 00:15:58,120 Speaker 1: No one cares as long as we have the luxuries 266 00:15:58,160 --> 00:16:01,120 Speaker 1: that were afforded. It does, and you don't care if 267 00:16:01,120 --> 00:16:05,240 Speaker 1: it came from burning banana appeals, No one cares. The 268 00:16:05,360 --> 00:16:08,560 Speaker 1: problem is that the problem with the course that we're 269 00:16:08,560 --> 00:16:13,440 Speaker 1: on apparently right now, is that we are um using 270 00:16:13,720 --> 00:16:18,280 Speaker 1: technology not to get more from less, but to get 271 00:16:18,520 --> 00:16:23,560 Speaker 1: more from more, more cheaply. Right. Yeah, it's um it's 272 00:16:23,600 --> 00:16:25,480 Speaker 1: a uniquely human thing they call it in the article, 273 00:16:25,520 --> 00:16:29,920 Speaker 1: which is pretty much true. But technological advancement is in 274 00:16:30,000 --> 00:16:32,960 Speaker 1: many ways leading to our habitat destruction. Ideally, at this 275 00:16:33,040 --> 00:16:38,680 Speaker 1: point everyone would be on solar and the massive companies 276 00:16:38,720 --> 00:16:40,920 Speaker 1: would be solar powered and all that kind of thing. 277 00:16:41,040 --> 00:16:44,440 Speaker 1: And that's another great point is you know, you don't 278 00:16:44,520 --> 00:16:46,800 Speaker 1: care where your electricity comes from. Do you care if 279 00:16:46,840 --> 00:16:50,000 Speaker 1: it comes from a solar panel or wind? No, of 280 00:16:50,040 --> 00:16:53,720 Speaker 1: course you don't. You just want your electricity. So if 281 00:16:53,760 --> 00:16:57,200 Speaker 1: we had invested, or if we could invest our technological 282 00:16:57,240 --> 00:17:02,680 Speaker 1: advances into um get what we have now from less 283 00:17:02,800 --> 00:17:06,840 Speaker 1: from solar radiation or wind power, then we would be 284 00:17:08,119 --> 00:17:11,560 Speaker 1: that that's true cutting edge technology, rather than you know, 285 00:17:11,680 --> 00:17:15,920 Speaker 1: figuring out ways to deplete things faster, more cheaply, which 286 00:17:15,960 --> 00:17:18,200 Speaker 1: is the way we're going. Yeah, like thinking of let's 287 00:17:18,200 --> 00:17:23,600 Speaker 1: say a more efficient oil driller or a more efficient 288 00:17:23,960 --> 00:17:27,200 Speaker 1: way of getting coal from a mountain, i e. Mountaintop removal. 289 00:17:27,520 --> 00:17:30,200 Speaker 1: So they're using technology, but they're using in ways that 290 00:17:30,240 --> 00:17:34,680 Speaker 1: are also destroying the habitat. And sustainability is all about 291 00:17:34,720 --> 00:17:37,720 Speaker 1: finding the right balance in your habitat. So here's here's 292 00:17:37,720 --> 00:17:40,440 Speaker 1: the conclusion I came to from reading this right the 293 00:17:40,680 --> 00:17:45,200 Speaker 1: argument from the positivists camp. I don't even think I'm 294 00:17:45,240 --> 00:17:48,600 Speaker 1: using that word correctly. But um, the people who are 295 00:17:48,640 --> 00:17:54,000 Speaker 1: the optimists camp, sure duh, right are Um they're saying, no, 296 00:17:54,160 --> 00:17:57,639 Speaker 1: mouth has was incorrect because he failed to account for 297 00:17:58,640 --> 00:18:03,119 Speaker 1: human ingenuity. And as population grows, so to do the 298 00:18:03,200 --> 00:18:09,600 Speaker 1: number of geniuses, and that's where innovation comes from. Right, Um. 299 00:18:09,760 --> 00:18:14,879 Speaker 1: The I think the the optimists are missing a point 300 00:18:15,000 --> 00:18:18,600 Speaker 1: in their model, and that is greed. You can't really 301 00:18:19,400 --> 00:18:27,520 Speaker 1: sway greed to to benefit human ecology, can you know? 302 00:18:27,680 --> 00:18:30,600 Speaker 1: And you can't convince an entire population of people to 303 00:18:30,680 --> 00:18:33,160 Speaker 1: change their lifestyles, which is what it would take. That's 304 00:18:33,160 --> 00:18:36,800 Speaker 1: what I'm saying you you can't because they don't care. 305 00:18:37,480 --> 00:18:40,080 Speaker 1: But if you could deliver them that same amount of 306 00:18:40,119 --> 00:18:44,040 Speaker 1: hot water, that same electricity, and it was coming from 307 00:18:44,040 --> 00:18:47,159 Speaker 1: a sustainable source. No one's going to fight that, right. 308 00:18:47,240 --> 00:18:49,960 Speaker 1: It's having to get them to fight that fight to 309 00:18:50,040 --> 00:18:53,040 Speaker 1: get the people who are controlling it to change over. 310 00:18:53,400 --> 00:18:56,800 Speaker 1: They're not going to do that. So there's that fatal 311 00:18:56,800 --> 00:18:59,440 Speaker 1: flaw in that model that the gloom and doom camp 312 00:18:59,520 --> 00:19:03,720 Speaker 1: has over the UM optimist camp, and that they don't 313 00:19:03,720 --> 00:19:06,760 Speaker 1: account for for greed. Have you ever seen who killed 314 00:19:06,760 --> 00:19:09,800 Speaker 1: the electric car? No? I never did. I encourage people 315 00:19:09,840 --> 00:19:12,240 Speaker 1: to see that. That's pretty scary. The e V one was. 316 00:19:12,920 --> 00:19:14,359 Speaker 1: I mean, I don't know if you remember, but the 317 00:19:14,400 --> 00:19:17,280 Speaker 1: e V one was. It was ready to go. There 318 00:19:17,280 --> 00:19:19,480 Speaker 1: were TV commercials you can look up EV one commercial 319 00:19:19,480 --> 00:19:22,960 Speaker 1: on YouTube and they were running them on television. Electric 320 00:19:23,119 --> 00:19:26,080 Speaker 1: electric cars are here, They're not coming, they are here, 321 00:19:26,720 --> 00:19:30,440 Speaker 1: and boom it was gone. Yeah, I'll check it out 322 00:19:31,080 --> 00:19:32,640 Speaker 1: and I'll give you a few guesses as to why 323 00:19:32,680 --> 00:19:35,800 Speaker 1: it left so quickly. And not only were they gone, dude, 324 00:19:36,040 --> 00:19:47,320 Speaker 1: they literally gathered them all up and crushed them many time. Exactly. Yeah, sad, 325 00:19:47,480 --> 00:19:50,640 Speaker 1: but go go rented. It's cool. Yeah, and um powerful 326 00:19:50,640 --> 00:19:53,280 Speaker 1: lobbies out there. What else you got? I got nothing, man. 327 00:19:53,359 --> 00:19:54,760 Speaker 1: This is this is a good one to chew on 328 00:19:54,880 --> 00:19:56,800 Speaker 1: for people. I think I think so too. We just 329 00:19:56,880 --> 00:19:59,879 Speaker 1: encourage people like we always do, just to you know, 330 00:20:00,000 --> 00:20:02,119 Speaker 1: we're not saying, you know, quit your job and go 331 00:20:02,240 --> 00:20:04,119 Speaker 1: like build solar panels for a living and live on 332 00:20:04,160 --> 00:20:06,520 Speaker 1: a on a wind farm. You can do that, that'd 333 00:20:06,560 --> 00:20:09,240 Speaker 1: be awesome. But little little things, little positive steps, they've 334 00:20:09,240 --> 00:20:11,640 Speaker 1: say a little water, say a little power. I disagree, man, 335 00:20:11,920 --> 00:20:14,639 Speaker 1: what I don't think the onus is on the people. 336 00:20:14,680 --> 00:20:18,800 Speaker 1: I think the onus is on the the people who 337 00:20:18,920 --> 00:20:26,320 Speaker 1: are misdirecting technological advancement. I'd say it's on bothagree you 338 00:20:26,320 --> 00:20:28,040 Speaker 1: don't think that this is on the people to conserve 339 00:20:29,400 --> 00:20:32,480 Speaker 1: I think I think it. I think it is. I 340 00:20:32,520 --> 00:20:34,320 Speaker 1: think we've put it on the people, but I don't 341 00:20:34,320 --> 00:20:39,159 Speaker 1: think it's going to make enough of an impact. All right, 342 00:20:39,240 --> 00:20:41,159 Speaker 1: I think it's on the policy makers. That's who I 343 00:20:41,160 --> 00:20:43,760 Speaker 1: think it's on. I would I think I think it's 344 00:20:43,760 --> 00:20:46,800 Speaker 1: on both. Um. Okay, well that's a debate to be 345 00:20:46,920 --> 00:20:49,600 Speaker 1: played out on the Facebook page if you ask me, right, yeah, man, 346 00:20:49,640 --> 00:20:52,160 Speaker 1: we should set a form um. So if you want 347 00:20:52,160 --> 00:20:54,880 Speaker 1: to learn more, type and has the Earth reached its 348 00:20:54,920 --> 00:20:58,639 Speaker 1: carrying capacity? Or Thomas Malthus M A L T h 349 00:20:58,800 --> 00:21:01,320 Speaker 1: U s in this search part how stuff works dot Com. 350 00:21:01,359 --> 00:21:04,000 Speaker 1: It will bring up some pretty cool stuff. Well, then 351 00:21:04,000 --> 00:21:07,440 Speaker 1: that means it's time for listener mail. All right, Josh, 352 00:21:07,480 --> 00:21:10,320 Speaker 1: I'm gonna call this, uh, how to make a my 353 00:21:10,400 --> 00:21:15,200 Speaker 1: teenage son listen to your show from Portland, Oregon. Hi, 354 00:21:15,280 --> 00:21:18,239 Speaker 1: guys and Jerry. When you have a teenager, you will 355 00:21:18,320 --> 00:21:20,000 Speaker 1: quickly learn that you can't just tell them what to 356 00:21:20,040 --> 00:21:23,160 Speaker 1: do and expect them to do it. I remember those days. 357 00:21:23,880 --> 00:21:26,800 Speaker 1: It's so frustrating because as a parent, you know that 358 00:21:26,920 --> 00:21:28,879 Speaker 1: your kid will love something and get lots out of it, 359 00:21:28,920 --> 00:21:30,720 Speaker 1: but you can't come right out and say it, or 360 00:21:30,720 --> 00:21:32,879 Speaker 1: they will never ever try the thing you told them 361 00:21:32,920 --> 00:21:36,320 Speaker 1: to try. For example, your podcast. I knew for a fact, 362 00:21:36,920 --> 00:21:38,639 Speaker 1: like I know that it will reign in Portland, that 363 00:21:38,720 --> 00:21:41,080 Speaker 1: my thirteen year old son Ethan would really love stuff. 364 00:21:41,080 --> 00:21:43,640 Speaker 1: You should know because I love the podcast. I've turned 365 00:21:43,680 --> 00:21:45,520 Speaker 1: other people onto it and they love it. But I 366 00:21:45,600 --> 00:21:47,639 Speaker 1: knew I had to be sneaky in order for my 367 00:21:47,680 --> 00:21:50,679 Speaker 1: son to give it a try. Ethan is a fencer 368 00:21:51,240 --> 00:21:53,200 Speaker 1: and at the time was also working on a research 369 00:21:53,240 --> 00:21:57,040 Speaker 1: project about Renaissance jousting and tournaments. So one Saturday I 370 00:21:57,119 --> 00:22:00,000 Speaker 1: was working in the kitchen. I played how nights were 371 00:22:00,560 --> 00:22:02,800 Speaker 1: Uh to catch his interest. Every time he came in 372 00:22:02,800 --> 00:22:05,919 Speaker 1: the kitchen, I'd hit play. When he leave, I'd hit pause. 373 00:22:08,000 --> 00:22:09,600 Speaker 1: I would figure he would just think, Man, these guys 374 00:22:09,600 --> 00:22:12,160 Speaker 1: take a long time to finish the center. He would 375 00:22:12,200 --> 00:22:14,240 Speaker 1: hang around the kitchen longer and longer each time, and 376 00:22:14,240 --> 00:22:16,040 Speaker 1: I could tell I almost had him on the line 377 00:22:16,200 --> 00:22:19,760 Speaker 1: like I was noodling. Although you would say I had 378 00:22:19,800 --> 00:22:22,840 Speaker 1: him on the arm. Yeah, there's no mine. When it 379 00:22:22,880 --> 00:22:24,840 Speaker 1: was over, he said he already knew everything you talked 380 00:22:24,840 --> 00:22:27,360 Speaker 1: about in the podcast, but I could tell he was intrigued. 381 00:22:27,640 --> 00:22:29,399 Speaker 1: Then I hit him with the Scooby Doo Show and 382 00:22:29,440 --> 00:22:31,840 Speaker 1: that was it. You had another fan. Now he has 383 00:22:31,880 --> 00:22:34,840 Speaker 1: downloaded the app for his iPod and listens each night 384 00:22:34,880 --> 00:22:38,680 Speaker 1: as he's going to sleep. And that tent, Yeah, that's 385 00:22:38,680 --> 00:22:43,679 Speaker 1: from Afton in a very sneaky mom thank you in Portland, Oregon. 386 00:22:43,880 --> 00:22:46,840 Speaker 1: That also kind of ties into the Colts and Brainwashing 387 00:22:46,840 --> 00:22:50,120 Speaker 1: episodes two, didn't it? Yeah? And she said um. When 388 00:22:50,160 --> 00:22:51,679 Speaker 1: she replied, I asked her if I could read this. 389 00:22:51,720 --> 00:22:53,879 Speaker 1: She said sure, And she said, I guess he'll know 390 00:22:54,400 --> 00:22:57,000 Speaker 1: my little trick now, but he'll get such a kick 391 00:22:57,000 --> 00:22:59,880 Speaker 1: out of being mentioned Ethan the fencer, he will forget 392 00:22:59,880 --> 00:23:03,119 Speaker 1: the yeah, and at least he can rest assured that 393 00:23:03,160 --> 00:23:06,120 Speaker 1: she's not like putting anything in his soup to get 394 00:23:06,160 --> 00:23:08,560 Speaker 1: him to do what she wants. She uses more subtle 395 00:23:08,560 --> 00:23:11,920 Speaker 1: tactics than that. Right. I wish you could put something 396 00:23:11,920 --> 00:23:14,000 Speaker 1: in soup to make people listen to this. I'd be 397 00:23:14,000 --> 00:23:16,840 Speaker 1: putting it in soup. Yeah, that's a good idea. I 398 00:23:17,119 --> 00:23:19,320 Speaker 1: put it in all soups. I'll tell you what, if 399 00:23:19,359 --> 00:23:21,280 Speaker 1: you have any suggestions of what we can put in 400 00:23:21,359 --> 00:23:23,320 Speaker 1: people's soup to get them to listen to stuff you 401 00:23:23,320 --> 00:23:25,440 Speaker 1: should know and to get them to go give us 402 00:23:25,480 --> 00:23:29,120 Speaker 1: a review on iTunes. Huh, yeah, that that helps us out. 403 00:23:29,200 --> 00:23:32,439 Speaker 1: When you do that, Uh, you should send us an 404 00:23:32,440 --> 00:23:36,280 Speaker 1: email and you should send it to a specific email dress. 405 00:23:37,119 --> 00:23:46,160 Speaker 1: That is stuff Podcast at how stuff works dot com. 406 00:23:46,240 --> 00:23:48,800 Speaker 1: Be sure to check out our new video podcast, Stuff 407 00:23:48,840 --> 00:23:51,480 Speaker 1: from the Future. Join how Stuff Work staff as we 408 00:23:51,520 --> 00:23:58,040 Speaker 1: explore the most promising and perplexing possibilities of tomorrow. Brought 409 00:23:58,040 --> 00:24:01,240 Speaker 1: to you by the reinvented two thousand twelve Camery. It's ready, 410 00:24:01,400 --> 00:24:01,760 Speaker 1: are you