1 00:00:00,280 --> 00:00:04,480 Speaker 1: This delightful program is brought to you by Squarespace. Beautiful 2 00:00:04,519 --> 00:00:09,200 Speaker 1: websites for beautiful bands like Walk the Moon band dot com. 3 00:00:09,240 --> 00:00:11,960 Speaker 1: So the name of their tour is talking is hard. 4 00:00:12,400 --> 00:00:15,160 Speaker 1: It's not that hard, guys. I'm doing it now. Most 5 00:00:15,160 --> 00:00:17,119 Speaker 1: of the time. I can't stop talking. I mean you 6 00:00:17,239 --> 00:00:19,119 Speaker 1: literally would have to cut me off one time. I 7 00:00:19,120 --> 00:00:21,759 Speaker 1: told yeah, yeah, yeah, Alec Baldwin, we get it, you 8 00:00:21,800 --> 00:00:24,840 Speaker 1: are talking too much. But you like square space and 9 00:00:24,880 --> 00:00:27,680 Speaker 1: we like square space, so we appreciate you getting on 10 00:00:27,760 --> 00:00:30,880 Speaker 1: our podcasts to help us plug one of our great sponsors. 11 00:00:31,640 --> 00:00:34,840 Speaker 1: Welcome to you. Stuff you should know from House Stuff 12 00:00:34,840 --> 00:00:43,279 Speaker 1: Works dot com. Hey, and welcome to the podcast. I'm 13 00:00:43,400 --> 00:00:47,680 Speaker 1: Josh Clark. There's Charles W. Chuck Bryant and Jerry's here 14 00:00:47,840 --> 00:00:51,800 Speaker 1: and it's stuff you should know from the future, but 15 00:00:52,000 --> 00:00:57,440 Speaker 1: not really. How are you doing. I'm fine, Well, good, 16 00:00:57,840 --> 00:01:01,640 Speaker 1: that's good. I enjoyed. This was kind of neat, Yeah 17 00:01:01,680 --> 00:01:05,360 Speaker 1: it was. It was funny, like when you're reading about 18 00:01:05,400 --> 00:01:11,640 Speaker 1: futurology in futurologists a k a. Futurists, you tend to 19 00:01:12,000 --> 00:01:15,640 Speaker 1: want to make it like more than it actually is, 20 00:01:16,400 --> 00:01:18,760 Speaker 1: and when you look into the topic, it keeps having 21 00:01:18,760 --> 00:01:21,080 Speaker 1: to be beaten down just because of the name alone. 22 00:01:21,280 --> 00:01:24,560 Speaker 1: It sounds like a little bit like a I could 23 00:01:24,560 --> 00:01:29,440 Speaker 1: do a seer. And you know, sometimes they're thinking about 24 00:01:29,480 --> 00:01:34,720 Speaker 1: they're using these these really neat techniques to predict the future. Um, 25 00:01:36,400 --> 00:01:41,720 Speaker 1: they're talking about some really mundane stuff, boring stuff economic forecasts, 26 00:01:41,840 --> 00:01:44,360 Speaker 1: things like that how much oil will be left in 27 00:01:44,400 --> 00:01:46,399 Speaker 1: thirty years, that kind of thing. But then on the 28 00:01:46,400 --> 00:01:48,760 Speaker 1: other hand, if you're a futurologist, you may also be 29 00:01:48,840 --> 00:01:51,960 Speaker 1: tasked with figuring out what technology we're going to be 30 00:01:52,040 --> 00:01:56,400 Speaker 1: using in thirty years or you know what, what what 31 00:01:56,680 --> 00:01:59,520 Speaker 1: color the shiny jumpsuits were all gonna wear will be? 32 00:01:59,640 --> 00:02:01,760 Speaker 1: That kind this stuff. Yeah, I think one of my 33 00:02:01,800 --> 00:02:07,640 Speaker 1: favorite things is to look at past future predictions. It's fun. Yeah, 34 00:02:07,640 --> 00:02:12,200 Speaker 1: there's nothing that will make someone look less knowledgeable than 35 00:02:12,280 --> 00:02:14,799 Speaker 1: going back to what they thought the future would look 36 00:02:14,840 --> 00:02:17,000 Speaker 1: like in the year two thousand, like back in the 37 00:02:17,080 --> 00:02:21,000 Speaker 1: nineteen thirties or forties or sometimes some of those things happen. Yeah, 38 00:02:21,200 --> 00:02:24,320 Speaker 1: and then it's amazing. Yeah, that's like wow, you know, 39 00:02:24,600 --> 00:02:28,120 Speaker 1: because sometimes these guys are like really really dead on 40 00:02:28,400 --> 00:02:31,040 Speaker 1: And I was reading an article, Um, I think it 41 00:02:31,080 --> 00:02:33,320 Speaker 1: was in Harvard Business Review, and it was a post 42 00:02:33,400 --> 00:02:39,120 Speaker 1: by Paul uh Sappho, who runs a venture capital firm 43 00:02:39,160 --> 00:02:43,920 Speaker 1: I believe called Discern. Yeah, and um, Paul staff was 44 00:02:43,960 --> 00:02:46,760 Speaker 1: saying like he was, he was trying to get across 45 00:02:46,840 --> 00:02:56,520 Speaker 1: that sci fi authors and future ologists their paths overlap 46 00:02:56,680 --> 00:02:59,360 Speaker 1: quite a bit, but really there's pretty big distinctions. And 47 00:02:59,440 --> 00:03:02,680 Speaker 1: even in this article they got lumped in together because 48 00:03:02,680 --> 00:03:08,720 Speaker 1: sci fi writers do definitely use futurology techniques. But Paul 49 00:03:08,760 --> 00:03:11,840 Speaker 1: Cepo was saying, like, yeah, but a real futurologist you 50 00:03:11,880 --> 00:03:15,120 Speaker 1: have to use logic, whereas if you're a sci fi writer, 51 00:03:15,160 --> 00:03:16,920 Speaker 1: you can just use your imagination. You don't have to 52 00:03:16,919 --> 00:03:19,560 Speaker 1: back it up with anything. Creature ologists, you have to 53 00:03:19,680 --> 00:03:24,440 Speaker 1: use logic that makes sense to the whoever's hearing your prediction. Yeah, 54 00:03:24,440 --> 00:03:27,000 Speaker 1: And I think that's one reason why some sci fi 55 00:03:27,120 --> 00:03:31,240 Speaker 1: writers have uh been right on the nose with some 56 00:03:31,280 --> 00:03:36,480 Speaker 1: future predictions, because they're not hampered by logic and they 57 00:03:36,480 --> 00:03:41,280 Speaker 1: can just uh free form, you know. But then it's 58 00:03:41,320 --> 00:03:43,720 Speaker 1: just a lucky guess. No, I don't think so. I 59 00:03:43,760 --> 00:03:46,080 Speaker 1: think they're still applying a lot of the same rules 60 00:03:46,120 --> 00:03:51,400 Speaker 1: of uh futurology, but they're just not bound by you know, 61 00:03:51,960 --> 00:03:55,400 Speaker 1: the laws of uh not not the laws, but you 62 00:03:55,400 --> 00:03:58,840 Speaker 1: know the laws of logic. Yeah, exactly, I'm with you. 63 00:03:59,080 --> 00:04:01,280 Speaker 1: But that's the best. It's fiction, though, I think, is 64 00:04:01,880 --> 00:04:05,840 Speaker 1: something that logically makes sense. Yeah, because then it's it's 65 00:04:05,880 --> 00:04:11,440 Speaker 1: just fantasy. Yeah, that's true. So futurology is um recognizing 66 00:04:11,440 --> 00:04:17,320 Speaker 1: and assessing potential future events. Uh. I could have sworn 67 00:04:17,360 --> 00:04:19,680 Speaker 1: Jonathan Strickland wrote this by the way it read, but 68 00:04:19,760 --> 00:04:23,880 Speaker 1: it was not. It's very strickland desk. Nicholas Gurbus, Yeah, 69 00:04:24,600 --> 00:04:27,960 Speaker 1: that's Strickland's alter ego. I wonder if it is. I've 70 00:04:27,960 --> 00:04:32,240 Speaker 1: never met this Nicholas Curbus. But the point Gurbus makes, 71 00:04:32,240 --> 00:04:34,320 Speaker 1: which I think is good, is it's a product of 72 00:04:34,320 --> 00:04:37,400 Speaker 1: our times in many cases, like depending on where we 73 00:04:37,440 --> 00:04:41,400 Speaker 1: are as a society, um, And like he makes a 74 00:04:41,440 --> 00:04:43,880 Speaker 1: great point. During the Civil War there probably weren't a 75 00:04:43,920 --> 00:04:47,039 Speaker 1: lot of like rosy predictions for the future American Civil War. 76 00:04:47,600 --> 00:04:50,920 Speaker 1: But in the Gilded Age people a lot more optimistic, optimistic, 77 00:04:51,000 --> 00:04:53,920 Speaker 1: so they may have you know, it's a whole different deal. 78 00:04:54,080 --> 00:04:58,000 Speaker 1: Like during the Cold War, for instance, a lot of paranoia, 79 00:04:58,040 --> 00:05:01,119 Speaker 1: a lot of cynicism, probably not to be a rosy 80 00:05:01,160 --> 00:05:04,560 Speaker 1: outlook for the future, right like during the Gilded Age 81 00:05:04,720 --> 00:05:07,279 Speaker 1: when it was rosier. Yeah, way more optimistic than the 82 00:05:07,320 --> 00:05:10,320 Speaker 1: Cold War, which is kind of ironic because the Guilded 83 00:05:10,360 --> 00:05:13,040 Speaker 1: Age didn't have anything to be optimistic about. They were 84 00:05:13,080 --> 00:05:18,960 Speaker 1: just pretending. Hence the name, Um the thing is. What 85 00:05:19,000 --> 00:05:21,479 Speaker 1: you've just said, though, is is kind of an argument 86 00:05:21,520 --> 00:05:24,839 Speaker 1: against future ology because one of the big critiques of 87 00:05:24,880 --> 00:05:27,440 Speaker 1: it is that a future ologist, they're not doing anything. 88 00:05:27,720 --> 00:05:31,160 Speaker 1: Even if you're commenting on the past or the future, 89 00:05:31,839 --> 00:05:35,920 Speaker 1: you're still really commenting about your present, your contemporary time, 90 00:05:35,960 --> 00:05:39,440 Speaker 1: because that's what you or recent past, that's what you 91 00:05:39,560 --> 00:05:42,719 Speaker 1: some what you've lived through and experienced. That's all you 92 00:05:42,760 --> 00:05:47,440 Speaker 1: can really reflect on is and and future ology seeks 93 00:05:47,480 --> 00:05:50,120 Speaker 1: to go beyond that. Well, yeah, that makes sense. So 94 00:05:50,200 --> 00:05:52,360 Speaker 1: if you like, look at this thing that is happening 95 00:05:52,440 --> 00:05:56,360 Speaker 1: now or just happened, then what is going to be 96 00:05:56,480 --> 00:05:59,880 Speaker 1: happening in that thing in ten years? And it's it's 97 00:06:00,400 --> 00:06:02,760 Speaker 1: a lot of times based on how the direction is 98 00:06:02,760 --> 00:06:07,599 Speaker 1: currently going. Yes, okay, so um gurbis makes a pretty 99 00:06:07,640 --> 00:06:11,040 Speaker 1: good gives a good example that the cell phone grew 100 00:06:11,040 --> 00:06:15,680 Speaker 1: out of the telegraph, which ultimately is related further back 101 00:06:15,720 --> 00:06:19,359 Speaker 1: to the smoke signal, sure, right, Yeah, But if you 102 00:06:19,360 --> 00:06:21,600 Speaker 1: were a future all just hanging out around somebody who 103 00:06:21,640 --> 00:06:24,960 Speaker 1: was sending smoke signals, would you be able to predict 104 00:06:24,960 --> 00:06:30,200 Speaker 1: the cell phone? Probably not or not. Could you predict 105 00:06:30,520 --> 00:06:35,040 Speaker 1: the impact of the automobile or the highway system? Maybe? 106 00:06:35,480 --> 00:06:39,440 Speaker 1: But would you predict that people would um have sex 107 00:06:39,440 --> 00:06:42,320 Speaker 1: in the back seat of a car because it provides 108 00:06:42,360 --> 00:06:46,080 Speaker 1: a little well, I don't think they did. Or urban sprawl? Yeah? 109 00:06:46,080 --> 00:06:50,640 Speaker 1: Could you predict excerbs and edge cities just because the 110 00:06:50,680 --> 00:06:52,760 Speaker 1: highways got built? Yeah, and not a lot of people did, 111 00:06:52,839 --> 00:06:55,240 Speaker 1: even though a lot of people said there's going to 112 00:06:55,279 --> 00:06:58,440 Speaker 1: be horseless carriages one day, and it's going They're going 113 00:06:58,480 --> 00:07:00,600 Speaker 1: to change things big time. People are going to be 114 00:07:00,680 --> 00:07:03,760 Speaker 1: able to move around a lot more. But that doesn't 115 00:07:03,800 --> 00:07:08,360 Speaker 1: mean that everybody saw every result of the automobile. Um, 116 00:07:08,400 --> 00:07:13,960 Speaker 1: it was a game changer. Yeah, has what you could agreed? So? 117 00:07:14,160 --> 00:07:16,520 Speaker 1: Um what we're saying here, And if it sounds a 118 00:07:16,520 --> 00:07:20,480 Speaker 1: little weird that we're at once supporting and criticizing future ology, 119 00:07:20,600 --> 00:07:23,440 Speaker 1: that's basically the fun thing to do when you talk 120 00:07:23,480 --> 00:07:27,840 Speaker 1: about future ology is um to criticize it and be 121 00:07:28,000 --> 00:07:30,680 Speaker 1: awed by it. Because a lot of times they really 122 00:07:30,720 --> 00:07:36,280 Speaker 1: are super right, that's right. Uh. Futurology has been around 123 00:07:36,320 --> 00:07:39,520 Speaker 1: for a long time. Um, I mean there since people 124 00:07:39,520 --> 00:07:43,240 Speaker 1: were writing fiction, there were people predicting the future, but 125 00:07:43,440 --> 00:07:48,280 Speaker 1: as far as um things didn't really get going as 126 00:07:48,280 --> 00:07:52,760 Speaker 1: far as being meaningful until after World War Two, when 127 00:07:53,320 --> 00:07:58,560 Speaker 1: the US started developing technological forecasting. Basically, like it was 128 00:07:58,640 --> 00:08:00,680 Speaker 1: really important to try and see you where things were 129 00:08:00,720 --> 00:08:04,520 Speaker 1: going militarily, right, because it was super expensive to develop 130 00:08:04,600 --> 00:08:08,080 Speaker 1: new technologies. Uh, it could take a long time. So 131 00:08:08,120 --> 00:08:10,760 Speaker 1: they started thinking, hey, we need to get some people 132 00:08:10,760 --> 00:08:13,880 Speaker 1: on board that can kind of hopefully predict where we're 133 00:08:13,880 --> 00:08:15,840 Speaker 1: headed here so we can make the right decisions. Yeah, 134 00:08:15,920 --> 00:08:18,000 Speaker 1: because if it takes a really long time, like you said, 135 00:08:18,040 --> 00:08:20,800 Speaker 1: to develop a weapon, by the time you have that 136 00:08:20,840 --> 00:08:24,840 Speaker 1: weapon in deployed in the field, you're gonna need to 137 00:08:24,880 --> 00:08:27,600 Speaker 1: know it's not already obsolete. The only way to do 138 00:08:27,640 --> 00:08:30,200 Speaker 1: that is to predict what kind of warfare you're going 139 00:08:30,280 --> 00:08:32,440 Speaker 1: to be engaged in. Because this is a time, like 140 00:08:32,480 --> 00:08:34,720 Speaker 1: at the end of World War two, so many inventions 141 00:08:34,760 --> 00:08:36,800 Speaker 1: came out of World War one and two war machine 142 00:08:36,840 --> 00:08:40,320 Speaker 1: inventions that that things were changing so quickly that there 143 00:08:40,400 --> 00:08:44,160 Speaker 1: was actually you can kind of put modern future ology 144 00:08:44,200 --> 00:08:46,600 Speaker 1: into the lap of one guy, an Air Force general 145 00:08:46,679 --> 00:08:50,800 Speaker 1: named hap Arnold, who saw that things were changing so 146 00:08:50,880 --> 00:08:54,880 Speaker 1: fast that his air force needed to basically predict the 147 00:08:54,880 --> 00:08:57,600 Speaker 1: future and see what direction it needed to go. So 148 00:08:57,640 --> 00:08:59,920 Speaker 1: he looked around and he started tapping people to do that. 149 00:09:00,120 --> 00:09:02,959 Speaker 1: One of the first people he tapped as a scientist 150 00:09:03,320 --> 00:09:08,040 Speaker 1: UM an aeronautical engineer named Theodore von Karmen. Yes, he 151 00:09:08,160 --> 00:09:10,520 Speaker 1: was a super smart dude, UM, and he led a 152 00:09:10,520 --> 00:09:14,400 Speaker 1: team that did predict a lot of stuff like drones. 153 00:09:15,200 --> 00:09:18,120 Speaker 1: UM and as far as you know, the military using drones, 154 00:09:18,200 --> 00:09:21,559 Speaker 1: not your uncle who flies it around the neighborhood to 155 00:09:21,600 --> 00:09:26,160 Speaker 1: film stuff. He predicted the rise of Brookstone, UH, target 156 00:09:26,240 --> 00:09:31,400 Speaker 1: seeking missiles, UM, supersonic aircraft, and even the atom bomb. 157 00:09:31,600 --> 00:09:35,000 Speaker 1: All of this was in one report to hap Arnold, 158 00:09:35,480 --> 00:09:38,040 Speaker 1: Like this is like and this guy knocked out of 159 00:09:38,080 --> 00:09:41,559 Speaker 1: the park. But um, he and his group were very 160 00:09:41,640 --> 00:09:45,200 Speaker 1: much limited to small academic and military circles. Like the 161 00:09:45,240 --> 00:09:47,920 Speaker 1: general public wasn't aware that this was going on, but 162 00:09:49,280 --> 00:09:54,479 Speaker 1: his group. Von Karmen's groups so accurately foresaw the direction 163 00:09:54,600 --> 00:09:59,160 Speaker 1: that that modern warfare was going that you can very 164 00:09:59,200 --> 00:10:01,400 Speaker 1: also very resially make the case that you know, he 165 00:10:01,559 --> 00:10:03,920 Speaker 1: basically created a roadmap to the future that the Air 166 00:10:03,960 --> 00:10:08,520 Speaker 1: Force followed. So is his prophecy were self fulfilling because 167 00:10:08,640 --> 00:10:10,800 Speaker 1: he said go this way, and the Air Force went 168 00:10:10,840 --> 00:10:14,120 Speaker 1: that way and created all this stuff. Yeah, and then 169 00:10:14,160 --> 00:10:19,640 Speaker 1: the UM the military and uh well, the Brand Corporation specifically, 170 00:10:20,000 --> 00:10:21,959 Speaker 1: it grew out of the US Air Force and Douglass 171 00:10:22,000 --> 00:10:25,360 Speaker 1: Aircraft in the mid forties. They said, well, having one 172 00:10:25,400 --> 00:10:28,320 Speaker 1: person to say these things was great, but what we 173 00:10:28,360 --> 00:10:32,160 Speaker 1: need is a team and a consensus among this team. 174 00:10:32,280 --> 00:10:34,640 Speaker 1: So they kind of, uh, well not kind of. They 175 00:10:34,840 --> 00:10:38,559 Speaker 1: very much patented a technique they called the Delphi uh 176 00:10:38,679 --> 00:10:42,320 Speaker 1: technique d E L p h I and that is 177 00:10:42,360 --> 00:10:45,880 Speaker 1: basically a technique where they're trying to get agreed on 178 00:10:46,000 --> 00:10:50,440 Speaker 1: consensus from a number of people. So there's there's this 179 00:10:50,559 --> 00:10:54,720 Speaker 1: very famous story about how UM the the Navy, I 180 00:10:54,760 --> 00:10:58,319 Speaker 1: think lost a submarine, a nuclear submarine, or the Russians 181 00:10:58,360 --> 00:11:00,160 Speaker 1: had lost the submarine something like that. There is a 182 00:11:00,200 --> 00:11:03,320 Speaker 1: lost sub that they wanted to find and they had 183 00:11:03,320 --> 00:11:06,920 Speaker 1: no idea where it was. So the Navy pulled all 184 00:11:06,960 --> 00:11:11,079 Speaker 1: these different UM, different experts and all these different fields 185 00:11:11,120 --> 00:11:16,800 Speaker 1: that might have something to do with nuclear submarines, whether um, 186 00:11:16,920 --> 00:11:21,560 Speaker 1: aeronautics uh, people from Noah, all these people right and 187 00:11:21,840 --> 00:11:24,480 Speaker 1: asked them where do you think the sub is? And 188 00:11:24,559 --> 00:11:27,920 Speaker 1: no one hit it on the nose. But when they 189 00:11:27,920 --> 00:11:32,480 Speaker 1: when they basically used um uh statistical distribution of these 190 00:11:32,559 --> 00:11:37,360 Speaker 1: various opinions guesses of professionals, it led them right to 191 00:11:37,480 --> 00:11:40,440 Speaker 1: that sub And that's what the Delfi technique does too. 192 00:11:40,559 --> 00:11:44,280 Speaker 1: It it takes opinions of experts in various fields and 193 00:11:44,320 --> 00:11:46,679 Speaker 1: says what do you think of this? And everybody sends 194 00:11:46,679 --> 00:11:50,600 Speaker 1: in a questionnaire and anonymously, and there's no group meetings, 195 00:11:50,640 --> 00:11:54,040 Speaker 1: so the group doesn't bow to pressure, no leaders emerge. 196 00:11:54,320 --> 00:11:57,920 Speaker 1: They're giving their unvarnished opinion. And then after the those 197 00:11:57,920 --> 00:12:01,440 Speaker 1: opinions come in there they they take that information and 198 00:12:01,480 --> 00:12:03,320 Speaker 1: send it out again. So it goes in rounds and 199 00:12:03,440 --> 00:12:06,839 Speaker 1: rounds and rounds until they finally come to a group 200 00:12:06,880 --> 00:12:10,040 Speaker 1: consensus that in the future we're all going to be 201 00:12:10,080 --> 00:12:14,320 Speaker 1: wearing metallic blue jumpsuits. Yeah. And what they're doing is 202 00:12:14,360 --> 00:12:17,560 Speaker 1: generating what's known as a scenario. And a guy named 203 00:12:17,559 --> 00:12:21,319 Speaker 1: Herman Kahn k h N worked with Brand in the 204 00:12:21,679 --> 00:12:24,320 Speaker 1: nineteen fifties and he's the one that kind of coined 205 00:12:24,360 --> 00:12:29,320 Speaker 1: the term scenario as it applies to futurology. Uh. A 206 00:12:29,320 --> 00:12:32,680 Speaker 1: pretty good definition I found was a scenario is a 207 00:12:32,679 --> 00:12:36,160 Speaker 1: a detailed portrait of a plausible future world, one sufficiently 208 00:12:36,240 --> 00:12:40,160 Speaker 1: vivid that a planner can clearly see and comprehend the problems, challenges, 209 00:12:40,200 --> 00:12:44,200 Speaker 1: and opportunities that such an environment would present. So it's 210 00:12:44,280 --> 00:12:48,400 Speaker 1: it's you know, it's saying, uh uh. In the future, 211 00:12:48,400 --> 00:12:51,040 Speaker 1: we're gonna have a scenario where they're going to be 212 00:12:51,120 --> 00:12:54,840 Speaker 1: robots in every house. And one of the biggest ways 213 00:12:54,880 --> 00:12:58,560 Speaker 1: that they work on scenarios is with something called bat casting, 214 00:12:59,080 --> 00:13:01,199 Speaker 1: which is starting the end, which is you got a 215 00:13:01,280 --> 00:13:04,640 Speaker 1: robot in every house, and then go backwards. How you 216 00:13:04,679 --> 00:13:07,200 Speaker 1: got there? Yeah, how we how you got there? Yeah? 217 00:13:07,400 --> 00:13:10,400 Speaker 1: Makes sense? Yeah, and scenario. That's a pretty cool scenario. 218 00:13:10,520 --> 00:13:13,880 Speaker 1: They can also be as mundane as running a fire 219 00:13:13,960 --> 00:13:17,760 Speaker 1: drill um where you're envisioning the fire broke out in 220 00:13:17,800 --> 00:13:21,400 Speaker 1: the high school gym and so everybody needs to get out. 221 00:13:21,520 --> 00:13:23,520 Speaker 1: That's a that's a scenario. It's as simple as that. 222 00:13:24,000 --> 00:13:28,280 Speaker 1: There um the weather forecasts or economic forecasts that are 223 00:13:28,360 --> 00:13:32,840 Speaker 1: run through computer algorithms. The computer algorithms the model. The 224 00:13:32,880 --> 00:13:35,599 Speaker 1: process that it's going through is the scenario, and it 225 00:13:35,720 --> 00:13:38,600 Speaker 1: spits out a possible prediction. It's almost like an effect 226 00:13:38,679 --> 00:13:43,640 Speaker 1: then cause right, yeah, yeah, excellently, thank you. Uh So 227 00:13:43,760 --> 00:13:46,080 Speaker 1: Herman Cohn worked with Rand and uh and he had 228 00:13:46,120 --> 00:13:48,360 Speaker 1: did you look him up at all? Yeah, he's one 229 00:13:48,360 --> 00:13:51,800 Speaker 1: of the inspirations for Dr Strangelove. He was described as 230 00:13:51,840 --> 00:13:54,680 Speaker 1: a super genius. Yeah, he was super smart and he 231 00:13:54,880 --> 00:13:57,360 Speaker 1: um he kind of was a bit of a celebrity 232 00:13:57,400 --> 00:14:00,360 Speaker 1: at the time. He wrote a book um in nineteen 233 00:14:00,400 --> 00:14:03,640 Speaker 1: sixty one called on thermon Nuclear War uh and then 234 00:14:03,679 --> 00:14:06,520 Speaker 1: went on to form left Rand to form the Hudson Institute, 235 00:14:06,920 --> 00:14:09,840 Speaker 1: where he basically was like, we're a group that is 236 00:14:09,880 --> 00:14:14,120 Speaker 1: going to forecast the future. So he became It was 237 00:14:14,240 --> 00:14:16,920 Speaker 1: like super popular book, and he spawned a lot of 238 00:14:16,920 --> 00:14:19,640 Speaker 1: other books, similar books. We need to take a break, 239 00:14:19,680 --> 00:14:22,360 Speaker 1: but we're getting We'll get right back to this in 240 00:14:22,400 --> 00:14:40,360 Speaker 1: a second. So, Chuck, you're just talking about Herman Kahn 241 00:14:40,920 --> 00:14:43,960 Speaker 1: being the super genius who was something of a celebrity. 242 00:14:44,320 --> 00:14:47,200 Speaker 1: I read that Timothy Leary animated that he had taken 243 00:14:47,240 --> 00:14:50,640 Speaker 1: acid with him. I believe that he was a part 244 00:14:50,640 --> 00:14:53,320 Speaker 1: of the inspiration for Dr. Strange love and this book 245 00:14:53,360 --> 00:14:58,120 Speaker 1: that he wrote called UM the the Year two thousand, 246 00:14:58,480 --> 00:15:02,320 Speaker 1: a Framework for Speculation the next thirty three years. It 247 00:15:02,440 --> 00:15:08,000 Speaker 1: basically established this outlook that UM that America and capitalism 248 00:15:08,040 --> 00:15:16,400 Speaker 1: could do anything thanks to basically technological UM inventiveness. Yeah, 249 00:15:16,440 --> 00:15:19,120 Speaker 1: here's a let's hear some of these um. There was 250 00:15:19,160 --> 00:15:23,320 Speaker 1: a list in that book, one hundred technical innovations very 251 00:15:23,360 --> 00:15:27,040 Speaker 1: likely in the last third of the twentieth century. Hundred 252 00:15:28,000 --> 00:15:33,720 Speaker 1: Uh some of the first ten multiple applications of lasers, boom, 253 00:15:34,120 --> 00:15:40,480 Speaker 1: high strength structural materials nailed it, wouldn't you think hallois Uh, 254 00:15:40,720 --> 00:15:45,000 Speaker 1: new or improved materials for equipment and appliances. Now that's 255 00:15:45,040 --> 00:15:47,840 Speaker 1: that's easy. Yeah, anyone can say that it would be 256 00:15:47,880 --> 00:15:54,240 Speaker 1: better material and predict that now. Uh, longer range, longer 257 00:15:54,360 --> 00:15:58,640 Speaker 1: range weather forecasting, more reliable weather forecasting. No, I don't 258 00:15:58,640 --> 00:16:01,520 Speaker 1: know about that one. I think that was us. How 259 00:16:01,520 --> 00:16:05,520 Speaker 1: about this here here are a few of the other ones. 260 00:16:05,920 --> 00:16:09,440 Speaker 1: New techniques for cheap and reliable birth control for sure. Yeah. 261 00:16:09,560 --> 00:16:13,760 Speaker 1: The pill I don't know if the pills around we 262 00:16:13,760 --> 00:16:15,560 Speaker 1: should do a whole thing may have been the same 263 00:16:15,640 --> 00:16:19,640 Speaker 1: year because it came out in sixty seven, was it? Yeah? Uh, Well, 264 00:16:19,640 --> 00:16:22,720 Speaker 1: This book came out in sixty seven, right, widespread use 265 00:16:22,720 --> 00:16:27,240 Speaker 1: of nuclear reactors for power duh, improved capability to change 266 00:16:27,320 --> 00:16:30,360 Speaker 1: sex of a children or adult gender reassignment. We did 267 00:16:30,360 --> 00:16:34,240 Speaker 1: a great episode on that pervasive business use of computers. Yeah, 268 00:16:34,280 --> 00:16:37,920 Speaker 1: they're all over personal pages. Yeah, they came in what 269 00:16:39,200 --> 00:16:41,560 Speaker 1: and then one of the other ones was home computers 270 00:16:41,600 --> 00:16:44,520 Speaker 1: to run households and communicate with the outside world. Yeah, 271 00:16:44,560 --> 00:16:47,360 Speaker 1: the Internet of Things. Yeah. They also predicted, um, the 272 00:16:47,480 --> 00:16:50,960 Speaker 1: rise of the credit economy really yeah, that we currently 273 00:16:51,240 --> 00:16:54,160 Speaker 1: are in interesting yeah, so um and that was just 274 00:16:54,280 --> 00:16:58,160 Speaker 1: like a list, like a sidebar basically in the book, 275 00:16:58,240 --> 00:17:00,480 Speaker 1: in this book, but the whole, the whole idea that 276 00:17:00,880 --> 00:17:04,120 Speaker 1: um America and capitalism in the West could invent its 277 00:17:04,119 --> 00:17:06,800 Speaker 1: way out of any problem we possibly ran across in 278 00:17:06,800 --> 00:17:11,000 Speaker 1: the future was the premise or the position of this book, 279 00:17:11,040 --> 00:17:16,280 Speaker 1: and it caused an enormous fewer in academic circles, and 280 00:17:16,320 --> 00:17:19,679 Speaker 1: not just academic circles, because this book was one of 281 00:17:19,680 --> 00:17:21,919 Speaker 1: the first to introduce to the public that there were 282 00:17:21,960 --> 00:17:26,640 Speaker 1: such things as think tanks like RAND and that. Yeah, 283 00:17:26,680 --> 00:17:29,920 Speaker 1: and that these people were sitting there thinking about the 284 00:17:29,960 --> 00:17:32,159 Speaker 1: future and we're writing books about it. And it kind 285 00:17:32,200 --> 00:17:34,760 Speaker 1: of became a hit thing, But the Club of Rome 286 00:17:34,880 --> 00:17:39,640 Speaker 1: was basically diametrically opposed to the UM the outlook that 287 00:17:40,040 --> 00:17:42,840 Speaker 1: Herman Khan had and the Club of Rome was a 288 00:17:42,880 --> 00:17:47,000 Speaker 1: business consortium that conspiracy theorist is basically the seat of 289 00:17:47,040 --> 00:17:51,560 Speaker 1: the new World Order. They are UM and the Club 290 00:17:51,560 --> 00:17:54,320 Speaker 1: of Rome basically said, no, we were. We were establishing 291 00:17:54,359 --> 00:17:57,080 Speaker 1: the gloom and doom camp that there's such thing as 292 00:17:57,200 --> 00:18:01,960 Speaker 1: resource depletion, over population and we are basically doomed. Yeah, 293 00:18:01,960 --> 00:18:04,040 Speaker 1: I mean we've we've covered this a lot on the show. 294 00:18:04,080 --> 00:18:07,000 Speaker 1: Different people that have made wild predictions about We're gonna 295 00:18:07,080 --> 00:18:09,840 Speaker 1: run out of this by this year. Thomas mathis, Yeah, 296 00:18:09,960 --> 00:18:12,800 Speaker 1: very mauthusian. One of the books that came out of 297 00:18:12,800 --> 00:18:15,280 Speaker 1: the Club of Rome UH in nineteen seventy two was 298 00:18:15,280 --> 00:18:19,520 Speaker 1: called Limits to Growth by Danilla H. Meadows, Dennis Meadows, 299 00:18:19,720 --> 00:18:23,119 Speaker 1: Jurgen Randers, and William Baron's at M I. T. And 300 00:18:23,160 --> 00:18:28,560 Speaker 1: they had a very dire apocalyptic outlook of the future UM, 301 00:18:28,720 --> 00:18:30,240 Speaker 1: as did a lot of other people at the time, 302 00:18:30,280 --> 00:18:34,560 Speaker 1: and a lot of these were way off base. A 303 00:18:34,600 --> 00:18:37,399 Speaker 1: lot of these dire predictions, you know, which happened over 304 00:18:37,440 --> 00:18:40,040 Speaker 1: and over again. Yeah, and so on the Club of 305 00:18:40,160 --> 00:18:44,360 Speaker 1: Rome's website. They defend them The Limits to Growth. No, 306 00:18:44,480 --> 00:18:47,159 Speaker 1: not the Limits to Growth, the UM Yeah, the Limits 307 00:18:47,200 --> 00:18:52,320 Speaker 1: to Growth book, basically saying that it's often mis cited 308 00:18:52,359 --> 00:18:56,879 Speaker 1: as predicting the collapse of civilization UM due to renewable 309 00:18:56,920 --> 00:19:00,960 Speaker 1: resource um overuse, and it doesn't do that. But they 310 00:19:00,960 --> 00:19:04,080 Speaker 1: did use the same kind of techniques that Herman Kahn 311 00:19:04,200 --> 00:19:07,320 Speaker 1: and some of his other colleagues were coming up with 312 00:19:07,760 --> 00:19:14,359 Speaker 1: by by taking population um information, food production data, industrial 313 00:19:14,359 --> 00:19:18,320 Speaker 1: production pollution, and non renewable resource consumption and then running 314 00:19:18,320 --> 00:19:22,720 Speaker 1: scenarios through this model that they built using computers and 315 00:19:22,840 --> 00:19:25,640 Speaker 1: coming up the scenarios they came up with. We're kind 316 00:19:25,640 --> 00:19:29,439 Speaker 1: of grim. The thing is is, even though they missed 317 00:19:29,440 --> 00:19:34,880 Speaker 1: the mark, they still helped establish a very young idea 318 00:19:35,320 --> 00:19:38,159 Speaker 1: that we we can't just you can't just throw your 319 00:19:38,240 --> 00:19:41,800 Speaker 1: McDonald's styrofoam on the ground, You can't drive a car 320 00:19:41,880 --> 00:19:44,879 Speaker 1: that gets two miles per gallon, like, we can't live 321 00:19:45,000 --> 00:19:49,280 Speaker 1: like everything is just forever abundant, that there's no such 322 00:19:49,320 --> 00:19:52,720 Speaker 1: thing as scarcity. Yeah, it's a double edged sword though, 323 00:19:52,760 --> 00:19:54,800 Speaker 1: Like I totally agree, but then it also when you're 324 00:19:54,800 --> 00:19:59,080 Speaker 1: wrong about these things, it gives cynics something point to say, well, 325 00:19:59,119 --> 00:20:02,200 Speaker 1: she we didn't run to oil in the early nineteen eighties, 326 00:20:02,240 --> 00:20:05,000 Speaker 1: like you said, we would do anything about it. Yeah, 327 00:20:05,080 --> 00:20:07,720 Speaker 1: I mean, man, that is a great point. It's a 328 00:20:07,920 --> 00:20:10,240 Speaker 1: it's a very great point. But at the same time, 329 00:20:10,960 --> 00:20:14,000 Speaker 1: what you're seeing here between the limits to growth and 330 00:20:14,119 --> 00:20:18,080 Speaker 1: the year two thousand um they we still see this 331 00:20:18,160 --> 00:20:22,119 Speaker 1: today with climate change. You know, it's like, let's do 332 00:20:22,200 --> 00:20:24,679 Speaker 1: something about climate change. The other people say, no, we 333 00:20:24,720 --> 00:20:26,480 Speaker 1: can invent our way out of it. And besides, if 334 00:20:26,480 --> 00:20:28,840 Speaker 1: we do something about climate change, it's gonna mess with 335 00:20:28,840 --> 00:20:32,480 Speaker 1: the economy. And these people are saying, forget about the economy. 336 00:20:32,760 --> 00:20:35,640 Speaker 1: We're all going to die, or not necessarily forget about 337 00:20:35,640 --> 00:20:38,479 Speaker 1: the economy. But maybe you can do both, you know. 338 00:20:39,680 --> 00:20:41,600 Speaker 1: You know, my whole deal with that has always been 339 00:20:41,920 --> 00:20:45,560 Speaker 1: just like why why take that risk? Well, we humans 340 00:20:45,560 --> 00:20:49,359 Speaker 1: aren't very good at like preparing for future risk, which 341 00:20:49,400 --> 00:20:51,440 Speaker 1: is I think one of the reasons why future alogists 342 00:20:51,440 --> 00:20:55,640 Speaker 1: are so revered and and odd but also mocked and scorned, 343 00:20:56,080 --> 00:20:59,080 Speaker 1: because they're doing something that's almost almost flies in the 344 00:20:59,119 --> 00:21:02,520 Speaker 1: face of human nature. Yeah, you're really putting yourself out 345 00:21:02,520 --> 00:21:06,159 Speaker 1: there when you predict the stuff. Uh, that was one 346 00:21:06,200 --> 00:21:08,200 Speaker 1: other episode that that just reminded me of the ten 347 00:21:08,200 --> 00:21:12,960 Speaker 1: thousand year clock. That was a great one. So um, 348 00:21:13,040 --> 00:21:16,399 Speaker 1: the military, the United States military obviously has used it 349 00:21:16,440 --> 00:21:20,280 Speaker 1: for years. Um then beginning uh when was this in 350 00:21:20,320 --> 00:21:24,040 Speaker 1: the sixties or seventies that business got into it. So 351 00:21:24,119 --> 00:21:29,160 Speaker 1: in V two, I think Royal Dutch Shell heard somebody 352 00:21:29,200 --> 00:21:31,120 Speaker 1: at the top heard that there wasn't gonna be any 353 00:21:31,160 --> 00:21:36,040 Speaker 1: oil in by five, and they went yeah. Businesses basically said, 354 00:21:36,160 --> 00:21:39,880 Speaker 1: wait a minute, there are people that can actually use 355 00:21:40,000 --> 00:21:43,280 Speaker 1: models to determine what the future might look like. How 356 00:21:43,320 --> 00:21:46,040 Speaker 1: can we use that to make money. Well, let's throw 357 00:21:46,080 --> 00:21:48,800 Speaker 1: money at him and find out exactly. A couple of 358 00:21:48,840 --> 00:21:53,080 Speaker 1: other places to um that that were nascent think tanks 359 00:21:53,080 --> 00:21:56,760 Speaker 1: and like grand was the Stanford Research Institute Futures Group 360 00:21:56,760 --> 00:22:01,720 Speaker 1: in the California Institute of Technology. UM, early like kind 361 00:22:01,720 --> 00:22:07,000 Speaker 1: of think tank breeding grounds, just smart people walking around 362 00:22:07,080 --> 00:22:09,760 Speaker 1: thinking about the future. But that wasn't enough. Um, you 363 00:22:09,840 --> 00:22:12,399 Speaker 1: can't just say this is what I think it's going 364 00:22:12,480 --> 00:22:15,040 Speaker 1: to be like, you have to back it up. And 365 00:22:15,040 --> 00:22:16,840 Speaker 1: we'll talk about how they back it up right after this. 366 00:22:32,760 --> 00:22:36,800 Speaker 1: How do they back it up? Well, they use different techniques. 367 00:22:36,840 --> 00:22:39,399 Speaker 1: If you're a futurist or a future ologist. You're going 368 00:22:39,480 --> 00:22:43,119 Speaker 1: to be using techniques that um are pretty recognizable. But 369 00:22:43,160 --> 00:22:45,240 Speaker 1: the way you put them together and the things you 370 00:22:45,359 --> 00:22:48,760 Speaker 1: sort out, um, is what's going to make you successful 371 00:22:48,840 --> 00:22:52,360 Speaker 1: or not successful. Right, So you're you like brainstorm ideas, Yeah, 372 00:22:52,359 --> 00:22:55,120 Speaker 1: that's probably where you start. Yeah, just just like Blue 373 00:22:55,119 --> 00:22:58,880 Speaker 1: Sky Territory as they say, yeah, you um imagine things 374 00:22:58,960 --> 00:23:02,840 Speaker 1: using scenarios are games apparently game theory, but we got 375 00:23:02,840 --> 00:23:05,159 Speaker 1: to do that at some point. Yeah, I've been avoiding it, 376 00:23:06,880 --> 00:23:10,600 Speaker 1: like we could mess it up really bad, but we'll 377 00:23:10,640 --> 00:23:15,960 Speaker 1: do it. Um. That that changed the the futurism feel 378 00:23:16,160 --> 00:23:18,760 Speaker 1: tremendously when they came up with game theory, because it's 379 00:23:18,760 --> 00:23:21,920 Speaker 1: a pretty good way of predicting how people will work. 380 00:23:21,960 --> 00:23:26,479 Speaker 1: And that's one of the big confounding factors is you 381 00:23:26,520 --> 00:23:29,119 Speaker 1: can predict something, follow every single one of these steps 382 00:23:29,160 --> 00:23:32,359 Speaker 1: that we're talking about right now, um, and then people 383 00:23:32,400 --> 00:23:34,880 Speaker 1: will just cut to the left all of a sudden, 384 00:23:35,480 --> 00:23:39,200 Speaker 1: and your prediction just fell to the wayside because humanity 385 00:23:39,240 --> 00:23:42,719 Speaker 1: went this way real quick. Or somebody invented a game changer, 386 00:23:43,520 --> 00:23:47,680 Speaker 1: a game changing product or innovation that nobody saw coming. Yeah, 387 00:23:47,720 --> 00:23:51,680 Speaker 1: what's that called disruptive technology, is it? Yeah, that's a 388 00:23:51,720 --> 00:23:54,360 Speaker 1: good not like that, Uh, not a bad band name. 389 00:23:55,320 --> 00:23:57,560 Speaker 1: Oh I wonder if it's not there. If so, it's 390 00:23:57,560 --> 00:24:00,800 Speaker 1: made of like Silicon Valley, rich guy, it's like my 391 00:24:00,880 --> 00:24:05,080 Speaker 1: side band. Right. Um, do you want to gather professional 392 00:24:05,080 --> 00:24:08,760 Speaker 1: opinions using say the Delphi technique. You want to do 393 00:24:08,840 --> 00:24:13,480 Speaker 1: historical analysis. Current trends are very huge and can help 394 00:24:13,520 --> 00:24:16,000 Speaker 1: you as well. And then like you were saying, I 395 00:24:16,000 --> 00:24:22,560 Speaker 1: think you call it back masking. No, that's uh, turn 396 00:24:22,680 --> 00:24:26,520 Speaker 1: me on dead man, right, Yeah, that's what they do. 397 00:24:26,560 --> 00:24:30,639 Speaker 1: They listen to the Beatles backwards. Uh where what it 398 00:24:30,760 --> 00:24:33,960 Speaker 1: was it? It's not back masking, I know, but where 399 00:24:33,960 --> 00:24:36,760 Speaker 1: they where you envision the future and then you work 400 00:24:36,800 --> 00:24:40,680 Speaker 1: your way backwards from it. When you do this, you 401 00:24:40,840 --> 00:24:44,760 Speaker 1: do all this stuff together and again back cast, back casting. 402 00:24:45,200 --> 00:24:47,520 Speaker 1: And when you're when you're using this along with the 403 00:24:47,880 --> 00:24:51,440 Speaker 1: computer algorithms that can model like the economy or the weather, 404 00:24:52,200 --> 00:24:55,800 Speaker 1: um oil consumption or something like that, you can come 405 00:24:55,920 --> 00:24:58,840 Speaker 1: up with something that you could rightly say is a 406 00:24:58,880 --> 00:25:01,720 Speaker 1: prediction or forecast for the future where we're going to 407 00:25:01,760 --> 00:25:07,000 Speaker 1: be again, though, um, just things happen, like for example, 408 00:25:07,280 --> 00:25:12,120 Speaker 1: Herman Kahn did not predict the UM the oil crisis 409 00:25:12,160 --> 00:25:15,720 Speaker 1: that came the year after he wrote another famous book 410 00:25:16,160 --> 00:25:19,879 Speaker 1: in nineteen seventy two. Um he wrote a response, I 411 00:25:19,880 --> 00:25:22,600 Speaker 1: think to limits the growth and just totally missed the 412 00:25:22,640 --> 00:25:25,800 Speaker 1: oil crisis. Man. But how could he predict that? Because 413 00:25:25,840 --> 00:25:28,920 Speaker 1: the oil crisis came out of the OPEC oil embargo 414 00:25:29,040 --> 00:25:31,600 Speaker 1: that was punishment for the US is being involved in 415 00:25:31,640 --> 00:25:34,399 Speaker 1: the Yam Kipper War. So you couldn't see that coming. No, 416 00:25:34,640 --> 00:25:38,320 Speaker 1: And that's the big problem with futuro ology. Yes, exactly, 417 00:25:38,880 --> 00:25:41,040 Speaker 1: our own US government has been wrong with the U 418 00:25:41,080 --> 00:25:44,479 Speaker 1: s Department of Interior announced twice in nineteen thirty nine 419 00:25:44,520 --> 00:25:47,360 Speaker 1: and then in nineteen fifty one that we only had 420 00:25:47,440 --> 00:25:51,359 Speaker 1: thirteen years of oil left. So weird that both times 421 00:25:51,359 --> 00:25:54,480 Speaker 1: it was thirteen years. They don't like to bother people, 422 00:25:54,480 --> 00:25:56,840 Speaker 1: so they wait until there's thirteen years left, and they 423 00:25:56,880 --> 00:26:02,600 Speaker 1: sound it's just such a specific number. Uh. What else, Well, 424 00:26:02,640 --> 00:26:05,160 Speaker 1: we we've talked about More's law before. That has aged 425 00:26:05,200 --> 00:26:10,320 Speaker 1: a little better than some other futurology predictions because, uh 426 00:26:10,320 --> 00:26:13,840 Speaker 1: it is been revised over the years, which is sort 427 00:26:13,840 --> 00:26:16,320 Speaker 1: of a cheat a little bit. But still, what I 428 00:26:16,359 --> 00:26:19,760 Speaker 1: really meant was, I think he went from eighteen months 429 00:26:19,800 --> 00:26:22,399 Speaker 1: to two years or something like that. But what's funny 430 00:26:22,440 --> 00:26:25,680 Speaker 1: is Gurbis stakes his position in this article. He's saying, 431 00:26:25,720 --> 00:26:28,359 Speaker 1: like the limits to growth and the other Club of 432 00:26:28,480 --> 00:26:31,919 Speaker 1: Rome stuff, they missed the mark um because they predicted 433 00:26:32,240 --> 00:26:38,680 Speaker 1: catastrophe and Moore's law predicts um technological innovation, so it's successful. 434 00:26:39,240 --> 00:26:43,480 Speaker 1: So clearly Gurbas agrees with the Herman Kahn group rather 435 00:26:43,520 --> 00:26:46,359 Speaker 1: than the Club of Rome group. I don't think it's subtle. 436 00:26:46,440 --> 00:26:48,760 Speaker 1: I think you uh, you can't just say, like the 437 00:26:48,880 --> 00:26:52,159 Speaker 1: gloom and doom camp has just been completely eradicated or 438 00:26:52,240 --> 00:26:55,240 Speaker 1: proven wrong. Agreed, you know, yeah, Moore's law, I don't 439 00:26:55,240 --> 00:26:58,320 Speaker 1: even think we said specifically. It predicts the number of 440 00:26:58,359 --> 00:27:02,000 Speaker 1: transistors on integrated circuits and computers doubles every two years. 441 00:27:02,640 --> 00:27:04,560 Speaker 1: And like we said, it's been updated and it's been 442 00:27:04,600 --> 00:27:09,000 Speaker 1: pretty consistent. And so with Herman Khan's popularity and then 443 00:27:09,040 --> 00:27:16,359 Speaker 1: the big high profile m book publishing argument that he 444 00:27:16,400 --> 00:27:18,520 Speaker 1: got in with the Club of Rome that led to 445 00:27:18,600 --> 00:27:20,879 Speaker 1: like a spade of other future ology books that we 446 00:27:21,080 --> 00:27:23,000 Speaker 1: can remember it being a big deal. When I was 447 00:27:23,000 --> 00:27:25,080 Speaker 1: a kid, I remember a lot of people talking about 448 00:27:25,920 --> 00:27:28,720 Speaker 1: the near and far future. The one that I ran 449 00:27:28,760 --> 00:27:31,119 Speaker 1: across in this article that I had heard of, but 450 00:27:31,160 --> 00:27:34,800 Speaker 1: I didn't know anything about Alvin Toffler's Future Shock. I 451 00:27:34,880 --> 00:27:38,280 Speaker 1: can't remember that. I think, did you read it? The cover? 452 00:27:38,400 --> 00:27:41,320 Speaker 1: I guarantee would just give you a nostalgia I'm sure. Um, 453 00:27:41,359 --> 00:27:43,320 Speaker 1: but it came out in nineteen seventy, and it predicts 454 00:27:43,320 --> 00:27:48,200 Speaker 1: a future where too much rapid change technological change in advancement, 455 00:27:48,760 --> 00:27:51,960 Speaker 1: it happens too quickly, and people get all sorts of 456 00:27:52,080 --> 00:27:57,360 Speaker 1: stressed and just worn out and um basically have all 457 00:27:57,640 --> 00:28:01,680 Speaker 1: all manner of terrible um reactions to it. And I'm like, oh, 458 00:28:02,119 --> 00:28:06,120 Speaker 1: like I predicted two thousand fifteen, so like a person's 459 00:28:06,400 --> 00:28:11,720 Speaker 1: emotions couldn't handle. Yeah, we're just overwhelmed through too much 460 00:28:11,840 --> 00:28:16,200 Speaker 1: rapid technological innovation happens too quick. Do you think we're overwhelmed? 461 00:28:16,720 --> 00:28:19,320 Speaker 1: Like I get stressed out by like say social media 462 00:28:19,440 --> 00:28:21,280 Speaker 1: or something like that. Yeah. I wonder if it's a 463 00:28:21,560 --> 00:28:24,960 Speaker 1: people of a a certain age maybe, Yeah. I would guess 464 00:28:24,960 --> 00:28:26,719 Speaker 1: if you're born into it, you're used to it, so 465 00:28:27,200 --> 00:28:32,160 Speaker 1: it would probably more more likely apply to a transition population, 466 00:28:32,359 --> 00:28:35,800 Speaker 1: like the transitional generation. Is that what we are? Don't 467 00:28:35,800 --> 00:28:37,800 Speaker 1: you get stressed by social media. Don't you get like 468 00:28:38,000 --> 00:28:41,120 Speaker 1: this tents and uh yeah, I mean I kind of 469 00:28:41,160 --> 00:28:44,720 Speaker 1: just hate it or having like having information and all 470 00:28:44,720 --> 00:28:48,560 Speaker 1: this information and all if it's just so thin content 471 00:28:48,640 --> 00:28:53,400 Speaker 1: wise or value wise, but there's tons of it and 472 00:28:53,440 --> 00:28:57,840 Speaker 1: it's always coming at you. Wears me out. I got 473 00:28:57,880 --> 00:29:02,600 Speaker 1: the future shock chuck at the Jimmy legs. Uh no, 474 00:29:02,760 --> 00:29:06,600 Speaker 1: I totally agree. I'm like that. I just want to 475 00:29:06,640 --> 00:29:14,640 Speaker 1: shut it all down, just everybody, not podcasts that should 476 00:29:14,640 --> 00:29:17,560 Speaker 1: live on. Uh. So we talked about science fiction writers 477 00:29:17,640 --> 00:29:20,920 Speaker 1: and how they are easily off the hook because they're 478 00:29:20,920 --> 00:29:24,040 Speaker 1: just writers, right, They're not supposed to predict the future. Um, 479 00:29:24,120 --> 00:29:26,640 Speaker 1: but they have been. You can't dismiss it because they've 480 00:29:26,680 --> 00:29:28,800 Speaker 1: been on the money or close to it a lot 481 00:29:28,840 --> 00:29:31,360 Speaker 1: over the years. Because, like we said, they're not hampered 482 00:29:31,400 --> 00:29:36,200 Speaker 1: by the rational laws of today. They can just say 483 00:29:36,200 --> 00:29:39,240 Speaker 1: whatever they want and if they're wrong, it's like, hey, dude, 484 00:29:39,240 --> 00:29:43,200 Speaker 1: I'm just writing stuff. Yes this is fiction. Um. But 485 00:29:43,840 --> 00:29:47,680 Speaker 1: a few of the highlights, um. Jules Verne mid nineteenth 486 00:29:47,720 --> 00:29:51,400 Speaker 1: century predicted uh, going to the moon and a spacecraft. 487 00:29:52,040 --> 00:29:55,280 Speaker 1: Not only that, so he predicted it would be shot 488 00:29:55,280 --> 00:29:59,880 Speaker 1: out of a cannon basically. Um. But the thing that 489 00:30:00,680 --> 00:30:04,600 Speaker 1: he really got though, was that he placed the moon 490 00:30:04,760 --> 00:30:07,920 Speaker 1: shot in Florida, like a hundred and thirty seven miles 491 00:30:07,920 --> 00:30:10,960 Speaker 1: from Cape Canaveral, where they do launch rockets to the moon. 492 00:30:11,120 --> 00:30:13,360 Speaker 1: Not bad now And for the same reason to like that, 493 00:30:13,440 --> 00:30:16,760 Speaker 1: it was it's close to the equator. Oh is that why? 494 00:30:16,800 --> 00:30:19,520 Speaker 1: It's one of the reasons why. Plus Cape Canaveral is 495 00:30:19,600 --> 00:30:24,000 Speaker 1: largely protected um by the Gulf Stream from hurricanes. Like 496 00:30:24,040 --> 00:30:27,440 Speaker 1: as a hurricane comes ashore right before it starts to 497 00:30:27,480 --> 00:30:31,040 Speaker 1: get to Canaveral, it goes out again and then hits 498 00:30:31,480 --> 00:30:35,040 Speaker 1: North Carolina. Interesting, that would be an interesting conversation to 499 00:30:35,120 --> 00:30:38,160 Speaker 1: have been in on when they were picking place, like 500 00:30:38,200 --> 00:30:40,480 Speaker 1: where should we launch this? I mean, where should we 501 00:30:40,480 --> 00:30:44,320 Speaker 1: put all of our money in? Right? Uh h g Wells. 502 00:30:44,680 --> 00:30:48,960 Speaker 1: He predicted tanks. Yeah, he was. Supposedly he was the 503 00:30:49,000 --> 00:30:51,400 Speaker 1: first guy to really think of himself as a futurist. 504 00:30:51,720 --> 00:30:54,520 Speaker 1: He predicted adam bomb in nineteen eight, aerial bombing in 505 00:30:54,600 --> 00:31:00,760 Speaker 1: nineteen o eight. What the name robot is actually coined 506 00:31:00,760 --> 00:31:04,080 Speaker 1: by a science fiction writer, a check writer name Carl 507 00:31:04,240 --> 00:31:10,040 Speaker 1: Kopeck and Um in nineteen one he named robots I 508 00:31:10,080 --> 00:31:14,880 Speaker 1: think the all time winner, though is Hugo Gernsbach and 509 00:31:15,080 --> 00:31:18,120 Speaker 1: Hugo Gernsbach. If you're into science fiction, you recognize his 510 00:31:18,160 --> 00:31:21,400 Speaker 1: first name because he's who the Hugo Award is named after. 511 00:31:21,720 --> 00:31:23,880 Speaker 1: You may also recognize his last name too, if you're 512 00:31:23,920 --> 00:31:27,600 Speaker 1: a Hugo Gernsbach fan. But back in uh, I think 513 00:31:27,680 --> 00:31:31,960 Speaker 1: the nineteen tens. Yeah, he wrote a book called Ralph 514 00:31:32,000 --> 00:31:36,800 Speaker 1: one to four C forty one plus. He predicted everything 515 00:31:36,840 --> 00:31:40,120 Speaker 1: in this Yeah you know what that means. It's actually 516 00:31:40,160 --> 00:31:46,280 Speaker 1: a play on words. Uh one two It means one 517 00:31:46,320 --> 00:31:51,520 Speaker 1: to four ce for one another. You get it? Wow? Yeah, 518 00:31:51,520 --> 00:31:54,040 Speaker 1: that's great. One to four C for one and then 519 00:31:54,080 --> 00:31:58,000 Speaker 1: another is the plus sign that that alone, I was sold, like, 520 00:31:58,040 --> 00:32:00,280 Speaker 1: I love this guy. It's just like that Van Halen 521 00:32:00,320 --> 00:32:04,680 Speaker 1: album Oh You eight one two exactly. So what has 522 00:32:04,720 --> 00:32:08,720 Speaker 1: he predicted? He predicted solar power, like the realistic use 523 00:32:08,720 --> 00:32:13,960 Speaker 1: of solar power. He predicted plastics, video phones, tape recorders, 524 00:32:15,000 --> 00:32:22,720 Speaker 1: um jukeboxes, loudspeakers, tinfoil, rustpous steel, synthetic fabrics, all in 525 00:32:22,800 --> 00:32:26,520 Speaker 1: one book. And he's famous in the Hugo Awards named 526 00:32:26,560 --> 00:32:29,640 Speaker 1: after him because he wanted to make science fiction more 527 00:32:29,720 --> 00:32:33,720 Speaker 1: science based, you know, using that same logic, So he 528 00:32:33,760 --> 00:32:37,200 Speaker 1: would have been a very um like almost a father 529 00:32:37,480 --> 00:32:40,680 Speaker 1: of future ology. Oh yeah, for sure. You know, here's 530 00:32:40,720 --> 00:32:44,280 Speaker 1: a few other things that from that book. Um, this 531 00:32:44,320 --> 00:32:46,280 Speaker 1: one to me, I'm surprised no one's done this yet. 532 00:32:46,320 --> 00:32:49,920 Speaker 1: The appetizer, which is at a restaurant in an in 533 00:32:50,000 --> 00:32:53,320 Speaker 1: an advanced scientifically advanced restaurant, will be a room that 534 00:32:53,360 --> 00:32:55,400 Speaker 1: you wait in before you get your table that's flooded 535 00:32:55,400 --> 00:32:59,320 Speaker 1: with gases that make you hungry. Oh, not bad, Just 536 00:32:59,360 --> 00:33:02,360 Speaker 1: have a the appetizer room, right, we'll be ready shortly. 537 00:33:03,240 --> 00:33:06,560 Speaker 1: It's like bloody fingernails and scratched into the walls that 538 00:33:06,680 --> 00:33:08,720 Speaker 1: people are trying to get to the other room where 539 00:33:08,720 --> 00:33:12,080 Speaker 1: the food is. Uh. The telt, the telautograph, which is 540 00:33:12,120 --> 00:33:16,720 Speaker 1: basically a fax machine. Uh. The telefot which was a 541 00:33:16,720 --> 00:33:21,000 Speaker 1: picture phone. UM had a universal translator where they translate 542 00:33:21,040 --> 00:33:24,160 Speaker 1: any language right there on your in your hand. Not bad. 543 00:33:24,880 --> 00:33:30,200 Speaker 1: And then this one I love. The vacation city was 544 00:33:30,280 --> 00:33:34,160 Speaker 1: a suspended city and a domed suspended city twenty feet 545 00:33:34,160 --> 00:33:37,640 Speaker 1: in the air that used the device that nullified gravity. 546 00:33:37,760 --> 00:33:42,640 Speaker 1: And in vacation City, no mechanical devices are permitted because 547 00:33:42,680 --> 00:33:45,280 Speaker 1: it was supposed to be a true escape. That's aweso 548 00:33:45,320 --> 00:33:47,880 Speaker 1: from the mechanized world waiting for that one. And this 549 00:33:47,960 --> 00:33:51,200 Speaker 1: was in nineteen eleven. He predicted just that there would 550 00:33:51,200 --> 00:33:53,360 Speaker 1: be a need for that. That's like that town in 551 00:33:53,520 --> 00:33:58,280 Speaker 1: um West Virginia, Green something, West Virginia where the people 552 00:33:58,280 --> 00:34:01,200 Speaker 1: who have electromagnetic sensitivity ago because you're not allowed to 553 00:34:01,240 --> 00:34:05,160 Speaker 1: have any electromagnetic stuff because there's like a radio telescope 554 00:34:05,160 --> 00:34:09,160 Speaker 1: where there's something there could be interfered with, and you 555 00:34:09,160 --> 00:34:13,399 Speaker 1: co could be amish. Can you just be amish? Like, hey, 556 00:34:13,400 --> 00:34:15,520 Speaker 1: I want to be amish. If you're Harrison Ford, you 557 00:34:15,520 --> 00:34:21,360 Speaker 1: could be yeah. Or Woody Harrelson, Yeah right, you got 558 00:34:21,400 --> 00:34:23,879 Speaker 1: anything else? How about these predictions for the future, There's 559 00:34:23,880 --> 00:34:25,920 Speaker 1: a couple in here. They are kind of funny. Ten 560 00:34:26,000 --> 00:34:29,920 Speaker 1: predictions that missed the mark, and these are real predictions. 561 00:34:30,040 --> 00:34:33,800 Speaker 1: UM In nineteen seven, U. S. News and World Reports 562 00:34:33,840 --> 00:34:36,080 Speaker 1: said that by the end of the century, we will 563 00:34:36,280 --> 00:34:40,400 Speaker 1: launch our freight across the continent with missiles. Like you 564 00:34:40,480 --> 00:34:43,359 Speaker 1: order something from Amazon in New York, instead of having 565 00:34:43,360 --> 00:34:46,040 Speaker 1: a fulfilment center nearby, they just put it in a 566 00:34:46,080 --> 00:34:50,120 Speaker 1: missile and shoot it to you. Didn't happen, um No, 567 00:34:50,360 --> 00:34:54,320 Speaker 1: But drones are coming. I really are they still on that? Probably? 568 00:34:56,960 --> 00:35:02,800 Speaker 1: A guy named Alex LeWitt predicted nuclear power vacuum cleaners. Uh, 569 00:35:02,840 --> 00:35:06,280 Speaker 1: this one I think would be pretty great dissolving dishes, 570 00:35:07,520 --> 00:35:09,960 Speaker 1: and uh asked what it would be like in the 571 00:35:10,000 --> 00:35:14,479 Speaker 1: year two thousand. A science writer named Valdemir come Forth. 572 00:35:15,160 --> 00:35:20,160 Speaker 1: There's a lot of man. He's a fabulous science writer 573 00:35:20,239 --> 00:35:23,000 Speaker 1: with the funny name Consonance in a row. Um, he said, 574 00:35:23,040 --> 00:35:28,000 Speaker 1: you would basically, uh, put your plate in two and 575 00:35:28,040 --> 00:35:29,839 Speaker 1: fifty degree water at the end and it would just 576 00:35:29,840 --> 00:35:36,879 Speaker 1: dissolve it. No more dishwashing. Uh. Bucky Fuller predicted that 577 00:35:37,239 --> 00:35:40,920 Speaker 1: Canada would be a subtropical climate because we've build a 578 00:35:40,960 --> 00:35:45,520 Speaker 1: dome over it. That didn't happen. No, it didn't, which 579 00:35:45,560 --> 00:35:49,279 Speaker 1: is strange because Bucky Fuller was pretty sharp. Dude. Here's 580 00:35:49,280 --> 00:35:52,640 Speaker 1: another one, Um, was he really? Yeah? Buck Minster Fuller? 581 00:35:52,800 --> 00:35:55,479 Speaker 1: Oh I didn't pick up on that. He's who Bucky 582 00:35:55,560 --> 00:36:00,920 Speaker 1: balls who are named after? Really? Why? I don't? He 583 00:36:00,960 --> 00:36:04,839 Speaker 1: may have invented him. I'm not sure. It's the those 584 00:36:04,920 --> 00:36:10,839 Speaker 1: little balls that are magnetic spheres that like you shape in. Uh. 585 00:36:10,880 --> 00:36:14,320 Speaker 1: Here's one a Scottish geneticist that um said in the 586 00:36:14,400 --> 00:36:17,759 Speaker 1: nineteen twenties that in the future, one third of the 587 00:36:17,760 --> 00:36:21,360 Speaker 1: babies would not be born, only one third would be 588 00:36:21,400 --> 00:36:24,560 Speaker 1: born as a result of pregnancy, and the other babies 589 00:36:24,600 --> 00:36:27,360 Speaker 1: would be born in a lab. Would they be grown 590 00:36:27,440 --> 00:36:31,960 Speaker 1: basically exogenesis? Uh, here's the last one. Check you already. 591 00:36:33,960 --> 00:36:39,520 Speaker 1: The Research Institute of America, which sounds pretty smart, said that, um, 592 00:36:39,640 --> 00:36:43,400 Speaker 1: by seventy five, I'm sorry, this is several years before that, 593 00:36:43,440 --> 00:36:48,520 Speaker 1: we would all be driving. Um. Personal helicopters did not 594 00:36:48,640 --> 00:36:51,120 Speaker 1: pan out, Probably never will. I don't know if i'd 595 00:36:51,120 --> 00:36:55,520 Speaker 1: want a personal helicopter, you know. I was, uh, for 596 00:36:55,560 --> 00:36:58,400 Speaker 1: Emily's birthday. I rented a cabin in the North Georgia Mountains. 597 00:36:58,840 --> 00:37:01,239 Speaker 1: Did you take a personal hell copter there? No? But 598 00:37:01,600 --> 00:37:04,200 Speaker 1: I was sitting on the deck we all were, and 599 00:37:04,360 --> 00:37:06,720 Speaker 1: way across the valley on the side of a mountain 600 00:37:06,800 --> 00:37:10,920 Speaker 1: was this huge, huge house, and I heard a sound 601 00:37:10,920 --> 00:37:13,359 Speaker 1: of a helicopter. It was like and I saw blinking light. 602 00:37:13,760 --> 00:37:16,840 Speaker 1: I got up the binoculars and this dude had a 603 00:37:16,880 --> 00:37:20,680 Speaker 1: helicopter and he took it and he flew it down 604 00:37:21,000 --> 00:37:23,560 Speaker 1: about two miles to the lake at the bottom of 605 00:37:23,560 --> 00:37:26,920 Speaker 1: the valley. And uh, I guess he has a lake 606 00:37:26,920 --> 00:37:29,520 Speaker 1: house and a mountain house in the easiest way to 607 00:37:29,560 --> 00:37:31,960 Speaker 1: get there, just to make the four minute elicopter flight. 608 00:37:32,120 --> 00:37:35,520 Speaker 1: That's crazy. Yeah, it was pretty amazing. Wow. I want 609 00:37:35,520 --> 00:37:37,759 Speaker 1: to know who that guy is. And that guy could 610 00:37:37,800 --> 00:37:40,040 Speaker 1: be a lady. Yeah what am I saying? It could 611 00:37:40,040 --> 00:37:44,680 Speaker 1: be Carly Fiorina. Yeah, is that she's the woman who's 612 00:37:44,760 --> 00:37:52,880 Speaker 1: running for GOP president candidate? Oh right, Fiorina, that's right, gotcha? 613 00:37:53,520 --> 00:37:57,440 Speaker 1: Go ahead? Oh sorry, Uh, let's see. Well, if you 614 00:37:57,440 --> 00:37:59,480 Speaker 1: want to know more about futurology, you can type that 615 00:37:59,520 --> 00:38:01,960 Speaker 1: word into the search part how still works dot com? 616 00:38:02,040 --> 00:38:05,239 Speaker 1: And since Chuck had an anecdote about helicopters, it's time 617 00:38:05,280 --> 00:38:09,799 Speaker 1: for the listener mail. Uh it sort of looked like 618 00:38:09,800 --> 00:38:12,359 Speaker 1: one of those Magnum p i ones too. Well, if 619 00:38:12,400 --> 00:38:14,520 Speaker 1: I did have a personal helicopter, it was look an 620 00:38:14,560 --> 00:38:17,719 Speaker 1: awful lot like that. I'm sure what Hey, guys, my 621 00:38:17,800 --> 00:38:19,600 Speaker 1: name Michelleby. I'm honored for you to be reading this. 622 00:38:19,920 --> 00:38:22,000 Speaker 1: My husband and I love your show, and you've solved 623 00:38:22,000 --> 00:38:24,120 Speaker 1: our dilemma. So what to listen to in our car together? 624 00:38:24,520 --> 00:38:25,759 Speaker 1: I want to let you know you did a great 625 00:38:25,840 --> 00:38:28,160 Speaker 1: job on the HIV AIDS podcast. However, I think you 626 00:38:28,800 --> 00:38:31,600 Speaker 1: miss telling a really important story about the AIDS crisis. 627 00:38:31,640 --> 00:38:34,240 Speaker 1: Just before the AIDS crisis broke, a method for treating 628 00:38:34,239 --> 00:38:38,360 Speaker 1: hemophilia called the clotting factor concentrate was developed. I finally 629 00:38:38,440 --> 00:38:41,279 Speaker 1: let those suffering from the disease live into adulthood and 630 00:38:41,280 --> 00:38:44,399 Speaker 1: completely change the landscape of the disorder. By the time 631 00:38:44,600 --> 00:38:47,560 Speaker 1: HIV was discovered to be a blood borne virus, many 632 00:38:47,560 --> 00:38:50,200 Speaker 1: of those suffering from hemophilia already had it, not to 633 00:38:50,280 --> 00:38:54,400 Speaker 1: mention that many also contracted hepatitis. However, the pharmaceutical companies 634 00:38:54,400 --> 00:38:56,680 Speaker 1: did not begin to pasteurize the drug in spite of 635 00:38:56,680 --> 00:38:59,600 Speaker 1: their knowledge that it was spreading HIV until a strong 636 00:38:59,640 --> 00:39:02,919 Speaker 1: public outcry prompted a government intervention. I think the story 637 00:39:03,000 --> 00:39:04,879 Speaker 1: is not told often enough, and the injustice that these 638 00:39:04,880 --> 00:39:09,120 Speaker 1: individuals suffered at the hands of big Pharma is undoubtedly 639 00:39:09,160 --> 00:39:11,640 Speaker 1: one of the greatest our country has seen. Uh. There's 640 00:39:11,640 --> 00:39:14,440 Speaker 1: an extremely informative and sad documentary on the topic called 641 00:39:14,520 --> 00:39:19,120 Speaker 1: Bad Blood Colon Cultionary Tale. Anyway, that's about it, uh, 642 00:39:19,120 --> 00:39:22,840 Speaker 1: And I'm sorry if I bummed everyone out there is 643 00:39:23,120 --> 00:39:28,840 Speaker 1: from Shelby. Shelby thank you for not only that illuminating email, 644 00:39:29,640 --> 00:39:33,520 Speaker 1: but also the documentary recommendation. We're always looking for those. Absolutely. 645 00:39:33,960 --> 00:39:35,400 Speaker 1: If you want to get in touch with us, you 646 00:39:35,440 --> 00:39:37,759 Speaker 1: can tweet to us at s y s K Podcasts. 647 00:39:37,840 --> 00:39:39,880 Speaker 1: You can join us on Facebook dot com, slash stuff 648 00:39:39,880 --> 00:39:41,680 Speaker 1: you Should Know. You can send us an email to 649 00:39:41,840 --> 00:39:44,720 Speaker 1: Stuff Podcasts at how stuff Works dot com, and, as always, 650 00:39:44,760 --> 00:39:46,960 Speaker 1: ting us at home on the web, Stuff you Should 651 00:39:46,960 --> 00:39:54,239 Speaker 1: Know dot com. For more on this and thousands of 652 00:39:54,239 --> 00:40:04,000 Speaker 1: other topics, is it how stuff Works dot com