1 00:00:03,040 --> 00:00:07,080 Speaker 1: Welcome to Stuff to Blow Your Mind, a production of iHeartRadio. 2 00:00:12,800 --> 00:00:14,920 Speaker 2: Hey you welcome to Stuff to Blow your Mind. My 3 00:00:15,000 --> 00:00:16,320 Speaker 2: name is Robert Lamb. 4 00:00:16,280 --> 00:00:19,079 Speaker 3: And I'm Joe McCormick, and we're back with part three 5 00:00:19,280 --> 00:00:23,080 Speaker 3: in our series on Future Shock, the title and the 6 00:00:23,120 --> 00:00:27,440 Speaker 3: subject of a best selling futurology book from more than 7 00:00:27,480 --> 00:00:30,720 Speaker 3: fifty years ago. If you haven't heard parts one and 8 00:00:30,760 --> 00:00:32,720 Speaker 3: two yet, you should probably go back and listen to 9 00:00:32,760 --> 00:00:35,720 Speaker 3: those first. That'll help you understand what we're talking about today. 10 00:00:36,159 --> 00:00:38,720 Speaker 3: But I thought we could start with a recap here 11 00:00:38,760 --> 00:00:41,879 Speaker 3: at the top. So Future Shock is the name of 12 00:00:41,920 --> 00:00:45,879 Speaker 3: a very influential book published in nineteen seventy by an 13 00:00:45,920 --> 00:00:49,360 Speaker 3: author named Alvin Toffler, who was known to be a 14 00:00:49,400 --> 00:00:53,159 Speaker 3: close collaborator with his wife, Heidi Toffler. So in this 15 00:00:53,280 --> 00:00:56,000 Speaker 3: series we've sometimes been speaking of the ideas of the 16 00:00:56,040 --> 00:00:59,680 Speaker 3: Tofflers rather than just Alvin, though originally he was credited 17 00:00:59,680 --> 00:01:02,480 Speaker 3: as the sole author of the book. And I would 18 00:01:02,520 --> 00:01:06,360 Speaker 3: describe the main idea of the book Future Shock as follows. So, 19 00:01:07,040 --> 00:01:10,360 Speaker 3: of course, the human technological tool set is always changing 20 00:01:10,480 --> 00:01:14,240 Speaker 3: to some degree, but the age beginning in like the 21 00:01:14,280 --> 00:01:17,880 Speaker 3: second half of the twentieth century is truly a special 22 00:01:17,920 --> 00:01:21,160 Speaker 3: time in history. It's a time when technology is developing 23 00:01:21,360 --> 00:01:24,560 Speaker 3: much much faster than ever before. I think there are 24 00:01:24,560 --> 00:01:26,560 Speaker 3: a lot of data points you could use to show 25 00:01:26,600 --> 00:01:29,600 Speaker 3: that it's not just that it feels this way, but 26 00:01:29,680 --> 00:01:34,320 Speaker 3: this is objectively true, Like you could measure the rate 27 00:01:34,360 --> 00:01:38,360 Speaker 3: of acceleration of energy consumed around the world, the accelerating 28 00:01:38,480 --> 00:01:43,240 Speaker 3: number of patents issued, the productivity in various industries per worker, 29 00:01:44,240 --> 00:01:49,440 Speaker 3: the acceleration of time spent on technologically mediated activities, and 30 00:01:49,480 --> 00:01:53,600 Speaker 3: so forth. And the authors of this book argue that 31 00:01:53,680 --> 00:01:57,200 Speaker 3: these changes are so drastic we should think of the 32 00:01:57,720 --> 00:02:01,360 Speaker 3: time beginning in the mid twentieth century as a totally 33 00:02:01,480 --> 00:02:05,680 Speaker 3: new technology regime. Maybe the previous regimes began with the 34 00:02:05,720 --> 00:02:09,800 Speaker 3: transition from hunter gatherer lifestyle to agriculture, and then after 35 00:02:09,840 --> 00:02:13,600 Speaker 3: that the transition to the industrial age through inventions like 36 00:02:13,639 --> 00:02:16,240 Speaker 3: the steam engine, and then this would be the third one. 37 00:02:16,280 --> 00:02:18,560 Speaker 3: And this new age that we exist in they call 38 00:02:18,639 --> 00:02:24,520 Speaker 3: the super Industrial Age, and the acceleration in technological change 39 00:02:24,840 --> 00:02:27,800 Speaker 3: that characterizes this age may of course come with lots 40 00:02:27,800 --> 00:02:30,600 Speaker 3: of benefits for human life. You can easily point to 41 00:02:30,880 --> 00:02:35,480 Speaker 3: medical advances that make life human life longer and help 42 00:02:35,480 --> 00:02:38,239 Speaker 3: people live with less illness, and you know, all kinds 43 00:02:38,280 --> 00:02:43,359 Speaker 3: of things that are pretty unambiguous positive impacts on human life. 44 00:02:43,400 --> 00:02:47,280 Speaker 3: But the Tofflers argue that these changes also come with 45 00:02:47,360 --> 00:02:51,679 Speaker 3: a profound cost that we have to understand and prepare for. 46 00:02:52,160 --> 00:02:54,280 Speaker 3: And the cost that they focus on in the book 47 00:02:54,400 --> 00:02:59,000 Speaker 3: is a mass psychological condition that they call future shock, 48 00:02:59,240 --> 00:03:02,840 Speaker 3: and they compare future shock to the pre existing idea 49 00:03:02,880 --> 00:03:06,320 Speaker 3: of culture shock, which is the state of anxiety and 50 00:03:06,360 --> 00:03:09,480 Speaker 3: psychic distress brought on when somebody is plunged into an 51 00:03:09,600 --> 00:03:13,200 Speaker 3: unfamiliar culture where they don't speak the language, they don't 52 00:03:13,240 --> 00:03:16,080 Speaker 3: understand the laws or customs, they don't know how to 53 00:03:16,120 --> 00:03:19,760 Speaker 3: interact with anything. They say future shock is like that, 54 00:03:20,080 --> 00:03:24,520 Speaker 3: but for one's own culture, as it changes rapidly around 55 00:03:24,600 --> 00:03:27,080 Speaker 3: us due to the effects of the new science and 56 00:03:27,120 --> 00:03:30,240 Speaker 3: the new machines. And they say this new environment will 57 00:03:30,280 --> 00:03:36,240 Speaker 3: be characterized primarily by transience, novelty, and diversity. So transience 58 00:03:36,600 --> 00:03:41,240 Speaker 3: things situations coming into and out of being faster and faster, 59 00:03:41,880 --> 00:03:46,200 Speaker 3: arrangements lasting for shorter periods of time of course, novelty 60 00:03:46,400 --> 00:03:48,840 Speaker 3: meaning new things you've never had to deal with before, 61 00:03:49,200 --> 00:03:52,000 Speaker 3: and diversity meaning just a lot of different things. To 62 00:03:52,160 --> 00:03:55,600 Speaker 3: understand and decide between and choose from. And so the 63 00:03:55,600 --> 00:03:59,000 Speaker 3: Topler's right that these changes in technology have profound effects, 64 00:03:59,760 --> 00:04:02,320 Speaker 3: not just on the gadgets we deal with, but you know, 65 00:04:02,360 --> 00:04:06,640 Speaker 3: they've got these secondary effects that revolutionize our work lives, 66 00:04:06,680 --> 00:04:09,920 Speaker 3: our family lives, our minds, and our culture. And that 67 00:04:09,960 --> 00:04:14,680 Speaker 3: the accelerating rate of change alters our culture faster than 68 00:04:14,720 --> 00:04:17,400 Speaker 3: most people are able to adapt, so we can never 69 00:04:17,560 --> 00:04:20,800 Speaker 3: get used to it. We can never grow accustomed to 70 00:04:20,880 --> 00:04:24,239 Speaker 3: the new normal as people can when overcoming culture shock, 71 00:04:24,279 --> 00:04:27,279 Speaker 3: because with culture shock you can eventually maybe learn the 72 00:04:27,360 --> 00:04:31,839 Speaker 3: language and learn the local norms and adapt. Or you conversely, 73 00:04:32,120 --> 00:04:34,920 Speaker 3: you can just go home. With future shock, you can't 74 00:04:34,920 --> 00:04:36,719 Speaker 3: ever really do that because you can't go back to 75 00:04:36,760 --> 00:04:40,200 Speaker 3: the past, and by the time you get used to it, 76 00:04:40,240 --> 00:04:42,599 Speaker 3: you learn the new language and the new customs, it 77 00:04:42,640 --> 00:04:45,760 Speaker 3: has changed again. And technology will just keep changing the 78 00:04:45,760 --> 00:04:49,279 Speaker 3: world faster and faster, so we can never keep up. 79 00:04:49,839 --> 00:04:54,920 Speaker 3: And they say this leads to a widespread sense of unease, anxiety, frustration, 80 00:04:55,080 --> 00:04:58,760 Speaker 3: and confusion, the future shock that defines our age. 81 00:04:59,400 --> 00:05:01,719 Speaker 2: You know, I have to throw in here that in 82 00:05:01,760 --> 00:05:04,640 Speaker 2: the first episode I mentioned that when I explained the 83 00:05:04,680 --> 00:05:06,320 Speaker 2: concept again to my wife. She was like, Oh, well, 84 00:05:06,480 --> 00:05:08,440 Speaker 2: that's not real. I don't really believe that's the thing. 85 00:05:09,120 --> 00:05:12,919 Speaker 2: But then a listener wrote in and encouraged everyone to 86 00:05:13,000 --> 00:05:17,320 Speaker 2: watch a particular episode of SpongeBob square Pants, an episode 87 00:05:17,400 --> 00:05:20,200 Speaker 2: or part of an episode titled SB one nine, in 88 00:05:20,240 --> 00:05:24,240 Speaker 2: which Squidward travels into the future and is overcome by 89 00:05:24,279 --> 00:05:28,280 Speaker 2: all of the chrome technological advancements around him, including multiple 90 00:05:28,320 --> 00:05:31,400 Speaker 2: SpongeBob clones, curls up in a ball on the floor 91 00:05:31,440 --> 00:05:34,960 Speaker 2: and begins to go future future over and over again. 92 00:05:35,880 --> 00:05:38,479 Speaker 2: My family watched this episode over the weekend, and my 93 00:05:38,520 --> 00:05:40,919 Speaker 2: wife tells me that now she understands future shock. 94 00:05:41,120 --> 00:05:44,279 Speaker 3: Oh okay, wait, she understands it, but still thinks that 95 00:05:44,360 --> 00:05:46,680 Speaker 3: it does not describe the reality we live in. Or 96 00:05:46,720 --> 00:05:49,200 Speaker 3: she does think it describes the reality we live in. 97 00:05:49,720 --> 00:05:53,720 Speaker 2: I would say that she understands the concept and sees 98 00:05:53,839 --> 00:05:57,320 Speaker 2: how you could apply that concept to some of our 99 00:05:57,360 --> 00:05:59,680 Speaker 2: interactions with technology today. 100 00:06:00,160 --> 00:06:04,240 Speaker 3: Okay, so she's partially converted, thinks it is partially descriptive 101 00:06:04,279 --> 00:06:04,960 Speaker 3: of reality. 102 00:06:05,360 --> 00:06:07,400 Speaker 2: Yeah, you can't argue with squidword on this one. 103 00:06:07,800 --> 00:06:09,400 Speaker 3: I'd say that's where I am. I think it is 104 00:06:09,480 --> 00:06:12,200 Speaker 3: partially correct, partially descriptive of our reality. 105 00:06:12,920 --> 00:06:13,320 Speaker 4: Yeah. 106 00:06:13,520 --> 00:06:15,880 Speaker 3: Well, anyway, in this series, we've been taking a look 107 00:06:15,920 --> 00:06:20,039 Speaker 3: at this fifty three year old book future Ology to 108 00:06:20,120 --> 00:06:22,039 Speaker 3: see what we think about it, what we think the 109 00:06:22,080 --> 00:06:24,640 Speaker 3: Tofflers were right about, what we think they were wrong about, 110 00:06:25,320 --> 00:06:27,880 Speaker 3: if there's any empirical evidence that can be brought in 111 00:06:27,960 --> 00:06:31,559 Speaker 3: to assess the accuracy of their predictions, and so forth. 112 00:06:31,680 --> 00:06:34,039 Speaker 3: And so we've already talked about a number of their 113 00:06:34,560 --> 00:06:38,120 Speaker 3: specific predictions about the future. Though to be fair, I 114 00:06:38,160 --> 00:06:40,640 Speaker 3: do recall there's a part in the book where Toffler says, 115 00:06:41,040 --> 00:06:44,839 Speaker 3: these are not quote predictions. You know, so when I say, like, 116 00:06:45,240 --> 00:06:47,160 Speaker 3: in the future, we're going to be living a lot 117 00:06:47,200 --> 00:06:49,880 Speaker 3: more underneath the ocean and having to learn to navigate 118 00:06:49,960 --> 00:06:53,279 Speaker 3: submarine environments and stuff, for some reason, he says, they're 119 00:06:53,279 --> 00:06:57,160 Speaker 3: not predictions. They're more just sort of like imaginings, and 120 00:06:57,520 --> 00:07:01,480 Speaker 3: we shouldn't be overly concerned with whether d specifics are right, 121 00:07:01,600 --> 00:07:06,240 Speaker 3: but instead should think about the general trend. Okay, fair enough, Alvin, 122 00:07:06,600 --> 00:07:10,200 Speaker 3: But in fact, the way we use the word prediction, 123 00:07:10,360 --> 00:07:12,560 Speaker 3: I do think these are predictions, so I think it's 124 00:07:12,560 --> 00:07:14,559 Speaker 3: fair to describe them that way, even though he didn't 125 00:07:14,600 --> 00:07:15,480 Speaker 3: like using that word. 126 00:07:16,240 --> 00:07:17,880 Speaker 2: Yeah, I kind of see where he's coming from. Like, 127 00:07:17,920 --> 00:07:23,160 Speaker 2: he's not saying we will definitely put embryos on spaceships 128 00:07:23,160 --> 00:07:26,520 Speaker 2: and send them to other planets, but he points to 129 00:07:26,600 --> 00:07:30,880 Speaker 2: that as sort of like in the Tree of conceivable possibilities, 130 00:07:30,920 --> 00:07:35,320 Speaker 2: based on like unchecked scientific advancement, that's somewhere we might 131 00:07:35,440 --> 00:07:38,840 Speaker 2: get to. And if we do get there, are we 132 00:07:39,160 --> 00:07:42,760 Speaker 2: really ready to deal with that? And I'd think the 133 00:07:42,800 --> 00:07:45,480 Speaker 2: most part he tends to lay out those ideas in 134 00:07:45,520 --> 00:07:48,600 Speaker 2: a very sort of neutral fashion, though again, it is 135 00:07:48,640 --> 00:07:52,320 Speaker 2: a book written in the late sixties, published in nineteen seventy, 136 00:07:52,400 --> 00:07:56,440 Speaker 2: and some of that cultural texture is there, as we've discussed, agreed. 137 00:07:56,960 --> 00:08:00,560 Speaker 3: But anyway, in the previous episodes, we've talked about of 138 00:08:00,600 --> 00:08:03,120 Speaker 3: these things which I do think are fair to call predictions, 139 00:08:03,160 --> 00:08:08,120 Speaker 3: but you're right, he couches them in a more nuanced way, 140 00:08:09,120 --> 00:08:13,200 Speaker 3: but they range from quite thoughtful and accurately predictive to 141 00:08:13,400 --> 00:08:17,400 Speaker 3: totally wrong and extremely funny ways. In the former column. 142 00:08:17,640 --> 00:08:19,280 Speaker 3: One thing that stands out to me, I mean, it's 143 00:08:19,280 --> 00:08:21,440 Speaker 3: often been pointed out that, wow, you know, he really 144 00:08:21,480 --> 00:08:25,360 Speaker 3: sort of got a lot right about things like personal 145 00:08:25,400 --> 00:08:29,800 Speaker 3: computers and the internet and how media would change over time. 146 00:08:29,920 --> 00:08:33,280 Speaker 3: Like the rising importance of electronic media more and more 147 00:08:33,280 --> 00:08:35,640 Speaker 3: in human life. I think he was on track about 148 00:08:35,679 --> 00:08:37,760 Speaker 3: a lot of that. One that really stood out to 149 00:08:37,800 --> 00:08:41,480 Speaker 3: me in the book is essentially the prediction of personally 150 00:08:41,600 --> 00:08:45,880 Speaker 3: curated news feeds as opposed to having to read the 151 00:08:45,920 --> 00:08:49,200 Speaker 3: same newspaper or watch the same news evening news as 152 00:08:49,240 --> 00:08:53,720 Speaker 3: everybody else, which I think that's a profoundly meaningful development 153 00:08:53,800 --> 00:08:56,120 Speaker 3: that I'm not so sure would have been easy to 154 00:08:56,280 --> 00:08:59,520 Speaker 3: predict in nineteen seventy. But as an example of the 155 00:08:59,559 --> 00:09:03,160 Speaker 3: things that they got wrong, you know, there is again 156 00:09:03,200 --> 00:09:05,280 Speaker 3: like leaning heavily on the idea that way more of 157 00:09:05,360 --> 00:09:08,000 Speaker 3: human life would shift to take place in the ocean 158 00:09:08,240 --> 00:09:11,240 Speaker 3: or in space. That hasn't really panned out yet and 159 00:09:11,280 --> 00:09:13,920 Speaker 3: I'm kind of doubtful whether it will. And there's a 160 00:09:13,920 --> 00:09:16,680 Speaker 3: bunch of stuff about the changing biomedical environment like that. 161 00:09:16,840 --> 00:09:19,440 Speaker 3: They say a lot about cloning, which in some ways 162 00:09:19,480 --> 00:09:24,040 Speaker 3: I think is pretty accurate about like where the technological 163 00:09:24,080 --> 00:09:28,120 Speaker 3: capability could be going, but wrong about the way that 164 00:09:28,160 --> 00:09:31,480 Speaker 3: it would impact culture and the possibilities it would provide 165 00:09:31,520 --> 00:09:32,520 Speaker 3: to the average person. 166 00:09:33,000 --> 00:09:33,160 Speaker 4: Yeah. 167 00:09:33,280 --> 00:09:37,640 Speaker 2: Yeah, the cloning talk is very interesting. To decide a 168 00:09:37,679 --> 00:09:40,920 Speaker 2: couple of details from it, I mentioned the spaceships already. 169 00:09:40,920 --> 00:09:43,360 Speaker 2: That's definitely something that has brought up the idea that well, 170 00:09:45,360 --> 00:09:47,680 Speaker 2: embryo is just way less than people, so it's cheaper 171 00:09:47,720 --> 00:09:50,920 Speaker 2: to blast those into space, and those will presumably robots 172 00:09:50,920 --> 00:09:53,120 Speaker 2: will raise them and grow them when they get to 173 00:09:53,120 --> 00:09:56,559 Speaker 2: where they're going. But there's a bit where he's where 174 00:09:56,600 --> 00:10:01,800 Speaker 2: they're citing molecular biologist Joshua Letterberg, and they raise the 175 00:10:01,880 --> 00:10:05,000 Speaker 2: problem that narcissist will be the most likely to clone 176 00:10:05,040 --> 00:10:10,000 Speaker 2: themselves and this could result in just more narcissist. Though 177 00:10:10,200 --> 00:10:12,280 Speaker 2: Tofler mentions, this is really more of a concern if 178 00:10:12,400 --> 00:10:17,480 Speaker 2: narcissism is biologically transferable rather than culturally transferable. And I 179 00:10:17,520 --> 00:10:19,680 Speaker 2: started looking into this a little bit, and I realized 180 00:10:19,720 --> 00:10:22,240 Speaker 2: we might have to come back and talk about narcissism 181 00:10:22,240 --> 00:10:24,439 Speaker 2: in more detail, because it looks like there have been 182 00:10:24,520 --> 00:10:30,600 Speaker 2: studies about narcissism as possibly being like something you can inherit. 183 00:10:30,720 --> 00:10:33,600 Speaker 2: But I'm not sure where we currently are regarding the 184 00:10:33,679 --> 00:10:35,800 Speaker 2: nature nurture discussion of narcissism. 185 00:10:36,240 --> 00:10:39,199 Speaker 3: I mean, if it's like most personality traits that I've 186 00:10:39,240 --> 00:10:41,679 Speaker 3: read about, it's going to be a mix. It will 187 00:10:41,760 --> 00:10:45,480 Speaker 3: usually be that there's some kind of there is a 188 00:10:45,559 --> 00:10:49,480 Speaker 3: heritable level of predisposition, but then that that doesn't get 189 00:10:49,520 --> 00:10:51,240 Speaker 3: you all the way there. That just sort of like 190 00:10:51,320 --> 00:10:55,320 Speaker 3: somewhat increases or decreases the chance that a person will 191 00:10:55,559 --> 00:10:59,240 Speaker 3: in a given environment environment develop narcissism. And then there's 192 00:10:59,240 --> 00:11:03,079 Speaker 3: probably a huge influence of like you know, experience, childhood, 193 00:11:03,200 --> 00:11:04,520 Speaker 3: upbringing and things like that. 194 00:11:05,280 --> 00:11:07,720 Speaker 2: Yeah, it seems like there's a lot to work out 195 00:11:07,760 --> 00:11:11,559 Speaker 2: before we can actually make an argument for heritability for narcissism. 196 00:11:11,679 --> 00:11:15,920 Speaker 2: But the angle is interesting, I think to me in 197 00:11:16,040 --> 00:11:19,280 Speaker 2: light of the TV series Foundation, which I don't know 198 00:11:19,280 --> 00:11:22,319 Speaker 2: how many listeners out there have been watching this. I 199 00:11:22,320 --> 00:11:25,360 Speaker 2: guess caveat that we did read some ads for Foundation 200 00:11:25,520 --> 00:11:29,320 Speaker 2: on the show, but afterwards we got into watching it 201 00:11:29,360 --> 00:11:31,080 Speaker 2: and really great show. 202 00:11:31,360 --> 00:11:32,800 Speaker 3: But you're not being paid to say this. 203 00:11:32,960 --> 00:11:34,360 Speaker 2: I'm not being paid to say this, but it's a 204 00:11:34,440 --> 00:11:37,760 Speaker 2: terrific show. But there's this whole angle that's apparently not 205 00:11:37,800 --> 00:11:42,120 Speaker 2: in the Asimov books but is quite fascinating on the series, 206 00:11:42,160 --> 00:11:45,000 Speaker 2: and that is that you have this genetic dynasty in 207 00:11:45,040 --> 00:11:47,920 Speaker 2: the Empire where you see a procession of cleon the 208 00:11:47,960 --> 00:11:54,480 Speaker 2: first clones that rule this vast interstellar civilization, and of course, 209 00:11:55,080 --> 00:11:58,360 Speaker 2: many or if not most of them are narcissists. But 210 00:11:58,520 --> 00:12:00,160 Speaker 2: you can also raise the question like how much of 211 00:12:00,160 --> 00:12:03,160 Speaker 2: that is tied up in the genes of Cleon the 212 00:12:03,200 --> 00:12:05,280 Speaker 2: first and how much of it is the way that 213 00:12:05,360 --> 00:12:09,240 Speaker 2: Cleon is raised, because they are you know, they're raised. 214 00:12:09,320 --> 00:12:11,880 Speaker 2: Each one is raised to be the emperor of an 215 00:12:11,880 --> 00:12:13,880 Speaker 2: interstellar civilization. 216 00:12:13,760 --> 00:12:16,360 Speaker 3: And presumably raised by other narcissists. 217 00:12:17,040 --> 00:12:20,200 Speaker 2: Yeah, you actually do, because you have three at a time. 218 00:12:20,600 --> 00:12:23,680 Speaker 2: So there's there's a brother Don, a brother Day, and 219 00:12:23,720 --> 00:12:26,720 Speaker 2: a brother Dusk. There's like a young version of the Clone, 220 00:12:27,400 --> 00:12:29,880 Speaker 2: an adult version of the Clone, and like an elderly 221 00:12:30,000 --> 00:12:32,040 Speaker 2: version of the Clone, with the middle Clone being the 222 00:12:32,080 --> 00:12:38,840 Speaker 2: primary ruler, and so Day and Dusk are essentially raising Dawn. Anyway, 223 00:12:38,840 --> 00:12:41,440 Speaker 2: we'll put a pin in that for later discussion, But 224 00:12:42,200 --> 00:12:44,839 Speaker 2: in general, the toddlers do raise concerns over a number 225 00:12:44,840 --> 00:12:49,120 Speaker 2: of quote unquote birth technologies, especially the idea of engineering 226 00:12:49,120 --> 00:12:52,520 Speaker 2: certain properties into children. And this also crosses over into 227 00:12:52,600 --> 00:12:55,679 Speaker 2: concerns that are raised elsewhere about the future of the family, 228 00:12:55,720 --> 00:12:57,760 Speaker 2: and we talked about that in the previous episode. How 229 00:12:58,000 --> 00:13:00,360 Speaker 2: a lot of these concerns about like the future of 230 00:13:00,360 --> 00:13:03,720 Speaker 2: the family haven't really panned out, like, you know, the 231 00:13:03,760 --> 00:13:06,040 Speaker 2: idea that well, the people will just be raised in 232 00:13:06,080 --> 00:13:09,400 Speaker 2: communes and so forth. Now I feel like there are 233 00:13:09,640 --> 00:13:12,920 Speaker 2: sort of three things going into modern considerations of the 234 00:13:13,040 --> 00:13:17,240 Speaker 2: birth technology thing though, because on one hand, as we discussed, 235 00:13:17,240 --> 00:13:20,000 Speaker 2: perhaps the alarm was raised early enough and we have 236 00:13:20,120 --> 00:13:23,839 Speaker 2: been largely more careful in this area of technological advancement. 237 00:13:24,320 --> 00:13:26,680 Speaker 2: But I think we also might consider that we're just 238 00:13:26,760 --> 00:13:30,559 Speaker 2: not yet at the point of real crisis with birth technologies, 239 00:13:30,600 --> 00:13:33,120 Speaker 2: at least not at scale. And also, you know, the 240 00:13:33,120 --> 00:13:36,440 Speaker 2: tofflers might not be fully recognizing the overall benefits of 241 00:13:36,440 --> 00:13:39,440 Speaker 2: birth technologies, and that we might be less culturally open 242 00:13:39,480 --> 00:13:42,040 Speaker 2: to drastic changes in the basic family unit. After all, 243 00:13:42,520 --> 00:13:45,920 Speaker 2: I don't know, but I do know people, real people 244 00:13:45,960 --> 00:13:48,319 Speaker 2: in the world who have benefited from strategies you might 245 00:13:48,360 --> 00:13:52,400 Speaker 2: call birth technologies. And I wouldn't say it feels it 246 00:13:52,440 --> 00:13:55,560 Speaker 2: didn't feel super future shocky to me. It just feels like, well, 247 00:13:55,920 --> 00:13:58,920 Speaker 2: there are perhaps some extra options available today, and people 248 00:13:58,920 --> 00:14:00,679 Speaker 2: take advantage of those options if they can. 249 00:14:01,360 --> 00:14:03,520 Speaker 3: Yes, And you could also argue that the fact that 250 00:14:03,559 --> 00:14:08,439 Speaker 3: we're not currently surrounded by human clones could be an 251 00:14:08,440 --> 00:14:11,680 Speaker 3: indication that, like the Tofflers suggest, in their sort of 252 00:14:11,720 --> 00:14:16,320 Speaker 3: solutions section of the book. It could be indicative of 253 00:14:16,360 --> 00:14:20,520 Speaker 3: the fact that, you know, scient the bodies governing scientific 254 00:14:20,560 --> 00:14:25,440 Speaker 3: research saw this change coming and were able to essentially 255 00:14:25,960 --> 00:14:29,200 Speaker 3: get enough thought and discussion out there early enough that 256 00:14:30,640 --> 00:14:33,960 Speaker 3: there hasn't been much temptation for scientists to experiment with 257 00:14:34,040 --> 00:14:35,840 Speaker 3: human cloning, at least not in the open. 258 00:14:36,400 --> 00:14:36,600 Speaker 4: Yeah. 259 00:14:36,920 --> 00:14:42,160 Speaker 2: Never underestimate humanity's power to fear change as well as 260 00:14:42,200 --> 00:14:43,920 Speaker 2: a void doing anything with it. 261 00:14:44,320 --> 00:14:46,240 Speaker 3: But unfortunately, to come back on the other side, I 262 00:14:46,240 --> 00:14:50,920 Speaker 3: think a lot of times seeing a potentially dangerous or 263 00:14:50,960 --> 00:14:53,760 Speaker 3: disruptive change coming is not enough if there are people 264 00:14:53,880 --> 00:14:56,760 Speaker 3: who have powerful individual incentives to pursue it. 265 00:14:56,800 --> 00:14:57,200 Speaker 5: Anyway. 266 00:14:57,840 --> 00:15:00,480 Speaker 2: Yeah, for the record, I think clone baby They're fine. 267 00:15:00,680 --> 00:15:04,400 Speaker 2: I'm pro clone baby babies. Now. 268 00:15:04,480 --> 00:15:07,080 Speaker 3: One thing we got into at length in part two 269 00:15:07,680 --> 00:15:10,880 Speaker 3: was a paper that tried to use empirical research from 270 00:15:10,880 --> 00:15:14,280 Speaker 3: the last fifty years to assess the general predictions that 271 00:15:14,760 --> 00:15:19,240 Speaker 3: the Toffler's made about how we will use time. 272 00:15:19,040 --> 00:15:19,640 Speaker 5: In the future. 273 00:15:19,680 --> 00:15:23,880 Speaker 3: So this primarily concerns their prediction that the future would 274 00:15:23,920 --> 00:15:27,280 Speaker 3: be characterized by greater and greater transience in life, a 275 00:15:27,360 --> 00:15:30,960 Speaker 3: sort of general shortening of the length of episodes both 276 00:15:31,040 --> 00:15:36,000 Speaker 3: large and small, and a shortening of commitments throughout life. 277 00:15:36,160 --> 00:15:39,440 Speaker 3: The paper tried to compare these predictions to actual time 278 00:15:39,520 --> 00:15:43,160 Speaker 3: use studies and concluded that for the parts we could check, 279 00:15:43,280 --> 00:15:46,320 Speaker 3: the Toffler's predictions about time use were actually pretty good. 280 00:15:46,480 --> 00:15:51,200 Speaker 3: Like we have seen time use become more fragmented, which 281 00:15:51,240 --> 00:15:54,920 Speaker 3: you could characterize as as having more transience to it, 282 00:15:55,440 --> 00:15:59,000 Speaker 3: more fragmented, more irregular, and more overlapped, and people do 283 00:16:00,320 --> 00:16:04,560 Speaker 3: feeling increasingly stressed and hurried about time, even in cases 284 00:16:04,600 --> 00:16:09,120 Speaker 3: where they actually objectively have more of it. However, there 285 00:16:09,120 --> 00:16:11,560 Speaker 3: were also some predictions they made about time use that 286 00:16:11,720 --> 00:16:15,440 Speaker 3: didn't really pan out, such as they sort of implied 287 00:16:15,440 --> 00:16:19,480 Speaker 3: that people would end up devaluing home life and spending 288 00:16:19,480 --> 00:16:22,240 Speaker 3: more time outside the home, coming and going a lot. 289 00:16:22,880 --> 00:16:26,680 Speaker 3: In fact, the research has shown the opposite. People in 290 00:16:26,760 --> 00:16:30,640 Speaker 3: high technology cultures seem to be valuing home life more 291 00:16:30,680 --> 00:16:34,160 Speaker 3: and more and spending more time inside the home. Though 292 00:16:34,240 --> 00:16:37,480 Speaker 3: it strikes me that this could possibly be interpreted as 293 00:16:37,520 --> 00:16:41,440 Speaker 3: an effect of changing technology and to some extent future shock. 294 00:16:42,160 --> 00:16:45,480 Speaker 3: So like the internet and media technology make it easier 295 00:16:45,560 --> 00:16:49,680 Speaker 3: to stay home without being bored, and it's possible that 296 00:16:49,760 --> 00:16:52,360 Speaker 3: like anxiety which may or may not be related to 297 00:16:52,400 --> 00:17:05,840 Speaker 3: future shock increasingly makes people hesitant to go out. So 298 00:17:05,880 --> 00:17:09,320 Speaker 3: we've already talked a lot about how the Tofflers described 299 00:17:09,400 --> 00:17:14,320 Speaker 3: future shock, why they argued it was likely to become 300 00:17:14,359 --> 00:17:17,919 Speaker 3: an increasing feature of human life. But they also spend 301 00:17:18,000 --> 00:17:20,920 Speaker 3: a significant amount of time in their book talking about 302 00:17:20,960 --> 00:17:24,520 Speaker 3: our reaction to it, like what could we do about 303 00:17:24,680 --> 00:17:28,040 Speaker 3: future shock to help alleviate and even prevent it? 304 00:17:29,119 --> 00:17:32,439 Speaker 2: Yeah, and they also lay out several different what they 305 00:17:32,480 --> 00:17:36,560 Speaker 2: call maladaptive coping strategies that can emerge so ways that 306 00:17:36,720 --> 00:17:42,720 Speaker 2: will sort of deal with future shock on our own terms, 307 00:17:42,760 --> 00:17:45,480 Speaker 2: without perhaps even realizing that future shock is going on. 308 00:17:46,440 --> 00:17:50,600 Speaker 2: And I found these categories rather insightful. You know, I 309 00:17:50,640 --> 00:17:54,280 Speaker 2: have some caveats to add, but I want to go 310 00:17:54,320 --> 00:17:57,200 Speaker 2: through these because I think there's some interesting ideas here. 311 00:17:58,000 --> 00:18:02,560 Speaker 2: So the first category of maladaptive coping strategies for future 312 00:18:02,600 --> 00:18:05,439 Speaker 2: shock that they outline is that of the denier. The 313 00:18:05,520 --> 00:18:09,520 Speaker 2: denier blocks out unwanted reality and clings to the idea 314 00:18:09,880 --> 00:18:14,679 Speaker 2: that any change is just superficial quote. He finds comfort 315 00:18:14,760 --> 00:18:18,439 Speaker 2: in such cliches as young people were always rebellious, or 316 00:18:18,800 --> 00:18:21,040 Speaker 2: there's nothing new on the face of the earth, or 317 00:18:21,359 --> 00:18:24,200 Speaker 2: the more things change, the more they stay the same. 318 00:18:24,800 --> 00:18:28,040 Speaker 3: This is funny because I did not expect when they 319 00:18:28,080 --> 00:18:30,800 Speaker 3: said that there are deniers this is what they were 320 00:18:30,800 --> 00:18:33,040 Speaker 3: going to mean by it. But I do see. So 321 00:18:33,080 --> 00:18:36,280 Speaker 3: they're saying, like denier in the sense of denying that 322 00:18:36,359 --> 00:18:39,360 Speaker 3: anything is really different this time, when they're saying no, no, 323 00:18:39,400 --> 00:18:43,720 Speaker 3: it objectively is actually different this time. The technological regime 324 00:18:43,720 --> 00:18:47,719 Speaker 3: we're living in now is different. It is actually happening faster. 325 00:18:48,240 --> 00:18:50,840 Speaker 3: And so the denier here is saying, you know, it's 326 00:18:50,840 --> 00:18:51,919 Speaker 3: always been like this. 327 00:18:52,480 --> 00:18:57,000 Speaker 2: Yeah, And the toddlers, the denier essentially puts off change 328 00:18:57,320 --> 00:19:00,840 Speaker 2: until change is forced upon them in just a one single, 329 00:19:00,960 --> 00:19:05,960 Speaker 2: massive life catastrophe. So the preferable alternative would be, of course, 330 00:19:06,000 --> 00:19:08,760 Speaker 2: to take on a series of manageable problems and solve 331 00:19:08,800 --> 00:19:12,600 Speaker 2: those instead of dealing with one gigantic problem at the end. 332 00:19:13,000 --> 00:19:15,480 Speaker 2: And I think it's an interesting way of looking at things, 333 00:19:15,480 --> 00:19:17,480 Speaker 2: because I mean, certainly you don't need a future shock 334 00:19:17,560 --> 00:19:21,879 Speaker 2: scenario to see, you know, versions of this. You know, 335 00:19:21,920 --> 00:19:25,080 Speaker 2: there are all sorts of realities in life that you 336 00:19:25,280 --> 00:19:29,200 Speaker 2: that are conquerable if you deal with them as small battles, 337 00:19:29,240 --> 00:19:32,919 Speaker 2: as opposed to one enormous world ending battle. 338 00:19:33,440 --> 00:19:37,600 Speaker 3: I think the denier attitude can be attractive to a 339 00:19:37,640 --> 00:19:41,040 Speaker 3: lot of people because it sounds very care free, you know. 340 00:19:41,760 --> 00:19:45,960 Speaker 3: It's actually it communicates a sense of confidence to say, like, ah, you. 341 00:19:45,880 --> 00:19:48,679 Speaker 5: Know, I wouldn't worry about it. Things have always. 342 00:19:48,440 --> 00:19:52,200 Speaker 3: Been like this, which that can be an assuring thing 343 00:19:52,280 --> 00:19:56,480 Speaker 3: to hear, especially in the face of like actual, actual alarmism, 344 00:19:56,760 --> 00:19:59,200 Speaker 3: when people are like getting freaked out about something that's 345 00:19:59,240 --> 00:20:01,359 Speaker 3: not actually a prop problem, which happens all the time. 346 00:20:02,240 --> 00:20:04,480 Speaker 2: Yeah, I feel like with the denier there's sort of 347 00:20:04,680 --> 00:20:07,080 Speaker 2: two I have sort of two feelings about. On one hand, 348 00:20:07,600 --> 00:20:09,080 Speaker 2: I feel like you do kind of have to block 349 00:20:09,080 --> 00:20:13,680 Speaker 2: it all out sometimes, like you can't constantly be waging 350 00:20:13,800 --> 00:20:17,080 Speaker 2: battles against change and dealing with change. Sometimes you just 351 00:20:17,119 --> 00:20:20,080 Speaker 2: got to, like, you know, get to work and do 352 00:20:20,160 --> 00:20:24,760 Speaker 2: whatever you do. And as far as these generalities, you know, 353 00:20:25,280 --> 00:20:27,120 Speaker 2: the more things change, the more they stay the same, 354 00:20:27,280 --> 00:20:29,440 Speaker 2: or the young people are always rebellious and so forth. 355 00:20:29,440 --> 00:20:32,520 Speaker 2: They are a million of these. And on one hand, 356 00:20:32,880 --> 00:20:35,760 Speaker 2: reading this, I kept thinking, well, some of these are 357 00:20:35,800 --> 00:20:37,880 Speaker 2: kind of true, right, I Mean, the reason we say 358 00:20:37,920 --> 00:20:40,320 Speaker 2: them is that there is at least some truth to them. 359 00:20:40,359 --> 00:20:42,840 Speaker 2: So I'm not sure we should just commpletely throw those 360 00:20:42,880 --> 00:20:45,439 Speaker 2: out or see those as just red flags. But on 361 00:20:45,480 --> 00:20:47,080 Speaker 2: the other hand, I do see where they're coming from, 362 00:20:47,080 --> 00:20:51,199 Speaker 2: Like you could just if you need a generality to 363 00:20:51,280 --> 00:20:53,840 Speaker 2: grab onto to keep from having to deal with change, 364 00:20:53,880 --> 00:20:54,760 Speaker 2: they are available. 365 00:20:55,280 --> 00:20:57,920 Speaker 3: Yeah, I mean, I guess another way of thinking about 366 00:20:57,960 --> 00:21:01,440 Speaker 3: it is that when you are concent coittering whether things 367 00:21:01,480 --> 00:21:04,240 Speaker 3: are really different today or whether a change is a 368 00:21:04,320 --> 00:21:07,600 Speaker 3: change is really happening in the world, there are, to 369 00:21:07,720 --> 00:21:10,159 Speaker 3: use statistics terms, there are type one and type two 370 00:21:10,320 --> 00:21:13,879 Speaker 3: errors happening all the time around us. There are people 371 00:21:13,960 --> 00:21:17,040 Speaker 3: saying this is totally different, it's never been like this before, 372 00:21:17,160 --> 00:21:19,840 Speaker 3: when when it has it's you know, it's not different. 373 00:21:20,160 --> 00:21:22,879 Speaker 3: And there are people who are ignoring things that are 374 00:21:22,960 --> 00:21:25,760 Speaker 3: totally different now be saying like, ah, don't worry about it, 375 00:21:25,520 --> 00:21:28,240 Speaker 3: it's always been like this, when it actually is something 376 00:21:28,520 --> 00:21:29,320 Speaker 3: totally different. 377 00:21:29,640 --> 00:21:31,600 Speaker 2: And I guess I guess we should also acknowledge that, Yeah, 378 00:21:31,600 --> 00:21:33,480 Speaker 2: things can also be both, right, I mean, you can 379 00:21:33,520 --> 00:21:36,439 Speaker 2: have something happening in a given culture that matches up 380 00:21:36,480 --> 00:21:41,120 Speaker 2: with expected trends, but to channel into the future shock 381 00:21:41,160 --> 00:21:43,520 Speaker 2: a little bit like if things are happening and it 382 00:21:43,560 --> 00:21:47,960 Speaker 2: advanced great technologically, then you're dealing with a slightly different scenario. 383 00:21:48,400 --> 00:21:50,600 Speaker 3: Also, if you're talking about future shock, you're talking about 384 00:21:50,640 --> 00:21:54,520 Speaker 3: psychological effects on the person, in which case an error 385 00:21:54,560 --> 00:21:58,480 Speaker 3: of the alarmist sort would still have negative psychological effects 386 00:21:58,520 --> 00:22:00,919 Speaker 3: on you because you would still be perceived does a change. 387 00:22:01,480 --> 00:22:04,880 Speaker 3: So you know, here's a false change that is freaking 388 00:22:04,920 --> 00:22:07,480 Speaker 3: you out alongside all of the real changes. 389 00:22:08,280 --> 00:22:10,240 Speaker 4: Yeah, all right. 390 00:22:10,320 --> 00:22:12,840 Speaker 2: Now, The next example that they give of a maladaptive 391 00:22:12,920 --> 00:22:16,120 Speaker 2: coping strategy is that of the specialist. So the specialist 392 00:22:16,160 --> 00:22:20,600 Speaker 2: doesn't block out everything. The specialist specializes in one area 393 00:22:20,640 --> 00:22:23,400 Speaker 2: of the changing world and keeps pace with that, which 394 00:22:23,440 --> 00:22:26,360 Speaker 2: creates the feeling that they're keeping pace with the larger 395 00:22:26,800 --> 00:22:30,040 Speaker 2: pace of change in the world. Quote, he narrows the 396 00:22:30,080 --> 00:22:33,360 Speaker 2: slit through which he sees the world, so it can 397 00:22:33,359 --> 00:22:37,119 Speaker 2: be superficially successful as a coping mechanism for a while. 398 00:22:37,560 --> 00:22:40,240 Speaker 2: But this too, they right, will eventually catch up with 399 00:22:40,320 --> 00:22:41,040 Speaker 2: the specialist. 400 00:22:41,400 --> 00:22:43,359 Speaker 3: This is the person who thinks that they're the master 401 00:22:43,480 --> 00:22:47,280 Speaker 3: of reality because they're in the crypto forums. It's like 402 00:22:47,480 --> 00:22:51,000 Speaker 3: I've figured out cryptocurrency, I know everything about the future, 403 00:22:51,359 --> 00:22:55,280 Speaker 3: and then suddenly, yeah, you just get according to the Toddlers, 404 00:22:55,280 --> 00:22:58,080 Speaker 3: at least then that might sort of lully you for 405 00:22:58,119 --> 00:23:00,080 Speaker 3: a while, but then it'll all catch up to you 406 00:23:00,240 --> 00:23:03,040 Speaker 3: and you'll feel this shock where you're like, wait, I 407 00:23:03,119 --> 00:23:06,280 Speaker 3: don't understand you know, what else is happening in culture 408 00:23:06,320 --> 00:23:06,960 Speaker 3: for some reason. 409 00:23:07,560 --> 00:23:10,480 Speaker 2: Yeah, yeah, so you think you're still stable on the bicycle, 410 00:23:10,480 --> 00:23:12,120 Speaker 2: but then it's going to fall over anyway. 411 00:23:13,520 --> 00:23:13,720 Speaker 4: Yeah. 412 00:23:13,720 --> 00:23:15,679 Speaker 2: This was an interesting one to think about because on 413 00:23:15,760 --> 00:23:18,320 Speaker 2: one hand, I feel like you do have to sort 414 00:23:18,320 --> 00:23:20,320 Speaker 2: of focus on the things you can change and adapt, 415 00:23:20,359 --> 00:23:22,640 Speaker 2: where you can adapt and where it makes the most 416 00:23:22,680 --> 00:23:26,320 Speaker 2: sense to adapt. But yeah, I suppose the idea is 417 00:23:26,359 --> 00:23:29,520 Speaker 2: that if that's your day to day reaction, it still 418 00:23:29,520 --> 00:23:30,920 Speaker 2: could eventually catch up with you. 419 00:23:31,960 --> 00:23:32,280 Speaker 4: All right. 420 00:23:32,320 --> 00:23:38,320 Speaker 2: The next category that they present is that of the reservationist, 421 00:23:39,119 --> 00:23:42,200 Speaker 2: one who rages against change and clings to the past, 422 00:23:42,240 --> 00:23:45,359 Speaker 2: adapting not to the future but to outdated modes of 423 00:23:45,400 --> 00:23:46,320 Speaker 2: what came before. 424 00:23:47,119 --> 00:23:47,320 Speaker 4: Now. 425 00:23:47,320 --> 00:23:49,919 Speaker 2: This one is interesting because the Toddlers note that this 426 00:23:49,960 --> 00:23:53,520 Speaker 2: sort of thing can manifest among both liberal individuals and 427 00:23:53,560 --> 00:23:57,439 Speaker 2: conservative individuals. They're just reaching back to different models of 428 00:23:57,480 --> 00:24:00,879 Speaker 2: the past, which I thought was interesting. So for my 429 00:24:01,720 --> 00:24:03,720 Speaker 2: example that came to mind is like you had someone 430 00:24:03,800 --> 00:24:07,160 Speaker 2: like Terence McKenna who called for an archaic revival. 431 00:24:07,520 --> 00:24:08,320 Speaker 4: You know, it's like, oh, you. 432 00:24:08,240 --> 00:24:11,360 Speaker 2: Need to go back to these older models of how 433 00:24:11,400 --> 00:24:14,639 Speaker 2: we viewed the world. And on the other side of 434 00:24:14,440 --> 00:24:17,760 Speaker 2: the spectrum you have you know, social conservatives who call 435 00:24:17,840 --> 00:24:20,480 Speaker 2: for what a return to family values and so forth. 436 00:24:20,920 --> 00:24:24,440 Speaker 2: So both see an escape from the future into the 437 00:24:24,480 --> 00:24:28,040 Speaker 2: past to some degree. You know this ideas the toddlers 438 00:24:28,119 --> 00:24:30,679 Speaker 2: drive home, you know, they're rallying around, we have to 439 00:24:30,720 --> 00:24:34,200 Speaker 2: return to what worked. And I don't think it's necessarily 440 00:24:34,560 --> 00:24:36,200 Speaker 2: you know, I don't think it's necessarily putting your head 441 00:24:36,200 --> 00:24:38,960 Speaker 2: in the sand to look backwards. But I do see 442 00:24:38,960 --> 00:24:39,920 Speaker 2: what they're going for here. 443 00:24:40,760 --> 00:24:42,960 Speaker 3: Well, yeah, one thing I would say about any model 444 00:24:43,040 --> 00:24:45,199 Speaker 3: that says, I want to go back and you know 445 00:24:45,280 --> 00:24:47,080 Speaker 3: it used to be good in the past, and we 446 00:24:47,080 --> 00:24:51,000 Speaker 3: should make it like that. Again, that's always a fantasy. 447 00:24:51,080 --> 00:24:53,520 Speaker 3: I mean, you can get sort of closer and closer 448 00:24:53,560 --> 00:24:57,919 Speaker 3: to but like you're never really understanding exactly what the 449 00:24:57,960 --> 00:25:00,960 Speaker 3: past was like, what that past point the you're idealizing 450 00:25:01,040 --> 00:25:03,800 Speaker 3: is like. In many cases, it's just like a pure fantasy. 451 00:25:03,840 --> 00:25:06,960 Speaker 3: It's just people like sort of dreaming up what they 452 00:25:07,080 --> 00:25:09,840 Speaker 3: imagine the past was like, and then saying I want 453 00:25:09,840 --> 00:25:12,160 Speaker 3: to return to that. In some cases it might be 454 00:25:12,200 --> 00:25:16,199 Speaker 3: more based in actual knowledge about history, but no matter what, like, 455 00:25:16,280 --> 00:25:19,320 Speaker 3: you can't perfectly recreate that past time, it is gone. 456 00:25:19,600 --> 00:25:23,080 Speaker 3: There are things that have changed about the material world 457 00:25:22,840 --> 00:25:26,119 Speaker 3: that cannot be undone, so you're not actually going to 458 00:25:26,160 --> 00:25:27,840 Speaker 3: be able to go back. The best you could do 459 00:25:28,000 --> 00:25:29,680 Speaker 3: is try to kind of imitate it. 460 00:25:30,600 --> 00:25:31,240 Speaker 4: Yeah. 461 00:25:31,400 --> 00:25:34,639 Speaker 2: It reminds me of an interview I saw with a 462 00:25:34,760 --> 00:25:39,280 Speaker 2: musician whose work I like, and the interviewer asked him, 463 00:25:39,520 --> 00:25:41,840 Speaker 2: if you could live in any time in human history, 464 00:25:42,160 --> 00:25:45,480 Speaker 2: where when would you live? And they said the Middle Ages? 465 00:25:45,520 --> 00:25:47,880 Speaker 2: And they're like, well, but then you wouldn't have electric guitars. 466 00:25:47,920 --> 00:25:51,040 Speaker 2: And they're like, oh, okay, I guess that's a good point. 467 00:25:50,880 --> 00:25:55,119 Speaker 3: So yeah, live now. Yeah, but yeah, But personally, I 468 00:25:55,160 --> 00:25:58,399 Speaker 3: tend to see anybody who wants to go back to 469 00:25:58,440 --> 00:26:00,520 Speaker 3: a time in the past that they think is better 470 00:26:01,680 --> 00:26:06,359 Speaker 3: that's always involving some kind of inaccurate idealization of the 471 00:26:06,400 --> 00:26:09,720 Speaker 3: path is just like failing to realize that there were 472 00:26:09,760 --> 00:26:13,560 Speaker 3: problems then too, and that many of the things that 473 00:26:13,680 --> 00:26:17,080 Speaker 3: you think were good about that time cannot be recreated. 474 00:26:18,200 --> 00:26:20,479 Speaker 2: And really this gets into the next category I think 475 00:26:20,560 --> 00:26:23,000 Speaker 2: quite well, and that is that that of the super 476 00:26:23,040 --> 00:26:27,879 Speaker 2: simplifier who seeks a quote unquote unitary solution and goes 477 00:26:28,000 --> 00:26:30,560 Speaker 2: all in on it as a means of explaining the 478 00:26:30,560 --> 00:26:33,720 Speaker 2: world or simplifying the challenges of the present. You know, 479 00:26:33,800 --> 00:26:37,200 Speaker 2: it's a in their words, a quote simple way out 480 00:26:37,480 --> 00:26:40,680 Speaker 2: of urgeoning complexity of choice in general overstimulation. 481 00:26:41,160 --> 00:26:43,600 Speaker 3: The person who's got it all figured out because they 482 00:26:43,640 --> 00:26:47,040 Speaker 3: read a book or maybe an article, or nowadays maybe 483 00:26:47,040 --> 00:26:50,000 Speaker 3: they watched a video and that person they laid it 484 00:26:50,040 --> 00:26:52,360 Speaker 3: all out for him. Now they know, now they understand 485 00:26:52,400 --> 00:26:55,119 Speaker 3: the world because doctor so and so told them this 486 00:26:55,200 --> 00:26:56,640 Speaker 3: is this is what's really going. 487 00:26:56,480 --> 00:26:58,080 Speaker 4: On, right right? Yeah. 488 00:26:58,240 --> 00:27:00,840 Speaker 2: They stress that this can only kind of take on 489 00:27:00,920 --> 00:27:05,199 Speaker 2: a form as opposition or or support. So, you know, 490 00:27:05,240 --> 00:27:08,080 Speaker 2: this world in this this worldview can be defined by 491 00:27:08,200 --> 00:27:11,240 Speaker 2: its savior or by its enemy. Like if you know, 492 00:27:11,800 --> 00:27:13,560 Speaker 2: this is the bad guy, and once you realize this 493 00:27:13,640 --> 00:27:16,600 Speaker 2: is the bad guy, everything makes sense. Or this is 494 00:27:16,640 --> 00:27:19,320 Speaker 2: the solution, and once you realize that everything makes sense, 495 00:27:19,359 --> 00:27:21,560 Speaker 2: and of course you can dip into both categories. 496 00:27:22,119 --> 00:27:26,399 Speaker 3: Unfortunately, I think the Internet these days is just rampant 497 00:27:26,480 --> 00:27:29,320 Speaker 3: with people who are who are trying to become sort 498 00:27:29,359 --> 00:27:32,800 Speaker 3: of the world explaining cult leader to an army of 499 00:27:33,359 --> 00:27:36,040 Speaker 3: people on the internet who listen to their podcast or 500 00:27:36,080 --> 00:27:38,400 Speaker 3: you know whatever. It's like, I've got the one solution 501 00:27:38,520 --> 00:27:41,960 Speaker 3: to tell you how everything is. Everybody out there be 502 00:27:42,119 --> 00:27:44,400 Speaker 3: wary of this, especially if the person is a very 503 00:27:44,480 --> 00:27:47,720 Speaker 3: charismatic speaker. Nobody is going to be able to explain 504 00:27:47,800 --> 00:27:51,320 Speaker 3: how everything works or like, you know, nobody. Nobody is 505 00:27:51,400 --> 00:27:53,520 Speaker 3: going to be the person who, like, ah, here's the 506 00:27:53,640 --> 00:27:55,919 Speaker 3: here's the guy who's got it all figured out, and 507 00:27:55,960 --> 00:27:57,760 Speaker 3: now I can just listen to him and he'll tell 508 00:27:57,760 --> 00:27:58,560 Speaker 3: me what's going on. 509 00:27:59,000 --> 00:28:01,360 Speaker 2: Yeah, I think this guy of thing is especially noticeable 510 00:28:01,400 --> 00:28:05,520 Speaker 2: and conspiracy thought. But I suppose you could also argue 511 00:28:05,920 --> 00:28:09,600 Speaker 2: that it can be beneficial to some degree, Like if 512 00:28:09,600 --> 00:28:13,399 Speaker 2: one goes all in on a particular social cause, you know, 513 00:28:13,400 --> 00:28:15,680 Speaker 2: I can see the appeal of that, and and if 514 00:28:15,800 --> 00:28:18,360 Speaker 2: them going all in on that particular social cause ends 515 00:28:18,440 --> 00:28:22,600 Speaker 2: up producing beneficial results, then I don't know. I mean, 516 00:28:22,600 --> 00:28:26,119 Speaker 2: are they happy, are they fulfilled? Are people benefiting from this? 517 00:28:27,160 --> 00:28:28,560 Speaker 2: I can see where you can make it, you know, 518 00:28:28,600 --> 00:28:30,280 Speaker 2: a more complex. 519 00:28:29,720 --> 00:28:31,200 Speaker 4: Equation to figure out. 520 00:28:31,359 --> 00:28:33,840 Speaker 2: And I should also know that the book stresses that 521 00:28:35,000 --> 00:28:39,080 Speaker 2: this unitary solution might be found in various intellectual ideas, 522 00:28:39,160 --> 00:28:41,560 Speaker 2: which of course could even include the concept of future shock. 523 00:28:41,640 --> 00:28:44,320 Speaker 2: They note like they point out that you know that 524 00:28:44,480 --> 00:28:46,480 Speaker 2: their book is not immune to this kind of thing, 525 00:28:47,360 --> 00:28:49,680 Speaker 2: but they also stress that it could be an investment 526 00:28:49,720 --> 00:28:52,880 Speaker 2: in action rather than thought, like I am you know, 527 00:28:52,960 --> 00:28:55,200 Speaker 2: I'm going to do this one thing. I'm going to 528 00:28:55,240 --> 00:28:59,080 Speaker 2: be this one thing, and that is going to sort 529 00:28:59,120 --> 00:29:02,880 Speaker 2: of super simplify this realm of choices that would otherwise 530 00:29:02,920 --> 00:29:06,720 Speaker 2: confound me. And again, I don't know that that is 531 00:29:06,880 --> 00:29:10,640 Speaker 2: necessarily a bad thing in all cases, like if someone 532 00:29:10,680 --> 00:29:14,239 Speaker 2: goes all in on cross training or whatever, you know, 533 00:29:14,840 --> 00:29:16,239 Speaker 2: like this is who I am, now, this is what 534 00:29:16,280 --> 00:29:18,240 Speaker 2: I'm doing, Like I don't know, live and let live 535 00:29:18,320 --> 00:29:19,640 Speaker 2: right Well. 536 00:29:19,400 --> 00:29:23,080 Speaker 3: I mean, I think that's even even if you did 537 00:29:23,080 --> 00:29:26,840 Speaker 3: this with a very good book or something. I think 538 00:29:26,920 --> 00:29:31,880 Speaker 3: just there there's no authority or no single text that 539 00:29:32,040 --> 00:29:35,400 Speaker 3: explains everything. And so if you try to under if 540 00:29:35,520 --> 00:29:39,360 Speaker 3: every problem you're faced with, you try to say, well, 541 00:29:39,520 --> 00:29:42,280 Speaker 3: what does my text say about it? And even if 542 00:29:42,280 --> 00:29:45,680 Speaker 3: that text is future shock, you know, what wisdom can 543 00:29:45,720 --> 00:29:47,840 Speaker 3: I get from the tofflers that would make sense of 544 00:29:47,880 --> 00:29:49,760 Speaker 3: what's going on to me right now, I think that's 545 00:29:49,800 --> 00:29:52,920 Speaker 3: just very limiting. Like you, you've got to reach out 546 00:29:52,960 --> 00:29:57,680 Speaker 3: broader and like have more more influences on your mind 547 00:29:57,760 --> 00:30:00,280 Speaker 3: than just sort of like one leader text stoor, a 548 00:30:00,360 --> 00:30:03,560 Speaker 3: leader idea or person that you always go back to 549 00:30:03,560 --> 00:30:04,600 Speaker 3: to explain everything. 550 00:30:05,320 --> 00:30:05,600 Speaker 4: Yeah. 551 00:30:05,680 --> 00:30:07,520 Speaker 2: Yeah, And I guess you could even apply it, of 552 00:30:07,520 --> 00:30:11,360 Speaker 2: course to religion, like yes, one can. You could talk 553 00:30:11,400 --> 00:30:14,480 Speaker 2: about religion as being a super simplifier kind of approach, 554 00:30:15,040 --> 00:30:17,200 Speaker 2: and I think that would only be true to a 555 00:30:17,240 --> 00:30:19,320 Speaker 2: certain extent, because I think once you really get into 556 00:30:19,360 --> 00:30:23,080 Speaker 2: any given religion, there's a lot of complexity there. It's 557 00:30:23,120 --> 00:30:26,760 Speaker 2: not just a simple top down system, though you might 558 00:30:26,800 --> 00:30:29,160 Speaker 2: have like a slice of that again with a particular 559 00:30:29,240 --> 00:30:31,800 Speaker 2: charismatic individual at the heart of it and so forth. 560 00:30:32,280 --> 00:30:36,120 Speaker 2: But even then, it's like, Okay, future changes are coming, 561 00:30:36,120 --> 00:30:38,280 Speaker 2: and they're coming at a pace that human beings can't keep. 562 00:30:38,200 --> 00:30:39,800 Speaker 4: Up with the basic idea of future shock. 563 00:30:40,560 --> 00:30:45,760 Speaker 2: Retreating into religion or becoming super religious might help to 564 00:30:45,880 --> 00:30:48,280 Speaker 2: some degree on an individual level, but it is not 565 00:30:48,400 --> 00:30:51,200 Speaker 2: going to help at the level and at the scale 566 00:30:51,520 --> 00:30:54,800 Speaker 2: that they're talking about long term. I'm of course not 567 00:30:54,840 --> 00:30:58,440 Speaker 2: accounting for any supernatural aspects, just focusing on the natural 568 00:30:58,440 --> 00:30:59,680 Speaker 2: world scenario. 569 00:30:59,280 --> 00:31:02,960 Speaker 3: Here as a way of trying to find some kind 570 00:31:03,000 --> 00:31:05,640 Speaker 3: of explanation to help you understand what's going on in 571 00:31:05,680 --> 00:31:08,560 Speaker 3: this confusing world. And yeah, you know, it's hard, Like 572 00:31:08,600 --> 00:31:12,200 Speaker 3: the world is confusing to some extent that maybe due 573 00:31:12,200 --> 00:31:16,480 Speaker 3: to the causes that the tofflers identify, and maybe there 574 00:31:16,480 --> 00:31:19,720 Speaker 3: are other causes too, but yeah, it's like it's hard 575 00:31:19,760 --> 00:31:22,040 Speaker 3: to make sense of this world. And it can be 576 00:31:22,160 --> 00:31:25,200 Speaker 3: very comfortable to just find one authority to turn to 577 00:31:25,360 --> 00:31:29,000 Speaker 3: that'll tell you here's what's going on. But like I 578 00:31:29,480 --> 00:31:31,960 Speaker 3: just warned people, like, don't do that. You don't want 579 00:31:31,960 --> 00:31:36,480 Speaker 3: to be the person who's who can't stop always referring 580 00:31:36,520 --> 00:31:40,680 Speaker 3: to the internet guy they just latched onto that explains everything. 581 00:31:40,280 --> 00:31:42,240 Speaker 4: You know. Oh yeah, yeah absolutely. 582 00:31:52,040 --> 00:31:54,440 Speaker 3: Now there's another thing I've been thinking about in terms 583 00:31:54,560 --> 00:32:01,120 Speaker 3: of maladaptive reactions to future shock, which is, what if 584 00:32:01,440 --> 00:32:04,360 Speaker 3: the idea of future shock is to some extent real, 585 00:32:04,400 --> 00:32:07,800 Speaker 3: if the toafflers were to some extent on target and 586 00:32:08,400 --> 00:32:12,920 Speaker 3: describing a real phenomenon, but people don't quite realize it, 587 00:32:13,520 --> 00:32:17,680 Speaker 3: and instead it manifests as a kind of anxiety and 588 00:32:17,760 --> 00:32:22,000 Speaker 3: psychic distress that arises as if out of nowhere, and 589 00:32:22,080 --> 00:32:27,560 Speaker 3: people end up blaming it on unrelated third party phenomena. 590 00:32:28,240 --> 00:32:30,640 Speaker 3: You know, this kind of reminds me that. So I 591 00:32:30,680 --> 00:32:33,400 Speaker 3: was reading the New York Times obituary for Alvin Toffler 592 00:32:33,560 --> 00:32:37,040 Speaker 3: that was written by Keith Schneider, published when Alvin died 593 00:32:37,080 --> 00:32:40,680 Speaker 3: in twenty sixteen, and there was a passage where the 594 00:32:40,720 --> 00:32:44,200 Speaker 3: obituary quoted a critic of Toffler's and I thought this 595 00:32:44,320 --> 00:32:45,920 Speaker 3: was kind of interesting. So I want to read this 596 00:32:46,400 --> 00:32:51,000 Speaker 3: quote first. In recent years, benefiting from hindsight, some critics 597 00:32:51,000 --> 00:32:55,040 Speaker 3: said mister Toffler had gotten much wrong. Shell Israel, an 598 00:32:55,080 --> 00:32:58,360 Speaker 3: author and commentator who writes about social media for Forbes, 599 00:32:58,440 --> 00:33:02,120 Speaker 3: took issue with mister Toffler twenty twelve for painting quote 600 00:33:02,120 --> 00:33:05,760 Speaker 3: a picture of people who were isolated and depressed, cut 601 00:33:05,800 --> 00:33:09,440 Speaker 3: off from human intimacy by a relentless fire hose of 602 00:33:09,560 --> 00:33:13,360 Speaker 3: messages and data barraging us. But he added, we are 603 00:33:13,400 --> 00:33:16,920 Speaker 3: not isolated by it, and when the information overloads us, 604 00:33:17,280 --> 00:33:20,280 Speaker 3: most people are still wise enough to use the power 605 00:33:20,400 --> 00:33:24,080 Speaker 3: of the off button to gain some peace. You know, 606 00:33:24,240 --> 00:33:26,680 Speaker 3: I think this is a This is an interesting and 607 00:33:26,800 --> 00:33:29,880 Speaker 3: funny example to highlight, because, of course I do think 608 00:33:29,920 --> 00:33:32,280 Speaker 3: Tofler got a lot of stuff wrong, but I don't 609 00:33:32,320 --> 00:33:35,280 Speaker 3: personally think that this was one of them. Like just 610 00:33:35,320 --> 00:33:39,560 Speaker 3: to zero in on one example of information overload, it 611 00:33:39,600 --> 00:33:42,680 Speaker 3: really does seem to me that the the bombardment of 612 00:33:42,840 --> 00:33:47,040 Speaker 3: information we're getting from media, especially now Internet based media 613 00:33:47,120 --> 00:33:50,360 Speaker 3: and social media, video content on the Internet and all that, 614 00:33:51,080 --> 00:33:55,400 Speaker 3: I personally think that probably is responsible for a lot 615 00:33:55,440 --> 00:33:58,959 Speaker 3: of feelings of isolation and depression. And I think it 616 00:33:59,120 --> 00:34:01,440 Speaker 3: does make a lot of people more lonely, and a 617 00:34:01,440 --> 00:34:04,240 Speaker 3: lot of us really do have trouble using the off 618 00:34:04,280 --> 00:34:07,000 Speaker 3: button to escape from it. And here's kind of where 619 00:34:07,040 --> 00:34:09,680 Speaker 3: we get back to what I was wondering about us 620 00:34:10,520 --> 00:34:15,960 Speaker 3: not identifying the actual cause of potential future shock. Apart 621 00:34:16,000 --> 00:34:20,359 Speaker 3: from the innately compelling addictive qualities of social media and 622 00:34:20,520 --> 00:34:23,239 Speaker 3: Internet video media and all that, the things that like 623 00:34:23,360 --> 00:34:26,640 Speaker 3: naturally just keep us scrolling for more. Often we are 624 00:34:26,680 --> 00:34:29,840 Speaker 3: not quite able to realize it is the media that 625 00:34:29,960 --> 00:34:32,840 Speaker 3: is causing us to feel so bad. And this is 626 00:34:32,840 --> 00:34:34,520 Speaker 3: true about a lot of things in life. We're often 627 00:34:34,600 --> 00:34:38,759 Speaker 3: just not good at identifying the causes of our own unhappiness. 628 00:34:39,080 --> 00:34:42,120 Speaker 3: There's a whole like therapy industry that a big part 629 00:34:42,160 --> 00:34:45,359 Speaker 3: of it is like helping people figure out what it 630 00:34:45,520 --> 00:34:48,120 Speaker 3: is in their life that is making them anxious, you know, 631 00:34:49,320 --> 00:34:52,120 Speaker 3: and so like it's just not always obvious to us 632 00:34:52,160 --> 00:34:56,040 Speaker 3: what the sources of our unease are. And so I 633 00:34:56,080 --> 00:35:00,440 Speaker 3: admit that this is I don't have like empirical evidence 634 00:35:00,440 --> 00:35:01,839 Speaker 3: for this. All I can go on is just sort 635 00:35:01,880 --> 00:35:05,960 Speaker 3: of my hunch about reality. But my opinion is that 636 00:35:06,000 --> 00:35:08,600 Speaker 3: it is a really odd thing to choose to criticize, 637 00:35:08,640 --> 00:35:10,760 Speaker 3: because I think this is one of the more prescient 638 00:35:10,840 --> 00:35:15,680 Speaker 3: generalizations situated among many less prescient ones that you know, 639 00:35:15,760 --> 00:35:18,960 Speaker 3: this information overload through media can really make people feel 640 00:35:19,000 --> 00:35:23,319 Speaker 3: isolated and lonely and distressed. And I wonder if it's 641 00:35:23,400 --> 00:35:26,920 Speaker 3: possible that a lot of future shock like effects exist 642 00:35:26,960 --> 00:35:28,960 Speaker 3: in a context like that, where like they do have 643 00:35:29,040 --> 00:35:32,480 Speaker 3: this negative psychological effect on us, but we don't really 644 00:35:32,640 --> 00:35:37,600 Speaker 3: realize that it's the technology or the cultural changes downstream 645 00:35:37,640 --> 00:35:40,080 Speaker 3: from that technology that are the root cause of it. Instead, 646 00:35:40,120 --> 00:35:43,319 Speaker 3: it just kind of like life feels bad right now, 647 00:35:43,320 --> 00:35:45,680 Speaker 3: and I feel confused and afraid and I don't know 648 00:35:45,680 --> 00:35:47,880 Speaker 3: what's going on, but you don't realize why. 649 00:35:48,360 --> 00:35:52,480 Speaker 2: Yeah, yeah, because it's you can always find other reasons 650 00:35:52,560 --> 00:35:55,000 Speaker 2: and not to say those other reasons aren't in play too, Like, yeah, 651 00:35:55,080 --> 00:35:58,359 Speaker 2: there are terrible things going on in the world, you know, 652 00:35:58,400 --> 00:36:00,759 Speaker 2: there are a lot of things to be concerned about, 653 00:36:01,320 --> 00:36:04,440 Speaker 2: and then certain generalities are also probably in play, you know. 654 00:36:04,520 --> 00:36:09,200 Speaker 2: I mean, the youth are always rebelling, right, I mean. 655 00:36:09,280 --> 00:36:09,360 Speaker 4: It. 656 00:36:10,840 --> 00:36:12,680 Speaker 2: Is often the case that it seems it's like harder 657 00:36:12,680 --> 00:36:14,440 Speaker 2: to make friends when you get older, things like that, 658 00:36:14,520 --> 00:36:16,920 Speaker 2: you know, And those are often sided as well. But 659 00:36:17,160 --> 00:36:20,120 Speaker 2: perhaps the technology is a part of these scenarios and 660 00:36:20,160 --> 00:36:20,839 Speaker 2: others as well. 661 00:36:21,200 --> 00:36:23,279 Speaker 3: So I guess part of what I'm reacting to is 662 00:36:23,640 --> 00:36:28,960 Speaker 3: maybe the assumption that when changes to our lives are 663 00:36:28,960 --> 00:36:32,040 Speaker 3: brought on by technology, it will be clear to us 664 00:36:32,080 --> 00:36:34,480 Speaker 3: that is what's happening. I don't know if it will 665 00:36:34,520 --> 00:36:34,920 Speaker 3: be clear. 666 00:36:35,680 --> 00:36:37,040 Speaker 4: Yeah, no, that's a great point. 667 00:36:37,719 --> 00:36:39,359 Speaker 3: But okay, we said we were going to get to 668 00:36:39,440 --> 00:36:42,320 Speaker 3: the section of the book where the Tofflers talk about 669 00:36:42,400 --> 00:36:49,280 Speaker 3: strategies for reacting to and weathering the storm of changes 670 00:36:50,040 --> 00:36:52,960 Speaker 3: brought on by technology. How do people how should people 671 00:36:53,080 --> 00:36:56,719 Speaker 3: deal with future shock? Maybe prevent it in some cases 672 00:36:56,920 --> 00:36:59,040 Speaker 3: or alleviate its effects in others. 673 00:37:00,120 --> 00:37:02,000 Speaker 2: Yeah, and this is a very interesting part of the 674 00:37:02,000 --> 00:37:04,720 Speaker 2: book as well. And again they're dealing with like essentially 675 00:37:04,760 --> 00:37:09,120 Speaker 2: strategies for survival off future shock, not for survival of 676 00:37:09,200 --> 00:37:14,600 Speaker 2: any particular you know, realistic or even fanciful prediction of 677 00:37:14,600 --> 00:37:17,120 Speaker 2: what the future might consist of. Right, So, the first 678 00:37:17,120 --> 00:37:20,440 Speaker 2: thing they lay out is direct coping. This is managing 679 00:37:20,480 --> 00:37:24,160 Speaker 2: overstimulation and anxiety at the individual level, not just at 680 00:37:24,160 --> 00:37:26,440 Speaker 2: the and not just at a subconscious level as in 681 00:37:26,480 --> 00:37:29,280 Speaker 2: like the maladaptive practices, but at the conscious level of things. 682 00:37:29,640 --> 00:37:32,080 Speaker 3: Okay, so what does that mean? Managing it at the 683 00:37:32,120 --> 00:37:32,920 Speaker 3: conscious level? 684 00:37:33,680 --> 00:37:36,360 Speaker 2: I think this is basically the pushing the off switch 685 00:37:36,400 --> 00:37:39,440 Speaker 2: area of the equation, like the idea that like, okay, 686 00:37:39,760 --> 00:37:42,919 Speaker 2: at some point, you've got to be able to to 687 00:37:42,920 --> 00:37:46,000 Speaker 2: to some degree, step up and manage your over stimulation. 688 00:37:46,239 --> 00:37:48,200 Speaker 2: Like maybe you, you know, to go back to the 689 00:37:48,480 --> 00:37:51,520 Speaker 2: movie we watched most recently on Weird House Cinema. Maybe 690 00:37:51,560 --> 00:37:54,640 Speaker 2: you cut down your wall of television sets to just 691 00:37:54,719 --> 00:37:57,080 Speaker 2: one or two television sets that you play at one time. 692 00:37:57,160 --> 00:38:00,680 Speaker 2: You know, you make like sensible decisions that you can 693 00:38:01,640 --> 00:38:04,680 Speaker 2: regarding your overstimulation. You're not gonna have control over everything, 694 00:38:04,920 --> 00:38:07,200 Speaker 2: but you are going to have control over some things, right. 695 00:38:07,400 --> 00:38:10,040 Speaker 3: Right, Realize when it's getting to you, and then demand 696 00:38:10,080 --> 00:38:12,840 Speaker 3: as David Bowie did, that the televisions get out of 697 00:38:12,960 --> 00:38:18,120 Speaker 3: my mind. Yeah, in a practical sense by turning them off. 698 00:38:18,280 --> 00:38:21,600 Speaker 2: Yeah, okay, And that's again sometimes easier said than done. 699 00:38:21,600 --> 00:38:23,720 Speaker 2: But O get we acknowledge that that could be one step. 700 00:38:24,040 --> 00:38:26,520 Speaker 3: Well, I mean, it's certainly easier to do if you 701 00:38:26,760 --> 00:38:30,560 Speaker 3: are put in place a sort of cultural regime around 702 00:38:30,600 --> 00:38:33,680 Speaker 3: yourself to remind you that that's an option and to like, 703 00:38:34,320 --> 00:38:36,760 Speaker 3: you know, bring it up over and over. It's easier 704 00:38:36,760 --> 00:38:39,359 Speaker 3: then than it is if you're just like sitting there 705 00:38:39,400 --> 00:38:43,560 Speaker 3: with your media devices and like never even hearing you know, 706 00:38:43,680 --> 00:38:46,279 Speaker 3: people say, hey, if you considered that this media might 707 00:38:46,360 --> 00:38:49,120 Speaker 3: be making you less happy than you could be or 708 00:38:49,200 --> 00:38:51,600 Speaker 3: causing you anxiety, maybe you should do something. 709 00:38:52,160 --> 00:38:52,359 Speaker 4: Yeah. 710 00:38:52,400 --> 00:38:54,600 Speaker 2: And then again, it's also easier to imagine with the 711 00:38:54,600 --> 00:38:57,560 Speaker 2: basic like nineteen seventies television scenarios like the television is 712 00:38:57,560 --> 00:38:59,600 Speaker 2: stressing you off, We'll just turn that baby off, man. 713 00:39:00,239 --> 00:39:03,040 Speaker 2: But it's different now, like when you have so many 714 00:39:03,080 --> 00:39:05,759 Speaker 2: people that you work through the internet that you need 715 00:39:05,800 --> 00:39:07,920 Speaker 2: the Internet and social media for their job, Like you 716 00:39:07,960 --> 00:39:11,879 Speaker 2: can't just ease necessarily easily just hit the. 717 00:39:11,840 --> 00:39:14,040 Speaker 3: Off switch on all of that, right, I have to 718 00:39:14,080 --> 00:39:16,080 Speaker 3: have my phone on because I might get a work 719 00:39:16,120 --> 00:39:20,359 Speaker 3: email I have to reply to. But then also, yeah, 720 00:39:20,400 --> 00:39:23,439 Speaker 3: there's the old Twitter or whatever it's called. 721 00:39:23,480 --> 00:39:26,200 Speaker 4: Now, yeah it's the X. 722 00:39:26,280 --> 00:39:29,319 Speaker 2: It's the X Joe, right, because you exit out and you. 723 00:39:29,280 --> 00:39:31,720 Speaker 4: Make it go away. Right. I'm sure I'm the million 724 00:39:31,800 --> 00:39:32,640 Speaker 4: person to make that. 725 00:39:32,680 --> 00:39:34,719 Speaker 3: Joke the first time I've heard it. 726 00:39:34,920 --> 00:39:37,800 Speaker 2: I've actually when I have checked it out once or twice, 727 00:39:37,880 --> 00:39:40,080 Speaker 2: you know, to see what somebody had to say about something. 728 00:39:40,120 --> 00:39:42,400 Speaker 2: And I found my finger going to the X logo 729 00:39:42,480 --> 00:39:44,920 Speaker 2: because I identify it with A with a canceling out 730 00:39:44,960 --> 00:39:47,960 Speaker 2: and exit out, and I'm perfect. It just feels like 731 00:39:48,040 --> 00:39:50,640 Speaker 2: a design error. But who am I to say? 732 00:39:50,840 --> 00:39:52,560 Speaker 3: Get out of my mind? 733 00:39:52,880 --> 00:39:53,200 Speaker 4: All right. 734 00:39:53,239 --> 00:39:56,480 Speaker 2: The next thing that they highlight is the idea of 735 00:39:56,800 --> 00:40:00,640 Speaker 2: personal stability zones, so a way of managing in life 736 00:40:00,760 --> 00:40:04,640 Speaker 2: rather than suppressing it, sensible preparations for the future, planning 737 00:40:04,680 --> 00:40:09,000 Speaker 2: for things that that fulfill you, and avoiding unnecessary changes, 738 00:40:09,120 --> 00:40:13,040 Speaker 2: especially the just cause changes, you know, like we're changing 739 00:40:13,040 --> 00:40:15,600 Speaker 2: this just because you know, update and something you don't 740 00:40:15,600 --> 00:40:19,120 Speaker 2: really need, but we're just changing it because we need 741 00:40:19,160 --> 00:40:21,960 Speaker 2: the next upgrade. Like figuring out ways to avoid that 742 00:40:22,280 --> 00:40:24,080 Speaker 2: and maintain stability. 743 00:40:23,920 --> 00:40:27,120 Speaker 3: Okay, so you understand that there are a lot of 744 00:40:27,200 --> 00:40:30,640 Speaker 3: changes that are going to happen that you cannot avoid. 745 00:40:30,840 --> 00:40:33,359 Speaker 3: So you identify what are the things that you can 746 00:40:33,480 --> 00:40:36,440 Speaker 3: avoid changing, that don't have to change, and you can 747 00:40:36,520 --> 00:40:39,120 Speaker 3: keep those stable to help give yourself a sort of 748 00:40:39,160 --> 00:40:39,840 Speaker 3: a foothold. 749 00:40:40,239 --> 00:40:40,479 Speaker 4: Yeah. 750 00:40:41,239 --> 00:40:44,480 Speaker 2: The next one that they highlight is that of situational grouping. 751 00:40:44,640 --> 00:40:49,160 Speaker 2: So future shock absorbers social organization not based on what 752 00:40:49,360 --> 00:40:52,320 Speaker 2: we are, but what we are becoming. 753 00:40:53,080 --> 00:40:56,239 Speaker 3: Right, So they say, because life is just going to 754 00:40:56,239 --> 00:40:59,960 Speaker 3: be characterized by more and more change all of the time, 755 00:41:00,880 --> 00:41:05,840 Speaker 3: we should have social groupings that help people through changes. 756 00:41:05,880 --> 00:41:10,400 Speaker 3: They're designed to be sort of social identification groups based 757 00:41:10,440 --> 00:41:14,600 Speaker 3: on the changes that you're experiencing, for example, sort of 758 00:41:14,640 --> 00:41:18,120 Speaker 3: social clubs or social organizations for people who are moving 759 00:41:18,160 --> 00:41:21,200 Speaker 3: to a new city, or people who are changing careers, 760 00:41:21,480 --> 00:41:24,240 Speaker 3: or people who are getting divorced or whatever. 761 00:41:24,440 --> 00:41:27,040 Speaker 2: The ultimate irony, based on what we were just saying, 762 00:41:27,239 --> 00:41:29,800 Speaker 2: is that you do kind of find this with various 763 00:41:30,239 --> 00:41:32,439 Speaker 2: like Facebook groups, right, I mean, there is a group 764 00:41:32,480 --> 00:41:36,400 Speaker 2: for everything these days. And you know, I didn't attempt 765 00:41:36,440 --> 00:41:38,560 Speaker 2: to do anything like itally, but I'm betting there are 766 00:41:39,600 --> 00:41:43,440 Speaker 2: you know, a huge number of groups that are positioned 767 00:41:43,440 --> 00:41:46,120 Speaker 2: to help people going through various changes in life, or 768 00:41:46,200 --> 00:41:47,680 Speaker 2: could certainly be utilized for them. 769 00:41:48,120 --> 00:41:50,719 Speaker 3: Well, from my own recent memory, I know that there 770 00:41:50,719 --> 00:41:53,000 Speaker 3: are a lot of things that are geared toward people 771 00:41:53,040 --> 00:41:55,400 Speaker 3: who are currently or about to have a baby. 772 00:41:56,040 --> 00:41:59,680 Speaker 2: Yes, yes, so anyway, with all these would be interesting 773 00:41:59,719 --> 00:42:03,040 Speaker 2: to hear listener feedback. But hey, listeners, if you if 774 00:42:03,520 --> 00:42:06,680 Speaker 2: if you have benefited from some sort of a group 775 00:42:06,680 --> 00:42:10,120 Speaker 2: in real life or online, et cetera, that is essentially 776 00:42:10,200 --> 00:42:13,920 Speaker 2: situational grouping, it'd be interesting to hear from you, all right. 777 00:42:13,960 --> 00:42:17,320 Speaker 2: The next one is crisis counseling, one on one counseling 778 00:42:17,400 --> 00:42:21,120 Speaker 2: during the crisis of adaptation. This one I felt I 779 00:42:21,239 --> 00:42:23,240 Speaker 2: felt like this one. I felt like this one made 780 00:42:23,280 --> 00:42:25,520 Speaker 2: a fair amount of sense, right, I mean, if if 781 00:42:25,560 --> 00:42:28,360 Speaker 2: you're going through some sort of a crisis, be it 782 00:42:28,400 --> 00:42:31,040 Speaker 2: a crisis of future shock or something else, it makes 783 00:42:31,080 --> 00:42:35,160 Speaker 2: sense to get professional help if professional help is available 784 00:42:35,239 --> 00:42:37,320 Speaker 2: and you know, affordable and so forth. 785 00:42:37,880 --> 00:42:39,920 Speaker 3: I mean, it seems like it would just be generally 786 00:42:39,960 --> 00:42:43,960 Speaker 3: socially beneficial to have more free crisis counseling for crisis 787 00:42:44,000 --> 00:42:45,600 Speaker 3: of whatever cause. 788 00:42:45,880 --> 00:42:48,640 Speaker 2: Right right, Yeah, And it and it also kind of 789 00:42:48,640 --> 00:42:50,399 Speaker 2: falls in line with the whole idea of like when 790 00:42:50,440 --> 00:42:53,120 Speaker 2: are you when when are you going to get involved 791 00:42:53,120 --> 00:42:55,160 Speaker 2: in a crisis scenario? Are you going to get involved 792 00:42:55,160 --> 00:42:57,879 Speaker 2: in those little winnable battles or that one massive life 793 00:42:57,920 --> 00:43:01,120 Speaker 2: destroying catastrophe, and you, I think you can. You can 794 00:43:01,200 --> 00:43:04,239 Speaker 2: certainly point to examples in the world where yeah, there's 795 00:43:04,400 --> 00:43:07,080 Speaker 2: some sort of crisis, counseling wasn't available, and then at 796 00:43:07,080 --> 00:43:09,000 Speaker 2: the very end, you end up with a situation nobody 797 00:43:09,000 --> 00:43:09,600 Speaker 2: wanted to have. 798 00:43:09,480 --> 00:43:10,080 Speaker 4: To contend with. 799 00:43:11,280 --> 00:43:13,960 Speaker 2: All Right, the next thing that they recommend is halfway 800 00:43:14,120 --> 00:43:19,120 Speaker 2: house future shock absorbers, so urban recreation centers for rural 801 00:43:19,160 --> 00:43:23,320 Speaker 2: people entering into their systems. I'm not so sure about this. 802 00:43:23,360 --> 00:43:25,640 Speaker 2: And this is the basic idea that like, cities are 803 00:43:25,640 --> 00:43:27,120 Speaker 2: just going to get bigger and bigger, and they're just 804 00:43:27,120 --> 00:43:30,960 Speaker 2: going to keep drawing in people from rural environments, and 805 00:43:31,040 --> 00:43:33,839 Speaker 2: they're going to go through future shock, especially as they 806 00:43:34,000 --> 00:43:36,320 Speaker 2: enter into these urban centers, and so you're going to 807 00:43:36,400 --> 00:43:38,879 Speaker 2: need to have a halfway house, a way a place 808 00:43:38,880 --> 00:43:41,839 Speaker 2: where they can go and become better at and more 809 00:43:41,840 --> 00:43:43,440 Speaker 2: adapted to what's going on in the city. 810 00:43:43,800 --> 00:43:46,480 Speaker 3: Yeah, I don't know how much this one, how much 811 00:43:46,520 --> 00:43:48,759 Speaker 3: sense this one makes then again, I mean, I guess 812 00:43:48,800 --> 00:43:51,959 Speaker 3: the idea of a halfway house, you know, the main 813 00:43:52,560 --> 00:43:54,880 Speaker 3: sense in which we're familiar with the halfway house is 814 00:43:54,960 --> 00:43:59,680 Speaker 3: like for people coming out of institutions, like people who 815 00:43:59,719 --> 00:44:03,480 Speaker 3: have been in a you know, in a prison or 816 00:44:03,600 --> 00:44:06,000 Speaker 3: maybe in like a mental health care facility or something 817 00:44:06,080 --> 00:44:09,239 Speaker 3: like that, will have a sort of semi stable house 818 00:44:09,280 --> 00:44:11,440 Speaker 3: where it's like, you know, everything is the same and 819 00:44:11,480 --> 00:44:14,319 Speaker 3: it's a controlled environment for a part of the day, 820 00:44:14,440 --> 00:44:17,200 Speaker 3: maybe at night, where you stay there, and then you 821 00:44:17,680 --> 00:44:22,759 Speaker 3: are being slowly reacclimated into the rest of society by 822 00:44:22,800 --> 00:44:26,160 Speaker 3: spending part of your day outside the house. So I 823 00:44:26,160 --> 00:44:28,000 Speaker 3: don't know. I mean to think about it in that 824 00:44:28,080 --> 00:44:32,680 Speaker 3: broader sense, I could maybe imagine something like that, But 825 00:44:32,719 --> 00:44:35,120 Speaker 3: then again, I don't know how much Like I don't 826 00:44:35,160 --> 00:44:38,480 Speaker 3: know how much like the future shock is really different 827 00:44:38,600 --> 00:44:43,279 Speaker 3: between say, like the rural environment in the city environment. 828 00:44:44,920 --> 00:44:48,120 Speaker 5: I don't know. Maybe I'm wrong about that one. 829 00:44:48,239 --> 00:44:48,759 Speaker 4: Yeah, I don't know. 830 00:44:48,800 --> 00:44:50,879 Speaker 2: It also feels like the whole idea of future shock 831 00:44:50,960 --> 00:44:52,440 Speaker 2: is that you would have to always be in the 832 00:44:52,520 --> 00:44:55,360 Speaker 2: halfway house because you'd never be able to fully adapt. 833 00:44:55,719 --> 00:44:56,480 Speaker 4: Yeah. 834 00:44:56,560 --> 00:44:59,280 Speaker 2: So yeah, this one, this one was I was definitely 835 00:44:59,360 --> 00:45:02,239 Speaker 2: less sure about. The next two are interesting to talk about, 836 00:45:02,280 --> 00:45:06,760 Speaker 2: though I'm not sure how how sensible they are either. 837 00:45:07,040 --> 00:45:09,480 Speaker 2: The first is that you would have enclaves of the past, 838 00:45:09,840 --> 00:45:14,840 Speaker 2: so communities in which turnover, novelty, and choice are deliberately limited, 839 00:45:16,040 --> 00:45:22,920 Speaker 2: basically Westworld or some other like past or simplified environment World. 840 00:45:23,280 --> 00:45:25,439 Speaker 2: I guess if we're to realistically imagine, it's like, Okay, 841 00:45:25,480 --> 00:45:28,320 Speaker 2: there's so much internet, there's internet everywhere. For the weekend, 842 00:45:28,400 --> 00:45:31,319 Speaker 2: we're going to go to No Internetville. And then when 843 00:45:31,360 --> 00:45:33,160 Speaker 2: you go there, you're going to feel a certain bit 844 00:45:33,200 --> 00:45:35,880 Speaker 2: of relief, and we often we often do. If you 845 00:45:35,920 --> 00:45:38,120 Speaker 2: travel around, you may find yourself in a situation that 846 00:45:38,200 --> 00:45:41,640 Speaker 2: is essentially no Internetville. It may be your mom's house 847 00:45:41,680 --> 00:45:44,720 Speaker 2: in the country. It may be you know, you've traveled 848 00:45:44,719 --> 00:45:47,879 Speaker 2: to somewhere where there's cell reception, isn't there you don't 849 00:45:47,880 --> 00:45:50,759 Speaker 2: have an international cell plan or something like that. But 850 00:45:50,840 --> 00:45:54,000 Speaker 2: this would be a place that is deliberately created for 851 00:45:54,040 --> 00:45:54,640 Speaker 2: this purpose. 852 00:45:55,680 --> 00:45:58,200 Speaker 3: This was one section of the Solutions here where I 853 00:45:58,280 --> 00:46:01,760 Speaker 3: was noticing what I thought was kind of a recurring 854 00:46:01,840 --> 00:46:05,840 Speaker 3: problem with the book, which in my opinion was I 855 00:46:05,880 --> 00:46:08,920 Speaker 3: think the book sort of has a problem projecting a 856 00:46:09,120 --> 00:46:13,520 Speaker 3: realistic picture of what is likely to be done. Like, 857 00:46:14,120 --> 00:46:16,520 Speaker 3: there are a lot of recommendations and predictions in the 858 00:46:16,520 --> 00:46:18,520 Speaker 3: book that are things like you know, yeah, we will 859 00:46:18,760 --> 00:46:22,799 Speaker 3: we will create future shock absorption zones where people can 860 00:46:22,880 --> 00:46:27,040 Speaker 3: go live as like a you know, a medieval farmer 861 00:46:27,080 --> 00:46:29,160 Speaker 3: for a few months so that they can recover from 862 00:46:29,200 --> 00:46:32,520 Speaker 3: the pace of change in modern life. And I was 863 00:46:32,640 --> 00:46:34,560 Speaker 3: kind of thinking, like, apart from whether or not that's 864 00:46:34,560 --> 00:46:37,960 Speaker 3: a good idea, like, practically, how is that going to happen? 865 00:46:38,160 --> 00:46:42,160 Speaker 3: You know, like ninety nine percent of people they can't 866 00:46:42,280 --> 00:46:44,000 Speaker 3: afford to do that. They've got to go to work 867 00:46:44,080 --> 00:46:46,480 Speaker 3: to make enough money to survive, take care of their 868 00:46:46,520 --> 00:46:49,400 Speaker 3: families every day, and all that you can't just like 869 00:46:49,960 --> 00:46:53,560 Speaker 3: leave for however long you need to to you know, 870 00:46:53,640 --> 00:46:57,160 Speaker 3: recover like this. So I think a lot of predictions 871 00:46:57,680 --> 00:47:01,960 Speaker 3: sort of have the unspoken assumption of a post scarcity 872 00:47:02,040 --> 00:47:05,680 Speaker 3: abundance future where everything is like, you know, there's a 873 00:47:05,760 --> 00:47:09,479 Speaker 3: very generous state that like provides for whatever people need 874 00:47:09,600 --> 00:47:12,040 Speaker 3: in order to adapt, and you know, it's almost like 875 00:47:12,080 --> 00:47:15,600 Speaker 3: anything not prevented by the laws of physics will be possible. 876 00:47:16,160 --> 00:47:18,080 Speaker 3: But I think it turns out that there are like 877 00:47:18,400 --> 00:47:22,040 Speaker 3: big problems of will and incentive that prevent a lot 878 00:47:22,080 --> 00:47:26,160 Speaker 3: of things from happening, even if they are both desirable 879 00:47:26,320 --> 00:47:30,120 Speaker 3: and physically technologically possible, just because like I don't know, 880 00:47:30,200 --> 00:47:33,640 Speaker 3: it would like take resources and will to make this 881 00:47:33,760 --> 00:47:36,600 Speaker 3: possible for people in general, and that you know that 882 00:47:36,920 --> 00:47:37,640 Speaker 3: hasn't happened. 883 00:47:37,960 --> 00:47:41,480 Speaker 2: Yeah, I mean they would require a drastic restructuring of society, 884 00:47:41,520 --> 00:47:45,560 Speaker 2: which ultimately is something they kind of push for. Yeah, 885 00:47:45,640 --> 00:47:47,520 Speaker 2: but we'll I think we'll get back to that in 886 00:47:47,560 --> 00:47:50,560 Speaker 2: a minute. The flip side of West World, for those 887 00:47:50,560 --> 00:47:54,080 Speaker 2: of you who are familiar with the original movies at least, 888 00:47:54,280 --> 00:47:57,760 Speaker 2: is of course future World. They also argue for enclaves 889 00:47:57,760 --> 00:48:01,600 Speaker 2: of the future where people may experience aspects of the 890 00:48:01,600 --> 00:48:05,359 Speaker 2: future in advance, kind of a halfway house for future living. Right, 891 00:48:05,400 --> 00:48:07,160 Speaker 2: It's like, well, I don't know if I'm gonna be ready. 892 00:48:07,000 --> 00:48:07,560 Speaker 4: For the future. 893 00:48:07,840 --> 00:48:09,640 Speaker 2: Well it's all right, we have a place you can 894 00:48:09,680 --> 00:48:11,919 Speaker 2: go for the weekend and that'll help you get ready 895 00:48:11,960 --> 00:48:12,200 Speaker 2: for it. 896 00:48:12,600 --> 00:48:15,919 Speaker 3: Perfect, same caveats I had before. But yes, I think 897 00:48:16,440 --> 00:48:19,000 Speaker 3: that should be and it should be publicly subsidized. Everybody 898 00:48:19,040 --> 00:48:21,319 Speaker 3: should get to go to Future World, not just for 899 00:48:21,400 --> 00:48:22,080 Speaker 3: the super rich. 900 00:48:22,160 --> 00:48:33,080 Speaker 5: As John Hammond said, Now. 901 00:48:33,160 --> 00:48:35,400 Speaker 2: The next one is I think a little bit of 902 00:48:35,400 --> 00:48:38,279 Speaker 2: a head scratcher, but also makes sense, but also is 903 00:48:38,320 --> 00:48:40,000 Speaker 2: something that I think we're just going to do on 904 00:48:40,040 --> 00:48:42,279 Speaker 2: our own as human beings. And that is what they 905 00:48:42,320 --> 00:48:46,239 Speaker 2: call global space pageants. So the introduction of additional stability 906 00:48:46,239 --> 00:48:49,080 Speaker 2: points and rituals into a society to give it structure 907 00:48:49,440 --> 00:48:52,600 Speaker 2: as older stability points and rituals fall away. Again, one 908 00:48:52,640 --> 00:48:55,160 Speaker 2: of their whole things about future shock is like even 909 00:48:55,239 --> 00:48:58,880 Speaker 2: the institutions and ideas that you would cling to for 910 00:48:59,719 --> 00:49:02,160 Speaker 2: stability as you're moving into the future and dealing with 911 00:49:02,200 --> 00:49:05,239 Speaker 2: all these other changes, those are falling away too, and 912 00:49:05,280 --> 00:49:08,719 Speaker 2: so you feel completely unmoored from the past. There are 913 00:49:08,760 --> 00:49:10,400 Speaker 2: all things, well, we just need to keep coming up 914 00:49:10,400 --> 00:49:13,480 Speaker 2: with new things. But aren't we doing that already? Like 915 00:49:13,560 --> 00:49:16,920 Speaker 2: International Cat Day is a thing and we all love it, 916 00:49:17,239 --> 00:49:20,520 Speaker 2: but it's not like rooted in deep human history or anything. 917 00:49:20,560 --> 00:49:23,560 Speaker 2: I think things like that we're always going to keep 918 00:49:23,600 --> 00:49:26,279 Speaker 2: coming up with that, We're going to keep creating these 919 00:49:26,360 --> 00:49:27,880 Speaker 2: quote unquote stability points. 920 00:49:28,360 --> 00:49:30,080 Speaker 3: I think this is a really good point. I think 921 00:49:30,280 --> 00:49:33,400 Speaker 3: in general it's true that we underestimate the importance of 922 00:49:33,520 --> 00:49:36,920 Speaker 3: rituals and the work they do in helping us feel 923 00:49:37,000 --> 00:49:40,040 Speaker 3: stability amidst the inevitable changes of life. 924 00:49:40,600 --> 00:49:41,400 Speaker 4: Yeah. Absolutely. 925 00:49:41,840 --> 00:49:44,840 Speaker 2: And then the final point that the survival tactic that 926 00:49:44,880 --> 00:49:46,600 Speaker 2: they mentioned this is a big one, and this one 927 00:49:47,120 --> 00:49:49,360 Speaker 2: makes perfect sense, really, and that is a future facing 928 00:49:49,480 --> 00:49:53,520 Speaker 2: education system, which is kind of a broad concept, but 929 00:49:53,800 --> 00:49:57,839 Speaker 2: certainly in like individual takes on it like this makes 930 00:49:57,840 --> 00:50:00,759 Speaker 2: perfect sense, Like you should be educat people in a 931 00:50:00,760 --> 00:50:04,600 Speaker 2: way to where the lessons they're learning are appliable to 932 00:50:04,680 --> 00:50:07,480 Speaker 2: the future, preparing them for change, etc. 933 00:50:08,360 --> 00:50:11,280 Speaker 3: Yeah, And a big part of the future facing education 934 00:50:11,440 --> 00:50:18,480 Speaker 3: system both necessitates and would involve programs for predicting the future. 935 00:50:18,960 --> 00:50:22,520 Speaker 3: So in a lot of ways, the Solutions section of 936 00:50:22,920 --> 00:50:26,560 Speaker 3: the Toffler's book could be seen overall as a case 937 00:50:26,719 --> 00:50:32,160 Speaker 3: for futurology itself. A big part of the Toffler's Solution is, 938 00:50:32,239 --> 00:50:36,920 Speaker 3: for example, what they call anticipatory democracy, which would be 939 00:50:37,120 --> 00:50:41,640 Speaker 3: mechanisms for democratically predicting what will happen in the future 940 00:50:42,160 --> 00:50:45,800 Speaker 3: and setting long term goals. So you know, technocratic planning 941 00:50:46,080 --> 00:50:49,480 Speaker 3: will have tangible objectives to aim for. And they spend 942 00:50:49,480 --> 00:50:52,280 Speaker 3: a lot of time talking about the benefits of having 943 00:50:52,680 --> 00:50:57,319 Speaker 3: professional future imagineers who specialize in predicting the future in 944 00:50:57,400 --> 00:51:00,719 Speaker 3: various ways so we can anticipate what's coming and plan 945 00:51:00,840 --> 00:51:03,480 Speaker 3: around it to reduce future shock impacts. 946 00:51:03,520 --> 00:51:04,279 Speaker 2: And uh. 947 00:51:04,840 --> 00:51:08,800 Speaker 3: This could involve anything from banning a particularly dangerous coming 948 00:51:08,800 --> 00:51:13,640 Speaker 3: technological development to coming up with these future shock absorbers 949 00:51:13,680 --> 00:51:16,440 Speaker 3: like you were just talking about rob in advance of 950 00:51:16,480 --> 00:51:20,000 Speaker 3: the coming changes. But ultimately they say that this work 951 00:51:20,080 --> 00:51:23,080 Speaker 3: cannot be all top down. It can't all be from 952 00:51:23,200 --> 00:51:26,440 Speaker 3: like you know, the government Office of Predicting the Future 953 00:51:26,560 --> 00:51:31,439 Speaker 3: with professional imagineers. There's also an important role for an 954 00:51:31,560 --> 00:51:36,719 Speaker 3: ad hoc democratic future anticipation potential. They say, maybe you 955 00:51:36,719 --> 00:51:40,560 Speaker 3: should assemble these units out of the people, so like 956 00:51:40,719 --> 00:51:43,760 Speaker 3: people maybe would be maybe would be selected at random, 957 00:51:43,880 --> 00:51:47,839 Speaker 3: just like juries are from from the demos, to come 958 00:51:47,880 --> 00:51:50,120 Speaker 3: together and predict what's going to happen in the future 959 00:51:50,560 --> 00:51:54,319 Speaker 3: and say what how that interacts with goals that they 960 00:51:54,400 --> 00:51:58,000 Speaker 3: have for their society and what should be done about it. 961 00:51:59,120 --> 00:52:00,799 Speaker 2: I'm going to read a quote from the book here 962 00:52:00,800 --> 00:52:04,239 Speaker 2: where they get into this event quote. The time has 963 00:52:04,280 --> 00:52:08,239 Speaker 2: come for dramatic reassessment of the directions of change, a 964 00:52:08,280 --> 00:52:11,640 Speaker 2: reassessment made not by the politicians or the sociologist or 965 00:52:11,640 --> 00:52:15,480 Speaker 2: the clergy or the elitist revolutionaries, not by technicians or 966 00:52:15,520 --> 00:52:19,359 Speaker 2: college presidents, but by people, by the people themselves. We need, 967 00:52:19,440 --> 00:52:22,160 Speaker 2: quite literally to quote, go to the people with a 968 00:52:22,239 --> 00:52:25,120 Speaker 2: question that is almost never asked of them. What kind 969 00:52:25,200 --> 00:52:27,760 Speaker 2: of a world do you want ten, twenty, or thirty 970 00:52:27,840 --> 00:52:30,680 Speaker 2: years from now? We need to initiate, in short, a 971 00:52:30,719 --> 00:52:34,759 Speaker 2: continuing plebiscite on the future. The moment is right for 972 00:52:34,840 --> 00:52:38,279 Speaker 2: the formation in each of the high technology nations of 973 00:52:38,320 --> 00:52:41,759 Speaker 2: a movement for total self review, a public self examination 974 00:52:41,840 --> 00:52:45,040 Speaker 2: aimed at broadening and defining, in social as well as 975 00:52:45,080 --> 00:52:49,319 Speaker 2: merely economic terms, the goals of quote unquote progress. On 976 00:52:49,480 --> 00:52:51,440 Speaker 2: the edge of a new millennium, on the brink of 977 00:52:51,480 --> 00:52:54,320 Speaker 2: a new stage of human development, we are racing blindly 978 00:52:54,360 --> 00:52:56,719 Speaker 2: into the future. But where do we want to go? 979 00:52:57,160 --> 00:53:00,239 Speaker 2: What would happen if we actually tried to amend answer 980 00:53:00,360 --> 00:53:00,920 Speaker 2: this question. 981 00:53:01,360 --> 00:53:04,920 Speaker 3: That's a very good point. And I don't know, I 982 00:53:04,920 --> 00:53:08,880 Speaker 3: don't know about like the particular mechanics that the Tofflers suggest, 983 00:53:09,080 --> 00:53:11,200 Speaker 3: but I would say I think it would be a 984 00:53:11,200 --> 00:53:16,279 Speaker 3: good thing if the conversation of democracies was way more 985 00:53:16,360 --> 00:53:19,600 Speaker 3: focused on the long term goals and on the future 986 00:53:19,719 --> 00:53:23,360 Speaker 3: instead of just like quarreling over what is or is 987 00:53:23,440 --> 00:53:24,560 Speaker 3: not happening in the present. 988 00:53:25,360 --> 00:53:29,440 Speaker 2: Yeah, so the Toddlers contend that contended that ultimately change 989 00:53:29,480 --> 00:53:33,360 Speaker 2: can certainly remain the protagonist of the human story, provided 990 00:53:33,400 --> 00:53:37,520 Speaker 2: that wild, unchecked and unanticipated change is control through a 991 00:53:37,560 --> 00:53:42,040 Speaker 2: host of personal and systemic changes. You know, what they're 992 00:53:42,080 --> 00:53:44,840 Speaker 2: proposing here is hardly as advanced is something like the 993 00:53:44,840 --> 00:53:48,239 Speaker 2: psychohistory of Isaac Asimov's Foundation series we have, which is 994 00:53:48,280 --> 00:53:51,240 Speaker 2: a you know, sort of an advanced sci fi concept 995 00:53:51,239 --> 00:53:54,840 Speaker 2: of futurology where you're like mathematically modeling the trends in 996 00:53:54,880 --> 00:53:58,520 Speaker 2: the future. But this thing still at its heart it 997 00:53:58,920 --> 00:54:00,960 Speaker 2: remains a challenge for you know, we can point to, 998 00:54:01,800 --> 00:54:04,319 Speaker 2: you know, to various examples in the modern world to 999 00:54:04,480 --> 00:54:08,960 Speaker 2: underline this. You know that our inability to really completely 1000 00:54:09,000 --> 00:54:12,359 Speaker 2: appreciate changes that are coming and feel like they are 1001 00:54:12,400 --> 00:54:15,239 Speaker 2: real to us and then act on that information. And 1002 00:54:15,840 --> 00:54:17,640 Speaker 2: I think climate change is perhaps one of the more 1003 00:54:17,719 --> 00:54:21,680 Speaker 2: alarming examples. You know. In response to scientific consensus that 1004 00:54:21,800 --> 00:54:26,280 Speaker 2: modern industrialized society is inflicting considerable change on our environment 1005 00:54:26,320 --> 00:54:29,000 Speaker 2: and altering the course of temperature and climate, a certain 1006 00:54:29,000 --> 00:54:32,040 Speaker 2: amount of social and political effort has been applied to 1007 00:54:32,080 --> 00:54:35,680 Speaker 2: the problem, but not seemingly enough so far to alter 1008 00:54:35,960 --> 00:54:38,480 Speaker 2: the course to the degree that is advisable. 1009 00:54:38,840 --> 00:54:40,760 Speaker 3: And this seems to be to come back to actually 1010 00:54:40,840 --> 00:54:43,520 Speaker 3: some of the points we raised earlier about like problems 1011 00:54:43,520 --> 00:54:46,040 Speaker 3: of will and incentives, like there are also people who 1012 00:54:46,080 --> 00:54:49,480 Speaker 3: have countervailing incentives and a lot of power to try 1013 00:54:49,480 --> 00:54:50,759 Speaker 3: to pursue those incentives. 1014 00:54:51,440 --> 00:54:53,719 Speaker 2: Yeah, and then just you know, individually, on the human level, 1015 00:54:53,760 --> 00:54:57,520 Speaker 2: it's like, we just have difficulty dealing with problems that 1016 00:54:57,920 --> 00:55:02,600 Speaker 2: extend into the into the future, you know, especially the 1017 00:55:02,640 --> 00:55:07,680 Speaker 2: future that exists beyond our own lifespan. So you know, 1018 00:55:07,719 --> 00:55:09,160 Speaker 2: this is not to say we should give up hope 1019 00:55:09,239 --> 00:55:12,399 Speaker 2: or anything. You know, obviously I'm the climate front. There's 1020 00:55:12,600 --> 00:55:14,839 Speaker 2: a lot of momentum out there, and small changes do 1021 00:55:14,960 --> 00:55:18,200 Speaker 2: add up. But paradoxically, some of the ideas that the 1022 00:55:18,239 --> 00:55:21,960 Speaker 2: Tofflers outline actually seem to work against these efforts though, 1023 00:55:21,960 --> 00:55:24,880 Speaker 2: because people, again they don't want to change and change 1024 00:55:25,400 --> 00:55:27,680 Speaker 2: be it. You know, various changes in your life to 1025 00:55:27,760 --> 00:55:30,600 Speaker 2: lessen the effects of climate change or the changes brought 1026 00:55:30,640 --> 00:55:35,000 Speaker 2: on by climate change itself, those can be ignored, avoided, 1027 00:55:35,080 --> 00:55:39,840 Speaker 2: and explained away via the various maladaptive coping mechanisms discussed earlier, 1028 00:55:40,040 --> 00:55:41,800 Speaker 2: at least for a certain amount of time. 1029 00:55:42,480 --> 00:55:46,520 Speaker 3: Yeah, thinking of direct analogies to you know, the Denyer mindset. Oh, 1030 00:55:46,520 --> 00:55:48,200 Speaker 3: it's always been like this, you know. 1031 00:55:48,320 --> 00:55:51,320 Speaker 2: Oh, okay, this isn't really different, nothing's different. 1032 00:55:51,360 --> 00:55:52,279 Speaker 4: This is just the same. 1033 00:55:52,360 --> 00:55:54,600 Speaker 2: This is just you know, this is like the newsweek 1034 00:55:54,600 --> 00:55:58,960 Speaker 2: cover from the nineties. But yeah, exactly anyway, you know, 1035 00:55:59,000 --> 00:56:01,120 Speaker 2: the basic paradox here, I was certainly in play with 1036 00:56:01,160 --> 00:56:03,560 Speaker 2: just future shock in general as a broader topic. According 1037 00:56:03,560 --> 00:56:06,440 Speaker 2: to the Tofflers, we must change or we or we 1038 00:56:06,480 --> 00:56:07,400 Speaker 2: will be changed. 1039 00:56:08,719 --> 00:56:09,080 Speaker 4: You can. 1040 00:56:09,120 --> 00:56:11,880 Speaker 2: You can either figure out those those small battles that 1041 00:56:12,000 --> 00:56:14,759 Speaker 2: can be won and win those small battles, or you're 1042 00:56:14,800 --> 00:56:17,719 Speaker 2: going to deal with that much larger conflict that is 1043 00:56:17,760 --> 00:56:21,000 Speaker 2: going to be much more difficult to deal with. And again, 1044 00:56:21,040 --> 00:56:24,000 Speaker 2: coming back to that basic question, what kind of future do. 1045 00:56:23,880 --> 00:56:26,640 Speaker 4: We actually want? And if, you know, if. 1046 00:56:26,480 --> 00:56:29,840 Speaker 2: We answer that question, we answer it honestly and intelligently, 1047 00:56:30,320 --> 00:56:33,360 Speaker 2: then do we have the you know, do we have 1048 00:56:33,400 --> 00:56:35,000 Speaker 2: the will to do something about it? 1049 00:56:35,400 --> 00:56:38,520 Speaker 3: Well said well, Rob, I am glad you you spurred 1050 00:56:38,600 --> 00:56:41,680 Speaker 3: us to do this, this exploration of future shock. Like 1051 00:56:41,680 --> 00:56:43,400 Speaker 3: I said in the first episode, I'd never read the 1052 00:56:43,400 --> 00:56:47,160 Speaker 3: book before, it was only a little aware of the 1053 00:56:47,160 --> 00:56:50,920 Speaker 3: the concept, and I think it is. This is the 1054 00:56:51,080 --> 00:56:53,000 Speaker 3: kind of book that I think a lot of times 1055 00:56:53,000 --> 00:56:55,719 Speaker 3: people don't go back to. But I think there's a 1056 00:56:55,760 --> 00:56:59,040 Speaker 3: lot you can learn from it, even in the places, 1057 00:56:59,200 --> 00:57:01,040 Speaker 3: there's a lot you can learn learn from like the 1058 00:57:01,080 --> 00:57:04,680 Speaker 3: things that books like this get right and wrong. It's 1059 00:57:04,680 --> 00:57:08,560 Speaker 3: interesting to see, like what they got wrong, why you 1060 00:57:08,680 --> 00:57:11,879 Speaker 3: might think that they would end up thinking like that. 1061 00:57:12,760 --> 00:57:13,319 Speaker 5: So I don't know. 1062 00:57:13,719 --> 00:57:16,280 Speaker 3: I love this sort of thing. The future ology of 1063 00:57:16,320 --> 00:57:18,840 Speaker 3: the past is endlessly interesting to me. 1064 00:57:19,360 --> 00:57:20,400 Speaker 4: Yeah, this is a fun one. 1065 00:57:20,440 --> 00:57:22,400 Speaker 2: So again, we'd love to hear from everyone out there 1066 00:57:22,840 --> 00:57:26,360 Speaker 2: how these episodes or the book itself, if you have 1067 00:57:26,840 --> 00:57:29,760 Speaker 2: read it on your own, or if you've watched the 1068 00:57:29,800 --> 00:57:33,840 Speaker 2: TV documentary version of it that again can be streamed 1069 00:57:33,840 --> 00:57:37,280 Speaker 2: in some places, though not in great quality. Let us 1070 00:57:37,320 --> 00:57:39,919 Speaker 2: know we would love to hear from you. But yeah, 1071 00:57:39,920 --> 00:57:41,360 Speaker 2: on that note, we're going to go ahead and close 1072 00:57:41,400 --> 00:57:44,400 Speaker 2: out our look at Future Shock that we may discuss 1073 00:57:44,440 --> 00:57:46,040 Speaker 2: some of it a little bit more in listener mail 1074 00:57:46,080 --> 00:57:50,000 Speaker 2: episodes ahead. We'll remind you here that Stuff to Blow 1075 00:57:50,000 --> 00:57:53,400 Speaker 2: Your Mind is primarily a science podcast, with core episodes 1076 00:57:53,440 --> 00:57:56,439 Speaker 2: on Tuesdays and Thursdays, listener mail on Mondays, a short 1077 00:57:56,480 --> 00:57:59,360 Speaker 2: form artifact or monster effect on Wednesdays, and on Fridays, 1078 00:57:59,360 --> 00:58:01,400 Speaker 2: we set aside more serious concerns to just talk about 1079 00:58:01,400 --> 00:58:03,600 Speaker 2: a weird film on Weird House Cinema. 1080 00:58:03,760 --> 00:58:07,400 Speaker 3: Huge thanks as always to our excellent audio producer JJ Posway. 1081 00:58:07,720 --> 00:58:09,360 Speaker 3: If you would like to get in touch with us 1082 00:58:09,400 --> 00:58:12,120 Speaker 3: with feedback on this episode or any other, to suggest 1083 00:58:12,120 --> 00:58:14,160 Speaker 3: a topic for the future, or just to say hello, 1084 00:58:14,320 --> 00:58:17,200 Speaker 3: you can email us at contact at stuff to Blow 1085 00:58:17,200 --> 00:58:25,720 Speaker 3: your Mind dot com. 1086 00:58:25,840 --> 00:58:28,760 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 1087 00:58:28,840 --> 00:58:31,640 Speaker 1: more podcasts from my Heart Radio, visit the iHeartRadio app, 1088 00:58:31,800 --> 00:58:48,919 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.