1 00:00:03,040 --> 00:00:06,840 Speaker 1: Welcome to Stuff to Blow Your Mind, a production of iHeartRadio. 2 00:00:12,880 --> 00:00:14,920 Speaker 2: Hey you welcome to Stuff to Blow your Mind. My 3 00:00:15,000 --> 00:00:15,840 Speaker 2: name is Robert. 4 00:00:15,640 --> 00:00:18,200 Speaker 3: Lamb and I am Joe McCormick. 5 00:00:18,520 --> 00:00:22,079 Speaker 2: So the world is of course always changing. We know 6 00:00:22,160 --> 00:00:25,000 Speaker 2: this to be true, and yet I don't know about 7 00:00:25,079 --> 00:00:26,640 Speaker 2: the rest of you, and this is going to probably 8 00:00:26,720 --> 00:00:31,600 Speaker 2: vary greatly. I was actually talking about my wife with 9 00:00:31,680 --> 00:00:33,519 Speaker 2: this right before I came in. She was like, what 10 00:00:34,320 --> 00:00:36,520 Speaker 2: are you recording about today? And I was like, well, 11 00:00:36,520 --> 00:00:38,560 Speaker 2: we're going to be talking about this concept future shock. 12 00:00:38,640 --> 00:00:42,200 Speaker 2: And she's like, oh, yeah, you've explained it to me before. 13 00:00:42,320 --> 00:00:44,919 Speaker 2: Can you explain it again? And I did and she said, oh, 14 00:00:44,960 --> 00:00:50,000 Speaker 2: that's not real. I never feel that way. So I 15 00:00:50,040 --> 00:00:52,120 Speaker 2: do want to acknowledge straight up that not everyone is 16 00:00:52,159 --> 00:00:55,440 Speaker 2: necessarily going to feel this way. It is, I guess subjective, 17 00:00:55,440 --> 00:00:57,640 Speaker 2: but I don't know. For me, I feel like there 18 00:00:57,680 --> 00:01:02,360 Speaker 2: are times when it feels like change is accelerated, or 19 00:01:02,440 --> 00:01:05,720 Speaker 2: that the changes that are occurring in the world social 20 00:01:06,080 --> 00:01:10,720 Speaker 2: to our technology, technological or scientific or what have you, geopolitical, 21 00:01:10,720 --> 00:01:14,080 Speaker 2: et cetera, are are kind of like the topiary animals 22 00:01:14,120 --> 00:01:18,560 Speaker 2: in Stephen King's novel The Shining in the book as 23 00:01:18,600 --> 00:01:21,280 Speaker 2: opposed to the Stanley Kubrick movie. It's it's not a 24 00:01:21,319 --> 00:01:24,920 Speaker 2: hedge maze, it's hedge animals. And you look away, and 25 00:01:24,959 --> 00:01:27,680 Speaker 2: then you look back and they've moved in closer. So 26 00:01:27,720 --> 00:01:30,200 Speaker 2: it's sometimes it feels like that to me, with with 27 00:01:30,319 --> 00:01:33,720 Speaker 2: advancements and technological advancements and so forth, that they've been 28 00:01:33,760 --> 00:01:36,800 Speaker 2: steadily advancing on is while our backs are turned, and 29 00:01:36,840 --> 00:01:40,559 Speaker 2: it's and we only begin to notice, perhaps seemingly too late, 30 00:01:41,200 --> 00:01:43,560 Speaker 2: that oh, these these things are basically here. 31 00:01:43,840 --> 00:01:46,800 Speaker 3: Okay, So you're saying in this episode or this series, 32 00:01:46,840 --> 00:01:49,880 Speaker 3: you wanted to talk about the idea of future shock, 33 00:01:50,000 --> 00:01:53,160 Speaker 3: and this is brought on by some recent feelings about technology, 34 00:01:53,200 --> 00:01:54,040 Speaker 3: primarily AI. 35 00:01:54,680 --> 00:01:57,520 Speaker 2: Yeah, I mean, I think AI is is one of 36 00:01:57,520 --> 00:02:00,280 Speaker 2: the main metrics of change that a lot of people 37 00:02:00,280 --> 00:02:02,840 Speaker 2: are talking about right now. AI has come up on 38 00:02:02,880 --> 00:02:04,880 Speaker 2: the show quite a bit over the years, and including 39 00:02:04,880 --> 00:02:09,400 Speaker 2: the idea of AIS and creativity as recently as just 40 00:02:09,440 --> 00:02:12,520 Speaker 2: a couple of years ago. Me personally, I found that 41 00:02:12,560 --> 00:02:16,160 Speaker 2: these concepts felt exciting, maybe a little bit threatening, but 42 00:02:16,240 --> 00:02:19,600 Speaker 2: not in an immediate sense. At the same time, there 43 00:02:19,639 --> 00:02:24,200 Speaker 2: was also a lot of optimistic ideas concerning what the 44 00:02:24,240 --> 00:02:28,640 Speaker 2: future might look like with generative AI and so called 45 00:02:28,639 --> 00:02:32,800 Speaker 2: creative AI systems in place, that AI would essentially be 46 00:02:32,919 --> 00:02:36,440 Speaker 2: our partner in change. It would be a collaboration. I 47 00:02:36,480 --> 00:02:39,760 Speaker 2: remember seeing you know, talks about this and examples of 48 00:02:39,800 --> 00:02:42,679 Speaker 2: how this would play out. You'd adapt to using these 49 00:02:42,720 --> 00:02:45,720 Speaker 2: new tools as part of your creative process, while in 50 00:02:45,760 --> 00:02:48,920 Speaker 2: other fields individuals would reskill to adapt to the change. 51 00:02:49,120 --> 00:02:51,440 Speaker 2: You know, it seemed like there was kind of a 52 00:02:51,760 --> 00:02:55,400 Speaker 2: you know, a roadmap in place, and you know it 53 00:02:55,480 --> 00:02:58,120 Speaker 2: eased any rising future shock you might have. 54 00:02:58,600 --> 00:03:00,600 Speaker 3: Okay, so you used to feel more or like whatever 55 00:03:00,720 --> 00:03:03,880 Speaker 3: changes are going to be brought on by AI, we're 56 00:03:03,919 --> 00:03:07,040 Speaker 3: sort of we're taking the proper steps to like cushion 57 00:03:07,080 --> 00:03:07,760 Speaker 3: those blows. 58 00:03:08,639 --> 00:03:10,520 Speaker 2: Yeah, or at least, I mean not to say that 59 00:03:10,560 --> 00:03:14,320 Speaker 2: I wasn't exposed to other ideas and more negative views 60 00:03:14,320 --> 00:03:16,079 Speaker 2: of what could occur, but it seemed like there was 61 00:03:16,240 --> 00:03:19,320 Speaker 2: enough positivity out there that I was able to sort 62 00:03:19,320 --> 00:03:23,280 Speaker 2: of buy into it. And then I remember, before the pandemic, 63 00:03:23,480 --> 00:03:25,919 Speaker 2: at some point of Frequage year, this was I tended 64 00:03:25,960 --> 00:03:28,840 Speaker 2: to talk at the World Science Festival in which physicist 65 00:03:29,000 --> 00:03:33,080 Speaker 2: Max Tegmark reference to this idea, this kind of topography 66 00:03:33,200 --> 00:03:36,760 Speaker 2: of human abilities and jobs, with the idea being that 67 00:03:37,240 --> 00:03:41,440 Speaker 2: the higher elevations of this topography, like the mountain peaks, 68 00:03:41,760 --> 00:03:45,000 Speaker 2: we're going to be the most protected from the rising 69 00:03:45,080 --> 00:03:50,120 Speaker 2: sea levels of artificial intelligence. You can find this illustration online, 70 00:03:50,160 --> 00:03:54,720 Speaker 2: but it's like down there in the water, that's where jeopardy, chess, arithmetic, 71 00:03:54,960 --> 00:03:59,320 Speaker 2: and rote memorization are like, those are already underwater of AI. 72 00:04:00,240 --> 00:04:02,480 Speaker 2: But then as you move up through the topography, eventually 73 00:04:02,480 --> 00:04:05,880 Speaker 2: you're going to reach the heights of art, book, writing, 74 00:04:06,000 --> 00:04:06,640 Speaker 2: and science. 75 00:04:07,200 --> 00:04:10,360 Speaker 3: So his idea was things like art, I'm looking, you've 76 00:04:10,360 --> 00:04:11,800 Speaker 3: got the image here for me to look at two 77 00:04:11,880 --> 00:04:18,120 Speaker 3: things like maybe art, cinematography, writing, science theorem proving these 78 00:04:18,160 --> 00:04:20,279 Speaker 3: are all like the peaks that are going to be 79 00:04:20,320 --> 00:04:24,160 Speaker 3: the last things that AI can reach and replicate. 80 00:04:24,240 --> 00:04:26,280 Speaker 2: Right right, And there would be other things in the 81 00:04:26,320 --> 00:04:28,400 Speaker 2: low lens that would be kind of like next to go, 82 00:04:28,560 --> 00:04:32,640 Speaker 2: like vision and speech recognition, driving, for example. And not 83 00:04:32,680 --> 00:04:35,760 Speaker 2: to say that this is now inaccurate or anything, but 84 00:04:36,400 --> 00:04:38,720 Speaker 2: at the time when I first saw this image, you know, 85 00:04:38,760 --> 00:04:41,520 Speaker 2: it was interesting, maybe somewhat concerning, but it still didn't 86 00:04:41,520 --> 00:04:46,200 Speaker 2: feel immediate. But then in the summer of twenty twenty two, 87 00:04:46,360 --> 00:04:49,200 Speaker 2: I chatted with Mike Sharples, the author of Story Machines, 88 00:04:49,240 --> 00:04:52,880 Speaker 2: how computers have become creative writers. And I know we 89 00:04:53,360 --> 00:04:55,920 Speaker 2: had a lot of fun on some subsequent listener mails 90 00:04:56,120 --> 00:05:01,400 Speaker 2: using some of these technologies to generate tech But I 91 00:05:01,720 --> 00:05:05,520 Speaker 2: asked Sharples about this image, this idea from Max tech Mark, 92 00:05:05,839 --> 00:05:07,679 Speaker 2: and I, you know, ask him what he thought about 93 00:05:07,680 --> 00:05:10,599 Speaker 2: this projection, you know, the idea that human book writing, 94 00:05:10,600 --> 00:05:13,479 Speaker 2: for example, was in the highlands, and he said that 95 00:05:13,520 --> 00:05:16,520 Speaker 2: he thought the waters were already considerably high. 96 00:05:16,640 --> 00:05:20,080 Speaker 3: I guess one problem maybe affecting a picture like this 97 00:05:20,200 --> 00:05:22,839 Speaker 3: or a topography like this, is that it considers book 98 00:05:22,880 --> 00:05:28,080 Speaker 3: writing as one thing. And whereas it might be quite 99 00:05:28,200 --> 00:05:31,359 Speaker 3: difficult for AI to write a certain kind of book 100 00:05:31,360 --> 00:05:33,760 Speaker 3: aimed for a certain kind of audience, it might be 101 00:05:33,839 --> 00:05:36,240 Speaker 3: quite easy for AI to write a different kind of 102 00:05:36,240 --> 00:05:38,039 Speaker 3: book with a different purpose in mind. 103 00:05:38,839 --> 00:05:40,279 Speaker 2: Yeah, yeah, I mean to sort of throw in a 104 00:05:40,279 --> 00:05:43,200 Speaker 2: sci fi example, like if you were to find out, oh, 105 00:05:43,279 --> 00:05:47,280 Speaker 2: the aliens are among us now and they are disguising 106 00:05:47,320 --> 00:05:50,840 Speaker 2: themselves as people. The disguises are horrible, but the mere 107 00:05:50,960 --> 00:05:54,000 Speaker 2: fact that they are now doing it is enough for alarm, 108 00:05:54,040 --> 00:05:56,240 Speaker 2: you know. I guess it's that sort of thing. Yeah, 109 00:05:56,279 --> 00:06:00,400 Speaker 2: because right now, you know, can AI write like, quote 110 00:06:00,440 --> 00:06:04,400 Speaker 2: unquote the Great American novel now? But can it do 111 00:06:04,480 --> 00:06:07,800 Speaker 2: other things that can be passed off at least in 112 00:06:08,120 --> 00:06:10,240 Speaker 2: you know, sort of like self publishing marketplace and that 113 00:06:10,320 --> 00:06:13,120 Speaker 2: sort of thing. You know, has it reached the point 114 00:06:13,120 --> 00:06:17,000 Speaker 2: where it is of concern in education, in publishing in general? 115 00:06:17,080 --> 00:06:20,760 Speaker 3: Yes, I think I was just reading an article about 116 00:06:20,760 --> 00:06:26,920 Speaker 3: people relying on AI generated travel guides just supposed to 117 00:06:26,960 --> 00:06:28,880 Speaker 3: be you know, this is this is the kind of 118 00:06:28,920 --> 00:06:31,160 Speaker 3: thing that you might imagine AI could do well, because 119 00:06:31,200 --> 00:06:35,120 Speaker 3: it's just sourcing publicly available information from the Internet and 120 00:06:35,160 --> 00:06:38,760 Speaker 3: then compiling that into a book. And it's like, oh, okay, 121 00:06:38,800 --> 00:06:40,680 Speaker 3: so you can know here the restaurants you can go 122 00:06:40,760 --> 00:06:43,200 Speaker 3: to in this city or something like that. But I 123 00:06:43,240 --> 00:06:46,279 Speaker 3: think I recall in the article of his reading that 124 00:06:46,360 --> 00:06:50,800 Speaker 3: people had been sent wildly astray about these things. But 125 00:06:51,160 --> 00:06:55,760 Speaker 3: that's still stuff that seems like mere I don't know, 126 00:06:55,880 --> 00:07:01,880 Speaker 3: compilation of publicly existing factual information should be more easily 127 00:07:01,920 --> 00:07:06,160 Speaker 3: achievable by AI than say, writing a really beautiful, expressive 128 00:07:06,240 --> 00:07:09,080 Speaker 3: literary novel that you know is meaningful to people. 129 00:07:09,840 --> 00:07:13,560 Speaker 2: Yeah. Yeah, And I actually have a friend who experimented 130 00:07:13,640 --> 00:07:17,640 Speaker 2: with vacation planning via AI, and he thought it was amazing, 131 00:07:17,760 --> 00:07:19,440 Speaker 2: you know, what you could do. But then of course 132 00:07:19,480 --> 00:07:22,400 Speaker 2: the realization then you've got to do the legwork of 133 00:07:22,440 --> 00:07:25,560 Speaker 2: fact checking everything and making sure that it all lines up, 134 00:07:25,560 --> 00:07:27,960 Speaker 2: because you don't want to just you know, go off 135 00:07:28,120 --> 00:07:31,400 Speaker 2: with this being your main you know, bit of planning. 136 00:07:32,680 --> 00:07:35,400 Speaker 2: So yeah, I don't know. Recently, it seems like there 137 00:07:35,440 --> 00:07:40,800 Speaker 2: have been varying waves of you know, excitement, enthusiasm, concern, malaise, 138 00:07:41,280 --> 00:07:44,320 Speaker 2: all of this concerning generative AI. You see this wave 139 00:07:44,400 --> 00:07:46,960 Speaker 2: sort of catching people at different points. Like I remember 140 00:07:47,040 --> 00:07:50,560 Speaker 2: when I first saw what some of these visual generative 141 00:07:50,560 --> 00:07:52,800 Speaker 2: AI programs could do, I was like, wow, that's amazing. 142 00:07:52,840 --> 00:07:54,640 Speaker 2: Look at this, this is this is this is kind 143 00:07:54,640 --> 00:07:59,080 Speaker 2: of great. It can make a dream like reality and 144 00:07:59,200 --> 00:08:03,000 Speaker 2: you can sort of create images that no one is 145 00:08:03,040 --> 00:08:05,720 Speaker 2: going to create for commercial reasons or even you know, 146 00:08:05,800 --> 00:08:09,559 Speaker 2: for personal artistic reasons, and you can make them sort 147 00:08:09,600 --> 00:08:13,160 Speaker 2: of real. And then I, for me personally, I don't 148 00:08:13,200 --> 00:08:15,920 Speaker 2: know that enthusiasm waned a lot when I just sort 149 00:08:15,960 --> 00:08:17,960 Speaker 2: of begin to see sort of like the soullessness of it, 150 00:08:18,000 --> 00:08:21,560 Speaker 2: at least from my perspective, it certainly waned when I 151 00:08:21,600 --> 00:08:24,120 Speaker 2: when I saw you know, visual artists that I know 152 00:08:24,800 --> 00:08:27,560 Speaker 2: or at the very least follow online, and you know, 153 00:08:27,600 --> 00:08:31,320 Speaker 2: saw them registering their concern over how these systems worked, 154 00:08:31,320 --> 00:08:34,080 Speaker 2: how they were sourcing information. And you continue to see 155 00:08:34,080 --> 00:08:36,120 Speaker 2: people at different points. You know, some people out there 156 00:08:36,120 --> 00:08:39,320 Speaker 2: just discovering some of this technology, and they're at the 157 00:08:39,400 --> 00:08:41,760 Speaker 2: high point, and maybe they're going to stay up there 158 00:08:41,760 --> 00:08:43,800 Speaker 2: for a little while. So I don't know. I find 159 00:08:43,960 --> 00:08:47,280 Speaker 2: I have to sort of voice you some restraint when 160 00:08:47,360 --> 00:08:49,400 Speaker 2: interacting with people. I don't want to be the person 161 00:08:49,440 --> 00:08:52,800 Speaker 2: who's immediately trying to squash everyone's excitement for new technology. 162 00:08:53,400 --> 00:08:56,600 Speaker 3: You know, specifically when it comes to the arts, there 163 00:08:56,600 --> 00:08:59,679 Speaker 3: has long been a kind of implicit model at work 164 00:09:00,559 --> 00:09:03,959 Speaker 3: in our culture. I'm sure you'll know what I mean 165 00:09:04,000 --> 00:09:08,680 Speaker 3: when I described this that says, well, an artist, they 166 00:09:08,720 --> 00:09:12,040 Speaker 3: will sort of have two simultaneous careers. They'll have the 167 00:09:12,120 --> 00:09:15,520 Speaker 3: stuff that they care about working on that they don't 168 00:09:15,520 --> 00:09:17,680 Speaker 3: expect to make money on. It's difficult to make money on, 169 00:09:17,760 --> 00:09:20,680 Speaker 3: but might actually be quite beautiful and meaningful, and we 170 00:09:20,800 --> 00:09:22,880 Speaker 3: are glad that we have that stuff in the culture. 171 00:09:22,920 --> 00:09:25,600 Speaker 3: It is enriching, even if it is not a major 172 00:09:25,679 --> 00:09:28,520 Speaker 3: source of economic of money moving from place to place. 173 00:09:28,880 --> 00:09:30,360 Speaker 3: And then on the other hand, they're like, well, they've 174 00:09:30,360 --> 00:09:32,880 Speaker 3: got to have a day job, so they do you know, 175 00:09:33,040 --> 00:09:36,360 Speaker 3: illustrations for advertisements or something like that. This kind of 176 00:09:36,400 --> 00:09:40,520 Speaker 3: like not very inspired and not very fulfilling, but is 177 00:09:40,640 --> 00:09:43,560 Speaker 3: the way they can use their skills to make some 178 00:09:43,640 --> 00:09:46,000 Speaker 3: money to pay the bills while they do this other thing. 179 00:09:46,480 --> 00:09:50,160 Speaker 3: And so if that secondary thing is now it's like, oh, well, 180 00:09:50,160 --> 00:09:52,760 Speaker 3: we can just get a program to do that for us. 181 00:09:53,040 --> 00:09:56,120 Speaker 3: I wonder how that affects the other half of the equation, 182 00:09:56,360 --> 00:10:00,040 Speaker 3: because now you have an artist who can't subsidize the 183 00:10:00,720 --> 00:10:02,720 Speaker 3: kind of art they want to do by doing this 184 00:10:02,840 --> 00:10:03,439 Speaker 3: other job. 185 00:10:04,280 --> 00:10:06,600 Speaker 2: Yeah, yeah, I know exactly what you mean. It's it's 186 00:10:06,640 --> 00:10:08,480 Speaker 2: like saying, oh, don't worry, we're not going to replace 187 00:10:08,559 --> 00:10:11,160 Speaker 2: your your art. We just want to replace that that 188 00:10:11,280 --> 00:10:14,120 Speaker 2: side gig you had where you were doing you know, 189 00:10:14,200 --> 00:10:17,840 Speaker 2: some mockups of concepts for various projects, and yeah, that 190 00:10:17,840 --> 00:10:20,240 Speaker 2: that could that may very well be the thing that's 191 00:10:20,280 --> 00:10:21,760 Speaker 2: sustaining the other efforts. 192 00:10:22,240 --> 00:10:25,040 Speaker 3: So I think that's one way when people have said like, oh, 193 00:10:25,160 --> 00:10:27,560 Speaker 3: you know, AI, it's not gonna it's not going to 194 00:10:27,600 --> 00:10:31,280 Speaker 3: replace great works of art that that might be kind 195 00:10:31,320 --> 00:10:34,040 Speaker 3: of missing some of the practical realities of what it 196 00:10:34,080 --> 00:10:35,840 Speaker 3: means to make a living as an artist. 197 00:10:36,760 --> 00:10:39,360 Speaker 2: Yeah, and so yeah, all this was going on, and 198 00:10:39,520 --> 00:10:42,200 Speaker 2: then it wasn't too long after this that a number 199 00:10:42,280 --> 00:10:45,199 Speaker 2: of people that that we both know lost their writing 200 00:10:45,240 --> 00:10:50,640 Speaker 2: and editing jobs to Generative AI, which seemed that was 201 00:10:50,720 --> 00:10:52,600 Speaker 2: especially a moment for me, which seemed to come just 202 00:10:52,679 --> 00:10:57,000 Speaker 2: shockingly fast that it just it's like people were saying, 203 00:10:57,040 --> 00:10:59,200 Speaker 2: this is the sort of thing that could occur, and 204 00:10:59,520 --> 00:11:01,800 Speaker 2: I still didn't think that it was on the cusp 205 00:11:01,840 --> 00:11:03,040 Speaker 2: of happening, and then it did. 206 00:11:03,360 --> 00:11:04,559 Speaker 3: You thought it was farther away. 207 00:11:04,960 --> 00:11:08,200 Speaker 2: Yeah, it just it just seemed like those toperary animals 208 00:11:08,240 --> 00:11:12,120 Speaker 2: were farther off, but then suddenly they're right here. So yeah, 209 00:11:12,120 --> 00:11:14,440 Speaker 2: it just suddenly felt like a lot of these advancements. Yeah, 210 00:11:14,640 --> 00:11:18,480 Speaker 2: we're far closer to me and people around me than 211 00:11:18,520 --> 00:11:20,839 Speaker 2: they had been previously. It seemed like some of the 212 00:11:20,960 --> 00:11:23,760 Speaker 2: rosy ideas concerning how it could all play out were 213 00:11:23,760 --> 00:11:26,480 Speaker 2: maybe not quite as accurate. And I found that it 214 00:11:26,520 --> 00:11:28,120 Speaker 2: made me, It made me feel a bit anxious, you know. 215 00:11:28,240 --> 00:11:30,480 Speaker 2: All while everything else in the news cycle was going 216 00:11:30,480 --> 00:11:33,800 Speaker 2: on from congressional testimony about UFOs, to the sort of 217 00:11:33,840 --> 00:11:37,680 Speaker 2: general grinding stone of political coverage, to the equally suddenly 218 00:11:37,800 --> 00:11:41,000 Speaker 2: very real ramifications of climate change, which of course are 219 00:11:41,320 --> 00:11:44,920 Speaker 2: kind of are basically reflected in that Max tech mark concept, 220 00:11:44,960 --> 00:11:47,560 Speaker 2: you know, trying to understand the rising threat of AI 221 00:11:47,679 --> 00:11:52,240 Speaker 2: by comparing it to the rising, literal, rising sea waters 222 00:11:52,520 --> 00:11:56,640 Speaker 2: due to climate change. And of course it's it's interesting 223 00:11:56,640 --> 00:12:01,280 Speaker 2: to use one to to understand the other when both 224 00:12:01,600 --> 00:12:04,960 Speaker 2: suffer at times from this this this lack of feeling 225 00:12:05,480 --> 00:12:08,840 Speaker 2: like an immediate concern to many people, even when here 226 00:12:08,840 --> 00:12:11,520 Speaker 2: we are suddenly you know, just watch the news any 227 00:12:11,520 --> 00:12:13,720 Speaker 2: given day and you can see it playing out in 228 00:12:13,760 --> 00:12:17,040 Speaker 2: real time. So anyway, I didn't think about it much 229 00:12:17,360 --> 00:12:19,640 Speaker 2: at the time, but then later I was chatting with 230 00:12:19,679 --> 00:12:23,199 Speaker 2: some friends and I was reminded once more of Alvin 231 00:12:23,200 --> 00:12:27,880 Speaker 2: Toffler's nineteen seventy book Future Shock, which i'd read myself 232 00:12:28,200 --> 00:12:32,320 Speaker 2: for the first time maybe ten years ago, and I thought, well, 233 00:12:32,559 --> 00:12:35,000 Speaker 2: we should come back and revisit it, like I want to, 234 00:12:35,040 --> 00:12:37,000 Speaker 2: you know, wanted to revisit it. At this point in 235 00:12:37,480 --> 00:12:40,960 Speaker 2: my life. I feel like there's more to discuss in 236 00:12:41,040 --> 00:12:43,400 Speaker 2: terms of like where we are in the world at 237 00:12:43,400 --> 00:12:45,959 Speaker 2: this point, and it's just it also deals with a 238 00:12:46,040 --> 00:12:49,640 Speaker 2: number of you know, stuff to blow your mind, you know, 239 00:12:49,720 --> 00:12:52,560 Speaker 2: classic concepts of futurism and change. 240 00:12:53,040 --> 00:12:58,600 Speaker 3: I always enjoy reading about the predictions people made about 241 00:12:58,640 --> 00:13:01,440 Speaker 3: the future from the disc in past. So this book 242 00:13:01,480 --> 00:13:04,160 Speaker 3: is now fifty three years old, is that right? 243 00:13:05,000 --> 00:13:07,640 Speaker 2: Yeah, yeah, yeah, it's over a half a century old. 244 00:13:07,760 --> 00:13:12,280 Speaker 2: So it's yeah, it's interesting to look back on because, 245 00:13:13,040 --> 00:13:15,120 Speaker 2: as you might expect, so many ideas are thrown out 246 00:13:15,120 --> 00:13:18,800 Speaker 2: in this book that you know, some hauntingly hit the mark, 247 00:13:19,240 --> 00:13:23,760 Speaker 2: some are way off base, and many of them are 248 00:13:23,800 --> 00:13:27,440 Speaker 2: just sort of like a snapshot of the time, like imagining. Yeah, 249 00:13:27,440 --> 00:13:29,840 Speaker 2: that this is this is a book that came together 250 00:13:29,920 --> 00:13:32,760 Speaker 2: in the late nineteen sixties, you know, at this point 251 00:13:32,800 --> 00:13:36,520 Speaker 2: of you know, drastic change in America, in the world, 252 00:13:36,800 --> 00:13:39,600 Speaker 2: and this is what one individual or a pair of 253 00:13:39,640 --> 00:13:41,960 Speaker 2: individuals put together about all of it. 254 00:13:42,360 --> 00:13:45,439 Speaker 3: There is a lot about hippies in this book. 255 00:13:45,559 --> 00:13:47,880 Speaker 2: Yeah, yeah, there's a lot of looking to what is 256 00:13:47,880 --> 00:13:51,680 Speaker 2: the hippie culture subculture doing and how can this be 257 00:13:51,760 --> 00:13:54,640 Speaker 2: like a magic ball in a sense to understand the future. 258 00:13:54,760 --> 00:13:57,080 Speaker 2: And some of it does pan out, like some of 259 00:13:57,120 --> 00:14:00,600 Speaker 2: the future predictions by looking at the hippies works, other 260 00:14:00,600 --> 00:14:03,760 Speaker 2: things not so much, you know, so we'll definitely get 261 00:14:03,800 --> 00:14:05,000 Speaker 2: into some of those examples. 262 00:14:05,320 --> 00:14:07,679 Speaker 3: Yeah. So, as I was saying, I love these things, 263 00:14:07,720 --> 00:14:10,880 Speaker 3: even when they're ludicrously wrong, you know. I like Criswell 264 00:14:10,920 --> 00:14:15,040 Speaker 3: predicts and future events such as these will concern us 265 00:14:15,040 --> 00:14:17,800 Speaker 3: all in the future because that is where we're going 266 00:14:17,800 --> 00:14:20,920 Speaker 3: to be living the rest of our lives. But I 267 00:14:21,040 --> 00:14:22,960 Speaker 3: like you so I'd never read this book before. I 268 00:14:23,080 --> 00:14:25,800 Speaker 3: just finished it for the first time literally hours before 269 00:14:25,800 --> 00:14:29,160 Speaker 3: we started recording here, and I was struck by Yeah, 270 00:14:29,200 --> 00:14:34,360 Speaker 3: a very interesting mix of reactions. On one hand, Future 271 00:14:34,360 --> 00:14:39,200 Speaker 3: Shock is it's exactly the kind of book that I 272 00:14:39,280 --> 00:14:42,640 Speaker 3: think one needs to be wary of, and I would 273 00:14:42,720 --> 00:14:46,320 Speaker 3: character I would put it in this category of charismatic 274 00:14:46,600 --> 00:14:50,760 Speaker 3: big cultural thesis books, books that have a kind of 275 00:14:51,680 --> 00:14:56,680 Speaker 3: basically easy to understand charismatic idea at the core, that's like, 276 00:14:56,760 --> 00:14:59,800 Speaker 3: here's the thing that explains what's going on in culture. 277 00:15:00,240 --> 00:15:03,720 Speaker 3: Books like that can be kind of epistemically dangerous because 278 00:15:04,000 --> 00:15:07,600 Speaker 3: it's very appealing to have to like land on a 279 00:15:07,640 --> 00:15:10,160 Speaker 3: theory that finally, you know, culture is so confusing, I 280 00:15:10,200 --> 00:15:13,080 Speaker 3: don't understand what's happening in the world today, and then oh, 281 00:15:13,200 --> 00:15:16,240 Speaker 3: here's a thesis that explains what's going on. Now I 282 00:15:16,240 --> 00:15:18,960 Speaker 3: finally understand it, and you can like put use that 283 00:15:19,040 --> 00:15:23,040 Speaker 3: as your lens that now the world makes sense. And 284 00:15:23,120 --> 00:15:25,280 Speaker 3: so it is a book in a way like that, 285 00:15:25,320 --> 00:15:27,760 Speaker 3: which is especially funny because there is a section in 286 00:15:27,800 --> 00:15:32,680 Speaker 3: the book warning about books of that sort and theories 287 00:15:32,680 --> 00:15:35,120 Speaker 3: of that sort that explain everything about culture. 288 00:15:36,040 --> 00:15:38,680 Speaker 2: Yeah, and if memory service, they do kind of acknowledge 289 00:15:38,680 --> 00:15:41,360 Speaker 2: like this book could could be exactly the sort of 290 00:15:41,360 --> 00:15:46,360 Speaker 2: thing that could be like a maladaptive reaction to future shock. 291 00:15:46,880 --> 00:15:48,840 Speaker 2: I just go all in on it exactly. 292 00:15:48,880 --> 00:15:51,920 Speaker 3: Yeah, So they have that consciousness, and so I think 293 00:15:51,960 --> 00:15:54,920 Speaker 3: that's fair and that's a good kind of self consciousness 294 00:15:54,920 --> 00:15:57,640 Speaker 3: for the author or authors. Maybe we can talk about 295 00:15:57,800 --> 00:16:00,480 Speaker 3: whether we should be talking about the author as Alvin 296 00:16:00,520 --> 00:16:04,680 Speaker 3: Toffler or Alvin and Heidi Toffler the book I read 297 00:16:04,720 --> 00:16:07,280 Speaker 3: the author name on it was Alvin Toffler, but from 298 00:16:07,320 --> 00:16:10,400 Speaker 3: what I understand there now sort of understood to be 299 00:16:11,080 --> 00:16:13,920 Speaker 3: co authors, though she was sort of an anonymous co author. 300 00:16:14,000 --> 00:16:16,920 Speaker 3: Is that right, Yeah, Yeah, that's my understanding as well. 301 00:16:17,120 --> 00:16:19,240 Speaker 3: He was a writer, a journalist and a futurist. She 302 00:16:19,280 --> 00:16:22,320 Speaker 3: was a researcher and an editor. According to her oh 303 00:16:22,360 --> 00:16:24,960 Speaker 3: bit in The New York Times, she quote served an essential, 304 00:16:25,000 --> 00:16:28,640 Speaker 3: though anonymous, collaborative role alongside her celebrated husband, and she 305 00:16:28,760 --> 00:16:31,800 Speaker 3: is later in later works she is credited as co author. 306 00:16:32,520 --> 00:16:35,480 Speaker 3: So I think we'll probably be interchangeables. 307 00:16:35,480 --> 00:16:38,440 Speaker 2: Like sometimes we may say he when we could easily 308 00:16:38,480 --> 00:16:42,480 Speaker 2: say they, we may say Toffler, and other times we 309 00:16:42,520 --> 00:16:46,120 Speaker 2: may say the Tofflers. But I think it's widely recognized 310 00:16:46,120 --> 00:16:47,320 Speaker 2: that they work together on these. 311 00:16:47,640 --> 00:16:50,000 Speaker 3: Okay, yeah, so for now I'll say the Tofflers. I 312 00:16:50,000 --> 00:16:53,400 Speaker 3: think the Tofflers do show that self consciousness and acknowledge 313 00:16:53,440 --> 00:16:57,400 Speaker 3: that which is useful. But like I said, it's always 314 00:16:57,400 --> 00:17:00,440 Speaker 3: good to approach books like this cautiously because there are 315 00:17:00,520 --> 00:17:05,960 Speaker 3: good books like this that can offer some good insights, 316 00:17:06,400 --> 00:17:09,560 Speaker 3: but rarely are they correct in everything they claim. And 317 00:17:09,600 --> 00:17:12,840 Speaker 3: also you just have to be conscious that I think 318 00:17:12,960 --> 00:17:16,680 Speaker 3: books like this can be more appealing than they deserve, 319 00:17:17,040 --> 00:17:20,399 Speaker 3: like that, you know, be like, oh, the explanatory power 320 00:17:20,440 --> 00:17:24,080 Speaker 3: they seem to offer can prove too alluring and can 321 00:17:24,200 --> 00:17:28,160 Speaker 3: kind of easily hop over our defenses to arguments that 322 00:17:28,200 --> 00:17:31,399 Speaker 3: we would notice our week in another context. Like I 323 00:17:31,440 --> 00:17:33,479 Speaker 3: remember at one point when it was making a point 324 00:17:33,520 --> 00:17:39,159 Speaker 3: about the diversity of options available in the world today 325 00:17:39,440 --> 00:17:42,119 Speaker 3: on things which you can you know, give your interest 326 00:17:42,160 --> 00:17:44,960 Speaker 3: to or spend your time on. It said quote book 327 00:17:44,960 --> 00:17:48,560 Speaker 3: clubs are finding it increasingly more difficult to choose monthly 328 00:17:48,640 --> 00:17:52,840 Speaker 3: selections that appeal to large numbers of divergent readers. And 329 00:17:53,119 --> 00:17:54,720 Speaker 3: I you know, at first you can just kind of 330 00:17:54,760 --> 00:17:56,840 Speaker 3: like read that sentence and be like yeah, yeah, And 331 00:17:56,880 --> 00:17:58,679 Speaker 3: then I stopped and I was like, wait a minute, 332 00:17:59,160 --> 00:18:01,840 Speaker 3: how would one know this? Like, no evidence of that 333 00:18:01,960 --> 00:18:05,320 Speaker 3: claim is given. It's just sort of it's something that 334 00:18:05,600 --> 00:18:08,520 Speaker 3: sounds plausible. That's probably, but I have no idea if 335 00:18:08,560 --> 00:18:11,959 Speaker 3: that statement is true. So, and there's just like a 336 00:18:12,000 --> 00:18:14,719 Speaker 3: lot of stuff like that in this book and books 337 00:18:14,800 --> 00:18:17,400 Speaker 3: like it, statements that kind of like they they're part 338 00:18:17,520 --> 00:18:21,680 Speaker 3: of this march of evidence toward the thesis about what's 339 00:18:21,680 --> 00:18:25,240 Speaker 3: going on in culture, and they sound plausibly enough true. 340 00:18:25,280 --> 00:18:27,600 Speaker 3: They kind of fit into the rhythm of the argument 341 00:18:27,720 --> 00:18:31,240 Speaker 3: being developed, and they just wash over you and you think, yeah, yeah, yeah, 342 00:18:31,280 --> 00:18:34,600 Speaker 3: but it's so anyway, I guess my point is when 343 00:18:34,640 --> 00:18:36,760 Speaker 3: you realize that's happening to you in a book, you 344 00:18:36,760 --> 00:18:38,800 Speaker 3: should stop and think, Okay, wait a minute, maybe I 345 00:18:38,800 --> 00:18:40,240 Speaker 3: should be a little cautious here. 346 00:18:40,760 --> 00:18:43,600 Speaker 2: Yeah. I mean we've covered books of this nature before. 347 00:18:43,680 --> 00:18:47,000 Speaker 2: I mean the bicameral mind. You know, I've talked about, 348 00:18:47,280 --> 00:18:50,520 Speaker 2: you know, the works of Terrence McKenna on the show before, 349 00:18:50,560 --> 00:18:53,399 Speaker 2: and you know, these are I think these are you know, 350 00:18:53,400 --> 00:18:57,720 Speaker 2: books by individuals who had some amazing ideas, some amazing viewpoints, 351 00:18:57,760 --> 00:19:02,000 Speaker 2: some things to say that were at times important, at 352 00:19:02,040 --> 00:19:06,320 Speaker 2: times entertaining, and you know also sometimes perhaps you know, incorrect. 353 00:19:06,880 --> 00:19:09,600 Speaker 2: But yeah, it's like, do you go all in on it? 354 00:19:09,640 --> 00:19:11,800 Speaker 2: Do you do you go all in on the Stone 355 00:19:11,840 --> 00:19:15,520 Speaker 2: Dape hypothesis or do you just you know, you read it, 356 00:19:15,640 --> 00:19:18,639 Speaker 2: you hear it, but you also, you know, keep a 357 00:19:18,640 --> 00:19:23,000 Speaker 2: foot in in other realities as well. So yeah, I 358 00:19:23,000 --> 00:19:25,000 Speaker 2: think it is it is important to maybe not go 359 00:19:25,119 --> 00:19:28,120 Speaker 2: all in on an idea like this, but I still 360 00:19:28,119 --> 00:19:29,680 Speaker 2: think that, yeah, there is a lot to learn from 361 00:19:29,680 --> 00:19:31,320 Speaker 2: it and to draw out of it. 362 00:19:31,640 --> 00:19:33,960 Speaker 3: Yeah, I mean, I think the healthy attitude is when 363 00:19:34,000 --> 00:19:36,760 Speaker 3: you're reading something like this, don't get sucked in and say, oh, 364 00:19:36,800 --> 00:19:38,320 Speaker 3: here's the person who's explaining it. 365 00:19:38,359 --> 00:19:38,520 Speaker 2: All. 366 00:19:38,640 --> 00:19:41,560 Speaker 3: This is now the teacher and I am the student. Instead, 367 00:19:41,560 --> 00:19:43,679 Speaker 3: you regard this as this is a person who's making 368 00:19:43,720 --> 00:19:48,639 Speaker 3: a series of claims, and like remember to evaluate those claims. Uh, 369 00:19:48,760 --> 00:19:50,560 Speaker 3: And I, like I said, I think some of the 370 00:19:50,600 --> 00:19:53,240 Speaker 3: claims made here are pretty good and are pretty insightful, 371 00:19:53,280 --> 00:19:55,119 Speaker 3: and I'll identify those as we go on. 372 00:19:55,720 --> 00:19:58,159 Speaker 2: Now. It also helps that Future Shock just has a 373 00:19:58,200 --> 00:20:01,439 Speaker 2: great title, the title book and also the title of 374 00:20:01,440 --> 00:20:05,880 Speaker 2: the central thesis, the idea that we as individuals and 375 00:20:05,880 --> 00:20:10,040 Speaker 2: we as a society are suffering from future shock. So 376 00:20:10,080 --> 00:20:13,320 Speaker 2: it's no surprise that even even you know so certainly, 377 00:20:13,320 --> 00:20:14,880 Speaker 2: this book was a huge success when it came out, 378 00:20:14,920 --> 00:20:16,760 Speaker 2: and a lot of people read it was translated into 379 00:20:16,760 --> 00:20:18,959 Speaker 2: so many different languages. It's my understanding it's never been 380 00:20:19,000 --> 00:20:22,560 Speaker 2: out of print. People continue to read it and other 381 00:20:22,560 --> 00:20:24,919 Speaker 2: works by the Toddlers to this day. So a lot 382 00:20:24,920 --> 00:20:28,000 Speaker 2: of people were exposed to the ideas in this book. 383 00:20:28,119 --> 00:20:31,840 Speaker 2: But also just the title itself inspired various things. So 384 00:20:32,359 --> 00:20:34,560 Speaker 2: in the British two thousand and a d comic book 385 00:20:34,600 --> 00:20:38,920 Speaker 2: series there is a regularly occurring a section called Thrag's 386 00:20:39,000 --> 00:20:42,240 Speaker 2: Future Shocks. I don't think it has anything really to 387 00:20:42,280 --> 00:20:45,160 Speaker 2: do with the central premise here, other than it sounds cool. 388 00:20:45,440 --> 00:20:48,960 Speaker 2: It's a thrag or oh, it's Thark. I'm sorry, Yes, 389 00:20:49,200 --> 00:20:51,959 Speaker 2: Tharg's Future Shocks. I don't know that I've ever actually 390 00:20:52,000 --> 00:20:54,280 Speaker 2: read Tharg's Future Shocks. I've read a lot of Judge 391 00:20:54,320 --> 00:20:58,040 Speaker 2: dread over the years, but in a few things outside 392 00:20:58,040 --> 00:20:59,600 Speaker 2: of that in two thousand and a d But yeah, 393 00:20:59,600 --> 00:21:02,080 Speaker 2: I don't think of read Thark per. 394 00:21:02,000 --> 00:21:03,400 Speaker 3: Se, who is the Ark? 395 00:21:04,880 --> 00:21:07,280 Speaker 2: He's just this. He's kind of like, you know, a cripkeeper. 396 00:21:07,280 --> 00:21:11,480 Speaker 2: I think you know, Oh, okay, nice futuristic monster cripkeeper 397 00:21:11,720 --> 00:21:13,240 Speaker 2: of two thousand and eighty. 398 00:21:13,640 --> 00:21:14,200 Speaker 3: I see. 399 00:21:14,720 --> 00:21:17,120 Speaker 2: There was also, and I've never seen this, but from 400 00:21:17,200 --> 00:21:20,800 Speaker 2: nineteen seventy six through nineteen seventy nine, James Brown hosted 401 00:21:21,080 --> 00:21:25,520 Speaker 2: a variety show and it was called Future Shock, which, 402 00:21:25,560 --> 00:21:29,760 Speaker 2: again I don't know how the musical content here was 403 00:21:29,800 --> 00:21:33,760 Speaker 2: supposed to actually be instilling us with future shock. I 404 00:21:33,760 --> 00:21:37,800 Speaker 2: think it just sounded cool. There's also a nineteen ninety 405 00:21:37,800 --> 00:21:41,600 Speaker 2: four Vivian Shilling b movie called Future Shock, has Bill 406 00:21:41,640 --> 00:21:44,360 Speaker 2: Paxton in it. I haven't seen it, but I get 407 00:21:44,400 --> 00:21:47,240 Speaker 2: the impression that it is only like surface level getting 408 00:21:47,280 --> 00:21:48,879 Speaker 2: into the idea of future shock. 409 00:21:49,680 --> 00:21:53,760 Speaker 3: I think I watched it maybe freshman year of college. 410 00:21:54,080 --> 00:21:57,080 Speaker 3: I do not remember anything about it at all except 411 00:21:57,119 --> 00:21:57,920 Speaker 3: that it wasn't good. 412 00:21:58,240 --> 00:22:02,640 Speaker 2: It's not what the tour de four that Soultaker was right, 413 00:22:03,240 --> 00:22:05,680 Speaker 2: wasn't she also in Soultakers I don't remember a celt 414 00:22:05,680 --> 00:22:08,920 Speaker 2: Taker movie. Okay, oh, but we're bearing the lead here 415 00:22:08,960 --> 00:22:14,920 Speaker 2: because there was also a wonderful nineteen seventy two documentary 416 00:22:15,359 --> 00:22:18,439 Speaker 2: based on the book Future Shock, covering some of the 417 00:22:18,560 --> 00:22:22,240 Speaker 2: you know, the key ideas involved here, hosted by Orson Wells. 418 00:22:22,840 --> 00:22:24,520 Speaker 3: Ah the French. 419 00:22:26,240 --> 00:22:29,840 Speaker 2: Oh boy, this is this. This you can definitely find 420 00:22:29,920 --> 00:22:32,840 Speaker 2: on streaming services, not in great quality, but in you know, 421 00:22:33,119 --> 00:22:38,000 Speaker 2: semi watchable quality. It's it's somewhat cheesy, still a lot 422 00:22:38,040 --> 00:22:41,600 Speaker 2: of fun. Certainly leads into the more theatrical aspects of 423 00:22:42,280 --> 00:22:45,800 Speaker 2: the whole premise and and where Orson Wells hams it 424 00:22:45,880 --> 00:22:48,959 Speaker 2: up a lot. But it does have some effective moments 425 00:22:48,960 --> 00:22:52,440 Speaker 2: of weirdness. There's there's one part very early on that 426 00:22:52,960 --> 00:22:54,720 Speaker 2: they really resonated with me when I first saw it. 427 00:22:54,880 --> 00:22:57,840 Speaker 2: It continues to sort of resonate with me. So early 428 00:22:57,880 --> 00:23:00,639 Speaker 2: in the documentary we see Worson Wells. This is like 429 00:23:00,760 --> 00:23:04,480 Speaker 2: late career Orson Welles. We see him at an airport 430 00:23:04,600 --> 00:23:08,040 Speaker 2: having apparently just landed on an airplane. He's smoking a 431 00:23:08,080 --> 00:23:12,600 Speaker 2: pipe or a cigar or something and telling us about basically, 432 00:23:12,640 --> 00:23:14,159 Speaker 2: the way he's talking, it sounds like, you know, he 433 00:23:14,200 --> 00:23:18,040 Speaker 2: came up with future shock, he says, in the course 434 00:23:18,080 --> 00:23:20,240 Speaker 2: of my work, which takes me to just about every 435 00:23:20,280 --> 00:23:22,600 Speaker 2: corner of the globe, I see many aspects of a 436 00:23:22,640 --> 00:23:27,240 Speaker 2: phenomenon which I'm just beginning to understand. Our modern technologies 437 00:23:27,240 --> 00:23:30,640 Speaker 2: have achieved a degree of sophistication beyond our wildest dreams. 438 00:23:30,920 --> 00:23:35,040 Speaker 2: But this technology has exacted a pretty heavy price. We 439 00:23:35,119 --> 00:23:38,440 Speaker 2: live in an age of anxiety, a time of stress, 440 00:23:38,760 --> 00:23:41,920 Speaker 2: and with all our sophistication, we all are in fact 441 00:23:42,119 --> 00:23:46,480 Speaker 2: the victims of our own technological strength. We are victims 442 00:23:46,480 --> 00:23:51,840 Speaker 2: of shock, of future shock. And I'm not accurately presenting 443 00:23:51,920 --> 00:23:55,160 Speaker 2: it here, but the way he delivers that last line 444 00:23:55,200 --> 00:23:57,360 Speaker 2: always like kind of stuck a chord with me, because 445 00:23:57,560 --> 00:24:00,800 Speaker 2: you know, Orson Welles, even a late career still as showman. 446 00:24:01,200 --> 00:24:03,639 Speaker 2: He you know, he's hamming it up a lot in 447 00:24:03,680 --> 00:24:06,240 Speaker 2: this particular documentary, but in that one line, I feel 448 00:24:06,280 --> 00:24:08,239 Speaker 2: like he pours a great deal of compassion into it. 449 00:24:08,600 --> 00:24:11,800 Speaker 2: You know, he's telling you, look, everything that you've been feeling, 450 00:24:11,920 --> 00:24:14,600 Speaker 2: perhaps without being able to identify all the causes or 451 00:24:14,640 --> 00:24:16,919 Speaker 2: even put a name to it. There's a reason you 452 00:24:16,960 --> 00:24:19,680 Speaker 2: feel like this, and we can put a name to it. 453 00:24:19,680 --> 00:24:22,440 Speaker 2: It's not your fault and you are not alone. 454 00:24:23,119 --> 00:24:26,159 Speaker 3: I mean, Orson Wells is a great, great host to 455 00:24:26,200 --> 00:24:29,080 Speaker 3: sell any concept. He can really infuse it with feeling. 456 00:24:29,160 --> 00:24:32,000 Speaker 3: He's if you never listened to the outtakes of Orson 457 00:24:32,040 --> 00:24:36,440 Speaker 3: Wells recording commercials about frozen peas and getting really mad 458 00:24:36,480 --> 00:24:39,199 Speaker 3: at how the copy is bad. I recommend looking that 459 00:24:39,359 --> 00:24:42,720 Speaker 3: up every year. Peace grow there. 460 00:24:45,680 --> 00:24:48,840 Speaker 2: I don't think I've seen that it's really good, but 461 00:24:49,520 --> 00:24:51,480 Speaker 2: you know, I feel like, you know, this is something 462 00:24:51,560 --> 00:24:54,280 Speaker 2: that perhaps some people needed to hear in nineteen seventy 463 00:24:54,280 --> 00:24:57,600 Speaker 2: and maybe some people need to hear today, you know, so, 464 00:24:57,800 --> 00:25:00,960 Speaker 2: I yeah, I thought it would be the rewarding to 465 00:25:01,000 --> 00:25:04,080 Speaker 2: revisit some aspects of the Toffler's future Shock concept here, 466 00:25:04,760 --> 00:25:06,879 Speaker 2: talk about how it stacks up or doesn't stack up 467 00:25:06,880 --> 00:25:10,000 Speaker 2: to today's world. And you know what we might learn 468 00:25:10,040 --> 00:25:11,560 Speaker 2: from revisiting the concept. 469 00:25:21,400 --> 00:25:24,280 Speaker 3: Okay, so, like fifty three years on now, we're doing 470 00:25:24,320 --> 00:25:28,960 Speaker 3: a retrospective on future Shock. Well, it is the future, Joe, 471 00:25:29,119 --> 00:25:30,680 Speaker 3: it is where we're going to be living the rest 472 00:25:30,680 --> 00:25:35,399 Speaker 3: of our lives. Yes, so just a little bit more detail, 473 00:25:35,640 --> 00:25:37,920 Speaker 3: just to get some dates. Alvin Toffler lived nineteen twenty 474 00:25:37,960 --> 00:25:39,040 Speaker 3: eight through twenty sixteen. 475 00:25:39,240 --> 00:25:41,640 Speaker 2: Hei do You Live? Nineteen twenty nine through twenty nineteen. 476 00:25:43,440 --> 00:25:46,880 Speaker 2: Alvin Toffler is credited with coining the term future shock 477 00:25:46,920 --> 00:25:50,640 Speaker 2: in a nineteen sixty five article for Horizon magazine. And 478 00:25:50,680 --> 00:25:55,439 Speaker 2: then they spent the next five years researching, interviewing, editing, 479 00:25:55,520 --> 00:25:59,199 Speaker 2: and writing putting together this book. Book first published in 480 00:25:59,240 --> 00:26:03,800 Speaker 2: nineteen seventy, and in short, it attempted to capture the 481 00:26:03,840 --> 00:26:07,760 Speaker 2: sort of bleeding edge of a rapidly advancing world of science, technology, 482 00:26:07,880 --> 00:26:12,000 Speaker 2: mass communications, and economics. And on these counts alone, you 483 00:26:12,000 --> 00:26:15,359 Speaker 2: know it's offensided as having predicted things like personal computers, 484 00:26:15,400 --> 00:26:18,520 Speaker 2: the Internet, cable television, and of course, the current arch 485 00:26:18,600 --> 00:26:21,200 Speaker 2: enemy of many companies, telecommuting. 486 00:26:23,520 --> 00:26:26,280 Speaker 3: Now, one trick you can always pull as a future 487 00:26:26,320 --> 00:26:30,000 Speaker 3: ologist is to make lots of predictions. And if you, 488 00:26:30,119 --> 00:26:33,960 Speaker 3: as you know, anybody who knows anything about gambling odds knows, 489 00:26:34,000 --> 00:26:36,600 Speaker 3: if you make lots of predictions, you're just upping the 490 00:26:36,680 --> 00:26:38,840 Speaker 3: chances that some of them will hit, even if a 491 00:26:38,880 --> 00:26:40,840 Speaker 3: lot of a miss and then people remember the hits 492 00:26:40,840 --> 00:26:43,480 Speaker 3: but not the misses. However, I would say, in the 493 00:26:43,520 --> 00:26:46,919 Speaker 3: Toffler's defense, some of the things they get right, I 494 00:26:46,960 --> 00:26:49,840 Speaker 3: think they do get right in a pretty thoughtful way. 495 00:26:50,000 --> 00:26:53,160 Speaker 3: Like it seems actually like they're working out the steps 496 00:26:53,280 --> 00:26:57,920 Speaker 3: and predicting a in a fairly deterministic fashion, how their 497 00:26:58,040 --> 00:27:00,399 Speaker 3: world at the time would lead to this thing that 498 00:27:00,480 --> 00:27:03,320 Speaker 3: did fundamentally actually happen, though maybe not all the details 499 00:27:03,359 --> 00:27:05,720 Speaker 3: always happen the way they think. Like at one point 500 00:27:05,760 --> 00:27:09,879 Speaker 3: they do talk about a future of having personally curated, 501 00:27:10,440 --> 00:27:14,359 Speaker 3: personalized news feeds, but they're talking about these as print 502 00:27:14,400 --> 00:27:15,840 Speaker 3: on demand newspapers. 503 00:27:16,920 --> 00:27:19,560 Speaker 2: Yeah, so the spirit of the thing is it certainly 504 00:27:19,560 --> 00:27:21,639 Speaker 2: holds up. I mean, that's how so many people get 505 00:27:21,680 --> 00:27:24,679 Speaker 2: their news now, you know, social media feed, but it's 506 00:27:24,760 --> 00:27:26,160 Speaker 2: not a printed newspaper. 507 00:27:26,640 --> 00:27:29,240 Speaker 3: Also, they kind of present this as if it's like 508 00:27:30,200 --> 00:27:35,199 Speaker 3: pretty much a great thing, and I think, yeah, but 509 00:27:35,320 --> 00:27:37,840 Speaker 3: you can't expect people to, you know, always work out 510 00:27:37,880 --> 00:27:40,120 Speaker 3: the implications of everything. So it's still I think that's 511 00:27:40,320 --> 00:27:41,240 Speaker 3: fairly insightful. 512 00:27:41,760 --> 00:27:44,960 Speaker 2: Yeah, so just from the like the futureology angle, Yeah, 513 00:27:45,000 --> 00:27:46,119 Speaker 2: I think, you know, this is the kind of thing 514 00:27:46,160 --> 00:27:49,000 Speaker 2: you see in other works, either like nonfiction futureology, but 515 00:27:49,080 --> 00:27:53,000 Speaker 2: also science fiction. You know, Neuromancer by William Gibson has 516 00:27:53,040 --> 00:27:57,920 Speaker 2: some great ideas about like a virtual, you know, cyber future, 517 00:27:58,160 --> 00:28:00,000 Speaker 2: but at the same time, like they're still using facts, 518 00:28:00,119 --> 00:28:03,560 Speaker 2: machine stuff like that. I mean, so sci fi is 519 00:28:03,600 --> 00:28:06,360 Speaker 2: full of that. But I think one of the things, 520 00:28:06,359 --> 00:28:08,640 Speaker 2: of course, that really separates Future Shock from so many 521 00:28:08,640 --> 00:28:12,040 Speaker 2: of these other nonfiction works is that, I mean, this 522 00:28:12,080 --> 00:28:13,919 Speaker 2: is where the title comes in, right, It's not just 523 00:28:14,000 --> 00:28:18,640 Speaker 2: about what the future will consist of, but how are 524 00:28:18,760 --> 00:28:22,280 Speaker 2: human beings going to cope with these changes and the 525 00:28:22,320 --> 00:28:26,400 Speaker 2: pace of these changes. So the Topplers were apparently amazed 526 00:28:26,480 --> 00:28:28,760 Speaker 2: at how little at the time there seemed to be 527 00:28:28,840 --> 00:28:34,439 Speaker 2: on the topic of adaptivity, especially considering you know that 528 00:28:34,520 --> 00:28:36,800 Speaker 2: you had people talking about, you know, these are the 529 00:28:37,280 --> 00:28:39,160 Speaker 2: advancements that are going to occur, this is where our 530 00:28:39,160 --> 00:28:42,000 Speaker 2: technology is taking us, and you know, we're going to 531 00:28:42,120 --> 00:28:45,200 Speaker 2: adapt to these changes. He writes, quote, in the most 532 00:28:45,280 --> 00:28:48,480 Speaker 2: rapidly changing environment to which man has ever been exposed, 533 00:28:48,760 --> 00:28:52,280 Speaker 2: we remain pitifully ignorant of how the human animal copes. 534 00:28:52,880 --> 00:28:55,920 Speaker 2: And you know, it's interesting how a lot of what 535 00:28:56,000 --> 00:28:58,080 Speaker 2: he's observing it there in nineteen seventy is still the 536 00:28:58,120 --> 00:29:02,120 Speaker 2: case now, Like how many different you know, technology companies 537 00:29:02,280 --> 00:29:05,239 Speaker 2: or are pushing some sort of new thing that's going to, 538 00:29:05,560 --> 00:29:07,440 Speaker 2: you know, break the old pattern of how we live 539 00:29:07,480 --> 00:29:10,160 Speaker 2: our lives, but they haven't really worked out all the 540 00:29:10,440 --> 00:29:13,880 Speaker 2: potential problems, Like this is just you know, part of it, 541 00:29:13,920 --> 00:29:16,640 Speaker 2: Like here's here's how we're going to communicate now, No, 542 00:29:16,800 --> 00:29:19,760 Speaker 2: we didn't think about how this might lead to radicalization 543 00:29:19,880 --> 00:29:20,480 Speaker 2: and so forth. 544 00:29:20,920 --> 00:29:23,479 Speaker 3: Yeah, I mean, so if the if the spirit of 545 00:29:23,680 --> 00:29:27,440 Speaker 3: the technological industries can be summed up as move fast 546 00:29:27,480 --> 00:29:31,040 Speaker 3: and break things they're trying to look at. Okay, if 547 00:29:31,440 --> 00:29:33,960 Speaker 3: humans are the things that are getting broken, how does 548 00:29:34,040 --> 00:29:36,760 Speaker 3: that happen? What happens when the humans break as a 549 00:29:36,800 --> 00:29:40,320 Speaker 3: result of these changes in technology and their downstream effects 550 00:29:40,400 --> 00:29:41,040 Speaker 3: on society? 551 00:29:41,520 --> 00:29:43,840 Speaker 2: Yeah, and so really like what happens when humans break 552 00:29:44,040 --> 00:29:47,320 Speaker 2: due to rapid advancements and technology. That's essentially future shock 553 00:29:47,400 --> 00:29:48,520 Speaker 2: according to the Toddlers. 554 00:29:49,080 --> 00:29:52,200 Speaker 3: Now, one thing I think again about these like big 555 00:29:52,240 --> 00:29:54,600 Speaker 3: cultural thesis books you always have to be careful of, 556 00:29:54,720 --> 00:29:59,600 Speaker 3: is it's very appealing anytime somebody says, here's how now 557 00:29:59,720 --> 00:30:03,520 Speaker 3: is totally different than anything that ever happened before and 558 00:30:03,800 --> 00:30:06,800 Speaker 3: you know, that's always it's always like appealing to think 559 00:30:06,840 --> 00:30:08,960 Speaker 3: that you live in a unique time in history in 560 00:30:09,000 --> 00:30:12,600 Speaker 3: a way. But I think the specific argument they're making 561 00:30:12,720 --> 00:30:16,200 Speaker 3: is pretty well grounded. In fact, I think you can 562 00:30:16,240 --> 00:30:18,920 Speaker 3: pretty well show that, like and the core of their 563 00:30:19,920 --> 00:30:22,680 Speaker 3: the factual basis of the idea of future Shock is 564 00:30:22,720 --> 00:30:28,479 Speaker 3: that technology is changing faster and changing our lives faster 565 00:30:29,240 --> 00:30:32,360 Speaker 3: than any other time in human history. And I think 566 00:30:32,440 --> 00:30:34,880 Speaker 3: they're correct about that. That's pretty much inarguable. 567 00:30:34,920 --> 00:30:38,600 Speaker 2: I would say, Yeah, there are times though, where it's like, 568 00:30:38,800 --> 00:30:42,040 Speaker 2: let's just follow these various extrapolations of like the worst 569 00:30:42,040 --> 00:30:46,160 Speaker 2: possible you know, ramifications of a given trend. And this 570 00:30:46,240 --> 00:30:50,360 Speaker 2: is especially true in the TV special, the documentary, there's 571 00:30:50,440 --> 00:30:53,640 Speaker 2: like a part where they're talking about an artificial elbow, 572 00:30:54,120 --> 00:30:57,560 Speaker 2: one more step towards an artificial man, where it's like, 573 00:30:57,920 --> 00:31:01,680 Speaker 2: I mean, I guess maybe so, but really I don't know. 574 00:31:02,240 --> 00:31:05,080 Speaker 3: Well, And also, like you could accurately point out at 575 00:31:05,080 --> 00:31:09,680 Speaker 3: the time, the density of new medical breakthroughs at you know, 576 00:31:09,920 --> 00:31:12,560 Speaker 3: in the late at the in the late sixties as 577 00:31:12,560 --> 00:31:16,000 Speaker 3: they were writing this book was just huge, Like there's 578 00:31:16,080 --> 00:31:19,440 Speaker 3: so many medical break breakthroughs recently compared to what happened 579 00:31:19,440 --> 00:31:21,520 Speaker 3: to a similar you know, ten year chunk of time 580 00:31:21,960 --> 00:31:25,080 Speaker 3: the century before or before that. So like, yes, things 581 00:31:25,160 --> 00:31:28,800 Speaker 3: are definitely changing faster, but that leads to parts where 582 00:31:28,800 --> 00:31:30,800 Speaker 3: they I think there's one part where they're like, you know, 583 00:31:30,840 --> 00:31:34,560 Speaker 3: with the new heart transplants and other organ transplants, will 584 00:31:34,600 --> 00:31:37,640 Speaker 3: this lead to roving gangs of murderers who kill people 585 00:31:37,640 --> 00:31:39,880 Speaker 3: to harvest their organs for transplants. 586 00:31:41,080 --> 00:31:44,040 Speaker 2: Yeah, yeah, that that seems like, wow, we're really just 587 00:31:44,080 --> 00:31:47,280 Speaker 2: following all the worst case possibilities here to this post 588 00:31:47,280 --> 00:31:50,240 Speaker 2: apocalyptic vision of liver thieves. 589 00:31:50,600 --> 00:31:54,000 Speaker 3: Fortunately, in the future, you can also implant tracking devices 590 00:31:54,000 --> 00:31:55,640 Speaker 3: in your liver so that you can know where they 591 00:31:55,640 --> 00:31:57,240 Speaker 3: took it and who's got your liver now. 592 00:31:58,000 --> 00:31:59,920 Speaker 2: Yeah, but then what do you do when you and 593 00:32:00,040 --> 00:32:04,080 Speaker 2: counter a man made entirely from stolen livers and now 594 00:32:04,120 --> 00:32:07,000 Speaker 2: he's a separate They don't explore that idea, but yeah, 595 00:32:07,000 --> 00:32:09,360 Speaker 2: there's that. They did hold back on a few things, 596 00:32:09,400 --> 00:32:09,800 Speaker 2: I guess. 597 00:32:10,680 --> 00:32:13,000 Speaker 3: But actually, when you come back to the so it's 598 00:32:13,040 --> 00:32:15,440 Speaker 3: kind of funny the artificial elbow one step closer to 599 00:32:15,480 --> 00:32:19,480 Speaker 3: an artificial man, Like that's funny, but also it does 600 00:32:19,600 --> 00:32:21,960 Speaker 3: get it. Something they do in the book that I 601 00:32:21,960 --> 00:32:24,680 Speaker 3: think kind of makes sense, which is they're saying, when 602 00:32:24,760 --> 00:32:29,960 Speaker 3: we have these, say like biomedical technological biomedical breakthroughs that 603 00:32:30,080 --> 00:32:33,160 Speaker 3: can change out human body parts and maybe even can 604 00:32:33,200 --> 00:32:37,479 Speaker 3: affect human brains and things like that, it may well 605 00:32:37,560 --> 00:32:41,880 Speaker 3: affect It may well force us to reckon with medical 606 00:32:41,920 --> 00:32:44,520 Speaker 3: ethics problems that we've never had to consider before. And 607 00:32:44,560 --> 00:32:48,440 Speaker 3: what happens when we're facing brand new medical ethics situations 608 00:32:48,480 --> 00:32:51,160 Speaker 3: that have never existed before, and we're facing tons of 609 00:32:51,200 --> 00:32:54,080 Speaker 3: them and they're coming on rapidly, that is a real 610 00:32:54,080 --> 00:32:57,640 Speaker 3: thing to be concerned about, Like how fast new medical 611 00:32:57,680 --> 00:33:01,280 Speaker 3: technologies are coming online will present scenarios of things that 612 00:33:01,320 --> 00:33:04,520 Speaker 3: can be done to and with human brains and human 613 00:33:04,560 --> 00:33:07,840 Speaker 3: bodies and human embryos and things that we've never had 614 00:33:07,880 --> 00:33:10,040 Speaker 3: to work out this problem before of what's the right 615 00:33:10,120 --> 00:33:12,080 Speaker 3: thing to do here, And it puts you in a 616 00:33:12,120 --> 00:33:13,840 Speaker 3: tough situation of decision making. 617 00:33:14,280 --> 00:33:17,200 Speaker 2: Yeah, yeah, yeah, And that's where the future shock again 618 00:33:17,360 --> 00:33:21,440 Speaker 2: kicks in. Yeah, the idea you think, well, an artificial 619 00:33:21,440 --> 00:33:23,560 Speaker 2: man will never make that, and then suddenly there's an 620 00:33:23,600 --> 00:33:26,280 Speaker 2: artificial man and then you're like, well, now I'm confounded. 621 00:33:26,400 --> 00:33:30,800 Speaker 2: Now I have the future shock now in discussing the 622 00:33:30,800 --> 00:33:33,840 Speaker 2: shape of future societies. In the book, it's also worth 623 00:33:33,840 --> 00:33:36,080 Speaker 2: noting the language is not always as sensitive as it 624 00:33:36,120 --> 00:33:39,680 Speaker 2: would be today, even as it gets some things very 625 00:33:39,720 --> 00:33:42,480 Speaker 2: wrong and some things right about the future shape of, 626 00:33:42,520 --> 00:33:47,560 Speaker 2: say the family. Specifically, in discussing the possibility of a 627 00:33:47,600 --> 00:33:51,360 Speaker 2: future in which homosexual marriage is common and in which 628 00:33:51,400 --> 00:33:55,760 Speaker 2: same sex couples use adoption to grow their families. Everything 629 00:33:55,800 --> 00:33:59,280 Speaker 2: is basically presented by the Tofflers, matter of fact, but 630 00:33:59,360 --> 00:34:03,080 Speaker 2: the words marriage and parents are placed in quotations, which 631 00:34:03,280 --> 00:34:05,440 Speaker 2: certainly feels offensive reading it today. 632 00:34:05,840 --> 00:34:07,840 Speaker 3: Yeah, with these kind of things again, it's I feel 633 00:34:07,840 --> 00:34:11,080 Speaker 3: like there's a mix of things going on. At some 634 00:34:11,239 --> 00:34:16,600 Speaker 3: points it feels kind of open minded and progressive and 635 00:34:17,040 --> 00:34:20,799 Speaker 3: accepting about different ways of thinking about family arrangements and 636 00:34:20,800 --> 00:34:23,439 Speaker 3: stuff like that. But then also there are parts where 637 00:34:23,480 --> 00:34:27,160 Speaker 3: it takes it takes like moments to emphasize how weird 638 00:34:27,239 --> 00:34:30,680 Speaker 3: everything will feel. Uh, And I think maybe it's part 639 00:34:30,800 --> 00:34:33,520 Speaker 3: of the thesis that that would be true, that like, 640 00:34:34,200 --> 00:34:38,040 Speaker 3: there will be new social arrangements that not everyone will 641 00:34:38,560 --> 00:34:41,960 Speaker 3: will immediately accept or will know what to, you know, 642 00:34:42,200 --> 00:34:45,040 Speaker 3: how to incorporate into their view of the world, which 643 00:34:45,080 --> 00:34:48,400 Speaker 3: is just true. But sometimes it can feel like, you know, 644 00:34:48,440 --> 00:34:51,239 Speaker 3: they're suggesting like, wow, look at these weird people, which 645 00:34:51,280 --> 00:34:52,360 Speaker 3: is not very nice. 646 00:34:52,960 --> 00:34:55,520 Speaker 2: Yeah. Yeah, like there's a there's certain tone deafness in 647 00:34:55,640 --> 00:34:59,000 Speaker 2: labeling a section homosexual daddies. This is a part about 648 00:34:59,000 --> 00:35:03,600 Speaker 2: parenting the concept we're just altogether futuristic and that history 649 00:35:03,640 --> 00:35:06,719 Speaker 2: and contemporary nineteen seventy did not contain plenty of gay 650 00:35:06,719 --> 00:35:09,719 Speaker 2: men who were also fathers. Yeah, so I mean that 651 00:35:10,280 --> 00:35:12,759 Speaker 2: again just that doesn't hold up. So but at the 652 00:35:12,800 --> 00:35:16,719 Speaker 2: same time, they're essentially correct on the future of same 653 00:35:16,760 --> 00:35:20,600 Speaker 2: sex couples and their families, while also being somewhat off 654 00:35:20,600 --> 00:35:22,920 Speaker 2: the mark when it comes to say, the possible future 655 00:35:23,000 --> 00:35:27,120 Speaker 2: of the relaxing of polygamy laws, because again coming back 656 00:35:27,160 --> 00:35:29,799 Speaker 2: to the hippies, they were like, well, hippies are living 657 00:35:29,880 --> 00:35:33,680 Speaker 2: in communes, hippies are taking it having you know, multiple spouses. 658 00:35:33,960 --> 00:35:38,320 Speaker 2: Therefore this will be part of the future as well. 659 00:35:39,280 --> 00:35:41,919 Speaker 2: It hasn't really worked out that way. 660 00:35:42,200 --> 00:35:45,200 Speaker 3: Yeah, they predict like that there will be a rise 661 00:35:45,200 --> 00:35:47,880 Speaker 3: in like five parent families where each of the parents 662 00:35:47,920 --> 00:35:51,239 Speaker 3: can specialize in different things and all that, because it 663 00:35:51,280 --> 00:35:54,200 Speaker 3: will be necessary because of the technology and the economies 664 00:35:54,200 --> 00:35:54,800 Speaker 3: of the future. 665 00:35:55,160 --> 00:35:56,640 Speaker 2: Yeah, and you see similar things in some of the 666 00:35:57,120 --> 00:36:01,440 Speaker 2: sci fi of the time period as well. Joe Haldeman's 667 00:36:01,480 --> 00:36:04,799 Speaker 2: nineteen seventy four book The Forever War. I believe we've 668 00:36:04,800 --> 00:36:08,920 Speaker 2: talked about this one on Weird House Cinema because he 669 00:36:09,000 --> 00:36:12,480 Speaker 2: was involved in the writing of Oh Goodness, Robot Jocks. 670 00:36:13,719 --> 00:36:16,400 Speaker 2: But The Forever War is a great book about interstellar 671 00:36:16,440 --> 00:36:19,840 Speaker 2: war fought across time and with time dilation playing an 672 00:36:20,000 --> 00:36:23,359 Speaker 2: enormous factor in the lives of the soldiers in this war, 673 00:36:24,280 --> 00:36:27,440 Speaker 2: and it it kind of progressively depicts the like the 674 00:36:27,480 --> 00:36:31,200 Speaker 2: sexual politics of an imagined future as the character central 675 00:36:31,280 --> 00:36:34,600 Speaker 2: character keeps dipping into societies and technologies that have advanced 676 00:36:34,640 --> 00:36:37,760 Speaker 2: significantly since he last like jumped across time in space. 677 00:36:39,239 --> 00:36:41,880 Speaker 2: And for the most part it feels, you know, pretty 678 00:36:42,000 --> 00:36:46,440 Speaker 2: like liberal in it's an open minded in its consideration 679 00:36:46,600 --> 00:36:52,279 Speaker 2: of future societies and future sexuality. But it also like 680 00:36:52,600 --> 00:36:53,840 Speaker 2: ends up, you know, you come up with sort of 681 00:36:53,840 --> 00:36:56,160 Speaker 2: futuristic lingo for describing all of this. So there are 682 00:36:56,160 --> 00:36:59,840 Speaker 2: a lot of discussions of quote unquote homo sex, which 683 00:37:00,280 --> 00:37:04,960 Speaker 2: which feel a bit weird reading the book today, even 684 00:37:05,040 --> 00:37:07,960 Speaker 2: if it is discussed as like a logical social progression 685 00:37:08,239 --> 00:37:09,320 Speaker 2: in the novel itself. 686 00:37:10,040 --> 00:37:13,120 Speaker 3: Yeah, I have not read that book, but that makes sense. 687 00:37:13,520 --> 00:37:16,399 Speaker 3: And so there are plenty of things I think in 688 00:37:16,680 --> 00:37:19,080 Speaker 3: this book from fifty three years ago that did not 689 00:37:19,320 --> 00:37:23,440 Speaker 3: age wonderfully. So some of it would be like ways 690 00:37:23,480 --> 00:37:27,560 Speaker 3: of talking about things, even if I think the idea 691 00:37:27,600 --> 00:37:30,359 Speaker 3: of the authors is to portray them somewhat sympathetically, Yeah, 692 00:37:30,640 --> 00:37:34,200 Speaker 3: just the language used feels not as sympathetic as the 693 00:37:34,239 --> 00:37:36,560 Speaker 3: authors would probably want if they were writing it today. 694 00:37:37,239 --> 00:37:50,759 Speaker 2: Yeah. Now, to come back to the central thesis here, 695 00:37:51,719 --> 00:37:55,200 Speaker 2: future shock itself. They do write of it as a disease, 696 00:37:55,760 --> 00:38:00,840 Speaker 2: as a quote unquote social illness. They write, future shock 697 00:38:01,000 --> 00:38:05,719 Speaker 2: is the dizzying disorientation brought on by the premature arrival 698 00:38:05,800 --> 00:38:08,560 Speaker 2: of the future. It may well be the most important 699 00:38:08,600 --> 00:38:13,760 Speaker 2: disease of tomorrow. I do like that the premature arrival 700 00:38:13,800 --> 00:38:16,879 Speaker 2: of the future, which of course makes sense and doesn't 701 00:38:16,920 --> 00:38:19,319 Speaker 2: make sense at the same time, but does kind of 702 00:38:19,360 --> 00:38:22,200 Speaker 2: adequately sum up this feeling where it's like whoa WHOA 703 00:38:22,400 --> 00:38:25,399 Speaker 2: hold on? Are we already at this point in our 704 00:38:25,440 --> 00:38:27,160 Speaker 2: technological advancement? 705 00:38:27,520 --> 00:38:29,040 Speaker 3: Well, I feel like we should get more into the 706 00:38:29,080 --> 00:38:31,640 Speaker 3: specifics of what they mean when they say future shock. 707 00:38:31,719 --> 00:38:36,440 Speaker 3: What exactly is this condition or disease or state of 708 00:38:36,440 --> 00:38:38,120 Speaker 3: being there describing. 709 00:38:38,360 --> 00:38:40,480 Speaker 2: Well, they do point out that it does have some 710 00:38:40,560 --> 00:38:43,239 Speaker 2: things in common with the concept of culture shock, which 711 00:38:43,320 --> 00:38:46,560 Speaker 2: was already a buzzword at this time point, especially for 712 00:38:46,640 --> 00:38:49,920 Speaker 2: Americans traveling to other cultures and feeling overwhelmed by it, 713 00:38:50,360 --> 00:38:53,600 Speaker 2: and culture shock alone would probably be a fascinating topic 714 00:38:53,680 --> 00:38:55,960 Speaker 2: for us to talk about. I was reading that there 715 00:38:56,000 --> 00:38:59,440 Speaker 2: is a Canadian anthropologist by the name of kalervo Oberg 716 00:38:59,760 --> 00:39:03,600 Speaker 2: who in nineteen fifty four like basically mapped out this 717 00:39:03,680 --> 00:39:06,680 Speaker 2: kind of adjustment period of culture shock. So there's like 718 00:39:06,680 --> 00:39:10,520 Speaker 2: a honeymoon period and then there's this period called negotiation, 719 00:39:10,840 --> 00:39:14,600 Speaker 2: and this is a high anxiety period followed by adjustment 720 00:39:14,880 --> 00:39:18,319 Speaker 2: and then ultimately adaptation. So you know, already we have 721 00:39:18,320 --> 00:39:20,520 Speaker 2: a pre existing model of like what happens when you're 722 00:39:20,520 --> 00:39:27,279 Speaker 2: thrust into a different social geographic in a world. You know, 723 00:39:27,480 --> 00:39:29,480 Speaker 2: you have like maybe a period of excitement, and then 724 00:39:29,560 --> 00:39:32,120 Speaker 2: you start feeling weird about everything. Then you go through 725 00:39:32,480 --> 00:39:35,200 Speaker 2: some adjustment, and then you eventually reach this point where 726 00:39:35,239 --> 00:39:36,400 Speaker 2: you were adapted to it. 727 00:39:36,680 --> 00:39:40,479 Speaker 3: So their idea is that future shock is like culture shock. 728 00:39:40,560 --> 00:39:42,840 Speaker 3: So culture shock is when you're plunged into a culture 729 00:39:43,000 --> 00:39:46,920 Speaker 3: that you are not adapted to, so you can't predict 730 00:39:47,000 --> 00:39:51,400 Speaker 3: people's reactions appropriately. You don't know what the customs are, 731 00:39:51,800 --> 00:39:55,279 Speaker 3: you don't understand everything that people are saying, you don't 732 00:39:55,280 --> 00:39:58,359 Speaker 3: know exactly how to communicate correctly. There are things all 733 00:39:58,400 --> 00:40:00,719 Speaker 3: around you that you don't know how to use or 734 00:40:00,760 --> 00:40:04,080 Speaker 3: interact with. And over time, you can't adapt to this. 735 00:40:04,160 --> 00:40:08,160 Speaker 3: In a country, as you become acclimated to the local culture, 736 00:40:08,239 --> 00:40:10,440 Speaker 3: you learn what everything's for, you learn the language, you 737 00:40:10,520 --> 00:40:12,960 Speaker 3: learn better how to communicate, you learn what the customs are, 738 00:40:13,000 --> 00:40:16,200 Speaker 3: and so forth. But what they're saying is that, imagine 739 00:40:16,280 --> 00:40:19,400 Speaker 3: there's culture shock, but it's for the whole world, and 740 00:40:19,440 --> 00:40:22,120 Speaker 3: it's for your own culture also, because the culture that 741 00:40:22,160 --> 00:40:25,759 Speaker 3: you're being plunged into, the unfamiliar environment is not a 742 00:40:25,800 --> 00:40:29,040 Speaker 3: different place, but it's a different time and it just 743 00:40:29,200 --> 00:40:32,239 Speaker 3: keeps changing. So unlike the culture shock, where you can 744 00:40:32,280 --> 00:40:34,840 Speaker 3: eventually you can look forward to saying, Okay, this is 745 00:40:34,840 --> 00:40:37,239 Speaker 3: a temporary experience, and then I'll go back to my 746 00:40:37,320 --> 00:40:39,920 Speaker 3: own culture where I know how to predict things and 747 00:40:39,960 --> 00:40:42,200 Speaker 3: how to do things and interact with people and communicate. 748 00:40:42,920 --> 00:40:44,960 Speaker 3: In this, you can't go back there's no way to 749 00:40:45,000 --> 00:40:47,279 Speaker 3: go home. There's only the future, and it's just going 750 00:40:47,320 --> 00:40:49,759 Speaker 3: to keep changing, and in fact, it's just going to 751 00:40:49,840 --> 00:40:51,000 Speaker 3: keep changing faster. 752 00:40:52,000 --> 00:40:56,040 Speaker 2: Yeah, which, you know, just that description may may raise 753 00:40:56,200 --> 00:40:59,560 Speaker 2: some folks anxiety. Yeah, this feeling that like you can't 754 00:40:59,560 --> 00:41:03,040 Speaker 2: go back to something that felt comfortable, You're just going 755 00:41:03,080 --> 00:41:08,160 Speaker 2: to be in like technological free fall for the duration 756 00:41:08,280 --> 00:41:11,320 Speaker 2: of your life. They write that future shock quote is 757 00:41:11,360 --> 00:41:15,160 Speaker 2: a time phenomenon, a product of the greatly accelerated rate 758 00:41:15,200 --> 00:41:18,960 Speaker 2: of change in society. It arises from the super imposition 759 00:41:19,320 --> 00:41:22,360 Speaker 2: of a new culture on an old one. It is 760 00:41:22,440 --> 00:41:24,719 Speaker 2: culture shock in one's own society. 761 00:41:25,200 --> 00:41:27,520 Speaker 3: And I think an important thing to understand about their 762 00:41:27,680 --> 00:41:30,440 Speaker 3: vision of culture shock is that it's not just the 763 00:41:30,600 --> 00:41:36,120 Speaker 3: standard conscious resistance to change that you know, people often exhibit, 764 00:41:36,160 --> 00:41:40,000 Speaker 3: and that we in some ways it's associated with kind 765 00:41:40,000 --> 00:41:43,000 Speaker 3: of like cultural conservatism or something that there's like a 766 00:41:43,120 --> 00:41:45,200 Speaker 3: you know, oh, I like things how they used to be. 767 00:41:45,280 --> 00:41:48,640 Speaker 3: I don't want them to change. Instead they're saying that, well, 768 00:41:48,680 --> 00:41:51,080 Speaker 3: of course there is that, But then there's also something 769 00:41:51,120 --> 00:41:53,600 Speaker 3: that just affects people more broadly, which is that the 770 00:41:53,719 --> 00:41:57,879 Speaker 3: technology in our surroundings is changing, and it's changing economics 771 00:41:57,880 --> 00:42:01,800 Speaker 3: and business and culture and everything so fast that even 772 00:42:01,840 --> 00:42:05,080 Speaker 3: for people who are not consciously resistant to change, they're 773 00:42:05,080 --> 00:42:07,680 Speaker 3: in a kind of state of heightened anxiety all the 774 00:42:07,760 --> 00:42:11,160 Speaker 3: time trying to figure out what's going on and adapt 775 00:42:11,200 --> 00:42:11,560 Speaker 3: to it. 776 00:42:12,200 --> 00:42:15,160 Speaker 2: Yeah. Yeah, and these adaptions. You know that this is 777 00:42:15,200 --> 00:42:17,680 Speaker 2: something we'll get into in the next episode. I think, 778 00:42:17,719 --> 00:42:20,640 Speaker 2: you know, these various ideas of you know, again, like 779 00:42:21,080 --> 00:42:24,279 Speaker 2: more broadly, like what are the defining characteristics of future shock? 780 00:42:24,320 --> 00:42:27,120 Speaker 2: But also what are some of the maladaptive ways that 781 00:42:27,160 --> 00:42:29,879 Speaker 2: people end up coping with future shock? I find this 782 00:42:30,239 --> 00:42:34,160 Speaker 2: section very interesting. So, yeah, these are going to be 783 00:42:34,160 --> 00:42:36,640 Speaker 2: some of the key areas we dive into. The book 784 00:42:36,760 --> 00:42:38,920 Speaker 2: obviously spends a lot of time approaching the topic from 785 00:42:38,920 --> 00:42:45,240 Speaker 2: different angles social, technological, business, employment, It gets into transience, 786 00:42:45,560 --> 00:42:50,600 Speaker 2: disposable society, population issues again, modular human beings, and cybernetics, 787 00:42:50,760 --> 00:42:52,319 Speaker 2: all sorts of stuff. We're not going to try to 788 00:42:52,360 --> 00:42:54,560 Speaker 2: cover everything, but we're going to at least cover some 789 00:42:54,600 --> 00:42:56,840 Speaker 2: of these key bits and some of the things that 790 00:42:56,880 --> 00:43:00,480 Speaker 2: maybe spoke to us the most revisiting this concept in 791 00:43:00,600 --> 00:43:01,839 Speaker 2: the year twenty twenty three. 792 00:43:02,160 --> 00:43:03,600 Speaker 3: All right, So yeah, I think we're gonna have to 793 00:43:03,640 --> 00:43:06,200 Speaker 3: call this first episode here, but we will be back 794 00:43:06,239 --> 00:43:08,840 Speaker 3: next time to talk about some of these central ideas 795 00:43:08,880 --> 00:43:11,200 Speaker 3: in the book, what we think about them, whether we 796 00:43:11,280 --> 00:43:14,600 Speaker 3: think they were on track or not, and what this 797 00:43:14,640 --> 00:43:17,440 Speaker 3: book like, what a book of futurology looks like fifty 798 00:43:17,480 --> 00:43:18,000 Speaker 3: years later. 799 00:43:18,719 --> 00:43:22,359 Speaker 2: Yeah, yeah, So we will see you, gentle listeners in 800 00:43:22,440 --> 00:43:27,839 Speaker 2: the future. That'll be on Thursday. A reminder that's Stuff 801 00:43:27,880 --> 00:43:29,879 Speaker 2: to Blow Your Mind as a science podcast with core 802 00:43:29,920 --> 00:43:33,160 Speaker 2: episodes on Tuesdays and Thursdays, listener mail on Monday's, a 803 00:43:33,200 --> 00:43:36,720 Speaker 2: short form artifactor Monster Fact on Wednesdays, and on Fridays. 804 00:43:36,719 --> 00:43:38,960 Speaker 2: We set aside most serious concerns to just talk about 805 00:43:39,000 --> 00:43:42,160 Speaker 2: a weird film, and if everything goes according to plan, 806 00:43:42,360 --> 00:43:44,840 Speaker 2: this week's weird film will also be one that is 807 00:43:44,960 --> 00:43:47,240 Speaker 2: concerned to some degree with the future. 808 00:43:47,840 --> 00:43:51,480 Speaker 3: Huge thanks as always to our excellent audio producer JJ Posway. 809 00:43:51,719 --> 00:43:53,520 Speaker 3: If you would like to get in touch with us 810 00:43:53,520 --> 00:43:56,200 Speaker 3: with feedback on this episode or any other, to suggest 811 00:43:56,200 --> 00:43:58,320 Speaker 3: a topic for the future, or just to say hello, 812 00:43:58,680 --> 00:44:01,160 Speaker 3: you can email us at com intact at Stuff to 813 00:44:01,160 --> 00:44:09,600 Speaker 3: Blow your Mind dot com. 814 00:44:09,640 --> 00:44:12,600 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 815 00:44:12,680 --> 00:44:15,480 Speaker 1: more podcasts from my Heart Radio, visit the iHeartRadio app, 816 00:44:15,640 --> 00:44:32,600 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.