1 00:00:06,160 --> 00:00:08,559 Speaker 1: Hey, welcome to Stuff to Blow your Mind. My name 2 00:00:08,600 --> 00:00:09,680 Speaker 1: is Robert Lamb. 3 00:00:09,720 --> 00:00:12,520 Speaker 2: And I am Joe McCormick, and it's Saturday. So we're 4 00:00:12,520 --> 00:00:14,720 Speaker 2: going into the vault for an older episode of the show. 5 00:00:14,760 --> 00:00:17,560 Speaker 2: This time we're going back several years to an episode 6 00:00:17,600 --> 00:00:21,560 Speaker 2: we did on the Black Mirror episode Bander Snatch. This 7 00:00:21,760 --> 00:00:25,000 Speaker 2: was before Weird House Cinema. Not a Weird House Cinema episode. 8 00:00:25,680 --> 00:00:30,240 Speaker 2: We featured it in a We featured it in a 9 00:00:30,280 --> 00:00:32,800 Speaker 2: core episode and talked about all kinds of things connected 10 00:00:33,120 --> 00:00:38,600 Speaker 2: to the plot. This sorry, this was not a Weird 11 00:00:38,600 --> 00:00:41,080 Speaker 2: House Cinema episode. This pre dates Weird House. It was 12 00:00:41,120 --> 00:00:43,040 Speaker 2: a core episode where we talked about all kinds of 13 00:00:43,120 --> 00:00:47,200 Speaker 2: things related to the plot. It originally aired January ninth, 14 00:00:47,280 --> 00:00:48,000 Speaker 2: twenty twenty. 15 00:00:48,360 --> 00:00:50,400 Speaker 1: Now why are we talking about it right now? Well, 16 00:00:50,400 --> 00:00:53,400 Speaker 1: it's my understanding that Netflix is set to remove Bander 17 00:00:53,479 --> 00:00:58,360 Speaker 1: Snatch from its platform on May twelfth, And unlike most films, 18 00:00:58,440 --> 00:01:00,400 Speaker 1: you know, when it leaves one platform, or it can 19 00:01:00,400 --> 00:01:02,800 Speaker 1: just pop up on another, right, some other streaming service 20 00:01:02,800 --> 00:01:05,920 Speaker 1: can offer it, or it'll come out on physical media maybe, 21 00:01:06,360 --> 00:01:09,360 Speaker 1: or you know, or if you can find a stream somewhere. 22 00:01:09,560 --> 00:01:12,320 Speaker 1: But of course the thing about Bandersnatch is that it's interactive. 23 00:01:12,400 --> 00:01:16,600 Speaker 1: It is a choose your own path experience, and therefore, 24 00:01:16,720 --> 00:01:18,679 Speaker 1: if it's not, it's one of the it's one of 25 00:01:18,680 --> 00:01:21,720 Speaker 1: the few of the interactive experiences that they actually did, 26 00:01:22,040 --> 00:01:25,200 Speaker 1: and once it is off of the Netflix platform, you're 27 00:01:25,240 --> 00:01:26,840 Speaker 1: not going to be able to experience it in the 28 00:01:26,880 --> 00:01:30,480 Speaker 1: same way, which I thought was brilliant. I thought it 29 00:01:30,520 --> 00:01:35,679 Speaker 1: was an amazing interface of not only the technical aspects 30 00:01:35,680 --> 00:01:38,640 Speaker 1: of it, but just the way that Black Mirror made 31 00:01:38,800 --> 00:01:42,560 Speaker 1: use of that technology to tell a compelling story about 32 00:01:42,640 --> 00:01:44,399 Speaker 1: choices and the illusion of choice. 33 00:01:45,000 --> 00:01:47,280 Speaker 2: Going to be a shame to lose it, but it 34 00:01:47,319 --> 00:01:49,600 Speaker 2: seems like a fitting end for an episode of Black 35 00:01:49,640 --> 00:01:52,280 Speaker 2: Mirror to actually just be destroyed by the kind of 36 00:01:52,360 --> 00:01:54,560 Speaker 2: whims of a technological behemoth. 37 00:01:54,880 --> 00:01:58,320 Speaker 1: It does, it really does. So Yeah, so you've got 38 00:01:58,400 --> 00:02:01,360 Speaker 1: you've literally gotten like what a couple of days, maybe 39 00:02:01,400 --> 00:02:03,480 Speaker 1: even down to hours at this point by the time 40 00:02:03,520 --> 00:02:07,280 Speaker 1: you're listening to this episode. So jump in there experienced Bandersnatch, 41 00:02:07,520 --> 00:02:10,600 Speaker 1: or re experience it while you have time and enjoy 42 00:02:10,680 --> 00:02:13,400 Speaker 1: our thoughts from the year twenty twenty about it. 43 00:02:18,960 --> 00:02:21,960 Speaker 2: And the Banker Inspired with a courage so new it 44 00:02:22,040 --> 00:02:25,680 Speaker 2: was a matter for general remark, rushed madly ahead and 45 00:02:25,800 --> 00:02:28,520 Speaker 2: was lost to their view in his zeal to discover 46 00:02:28,600 --> 00:02:32,160 Speaker 2: the snark. But while he was seeking with thimbles and care, 47 00:02:32,639 --> 00:02:36,840 Speaker 2: a bender snatch swiftly drew nigh and grabbed at the banker, 48 00:02:36,880 --> 00:02:39,760 Speaker 2: who shrieked in despair, for he knew it was useless 49 00:02:39,800 --> 00:02:43,520 Speaker 2: to fly. He offered large discount. He offered a check 50 00:02:44,000 --> 00:02:46,679 Speaker 2: drawn two bearer for seven pounds ten. 51 00:02:46,760 --> 00:02:50,120 Speaker 1: But the bandersnatch merely extended its neck and grabbed at 52 00:02:50,120 --> 00:02:54,560 Speaker 1: the banker again without rest or pause, while those frumiest 53 00:02:54,720 --> 00:02:59,160 Speaker 1: jaws went savagely snapping around. He skipped, and he hopped, 54 00:02:59,160 --> 00:03:02,760 Speaker 1: and he floundered, flopped till fainting he fell to the ground. 55 00:03:03,520 --> 00:03:06,720 Speaker 1: The Bandersnatch fled as the others appeared, led on by 56 00:03:06,720 --> 00:03:10,720 Speaker 1: that fear stricken yell, and the bellman remarked, it is 57 00:03:10,880 --> 00:03:14,000 Speaker 1: just as I feared, and solemnly told on his bell. 58 00:03:20,800 --> 00:03:22,680 Speaker 3: Welcome to Stuff to Blow your Mind, A production of 59 00:03:22,720 --> 00:03:24,120 Speaker 3: Iheartradios How Stuff Works. 60 00:03:30,680 --> 00:03:32,760 Speaker 1: Hey, Welcome to stuff to Blow your Mind. My name 61 00:03:32,800 --> 00:03:33,800 Speaker 1: is Robert Lamb. 62 00:03:33,560 --> 00:03:36,040 Speaker 2: And I'm Joe McCormick. And Robert are you feeling more 63 00:03:36,080 --> 00:03:38,200 Speaker 2: frabjess or more frumious today. 64 00:03:38,480 --> 00:03:42,280 Speaker 1: I guess more frumious frumious would be my answer. That was, 65 00:03:42,320 --> 00:03:45,840 Speaker 1: of course the poem The Hunting of the Snark by 66 00:03:45,920 --> 00:03:48,760 Speaker 1: Lewis Carroll, But I guess a number of people are 67 00:03:48,760 --> 00:03:51,720 Speaker 1: probably more familiar with the Bandersnatch from another poem by 68 00:03:51,800 --> 00:03:55,720 Speaker 1: Lewis Carroll, that being the jabberwock Jabwacke, Yes, where the 69 00:03:55,760 --> 00:03:59,760 Speaker 1: Bandersnatch is just alluded to as another monstrous creature that 70 00:03:59,840 --> 00:04:02,160 Speaker 1: my i'd be running around the woods. 71 00:04:02,440 --> 00:04:05,440 Speaker 2: I love it when a poet first named something in 72 00:04:05,480 --> 00:04:08,840 Speaker 2: a kind of in a listical kind of way, you know, 73 00:04:09,000 --> 00:04:12,840 Speaker 2: poem as listical, and then later it comes through in 74 00:04:12,880 --> 00:04:15,480 Speaker 2: another poem with more force. I think that sort of 75 00:04:15,520 --> 00:04:18,480 Speaker 2: happened with the demogorgan, right. 76 00:04:18,480 --> 00:04:21,360 Speaker 1: Yeah, yeah, I think so. And in this case, yeah, 77 00:04:21,560 --> 00:04:24,120 Speaker 1: the Bandersnatch. There's not a lot really said about it 78 00:04:24,160 --> 00:04:26,880 Speaker 1: in the writings of Lewis Carroll. Lewis Carroll, by the way, 79 00:04:27,279 --> 00:04:30,760 Speaker 1: was the pen name of Charles Lutwich Dodgson, who lived 80 00:04:30,800 --> 00:04:34,359 Speaker 1: eighteen thirty two through eighteen ninety eight, and he first 81 00:04:34,360 --> 00:04:36,760 Speaker 1: introduced the bander Snatch again just in a list of 82 00:04:36,760 --> 00:04:39,960 Speaker 1: creatures that might pop up in his eighteen seventy two 83 00:04:40,040 --> 00:04:43,640 Speaker 1: novel Through the Looking Glass in that poem the Jabberwackie, 84 00:04:44,040 --> 00:04:46,560 Speaker 1: and then pops up again in this eighteen seventy four 85 00:04:46,560 --> 00:04:48,400 Speaker 1: poem that we just read, the Hunting of the Snark, 86 00:04:49,720 --> 00:04:50,720 Speaker 1: which we didn't. 87 00:04:50,480 --> 00:04:52,000 Speaker 2: Read the whole of the poem. That was just an 88 00:04:52,000 --> 00:04:56,000 Speaker 2: exerpt from it. I think where the like who's the banker? 89 00:04:56,080 --> 00:04:57,800 Speaker 2: The banker is one of these people who goes on 90 00:04:57,839 --> 00:05:01,120 Speaker 2: a voyage hunting the snark. I've read that that poem 91 00:05:01,800 --> 00:05:05,480 Speaker 2: has been interpreted by some as metaphorical of you know, 92 00:05:05,520 --> 00:05:07,480 Speaker 2: that it's supposed to be an allegory about the search 93 00:05:07,520 --> 00:05:11,440 Speaker 2: for human happiness and contentment. But then also I think 94 00:05:11,440 --> 00:05:13,520 Speaker 2: i've heard it alleged that the poem actually has no 95 00:05:13,920 --> 00:05:16,480 Speaker 2: allegorical meaning, that it's just kind of silly. 96 00:05:17,480 --> 00:05:19,600 Speaker 1: Yeah, I mean, in that sense, it's kind of enigmatic. 97 00:05:19,640 --> 00:05:23,680 Speaker 1: And the creature itself is enigmatic, scarcely described, but certainly 98 00:05:23,800 --> 00:05:26,560 Speaker 1: best avoided at all costs. There's no way to outrun it, 99 00:05:26,920 --> 00:05:30,279 Speaker 1: no way to escape its intensity. And by the way, 100 00:05:30,320 --> 00:05:34,599 Speaker 1: frumius is a combination of fuming and furious. Carroll just 101 00:05:34,720 --> 00:05:37,039 Speaker 1: ran these two words together to make a nice new 102 00:05:37,120 --> 00:05:39,640 Speaker 1: adjective for a strange monster is. 103 00:05:39,640 --> 00:05:43,680 Speaker 2: A perfectly cromulent word, very frabjous. And I was wondering, 104 00:05:43,680 --> 00:05:45,719 Speaker 2: do you need a vorpal sword if you go up 105 00:05:45,720 --> 00:05:47,800 Speaker 2: against a vander Snatch or is that only for the 106 00:05:47,880 --> 00:05:48,640 Speaker 2: jabberwock Well. 107 00:05:48,600 --> 00:05:51,159 Speaker 1: It certainly worked on the Jabberwockie. I don't know about 108 00:05:51,160 --> 00:05:53,760 Speaker 1: the vander Snatch. There are no tales of slaying it, 109 00:05:53,800 --> 00:05:57,039 Speaker 1: are there, at least not in Lewis Carroll's original work. 110 00:05:57,160 --> 00:05:59,479 Speaker 2: Does the vorpal sword show up as an artifact in 111 00:05:59,560 --> 00:06:00,520 Speaker 2: D and D It does? 112 00:06:00,720 --> 00:06:05,520 Speaker 1: Yeah, it certainly does. Yeah, pretty good sword. Oh yeah, yeah, 113 00:06:05,600 --> 00:06:09,040 Speaker 1: very good sword. Now, the name Vandersnatch has been evoked 114 00:06:09,080 --> 00:06:11,640 Speaker 1: many times over the years and works of fantasy and 115 00:06:11,680 --> 00:06:14,840 Speaker 1: science fiction. I've seen it pop up as a space 116 00:06:14,920 --> 00:06:18,080 Speaker 1: slug and other such creatures. Sometimes it's just kind of 117 00:06:18,120 --> 00:06:21,760 Speaker 1: an enigmatic name for like a government project or something, 118 00:06:22,600 --> 00:06:25,320 Speaker 1: because it's a great it's a great name. In depictions 119 00:06:25,320 --> 00:06:27,479 Speaker 1: of Lewis Carroll's work, it is often take on a 120 00:06:27,640 --> 00:06:32,839 Speaker 1: mammalian character. Nineteenth century children's illustrator Peter Newell depicted it 121 00:06:32,920 --> 00:06:36,080 Speaker 1: is kind of a furry, horned beast that might resemble 122 00:06:36,120 --> 00:06:39,320 Speaker 1: a cat or maybe a wolf like creature, and this one, 123 00:06:39,360 --> 00:06:42,720 Speaker 1: this is a very popular image, and then film adaptations 124 00:06:42,720 --> 00:06:46,359 Speaker 1: have depicted it as both borlike and cat like. The 125 00:06:46,360 --> 00:06:50,560 Speaker 1: twenty ten Tim Burton adaptation has a very memorable creature 126 00:06:50,560 --> 00:06:51,880 Speaker 1: design for the Bandersnatch. 127 00:06:51,960 --> 00:06:54,640 Speaker 4: Oh, the Tim Burton Alice in Wonderland? Is that what 128 00:06:54,680 --> 00:06:55,320 Speaker 4: you're talking about? 129 00:06:55,400 --> 00:06:57,240 Speaker 1: Yeah, yeah, I have, I've seen the first one. 130 00:06:57,839 --> 00:07:01,279 Speaker 4: I never ventured that far into late Burton. 131 00:07:01,680 --> 00:07:05,720 Speaker 1: Well, it had some things going for it. It had 132 00:07:05,760 --> 00:07:09,880 Speaker 1: had a really good cast, it had some interesting character designs, 133 00:07:09,880 --> 00:07:15,160 Speaker 1: I'll say that, and a very monstrous looking Bandersnatch. Okay, Now, 134 00:07:15,200 --> 00:07:18,880 Speaker 1: just a couple of other interesting tidbits about Lewis Carroll. 135 00:07:19,880 --> 00:07:22,960 Speaker 1: He was a mathematician. He worked in geometry and new 136 00:07:23,040 --> 00:07:27,840 Speaker 1: ideas in algebra, logic, machines, ciphers. So between this and 137 00:07:27,920 --> 00:07:30,440 Speaker 1: other details of his life, there's a lot of black 138 00:07:30,520 --> 00:07:35,680 Speaker 1: mirror to the originator of the Bandersnatch. Also, in Hallucinations, 139 00:07:36,040 --> 00:07:39,200 Speaker 1: the book by Oliver Sachs, the Late Oliver Sacks, Sax 140 00:07:39,240 --> 00:07:42,760 Speaker 1: points out that Carroll was known to suffer from classical migraines, 141 00:07:43,200 --> 00:07:46,760 Speaker 1: and that Caro W. Lippman and others have suggested that 142 00:07:46,840 --> 00:07:53,440 Speaker 1: his migraine experiences may have contributed to the way he 143 00:07:53,600 --> 00:07:56,720 Speaker 1: envisioned through the looking Glass and Alice in Wonderland, like 144 00:07:56,760 --> 00:08:01,480 Speaker 1: the skewing of time and space. Also, you have auditory 145 00:08:01,480 --> 00:08:04,320 Speaker 1: hallucinations that are not uncommon in migraines, as well as 146 00:08:04,320 --> 00:08:10,040 Speaker 1: old factory hallucinations. I've also seen descriptions of this lifting feeling, 147 00:08:10,640 --> 00:08:12,640 Speaker 1: this feeling of being moved through space. 148 00:08:13,840 --> 00:08:16,720 Speaker 2: Yeah, I guess the extension of the lightheadedness that comes 149 00:08:16,720 --> 00:08:18,120 Speaker 2: on with the aura and all that. 150 00:08:18,360 --> 00:08:19,080 Speaker 1: Yeah. 151 00:08:19,920 --> 00:08:22,360 Speaker 2: You know, there's actually an asteroid named bander Snatch. 152 00:08:22,440 --> 00:08:23,200 Speaker 1: Oh I didn't know this. 153 00:08:23,280 --> 00:08:24,960 Speaker 2: Yeah, I look this up. Nine to seven to eighty 154 00:08:25,000 --> 00:08:27,760 Speaker 2: Bander Snatch. It's a Main Belt asteroid, so it's out 155 00:08:27,800 --> 00:08:30,640 Speaker 2: beyond the orbit of Mars. Discovered in nineteen ninety four 156 00:08:30,680 --> 00:08:35,720 Speaker 2: by Japanese astronomers Takashi Urrata and Yasuhiro Shimizu at the 157 00:08:36,000 --> 00:08:38,960 Speaker 2: Nachi Katsura Observatory, and it was named, of course after 158 00:08:39,040 --> 00:08:40,240 Speaker 2: the Frumias Bandersnatch. 159 00:08:40,880 --> 00:08:43,719 Speaker 1: Awesome. Now, one of the this is just sort of 160 00:08:43,760 --> 00:08:47,439 Speaker 1: the introductory material on the Bandersnatch, because for the vast 161 00:08:47,480 --> 00:08:50,520 Speaker 1: majority of this episode we're going to be talking about 162 00:08:50,640 --> 00:08:56,240 Speaker 1: what as I guess the most recent cinematic invocation of 163 00:08:56,280 --> 00:09:00,720 Speaker 1: the bandersnatch, and that is the Black Mirror episode. Well, 164 00:09:00,720 --> 00:09:03,480 Speaker 1: it's not even an episode. It's a Black Mirror film 165 00:09:03,960 --> 00:09:07,000 Speaker 1: that came out on Netflix December of what was it, 166 00:09:07,040 --> 00:09:11,079 Speaker 1: twenty eighteen, so a little over a year ago, and 167 00:09:11,600 --> 00:09:12,920 Speaker 1: there's a lot to unpack here. 168 00:09:13,160 --> 00:09:15,559 Speaker 2: I actually didn't watch it until this week, so I 169 00:09:15,640 --> 00:09:17,480 Speaker 2: knew you wanted to do an episode about it, and 170 00:09:17,559 --> 00:09:19,240 Speaker 2: I was like, Okay, I'll finally see what all the 171 00:09:19,280 --> 00:09:19,840 Speaker 2: fuss is about. 172 00:09:19,920 --> 00:09:20,800 Speaker 4: I was very impressed. 173 00:09:21,280 --> 00:09:25,040 Speaker 1: Yeah, I'll get I'll certainly get more into my very 174 00:09:25,080 --> 00:09:26,960 Speaker 1: thoughts on it later. I was impressed with it when 175 00:09:26,960 --> 00:09:28,880 Speaker 1: it came out, and then since we were going to 176 00:09:28,960 --> 00:09:31,480 Speaker 1: do the episode, I rewatched it for the first time 177 00:09:31,679 --> 00:09:36,040 Speaker 1: since its original release earlier this week, and I have 178 00:09:36,120 --> 00:09:38,320 Speaker 1: to say it, I thought it held up. I even 179 00:09:38,760 --> 00:09:41,040 Speaker 1: got a different ending and a different dead end at 180 00:09:41,040 --> 00:09:45,200 Speaker 1: one point than I had encountered previously. So it was like, 181 00:09:45,280 --> 00:09:47,960 Speaker 1: why every time you watch a film that you like 182 00:09:48,160 --> 00:09:50,840 Speaker 1: again you find new things, but in this case you 183 00:09:50,880 --> 00:09:52,559 Speaker 1: can actually get a different ending. 184 00:09:52,920 --> 00:09:56,000 Speaker 2: Yeah, now we're going to be exploring today some of 185 00:09:56,040 --> 00:09:59,280 Speaker 2: the science and the ideas and philosophy that are alluded 186 00:09:59,320 --> 00:10:02,720 Speaker 2: to in Vanderson. But in doing so, of course, this 187 00:10:02,760 --> 00:10:07,079 Speaker 2: will involve some spoilers for this strange film. So I 188 00:10:07,120 --> 00:10:08,800 Speaker 2: would say, there are a couple of places we're not 189 00:10:09,160 --> 00:10:11,240 Speaker 2: going to like go through and explore every possible ending 190 00:10:11,360 --> 00:10:14,240 Speaker 2: or anything like that. But if you are in the 191 00:10:14,400 --> 00:10:16,680 Speaker 2: case where you haven't seen it yet and you don't 192 00:10:16,679 --> 00:10:19,600 Speaker 2: want anything at all spoiled, you should probably stop here 193 00:10:19,720 --> 00:10:21,439 Speaker 2: and go watch it first before you listen to the 194 00:10:21,480 --> 00:10:23,840 Speaker 2: rest of the episode. But if you've already seen it, 195 00:10:24,040 --> 00:10:25,720 Speaker 2: or you haven't seen it, and you don't care about 196 00:10:25,720 --> 00:10:27,640 Speaker 2: minor spoilers that don't go all the way to all 197 00:10:27,679 --> 00:10:30,240 Speaker 2: the endings, then you know, forge ahead with us. 198 00:10:30,240 --> 00:10:32,920 Speaker 1: Please. However, some of you may be asking the question, 199 00:10:33,160 --> 00:10:35,880 Speaker 1: what are you talking about? What is Black Mirror? So 200 00:10:35,880 --> 00:10:38,120 Speaker 1: we should probably take a few minutes to just refresh 201 00:10:38,160 --> 00:10:41,160 Speaker 1: you on what this is. It is a is the 202 00:10:41,160 --> 00:10:45,560 Speaker 1: word refresh? Would that be the word? We should shock 203 00:10:45,640 --> 00:10:49,320 Speaker 1: you to the bone? All right? So Black Mirror is 204 00:10:49,880 --> 00:10:54,440 Speaker 1: in essence a sci fi anthology television series in the 205 00:10:54,480 --> 00:10:57,960 Speaker 1: same vein as The Twilight Zone, the Outer Limits, these 206 00:10:58,040 --> 00:10:59,720 Speaker 1: various shows we've discussed in the past. 207 00:11:00,320 --> 00:11:04,160 Speaker 2: I might call it pretty often techno horror. Not every 208 00:11:04,200 --> 00:11:08,960 Speaker 2: episode is the same, but there's essentially no horror movie 209 00:11:09,000 --> 00:11:13,720 Speaker 2: as scary as the scariest episodes of Black Mirror, especially 210 00:11:13,840 --> 00:11:19,080 Speaker 2: the ones that manage to take fairly plausible technological scenarios 211 00:11:19,160 --> 00:11:23,040 Speaker 2: and follow them to their logical conclusions. I mean, it's 212 00:11:24,040 --> 00:11:26,120 Speaker 2: it's a show that's very good at conjuring up the 213 00:11:26,160 --> 00:11:31,800 Speaker 2: worst possible nightmares of like the intersection of capitalism and technology. 214 00:11:32,240 --> 00:11:37,560 Speaker 1: Yeah, definitely, episodes tend to have a technological swing to 215 00:11:37,640 --> 00:11:40,720 Speaker 1: the story, and they tend to deal on some level 216 00:11:41,120 --> 00:11:47,559 Speaker 1: with contemporary anxiety about current technology and emerging technology. What 217 00:11:48,559 --> 00:11:51,520 Speaker 1: are these technologies doing to our lives? What may they 218 00:11:51,559 --> 00:11:54,080 Speaker 1: do to our lives in the future, And you know, 219 00:11:54,120 --> 00:11:57,480 Speaker 1: sometimes they take varying specuative leaps there, of course, since 220 00:11:57,520 --> 00:12:01,199 Speaker 1: it is science fiction. But you, I would say, you 221 00:12:01,280 --> 00:12:04,560 Speaker 1: typically leave an episode of Black Mirror feeling a little 222 00:12:04,640 --> 00:12:08,080 Speaker 1: worse about the world. I know that Netflix, their current masters, 223 00:12:08,400 --> 00:12:11,240 Speaker 1: are very into the whole binge model, but I personally 224 00:12:11,240 --> 00:12:15,320 Speaker 1: find it very difficult to binge Black Mirror, in part 225 00:12:15,360 --> 00:12:18,480 Speaker 1: because each episode, of course, is a self contained story 226 00:12:18,840 --> 00:12:22,960 Speaker 1: with characters and a plot, et cetera. But then also 227 00:12:23,120 --> 00:12:25,400 Speaker 1: It's like they're often like a punch in the gut, 228 00:12:25,480 --> 00:12:28,120 Speaker 1: and I just can't just sit there and take one 229 00:12:28,160 --> 00:12:29,080 Speaker 1: punch after the other. 230 00:12:29,240 --> 00:12:32,640 Speaker 2: Apart from one very sweet, very nice episode, there's essentially 231 00:12:32,800 --> 00:12:35,559 Speaker 2: nothing that makes me feel as bad as Black Mirror. 232 00:12:37,080 --> 00:12:38,920 Speaker 1: Well, you know, I was thinking. I was thinking about 233 00:12:38,920 --> 00:12:41,080 Speaker 1: this because there are definitely some very bleak episode. There 234 00:12:41,080 --> 00:12:44,240 Speaker 1: are episodes of Black Mirror that I admire that I 235 00:12:44,240 --> 00:12:48,440 Speaker 1: will never watch again. But then I look back at 236 00:12:48,440 --> 00:12:50,880 Speaker 1: my some of my favorite episodes. My favorite episodes are 237 00:12:50,880 --> 00:12:55,480 Speaker 1: probably san Ju Napero, The Uss Callister, and metal Head. 238 00:12:56,559 --> 00:12:58,640 Speaker 1: Two of those, one of those is still pretty bleak, 239 00:12:58,720 --> 00:13:02,199 Speaker 1: but two of those are actually pretty upbeat, probably the 240 00:13:02,240 --> 00:13:06,840 Speaker 1: most upbeat episodes of the show. And maybe that's the 241 00:13:06,840 --> 00:13:08,880 Speaker 1: reason I would come back to them, because if I'm 242 00:13:08,920 --> 00:13:10,720 Speaker 1: going to double dip, I want to double dip for 243 00:13:10,760 --> 00:13:13,360 Speaker 1: optimism's sake. Now, in terms that we can't talk about 244 00:13:13,360 --> 00:13:18,040 Speaker 1: Black Meir without talking about the creator behind it, the 245 00:13:18,080 --> 00:13:21,559 Speaker 1: main creative individual behind it, and that is Charlie Brooker, 246 00:13:21,960 --> 00:13:26,160 Speaker 1: a British writer and humorist who the earliest thing that 247 00:13:26,160 --> 00:13:27,800 Speaker 1: he worked on that I was familiar with was that 248 00:13:27,840 --> 00:13:31,400 Speaker 1: he worked on Chris Morris's excellent news satire Brass Eye, 249 00:13:32,440 --> 00:13:35,320 Speaker 1: and then he also created a pretty great zombie movie 250 00:13:36,040 --> 00:13:39,080 Speaker 1: titled Dead Set in two thousand and eight, in which 251 00:13:39,160 --> 00:13:43,440 Speaker 1: the zombie apocalypse breaks out in and around a Big 252 00:13:43,480 --> 00:13:46,600 Speaker 1: Brother style reality TV production. 253 00:13:46,720 --> 00:13:48,400 Speaker 2: I feel like two thousand and eight was sort of 254 00:13:48,440 --> 00:13:50,480 Speaker 2: like maybe two thousand and seven. Two thousand and eight 255 00:13:50,520 --> 00:13:52,600 Speaker 2: was like peak zombie satire movie. 256 00:13:52,880 --> 00:13:56,000 Speaker 1: Yeah. And the thing about this one, though, the premise 257 00:13:56,160 --> 00:13:59,120 Speaker 1: sounds like a comedy, and so I acquired a copy 258 00:13:59,160 --> 00:14:01,240 Speaker 1: of it, thinking, oh, this is a comedy, and this 259 00:14:01,280 --> 00:14:03,440 Speaker 1: is a guy that worked on Brass Eye. This is 260 00:14:03,440 --> 00:14:06,560 Speaker 1: going to be hilarious. And it is not a straight 261 00:14:06,640 --> 00:14:09,480 Speaker 1: up comedy. It is a pretty terrifying film. But you 262 00:14:09,559 --> 00:14:12,360 Speaker 1: see shades of that in Black Mirror. Sometimes there is 263 00:14:12,360 --> 00:14:17,079 Speaker 1: a premise that could sound like a joke, but then 264 00:14:17,160 --> 00:14:22,520 Speaker 1: it is taken and considered with such intensity that it works. Yeah. 265 00:14:22,560 --> 00:14:25,680 Speaker 2: What if like a major tech company used eye tracking 266 00:14:25,720 --> 00:14:28,760 Speaker 2: software to make sure you were always watching their ads 267 00:14:28,760 --> 00:14:31,400 Speaker 2: and if you didn't watch them, they would ring sirens 268 00:14:31,440 --> 00:14:34,360 Speaker 2: in your brain and deduct money from your bank account 269 00:14:34,440 --> 00:14:37,800 Speaker 2: until you started watching the ads. Again, sounds like a joke, 270 00:14:37,920 --> 00:14:40,120 Speaker 2: but like if you just said take that seriously for 271 00:14:40,200 --> 00:14:43,520 Speaker 2: a bit and explore that becomes like a nightmare of 272 00:14:44,800 --> 00:14:45,640 Speaker 2: technoci fi. 273 00:14:46,000 --> 00:14:50,680 Speaker 1: Absolutely. Now. Black Mirror began in twenty ten two seasons 274 00:14:50,680 --> 00:14:53,160 Speaker 1: in a holiday special came out and ran on Channel 275 00:14:53,160 --> 00:14:56,480 Speaker 1: four in the UK. Then Netflix started carrying it, and 276 00:14:56,560 --> 00:15:00,320 Speaker 1: Netflix became the owner of the main pub We'll share 277 00:15:00,400 --> 00:15:02,240 Speaker 1: the program however you want to look at it, starting 278 00:15:02,240 --> 00:15:06,000 Speaker 1: with season three in October of twenty sixteen. All in all, 279 00:15:06,000 --> 00:15:09,400 Speaker 1: it has thus far gone five seasons twenty one episodes, 280 00:15:09,840 --> 00:15:13,040 Speaker 1: and that's not counting the film Bandersnatch, which again came 281 00:15:13,040 --> 00:15:14,720 Speaker 1: out in December of twenty eighteen. 282 00:15:14,960 --> 00:15:18,160 Speaker 2: These bits of publisher information will actually become relevant later 283 00:15:18,200 --> 00:15:19,760 Speaker 2: on as we discussed the story. 284 00:15:19,720 --> 00:15:21,400 Speaker 4: Because the the ideas there. 285 00:15:21,480 --> 00:15:23,200 Speaker 1: Yeah, because at the end we definitely get into some 286 00:15:23,240 --> 00:15:25,760 Speaker 1: citiz scenarios where we have to consider the fact that 287 00:15:25,840 --> 00:15:30,160 Speaker 1: Netflix is the business daddy behind Black Mirror. 288 00:15:30,320 --> 00:15:34,120 Speaker 2: So Vandersnatch the film. The Black Mirror film was actually 289 00:15:34,160 --> 00:15:36,560 Speaker 2: directed by David Slade, who I was like, where do 290 00:15:36,600 --> 00:15:38,840 Speaker 2: I know that name from? He's done several things, but 291 00:15:38,920 --> 00:15:41,920 Speaker 2: one of them was he did one of the Twilight movies. 292 00:15:42,400 --> 00:15:44,480 Speaker 1: Yes, and I've seen that that particular one. It was 293 00:15:44,520 --> 00:15:47,720 Speaker 1: the second one, I think, and it's that's a I'm 294 00:15:47,720 --> 00:15:50,360 Speaker 1: not a huge Twilight fan, but that is a very 295 00:15:50,360 --> 00:15:52,840 Speaker 1: watchable Twilight movie and it has a great soundtrack. It's 296 00:15:52,840 --> 00:15:55,560 Speaker 1: got Tom York on it. Oh yeah, yeah. He also 297 00:15:55,640 --> 00:15:58,600 Speaker 1: directed the Black Mirror episode metal Head that I alluded 298 00:15:58,600 --> 00:16:01,560 Speaker 1: to earlier, and there are some callbacks to metal Head 299 00:16:01,640 --> 00:16:03,360 Speaker 1: in the Bandersnatch episode. 300 00:16:03,760 --> 00:16:05,760 Speaker 2: Now, I guess one thing we haven't gotten fully into 301 00:16:05,880 --> 00:16:08,920 Speaker 2: so far is the fact that the Black Mirror movie 302 00:16:08,960 --> 00:16:13,480 Speaker 2: Bandersnatch is it's an interactive movie, which makes it very unique. 303 00:16:13,640 --> 00:16:15,840 Speaker 1: Right. This was the big selling point on it, and 304 00:16:16,600 --> 00:16:19,760 Speaker 1: indeed is one of the mean it's a key part 305 00:16:19,760 --> 00:16:21,440 Speaker 1: of the way you consume it, but it is also 306 00:16:21,640 --> 00:16:25,520 Speaker 1: very important thematically, like you know, true to form, The 307 00:16:25,560 --> 00:16:28,880 Speaker 1: creators here really thought long and hard about how to 308 00:16:29,000 --> 00:16:35,120 Speaker 1: utilize an interactive system within the work and make the 309 00:16:35,160 --> 00:16:37,360 Speaker 1: work comment on that system as well. 310 00:16:37,440 --> 00:16:40,040 Speaker 2: Right, the interactive system being Netflix, like the fact that 311 00:16:40,080 --> 00:16:42,120 Speaker 2: the user can make inputs on the movie. 312 00:16:42,320 --> 00:16:45,800 Speaker 1: Yeah, basically, that's what it amounts to is you start 313 00:16:45,840 --> 00:16:49,840 Speaker 1: off watching it, it seems like a normal Netflix presentation, but 314 00:16:49,960 --> 00:16:54,720 Speaker 1: then your in my case, my Xbox one controller would 315 00:16:54,760 --> 00:16:58,560 Speaker 1: suddenly vibrate and then this little the screen at the 316 00:16:58,560 --> 00:17:01,800 Speaker 1: bottom of the screen. You're suddenly with two choices and 317 00:17:01,840 --> 00:17:05,240 Speaker 1: a timer, and you have to choose, you know, what 318 00:17:05,359 --> 00:17:07,800 Speaker 1: is going to happen, what the character is going to do, etc. 319 00:17:08,760 --> 00:17:12,040 Speaker 1: Now this is you know, when you're just checking out 320 00:17:12,040 --> 00:17:14,920 Speaker 1: the film, you might not realize how much work goes 321 00:17:14,960 --> 00:17:18,720 Speaker 1: into this, but it took apparently a huge amount of 322 00:17:18,720 --> 00:17:22,040 Speaker 1: work to shoot all these various branching paths, because it 323 00:17:22,080 --> 00:17:27,959 Speaker 1: becomes this tree, this branching system of possibilities when you 324 00:17:28,000 --> 00:17:32,720 Speaker 1: start presenting the user with these interactive choices. For instance, 325 00:17:33,080 --> 00:17:36,080 Speaker 1: the previously the longest episode of Black Mirror was an 326 00:17:36,119 --> 00:17:39,600 Speaker 1: episode called Hated in the Nation, which was eighty nine 327 00:17:39,640 --> 00:17:43,240 Speaker 1: minutes long. That's future length, right, what ninety minutes is 328 00:17:43,320 --> 00:17:47,480 Speaker 1: usually the length you shoot for with a shot film. Well, 329 00:17:47,840 --> 00:17:52,040 Speaker 1: when you're watching Bandersnatch, depending on your choices, the film 330 00:17:52,080 --> 00:17:55,320 Speaker 1: can run anywhere between ninety minutes and two and a 331 00:17:55,400 --> 00:17:59,800 Speaker 1: half hours. And in order to make this work, as 332 00:18:00,040 --> 00:18:02,840 Speaker 1: pointed out by a Jackie Strauss in The Hollywood Reporter, 333 00:18:03,720 --> 00:18:06,159 Speaker 1: this means they had to shoot like five hours of 334 00:18:06,160 --> 00:18:09,000 Speaker 1: footage so that they could actually cover all of these 335 00:18:09,040 --> 00:18:12,440 Speaker 1: various choices. Wow, and you may watch it like, for instance, 336 00:18:12,760 --> 00:18:14,879 Speaker 1: the first time I watched it, there were plenty of 337 00:18:14,920 --> 00:18:17,440 Speaker 1: scenes I did not see, and then when I watched 338 00:18:17,480 --> 00:18:19,440 Speaker 1: it again, there were films, there were scenes that I 339 00:18:19,440 --> 00:18:21,400 Speaker 1: saw the first time that I did not see, and 340 00:18:21,480 --> 00:18:23,719 Speaker 1: I got an entirely different ending that I'd never I 341 00:18:23,800 --> 00:18:25,760 Speaker 1: didn't even know about. And then there are of course 342 00:18:25,840 --> 00:18:29,119 Speaker 1: various Easter eggs and even I've read quote golden Easter 343 00:18:29,200 --> 00:18:32,640 Speaker 1: eggs that are spread throughout, things that most viewers will 344 00:18:32,680 --> 00:18:35,439 Speaker 1: not find unless they spend a great deal of time 345 00:18:35,840 --> 00:18:38,720 Speaker 1: going through and going back through and backing up, etc. 346 00:18:39,680 --> 00:18:43,240 Speaker 2: With this interactive piece encouraging unhealthy obsessive behavior. 347 00:18:43,520 --> 00:18:46,320 Speaker 1: Yeah, exactly, I mean it is black mirror. Now, as 348 00:18:46,359 --> 00:18:52,000 Speaker 1: far as the choices you make in Bandersnatch, you start 349 00:18:52,000 --> 00:18:56,360 Speaker 1: off making very small choices, it seemed very consequential. For instance, 350 00:18:57,200 --> 00:19:01,200 Speaker 1: choosing the main character's breakfast series he's presently his father 351 00:19:01,480 --> 00:19:04,119 Speaker 1: shows two boxes and you decide which one he's going 352 00:19:04,200 --> 00:19:05,200 Speaker 1: to have for breakfast. Yeah. 353 00:19:05,200 --> 00:19:07,399 Speaker 2: I think it was what like frosted flakes or sugar 354 00:19:07,400 --> 00:19:09,840 Speaker 2: puffs or something. I remember what I realized after I 355 00:19:09,880 --> 00:19:12,600 Speaker 2: made that choice was I was like, oh no, I 356 00:19:12,600 --> 00:19:15,640 Speaker 2: think I chose the brand that I was more familiar with. 357 00:19:15,920 --> 00:19:18,800 Speaker 1: Ah, we'll come back to that later. That is an 358 00:19:18,840 --> 00:19:21,399 Speaker 1: important point that we'll come back to later on in 359 00:19:21,440 --> 00:19:24,399 Speaker 1: the episode. But yeah, at first, it's it's what kind 360 00:19:24,440 --> 00:19:26,359 Speaker 1: of cereal does he want? All right? It doesn't doesn't 361 00:19:26,359 --> 00:19:28,320 Speaker 1: seem to matter much, and it also gives you a 362 00:19:28,400 --> 00:19:32,520 Speaker 1: chance to try out the technology low stakes. But and 363 00:19:32,960 --> 00:19:35,159 Speaker 1: then also later on, you choose what music is going 364 00:19:35,240 --> 00:19:38,320 Speaker 1: to listen to in when he's on the bus, which 365 00:19:38,320 --> 00:19:39,800 Speaker 1: is kind of fun. I think you get to choose 366 00:19:39,840 --> 00:19:44,359 Speaker 1: between a eurhythmic song and something else. I can't remember 367 00:19:44,440 --> 00:19:44,800 Speaker 1: the other one. 368 00:19:44,800 --> 00:19:46,680 Speaker 4: I think it's Thompson Twins. 369 00:19:46,359 --> 00:19:49,840 Speaker 1: That's it. Yeah. And then later on he's in a 370 00:19:49,920 --> 00:19:51,919 Speaker 1: record store and you get to choose which record he's 371 00:19:51,960 --> 00:19:53,840 Speaker 1: going to buy. And this is also pretty great because 372 00:19:53,840 --> 00:19:57,439 Speaker 1: one of the choices is Tangerine Dreams excellent nineteen seventy 373 00:19:57,480 --> 00:19:59,960 Speaker 1: four album Phaedra, which is incredible. 374 00:20:00,200 --> 00:20:00,800 Speaker 4: Absolutely. 375 00:20:00,840 --> 00:20:03,000 Speaker 2: In fact, I was listening to that again this morning 376 00:20:03,080 --> 00:20:05,880 Speaker 2: while I was doing some prep for this episode. 377 00:20:06,000 --> 00:20:09,320 Speaker 1: Yeah, it's excellent stuff. However, this time around I forced 378 00:20:09,359 --> 00:20:12,919 Speaker 1: myself to choose the other album instead, the other album 379 00:20:12,960 --> 00:20:18,320 Speaker 1: being a Seo Tamita's the Bermuda Triangle, which very strange. Yeah, 380 00:20:18,440 --> 00:20:21,040 Speaker 1: very strange work, but very good. I was really not 381 00:20:21,080 --> 00:20:24,280 Speaker 1: that familiar with this artist or this work, which is 382 00:20:24,280 --> 00:20:26,720 Speaker 1: apparently kind of hard to come by on streaming unless 383 00:20:26,760 --> 00:20:30,240 Speaker 1: you just find like a YouTube full album rip. But yeah, 384 00:20:30,240 --> 00:20:33,000 Speaker 1: this is just a taste of the soundtrack. The Bandersnatch 385 00:20:33,119 --> 00:20:36,080 Speaker 1: is a wonderful soundtrack, including not only these artists but 386 00:20:36,119 --> 00:20:40,080 Speaker 1: also deep esch Mode, Laurie Anderson. Great stuff. But let's 387 00:20:40,119 --> 00:20:43,680 Speaker 1: come back to the choices you make in this interactive system. 388 00:20:44,240 --> 00:20:48,760 Speaker 1: So again, they start off seeming largely inconsequential. They start 389 00:20:48,800 --> 00:20:50,840 Speaker 1: off seeming a little bit fun. You know, it's just 390 00:20:50,960 --> 00:20:53,639 Speaker 1: surface level stuff like what's his breakfast cereal, what's his 391 00:20:53,720 --> 00:20:57,919 Speaker 1: musical choice? But then they become increasingly high stakes and 392 00:20:58,000 --> 00:21:01,800 Speaker 1: even nerve racking to decide on. Suddenly, when you're controller 393 00:21:02,200 --> 00:21:05,800 Speaker 1: vibrates and you're presented with this choice and sometimes dread, Yeah, 394 00:21:05,960 --> 00:21:08,280 Speaker 1: you fill this dread because sometimes the choices neither one 395 00:21:08,359 --> 00:21:11,320 Speaker 1: is all that great. Sometimes the choices are kind of horrible, 396 00:21:11,640 --> 00:21:15,280 Speaker 1: and there's at least one point where you have no choice. 397 00:21:14,920 --> 00:21:18,719 Speaker 1: There's something to select, but there's no alternative selection, and 398 00:21:18,760 --> 00:21:21,840 Speaker 1: that feels maddening as well. And and you have a 399 00:21:21,880 --> 00:21:23,359 Speaker 1: time or you have like what I think it's ten 400 00:21:23,480 --> 00:21:26,600 Speaker 1: seconds to choose something, and if you don't choose, Netflix 401 00:21:26,680 --> 00:21:29,920 Speaker 1: chooses for you. But Netflix reports ninety four percent of 402 00:21:30,000 --> 00:21:34,240 Speaker 1: viewers actively made choices when they watched Bandersnatch. 403 00:21:34,359 --> 00:21:37,200 Speaker 2: Now, in my experience, it wasn't that they chose for 404 00:21:37,240 --> 00:21:39,679 Speaker 2: you at random. It was that whichever one of the 405 00:21:39,720 --> 00:21:42,240 Speaker 2: two options was highlighted, and it was like a you know, 406 00:21:42,400 --> 00:21:45,520 Speaker 2: on off toggle, like you couldn't select neither one. You 407 00:21:45,600 --> 00:21:47,720 Speaker 2: were just selecting one or the other, and then you 408 00:21:47,760 --> 00:21:50,000 Speaker 2: could and you could go with it or you could 409 00:21:50,080 --> 00:21:52,520 Speaker 2: not go with it in whichever one you had highlighted 410 00:21:52,600 --> 00:21:57,360 Speaker 2: would just proceed. So there's this kind of there's this 411 00:21:57,400 --> 00:22:00,800 Speaker 2: horrible sense of like helplessness that that poses on you 412 00:22:00,880 --> 00:22:01,480 Speaker 2: as the viewer. 413 00:22:03,080 --> 00:22:04,960 Speaker 1: I have not seen it. Sometimes I'll go into a 414 00:22:05,080 --> 00:22:07,560 Speaker 1: restaurant or a bar and they'll have Netflix on playing 415 00:22:07,600 --> 00:22:13,120 Speaker 1: some show. I've never seen them showing Bandersnatch. Probably for this. 416 00:22:13,119 --> 00:22:15,840 Speaker 2: Reason, letting all the bar paytrens vote to decide. 417 00:22:15,920 --> 00:22:18,440 Speaker 1: Well, yeah, or just going crazy like why is nobody 418 00:22:18,480 --> 00:22:20,840 Speaker 1: clicking a button? Why is nobody interacting with this? Don't 419 00:22:20,920 --> 00:22:24,200 Speaker 1: let don't let that choice go through. So eventually, as 420 00:22:24,200 --> 00:22:27,920 Speaker 1: you as you interact with Bandersnatch, a warping of time occurs. 421 00:22:27,960 --> 00:22:31,000 Speaker 1: You find yourself coming back around to pass choices like 422 00:22:31,040 --> 00:22:33,880 Speaker 1: a wanderer or lost in a maze. And of course, 423 00:22:33,920 --> 00:22:36,280 Speaker 1: befitting of a maze, there is a sort of minotaur 424 00:22:36,359 --> 00:22:38,360 Speaker 1: in all of this. There is the bander Snatch. 425 00:22:38,680 --> 00:22:41,840 Speaker 2: Wait is it the Bandersnatch or is it the Demon Packs? 426 00:22:42,000 --> 00:22:44,359 Speaker 1: It is the Demon Packs, yeah, but I also it 427 00:22:44,440 --> 00:22:48,000 Speaker 1: is also the Bandersnatch. Like it's design. Okay, it's design 428 00:22:48,080 --> 00:22:51,480 Speaker 1: is roughly based on that illustration of the Bandersnatch we 429 00:22:51,520 --> 00:22:53,760 Speaker 1: talked about. Okay, cool, All right, We're going to take 430 00:22:53,760 --> 00:22:55,720 Speaker 1: a quick break, but when we come back we will 431 00:22:55,720 --> 00:22:58,679 Speaker 1: get into the themes of bander Snatch and into the 432 00:22:58,840 --> 00:23:05,359 Speaker 1: nature of choice and free will. All right, we're back. 433 00:23:05,840 --> 00:23:09,399 Speaker 1: So there are a lot of interesting ideas, cool themes, 434 00:23:09,920 --> 00:23:13,920 Speaker 1: historical tidbits that are thrown together, well not thrown together, 435 00:23:14,720 --> 00:23:19,920 Speaker 1: stitch together, reassembled in Bandersnatch that give it its unique feel. 436 00:23:20,680 --> 00:23:22,640 Speaker 1: Here's just a list of some of the things. First 437 00:23:22,680 --> 00:23:26,320 Speaker 1: of all, video game design circa nineteen eighty four, because 438 00:23:26,320 --> 00:23:27,920 Speaker 1: that is the setting nineteen eighty four. 439 00:23:28,080 --> 00:23:30,399 Speaker 2: Yeah, it takes place in the eighties with eighties music, 440 00:23:30,480 --> 00:23:34,320 Speaker 2: eighties fashion, all that stuff. But they're also programming you know, 441 00:23:34,440 --> 00:23:37,960 Speaker 2: old school adventure games for like the Commodore sixty four 442 00:23:38,160 --> 00:23:38,600 Speaker 2: and stuff. 443 00:23:38,680 --> 00:23:41,320 Speaker 1: Yeah. Another huge part of it are Choose your Own 444 00:23:41,320 --> 00:23:45,439 Speaker 1: Adventure books which are directly referenced. And then there is 445 00:23:45,480 --> 00:23:49,080 Speaker 1: a book within Bandersnatch titled bander Snatch that is this 446 00:23:49,760 --> 00:23:52,919 Speaker 1: enormous tone that we're told is essentially a choose your 447 00:23:52,960 --> 00:23:57,720 Speaker 1: own adventure type scenario. Do you have any fond memories 448 00:23:57,760 --> 00:23:59,200 Speaker 1: of choose your own adventure books? 449 00:24:00,440 --> 00:24:01,440 Speaker 4: I was obsessed with them. 450 00:24:01,840 --> 00:24:03,919 Speaker 2: I loved them when I was in elementary school, and 451 00:24:03,920 --> 00:24:06,840 Speaker 2: I would love them despite the fact that you know, 452 00:24:06,880 --> 00:24:10,000 Speaker 2: you die in most of the endings, like it imposes 453 00:24:10,040 --> 00:24:14,000 Speaker 2: a kind of horrible paranoid fatalism on a child. I think, 454 00:24:14,040 --> 00:24:17,040 Speaker 2: where you know, oh, this is a book about exploring 455 00:24:17,080 --> 00:24:19,399 Speaker 2: the Arctic, but almost no matter what you do, you 456 00:24:19,440 --> 00:24:21,879 Speaker 2: get eaten by a polar bear, or you fall beneath 457 00:24:21,880 --> 00:24:25,000 Speaker 2: the ice and you can't get out. I guess my 458 00:24:25,080 --> 00:24:27,120 Speaker 2: young brain was drawn to that kind of thing, though, 459 00:24:27,160 --> 00:24:29,439 Speaker 2: you know, I had that like morbid obsession with peril 460 00:24:29,520 --> 00:24:32,800 Speaker 2: and danger and death and all that. But also I'm 461 00:24:32,880 --> 00:24:36,439 Speaker 2: curious what is so appealing about the choose your own 462 00:24:36,480 --> 00:24:38,720 Speaker 2: adventure books, because one thing we should say is that 463 00:24:38,880 --> 00:24:43,000 Speaker 2: this is not the first interactive film Bandersnatch, and previous 464 00:24:43,040 --> 00:24:49,119 Speaker 2: attempts at interactive films have generally been very unpopular. I 465 00:24:49,119 --> 00:24:52,520 Speaker 2: think a lot of times people don't actually enjoy the 466 00:24:52,640 --> 00:24:56,760 Speaker 2: experience of choosing the outcome of a film, and I 467 00:24:56,800 --> 00:24:58,520 Speaker 2: think there are reasons for that. I mean, for one thing, 468 00:24:58,560 --> 00:25:01,200 Speaker 2: it's just like hard to make a story where like multiple, 469 00:25:01,240 --> 00:25:03,560 Speaker 2: like so many different options of how the story could 470 00:25:03,720 --> 00:25:06,119 Speaker 2: go would all be equally satisfying. I mean, there's a 471 00:25:06,160 --> 00:25:08,640 Speaker 2: reason that an author writes a story a certain. 472 00:25:08,320 --> 00:25:11,800 Speaker 1: Way, right. For instance, one film that we've talked about 473 00:25:11,800 --> 00:25:16,320 Speaker 1: on the show before, William Castle's Mister Sardonicus from nineteen 474 00:25:16,440 --> 00:25:20,720 Speaker 1: sixty one, was presented was marketed as having an interactive 475 00:25:20,720 --> 00:25:22,720 Speaker 1: element in that at the end of this you got 476 00:25:22,720 --> 00:25:25,200 Speaker 1: to choose the fate for the villain would it be, 477 00:25:25,280 --> 00:25:30,119 Speaker 1: you know, justice or mercy? And the thing is, audiences 478 00:25:30,200 --> 00:25:34,479 Speaker 1: never chose mercy for this horrible villain. Of course, they 479 00:25:34,520 --> 00:25:38,159 Speaker 1: always chose justice, and so there were even accusations that 480 00:25:38,200 --> 00:25:41,280 Speaker 1: they never even shot the alternate version, like there was 481 00:25:41,400 --> 00:25:44,919 Speaker 1: the idea that it was interactive was just you know, 482 00:25:45,000 --> 00:25:47,679 Speaker 1: the pitch was just the marketing, but there was no 483 00:25:47,760 --> 00:25:51,240 Speaker 1: actual interactive element. William Castle, I think, claimed otherwise, saying 484 00:25:51,280 --> 00:25:53,840 Speaker 1: yes they did shoot the sequence. I do not know 485 00:25:54,040 --> 00:25:57,719 Speaker 1: personally if that's true or not, if this footage has 486 00:25:57,720 --> 00:26:01,040 Speaker 1: ever materialized, but what I did did read was that 487 00:26:01,680 --> 00:26:07,160 Speaker 1: generally people point to nineteen sixty seven's Keino automat as 488 00:26:07,240 --> 00:26:10,080 Speaker 1: the first truly interactive film, but even that, I think 489 00:26:10,080 --> 00:26:12,800 Speaker 1: there are only like four choices that could be made, 490 00:26:13,080 --> 00:26:15,560 Speaker 1: and this film was also I think, largely comedic. 491 00:26:16,160 --> 00:26:19,119 Speaker 2: Okay, well, I mean I would say there are many 492 00:26:19,200 --> 00:26:22,240 Speaker 2: reasons why this format doesn't always work. For some reason, 493 00:26:22,280 --> 00:26:24,160 Speaker 2: it worked for me as a kid with to choose 494 00:26:24,200 --> 00:26:27,480 Speaker 2: your own adventure books. I loved those. But I mean, 495 00:26:27,560 --> 00:26:29,840 Speaker 2: one problem I think is that it's hard to make 496 00:26:29,840 --> 00:26:32,919 Speaker 2: all the narrative branches as good as each other, but 497 00:26:33,000 --> 00:26:34,320 Speaker 2: another one is just the like. 498 00:26:34,320 --> 00:26:36,680 Speaker 1: Yeah, like, for instance, when you finish it, I don't 499 00:26:36,680 --> 00:26:38,639 Speaker 1: think there was ever a sense where I'm like, Okay, 500 00:26:38,680 --> 00:26:41,440 Speaker 1: that's the ending I got. No, I want the good ending, 501 00:26:41,520 --> 00:26:43,640 Speaker 1: or I want the robust ending you go. 502 00:26:43,600 --> 00:26:45,359 Speaker 2: Back and do it again. It's more like a video 503 00:26:45,400 --> 00:26:45,840 Speaker 2: game or. 504 00:26:45,840 --> 00:26:48,200 Speaker 1: So I don't want the ending where I randomly die, 505 00:26:48,400 --> 00:26:52,160 Speaker 1: Like the story of Super Mario is not that he's 506 00:26:52,240 --> 00:26:55,800 Speaker 1: killed by a mutant turtle three minutes into the game, 507 00:26:55,920 --> 00:26:58,399 Speaker 1: you know, I mean, that's not an epic tale. 508 00:26:58,440 --> 00:27:00,520 Speaker 2: So in some ways, I think the choose your own 509 00:27:00,520 --> 00:27:03,680 Speaker 2: adventure books are sometimes better thought of as like a 510 00:27:03,720 --> 00:27:08,119 Speaker 2: puzzle to solve than as like a narrative to be experienced. 511 00:27:08,119 --> 00:27:10,600 Speaker 2: And another big difference I will say is that one 512 00:27:10,640 --> 00:27:14,240 Speaker 2: of the great pleasures of watching a movie or reading 513 00:27:14,240 --> 00:27:16,240 Speaker 2: a book, or you know, engaging in any kind of 514 00:27:16,320 --> 00:27:19,880 Speaker 2: narrative with an author's storyteller and you as the passive audience, 515 00:27:20,359 --> 00:27:24,119 Speaker 2: is a surrendering of responsibility for what is about to 516 00:27:24,160 --> 00:27:28,960 Speaker 2: happen in your own mind. You give up that responsibility 517 00:27:29,000 --> 00:27:32,280 Speaker 2: and suddenly you know when when bad things continue to 518 00:27:32,280 --> 00:27:35,199 Speaker 2: happen in the story, when characters make disastrous decisions that 519 00:27:35,359 --> 00:27:38,960 Speaker 2: unfold and increase the peril and heighten the drama, you're 520 00:27:38,960 --> 00:27:42,679 Speaker 2: not responsible for what's happening. You're just witnessing it, and 521 00:27:42,480 --> 00:27:45,440 Speaker 2: that witnessing is very fun. It's peaking through a hole 522 00:27:45,480 --> 00:27:47,720 Speaker 2: in the wall and what's happening to somebody else. When 523 00:27:47,760 --> 00:27:52,160 Speaker 2: they make you make decisions, it introduces this horrible tension 524 00:27:52,280 --> 00:27:55,680 Speaker 2: between what you want to see versus what you think 525 00:27:55,720 --> 00:27:59,600 Speaker 2: you should do. You know, like that, I think there's 526 00:27:59,640 --> 00:28:03,200 Speaker 2: this ten whenever. A great example would be in Bandersnatch, 527 00:28:03,440 --> 00:28:08,320 Speaker 2: I often felt, in a bizarre way, morally compelled to 528 00:28:08,440 --> 00:28:12,480 Speaker 2: make the tamer, safer options, where at the same time 529 00:28:12,560 --> 00:28:15,479 Speaker 2: I felt more interested in seeing the more kind of 530 00:28:15,520 --> 00:28:18,160 Speaker 2: like dangerous disastrous options play out. 531 00:28:18,440 --> 00:28:21,879 Speaker 1: Yeah, this was definitely my experience with my first viewing 532 00:28:21,880 --> 00:28:26,359 Speaker 1: a Banterer Snatch is that when the decisions start start 533 00:28:26,400 --> 00:28:28,719 Speaker 1: hitting you, like later on they become like this horrible 534 00:28:28,800 --> 00:28:30,880 Speaker 1: choice or this horrible choice and becomes harder to play 535 00:28:30,880 --> 00:28:34,040 Speaker 1: this game. But earlier on there are moments where you're like, 536 00:28:34,119 --> 00:28:36,840 Speaker 1: are you going to do the sensible thing or the 537 00:28:36,840 --> 00:28:39,680 Speaker 1: more rebellious thing, or even the more dangerous thing? And 538 00:28:39,760 --> 00:28:44,320 Speaker 1: I found myself choosing the safer thing. Like minor spoiler here, 539 00:28:44,440 --> 00:28:48,400 Speaker 1: but he is he's offered the choice between producing his 540 00:28:48,560 --> 00:28:52,760 Speaker 1: dream game with this company at their offices, with their support, 541 00:28:53,200 --> 00:28:56,840 Speaker 1: or saying no to them, and so like the responsible 542 00:28:56,840 --> 00:28:58,640 Speaker 1: part of me is like, yes, say yes to this 543 00:28:58,920 --> 00:29:01,000 Speaker 1: is employment this is going to be good for you. Like, 544 00:29:01,040 --> 00:29:03,400 Speaker 1: clearly you're you're stuck in a weird situation at home. 545 00:29:03,440 --> 00:29:06,800 Speaker 1: You need to get out of the house, protagonists, and 546 00:29:07,480 --> 00:29:10,640 Speaker 1: and so that's the way I went. But it's ultimately 547 00:29:10,640 --> 00:29:12,800 Speaker 1: not the best choice, and it kind of dead ends 548 00:29:12,840 --> 00:29:13,880 Speaker 1: if you take that choice. 549 00:29:14,120 --> 00:29:16,600 Speaker 2: Well yeah, it almost kind of gives you a little 550 00:29:16,600 --> 00:29:19,080 Speaker 2: slap on the wrist for making that choice, you know. 551 00:29:19,200 --> 00:29:22,640 Speaker 2: So so I don't want to spoil anything, but yeah, 552 00:29:22,800 --> 00:29:26,320 Speaker 2: there's like a slight shaming of the viewer for choosing 553 00:29:26,360 --> 00:29:27,120 Speaker 2: the safe option. 554 00:29:27,240 --> 00:29:29,080 Speaker 1: Yeah, and this is very early on, so we're not 555 00:29:29,120 --> 00:29:32,240 Speaker 1: really you know, spoiling anything, I think nature, But yeah, 556 00:29:32,280 --> 00:29:33,720 Speaker 1: I would do that a lot. I would take I 557 00:29:33,720 --> 00:29:36,640 Speaker 1: would make safe choices. And in fact, it ultimately ended 558 00:29:36,680 --> 00:29:39,480 Speaker 1: up reminding me a little bit of the Spacing Guild 559 00:29:39,480 --> 00:29:41,960 Speaker 1: and Doune, who of course used the spices to see 560 00:29:41,960 --> 00:29:43,680 Speaker 1: into the future to figure out how to navigate the 561 00:29:43,760 --> 00:29:46,760 Speaker 1: dangers of space, which is helpful if your navigating the 562 00:29:46,840 --> 00:29:50,000 Speaker 1: dangers of space. But in life and in politics and 563 00:29:50,040 --> 00:29:53,200 Speaker 1: all these other choices, it's this road to stagnation for 564 00:29:53,240 --> 00:29:56,680 Speaker 1: the Spacing Guild because they always make the safe choice. 565 00:29:57,000 --> 00:30:00,920 Speaker 1: And when we look at the narratives that we love generally, 566 00:30:00,920 --> 00:30:04,080 Speaker 1: they're not about people making the safe choice after safe 567 00:30:04,160 --> 00:30:07,560 Speaker 1: choice after safe choice. They're about people flying off the 568 00:30:07,600 --> 00:30:10,560 Speaker 1: handles or making huge mistakes and having to deal with those, 569 00:30:10,920 --> 00:30:12,840 Speaker 1: And so there is I think there's a learning curve 570 00:30:12,880 --> 00:30:16,640 Speaker 1: there with Bandersnatch, And so my second viewing of it, 571 00:30:16,680 --> 00:30:18,200 Speaker 1: I tried to do more of that. I tried to 572 00:30:18,240 --> 00:30:24,320 Speaker 1: make choices that I felt were interesting or more dramatic, 573 00:30:24,960 --> 00:30:27,640 Speaker 1: and that seemed to work really well, and I feel 574 00:30:27,640 --> 00:30:31,800 Speaker 1: like the product rewards you for doing that. 575 00:30:32,200 --> 00:30:35,400 Speaker 2: Yeah, So I think that tension is definitely there with 576 00:30:35,440 --> 00:30:38,080 Speaker 2: the movies, and I wonder if it's more the case 577 00:30:38,120 --> 00:30:39,800 Speaker 2: in a movie than in a book, just because a 578 00:30:39,840 --> 00:30:43,760 Speaker 2: movie is more sensorily visceral. The fact that you know 579 00:30:43,880 --> 00:30:47,160 Speaker 2: that it's actually visually presented to you in video and 580 00:30:47,240 --> 00:30:52,280 Speaker 2: audio makes it harder to just pursue, you know, your 581 00:30:52,400 --> 00:30:56,200 Speaker 2: sort of lust for drama and weirdness and whatever it 582 00:30:56,240 --> 00:30:59,280 Speaker 2: is you want to see as opposed to making the 583 00:30:59,280 --> 00:31:02,360 Speaker 2: safer choices. I don't recall feeling compelled to make the 584 00:31:02,400 --> 00:31:05,480 Speaker 2: safer choice the same way with Choose your Own Adventure books. 585 00:31:06,640 --> 00:31:10,040 Speaker 2: That could just be because of like the lower sensory 586 00:31:10,120 --> 00:31:12,479 Speaker 2: salience of books compared to movies. 587 00:31:12,520 --> 00:31:16,200 Speaker 1: I don't know, Yeah, maybe, So I finally finally remember 588 00:31:16,200 --> 00:31:18,360 Speaker 1: the Choose you Own Your Adventure books as well, in 589 00:31:18,400 --> 00:31:20,040 Speaker 1: part because they had them at the library and I 590 00:31:20,080 --> 00:31:23,520 Speaker 1: could check them out. Yeah. But also another series that 591 00:31:23,600 --> 00:31:28,160 Speaker 1: I finally remember, the Lone Wolf series. Were you familiar 592 00:31:28,200 --> 00:31:31,240 Speaker 1: with these? So these there's a series of these. The 593 00:31:31,240 --> 00:31:35,600 Speaker 1: first one was by Joe Deaver and Gary Chalk, and 594 00:31:35,680 --> 00:31:38,240 Speaker 1: this is they're like a Choose your Own Adventure series, 595 00:31:38,400 --> 00:31:42,080 Speaker 1: very much fantasy Dungeons and Dragon style high fantasy, but 596 00:31:42,120 --> 00:31:43,960 Speaker 1: there's more of a role playing element to it. So 597 00:31:44,000 --> 00:31:47,040 Speaker 1: for instance, when you open the book, it has not 598 00:31:47,080 --> 00:31:50,560 Speaker 1: only a map of the adventuring world you're taking a 599 00:31:50,640 --> 00:31:53,000 Speaker 1: part of, but there's also an action chart and a 600 00:31:53,040 --> 00:31:55,920 Speaker 1: combat record because you're going to end up having to 601 00:31:55,960 --> 00:31:58,560 Speaker 1: pencil in your stats as you go through the story, 602 00:31:59,160 --> 00:32:01,040 Speaker 1: picking spell and so forth. 603 00:32:01,920 --> 00:32:04,160 Speaker 2: It's more like a one player D and D module. 604 00:32:04,280 --> 00:32:07,120 Speaker 1: Yeah, exactly. It's like imagine it's like a Choose your 605 00:32:07,160 --> 00:32:09,240 Speaker 1: Own Adventure book and a one player a D and 606 00:32:09,320 --> 00:32:12,719 Speaker 1: D module come together into this one little tone. So 607 00:32:12,760 --> 00:32:15,880 Speaker 1: I finally remember those, and I might be misremembering here, 608 00:32:16,120 --> 00:32:18,880 Speaker 1: but I think I did get turned off later on 609 00:32:18,960 --> 00:32:23,240 Speaker 1: when I reached an artificial dead end in one of them, 610 00:32:23,280 --> 00:32:25,720 Speaker 1: like there was something broken and I couldn't go back. 611 00:32:25,840 --> 00:32:29,400 Speaker 1: Oh no, yeah, but again my memory may not be 612 00:32:29,440 --> 00:32:31,960 Speaker 1: perfect on that. If you're at all interested in this format, 613 00:32:32,000 --> 00:32:34,720 Speaker 1: I do highly recommend picking up one of these old, 614 00:32:34,880 --> 00:32:37,880 Speaker 1: fabulous used copies of the Lone Wolf series. And I 615 00:32:37,880 --> 00:32:40,920 Speaker 1: think they've republished them again with new artwork, but I 616 00:32:40,960 --> 00:32:43,960 Speaker 1: don't know. The classic artwork is exactly the kind of 617 00:32:44,000 --> 00:32:44,520 Speaker 1: thing I love. 618 00:32:44,880 --> 00:32:47,000 Speaker 2: The Choose your Own Adventure book that I brought in 619 00:32:47,080 --> 00:32:49,400 Speaker 2: today for you to look at Robert is called You 620 00:32:49,480 --> 00:32:52,880 Speaker 2: Are a Shark by Edward Packard. It has a kid 621 00:32:52,960 --> 00:32:56,560 Speaker 2: turning into a shark. He's like mid animorph sequence, oh man, 622 00:32:56,720 --> 00:32:59,080 Speaker 2: but he also looks like he's slipping sliding as he 623 00:32:59,160 --> 00:33:00,240 Speaker 2: turns into a shark. 624 00:33:00,520 --> 00:33:01,080 Speaker 4: Is pretty good. 625 00:33:01,120 --> 00:33:05,600 Speaker 1: That's pretty brilliant too, like channeling something that children, especially 626 00:33:05,600 --> 00:33:08,160 Speaker 1: of that time, would have been familiar would have likely done, 627 00:33:08,520 --> 00:33:10,680 Speaker 1: and giving this fantastic spin on it. 628 00:33:10,760 --> 00:33:15,280 Speaker 2: But the story is essentially the fingle doppling scene from 629 00:33:15,600 --> 00:33:19,080 Speaker 2: Overdrawn at the Memory Bang, where he just gets transformed 630 00:33:19,080 --> 00:33:21,960 Speaker 2: into various different animals. Do you know you get turned 631 00:33:22,000 --> 00:33:25,040 Speaker 2: into an elephant or a seagull, or of course a shark. 632 00:33:25,360 --> 00:33:27,480 Speaker 2: I think I recall one death where you get turned 633 00:33:27,480 --> 00:33:30,120 Speaker 2: into a squid and you're being chased by something, maybe 634 00:33:30,120 --> 00:33:31,920 Speaker 2: it is a shark, and you run out of ink 635 00:33:32,040 --> 00:33:34,960 Speaker 2: to disguise yourself with and you're doomed. 636 00:33:36,520 --> 00:33:39,600 Speaker 1: All right. Well, coming back to Bandersnatch, we mentioned the 637 00:33:39,680 --> 00:33:42,840 Speaker 1: video game aspect nineteen eighty four Choose your Own Adventure books. 638 00:33:43,200 --> 00:33:45,640 Speaker 1: There are a number of other elements and homages in 639 00:33:45,680 --> 00:33:48,840 Speaker 1: there as well. It deals with mental illness, it deals 640 00:33:48,880 --> 00:33:54,440 Speaker 1: with LSD. There are allusions to Philip K. Dick. There's 641 00:33:54,600 --> 00:33:57,840 Speaker 1: mention of alternate timelines, and of course it spends a 642 00:33:57,880 --> 00:34:00,640 Speaker 1: lot of time contemplating this idea of a free will 643 00:34:01,000 --> 00:34:02,959 Speaker 1: and the potential illusion of choice. 644 00:34:03,160 --> 00:34:03,360 Speaker 4: Yeah. 645 00:34:03,400 --> 00:34:07,320 Speaker 2: I think that's the main theme of it, is interrogating 646 00:34:07,360 --> 00:34:09,680 Speaker 2: the idea of what it means to be in control 647 00:34:09,760 --> 00:34:11,160 Speaker 2: of one's own actions. 648 00:34:11,560 --> 00:34:14,759 Speaker 1: Yeah. And the basic plot is as follows. A young 649 00:34:14,800 --> 00:34:18,000 Speaker 1: programmer named Stephan Butler is obsessed with a choose your 650 00:34:18,000 --> 00:34:22,120 Speaker 1: own adventure style book titled Bandersnatch that was written by 651 00:34:22,480 --> 00:34:25,919 Speaker 1: the late troubled writer Jerome F. Davies, and he really 652 00:34:26,000 --> 00:34:28,520 Speaker 1: wants to turn this into a computer adventure game, and 653 00:34:28,560 --> 00:34:31,080 Speaker 1: he's begun work on it on his own. So he 654 00:34:31,160 --> 00:34:34,440 Speaker 1: ends up falling in with this video game company called 655 00:34:34,680 --> 00:34:40,400 Speaker 1: Tuckersoft and meets its lead creative, this programmer named Colin Rittman. 656 00:34:40,920 --> 00:34:44,680 Speaker 1: And from there it departs through these varying winding paths, 657 00:34:44,760 --> 00:34:49,520 Speaker 1: reality warping, through madness and sometimes horror, and through all 658 00:34:49,520 --> 00:34:51,920 Speaker 1: of it, there's also this feeling that there is a 659 00:34:51,960 --> 00:34:56,400 Speaker 1: minotaur like monster pursuing you, pursuing our protagonist as well. 660 00:34:56,680 --> 00:34:59,960 Speaker 1: And this is the Bandersnatch, but more specifically, it is 661 00:35:00,120 --> 00:35:02,600 Speaker 1: titled Packs. Its name is Packs, and it is we 662 00:35:02,680 --> 00:35:04,600 Speaker 1: are told it is the Thief of Destiny. 663 00:35:05,360 --> 00:35:07,120 Speaker 2: Yeah, there's a great moment where the game appears to 664 00:35:07,120 --> 00:35:10,640 Speaker 2: give you an option to either deny worshiping packs or 665 00:35:10,680 --> 00:35:12,160 Speaker 2: submit to worshiping packs. 666 00:35:12,200 --> 00:35:13,799 Speaker 1: Oh yes, this is the game within a game. 667 00:35:14,239 --> 00:35:15,439 Speaker 4: It made me want to play the game. 668 00:35:15,520 --> 00:35:16,760 Speaker 1: Yeah, yeah, it looked really cool. 669 00:35:16,960 --> 00:35:17,759 Speaker 4: So something that. 670 00:35:17,719 --> 00:35:20,640 Speaker 2: Made Bandersnatch different from most of the choose your own 671 00:35:20,680 --> 00:35:23,719 Speaker 2: adventure books that I remember reading. I'm sure there are 672 00:35:23,719 --> 00:35:26,920 Speaker 2: probably exceptions, but in the classic books I remember reading, 673 00:35:27,480 --> 00:35:30,879 Speaker 2: the story is written in the second person. The protagonist 674 00:35:31,080 --> 00:35:34,120 Speaker 2: is an unnamed you. You know, you go down the 675 00:35:34,239 --> 00:35:36,920 Speaker 2: left hall, you get eaten by a swarm of feral pigs. 676 00:35:37,200 --> 00:35:39,880 Speaker 2: You go down the right hall, you get turned into 677 00:35:40,040 --> 00:35:42,120 Speaker 2: a bowl of ice cream by magic pirate. You know, 678 00:35:43,239 --> 00:35:46,319 Speaker 2: you explore all the different dooms on offer to you, 679 00:35:46,400 --> 00:35:47,080 Speaker 2: but it's you. 680 00:35:47,520 --> 00:35:51,400 Speaker 1: Yeah. Likewise, in the Lone Wolf books, as I recall you, 681 00:35:51,400 --> 00:35:54,680 Speaker 1: you kind of make choices regarding how this characters put together. 682 00:35:54,800 --> 00:35:58,080 Speaker 1: You have a fair amount of control. It is your character. 683 00:35:58,880 --> 00:36:02,080 Speaker 2: But band ver Snatch challenges this formula a little bit 684 00:36:02,320 --> 00:36:06,400 Speaker 2: by making the protagonist a third person character with a 685 00:36:06,560 --> 00:36:11,000 Speaker 2: name and pre existing individualized circumstances. You've got Stefan right. 686 00:36:11,800 --> 00:36:14,680 Speaker 2: But then this is where it starts getting even weirder. 687 00:36:15,040 --> 00:36:17,960 Speaker 2: So not only is it this definite third person character 688 00:36:18,120 --> 00:36:21,759 Speaker 2: with their own characteristics and not just a second person protagonist, 689 00:36:22,280 --> 00:36:25,960 Speaker 2: there are moments where the options are you choose, not 690 00:36:26,200 --> 00:36:29,040 Speaker 2: what Stefan does. That's how it mostly is. You know 691 00:36:29,080 --> 00:36:31,759 Speaker 2: what does Stefan pick? You know what, what does he 692 00:36:31,840 --> 00:36:34,920 Speaker 2: listen to? What does he answer to somebody who poses 693 00:36:34,920 --> 00:36:37,400 Speaker 2: a question to him? It then changes and gives you 694 00:36:37,440 --> 00:36:41,000 Speaker 2: the option to dictate what happens to him from the 695 00:36:41,080 --> 00:36:45,279 Speaker 2: outside The specific example I recall is what messages he 696 00:36:45,400 --> 00:36:49,600 Speaker 2: believes he is receiving on his computer screen. Oh yeah, Now, 697 00:36:49,640 --> 00:36:52,000 Speaker 2: of course, if you go with the most straightforward interpretation 698 00:36:52,120 --> 00:36:54,840 Speaker 2: of the story, which is that Stefan is experiencing symptoms 699 00:36:54,880 --> 00:36:58,239 Speaker 2: of psychosis, in a way, you're still dictating the activity 700 00:36:58,280 --> 00:37:02,239 Speaker 2: of his brain. Activities of his brain that he as 701 00:37:02,280 --> 00:37:05,960 Speaker 2: a character does not perceive as coming from himself. They're 702 00:37:06,280 --> 00:37:09,480 Speaker 2: hallucinations that he believes to be coming from the outside. 703 00:37:10,000 --> 00:37:12,560 Speaker 2: And you know, this makes me wonder about the framing 704 00:37:12,600 --> 00:37:16,160 Speaker 2: of how we should think about hallucinations that are generated 705 00:37:16,239 --> 00:37:19,480 Speaker 2: internally by the brain but perceived to come from an 706 00:37:19,480 --> 00:37:25,040 Speaker 2: external source. Are those hallucinations best understood as you or not? 707 00:37:26,080 --> 00:37:29,359 Speaker 2: Are there processes within your own brain that are, in 708 00:37:29,400 --> 00:37:32,960 Speaker 2: some legitimate sense not you, even though they are your brain, 709 00:37:33,000 --> 00:37:34,120 Speaker 2: they're not anybody else. 710 00:37:34,640 --> 00:37:38,000 Speaker 1: Yeah, it's not really the voice of God. It is 711 00:37:38,000 --> 00:37:40,960 Speaker 1: is it is something coming from inside your brain that 712 00:37:41,040 --> 00:37:46,040 Speaker 1: you are perhaps imagining or interpreting as the voice of God. 713 00:37:46,400 --> 00:37:50,000 Speaker 2: But is you more synonymous with your whole brain and 714 00:37:50,040 --> 00:37:53,600 Speaker 2: everything it does, or is you more synonymous with the 715 00:37:53,640 --> 00:37:57,200 Speaker 2: part of your brain that you identify as yourself. 716 00:37:57,040 --> 00:37:59,279 Speaker 1: That's going to be very key to some interpretations of 717 00:37:59,320 --> 00:38:03,040 Speaker 1: the base theme explored in Bandersnatch right, and they do 718 00:38:03,120 --> 00:38:07,439 Speaker 1: explore this theme amazingly. Well, I felt the second time 719 00:38:07,520 --> 00:38:11,799 Speaker 1: I watched it, I found all these additional layers. You know, again, 720 00:38:11,920 --> 00:38:14,920 Speaker 1: I'm I'm tempted to make the best choices for a protagonist, 721 00:38:15,040 --> 00:38:17,279 Speaker 1: or at least there's still that inclination that I want 722 00:38:17,320 --> 00:38:19,479 Speaker 1: to do that. And at one point there's this song 723 00:38:19,560 --> 00:38:22,279 Speaker 1: playing with the lyrics doing what's best for Nigel, and 724 00:38:22,320 --> 00:38:26,360 Speaker 1: it's all in the stc or I think so. Yes, Okay, 725 00:38:26,640 --> 00:38:28,360 Speaker 1: I was not familiar with that group or that this 726 00:38:28,440 --> 00:38:31,040 Speaker 1: song before, but yeah, it's playing, and the whole scene 727 00:38:31,080 --> 00:38:33,919 Speaker 1: is about like how his father is making choices for him, 728 00:38:34,320 --> 00:38:36,960 Speaker 1: or other times it's you know, it's his therapist that 729 00:38:37,040 --> 00:38:39,359 Speaker 1: is giving him advice about how to how to make 730 00:38:39,440 --> 00:38:42,319 Speaker 1: choices in his life. And so you have all these 731 00:38:42,360 --> 00:38:45,840 Speaker 1: forces that help him make his choices or make choices 732 00:38:45,880 --> 00:38:49,040 Speaker 1: for him. And then that's also what we are doing 733 00:38:49,080 --> 00:38:50,960 Speaker 1: as we interact with the product. 734 00:38:51,239 --> 00:38:52,880 Speaker 2: Well, yes, and in a weird way, it kind of 735 00:38:52,920 --> 00:38:56,080 Speaker 2: brings you back around to this question of wait a minute, 736 00:38:56,200 --> 00:38:58,879 Speaker 2: is he is he a third person narrator or are 737 00:38:58,880 --> 00:39:02,240 Speaker 2: you supposed to identify as him. So when these choices 738 00:39:02,560 --> 00:39:06,040 Speaker 2: are in some cases things coming to him apparently from 739 00:39:06,080 --> 00:39:09,200 Speaker 2: the outside, you know, they might be messages he's receiving 740 00:39:09,280 --> 00:39:14,040 Speaker 2: from some kind of otherworldly source or hallucinations, are you 741 00:39:14,160 --> 00:39:17,840 Speaker 2: still making choices for him or not? And it leads 742 00:39:17,880 --> 00:39:21,480 Speaker 2: back into this theme of whether or not you are 743 00:39:21,520 --> 00:39:24,040 Speaker 2: really in control of your own actions and what does 744 00:39:24,040 --> 00:39:26,440 Speaker 2: it mean to be in control of your own actions? 745 00:39:27,000 --> 00:39:29,600 Speaker 1: And in this we come to the subject of free will, 746 00:39:30,400 --> 00:39:33,439 Speaker 1: which is a huge topic that we return to time 747 00:39:33,480 --> 00:39:35,719 Speaker 1: and time again on stuff to blow your mind. And 748 00:39:35,760 --> 00:39:39,360 Speaker 1: we're not going to try to encapsulate everything about that here. 749 00:39:39,960 --> 00:39:42,120 Speaker 1: You know, we've talked about in the past, we're talking 750 00:39:42,160 --> 00:39:43,560 Speaker 1: about it today, We're going to talk about it in 751 00:39:43,560 --> 00:39:48,560 Speaker 1: the future. But it suffice to say philosophies vary, scientific 752 00:39:48,560 --> 00:39:52,759 Speaker 1: interpretations vary, and then it drags in additional drags in 753 00:39:52,840 --> 00:39:55,279 Speaker 1: just about everything about the human condition, right, I mean, 754 00:39:55,600 --> 00:39:59,160 Speaker 1: moral responsibility, theological quandaries, etc. 755 00:39:59,600 --> 00:40:03,280 Speaker 2: Yeah, it's a problem that it is such a huge 756 00:40:03,360 --> 00:40:07,520 Speaker 2: topic and that almost all discussions about free will that 757 00:40:07,560 --> 00:40:10,240 Speaker 2: I encounter in the wild are an absolute mess. 758 00:40:10,760 --> 00:40:11,879 Speaker 4: This is my personal take. 759 00:40:12,080 --> 00:40:15,520 Speaker 2: I noticed, do you ever notice how conversations about free 760 00:40:15,520 --> 00:40:20,239 Speaker 2: will almost never seem to clarify anything. They almost never 761 00:40:20,239 --> 00:40:23,400 Speaker 2: seem to provide any more focus or clarity than you 762 00:40:23,480 --> 00:40:24,319 Speaker 2: had to begin with. 763 00:40:24,960 --> 00:40:29,040 Speaker 1: Yeah, Like sometimes it's at times it feels like having 764 00:40:29,040 --> 00:40:33,680 Speaker 1: a conversation with somebody in a swimming pool about whether 765 00:40:33,760 --> 00:40:36,759 Speaker 1: water is wet. Yeah, because it does get down to like, 766 00:40:36,760 --> 00:40:39,239 Speaker 1: like it seems wet to me, I am in it. 767 00:40:39,239 --> 00:40:41,400 Speaker 1: It seems like free will to me because I am 768 00:40:41,480 --> 00:40:44,800 Speaker 1: immersed in it, and it's difficult for me to remove 769 00:40:44,880 --> 00:40:48,440 Speaker 1: myself from the experience that I'm having and all of 770 00:40:48,640 --> 00:40:50,799 Speaker 1: and everything in my life that supports everything in the 771 00:40:50,840 --> 00:40:54,200 Speaker 1: culture at large, that supports the idea that I am 772 00:40:54,320 --> 00:40:57,840 Speaker 1: making choices and form choices about my life. 773 00:40:58,120 --> 00:40:59,840 Speaker 2: I mean, I feel like some dilemmas having to do 774 00:40:59,880 --> 00:41:03,160 Speaker 2: with free will or like they force you to choose 775 00:41:03,200 --> 00:41:04,440 Speaker 2: between two options that. 776 00:41:04,440 --> 00:41:07,480 Speaker 4: Are both tautologies or both absurdities. 777 00:41:07,520 --> 00:41:10,759 Speaker 2: And any time you encounter a problem like that, I 778 00:41:10,760 --> 00:41:13,160 Speaker 2: think there's a pretty good chance that the underlying disease 779 00:41:13,280 --> 00:41:15,920 Speaker 2: causing that is poorly defined terms. 780 00:41:16,040 --> 00:41:19,640 Speaker 1: Right, Yeah, and yeah, to your point, the extreme versions 781 00:41:19,680 --> 00:41:22,040 Speaker 1: of this are to tend to come off as kind 782 00:41:22,040 --> 00:41:23,960 Speaker 1: of loony, Like if someone is just like I am 783 00:41:23,960 --> 00:41:26,800 Speaker 1: a completely free moving soul, Like no, you're not dummy. 784 00:41:27,480 --> 00:41:27,680 Speaker 2: You know. 785 00:41:27,760 --> 00:41:30,400 Speaker 1: It's like when we discussed in the Thankfulness episode that 786 00:41:30,520 --> 00:41:33,360 Speaker 1: we put out, you know, like everybody's life is shaped 787 00:41:33,400 --> 00:41:35,560 Speaker 1: by these other factors. These are other individuals in their 788 00:41:35,600 --> 00:41:38,360 Speaker 1: life to some extent, and I feel like to argue 789 00:41:38,480 --> 00:41:41,040 Speaker 1: against that is just lunacy. On the other hand, if 790 00:41:41,080 --> 00:41:43,839 Speaker 1: someone is saying I am a just a pure automaton, 791 00:41:44,480 --> 00:41:46,960 Speaker 1: I mean, there you can back that argument up with 792 00:41:47,000 --> 00:41:50,279 Speaker 1: some very intriguing arguments, and we'll get into some of those. 793 00:41:50,600 --> 00:41:53,479 Speaker 1: But at the end of the day, does that match 794 00:41:53,560 --> 00:41:55,440 Speaker 1: up with your experience of reality? 795 00:41:55,719 --> 00:41:58,040 Speaker 2: I totally agree, But I think even talking about it 796 00:41:58,080 --> 00:42:01,239 Speaker 2: at that level, that's already like a level up, like 797 00:42:01,320 --> 00:42:05,480 Speaker 2: having accepted some terms as unproblematic more than I think 798 00:42:05,520 --> 00:42:07,600 Speaker 2: they should be. So like anyway, I mean, I think 799 00:42:07,640 --> 00:42:10,080 Speaker 2: the main problem with free will is people aren't being 800 00:42:10,200 --> 00:42:13,400 Speaker 2: clear what they're talking about before they start talking. And 801 00:42:13,440 --> 00:42:16,279 Speaker 2: I'm totally guilty of this as well. This is usually 802 00:42:16,360 --> 00:42:18,160 Speaker 2: the case when it comes to free will, and this 803 00:42:18,239 --> 00:42:20,920 Speaker 2: happens even when we're not aware that we're being unclear, 804 00:42:21,000 --> 00:42:23,640 Speaker 2: So we can't do it full justice. In the short segment. 805 00:42:23,880 --> 00:42:25,680 Speaker 2: I think we will try to do better than an 806 00:42:25,719 --> 00:42:30,440 Speaker 2: absolute mess. So to try to understand what our terms 807 00:42:30,480 --> 00:42:31,040 Speaker 2: actually mean. 808 00:42:31,360 --> 00:42:32,200 Speaker 4: What is free? 809 00:42:32,280 --> 00:42:36,279 Speaker 2: Will A common understanding is I am in control of 810 00:42:36,320 --> 00:42:39,479 Speaker 2: my own actions. And I think most of the time 811 00:42:39,600 --> 00:42:43,680 Speaker 2: for most people this feels true, though curiously, of course, 812 00:42:43,719 --> 00:42:45,640 Speaker 2: not all of the time and not for all people. 813 00:42:45,719 --> 00:42:48,440 Speaker 2: We can come back to that, but I would argue 814 00:42:48,440 --> 00:42:52,000 Speaker 2: that it only feels true in a general way, and 815 00:42:52,080 --> 00:42:55,200 Speaker 2: it gets stickier and thornier the more you try to 816 00:42:55,200 --> 00:42:58,000 Speaker 2: think about it, and the more precisely you try to 817 00:42:58,040 --> 00:43:00,960 Speaker 2: define those terms. So, if I'm in control of my 818 00:43:01,080 --> 00:43:02,479 Speaker 2: own actions, who is I. 819 00:43:03,640 --> 00:43:05,480 Speaker 4: We brought this up a minute ago. Is I my 820 00:43:05,680 --> 00:43:06,480 Speaker 4: whole brain? 821 00:43:07,600 --> 00:43:09,440 Speaker 2: I mean? Also, I think there's a good case to 822 00:43:09,480 --> 00:43:11,640 Speaker 2: be made that other parts of your body gets some 823 00:43:11,760 --> 00:43:14,319 Speaker 2: kind of vote in your decision making. So is it 824 00:43:14,360 --> 00:43:17,600 Speaker 2: my whole body? Is it everything with my genome? Even then, 825 00:43:17,600 --> 00:43:20,080 Speaker 2: I would say your microbiota sort of gets a vote. 826 00:43:20,880 --> 00:43:23,200 Speaker 2: I think there are questions about what the eye is. 827 00:43:23,360 --> 00:43:26,719 Speaker 2: But then also what counts as control? If I am 828 00:43:26,760 --> 00:43:29,560 Speaker 2: in control of my own actions, does it mean that 829 00:43:29,560 --> 00:43:33,560 Speaker 2: that I make my decisions with no outside influences? I mean, 830 00:43:33,560 --> 00:43:35,880 Speaker 2: that's obviously not true, as you were alluding to a 831 00:43:35,920 --> 00:43:39,520 Speaker 2: minute ago. But once you accept that outside factors have 832 00:43:39,640 --> 00:43:43,040 Speaker 2: some influence over whatever it is you're talking about controlling, 833 00:43:43,560 --> 00:43:46,719 Speaker 2: what's to stop you from assuming that they have total influence? 834 00:43:46,760 --> 00:43:49,640 Speaker 2: I mean, what part of your decision making is not 835 00:43:49,920 --> 00:43:53,879 Speaker 2: influenced by pre existing factors like your memory and your 836 00:43:53,880 --> 00:43:57,160 Speaker 2: physical circumstances and so forth, Like what part of you 837 00:43:57,560 --> 00:44:01,480 Speaker 2: can you identify that stands outside of the world. And 838 00:44:01,520 --> 00:44:03,600 Speaker 2: then from the other end, paradoxically, if you were to 839 00:44:03,640 --> 00:44:06,919 Speaker 2: suddenly act in a way that made no sense given 840 00:44:06,960 --> 00:44:10,120 Speaker 2: your own history and memory and all of the inputs 841 00:44:10,160 --> 00:44:12,520 Speaker 2: coming in that you think of as influences on you, 842 00:44:12,960 --> 00:44:16,840 Speaker 2: wouldn't that action actually feel less like something that comes 843 00:44:16,880 --> 00:44:20,880 Speaker 2: genuinely from you, whatever you are. Wouldn't by this metric, 844 00:44:20,960 --> 00:44:24,560 Speaker 2: the most objectively free action seem like something coming from 845 00:44:24,600 --> 00:44:25,440 Speaker 2: the outside. 846 00:44:26,560 --> 00:44:29,879 Speaker 1: You mean, like if you go to a restaurant where there's, 847 00:44:29,920 --> 00:44:32,600 Speaker 1: say there's a drink menu, and you always tend to 848 00:44:32,719 --> 00:44:35,040 Speaker 1: order something that is made with a base of say 849 00:44:35,120 --> 00:44:38,919 Speaker 1: rum or bourbon or whiskey, and instead you throw caution 850 00:44:39,000 --> 00:44:40,560 Speaker 1: to the wind one day and you get a mescal 851 00:44:40,680 --> 00:44:41,560 Speaker 1: or a vodka drink. 852 00:44:41,760 --> 00:44:46,320 Speaker 2: Uh, does that actually make you feel more free or 853 00:44:46,520 --> 00:44:49,040 Speaker 2: does it seem like something you know, something. 854 00:44:48,680 --> 00:44:50,959 Speaker 4: Got into you. Where does that phrase come from? 855 00:44:51,320 --> 00:44:53,160 Speaker 1: I don't know. When I do things like that, I 856 00:44:53,760 --> 00:44:55,839 Speaker 1: think it does make me feel more free, because I'm like, no, 857 00:44:56,440 --> 00:44:59,040 Speaker 1: I'm not gonna be the same person I've been every 858 00:44:59,040 --> 00:45:01,920 Speaker 1: time I'm gonna try I had a different direction. You know, 859 00:45:02,000 --> 00:45:04,200 Speaker 1: I'm going to I'm going to get a different type 860 00:45:04,200 --> 00:45:06,080 Speaker 1: of drink, I'm going to try a different type of food, 861 00:45:06,120 --> 00:45:08,600 Speaker 1: I'm going to walk a different way to the train station, 862 00:45:08,719 --> 00:45:09,160 Speaker 1: et cetera. 863 00:45:09,480 --> 00:45:11,600 Speaker 2: Well, I mean, I would say that this just highlights 864 00:45:11,680 --> 00:45:16,520 Speaker 2: that neither branch either acting in character where your character 865 00:45:16,600 --> 00:45:18,480 Speaker 2: has been shaped by everything that ever happened to you, 866 00:45:18,560 --> 00:45:21,040 Speaker 2: nor a by acting out of character where you know 867 00:45:21,120 --> 00:45:25,600 Speaker 2: something got into you. Neither way really cites the origin 868 00:45:25,920 --> 00:45:29,799 Speaker 2: of decisions or the origin of actions in something that's 869 00:45:29,840 --> 00:45:33,480 Speaker 2: out without outside influence. So a lot of the arguments 870 00:45:33,480 --> 00:45:36,040 Speaker 2: about whether we have free will actually seem to me 871 00:45:36,160 --> 00:45:39,160 Speaker 2: to reduce to the question of whether we feel we 872 00:45:39,239 --> 00:45:42,080 Speaker 2: have free will. But what would it actually mean to 873 00:45:42,120 --> 00:45:45,880 Speaker 2: settle the question of whether we are like physically objectively free? 874 00:45:46,640 --> 00:45:48,640 Speaker 2: So maybe we should look at like a more thought 875 00:45:48,680 --> 00:45:52,120 Speaker 2: out dictionary definition. So one that I found is quote 876 00:45:52,239 --> 00:45:57,160 Speaker 2: the power of acting without the constraint of necessity or fate. 877 00:45:57,360 --> 00:45:59,840 Speaker 1: Oh or fate. Now that that brings me back to 878 00:45:59,840 --> 00:46:03,480 Speaker 1: this demon Packs, the thief of destiny, I find myself. 879 00:46:03,560 --> 00:46:06,600 Speaker 1: I found myself with the second viewing of Bandersnatch, returning 880 00:46:06,640 --> 00:46:09,120 Speaker 1: to that title and trying to figure out exactly what 881 00:46:09,160 --> 00:46:14,600 Speaker 1: it means. Because destiny, on one hand, means like, you're predestined, right, 882 00:46:14,640 --> 00:46:17,400 Speaker 1: There is a destiny in place for you, and you 883 00:46:17,440 --> 00:46:19,879 Speaker 1: perhaps don't have any real control. It is the thing 884 00:46:19,960 --> 00:46:22,560 Speaker 1: that the gods have laid out for you. That's like 885 00:46:22,600 --> 00:46:24,160 Speaker 1: one way of looking at But another way of looking 886 00:46:24,200 --> 00:46:28,400 Speaker 1: at destiny is that destiny is the thing you aspire to. Like, 887 00:46:28,480 --> 00:46:32,680 Speaker 1: you choose your own destiny, You choose your own adventure. Right, So, 888 00:46:32,800 --> 00:46:36,799 Speaker 1: which of the two is the demon Packs stealing from you? 889 00:46:36,920 --> 00:46:40,080 Speaker 1: Is he stealing from you the power to make your 890 00:46:40,080 --> 00:46:44,040 Speaker 1: own decisions? Or is he stealing from you a pre 891 00:46:44,320 --> 00:46:47,960 Speaker 1: destined path? Is he liberating you from this from the 892 00:46:48,000 --> 00:46:50,880 Speaker 1: same tired walk to the train station and the same 893 00:46:50,960 --> 00:46:52,600 Speaker 1: tired choices on the menu. 894 00:46:52,719 --> 00:46:54,640 Speaker 2: Well, the funny thing about it choose your own adventure 895 00:46:54,800 --> 00:46:57,880 Speaker 2: is that even though you are making the choices on 896 00:46:57,920 --> 00:47:00,360 Speaker 2: each page about which page to turn to next. Somebody 897 00:47:00,360 --> 00:47:01,359 Speaker 2: else wrote the whole thing. 898 00:47:01,440 --> 00:47:05,239 Speaker 1: That's true, and I mean, to a certain extent, like 899 00:47:05,280 --> 00:47:07,680 Speaker 1: you can. You can apply that to life, like as 900 00:47:07,800 --> 00:47:11,000 Speaker 1: rebellious as you might seem, ordering something else on the 901 00:47:11,040 --> 00:47:13,200 Speaker 1: menu that you normally don't get, it's still on the menu. 902 00:47:13,560 --> 00:47:15,520 Speaker 1: And other things in life are like that too. Like 903 00:47:15,560 --> 00:47:19,759 Speaker 1: you were large, you are constrained by the possibilities of 904 00:47:20,760 --> 00:47:23,600 Speaker 1: your culture, of your station in life, of you know, 905 00:47:23,640 --> 00:47:25,320 Speaker 1: political realities, et cetera. 906 00:47:25,680 --> 00:47:30,480 Speaker 2: But even then, is the unpredictability of a behavior at 907 00:47:30,480 --> 00:47:35,680 Speaker 2: all evidence of your control or your personal volition. 908 00:47:35,560 --> 00:47:37,759 Speaker 4: Of that behavior. I don't know. 909 00:47:37,800 --> 00:47:40,160 Speaker 2: I mean those things seem perhaps unrelated to me. 910 00:47:40,320 --> 00:47:43,520 Speaker 1: Actually, yeah, I mean you can also be predictably unpredictable. 911 00:47:43,640 --> 00:47:47,120 Speaker 2: Yeah, But anyway, coming back to this definition, the one 912 00:47:47,160 --> 00:47:50,920 Speaker 2: that's you know, acting without constraint of necessity or fate. 913 00:47:51,160 --> 00:47:52,200 Speaker 4: So I think it can be. 914 00:47:52,120 --> 00:47:54,280 Speaker 2: Hard to pin this down to a concrete claim, but 915 00:47:54,360 --> 00:47:57,040 Speaker 2: I think what it comes closest to is saying that 916 00:47:57,160 --> 00:48:00,400 Speaker 2: for any given action or moment in my life history, 917 00:48:00,440 --> 00:48:04,040 Speaker 2: anything I do or think or say, given the exact 918 00:48:04,160 --> 00:48:08,239 Speaker 2: same inputs, I could have produced different output than I 919 00:48:08,280 --> 00:48:12,200 Speaker 2: actually did, And this would be I think, some way 920 00:48:12,280 --> 00:48:17,279 Speaker 2: of making free will a kind of like a physical proposition. Right, 921 00:48:17,600 --> 00:48:20,719 Speaker 2: If exactly the same inputs went into you, everything was 922 00:48:20,760 --> 00:48:23,399 Speaker 2: exactly the same, you could have done something different than 923 00:48:23,440 --> 00:48:26,319 Speaker 2: what you did. Unfortunately, I think this is just a 924 00:48:26,400 --> 00:48:30,840 Speaker 2: completely untestable assumption. R. You know, given the complexity of brains, 925 00:48:31,000 --> 00:48:34,279 Speaker 2: you can never have all of exactly the same inputs 926 00:48:34,600 --> 00:48:36,720 Speaker 2: that somebody had at a given moment, So you can't 927 00:48:36,760 --> 00:48:38,759 Speaker 2: experiment on this to find out what's true. 928 00:48:39,160 --> 00:48:42,600 Speaker 1: Though we certainly love ruminating on this in our fiction. Yeah, 929 00:48:42,640 --> 00:48:45,560 Speaker 1: Like any kind of time travel film, any kind of 930 00:48:45,600 --> 00:48:49,560 Speaker 1: Groundhog Day scenario is exploring this subject. 931 00:48:49,719 --> 00:48:50,279 Speaker 4: Yeah. 932 00:48:50,320 --> 00:48:52,480 Speaker 2: Though, even with most of those time travel things where 933 00:48:52,480 --> 00:48:54,640 Speaker 2: people want to go back and relive it, what they 934 00:48:54,640 --> 00:48:57,000 Speaker 2: actually are imagining is they want to go back and 935 00:48:57,040 --> 00:48:59,640 Speaker 2: relive a moment with the wisdom and knowledge that they 936 00:48:59,680 --> 00:49:02,759 Speaker 2: have now that they didn't have then. So it would 937 00:49:02,760 --> 00:49:05,959 Speaker 2: be funny to just like replay the same instance over 938 00:49:06,040 --> 00:49:09,040 Speaker 2: and over again with exactly the same physics involved, and 939 00:49:09,080 --> 00:49:11,920 Speaker 2: see if anything different happens without having. 940 00:49:11,680 --> 00:49:12,960 Speaker 4: Any new knowledge or whatever. 941 00:49:13,120 --> 00:49:17,120 Speaker 2: Oh yeah, but even then, I mean imagine maybe you 942 00:49:17,160 --> 00:49:19,280 Speaker 2: could do that somehow, You know, you could just watch 943 00:49:19,320 --> 00:49:21,799 Speaker 2: the same period of time play out over and over 944 00:49:21,880 --> 00:49:25,160 Speaker 2: again and see if anything different happens. Even if it 945 00:49:25,200 --> 00:49:27,879 Speaker 2: were true that you could have produced different outputs given 946 00:49:27,920 --> 00:49:31,680 Speaker 2: the exact same inputs, would this really mean you were free? 947 00:49:31,719 --> 00:49:34,520 Speaker 2: Would this be what people mean when they see free will, 948 00:49:34,560 --> 00:49:37,440 Speaker 2: like they're in control of their own actions? You know, 949 00:49:37,520 --> 00:49:40,880 Speaker 2: imagine there's some random dice rolling machine inside your head, 950 00:49:41,000 --> 00:49:43,520 Speaker 2: or a ghost or a spirit in your brain which 951 00:49:43,600 --> 00:49:46,960 Speaker 2: pushes you in different directions even if every single iota 952 00:49:46,960 --> 00:49:50,120 Speaker 2: of input is the same. Is that actually freedom? That 953 00:49:50,200 --> 00:49:53,520 Speaker 2: just sounds like a different kind of impetus or control. 954 00:49:54,000 --> 00:49:57,520 Speaker 1: That's interesting. You bring up a randomization via some sort 955 00:49:57,560 --> 00:50:01,400 Speaker 1: of technology like dice or a casting of bones, because 956 00:50:01,400 --> 00:50:03,399 Speaker 1: we've discussed that in the past on the show. How 957 00:50:03,440 --> 00:50:06,279 Speaker 1: that is sometimes brought up as being like that, Like 958 00:50:06,320 --> 00:50:11,960 Speaker 1: that's the purpose of these early divination technologies techniques, a 959 00:50:12,040 --> 00:50:16,360 Speaker 1: way to randomize choice and to sometimes force us towards 960 00:50:16,400 --> 00:50:19,320 Speaker 1: a decision that we otherwise wouldn't make, like in a 961 00:50:19,360 --> 00:50:23,319 Speaker 1: way to free us from these predestined paths that are 962 00:50:23,360 --> 00:50:26,640 Speaker 1: before us, or at least, you know, lean us over 963 00:50:26,719 --> 00:50:29,360 Speaker 1: towards a different path that we would that is available, 964 00:50:29,400 --> 00:50:31,439 Speaker 1: but we normally wouldn't go for Well. 965 00:50:31,320 --> 00:50:31,760 Speaker 4: It's funny. 966 00:50:31,800 --> 00:50:33,879 Speaker 2: I mean either way you go there. So yeah, say 967 00:50:33,880 --> 00:50:37,840 Speaker 2: you're doing the etching or throwing bones. Does either or 968 00:50:37,960 --> 00:50:41,080 Speaker 2: not doing that either way is one making you more 969 00:50:41,200 --> 00:50:44,680 Speaker 2: the author of your own destiny than another? I'm not sure. 970 00:50:44,719 --> 00:50:48,480 Speaker 2: I mean, they have differences in outcomes, right, but does 971 00:50:48,520 --> 00:50:51,480 Speaker 2: that actually change what people mean when they say when 972 00:50:51,480 --> 00:50:52,400 Speaker 2: they say free will? 973 00:50:52,600 --> 00:50:54,720 Speaker 1: I don't know. I mean, even if you randomize your choices, 974 00:50:54,840 --> 00:50:57,160 Speaker 1: you are the one that will then enact that choice, 975 00:50:57,560 --> 00:51:00,280 Speaker 1: Like you're still the actor in your narrative. 976 00:51:00,040 --> 00:51:02,040 Speaker 2: And the randomization is still an input on you. 977 00:51:03,120 --> 00:51:03,640 Speaker 4: So I don't know. 978 00:51:03,760 --> 00:51:07,359 Speaker 2: So anyway to sort of sum it all up, I've 979 00:51:07,400 --> 00:51:10,560 Speaker 2: got a theory here, and it is that I think 980 00:51:10,600 --> 00:51:13,200 Speaker 2: what a lot of us are actually circling around when 981 00:51:13,239 --> 00:51:16,240 Speaker 2: we're trying to figure out how to articulate our concept 982 00:51:16,239 --> 00:51:19,000 Speaker 2: of free will is this claim. And the claim is 983 00:51:19,760 --> 00:51:24,440 Speaker 2: our consciousness dictates our choices of how we act, or 984 00:51:24,440 --> 00:51:28,359 Speaker 2: in other words, we're conscious of the process by which 985 00:51:28,400 --> 00:51:31,560 Speaker 2: our choices are made or by which our actions are generated. 986 00:51:31,719 --> 00:51:37,080 Speaker 2: Right that when we act, we are able to consciously 987 00:51:37,239 --> 00:51:41,080 Speaker 2: be a part of the impetus to act, or consciously 988 00:51:41,400 --> 00:51:44,600 Speaker 2: cause the impetus to act. And I think this one 989 00:51:44,840 --> 00:51:47,719 Speaker 2: is actually testable, and we can come back to that 990 00:51:47,800 --> 00:51:48,320 Speaker 2: in a minute. 991 00:51:48,920 --> 00:51:50,880 Speaker 1: So this is, of course, this is one of the 992 00:51:50,880 --> 00:51:53,680 Speaker 1: big riddles of the human experience. And so people have 993 00:51:53,719 --> 00:51:57,160 Speaker 1: been thinking about this and you know, essentially banging their 994 00:51:57,160 --> 00:52:01,400 Speaker 1: head against the wall about this for thousands of years. 995 00:52:01,960 --> 00:52:06,960 Speaker 1: The philosophers Democritus and Lucippus saw the universe as wholly 996 00:52:07,000 --> 00:52:11,160 Speaker 1: governed by natural laws and composed of you know, essentially 997 00:52:11,160 --> 00:52:15,520 Speaker 1: indivisible atoms. They took the determinist view of life of 998 00:52:15,640 --> 00:52:20,120 Speaker 1: one propelled down a flowing stream of events. Aristotle, on 999 00:52:20,160 --> 00:52:21,799 Speaker 1: the other hand, is a great example of someone who 1000 00:52:21,840 --> 00:52:26,000 Speaker 1: stressed the individual's responsibility for their actions. The indeterminists view 1001 00:52:26,000 --> 00:52:29,760 Speaker 1: of life is a boat propelling itself through a body 1002 00:52:29,800 --> 00:52:33,040 Speaker 1: of water. So, yeah, to what extent are you just 1003 00:52:33,120 --> 00:52:36,080 Speaker 1: sailing down the river, you know, you know, with no 1004 00:52:36,160 --> 00:52:38,160 Speaker 1: power on where you're going, or are you in a 1005 00:52:38,160 --> 00:52:40,560 Speaker 1: boat that in which you have the power to move 1006 00:52:40,600 --> 00:52:43,360 Speaker 1: about and even move upstream if you need to. On 1007 00:52:43,440 --> 00:52:45,160 Speaker 1: that note, we're going to take one quick break, but 1008 00:52:45,160 --> 00:52:47,560 Speaker 1: when we come back, we will start rolling through some 1009 00:52:47,800 --> 00:52:50,280 Speaker 1: arguments for and against free will, and then we will 1010 00:52:50,520 --> 00:52:58,040 Speaker 1: return to bandersnatch. All right, we're back. So let's start 1011 00:52:58,080 --> 00:53:00,960 Speaker 1: with some arguments against free will, because ultimately I think 1012 00:53:00,960 --> 00:53:02,840 Speaker 1: these are these are often easier to discuss. 1013 00:53:03,080 --> 00:53:05,879 Speaker 2: Sure, I would say the most basic one, right is 1014 00:53:06,040 --> 00:53:08,279 Speaker 2: just the science of physics, Right. 1015 00:53:08,600 --> 00:53:10,120 Speaker 4: Physics is very predictable. 1016 00:53:10,840 --> 00:53:14,440 Speaker 2: You can, you know, given given the inputs of forces 1017 00:53:14,480 --> 00:53:17,279 Speaker 2: and energy and all that, you can determine what's going 1018 00:53:17,280 --> 00:53:20,480 Speaker 2: to happen as an output of that action. And if 1019 00:53:20,520 --> 00:53:23,319 Speaker 2: we assume that applies to everything, then why doesn't it 1020 00:53:23,320 --> 00:53:24,040 Speaker 2: apply to us? 1021 00:53:24,480 --> 00:53:24,640 Speaker 4: Right? 1022 00:53:24,760 --> 00:53:28,359 Speaker 1: And it basically comes back to Democritus and Lucipus, right, 1023 00:53:28,440 --> 00:53:31,359 Speaker 1: the idea that their natural laws and they that are 1024 00:53:31,400 --> 00:53:33,600 Speaker 1: in place, and we are not above those laws. 1025 00:53:33,800 --> 00:53:34,120 Speaker 4: Sure. 1026 00:53:34,560 --> 00:53:37,440 Speaker 2: Yeah, So we're acting on the inputs that come in 1027 00:53:37,560 --> 00:53:41,399 Speaker 2: and you know that be being pushed in one way 1028 00:53:41,480 --> 00:53:44,200 Speaker 2: or another by our life history and our brains and 1029 00:53:44,239 --> 00:53:46,960 Speaker 2: all that we're going to act a certain way as 1030 00:53:47,000 --> 00:53:51,080 Speaker 2: physically reactive objects. Now this is an argument, of course, 1031 00:53:51,080 --> 00:53:53,680 Speaker 2: it's the most common argument I think against free will. 1032 00:53:54,080 --> 00:53:58,520 Speaker 2: But one question is our free will and causal determinism 1033 00:53:58,600 --> 00:54:02,080 Speaker 2: really incompatible? Not that it settles the issue but I 1034 00:54:02,080 --> 00:54:04,800 Speaker 2: think the majority of philosophers who look at this issue 1035 00:54:04,920 --> 00:54:08,440 Speaker 2: pretty closely actually end up becoming what are called compatibilists. 1036 00:54:08,560 --> 00:54:11,920 Speaker 2: They accept causal determinism, they say, yeah, you know, we're 1037 00:54:11,920 --> 00:54:15,600 Speaker 2: physical objects being pushed around by physical forces, but they 1038 00:54:15,640 --> 00:54:18,560 Speaker 2: define free will in some way that it is compatible 1039 00:54:18,560 --> 00:54:21,680 Speaker 2: with that. That you are a physical object being pushed 1040 00:54:21,719 --> 00:54:24,080 Speaker 2: around by physical forces, and the whole history of your 1041 00:54:24,120 --> 00:54:27,360 Speaker 2: life and everything, and yet somehow free will still applies 1042 00:54:27,400 --> 00:54:30,480 Speaker 2: to you. This often comes down to like an understanding 1043 00:54:30,600 --> 00:54:33,480 Speaker 2: or feeling of free will, like I was talking about earlier, like, 1044 00:54:33,520 --> 00:54:37,800 Speaker 2: even if your actions are causally determined, somehow you feel 1045 00:54:37,800 --> 00:54:41,120 Speaker 2: like you have agency, and that's what we mean by free. 1046 00:54:40,880 --> 00:54:45,320 Speaker 1: Will, right right. Another take on this that I came across, 1047 00:54:45,560 --> 00:54:47,440 Speaker 1: and this goes back to something we were talking about earlier, 1048 00:54:47,640 --> 00:54:52,759 Speaker 1: contemporary British analytic philosopher Galen Straws, and their argument is 1049 00:54:52,760 --> 00:54:56,280 Speaker 1: that that basically free will is impossible because we act 1050 00:54:56,320 --> 00:54:59,120 Speaker 1: the way we are right in this argument, This argument 1051 00:54:59,120 --> 00:55:02,920 Speaker 1: always makes me think of Yates in the poem No 1052 00:55:03,080 --> 00:55:06,120 Speaker 1: Second Troy. There's that line what could she have done? 1053 00:55:06,280 --> 00:55:10,240 Speaker 1: Being what she is? And I think about that with myself, 1054 00:55:10,280 --> 00:55:12,399 Speaker 1: like what when I look back on past choices, what 1055 00:55:12,440 --> 00:55:15,319 Speaker 1: else could I have done being who I am? You know, 1056 00:55:15,560 --> 00:55:18,320 Speaker 1: without the you know, some sort of sci fi foresight 1057 00:55:18,440 --> 00:55:22,719 Speaker 1: brought on by time travel or groundhog day shenanigans, Like 1058 00:55:23,080 --> 00:55:25,680 Speaker 1: I am who I am? I am influenced by all 1059 00:55:25,680 --> 00:55:29,399 Speaker 1: these these things in my life, and my mind is this, 1060 00:55:29,880 --> 00:55:32,320 Speaker 1: and then what other choice would that mind have made? 1061 00:55:32,600 --> 00:55:32,799 Speaker 4: Right? 1062 00:55:32,920 --> 00:55:35,600 Speaker 2: I mean, this is That's a very good way of 1063 00:55:35,640 --> 00:55:39,520 Speaker 2: putting it. It almost like it maybe emphasizes the fact 1064 00:55:39,640 --> 00:55:43,000 Speaker 2: that free will is a difficult concept because of some 1065 00:55:43,120 --> 00:55:46,200 Speaker 2: of the baggage brought by the word free. Yeah, to 1066 00:55:46,320 --> 00:55:49,759 Speaker 2: act in accordance with your nature and your circumstances is 1067 00:55:49,800 --> 00:55:51,000 Speaker 2: not necessarily not. 1068 00:55:51,320 --> 00:55:55,239 Speaker 1: Free, right, Like it was in my nature to responsibly 1069 00:55:55,239 --> 00:55:57,640 Speaker 1: come to work this morning, and so therefore I did. 1070 00:55:58,640 --> 00:56:01,439 Speaker 1: Could I have decided not to come into work? Could 1071 00:56:01,440 --> 00:56:03,640 Speaker 1: I have gone to the local at an arcade or something, 1072 00:56:03,719 --> 00:56:07,440 Speaker 1: or whatever whatever one does when one skips quirk, I 1073 00:56:07,480 --> 00:56:08,239 Speaker 1: guess I could have. 1074 00:56:08,680 --> 00:56:10,239 Speaker 4: In theory, there's nothing stopping you. 1075 00:56:10,320 --> 00:56:13,360 Speaker 1: Yeah, nothing at all, except that is not my nature, 1076 00:56:13,520 --> 00:56:15,719 Speaker 1: and that is not what I did. Because of my. 1077 00:56:15,760 --> 00:56:18,640 Speaker 2: Nature, given the circumstances of who you are and who 1078 00:56:18,640 --> 00:56:20,960 Speaker 2: you were, this morning and what was going on this morning. 1079 00:56:21,040 --> 00:56:23,440 Speaker 2: You didn't do it, and that's all we know is 1080 00:56:23,480 --> 00:56:25,839 Speaker 2: that you know you acted the way you were at 1081 00:56:25,840 --> 00:56:28,400 Speaker 2: that time because that's who you were at that time. 1082 00:56:28,760 --> 00:56:32,120 Speaker 1: Yeah. Now that being said, yes, events could have been different. 1083 00:56:32,120 --> 00:56:34,520 Speaker 1: We could have had an email from our boss saying 1084 00:56:34,600 --> 00:56:37,000 Speaker 1: that there was going to be like a rock concert 1085 00:56:37,080 --> 00:56:41,240 Speaker 1: in the in the office today. That would never happen, 1086 00:56:42,320 --> 00:56:44,319 Speaker 1: and it might make me think, well, maybe I don't 1087 00:56:44,320 --> 00:56:46,680 Speaker 1: want to come into work today, and maybe the easiest 1088 00:56:46,680 --> 00:56:50,560 Speaker 1: thing to do would be just to skip. I don't know. Again, 1089 00:56:51,200 --> 00:56:54,560 Speaker 1: you can tease your brain all day thinking about what 1090 00:56:54,600 --> 00:56:56,920 Speaker 1: if and how this would have this little detail where 1091 00:56:56,960 --> 00:57:00,239 Speaker 1: this other detail would affected your choices, but ultimately we 1092 00:57:00,280 --> 00:57:02,440 Speaker 1: only have the version of the path behind us to 1093 00:57:02,480 --> 00:57:05,359 Speaker 1: look back on when we think about all of this. Now, 1094 00:57:05,400 --> 00:57:08,719 Speaker 1: two other basic arguments against free will. This is when 1095 00:57:08,760 --> 00:57:12,400 Speaker 1: I think we'll come back to Experimentation has pointed to 1096 00:57:12,520 --> 00:57:15,440 Speaker 1: breakdowns between what feels like the moment of choice and 1097 00:57:15,480 --> 00:57:17,560 Speaker 1: what physically signals a choice being made. 1098 00:57:17,640 --> 00:57:20,600 Speaker 2: Yeah, I think this very much complicates the idea that Again, 1099 00:57:20,680 --> 00:57:23,160 Speaker 2: what I think people are actually really getting at with 1100 00:57:23,200 --> 00:57:26,040 Speaker 2: their idea of free will is that they have conscious 1101 00:57:26,200 --> 00:57:28,680 Speaker 2: control over their actions and thoughts. 1102 00:57:29,480 --> 00:57:31,840 Speaker 1: And another one, and this is again we've been touching 1103 00:57:31,920 --> 00:57:35,760 Speaker 1: on this the whole episode, but myriad causal influences at 1104 00:57:35,880 --> 00:57:39,040 Speaker 1: least guide our decisions, if not make them for us. 1105 00:57:39,400 --> 00:57:41,120 Speaker 4: Yeah, hard to deny. 1106 00:57:41,280 --> 00:57:43,960 Speaker 1: All right, So here are some arguments for free will. 1107 00:57:44,040 --> 00:57:47,240 Speaker 1: The big one, of course, is that subjectively, we tend 1108 00:57:47,240 --> 00:57:50,520 Speaker 1: to feel like we have rational, reflective control over our 1109 00:57:50,600 --> 00:57:51,640 Speaker 1: choices and actions. 1110 00:57:51,840 --> 00:57:54,400 Speaker 2: Sure, I mean, I can decide to do anything that 1111 00:57:54,440 --> 00:57:56,400 Speaker 2: occurs to me to do right now. You know, I 1112 00:57:56,480 --> 00:57:59,600 Speaker 2: could throw my computer across the room if I really 1113 00:57:59,600 --> 00:58:00,080 Speaker 2: wanted to. 1114 00:58:00,440 --> 00:58:03,640 Speaker 1: Yeah. And and the idea, the way that that our 1115 00:58:03,640 --> 00:58:07,160 Speaker 1: brains enable us to simulate these possibilities really I think 1116 00:58:07,280 --> 00:58:11,000 Speaker 1: allows us to lean into that interpretation because it's like 1117 00:58:11,080 --> 00:58:13,880 Speaker 1: the Choose your own adventure book. The other choices are 1118 00:58:13,920 --> 00:58:16,280 Speaker 1: in there, and if we want to, we can cheat, 1119 00:58:16,320 --> 00:58:18,360 Speaker 1: and we can check one out and then back up. 1120 00:58:18,720 --> 00:58:20,080 Speaker 1: And in a way, you know, we can't do that 1121 00:58:20,120 --> 00:58:24,640 Speaker 1: in real life except through our ability to simulate possible futures. 1122 00:58:25,240 --> 00:58:28,360 Speaker 1: And and of course that has an important evolutionary role, 1123 00:58:28,400 --> 00:58:31,360 Speaker 1: It has important role for our survival. We can think 1124 00:58:31,400 --> 00:58:34,280 Speaker 1: about the different ways we might try to say, steal 1125 00:58:34,360 --> 00:58:36,840 Speaker 1: a piece of meat from a hungry lion and escape 1126 00:58:37,040 --> 00:58:39,960 Speaker 1: with food and our lives, and then choose the best 1127 00:58:39,960 --> 00:58:43,960 Speaker 1: course of action. This is this is important, but it 1128 00:58:44,000 --> 00:58:48,440 Speaker 1: can also lean into these interpretations that you know that 1129 00:58:48,520 --> 00:58:52,360 Speaker 1: certainly you know I have more choice than I actually have, 1130 00:58:52,880 --> 00:58:55,360 Speaker 1: or even ultimately an idea that of course is explored 1131 00:58:55,360 --> 00:58:58,680 Speaker 1: in Bandersnatch, the idea that these other alternate choices are 1132 00:58:58,800 --> 00:59:01,440 Speaker 1: kind of alternate timeline. It's that they're out there like 1133 00:59:01,480 --> 00:59:03,720 Speaker 1: I saw it in my head to a certain extent, 1134 00:59:03,920 --> 00:59:06,760 Speaker 1: that reality where I tried to take more meat than 1135 00:59:06,880 --> 00:59:09,120 Speaker 1: was feasible and was killed by the lion in a 1136 00:59:09,160 --> 00:59:11,160 Speaker 1: way that exists because I just saw it. 1137 00:59:11,480 --> 00:59:15,240 Speaker 2: Unfortunately, I would say about this argument, it does often 1138 00:59:15,320 --> 00:59:17,640 Speaker 2: feel that way that you know, like I could have 1139 00:59:17,720 --> 00:59:21,560 Speaker 2: done anything a minute ago, but you didn't. You did 1140 00:59:21,560 --> 00:59:23,920 Speaker 2: what you did. So again, this comes back to the 1141 00:59:24,440 --> 00:59:28,280 Speaker 2: untestability of this one, like there's just never any way 1142 00:59:28,320 --> 00:59:31,360 Speaker 2: to prove that you could have done differently than you 1143 00:59:31,440 --> 00:59:32,480 Speaker 2: did in the moment. 1144 00:59:33,000 --> 00:59:34,880 Speaker 1: Like if you ever had a like a close call 1145 00:59:35,000 --> 00:59:38,840 Speaker 1: where say you're almost in a wreck or you almost 1146 00:59:38,920 --> 00:59:42,240 Speaker 1: do something that could have conceivably gotten you killed, and 1147 00:59:42,280 --> 00:59:44,560 Speaker 1: then you have that moment of reflection granted. On one level, 1148 00:59:44,600 --> 00:59:48,280 Speaker 1: like it may get you just bodily you're excited, right 1149 00:59:48,320 --> 00:59:50,400 Speaker 1: because this has happened and your body's on high alert. 1150 00:59:51,080 --> 00:59:53,000 Speaker 1: But on the other hand, part of it is sort 1151 00:59:53,040 --> 00:59:57,440 Speaker 1: of realizing your close proximity to this other possibility, like 1152 00:59:57,520 --> 01:00:00,720 Speaker 1: I was just some minor choice, some mine, or a 1153 01:00:00,800 --> 01:00:04,720 Speaker 1: bit of input data away from something more catastrophic. 1154 01:00:04,920 --> 01:00:08,200 Speaker 2: Yes, it makes you suddenly you come face to face 1155 01:00:08,240 --> 01:00:10,880 Speaker 2: with how dependent you are on moment to moment circumstances 1156 01:00:10,920 --> 01:00:13,400 Speaker 2: and awareness. Though I would say a lot of times 1157 01:00:13,440 --> 01:00:15,760 Speaker 2: when I get that like that, like oh, you know, 1158 01:00:15,920 --> 01:00:18,120 Speaker 2: catch your breath about what could have happened. It wasn't 1159 01:00:18,160 --> 01:00:21,960 Speaker 2: because I narrowly avoided something really bad happening. It's because 1160 01:00:22,000 --> 01:00:25,800 Speaker 2: I suddenly, out of nowhere, imagined something really bad happening. 1161 01:00:26,240 --> 01:00:28,960 Speaker 2: Like you're going down a flight of stairs and it's 1162 01:00:28,960 --> 01:00:32,120 Speaker 2: going fine, but you just imagine, ooh I could fall 1163 01:00:32,120 --> 01:00:34,040 Speaker 2: and hit my teeth on that thing, and. 1164 01:00:34,000 --> 01:00:36,640 Speaker 1: Oh yeah, oh yeah, I do that. This is of course, 1165 01:00:36,840 --> 01:00:40,360 Speaker 1: this is one of my pitfalls, is to almost constantly, 1166 01:00:42,000 --> 01:00:45,880 Speaker 1: essentially fantasize about bad things that could happen. And I 1167 01:00:45,880 --> 01:00:47,240 Speaker 1: think a lot of us do that you know, and 1168 01:00:47,280 --> 01:00:51,000 Speaker 1: part of that is your mind is exploring possibilities, sure 1169 01:00:51,040 --> 01:00:54,240 Speaker 1: of what is happening or could happen or has happened. 1170 01:00:55,240 --> 01:00:57,240 Speaker 1: But in doing that we can lean into the negative 1171 01:00:57,240 --> 01:01:01,480 Speaker 1: possibilities too much, and then our lives become this abysmal 1172 01:01:01,560 --> 01:01:05,880 Speaker 1: choose your own adventure book of mostly terrible ends, even 1173 01:01:05,920 --> 01:01:08,720 Speaker 1: though the path that you're actually on may not be 1174 01:01:08,880 --> 01:01:10,600 Speaker 1: leading to any of them. 1175 01:01:11,080 --> 01:01:13,960 Speaker 2: It seems like the curse of all this confusion about 1176 01:01:13,960 --> 01:01:15,760 Speaker 2: whether we have free will or not and what that 1177 01:01:15,840 --> 01:01:18,880 Speaker 2: actually means, could just be rooted in the fact that 1178 01:01:18,920 --> 01:01:22,880 Speaker 2: we can consider hypothetical alternative scenarios. The fact that we're 1179 01:01:22,920 --> 01:01:28,240 Speaker 2: able to imagine counterfactuals is what makes is what gives 1180 01:01:28,320 --> 01:01:29,720 Speaker 2: rise to this whole argument. 1181 01:01:31,520 --> 01:01:33,240 Speaker 1: So another thing I have in the list here, and 1182 01:01:33,240 --> 01:01:35,760 Speaker 1: this basically is just an extrapolation of everything we're talking 1183 01:01:35,840 --> 01:01:40,600 Speaker 1: about right now, is Philosophers Stephen Cave and also Bruce 1184 01:01:40,800 --> 01:01:45,200 Speaker 1: Waller have both argued that animals evolved with the capabilities 1185 01:01:45,240 --> 01:01:48,280 Speaker 1: we tend to associate with free will in order to survive, 1186 01:01:48,360 --> 01:01:52,640 Speaker 1: such as opinion generation, deliberation, will, power to stick to 1187 01:01:52,680 --> 01:01:55,920 Speaker 1: a choice, and the large human brain has all of 1188 01:01:55,960 --> 01:01:58,960 Speaker 1: this in Spades. Cave argues that the level of free 1189 01:01:59,000 --> 01:02:03,080 Speaker 1: will that we have may actually vary from individual to individual, 1190 01:02:03,360 --> 01:02:05,600 Speaker 1: and he argues that we could potentially even put together 1191 01:02:05,640 --> 01:02:09,720 Speaker 1: a method of measuring one's freedom quotient or FQ in 1192 01:02:09,800 --> 01:02:14,720 Speaker 1: the same way that we will roughly measure one's intelligence, creativity, 1193 01:02:14,760 --> 01:02:16,360 Speaker 1: and other psychological factors. 1194 01:02:16,640 --> 01:02:18,680 Speaker 2: I do think that's possible, but I also think that 1195 01:02:18,680 --> 01:02:20,479 Speaker 2: that would be subject to a lot of debate about 1196 01:02:20,480 --> 01:02:21,080 Speaker 2: exactly what. 1197 01:02:21,200 --> 01:02:22,360 Speaker 4: It is you're measuring. 1198 01:02:22,440 --> 01:02:25,920 Speaker 2: There as a lot of these actual you know, human 1199 01:02:26,040 --> 01:02:29,280 Speaker 2: or animal quotients are I mean, when you measure human intelligence, 1200 01:02:29,320 --> 01:02:32,720 Speaker 2: there's debate about what exactly are you measuring, And I 1201 01:02:32,720 --> 01:02:35,000 Speaker 2: think the same thing would be true of freedom subject 1202 01:02:35,040 --> 01:02:37,200 Speaker 2: to all of these you know, crazy caveats we've been 1203 01:02:37,200 --> 01:02:38,040 Speaker 2: talking about so far. 1204 01:02:38,080 --> 01:02:40,440 Speaker 4: What do you mean when you say freedom? 1205 01:02:40,720 --> 01:02:42,920 Speaker 1: Yeah. Another take on this that I had read in 1206 01:02:43,000 --> 01:02:46,720 Speaker 1: the past was something that neuroscientists David Eagleman called the 1207 01:02:46,800 --> 01:02:50,440 Speaker 1: principle of sufficient automatism. And the idea here is that 1208 01:02:51,040 --> 01:02:53,360 Speaker 1: the more that we map the human genome and study 1209 01:02:53,400 --> 01:02:56,880 Speaker 1: the brains many subconscious machinations, the more it becomes clear 1210 01:02:56,960 --> 01:03:01,080 Speaker 1: that a free will exist. It's only a hitching a 1211 01:03:01,160 --> 01:03:06,200 Speaker 1: ride on top of enormous automated machinery. So again it 1212 01:03:06,200 --> 01:03:10,960 Speaker 1: comes there's plenty of ground in between automaton and self 1213 01:03:11,000 --> 01:03:15,200 Speaker 1: moving soul where you can sort of move the slider 1214 01:03:15,320 --> 01:03:18,680 Speaker 1: towards one direction or the other and still have something 1215 01:03:19,000 --> 01:03:21,360 Speaker 1: that we can at least refer to as free will. 1216 01:03:21,520 --> 01:03:23,480 Speaker 4: But it might only be a very very little bit 1217 01:03:23,520 --> 01:03:24,040 Speaker 4: of something. 1218 01:03:24,200 --> 01:03:26,680 Speaker 1: Yeah, it might, And it's interesting to sort of do that, 1219 01:03:27,080 --> 01:03:29,560 Speaker 1: to do a little self reflection and think about that, Like, yes, 1220 01:03:29,640 --> 01:03:33,000 Speaker 1: I had choice in the Senate situation, but really, how 1221 01:03:33,120 --> 01:03:34,400 Speaker 1: much choice was there? 1222 01:03:35,040 --> 01:03:37,360 Speaker 2: Yeah, and I think for me at least some of 1223 01:03:37,360 --> 01:03:41,240 Speaker 2: the definition problems would still remain, Like, I'm not sure 1224 01:03:41,360 --> 01:03:46,080 Speaker 2: that even then that's clarifying what the concept of freedom means. There, 1225 01:03:47,320 --> 01:03:50,080 Speaker 2: So we can't test whether it's possible for a person 1226 01:03:50,080 --> 01:03:53,200 Speaker 2: to produce different outputs given the exact same inputs. That 1227 01:03:53,360 --> 01:03:56,880 Speaker 2: just seems beyond the bounds of empiricism. You could believe 1228 01:03:56,920 --> 01:03:59,120 Speaker 2: that if you want, but I don't think there's any 1229 01:03:59,160 --> 01:04:01,600 Speaker 2: evidence for it. But this might not be what we 1230 01:04:01,640 --> 01:04:04,680 Speaker 2: really mean by free will. Maybe, as I mentioned earlier, 1231 01:04:05,400 --> 01:04:07,440 Speaker 2: what we mean by free will is that we are 1232 01:04:07,640 --> 01:04:11,320 Speaker 2: conscious of the process by which we make decisions or 1233 01:04:11,360 --> 01:04:14,960 Speaker 2: generate actions. And I think the empirical research is pretty 1234 01:04:15,000 --> 01:04:18,280 Speaker 2: clear that this is not true, at least not in 1235 01:04:18,360 --> 01:04:21,560 Speaker 2: many cases. So just to look at a few studies 1236 01:04:21,640 --> 01:04:26,040 Speaker 2: undercutting traditional notions that our consciousness dictates our decisions or 1237 01:04:26,080 --> 01:04:29,400 Speaker 2: that we're consciously aware of how all our decisions are reached. 1238 01:04:30,480 --> 01:04:32,240 Speaker 2: So first of all, I want to look at one 1239 01:04:32,320 --> 01:04:36,960 Speaker 2: by us soon Brass, Hinds and Haynes, published in Nature 1240 01:04:37,000 --> 01:04:40,160 Speaker 2: Neuroscience in two thousand and eight, called unconscious determinants of 1241 01:04:40,200 --> 01:04:43,360 Speaker 2: free decisions in the human brain. In this study, the 1242 01:04:43,480 --> 01:04:47,080 Speaker 2: authors found that they could use brain scanning to detect 1243 01:04:47,120 --> 01:04:52,240 Speaker 2: a person's choice between two options before the person believed 1244 01:04:52,360 --> 01:04:54,840 Speaker 2: that they had made a choice. So you've got a 1245 01:04:54,920 --> 01:04:58,360 Speaker 2: very simple setup. You're supposed to freely choose between pressing 1246 01:04:58,400 --> 01:05:00,640 Speaker 2: two buttons. You got a left button I pressed with 1247 01:05:00,720 --> 01:05:03,080 Speaker 2: your left hand. You got to right button pressed with 1248 01:05:03,120 --> 01:05:06,160 Speaker 2: your right hand. The two different hands were used because 1249 01:05:06,200 --> 01:05:08,280 Speaker 2: this made it easier to see which hand was about 1250 01:05:08,320 --> 01:05:11,560 Speaker 2: to be engaged through motor control and brain imaging. And 1251 01:05:11,640 --> 01:05:14,520 Speaker 2: so you take your time, you decide which button you 1252 01:05:14,600 --> 01:05:17,520 Speaker 2: want to press, and then you note which letter in 1253 01:05:17,600 --> 01:05:20,360 Speaker 2: a timed sequence is displayed on a screen in front 1254 01:05:20,400 --> 01:05:22,880 Speaker 2: of you at the moment you believe you've made your 1255 01:05:22,880 --> 01:05:26,400 Speaker 2: decision about which button to push, and in some cases, 1256 01:05:26,440 --> 01:05:30,000 Speaker 2: the researchers could detect brain activity of the prefrontal and 1257 01:05:30,040 --> 01:05:33,920 Speaker 2: parietal cortex indicating which choice a person was going to 1258 01:05:34,000 --> 01:05:37,760 Speaker 2: make up to seven to ten seconds before the person 1259 01:05:37,880 --> 01:05:42,120 Speaker 2: believed they had made their choice. So this study indicates 1260 01:05:42,120 --> 01:05:45,120 Speaker 2: that at least in some cases, at the moment you 1261 01:05:45,200 --> 01:05:48,840 Speaker 2: believe that you have consciously made a choice to do something, 1262 01:05:49,440 --> 01:05:52,560 Speaker 2: machines can look at your brain and show that the 1263 01:05:52,600 --> 01:05:55,760 Speaker 2: brain has made a choice before you believe you have 1264 01:05:55,880 --> 01:05:58,880 Speaker 2: made a choice and predict with better than chance accuracy 1265 01:05:59,200 --> 01:06:00,000 Speaker 2: what that choice is. 1266 01:06:01,160 --> 01:06:04,200 Speaker 1: This is a study that really intrigued me. I remember 1267 01:06:04,200 --> 01:06:07,680 Speaker 1: when it came out because it's basically this idea where 1268 01:06:07,680 --> 01:06:09,800 Speaker 1: I think that I'm the lightning, but perhaps I am 1269 01:06:09,840 --> 01:06:12,560 Speaker 1: the thunder, or at least my experience is that of 1270 01:06:12,560 --> 01:06:15,760 Speaker 1: the thunder. But then the other question is, well, does 1271 01:06:15,800 --> 01:06:18,080 Speaker 1: that mean I'm not the lightning? Am I not? Both? 1272 01:06:18,160 --> 01:06:21,200 Speaker 1: And maybe just like I have a thunder level awareness 1273 01:06:21,720 --> 01:06:24,000 Speaker 1: of what I am, but there is this lightning that 1274 01:06:24,040 --> 01:06:25,840 Speaker 1: precedes this experience of me. 1275 01:06:26,320 --> 01:06:28,800 Speaker 2: Well, I mean, I don't know, I mean the decision 1276 01:06:28,880 --> 01:06:31,680 Speaker 2: is generated by the brain. So again you're back to 1277 01:06:31,720 --> 01:06:34,480 Speaker 2: this question of what free will means. But if it 1278 01:06:34,520 --> 01:06:37,800 Speaker 2: does have something to do with consciously being a participant 1279 01:06:38,720 --> 01:06:42,680 Speaker 2: at the moment that a decision is made, there's pretty 1280 01:06:42,680 --> 01:06:45,120 Speaker 2: good evidence that that's not going on. The brain is 1281 01:06:45,160 --> 01:06:49,480 Speaker 2: making decisions before the person thinks I have made a decision. 1282 01:06:49,760 --> 01:06:51,440 Speaker 2: But okay, that was two thousand and eight. Is there 1283 01:06:51,440 --> 01:06:55,240 Speaker 2: anything since then? Sure? Here's one study with findings along 1284 01:06:55,280 --> 01:06:58,400 Speaker 2: these lines but applied to voluntary mental imagery who is 1285 01:06:58,440 --> 01:07:00,959 Speaker 2: published just last year in twenty nineteen in the open 1286 01:07:01,000 --> 01:07:05,920 Speaker 2: access journal Scientific Reports. It's by Kenning, Robert and Pearson 1287 01:07:06,240 --> 01:07:09,520 Speaker 2: in I said Scientific Reports called Decoding the Contents and 1288 01:07:09,560 --> 01:07:14,120 Speaker 2: Strength of Imagery before Volitional Engagement. And again this was 1289 01:07:14,160 --> 01:07:17,600 Speaker 2: published in twenty nineteen. The short version here is that 1290 01:07:17,640 --> 01:07:20,720 Speaker 2: the researchers exposed people to two different images. You got 1291 01:07:20,720 --> 01:07:24,200 Speaker 2: a red circle with horizontal lines and a green circle 1292 01:07:24,240 --> 01:07:27,280 Speaker 2: with vertical lines. And then the researchers were able to 1293 01:07:27,400 --> 01:07:31,320 Speaker 2: correlate images of brain states with mental representation of the 1294 01:07:31,360 --> 01:07:33,800 Speaker 2: different pictures, so they know it's what it looks like 1295 01:07:33,840 --> 01:07:37,080 Speaker 2: in your brain when you're thinking about these two images separately. 1296 01:07:38,080 --> 01:07:41,320 Speaker 2: They could use this brain imaging to predict, again above chance, 1297 01:07:41,640 --> 01:07:45,800 Speaker 2: which image subjects would choose to visualize in their head 1298 01:07:46,320 --> 01:07:49,880 Speaker 2: before the subjects believed they had made a choice about 1299 01:07:49,880 --> 01:07:52,479 Speaker 2: which one to imagine in their head, and they could 1300 01:07:52,480 --> 01:07:54,880 Speaker 2: make these predictions at a rate above chance an average 1301 01:07:54,880 --> 01:07:59,360 Speaker 2: of eleven seconds before the person's actual choice about which 1302 01:07:59,360 --> 01:08:02,000 Speaker 2: one they were going to imagine. So, one of the authors, 1303 01:08:02,040 --> 01:08:04,840 Speaker 2: Joel Pearson, was quoted in a statement I believe to 1304 01:08:04,880 --> 01:08:08,200 Speaker 2: a medical express quote. We believe that when we are 1305 01:08:08,240 --> 01:08:10,760 Speaker 2: faced with the choice between two or more options of 1306 01:08:10,800 --> 01:08:14,760 Speaker 2: what to think about, non conscious traces of the thoughts 1307 01:08:14,800 --> 01:08:18,800 Speaker 2: are there already, a bit like unconscious hallucinations. 1308 01:08:19,120 --> 01:08:20,839 Speaker 4: That comes back to something we talked about recently. 1309 01:08:20,920 --> 01:08:24,759 Speaker 2: Yeah, As the decision of what to think about is made, 1310 01:08:25,080 --> 01:08:28,320 Speaker 2: executive areas of the brain choose the thought trace which 1311 01:08:28,400 --> 01:08:31,639 Speaker 2: is stronger. In other words, if any pre existing brain 1312 01:08:31,720 --> 01:08:34,960 Speaker 2: activity matches one of your choices, then your brain will 1313 01:08:34,960 --> 01:08:37,439 Speaker 2: be more likely to pick that option as it gets 1314 01:08:37,479 --> 01:08:41,280 Speaker 2: boosted by the pre existing brain activity. This would explain, 1315 01:08:41,320 --> 01:08:44,559 Speaker 2: for example, why thinking over and over about something leads 1316 01:08:44,560 --> 01:08:46,920 Speaker 2: to ever more thoughts about it as it occurs in 1317 01:08:46,960 --> 01:08:50,120 Speaker 2: a positive feedback loop, and then to quote from the 1318 01:08:50,160 --> 01:08:53,320 Speaker 2: study abstract, the authors say, our results suggest that the 1319 01:08:53,360 --> 01:08:57,400 Speaker 2: contents and strength of mental imagery are influenced by sensory 1320 01:08:57,560 --> 01:09:03,280 Speaker 2: like neural representations that emerge spontaneously before volition. So there 1321 01:09:03,280 --> 01:09:05,280 Speaker 2: are things going on within the brain that we can 1322 01:09:05,280 --> 01:09:09,400 Speaker 2: detect with machinery from the outside that suggest what you're 1323 01:09:09,439 --> 01:09:12,599 Speaker 2: going to think about before you think about it now. 1324 01:09:12,640 --> 01:09:14,679 Speaker 2: I think we should be fair that it's possible. This 1325 01:09:14,800 --> 01:09:18,040 Speaker 2: isn't always the case, but there's plenty of evidence that, 1326 01:09:18,120 --> 01:09:20,799 Speaker 2: at least in some of the at least in some cases, 1327 01:09:21,280 --> 01:09:25,360 Speaker 2: when people think they're consciously making a choice, the brain 1328 01:09:25,640 --> 01:09:28,800 Speaker 2: in a measurable way has already made a choice that 1329 01:09:28,840 --> 01:09:32,120 Speaker 2: can be detected from the outside. The brain has already 1330 01:09:32,200 --> 01:09:36,000 Speaker 2: set one course of action in motion before the conscious 1331 01:09:36,040 --> 01:09:38,680 Speaker 2: part of our brain is aware that we're going to 1332 01:09:38,760 --> 01:09:39,759 Speaker 2: choose that course. 1333 01:09:40,320 --> 01:09:43,880 Speaker 1: So again, kind of a thunder and lightning scenario right now. 1334 01:09:43,920 --> 01:09:46,080 Speaker 2: Of course, this stuff we've been talking about is true 1335 01:09:46,080 --> 01:09:48,880 Speaker 2: of typical human brains. Once you start looking at atypical 1336 01:09:48,920 --> 01:09:52,360 Speaker 2: neurological situations, you can find all kinds of evidence of 1337 01:09:52,439 --> 01:09:56,200 Speaker 2: action without the sensation of conscious awareness or choice. A 1338 01:09:56,200 --> 01:09:57,559 Speaker 2: lot of these are things that have come up on 1339 01:09:57,600 --> 01:10:00,920 Speaker 2: the show before, like blind sight, the fact that people 1340 01:10:01,000 --> 01:10:05,960 Speaker 2: can physically react to visual stimuli while believing consciously that 1341 01:10:06,040 --> 01:10:08,240 Speaker 2: they are blind, or that they're blind in some part 1342 01:10:08,240 --> 01:10:11,639 Speaker 2: of their visual field, like you can react to raise 1343 01:10:11,680 --> 01:10:14,120 Speaker 2: your hand to catch a ball without believing that you 1344 01:10:14,160 --> 01:10:17,280 Speaker 2: have seen the ball, or you got alien limb syndrome, 1345 01:10:17,320 --> 01:10:20,439 Speaker 2: where something like a brain lesion can cause part of 1346 01:10:20,439 --> 01:10:22,200 Speaker 2: the body to act in a way that you do 1347 01:10:22,240 --> 01:10:25,439 Speaker 2: not feel in control of. The hand moves on its own, 1348 01:10:25,479 --> 01:10:28,120 Speaker 2: it moves against your will, It picks up the spoon 1349 01:10:28,240 --> 01:10:31,439 Speaker 2: when you meant to pick up the fork. Of course, 1350 01:10:31,520 --> 01:10:34,639 Speaker 2: the experiences of split brain patients, which we did a 1351 01:10:34,680 --> 01:10:38,200 Speaker 2: deep dive on in January of twenty nineteen. The short 1352 01:10:38,320 --> 01:10:41,120 Speaker 2: version is that some patients who undergo a surgery called 1353 01:10:41,120 --> 01:10:44,600 Speaker 2: a corpus calisotomy, in which the main avenue of information 1354 01:10:44,680 --> 01:10:47,200 Speaker 2: sharing between the two hemispheres of the brain is severed, 1355 01:10:47,600 --> 01:10:51,120 Speaker 2: can seem to show signs of the right hemisphere acting 1356 01:10:51,160 --> 01:10:54,920 Speaker 2: and making choices without the conscious awareness or control of 1357 01:10:54,960 --> 01:10:57,439 Speaker 2: the left hemisphere, which seems to be the part of 1358 01:10:57,439 --> 01:11:00,840 Speaker 2: the brain that can usually talk. And example led to 1359 01:11:00,960 --> 01:11:06,040 Speaker 2: hypotheses like Michael Gazaniga and Joseph Ledu's left brain interpreter model, 1360 01:11:06,080 --> 01:11:09,000 Speaker 2: where they argue that part of what the left hemisphere 1361 01:11:09,000 --> 01:11:12,280 Speaker 2: of the brain does is generate an ongoing series of 1362 01:11:12,400 --> 01:11:17,120 Speaker 2: narrative explanations that reconcile past and present and give us 1363 01:11:17,160 --> 01:11:20,680 Speaker 2: the sense of that we understand why we do what 1364 01:11:20,760 --> 01:11:23,440 Speaker 2: we do now. Of course, their model could be incorrect, 1365 01:11:23,479 --> 01:11:25,920 Speaker 2: but I think it's also possible that they're really onto 1366 01:11:25,960 --> 01:11:29,080 Speaker 2: something that the brain seems to have a major function 1367 01:11:29,200 --> 01:11:34,280 Speaker 2: of trying to convince itself that its behavior is coherent 1368 01:11:34,400 --> 01:11:38,680 Speaker 2: and has rational justifications, and if possible, to convince the 1369 01:11:38,760 --> 01:11:41,840 Speaker 2: conscious part of the brain that it's in control. I 1370 01:11:41,880 --> 01:11:44,080 Speaker 2: think this is kind of like at work when you 1371 01:11:44,120 --> 01:11:46,760 Speaker 2: give the boss three options. You know, it's like, here 1372 01:11:46,760 --> 01:11:48,599 Speaker 2: are the three things we came up with, and you've 1373 01:11:48,640 --> 01:11:51,479 Speaker 2: got the one you actually want to go with, and 1374 01:11:51,560 --> 01:11:54,600 Speaker 2: then you've got two like terrible options that are designed 1375 01:11:54,640 --> 01:11:57,599 Speaker 2: in order to be ignored and discarded by the boss 1376 01:11:57,800 --> 01:12:00,440 Speaker 2: and flatter the boss and give them a sense of control. 1377 01:12:00,320 --> 01:12:02,360 Speaker 1: Right, which can be a dangerous exercise. 1378 01:12:02,520 --> 01:12:05,800 Speaker 2: Absolutely, I'm not advising that as a good strategy. I'm 1379 01:12:05,800 --> 01:12:08,320 Speaker 2: just saying people do it to look quickly at one 1380 01:12:08,320 --> 01:12:08,800 Speaker 2: more study. 1381 01:12:08,840 --> 01:12:09,160 Speaker 4: I found. 1382 01:12:09,240 --> 01:12:15,559 Speaker 2: This was published in twenty eighteen in PNAS by Darby, Jautza, Burke, 1383 01:12:15,640 --> 01:12:20,559 Speaker 2: and Fox called lesion network localization of free will. Very briefly, 1384 01:12:20,600 --> 01:12:25,840 Speaker 2: the authors here define the neurologically relevant parts of free 1385 01:12:25,840 --> 01:12:28,599 Speaker 2: will as having two parts. So first of all, there's 1386 01:12:28,720 --> 01:12:32,400 Speaker 2: the desire to act, that's volition. You got volitional control, 1387 01:12:32,680 --> 01:12:36,000 Speaker 2: and then a sense of responsibility for that action, which 1388 01:12:36,040 --> 01:12:39,280 Speaker 2: is the feeling of agencies. So you got volition and agency. 1389 01:12:39,880 --> 01:12:42,439 Speaker 2: And then they looked at two neurological conditions, one that 1390 01:12:42,560 --> 01:12:45,840 Speaker 2: is believed to disrupt each of these functions. They looked 1391 01:12:45,840 --> 01:12:49,720 Speaker 2: at focal brain lesions that disrupt a volition causing a 1392 01:12:49,920 --> 01:12:53,479 Speaker 2: kinetic mutism. And a kinetic mutism is a condition where 1393 01:12:53,520 --> 01:12:57,120 Speaker 2: patients are unable to voluntarily move or speak. This would 1394 01:12:57,120 --> 01:12:59,400 Speaker 2: of course be a disruption of the volition part of 1395 01:12:59,400 --> 01:13:02,720 Speaker 2: the brain. And then lesions that disrupt agency, and this 1396 01:13:02,760 --> 01:13:06,679 Speaker 2: would of course cause alien Limb syndrome. Again, alien limb syndrome, 1397 01:13:06,720 --> 01:13:09,040 Speaker 2: that's where you've got part of your body acting or 1398 01:13:09,080 --> 01:13:11,280 Speaker 2: moving in a way that does not feel voluntary. 1399 01:13:11,320 --> 01:13:13,720 Speaker 4: It moves, but you don't feel like you did it. 1400 01:13:14,520 --> 01:13:17,320 Speaker 2: And then they basically found that brain lesions that disrupt 1401 01:13:17,439 --> 01:13:20,080 Speaker 2: volition occur all over the brain, but they're within a 1402 01:13:20,120 --> 01:13:22,800 Speaker 2: brain network that is connected in some way to the 1403 01:13:22,840 --> 01:13:26,799 Speaker 2: anterior cingulate cortex. And they found that lesions that disrupt 1404 01:13:26,880 --> 01:13:30,200 Speaker 2: agency also occur in different locations around the brain, but 1405 01:13:30,280 --> 01:13:32,920 Speaker 2: they tend to be defined by connectivity to a part 1406 01:13:32,920 --> 01:13:35,840 Speaker 2: of the brain called the precuneus. Now, again I would 1407 01:13:35,840 --> 01:13:39,479 Speaker 2: note that this this acknowledges physical evidence that there are 1408 01:13:39,520 --> 01:13:44,479 Speaker 2: distinct brain processes involved in generating action, you know, volition 1409 01:13:44,960 --> 01:13:49,519 Speaker 2: versus recognizing personal agency in that action, and typical brains 1410 01:13:49,560 --> 01:13:52,880 Speaker 2: executing typical actions have both of these acting in sync. 1411 01:13:53,200 --> 01:13:55,639 Speaker 2: But brains can have either one without the other. 1412 01:13:56,040 --> 01:13:58,120 Speaker 1: Now, obviously we could keep going here. We could keep 1413 01:13:58,160 --> 01:14:02,439 Speaker 1: discussing free will and what feels like free will and 1414 01:14:02,439 --> 01:14:07,400 Speaker 1: how it matches up with neuroscientific data, etc. But at 1415 01:14:07,400 --> 01:14:09,120 Speaker 1: this point the podcast, we probably do need to bring 1416 01:14:09,120 --> 01:14:12,639 Speaker 1: it back around to Bandersnatch and the question like, Okay, 1417 01:14:12,640 --> 01:14:14,720 Speaker 1: given all this stuff that we've talked about, what does 1418 01:14:14,760 --> 01:14:17,920 Speaker 1: Bandersnatch seem to be saying about all of this? Well, 1419 01:14:18,560 --> 01:14:21,400 Speaker 1: it does seem to be largely a rumination on the 1420 01:14:21,439 --> 01:14:23,760 Speaker 1: idea that we do not seem to have as much 1421 01:14:23,760 --> 01:14:28,040 Speaker 1: free will as we think we do that we can resist, 1422 01:14:28,120 --> 01:14:31,280 Speaker 1: but it takes considerable effort to run counter to the 1423 01:14:31,720 --> 01:14:32,960 Speaker 1: current that we're stuck in. 1424 01:14:33,240 --> 01:14:35,439 Speaker 2: I would say a thing that is a theme that 1425 01:14:35,520 --> 01:14:38,080 Speaker 2: is hammered home about free will, and it is the 1426 01:14:38,160 --> 01:14:40,840 Speaker 2: more we look at the concept of free will and 1427 01:14:40,960 --> 01:14:43,600 Speaker 2: think about whether we have control over our actions, the 1428 01:14:43,680 --> 01:14:44,960 Speaker 2: less we feel we have it. 1429 01:14:45,280 --> 01:14:47,559 Speaker 1: Yeah, Like I was thinking, I'm trying to list, like 1430 01:14:47,920 --> 01:14:53,240 Speaker 1: all the various factors and agents that are influencing Stephan 1431 01:14:53,439 --> 01:14:57,000 Speaker 1: in the story. I mean, we have his mental health, 1432 01:14:57,080 --> 01:15:02,000 Speaker 1: his past trauma, his father, his therapy, the work and 1433 01:15:02,240 --> 01:15:06,000 Speaker 1: tragic life, the influence of Jerome F. Davies, his boss 1434 01:15:06,040 --> 01:15:11,000 Speaker 1: at Tuckersoft, his mentor slash hero slash friend Colin Rittman, 1435 01:15:11,320 --> 01:15:15,160 Speaker 1: conspiracy theories, music media, et cetera. And that's not even 1436 01:15:15,200 --> 01:15:17,760 Speaker 1: getting into the speculative elopment that there is either an 1437 01:15:17,800 --> 01:15:22,080 Speaker 1: actual demon entity that is the literal thief of destiny, 1438 01:15:22,800 --> 01:15:26,000 Speaker 1: or that a power beyond himself is influencing his decisions, 1439 01:15:26,000 --> 01:15:28,880 Speaker 1: some sort of voice from beyond or the machinations of 1440 01:15:28,880 --> 01:15:30,439 Speaker 1: a player in another world. 1441 01:15:30,760 --> 01:15:34,479 Speaker 2: Yeah, the story really brings home this paradox, which is 1442 01:15:34,520 --> 01:15:37,840 Speaker 2: that I think it is the case that the closer 1443 01:15:37,920 --> 01:15:41,840 Speaker 2: we look at free will, and the more we bring 1444 01:15:42,080 --> 01:15:47,680 Speaker 2: our sharpest scientific tools and philosophical instruments to understand it, 1445 01:15:48,000 --> 01:15:50,280 Speaker 2: the less it seems to make sense, and the less 1446 01:15:50,320 --> 01:15:52,800 Speaker 2: it seems to be there. And yet at the same 1447 01:15:52,880 --> 01:15:56,559 Speaker 2: time that we acknowledge that to feel like your actions 1448 01:15:56,600 --> 01:15:59,599 Speaker 2: are not under your own control is not a heightened 1449 01:15:59,600 --> 01:16:02,800 Speaker 2: state of consciousness, that is still a problem. Yeah, and 1450 01:16:03,160 --> 01:16:05,720 Speaker 2: it and I don't know exactly what that signals. That 1451 01:16:05,760 --> 01:16:09,040 Speaker 2: may be yet another unresolved tension in the issue of 1452 01:16:09,040 --> 01:16:12,000 Speaker 2: free will, that like, the more closely we examine it, 1453 01:16:12,040 --> 01:16:14,080 Speaker 2: the less we feel like we have it, and yet 1454 01:16:14,600 --> 01:16:17,599 Speaker 2: genuinely feeling like you don't have it, the more you 1455 01:16:17,680 --> 01:16:20,320 Speaker 2: feel that way, the more this is a serious impediment 1456 01:16:20,400 --> 01:16:21,840 Speaker 2: to you living a healthy life. 1457 01:16:22,040 --> 01:16:25,439 Speaker 1: Absolutely, now, this seems this may seem like a logical 1458 01:16:25,479 --> 01:16:29,080 Speaker 1: place to end the conversation, but one of the things 1459 01:16:29,120 --> 01:16:32,960 Speaker 1: that's really interesting here is that is that we were 1460 01:16:32,960 --> 01:16:35,519 Speaker 1: talking about an episode of Black Mirror that deals with 1461 01:16:35,600 --> 01:16:38,880 Speaker 1: free will and our choices in life. And certainly again 1462 01:16:38,920 --> 01:16:42,000 Speaker 1: Black Mirror frequently comments on our unease regarding new technology, 1463 01:16:42,280 --> 01:16:46,080 Speaker 1: but then band or Snatch itself this show on Netflix 1464 01:16:46,160 --> 01:16:50,840 Speaker 1: this this movie. This movie itself factors into some user 1465 01:16:50,880 --> 01:16:55,200 Speaker 1: concerns about the future of this sort of interactive viewing technology. 1466 01:16:55,600 --> 01:16:59,240 Speaker 2: Yes, you know, I would say one of the things 1467 01:16:59,240 --> 01:17:01,760 Speaker 2: that is a legitimate concern about free will, however you 1468 01:17:01,840 --> 01:17:05,200 Speaker 2: define it as murky as it is. At least one 1469 01:17:05,240 --> 01:17:08,120 Speaker 2: thing that we want is to we want to think 1470 01:17:08,160 --> 01:17:12,599 Speaker 2: that we understand the incoming influences on our behavior, right 1471 01:17:13,080 --> 01:17:15,519 Speaker 2: Like you'd like to think that if I did X, 1472 01:17:16,000 --> 01:17:18,120 Speaker 2: I can sort of make sense of that it was 1473 01:17:18,200 --> 01:17:20,799 Speaker 2: because I read this book, or I read this article, 1474 01:17:20,960 --> 01:17:23,760 Speaker 2: or I had a conversation with this person, and I 1475 01:17:23,880 --> 01:17:27,280 Speaker 2: connect the knowledge I gained through that or the influences 1476 01:17:27,320 --> 01:17:31,559 Speaker 2: of those past experiences with the decision I just made. 1477 01:17:32,120 --> 01:17:35,800 Speaker 2: Life starts getting more difficult when you have trouble understanding 1478 01:17:35,880 --> 01:17:38,320 Speaker 2: what the influences on yourself are. 1479 01:17:38,520 --> 01:17:39,320 Speaker 4: Does that make sense? 1480 01:17:39,600 --> 01:17:42,280 Speaker 1: Yeah? Yeah, And we've we've discussed some of these in 1481 01:17:42,280 --> 01:17:44,000 Speaker 1: the past. We've discussed a number of these in the past, 1482 01:17:44,200 --> 01:17:47,439 Speaker 1: but technologically speaking, we have discussed advertising and we have 1483 01:17:47,479 --> 01:17:51,800 Speaker 1: discussed social media, which are good things to keep in 1484 01:17:51,800 --> 01:17:54,439 Speaker 1: mind as we continue here. Because there might not be 1485 01:17:54,479 --> 01:17:57,000 Speaker 1: a band or snatcher, a demon awaiting you in the 1486 01:17:57,040 --> 01:18:00,320 Speaker 1: maze of future interactive media technology. But there there might 1487 01:18:00,400 --> 01:18:06,000 Speaker 1: just be some highly targeted advertisements for example. So two 1488 01:18:06,240 --> 01:18:09,920 Speaker 1: individuals that I ran across wrote about this topic or 1489 01:18:09,960 --> 01:18:13,040 Speaker 1: touched on this topic. One is Matthew Galt, who wrote 1490 01:18:13,040 --> 01:18:17,040 Speaker 1: about this last year for Vice's Motherboard, and then Tiffany 1491 01:18:17,080 --> 01:18:20,040 Speaker 1: Schu wrote about it for The New York Times. So 1492 01:18:20,120 --> 01:18:24,439 Speaker 1: Galt wrote about Michael Veil, a technology policy researcher at 1493 01:18:24,520 --> 01:18:29,600 Speaker 1: University College London, who utilized Europe's General Data Protection Regulation 1494 01:18:29,880 --> 01:18:34,240 Speaker 1: or GDPR law to request a copy of the data 1495 01:18:34,560 --> 01:18:38,519 Speaker 1: Netflix collected about him and his choices through the use 1496 01:18:38,880 --> 01:18:43,120 Speaker 1: of the Bandersnatch program. Now they complied, perhaps in part 1497 01:18:43,160 --> 01:18:46,720 Speaker 1: because of veil status as a public person, but basically 1498 01:18:47,400 --> 01:18:50,800 Speaker 1: Netflix acquires this information in order to carry out the 1499 01:18:50,800 --> 01:18:54,400 Speaker 1: Bandersnatch experience, which makes sense, right, it has to chart 1500 01:18:54,400 --> 01:18:59,280 Speaker 1: your path through this complex system. But then also Netflix 1501 01:18:59,400 --> 01:19:02,400 Speaker 1: keeps this in which the company claims is in order 1502 01:19:02,439 --> 01:19:05,920 Speaker 1: to quote determine how to improve this model of storytelling 1503 01:19:06,000 --> 01:19:09,840 Speaker 1: in the context of a show or movie. And I mean, 1504 01:19:09,960 --> 01:19:12,439 Speaker 1: on one level that sounds well and good as well, 1505 01:19:12,479 --> 01:19:16,240 Speaker 1: except that Vial thinks that Netflix quote should really be 1506 01:19:16,320 --> 01:19:19,960 Speaker 1: using consent, which you should be able to refuse or 1507 01:19:20,200 --> 01:19:24,559 Speaker 1: legitimate interest, meaning that you can object to it instead. Now, 1508 01:19:24,560 --> 01:19:27,640 Speaker 1: in Shoe's article, she points to the early choice we 1509 01:19:27,720 --> 01:19:33,639 Speaker 1: make between Kellogg's Frosty's and then Quaker Sugar Puffs. Now, 1510 01:19:33,760 --> 01:19:36,200 Speaker 1: both of these are real serials, though I have to 1511 01:19:36,240 --> 01:19:39,920 Speaker 1: admit I thought Quaker Sugarpuffs was made up because it 1512 01:19:39,920 --> 01:19:43,519 Speaker 1: has this ridiculous honey monster mascot that's like super fun, 1513 01:19:43,600 --> 01:19:45,240 Speaker 1: kind of a cheddar Goblin sort of thing. 1514 01:19:45,400 --> 01:19:45,839 Speaker 2: Nice. 1515 01:19:45,960 --> 01:19:49,280 Speaker 1: But it turns out this was an actual UK product. 1516 01:19:49,280 --> 01:19:52,720 Speaker 1: It was just a UK only product, so Americans such 1517 01:19:52,720 --> 01:19:55,719 Speaker 1: as ourselves were perhaps not privy to it. But again, 1518 01:19:55,760 --> 01:19:59,320 Speaker 1: both were real products, and neither one was a paid inclusion, 1519 01:19:59,360 --> 01:20:02,560 Speaker 1: so it was not a official product placement or product integration. 1520 01:20:02,960 --> 01:20:07,000 Speaker 1: And Netflix, of course is like an ad free system anyway. 1521 01:20:07,800 --> 01:20:11,440 Speaker 1: But Shoe points to some of the words of Read Hastings, 1522 01:20:11,720 --> 01:20:15,000 Speaker 1: co founder, chairman and CEO of Netflix, who pointed out 1523 01:20:15,080 --> 01:20:18,240 Speaker 1: during a webcast tied to an earnings report that seventy 1524 01:20:18,280 --> 01:20:24,120 Speaker 1: three percent of Bandersnatch viewers selected kellogg Frosty's over the 1525 01:20:25,000 --> 01:20:26,080 Speaker 1: Quaker Sugar Puffs. 1526 01:20:26,080 --> 01:20:28,679 Speaker 4: Oh no, I did too. I feel so vulnerable right now. 1527 01:20:29,200 --> 01:20:31,599 Speaker 1: I don't remember what I did the first time around 1528 01:20:31,760 --> 01:20:33,439 Speaker 1: the first time I watched it, I also watched with 1529 01:20:33,479 --> 01:20:36,559 Speaker 1: my wife, so we were voting on which choices. You know, 1530 01:20:36,560 --> 01:20:39,080 Speaker 1: we're having a discussion. I guess I should have mentioned 1531 01:20:39,120 --> 01:20:40,920 Speaker 1: that earlier, because that has a whole other wrinkled as 1532 01:20:40,920 --> 01:20:44,120 Speaker 1: a scenario of making communal choices and voting on something. 1533 01:20:44,400 --> 01:20:47,920 Speaker 1: But on my own, I chose the Quaker thing just 1534 01:20:47,960 --> 01:20:51,080 Speaker 1: because I thought it looked weirder. Okay, but again I'm 1535 01:20:51,080 --> 01:20:54,400 Speaker 1: in the minority for doing so. So first of all, 1536 01:20:54,800 --> 01:20:56,479 Speaker 1: I think this is a shame because I think the 1537 01:20:56,479 --> 01:20:59,320 Speaker 1: cover and TV advertisement for Quaker Sugar Puffs is awesome 1538 01:20:59,320 --> 01:21:02,160 Speaker 1: and weird. Again, but more to the point, as Shoe 1539 01:21:02,160 --> 01:21:06,240 Speaker 1: points out, Spencer Wang, a Netflix vice president, chimed in 1540 01:21:06,360 --> 01:21:08,280 Speaker 1: and joked, and let's be clear he was He was 1541 01:21:08,320 --> 01:21:11,519 Speaker 1: apparently joking that this was the most critical data point 1542 01:21:11,560 --> 01:21:12,120 Speaker 1: of the quarter. 1543 01:21:13,040 --> 01:21:13,240 Speaker 4: Now. 1544 01:21:13,240 --> 01:21:16,080 Speaker 1: She writes that while Netflix doesn't run commercials and has 1545 01:21:16,080 --> 01:21:19,799 Speaker 1: stated that it would not use bandersnatch information for anything 1546 01:21:19,880 --> 01:21:23,400 Speaker 1: like this, others outside the company do see the potential, namely, 1547 01:21:23,439 --> 01:21:27,160 Speaker 1: in quote, the possibility of inserting brand name products into 1548 01:21:27,160 --> 01:21:31,880 Speaker 1: streaming shows based on data generated by interactive programming. Now, 1549 01:21:31,880 --> 01:21:34,840 Speaker 1: Shoe stresses that the technology to roll this out isn't 1550 01:21:34,840 --> 01:21:38,280 Speaker 1: here yet. But I suppose we have to consider two 1551 01:21:38,360 --> 01:21:42,680 Speaker 1: key factors in that statement. So, first of all, we 1552 01:21:42,720 --> 01:21:46,519 Speaker 1: were in the early days of truly interactive features like 1553 01:21:46,560 --> 01:21:50,200 Speaker 1: this on major streaming platforms, you know, assume, and that 1554 01:21:50,280 --> 01:21:52,519 Speaker 1: is just assuming that it really catches on at all. 1555 01:21:52,600 --> 01:21:56,680 Speaker 1: As we've discussed, interactive cinema is not new. It's been 1556 01:21:56,720 --> 01:22:01,200 Speaker 1: around for decades and it has largely failed to catch on. 1557 01:22:02,320 --> 01:22:06,720 Speaker 1: It is not like a driving force in our entertainment. 1558 01:22:07,040 --> 01:22:08,960 Speaker 1: You'll find plenty of examples of it. You also find 1559 01:22:08,960 --> 01:22:12,120 Speaker 1: a lot of computer games that kind of fulfill this 1560 01:22:12,120 --> 01:22:14,560 Speaker 1: this niche right, Yeah. 1561 01:22:14,240 --> 01:22:15,799 Speaker 4: Those are also sort of failed. 1562 01:22:16,360 --> 01:22:20,000 Speaker 1: Yeah, you know, I would have there are certainly deeper 1563 01:22:20,000 --> 01:22:22,280 Speaker 1: dives and say the history of things like what Telltale 1564 01:22:22,320 --> 01:22:24,760 Speaker 1: Games I think was the company maybe that did a 1565 01:22:24,840 --> 01:22:27,960 Speaker 1: number of these things that were again not really released 1566 01:22:28,000 --> 01:22:30,800 Speaker 1: as they weren't marketed as interactive movies so much as 1567 01:22:30,800 --> 01:22:35,000 Speaker 1: they were interactive gaming experiences. So that's one thing to consider. 1568 01:22:36,000 --> 01:22:38,360 Speaker 1: Interest in interactive films has essentially gone up and down 1569 01:22:38,360 --> 01:22:40,800 Speaker 1: over the years, and again it hasn't really like ignited 1570 01:22:41,479 --> 01:22:44,600 Speaker 1: still Netflix and also Netflix itself has only released a 1571 01:22:44,640 --> 01:22:48,720 Speaker 1: handful of interactive titles, mostly kids stuff. Bandersnatch is their 1572 01:22:48,760 --> 01:22:52,920 Speaker 1: only true adult drama release in this of this product type, 1573 01:22:53,240 --> 01:22:56,160 Speaker 1: though they claimed to be doubling down on interactive content 1574 01:22:56,200 --> 01:22:59,120 Speaker 1: in the future. Given you know how Netflix tends to 1575 01:22:59,120 --> 01:23:02,640 Speaker 1: be a little bit secretive about what's coming out, or 1576 01:23:02,640 --> 01:23:04,559 Speaker 1: at least they don't tell you a lot, I guess 1577 01:23:04,560 --> 01:23:06,639 Speaker 1: we'll just have to know about it when we see 1578 01:23:06,680 --> 01:23:10,400 Speaker 1: it pop up. But also it's also worth reminding ourselves 1579 01:23:10,400 --> 01:23:12,840 Speaker 1: that a great deal of work went into creating Bandersnatch 1580 01:23:12,880 --> 01:23:15,880 Speaker 1: as well. I think I've seen it written that like 1581 01:23:15,960 --> 01:23:19,559 Speaker 1: three times as much work went into Bandersnatch versus say 1582 01:23:20,320 --> 01:23:23,879 Speaker 1: that long episode of the show that was approximately ninety minutes, 1583 01:23:24,479 --> 01:23:28,280 Speaker 1: So is it cost effective content? Are all the limitations 1584 01:23:28,400 --> 01:23:31,200 Speaker 1: worked out yet? For instance, I don't believe Bandersnatch works 1585 01:23:31,200 --> 01:23:34,479 Speaker 1: on many mobile formats or older models, like you have 1586 01:23:34,560 --> 01:23:37,400 Speaker 1: to have something more updated. Like I tried to load 1587 01:23:37,439 --> 01:23:41,160 Speaker 1: it onto my phone and I have an older iPhone. 1588 01:23:41,479 --> 01:23:43,120 Speaker 1: I tried to load it on there to watch on 1589 01:23:43,160 --> 01:23:45,960 Speaker 1: an airplane and it wouldn't work. I had to watch 1590 01:23:45,960 --> 01:23:49,080 Speaker 1: it through my Xbox One. And another big concern is 1591 01:23:49,320 --> 01:23:51,599 Speaker 1: there would need to be I guess enough interactive content 1592 01:23:51,720 --> 01:23:54,960 Speaker 1: out there tuned for this sort of thing to generate 1593 01:23:55,040 --> 01:23:58,280 Speaker 1: the necessary user data to then be employed. 1594 01:23:58,840 --> 01:24:01,120 Speaker 2: I can really see this kind of thing being used 1595 01:24:01,160 --> 01:24:03,680 Speaker 2: as a as a major data mining I mean, I 1596 01:24:03,720 --> 01:24:07,479 Speaker 2: don't know, it seems very possible to get psychologically salient 1597 01:24:07,560 --> 01:24:08,400 Speaker 2: information through this. 1598 01:24:08,800 --> 01:24:09,200 Speaker 1: Yeah. 1599 01:24:09,240 --> 01:24:11,720 Speaker 2: Now, of course that they're already getting information through all 1600 01:24:11,840 --> 01:24:13,840 Speaker 2: kinds of things. You know, the tech business can get 1601 01:24:13,840 --> 01:24:17,960 Speaker 2: your information through through what you buy online, through what 1602 01:24:18,000 --> 01:24:20,880 Speaker 2: websites you visit, through what you do on Facebook or 1603 01:24:20,920 --> 01:24:21,799 Speaker 2: other social media. 1604 01:24:21,880 --> 01:24:25,000 Speaker 1: Right Like, a website like Netflix already knows what kind 1605 01:24:25,040 --> 01:24:28,120 Speaker 1: of movies you have watched, what kind you like, what 1606 01:24:28,200 --> 01:24:31,120 Speaker 1: kind of movies you want to like, and then also 1607 01:24:31,760 --> 01:24:33,880 Speaker 1: you know how you have rated things as well, and 1608 01:24:33,920 --> 01:24:36,880 Speaker 1: then they can serve you a recommendation of what you 1609 01:24:36,960 --> 01:24:38,240 Speaker 1: might want to watch in the future. 1610 01:24:38,479 --> 01:24:41,080 Speaker 2: Right now, This is so, of course we're talking about 1611 01:24:41,080 --> 01:24:44,840 Speaker 2: this and possibly going multiple ways. One is using interactive 1612 01:24:45,000 --> 01:24:49,040 Speaker 2: choices in a film to gather data about you, and 1613 01:24:49,120 --> 01:24:54,000 Speaker 2: the other side would be giving like specially user tailored 1614 01:24:54,200 --> 01:24:58,000 Speaker 2: media experiences, which we already get somewhat of course on websites, 1615 01:24:58,040 --> 01:25:00,200 Speaker 2: you know, you get websites loading with the ad of 1616 01:25:00,240 --> 01:25:03,439 Speaker 2: stuff you searched for on Amazon and all that. But yeah, 1617 01:25:03,439 --> 01:25:05,600 Speaker 2: I guess we're being forced to consider what if that 1618 01:25:05,640 --> 01:25:08,800 Speaker 2: starts happening within the movies and TV shows you watch, 1619 01:25:09,360 --> 01:25:13,280 Speaker 2: So you start seeing product placement for specific products that 1620 01:25:13,320 --> 01:25:17,600 Speaker 2: are aimed at you individually within the shows you watch. 1621 01:25:18,000 --> 01:25:21,840 Speaker 1: Right right, Like, you know, they know that given the 1622 01:25:21,840 --> 01:25:26,840 Speaker 1: serial scenario, like potentially the master of the content, be 1623 01:25:26,920 --> 01:25:31,519 Speaker 1: it Netflix or some other company, Hypothetically, they might know 1624 01:25:31,680 --> 01:25:34,400 Speaker 1: that you are, say more inclined towards, you know, healthy 1625 01:25:34,400 --> 01:25:39,040 Speaker 1: lifestyle choices, and therefore some sort of granola, you know, 1626 01:25:39,200 --> 01:25:42,839 Speaker 1: health wrapped content would be ideal for you in that scenario. 1627 01:25:43,080 --> 01:25:47,240 Speaker 1: Or they might know that that's not your ideal cereal, 1628 01:25:47,560 --> 01:25:49,400 Speaker 1: or maybe they know that you have children in the 1629 01:25:49,400 --> 01:25:52,280 Speaker 1: house and therefore a children's cereal would be more appropriate. 1630 01:25:52,600 --> 01:25:55,560 Speaker 1: Like that's the kind of information that they could conceivably 1631 01:25:55,640 --> 01:25:59,280 Speaker 1: have and then feed into the cereal that appears before 1632 01:25:59,320 --> 01:25:59,920 Speaker 1: you on the screen. 1633 01:26:00,360 --> 01:26:03,680 Speaker 2: Now, that would be, of course, something we're more familiar with, 1634 01:26:03,840 --> 01:26:07,200 Speaker 2: just like inserting ads. And you might imagine a character 1635 01:26:07,320 --> 01:26:10,080 Speaker 2: walking past a billboard or something in a movie like 1636 01:26:10,280 --> 01:26:13,439 Speaker 2: happens all the time now except that billboard can be 1637 01:26:13,640 --> 01:26:16,599 Speaker 2: you know, dynamically inserted with new imagery or something. 1638 01:26:16,800 --> 01:26:17,000 Speaker 1: Right. 1639 01:26:17,880 --> 01:26:20,760 Speaker 2: I think things start getting even creepier when you imagine 1640 01:26:21,560 --> 01:26:24,479 Speaker 2: something more like Band or Snatch Itself, where there are 1641 01:26:24,680 --> 01:26:27,600 Speaker 2: alternative versions of a film that are specially tailored to you, 1642 01:26:27,720 --> 01:26:32,519 Speaker 2: that have different narrative content depending on who's watching. I mean, so, 1643 01:26:32,600 --> 01:26:34,759 Speaker 2: one thing Robert and I were talking about briefly before 1644 01:26:34,760 --> 01:26:37,680 Speaker 2: we came in here is the idea that you know, 1645 01:26:37,720 --> 01:26:41,759 Speaker 2: we often know that movies can embody values, of course, 1646 01:26:41,840 --> 01:26:44,960 Speaker 2: you know that, like sometimes the values of a filmmaker 1647 01:26:45,160 --> 01:26:48,680 Speaker 2: creator come through and what happens in a story, and 1648 01:26:48,760 --> 01:26:52,599 Speaker 2: then other times there are sort of like cheap attempts 1649 01:26:52,640 --> 01:26:55,800 Speaker 2: to display values what would often be called like pandering, right, 1650 01:26:55,880 --> 01:26:59,360 Speaker 2: you know, like you know, cheap appeals to patriotism or 1651 01:26:59,400 --> 01:27:01,120 Speaker 2: something like that in a movie. 1652 01:27:01,520 --> 01:27:03,280 Speaker 1: Or I don't know, I guess you could make an 1653 01:27:03,320 --> 01:27:06,599 Speaker 1: argument for awards season Academy Awards bait as well. 1654 01:27:06,479 --> 01:27:09,719 Speaker 2: Right, yeah, sure, you know, just like sort of cheap 1655 01:27:09,720 --> 01:27:15,720 Speaker 2: attempts to exploit the specific desires or value interests of 1656 01:27:15,920 --> 01:27:17,400 Speaker 2: a specific target audience. 1657 01:27:17,640 --> 01:27:21,400 Speaker 4: Right, And so you can imagine. 1658 01:27:21,000 --> 01:27:23,800 Speaker 2: Okay, well, now if a movie is made and it 1659 01:27:23,880 --> 01:27:27,920 Speaker 2: wants to pander, it needs to at least make a choice. Right, 1660 01:27:28,160 --> 01:27:30,439 Speaker 2: It's hard to pander to everybody at the same time. 1661 01:27:31,520 --> 01:27:35,559 Speaker 2: But you can imagine, Okay, what if somebody just starts 1662 01:27:35,560 --> 01:27:38,240 Speaker 2: making more like a Bandersnatch kind of thing where maybe 1663 01:27:38,240 --> 01:27:40,320 Speaker 2: you don't make the choices, the choices are made for 1664 01:27:40,400 --> 01:27:43,160 Speaker 2: you based on what is known about your user profile. 1665 01:27:43,720 --> 01:27:46,920 Speaker 2: And so what I was imagining beforehand was you could 1666 01:27:46,920 --> 01:27:49,840 Speaker 2: have different versions of the movie Independence Day. 1667 01:27:49,880 --> 01:27:52,880 Speaker 4: You remember that speech Bill Pullman gives before they all 1668 01:27:52,880 --> 01:27:55,439 Speaker 4: get in the planes and go fly off and fight. 1669 01:27:55,720 --> 01:27:57,160 Speaker 1: Oh yes, So it's. 1670 01:27:57,000 --> 01:27:59,439 Speaker 2: This rousing moment where Bill Pullman gives this kind of 1671 01:27:59,600 --> 01:28:03,400 Speaker 2: innocus speech that could appeal to basically anybody. But you 1672 01:28:03,439 --> 01:28:07,280 Speaker 2: can make that speech a much more tailored, specific interest 1673 01:28:07,320 --> 01:28:10,120 Speaker 2: group pandering kind of thing, where you could have one 1674 01:28:10,200 --> 01:28:13,080 Speaker 2: version of the film that plays for somebody that's that's 1675 01:28:13,160 --> 01:28:15,840 Speaker 2: very inclusive. He gives a speech he's like, humans will 1676 01:28:15,920 --> 01:28:18,320 Speaker 2: join arms together around the world. There will be no 1677 01:28:18,439 --> 01:28:21,120 Speaker 2: more nations and borders and creeds and all. You know, 1678 01:28:21,320 --> 01:28:23,519 Speaker 2: we all unite as one and stand in brothers and 1679 01:28:23,880 --> 01:28:26,920 Speaker 2: as brothers and sisters against this. Or you could have 1680 01:28:26,960 --> 01:28:29,880 Speaker 2: a version where he gives a speech about American exceptionalism 1681 01:28:29,920 --> 01:28:32,000 Speaker 2: and how we're the first and we stand up and 1682 01:28:32,040 --> 01:28:34,439 Speaker 2: fight when no one else will, or you could get 1683 01:28:34,439 --> 01:28:36,599 Speaker 2: you know, you can imagine a million versions of this 1684 01:28:36,800 --> 01:28:40,120 Speaker 2: depending on what kind of user they think you are 1685 01:28:40,360 --> 01:28:41,440 Speaker 2: who are watching. 1686 01:28:41,439 --> 01:28:45,400 Speaker 1: Right, I mean, and that that kind of personality profile 1687 01:28:46,000 --> 01:28:49,519 Speaker 1: or worldview profile would be pretty easy to acquire. I mean, 1688 01:28:49,560 --> 01:28:54,280 Speaker 1: basically websites like Facebook have that information. Like sure, they're 1689 01:28:54,320 --> 01:28:59,080 Speaker 1: not feeding you independence day to a tailored speech in it, 1690 01:28:59,120 --> 01:29:02,280 Speaker 1: but they are feeding giving you a feed that that 1691 01:29:02,840 --> 01:29:05,640 Speaker 1: reflects your world views and values. 1692 01:29:05,520 --> 01:29:09,120 Speaker 2: And people are very invested in like the values of 1693 01:29:09,120 --> 01:29:12,240 Speaker 2: what media they consume these days. I can imagine it 1694 01:29:12,360 --> 01:29:16,760 Speaker 2: being judged a very profitable enterprise by some studios to say, well, 1695 01:29:16,840 --> 01:29:19,320 Speaker 2: let's just cover all the bases. You know, we'll have 1696 01:29:19,400 --> 01:29:21,720 Speaker 2: way less trouble if we make a movie, you know, 1697 01:29:21,840 --> 01:29:24,080 Speaker 2: a version A of the movie for you and a 1698 01:29:24,200 --> 01:29:26,720 Speaker 2: version B of the movie for you. It doesn't have 1699 01:29:26,840 --> 01:29:29,320 Speaker 2: to be a coherent vision or picture of the world. 1700 01:29:29,560 --> 01:29:31,720 Speaker 1: Yeah, this is you know. I can't help but think 1701 01:29:31,760 --> 01:29:34,200 Speaker 1: on past films, like, for instance, we talked about Conan 1702 01:29:34,240 --> 01:29:37,160 Speaker 1: the Barbarian on the show in the past, like that 1703 01:29:37,320 --> 01:29:39,760 Speaker 1: is a film that has a has a very particular 1704 01:29:39,880 --> 01:29:43,439 Speaker 1: view of what strength means and what, uh you know, 1705 01:29:43,520 --> 01:29:48,040 Speaker 1: how power works, et cetera. And it's not everybody's political 1706 01:29:48,320 --> 01:29:50,920 Speaker 1: or philosophical cup of tea. Sure, I mean you can. 1707 01:29:51,080 --> 01:29:53,439 Speaker 1: I think you can enjoy that film without focusing on 1708 01:29:53,479 --> 01:29:58,000 Speaker 1: all of that. But still it's definitely there. And that's 1709 01:29:58,040 --> 01:29:59,800 Speaker 1: not a film, I mean, especially at the time it 1710 01:29:59,840 --> 01:30:02,440 Speaker 1: came out. It's not a film where you would necessarily 1711 01:30:02,479 --> 01:30:05,439 Speaker 1: ask for an alternate version of it. But again, it's 1712 01:30:05,560 --> 01:30:07,360 Speaker 1: very clear in what it's saying. But then you have 1713 01:30:07,400 --> 01:30:10,519 Speaker 1: films like say Patent. Patent is often brought up as 1714 01:30:10,600 --> 01:30:13,720 Speaker 1: example of a film that meant one thing to one 1715 01:30:13,800 --> 01:30:16,439 Speaker 1: part of a divided America and another to the other 1716 01:30:16,479 --> 01:30:19,800 Speaker 1: part of a divided America without having to have like 1717 01:30:19,840 --> 01:30:20,960 Speaker 1: an ab version. 1718 01:30:21,320 --> 01:30:22,840 Speaker 2: Right Yeah, I think you could say that about a 1719 01:30:22,840 --> 01:30:25,519 Speaker 2: lot of like war movies, especially. I think that might 1720 01:30:25,560 --> 01:30:28,400 Speaker 2: be sort of true of Platoon, right yeah, is that 1721 01:30:28,520 --> 01:30:29,520 Speaker 2: like an anti. 1722 01:30:29,200 --> 01:30:31,040 Speaker 4: War movie or a patriotic movie? 1723 01:30:31,080 --> 01:30:32,920 Speaker 2: You know, you sort of have some elements of each 1724 01:30:33,080 --> 01:30:34,960 Speaker 2: you can latch onto what you want to see there, 1725 01:30:36,120 --> 01:30:38,800 Speaker 2: But yeah, I don't know. I'm somewhat disturbed by the 1726 01:30:38,840 --> 01:30:42,880 Speaker 2: idea of like of media filling up with these like 1727 01:30:43,000 --> 01:30:46,160 Speaker 2: personally tailored options that are designed to make a sort 1728 01:30:46,160 --> 01:30:51,759 Speaker 2: of like generic media template individually palatable to the user, 1729 01:30:52,160 --> 01:30:54,640 Speaker 2: as opposed to standing for something on its own and 1730 01:30:54,680 --> 01:30:55,920 Speaker 2: allowing you to judge it. 1731 01:30:56,080 --> 01:30:59,920 Speaker 1: Yeah, or having some level of ambiguity, like does the 1732 01:31:00,040 --> 01:31:04,400 Speaker 1: as the modern audience and like the near future audience, 1733 01:31:04,479 --> 01:31:07,400 Speaker 1: do they want ambiguity in their work or do they 1734 01:31:07,439 --> 01:31:12,280 Speaker 1: want like a clear cut view that is expressed clear 1735 01:31:12,320 --> 01:31:14,760 Speaker 1: cut values not only of the film, but of like 1736 01:31:14,880 --> 01:31:18,040 Speaker 1: the creator or creators behind it, like they you know, 1737 01:31:18,320 --> 01:31:21,280 Speaker 1: is there is there an increased hunger for that? And 1738 01:31:21,960 --> 01:31:24,760 Speaker 1: if that is the case, you could easily see a 1739 01:31:24,800 --> 01:31:27,879 Speaker 1: way of worming around that by taking this ABC approach 1740 01:31:28,000 --> 01:31:31,880 Speaker 1: to film creation, because then nobody can say, well, I 1741 01:31:32,000 --> 01:31:33,600 Speaker 1: like the character of Cone in the Barbarian, but I 1742 01:31:33,680 --> 01:31:38,600 Speaker 1: think your view is pro totalitarianism and you know, I 1743 01:31:38,640 --> 01:31:43,080 Speaker 1: don't know, celebrates toxic masculinity or whatever the critique might be, 1744 01:31:43,479 --> 01:31:46,080 Speaker 1: And then they could say, well, that's all good, well 1745 01:31:46,120 --> 01:31:48,360 Speaker 1: and good, but you're you're only talking about one version 1746 01:31:48,680 --> 01:31:50,280 Speaker 1: if you watch Cone in the Barbarian. 1747 01:31:51,000 --> 01:31:51,200 Speaker 2: You know. 1748 01:31:51,240 --> 01:31:56,640 Speaker 1: The twenty twenty eight relaunch of the platform then you 1749 01:31:56,680 --> 01:31:59,360 Speaker 1: will get what is tailored to you. It's treatment of 1750 01:31:59,400 --> 01:32:03,679 Speaker 1: masculinity and power will be exactly what you want to see. 1751 01:32:04,320 --> 01:32:07,080 Speaker 1: And I mean that opens the door to just a 1752 01:32:07,080 --> 01:32:10,040 Speaker 1: big question of like what art is and what does 1753 01:32:10,040 --> 01:32:13,599 Speaker 1: that do to you know, to the role of these 1754 01:32:13,680 --> 01:32:15,639 Speaker 1: narratives in our culture. 1755 01:32:15,920 --> 01:32:20,559 Speaker 2: I remember many years ago how much of like the 1756 01:32:20,600 --> 01:32:23,759 Speaker 2: new Internet and the new media landscape was being sold 1757 01:32:23,840 --> 01:32:27,240 Speaker 2: to us. It was so often on the selling point 1758 01:32:27,240 --> 01:32:31,559 Speaker 2: of customization and individualization. You know, get what's right for you, 1759 01:32:31,680 --> 01:32:35,639 Speaker 2: get an experience that's personally tailored for you. And somehow 1760 01:32:35,720 --> 01:32:38,320 Speaker 2: I just feel like we were not able to anticipate 1761 01:32:38,479 --> 01:32:41,400 Speaker 2: how scary and messed up that would feel when it 1762 01:32:41,439 --> 01:32:42,320 Speaker 2: actually happened. 1763 01:32:42,479 --> 01:32:44,920 Speaker 1: Yeah, Yeah, I like to come back to Bandersnatch. The 1764 01:32:44,960 --> 01:32:48,120 Speaker 1: first time I watched it, I think it probably was 1765 01:32:48,200 --> 01:32:52,120 Speaker 1: over two hours that I spent questing after the happy ending, 1766 01:32:52,200 --> 01:32:54,360 Speaker 1: and I got it, and I have to admit I 1767 01:32:54,360 --> 01:32:57,840 Speaker 1: felt a little empty when I reached it. The second time, 1768 01:32:58,080 --> 01:33:01,639 Speaker 1: I tried to just again make more dramatic choices, make 1769 01:33:01,680 --> 01:33:03,280 Speaker 1: a choice here and there that were just different from 1770 01:33:03,280 --> 01:33:06,200 Speaker 1: what I did. The first time. I got a bleak ending, 1771 01:33:06,640 --> 01:33:13,360 Speaker 1: but it felt more authentic so yeah, it's interesting to 1772 01:33:13,400 --> 01:33:18,800 Speaker 1: think about how choice potentially impacts our appreciation of a 1773 01:33:18,840 --> 01:33:22,640 Speaker 1: work like this, especially if we're talking about increasingly interactive 1774 01:33:22,760 --> 01:33:25,240 Speaker 1: work in a hypothetical future. 1775 01:33:25,600 --> 01:33:27,320 Speaker 2: Had to find a good bleak note to end on. 1776 01:33:27,439 --> 01:33:28,120 Speaker 2: I think that's it. 1777 01:33:28,320 --> 01:33:31,000 Speaker 1: Yeah, yeah, I mean, if we're talking about black Mirror, 1778 01:33:31,040 --> 01:33:34,479 Speaker 1: that's where we have to leave it. All right, Well, 1779 01:33:34,479 --> 01:33:37,160 Speaker 1: we covered a lot of ground in there. I imagine 1780 01:33:37,280 --> 01:33:40,519 Speaker 1: listeners will want to chime in, certainly on Bandersnatch if 1781 01:33:40,520 --> 01:33:42,559 Speaker 1: they have experienced it. I'd love to hear from anyone 1782 01:33:42,600 --> 01:33:44,919 Speaker 1: who's like, how much time did you spend on Banderstatch? 1783 01:33:44,960 --> 01:33:46,720 Speaker 1: How many viewings have you given? Did you do like 1784 01:33:46,800 --> 01:33:50,040 Speaker 1: Joe and go in and try and find every golden 1785 01:33:50,040 --> 01:33:50,599 Speaker 1: Easter egg? 1786 01:33:50,720 --> 01:33:52,160 Speaker 4: I didn't get all of them, but I got a 1787 01:33:52,160 --> 01:33:52,599 Speaker 4: lot of them. 1788 01:33:52,760 --> 01:33:54,160 Speaker 1: Or did you do like me? Did you sort of 1789 01:33:54,200 --> 01:33:56,240 Speaker 1: go through it once and maybe go through a second time? 1790 01:33:56,280 --> 01:33:58,400 Speaker 1: And maybe you haven't seen or haven't read about the 1791 01:33:58,439 --> 01:34:02,240 Speaker 1: other endings. And of course free will you all have 1792 01:34:02,360 --> 01:34:03,840 Speaker 1: it or maybe you all don't have it but you 1793 01:34:03,840 --> 01:34:06,160 Speaker 1: think you have it, which however you want to look 1794 01:34:06,160 --> 01:34:09,000 Speaker 1: at it. You all have some thoughts about free Will. 1795 01:34:09,040 --> 01:34:11,240 Speaker 1: You all have some experience to share about this, and 1796 01:34:11,280 --> 01:34:13,719 Speaker 1: we would love to hear from you in the meantime. 1797 01:34:13,760 --> 01:34:15,240 Speaker 1: If you want to check out more Stuff to Blow 1798 01:34:15,240 --> 01:34:18,000 Speaker 1: your Mind, you can find us anywhere you get your podcasts. 1799 01:34:18,200 --> 01:34:20,040 Speaker 1: You can certainly go to stuff to Blow your Mind 1800 01:34:20,080 --> 01:34:22,200 Speaker 1: dot com and that will redirect you to a place 1801 01:34:22,200 --> 01:34:25,240 Speaker 1: where you can find the episodes and wherever you get 1802 01:34:25,240 --> 01:34:26,960 Speaker 1: the show. We just have to ask that you support 1803 01:34:27,040 --> 01:34:30,360 Speaker 1: us by rating and reviewing and subscribing. And don't forget 1804 01:34:30,400 --> 01:34:33,879 Speaker 1: we have another show out there titled Invention and Invention 1805 01:34:34,320 --> 01:34:37,400 Speaker 1: covers human technohistory, one invention at a time. 1806 01:34:37,960 --> 01:34:40,880 Speaker 2: Huge thanks as always to our excellent audio producer Seth 1807 01:34:40,960 --> 01:34:43,840 Speaker 2: Nicholas Johnson, who is doing a heroic quick turnaround on 1808 01:34:43,880 --> 01:34:48,040 Speaker 2: today's episode, So praise him, everyone, praise him. If you 1809 01:34:48,080 --> 01:34:50,639 Speaker 2: would like to get in touch with us with notes 1810 01:34:50,680 --> 01:34:53,080 Speaker 2: on this episode or any other, to suggest a topic 1811 01:34:53,120 --> 01:34:55,559 Speaker 2: for the future, just to say hi, you can email 1812 01:34:55,680 --> 01:35:05,879 Speaker 2: us at contact at stuff to Blow your Mind dot com. 1813 01:35:06,040 --> 01:35:08,160 Speaker 3: Stuff to Blow Your Mind is a production of iHeartRadio. 1814 01:35:08,240 --> 01:35:10,920 Speaker 3: How Stuff Works. For more podcasts from iHeart Radio, visit 1815 01:35:10,920 --> 01:35:13,840 Speaker 3: the iHeartRadio app, Apple Podcasts, or wherever you listen to 1816 01:35:13,880 --> 01:35:27,960 Speaker 3: your favorite shows.