1 00:00:05,720 --> 00:00:08,239 Speaker 1: Hey, welcome to Stuff to Blow your mind. My name 2 00:00:08,320 --> 00:00:11,559 Speaker 1: is Robert Lamb and I'm Joe McCormick, and it is 3 00:00:11,600 --> 00:00:14,400 Speaker 1: not Saturday, but today we're bringing you an episode from 4 00:00:14,440 --> 00:00:16,400 Speaker 1: the vault. We just had a little bit of time 5 00:00:16,440 --> 00:00:18,960 Speaker 1: off for the holidays and uh, and to get us 6 00:00:18,960 --> 00:00:21,479 Speaker 1: through that time where we're playing some older episodes. But 7 00:00:21,520 --> 00:00:23,239 Speaker 1: this was a really good one and I'm glad we're 8 00:00:23,239 --> 00:00:25,520 Speaker 1: revisiting it or I don't know if I should say myself. 9 00:00:25,520 --> 00:00:27,120 Speaker 1: It was really good. I remember it being a lot 10 00:00:27,160 --> 00:00:30,560 Speaker 1: of fun. Uh. This was from January nine, and it's 11 00:00:30,560 --> 00:00:33,199 Speaker 1: the one we did about band or Snatch. Yeah, this 12 00:00:33,280 --> 00:00:35,239 Speaker 1: was a lot of fun to do. Uh. This of 13 00:00:35,280 --> 00:00:39,479 Speaker 1: course concerns the Black Mirror Choose your Own Adventure style 14 00:00:40,240 --> 00:00:43,880 Speaker 1: episode that you can find on Netflix titled band or Snatch. 15 00:00:43,960 --> 00:00:47,680 Speaker 1: And yeah, it's a discussion of a free will and 16 00:00:47,720 --> 00:00:51,240 Speaker 1: how we make choices, and then also some some very 17 00:00:51,400 --> 00:00:54,600 Speaker 1: episode specific stuff about this, you know, about our experiences 18 00:00:54,640 --> 00:00:57,639 Speaker 1: with band or Snatch, and also I believe we got 19 00:00:57,640 --> 00:01:01,640 Speaker 1: into information collection of bit as well. Oh yeah, but 20 00:01:01,720 --> 00:01:05,280 Speaker 1: I remember one thing about I assumed one reason you 21 00:01:05,319 --> 00:01:07,920 Speaker 1: wanted to replay this one is that Bandersnatch came up 22 00:01:07,920 --> 00:01:11,039 Speaker 1: in our recent episode about how our choices later inform 23 00:01:11,160 --> 00:01:14,319 Speaker 1: what our preferences become. Yes, that was one reason. The 24 00:01:14,360 --> 00:01:17,520 Speaker 1: other reason was that I put most everything else that 25 00:01:17,560 --> 00:01:21,440 Speaker 1: we recorded uh in the last year into the schedule 26 00:01:21,720 --> 00:01:24,280 Speaker 1: for Vault episodes, and I needed something that wasn't there, 27 00:01:24,319 --> 00:01:28,240 Speaker 1: and Bandersnatch had not been assigned yet. Bam, Well we 28 00:01:28,319 --> 00:01:30,680 Speaker 1: hope you enjoy. Oh and just so you know, we 29 00:01:30,720 --> 00:01:33,600 Speaker 1: will be back with more brand new core episodes of 30 00:01:33,600 --> 00:01:35,720 Speaker 1: stuff to blow your mind on Tuesday and Thursday of 31 00:01:35,760 --> 00:01:38,560 Speaker 1: next week. So never fear more core stuff to blow 32 00:01:38,600 --> 00:01:47,200 Speaker 1: your mind on the way. And the banker, inspired with 33 00:01:47,240 --> 00:01:50,080 Speaker 1: a courage so new it was a matter for general remark, 34 00:01:50,600 --> 00:01:53,760 Speaker 1: rushed madly ahead and was lost to their view in 35 00:01:53,880 --> 00:01:56,880 Speaker 1: his zeal to discover the snark. But while he was 36 00:01:56,920 --> 00:02:01,280 Speaker 1: seeking with thymbals and care, a banders match swiftly drew 37 00:02:01,400 --> 00:02:04,720 Speaker 1: nigh and grabbed at the banker, who shrieked in despair, 38 00:02:04,920 --> 00:02:08,040 Speaker 1: for he knew it was useless to fly. He offered 39 00:02:08,120 --> 00:02:11,760 Speaker 1: large discount. He offered a check drawn to bearer for 40 00:02:11,880 --> 00:02:15,800 Speaker 1: seven pounds ten, but the Bandersnatch merely extended its neck 41 00:02:15,840 --> 00:02:19,640 Speaker 1: and grabbed at the banker again without rest or pause, 42 00:02:19,800 --> 00:02:25,000 Speaker 1: while those frumious jaws went savagely snapping around. He skipped, 43 00:02:25,000 --> 00:02:27,919 Speaker 1: and he hopped, and he floundered and flopped till fainting 44 00:02:27,960 --> 00:02:31,560 Speaker 1: he fell to the ground. The Bandersnatch fled as the 45 00:02:31,600 --> 00:02:35,440 Speaker 1: others appeared, led on by that fear stricken yell, and 46 00:02:35,560 --> 00:02:38,560 Speaker 1: the bellman remarked, it is just as I feared and 47 00:02:38,680 --> 00:02:48,120 Speaker 1: solemnly told on his bell Welcome to stuff to blow 48 00:02:48,160 --> 00:02:57,359 Speaker 1: your mind. A production of I Heart Radios has to work. Hey, 49 00:02:57,440 --> 00:02:59,280 Speaker 1: you welcome to stuff to blow your mind. My name 50 00:02:59,280 --> 00:03:01,880 Speaker 1: is Robert Lamb, and I'm Joe McCormick. And Robert, are 51 00:03:01,880 --> 00:03:05,360 Speaker 1: you feeling more frabjous or more frumious today? I guess 52 00:03:05,360 --> 00:03:09,280 Speaker 1: more frumiousis would be my answer. That was, of course 53 00:03:09,560 --> 00:03:13,120 Speaker 1: the poem The Hunting of the Snark by Lewis Carroll, 54 00:03:14,000 --> 00:03:15,720 Speaker 1: But I guess a number of people were probably more 55 00:03:15,720 --> 00:03:18,919 Speaker 1: familiar with the Bandersnatch from another poem by Lewis Carroll, 56 00:03:19,000 --> 00:03:22,959 Speaker 1: that being the jabberwah Jackwaukee. Yes, where the Bandersnatch is 57 00:03:23,000 --> 00:03:26,760 Speaker 1: just alluded to as another monstrous creature that might be 58 00:03:27,400 --> 00:03:30,120 Speaker 1: running around the woods. I love it when a poet 59 00:03:30,280 --> 00:03:33,680 Speaker 1: first name something in a kind of in a in 60 00:03:33,720 --> 00:03:36,360 Speaker 1: a listical kind of way, you know, poem poem as 61 00:03:36,360 --> 00:03:39,320 Speaker 1: a listical uh, and then later it comes through in 62 00:03:39,360 --> 00:03:41,960 Speaker 1: another poem with more force. I think that's sort of 63 00:03:42,000 --> 00:03:45,920 Speaker 1: happened with um with the demo organ, right, Yeah, yeah, 64 00:03:46,120 --> 00:03:49,000 Speaker 1: I think so. And in this case, yeah, the bandersnatch. 65 00:03:49,080 --> 00:03:50,800 Speaker 1: There's not a lot really said about it in the 66 00:03:51,040 --> 00:03:53,840 Speaker 1: writings of Lewis Carroll. Lewis Carroll, by the way, was 67 00:03:54,080 --> 00:03:57,680 Speaker 1: the pen name of Charles Lutwich Dodgson, who lived eighteen 68 00:03:57,720 --> 00:04:01,600 Speaker 1: thirty two through eight, and he first introduced the band 69 00:04:01,680 --> 00:04:03,840 Speaker 1: or snatch again just in a list of creatures that 70 00:04:03,920 --> 00:04:07,120 Speaker 1: might pop up in his eighteen seventy two novel Through 71 00:04:07,120 --> 00:04:10,640 Speaker 1: the Looking Glass in that poem the jabberwocke uh, and 72 00:04:10,680 --> 00:04:13,320 Speaker 1: then pops up again in this eighteen seventy four poem 73 00:04:13,360 --> 00:04:16,480 Speaker 1: that we just read, the Hunting of the Snark, which 74 00:04:16,640 --> 00:04:18,160 Speaker 1: we didn't read the whole of the poem, that was 75 00:04:18,200 --> 00:04:21,760 Speaker 1: just an excerpt from it. I think where the like, 76 00:04:21,800 --> 00:04:23,760 Speaker 1: who's the banker? The banker is one of these people 77 00:04:23,800 --> 00:04:26,480 Speaker 1: who goes on a voyage hunting the snark. I think 78 00:04:26,520 --> 00:04:29,440 Speaker 1: I've read that that poem was has been interpreted by 79 00:04:29,480 --> 00:04:32,600 Speaker 1: some as as metaphorical of you know that it's supposed 80 00:04:32,600 --> 00:04:34,920 Speaker 1: to be an allegory about the search for human happiness 81 00:04:35,000 --> 00:04:38,400 Speaker 1: and contentment. But then also, I think I've heard it 82 00:04:38,440 --> 00:04:41,479 Speaker 1: alleged that the poem actually has no allegorical meaning, that 83 00:04:41,520 --> 00:04:44,520 Speaker 1: it's just it's just kind of silly. Yeah, I mean, 84 00:04:44,600 --> 00:04:46,640 Speaker 1: in that sense, it's kind of enigmatic. And the creature 85 00:04:46,680 --> 00:04:51,200 Speaker 1: itself is enigmatic, scarcely described, but certainly best avoided at 86 00:04:51,240 --> 00:04:53,720 Speaker 1: all costs. There's no way to outrun it, no way 87 00:04:53,800 --> 00:04:57,400 Speaker 1: to escape its intensity. Uh. And by the way, frumious 88 00:04:57,839 --> 00:05:01,080 Speaker 1: is a combination of fuming and fear. Us Carroll just 89 00:05:01,160 --> 00:05:03,520 Speaker 1: grand these two words together to make a nice new 90 00:05:03,600 --> 00:05:07,800 Speaker 1: adjective for strange monster is a perfectly cromulent word. Very 91 00:05:07,839 --> 00:05:10,640 Speaker 1: frabjous um. And I was wondering, do you need a 92 00:05:10,720 --> 00:05:13,599 Speaker 1: vorpal sword if you go up against a bandersnatch or 93 00:05:13,680 --> 00:05:16,000 Speaker 1: is that only for the jabberwock What certainly worked on 94 00:05:16,040 --> 00:05:18,400 Speaker 1: the jabberwockie. I don't know about the band or snatch. 95 00:05:18,480 --> 00:05:20,640 Speaker 1: There's there are no tales of slaying it, are there, 96 00:05:20,720 --> 00:05:23,920 Speaker 1: at least not in Lewis Carroll's original work. Does the 97 00:05:24,000 --> 00:05:26,400 Speaker 1: vorpal sword show up as an artifact in D and D? 98 00:05:26,680 --> 00:05:30,440 Speaker 1: It does, it certainly does. Yeah. Um, pretty good sword. 99 00:05:30,760 --> 00:05:35,040 Speaker 1: Oh yeah, very good sword. Now. The name Bandersnatch has 100 00:05:35,040 --> 00:05:37,359 Speaker 1: been invoked many times over the years and works of 101 00:05:37,400 --> 00:05:40,800 Speaker 1: fantasy and science fiction. I've seen it pop up as 102 00:05:40,839 --> 00:05:44,480 Speaker 1: a space slug, another such creature. Sometimes it's just kind 103 00:05:44,520 --> 00:05:48,800 Speaker 1: of an enigmatic name for like a government project or something. Uh, 104 00:05:49,080 --> 00:05:51,800 Speaker 1: because it's a great it's a great name. In depictions 105 00:05:51,800 --> 00:05:53,960 Speaker 1: of Lewis Carroll's work, it is often take on a 106 00:05:54,120 --> 00:05:59,320 Speaker 1: mammalian character. Nineteen century children's illustrator Peter Newell depicted it 107 00:05:59,360 --> 00:06:02,560 Speaker 1: as kind of a furry, horned beast that might resemble 108 00:06:02,600 --> 00:06:05,400 Speaker 1: a cat or maybe a wolf like creature. Uh. And 109 00:06:05,440 --> 00:06:08,200 Speaker 1: this one. This is a very popular image. And then 110 00:06:08,200 --> 00:06:11,440 Speaker 1: film adaptations have depicted it as both bor like and 111 00:06:11,640 --> 00:06:15,520 Speaker 1: cat like. The two thousand ten Tim Burton adaptation has 112 00:06:15,680 --> 00:06:19,240 Speaker 1: a very memorable creature design for the Bandersnatch. Tim the 113 00:06:19,279 --> 00:06:22,360 Speaker 1: Tim Burton Alice in Wonderland. Is that what you're talking about? Yeah, yeah, 114 00:06:22,480 --> 00:06:24,719 Speaker 1: I have. I've seen the first one. Um, I never 115 00:06:24,800 --> 00:06:30,400 Speaker 1: ventured that far into late Burton. Well it it had 116 00:06:30,480 --> 00:06:33,200 Speaker 1: some things going for it. It had a really good cast, 117 00:06:33,279 --> 00:06:36,880 Speaker 1: and it had some some interesting character designs, I'll say that, 118 00:06:37,240 --> 00:06:41,880 Speaker 1: and and a very monstrous looking bandersnatch. Now, just a 119 00:06:41,880 --> 00:06:46,440 Speaker 1: couple of other interesting tidbits about Lewis Carroll. Um. He 120 00:06:46,520 --> 00:06:50,080 Speaker 1: was a mathematician. He worked in geometry and new ideas 121 00:06:50,080 --> 00:06:54,599 Speaker 1: in algebra, logic, machines, ciphers. So between this and other 122 00:06:54,600 --> 00:06:56,920 Speaker 1: details of of his life, there's a lot of black 123 00:06:57,000 --> 00:07:02,200 Speaker 1: mirror to the originator of the bander snatch. Also in Hallucinations, 124 00:07:02,560 --> 00:07:05,680 Speaker 1: the book by Oliver Sacks, the Late Oliver Sax, Sax 125 00:07:05,760 --> 00:07:09,200 Speaker 1: points out that Carol was known to suffer from classical migraines, 126 00:07:09,720 --> 00:07:13,280 Speaker 1: and that Carol W. Lippman and others have suggested that 127 00:07:13,360 --> 00:07:19,840 Speaker 1: his migraine experiences, uh may have contributed to the way 128 00:07:19,840 --> 00:07:23,120 Speaker 1: he envisioned through the Looking Glass and Alice in Wonderland, 129 00:07:23,160 --> 00:07:27,320 Speaker 1: like the skewing of time and space. Um. Also, you 130 00:07:27,360 --> 00:07:30,520 Speaker 1: have auditory hallucinations that are not uncommon in migraines, as 131 00:07:30,520 --> 00:07:34,480 Speaker 1: well as olfactory hallucinations. I've also seen descriptions of of 132 00:07:35,320 --> 00:07:40,520 Speaker 1: this lifting feeling, that this feeling of being moved through space. Uh. Yeah, 133 00:07:40,560 --> 00:07:43,200 Speaker 1: I guess the the extension of the lightheadedness that comes 134 00:07:43,240 --> 00:07:46,600 Speaker 1: on with the aura and all that. Yeah. Um, you know, 135 00:07:46,640 --> 00:07:49,560 Speaker 1: there's actually an asteroid named Bandersnatch. Oh. I didn't know 136 00:07:49,600 --> 00:07:52,480 Speaker 1: this and I looked this up. Bander Snatch. It's a 137 00:07:52,520 --> 00:07:55,480 Speaker 1: Main Belt asteroid, so it's out beyond the orbit of Mars, 138 00:07:55,720 --> 00:08:00,640 Speaker 1: discovered in nineteen by Japanese astronomers Takashi Urata and Yasuhiro 139 00:08:01,000 --> 00:08:04,840 Speaker 1: Shimizu at the Nazi Katsura Observatory and it was named, 140 00:08:04,840 --> 00:08:09,160 Speaker 1: of course after the Fermius Bandersnatch. Awesome. Now, one of 141 00:08:09,200 --> 00:08:11,800 Speaker 1: the this is just sort of the introductory material on 142 00:08:11,840 --> 00:08:15,240 Speaker 1: the Bandersnatch because for the vast majority of this episode 143 00:08:15,520 --> 00:08:17,920 Speaker 1: we're going to be talking about what, as I guess, 144 00:08:17,960 --> 00:08:23,600 Speaker 1: the most recent uh cinematic in vocation of the Bandersnatch, 145 00:08:23,960 --> 00:08:27,760 Speaker 1: and that is the Black Mirror episode. It's not even 146 00:08:28,120 --> 00:08:31,240 Speaker 1: an episode, it's a Black Mirror film that came out 147 00:08:31,240 --> 00:08:34,959 Speaker 1: on Netflix December of what was theeen so a little 148 00:08:34,960 --> 00:08:38,400 Speaker 1: over a year ago. Um and uh, and there's a 149 00:08:38,440 --> 00:08:41,120 Speaker 1: lot to unpack here. I actually didn't watch it until 150 00:08:41,160 --> 00:08:43,160 Speaker 1: this week, so I knew you wanted to do an 151 00:08:43,160 --> 00:08:45,200 Speaker 1: episode about it, and I was like, okay, I'll finally 152 00:08:45,200 --> 00:08:48,000 Speaker 1: see what all the fuss is about. I was very impressed. Yeah, 153 00:08:48,600 --> 00:08:51,559 Speaker 1: I'll get I'll certainly get more into my my very 154 00:08:51,600 --> 00:08:53,439 Speaker 1: thoughts on it later. I was impressed with it when 155 00:08:53,480 --> 00:08:55,360 Speaker 1: it came out, and then since we were going to 156 00:08:55,440 --> 00:08:58,000 Speaker 1: do the episode, I rewatched it for the first time 157 00:08:58,160 --> 00:09:02,040 Speaker 1: since its original release earlier this week, and uh, and 158 00:09:02,360 --> 00:09:03,959 Speaker 1: I have to say it, I thought it held up. 159 00:09:04,320 --> 00:09:07,120 Speaker 1: I even got a different ending and a different dead 160 00:09:07,200 --> 00:09:10,160 Speaker 1: end at one point than I had encountered previously. So 161 00:09:10,440 --> 00:09:12,800 Speaker 1: it was it was like, why every time you watch 162 00:09:12,840 --> 00:09:16,120 Speaker 1: a film that you like again you find new things, 163 00:09:16,480 --> 00:09:20,000 Speaker 1: but in this case you can actually get a different ending. Yeah, 164 00:09:20,040 --> 00:09:22,000 Speaker 1: now we're going to be exploring today some of the 165 00:09:22,280 --> 00:09:25,000 Speaker 1: some of the science and the ideas and philosophy that 166 00:09:25,080 --> 00:09:28,160 Speaker 1: are alluded to in band or snatch Um. But in 167 00:09:28,240 --> 00:09:31,200 Speaker 1: doing so, of course, this will involve some spoilers for 168 00:09:31,240 --> 00:09:34,439 Speaker 1: this strange film. So I would say there are a 169 00:09:34,480 --> 00:09:36,280 Speaker 1: couple of places we're not We're not gonna go through 170 00:09:36,320 --> 00:09:39,240 Speaker 1: and explore every possible ending or anything like that. But 171 00:09:39,400 --> 00:09:42,120 Speaker 1: if you are in the case where you haven't seen 172 00:09:42,160 --> 00:09:44,640 Speaker 1: it yet and you don't want anything at all spoiled, 173 00:09:44,679 --> 00:09:47,200 Speaker 1: you should probably stop here and go watch it first 174 00:09:47,200 --> 00:09:48,880 Speaker 1: before you listen to the rest of the episode. But 175 00:09:49,360 --> 00:09:51,400 Speaker 1: if you've already seen it, or you haven't seen it, 176 00:09:51,400 --> 00:09:53,480 Speaker 1: and you don't care about minor spoilers that don't go 177 00:09:53,520 --> 00:09:55,800 Speaker 1: all the way to all the endings, then you know, 178 00:09:55,880 --> 00:09:58,280 Speaker 1: forge ahead with us. Please. However, some of you may 179 00:09:58,320 --> 00:10:00,560 Speaker 1: be asking the question, well, what are you talking about? 180 00:10:00,600 --> 00:10:03,160 Speaker 1: What is Black Mirror? Uh? So we should probably take 181 00:10:03,160 --> 00:10:05,880 Speaker 1: a few minutes to just refresh you on what this is. 182 00:10:06,800 --> 00:10:09,160 Speaker 1: It is a is the word refresh? Would that be 183 00:10:09,200 --> 00:10:14,480 Speaker 1: the word? We should shock you to the bone? Alright? 184 00:10:14,520 --> 00:10:18,959 Speaker 1: So Black Mirror is in essence a sci fi anthology 185 00:10:19,000 --> 00:10:22,800 Speaker 1: television series. Uh, in the same vein as the Twilight Zone, 186 00:10:22,920 --> 00:10:25,719 Speaker 1: the Outer Limits, uh, these various shows we've discussed in 187 00:10:25,760 --> 00:10:30,120 Speaker 1: the past. I might call it pretty often techno horror. 188 00:10:30,200 --> 00:10:33,960 Speaker 1: Not every episode is the same, but there's essentially no 189 00:10:34,600 --> 00:10:38,520 Speaker 1: horror movie as scary as the scariest episodes of Black Mirror, 190 00:10:39,760 --> 00:10:44,760 Speaker 1: especially the ones that managed to take fairly plausible technological 191 00:10:44,800 --> 00:10:48,880 Speaker 1: scenarios and and follow them to their logical conclusions. I mean, 192 00:10:49,000 --> 00:10:52,280 Speaker 1: it's a it's a show that's very good at conjuring 193 00:10:52,360 --> 00:10:56,120 Speaker 1: up the worst possible nightmares of like the intersection of 194 00:10:56,920 --> 00:11:02,080 Speaker 1: capitalism and technology. Yeah, definitely, episodes tend to have a 195 00:11:02,120 --> 00:11:06,240 Speaker 1: technological swing to the story, and they tend to deal 196 00:11:06,320 --> 00:11:12,559 Speaker 1: on some level with with contemporary anxiety about current technology 197 00:11:12,559 --> 00:11:16,400 Speaker 1: and emerging technology. What what what are these technologies doing 198 00:11:16,440 --> 00:11:18,920 Speaker 1: to our lives? What may they do to our lives 199 00:11:18,920 --> 00:11:22,080 Speaker 1: in the future? And you know, sometimes they take varying 200 00:11:22,120 --> 00:11:25,520 Speaker 1: speculative leaps there, of course, since it is science fiction. 201 00:11:26,160 --> 00:11:28,880 Speaker 1: But you, I would say, you typically leave an episode 202 00:11:28,920 --> 00:11:32,120 Speaker 1: of Black Mirror, uh, feeling a little worse about the world. 203 00:11:32,200 --> 00:11:35,600 Speaker 1: I know that Netflix, their current masters, are very into 204 00:11:35,600 --> 00:11:38,319 Speaker 1: the whole binge model, but I personally find it very 205 00:11:38,320 --> 00:11:42,920 Speaker 1: difficult to binge Black Mirror, in part because each episode, 206 00:11:42,960 --> 00:11:46,280 Speaker 1: of course, is a self contained story with its with 207 00:11:46,440 --> 00:11:49,960 Speaker 1: with characters and applaud etcetera. But then also it's like 208 00:11:50,400 --> 00:11:52,200 Speaker 1: they're often like a punch in the gut, and I 209 00:11:52,320 --> 00:11:55,240 Speaker 1: just can't just set there and take one punch after 210 00:11:55,280 --> 00:11:58,360 Speaker 1: the other. Apart from one very sweet, very nice episode, 211 00:11:58,360 --> 00:12:01,319 Speaker 1: there's essentially nothing that makes me feel as bad as 212 00:12:01,360 --> 00:12:04,959 Speaker 1: Black Mirror. Well, you know, I was thinking, I was 213 00:12:05,000 --> 00:12:07,120 Speaker 1: thinking about this because there are definitely some very bleak 214 00:12:07,160 --> 00:12:10,600 Speaker 1: episode there episodes of Black Mirror that I admire that 215 00:12:10,679 --> 00:12:14,600 Speaker 1: I will never watch again. But but then I look 216 00:12:14,640 --> 00:12:16,840 Speaker 1: back at my some of my favorite episodes. My favorite 217 00:12:16,840 --> 00:12:20,600 Speaker 1: episodes are probably San Junipero, The U S. S. Calister, 218 00:12:21,080 --> 00:12:24,360 Speaker 1: and metal Head. Two of those. One of those is 219 00:12:24,360 --> 00:12:26,800 Speaker 1: still pretty bleak, but two of two of those are 220 00:12:26,800 --> 00:12:30,520 Speaker 1: actually pretty upbeat, probably the most upbeat episodes of the show. 221 00:12:31,080 --> 00:12:33,920 Speaker 1: Um and uh. And maybe that's the reason I would 222 00:12:33,920 --> 00:12:36,080 Speaker 1: come back to them, because if I'm gonna double dip, 223 00:12:36,080 --> 00:12:38,840 Speaker 1: I want to double dip for optimism sake. Now, in 224 00:12:38,920 --> 00:12:40,760 Speaker 1: terms that, we can't talk about Black Mirror without talking 225 00:12:40,760 --> 00:12:46,640 Speaker 1: about the creator behind it, the main creative individual behind it, 226 00:12:46,679 --> 00:12:50,120 Speaker 1: and that is Charlie Brooker, a British writer and humorist 227 00:12:50,520 --> 00:12:53,320 Speaker 1: whom the earliest thing that he worked on it I 228 00:12:53,320 --> 00:12:55,679 Speaker 1: was familiar with was that he worked on Chris Morris's 229 00:12:55,720 --> 00:12:59,679 Speaker 1: excellent news satire Brass Eye. Um. And then he also 230 00:12:59,760 --> 00:13:03,720 Speaker 1: create did a pretty great zombie movie titled Dead Set 231 00:13:03,760 --> 00:13:07,320 Speaker 1: in two thousand eight, in which the zombie apocalypse breaks 232 00:13:07,360 --> 00:13:11,439 Speaker 1: out uh in and around a Big Brother style reality 233 00:13:11,480 --> 00:13:14,880 Speaker 1: TV production. Feel like two thousand eight was sort of 234 00:13:14,920 --> 00:13:17,200 Speaker 1: like maybe two thousand seven. Two thousand eight was like 235 00:13:17,280 --> 00:13:21,400 Speaker 1: peak zombie satire movie. Yeah. And the thing about this one, though, 236 00:13:21,920 --> 00:13:25,120 Speaker 1: the premise sounds like a comedy, and so I acquired 237 00:13:25,160 --> 00:13:27,400 Speaker 1: a copy of it, thinking, oh, this is a comedy 238 00:13:27,440 --> 00:13:29,319 Speaker 1: and this is a guy that worked on brass Ie. 239 00:13:29,679 --> 00:13:32,719 Speaker 1: This is gonna be hilarious. And it is not a 240 00:13:32,800 --> 00:13:35,760 Speaker 1: straight up comedy. It is a pretty terrifying film. But 241 00:13:35,920 --> 00:13:38,720 Speaker 1: you see shades of that in Black Mirror. Sometimes there 242 00:13:38,800 --> 00:13:43,320 Speaker 1: is a premise that could sound like a joke, but 243 00:13:43,440 --> 00:13:47,840 Speaker 1: then it is taken and considered with such intensity that 244 00:13:47,920 --> 00:13:50,720 Speaker 1: it works. Yeah, what if like a major tech company 245 00:13:51,120 --> 00:13:54,120 Speaker 1: used eye tracking software to make sure you were always 246 00:13:54,160 --> 00:13:56,720 Speaker 1: watching their ads, and if you didn't watch them, they 247 00:13:56,760 --> 00:14:00,160 Speaker 1: would ring sirens in your brain and deduct money them 248 00:14:00,160 --> 00:14:02,800 Speaker 1: your bank account until you started watching the ads. Again, 249 00:14:03,400 --> 00:14:05,520 Speaker 1: sounds like a joke, but like if you just said 250 00:14:05,600 --> 00:14:08,000 Speaker 1: take that seriously for a bit, uh and explore it, 251 00:14:08,000 --> 00:14:12,320 Speaker 1: that becomes like a nightmare of of of techno sci fi. 252 00:14:12,520 --> 00:14:17,360 Speaker 1: Absolutely now. Black Mirror began in two seasons and a 253 00:14:17,360 --> 00:14:20,120 Speaker 1: holiday special came out and ran on Channel four in 254 00:14:20,160 --> 00:14:24,280 Speaker 1: the UK. Then Netflix started carrying it and Netflix became 255 00:14:24,640 --> 00:14:27,600 Speaker 1: the owner of the main publisher of the program. However, 256 00:14:27,600 --> 00:14:29,520 Speaker 1: you want to look at it starting with season three 257 00:14:29,600 --> 00:14:33,520 Speaker 1: in October. All in all, it is thus far gone 258 00:14:33,600 --> 00:14:37,680 Speaker 1: five seasons, twenty one episodes, and that's not counting the 259 00:14:37,720 --> 00:14:41,640 Speaker 1: film Bandersnatch, which again came out in December. Of these 260 00:14:41,680 --> 00:14:44,840 Speaker 1: bits of publisher information will actually become relevant later on 261 00:14:44,880 --> 00:14:47,880 Speaker 1: as we discussed the story, because the the ideas there, 262 00:14:48,160 --> 00:14:50,040 Speaker 1: because at the end we definitely get into some citiz 263 00:14:50,040 --> 00:14:52,840 Speaker 1: scenarios where we have to consider the fact that Netflix 264 00:14:52,960 --> 00:14:56,960 Speaker 1: is the the business daddy behind the Black Mirror, so 265 00:14:57,120 --> 00:15:01,160 Speaker 1: Bandersnatch the film. The Black Mirror film is actually directed 266 00:15:01,160 --> 00:15:03,160 Speaker 1: by David Slade, who I was like, where do I 267 00:15:03,200 --> 00:15:05,560 Speaker 1: know that name from. He's done several things, but one 268 00:15:05,640 --> 00:15:09,160 Speaker 1: of them was he did one of the Twilight movies. Yes, 269 00:15:09,200 --> 00:15:11,120 Speaker 1: and I've seen that, that particular one. It was the 270 00:15:11,200 --> 00:15:14,280 Speaker 1: second one, I think, and it's that's a I'm not 271 00:15:14,360 --> 00:15:17,320 Speaker 1: a huge Twilight fan, but that is a very watchable 272 00:15:17,320 --> 00:15:19,520 Speaker 1: Twilight movie and it has a great soundtrack. It's got 273 00:15:19,520 --> 00:15:22,880 Speaker 1: Tom York on it. Oh yeah. He also directed the 274 00:15:22,960 --> 00:15:25,760 Speaker 1: Black Mirror episode metal Head that I alluded to earlier, 275 00:15:26,000 --> 00:15:28,560 Speaker 1: and there are some callbacks to metal Head in the 276 00:15:28,600 --> 00:15:31,080 Speaker 1: band or Snatch episode. Now, I guess one thing we 277 00:15:31,120 --> 00:15:33,840 Speaker 1: haven't gotten fully into so far is the fact that 278 00:15:34,480 --> 00:15:38,480 Speaker 1: the Black Mirror movie Bandersnatch is it's an interactive movie, 279 00:15:38,560 --> 00:15:41,080 Speaker 1: which makes it very unique. Right, This was the big 280 00:15:41,120 --> 00:15:43,840 Speaker 1: selling point on it and and and indeed is one 281 00:15:43,840 --> 00:15:46,320 Speaker 1: of the I mean, it's it's a key part of 282 00:15:46,360 --> 00:15:48,240 Speaker 1: the way you consume it, but it is also very 283 00:15:48,280 --> 00:15:52,600 Speaker 1: important thematically, like you know, true to form. The creators 284 00:15:52,640 --> 00:15:56,160 Speaker 1: here really thought long and hard about how to utilize 285 00:15:56,640 --> 00:16:01,480 Speaker 1: um an interactive system, uh within the work and make 286 00:16:01,520 --> 00:16:04,720 Speaker 1: the work comment on that system as well. The interactive 287 00:16:04,720 --> 00:16:07,280 Speaker 1: system being Netflix, like the fact that the user can 288 00:16:07,280 --> 00:16:09,880 Speaker 1: make inputs on the movie. Yeah, basically that's what it 289 00:16:09,880 --> 00:16:12,880 Speaker 1: will amounts to. Is you you start off watching it, it 290 00:16:12,800 --> 00:16:17,760 Speaker 1: it seems like a normal Netflix presentation, but then you're 291 00:16:17,800 --> 00:16:22,200 Speaker 1: in my case, my Xbox one controller would suddenly vibrate 292 00:16:22,680 --> 00:16:25,440 Speaker 1: and then this little the screen at the bottom of 293 00:16:25,480 --> 00:16:28,400 Speaker 1: the screen. You're suddenly presented with two choices and a 294 00:16:28,480 --> 00:16:31,760 Speaker 1: timer and you have to choose, Uh, you know what 295 00:16:31,920 --> 00:16:33,760 Speaker 1: is going to happen with the character is going to do, 296 00:16:34,000 --> 00:16:37,520 Speaker 1: et cetera. Now, this is you know, when you just 297 00:16:37,680 --> 00:16:40,360 Speaker 1: watching checking out the film, you you might not realize 298 00:16:40,400 --> 00:16:44,200 Speaker 1: how much work goes into this, but it took apparently 299 00:16:44,320 --> 00:16:46,960 Speaker 1: a huge amount of work to shoot all these various 300 00:16:46,960 --> 00:16:51,080 Speaker 1: branching paths, because it becomes a you know this this, 301 00:16:51,080 --> 00:16:54,800 Speaker 1: this tree, this branching system of possibilities when you start 302 00:16:55,240 --> 00:17:00,320 Speaker 1: presenting the user with these interactive choices. For instance, the 303 00:17:00,320 --> 00:17:02,600 Speaker 1: the previously the longest episode of Black Mirror was an 304 00:17:02,600 --> 00:17:06,120 Speaker 1: episode called Hated in the Nation, which was eighty nine 305 00:17:06,119 --> 00:17:09,600 Speaker 1: minutes long. That's that's future length, right, what ninety minutes 306 00:17:09,680 --> 00:17:14,000 Speaker 1: is usually the length you shoot for with the sure film. Well, 307 00:17:14,359 --> 00:17:17,800 Speaker 1: when you're watching the Band or Snatch, depending on your choices, 308 00:17:18,160 --> 00:17:21,680 Speaker 1: the film can run anywhere between ninety minutes and two 309 00:17:21,680 --> 00:17:25,439 Speaker 1: and a half hours. And in order to make this work, 310 00:17:26,160 --> 00:17:30,200 Speaker 1: as pointed out by a Jackie Strausse in the Hollywood Reporter, uh, 311 00:17:30,240 --> 00:17:32,639 Speaker 1: this means they had to shoot like five hours of 312 00:17:32,680 --> 00:17:35,480 Speaker 1: footage so that they could actually cover all of these 313 00:17:35,560 --> 00:17:39,000 Speaker 1: various choices. And you may watch it like, for instance, 314 00:17:39,280 --> 00:17:41,359 Speaker 1: the first time I watched it, there were plenty of 315 00:17:41,400 --> 00:17:43,960 Speaker 1: scenes I did not see, And then when I watched 316 00:17:43,960 --> 00:17:45,960 Speaker 1: it again, there were films there were scenes that I 317 00:17:45,960 --> 00:17:47,879 Speaker 1: saw the first time that I did not see, and 318 00:17:47,960 --> 00:17:50,560 Speaker 1: I got an entirely different ending that I never didn't 319 00:17:50,560 --> 00:17:53,119 Speaker 1: even know about. And then there, of course various Easter 320 00:17:53,200 --> 00:17:56,679 Speaker 1: eggs and even I've read quote golden Easter eggs that 321 00:17:56,720 --> 00:17:59,720 Speaker 1: are spread throughout things that most viewers will not find. 322 00:17:59,880 --> 00:18:03,000 Speaker 1: Must they spend a great deal of time going through 323 00:18:03,080 --> 00:18:06,439 Speaker 1: and going back through and backing up, etcetera, with this 324 00:18:06,520 --> 00:18:11,040 Speaker 1: interactive piece encouraging unhealthy obsessive behavior exactly, I mean it 325 00:18:11,119 --> 00:18:13,199 Speaker 1: is it is black mirror. Now, as far as the 326 00:18:13,280 --> 00:18:17,120 Speaker 1: choices as you make in the in band or snatch um, 327 00:18:18,119 --> 00:18:22,080 Speaker 1: you start off making very small choices. It seemed very consequential. 328 00:18:22,280 --> 00:18:26,840 Speaker 1: For instance, uh, choosing the main characters breakfast cereal he's 329 00:18:26,840 --> 00:18:30,160 Speaker 1: presented his father shows two boxes and you decide which 330 00:18:30,200 --> 00:18:31,960 Speaker 1: one he's going to have for breakfast. Yeah. I think 331 00:18:31,960 --> 00:18:34,680 Speaker 1: it was what like frosted flakes or sugar puffs or something. 332 00:18:34,720 --> 00:18:37,119 Speaker 1: I remember what I realized after I made that choice 333 00:18:37,720 --> 00:18:39,879 Speaker 1: was I was like, oh, no, I think I chose 334 00:18:39,920 --> 00:18:43,200 Speaker 1: the brand that I was more familiar with. Ah, well, 335 00:18:43,320 --> 00:18:45,320 Speaker 1: we'll come back to that later. That that is an 336 00:18:45,320 --> 00:18:47,919 Speaker 1: important point that will come back to later on in 337 00:18:47,920 --> 00:18:50,920 Speaker 1: the episode. But but at first, it's it's what kind 338 00:18:50,920 --> 00:18:52,880 Speaker 1: of serial does he want all right, it doesn't doesn't 339 00:18:52,880 --> 00:18:54,720 Speaker 1: seem to matter much, and and it also gives you 340 00:18:54,760 --> 00:18:58,639 Speaker 1: a chance to try out the technology low stakes. But uh. 341 00:18:58,960 --> 00:19:01,560 Speaker 1: And then also later on you choose what music is 342 00:19:01,560 --> 00:19:04,399 Speaker 1: going to listen to in uh, when he's on the bus, 343 00:19:04,600 --> 00:19:05,960 Speaker 1: which is kind of fun. I think you get to 344 00:19:06,040 --> 00:19:10,640 Speaker 1: choose between a rhythmic song and something else. I can't 345 00:19:10,640 --> 00:19:14,320 Speaker 1: remember the other one. I think it's Thompson Twins, that's it. Yeah, 346 00:19:14,359 --> 00:19:16,320 Speaker 1: And and uh, and then later on he's in a 347 00:19:16,400 --> 00:19:18,440 Speaker 1: record store and you get to choose which record he's 348 00:19:18,440 --> 00:19:20,479 Speaker 1: gonna buy. And this is also pretty great because one 349 00:19:20,480 --> 00:19:25,240 Speaker 1: of the choices is Tangerine Dreams excellent nine four album Phaedra, 350 00:19:25,480 --> 00:19:28,399 Speaker 1: which is incredible. Absolutely. In fact, I was listening to 351 00:19:28,440 --> 00:19:31,680 Speaker 1: that again this morning while I was doing some prep 352 00:19:31,720 --> 00:19:34,679 Speaker 1: for this episode. Yeah, it's it's excellent stuff. However, this 353 00:19:34,760 --> 00:19:38,360 Speaker 1: time around, I forced myself to choose the other album instead, 354 00:19:38,720 --> 00:19:42,600 Speaker 1: the other album being a Sao Tamita's The Bermuda Triangle, 355 00:19:43,080 --> 00:19:46,840 Speaker 1: which very strange, Yeah, very strange work, but very very good. 356 00:19:46,880 --> 00:19:49,359 Speaker 1: I was really not that familiar with with with this 357 00:19:49,480 --> 00:19:51,600 Speaker 1: artist or this work, which is apparently kind of hard 358 00:19:51,600 --> 00:19:53,879 Speaker 1: to come by on streaming unless you just find like 359 00:19:53,880 --> 00:19:56,920 Speaker 1: a YouTube full album rip. But but yeah, this is 360 00:19:56,960 --> 00:19:59,680 Speaker 1: just a taste of the soundtrack. The Bandersnach is a 361 00:19:59,680 --> 00:20:03,800 Speaker 1: wonderfu soundtrack, including not only these artists but also Depeche Mode, 362 00:20:03,920 --> 00:20:07,760 Speaker 1: Laurie Anderson. Great stuff. But let's come back to the 363 00:20:07,840 --> 00:20:11,800 Speaker 1: choices you make in this interactive system. So again, they 364 00:20:11,840 --> 00:20:15,840 Speaker 1: start off seeming largely inconsequential. They start off seeming a 365 00:20:15,840 --> 00:20:18,520 Speaker 1: little bit fun, you know, it's just surface level stuff 366 00:20:18,520 --> 00:20:20,960 Speaker 1: like what what's his breakfast cereal? What's his musical choice? 367 00:20:21,280 --> 00:20:25,160 Speaker 1: But then they become increasingly high stakes and even nerve 368 00:20:25,280 --> 00:20:28,320 Speaker 1: racking to decide on, Like suddenly you're when you're controller 369 00:20:28,680 --> 00:20:32,560 Speaker 1: vibrates and you're presented with this choice, and sometimes you 370 00:20:32,600 --> 00:20:34,800 Speaker 1: feel this dread because sometimes the choice is neither one 371 00:20:34,840 --> 00:20:37,800 Speaker 1: is all that great. Sometimes the choices are kind of horrible. 372 00:20:38,160 --> 00:20:41,399 Speaker 1: And there's at least one point where you have no choice. 373 00:20:41,440 --> 00:20:44,680 Speaker 1: There's there's something to select, but there's no alternative selection, 374 00:20:45,119 --> 00:20:48,240 Speaker 1: and that feels maddening as well. And and you have 375 00:20:48,280 --> 00:20:49,720 Speaker 1: a time or you have like what I think it's 376 00:20:49,720 --> 00:20:52,560 Speaker 1: ten seconds to choose something, and if you don't choose, 377 00:20:52,680 --> 00:20:57,480 Speaker 1: Netflix chooses for you. But Netflix reports of viewers actively 378 00:20:57,560 --> 00:21:01,040 Speaker 1: made choices uh when they watch band or Snatch. Now, 379 00:21:01,080 --> 00:21:03,840 Speaker 1: in my experience, it wasn't that they chose for you 380 00:21:03,880 --> 00:21:06,360 Speaker 1: at random. It was that whichever one of the two 381 00:21:06,440 --> 00:21:08,760 Speaker 1: options was highlighted, and it was like a you know, 382 00:21:08,920 --> 00:21:12,040 Speaker 1: on off toggle, like you couldn't select neither one. You 383 00:21:12,080 --> 00:21:14,240 Speaker 1: were just selecting one or the other, and then you 384 00:21:14,240 --> 00:21:16,560 Speaker 1: could and you could go with it, or you could 385 00:21:16,560 --> 00:21:19,040 Speaker 1: not go with it, and whichever one you had highlighted 386 00:21:19,080 --> 00:21:23,400 Speaker 1: would just proceed. So there's this kind of um uh, 387 00:21:23,440 --> 00:21:27,040 Speaker 1: there's this horrible sense of like helplessness that that imposes 388 00:21:27,040 --> 00:21:30,399 Speaker 1: on you as the viewer. I have not seen it. 389 00:21:30,480 --> 00:21:32,560 Speaker 1: Sometimes I'll go into a restaurant or a bar and 390 00:21:32,640 --> 00:21:35,960 Speaker 1: they'll have Netflix on playing some show. I've never seen 391 00:21:36,040 --> 00:21:40,560 Speaker 1: them showing Bandersnatch, probably for this reason, letting all the 392 00:21:40,560 --> 00:21:43,119 Speaker 1: bar patrends vote to the side, well yeah, or just 393 00:21:43,200 --> 00:21:45,840 Speaker 1: going crazy like why is nobody clicking the button? Why 394 00:21:45,880 --> 00:21:48,159 Speaker 1: is nobody interacting with this? Don't let don't let that 395 00:21:48,240 --> 00:21:51,840 Speaker 1: choice go through. So eventually, as you as you interact 396 00:21:51,960 --> 00:21:54,720 Speaker 1: with Bandersnatch, a warping of time occurs as you find 397 00:21:54,760 --> 00:21:58,080 Speaker 1: yourself coming back around to pass choices like a wanderer 398 00:21:58,359 --> 00:22:01,359 Speaker 1: lost in a maze, and of course, befitting of a maze, 399 00:22:01,560 --> 00:22:03,399 Speaker 1: there is a sort of minotaur in all of this. 400 00:22:03,520 --> 00:22:07,159 Speaker 1: There is the Bandersnatch. Wait is it the Bandersnatch or 401 00:22:07,200 --> 00:22:09,919 Speaker 1: is it the Demon Packs? It is even packs? Yeah, 402 00:22:09,960 --> 00:22:13,360 Speaker 1: but I also it is also the Bandersnatch. Like it's design. 403 00:22:13,920 --> 00:22:17,199 Speaker 1: Its design is roughly based on that illustration of the 404 00:22:17,160 --> 00:22:20,120 Speaker 1: Bandersnatch we talked about. Okay, cool, all right, We're gonna 405 00:22:20,119 --> 00:22:22,040 Speaker 1: take a quick break, but when we come back we 406 00:22:22,080 --> 00:22:25,199 Speaker 1: will get into the themes of Bandersnatch and into the 407 00:22:25,359 --> 00:22:31,800 Speaker 1: nature of of choice and free will. Alright, we're back. 408 00:22:32,320 --> 00:22:35,879 Speaker 1: So there are a lot of interesting ideas, cool themes, 409 00:22:36,359 --> 00:22:40,439 Speaker 1: historical tidbits that are thrown together, We're not thrown together, 410 00:22:41,240 --> 00:22:45,560 Speaker 1: stitched together, reassembled in band or snatch that give it 411 00:22:45,560 --> 00:22:48,240 Speaker 1: its unique feel. Here's just a list of some of 412 00:22:48,240 --> 00:22:52,480 Speaker 1: the things. First of all, video game design circa four 413 00:22:52,560 --> 00:22:55,520 Speaker 1: because that is the setting. Yeah, it takes place in 414 00:22:55,560 --> 00:22:58,520 Speaker 1: the eighties with eighties music, eighties fashion and all that stuff. 415 00:22:58,560 --> 00:23:02,520 Speaker 1: But they're also program mng you know, old school adventure 416 00:23:02,640 --> 00:23:05,360 Speaker 1: games for like the Commodore sixty four and stuff. Yea 417 00:23:05,800 --> 00:23:09,399 Speaker 1: another huge part of it, aren't choose your own adventure books? Um, 418 00:23:09,480 --> 00:23:11,920 Speaker 1: which are you know directly referenced? And then there is 419 00:23:11,960 --> 00:23:15,119 Speaker 1: a book within Band or Snatch titled band or Snatch 420 00:23:15,200 --> 00:23:18,959 Speaker 1: that is this enormous tone that we're told is essentially 421 00:23:18,960 --> 00:23:23,040 Speaker 1: a choose your own adventure type scenario. Uh, do you 422 00:23:23,080 --> 00:23:26,880 Speaker 1: have any fond memories of cho shown adventure books? Absolutely? 423 00:23:26,920 --> 00:23:29,359 Speaker 1: I was obsessed with them. I loved them when I 424 00:23:29,400 --> 00:23:32,280 Speaker 1: was in elementary school, and I would love them despite 425 00:23:32,280 --> 00:23:34,560 Speaker 1: the fact that you know, you die in most of 426 00:23:34,600 --> 00:23:38,760 Speaker 1: the endings, Like it imposes a kind of horrible paranoid 427 00:23:38,800 --> 00:23:41,680 Speaker 1: fatalism on a child. I think, where you know, oh, 428 00:23:42,160 --> 00:23:44,639 Speaker 1: this is a book about exploring the Arctic, but almost 429 00:23:44,680 --> 00:23:46,639 Speaker 1: no matter what you do, you get eaten by a 430 00:23:46,680 --> 00:23:49,320 Speaker 1: polar bear or fall beneath the ice and you can't 431 00:23:49,320 --> 00:23:52,639 Speaker 1: get out. Um. I guess my young brain was drawn 432 00:23:52,640 --> 00:23:54,120 Speaker 1: to that kind of thing, though, you know, I had 433 00:23:54,119 --> 00:23:57,400 Speaker 1: that like morbid obsession with peril and danger and death 434 00:23:57,440 --> 00:24:01,040 Speaker 1: and all that. But also I'm curious what is so 435 00:24:01,119 --> 00:24:04,280 Speaker 1: appealing about the choose your own Adventure books? Because one 436 00:24:04,280 --> 00:24:05,920 Speaker 1: thing we should say is that this is not the 437 00:24:05,960 --> 00:24:11,879 Speaker 1: first interactive film Bandersnatch. Previous attempts at interactive films have 438 00:24:12,080 --> 00:24:16,639 Speaker 1: generally been very unpopular. I think a lot of times 439 00:24:16,960 --> 00:24:21,679 Speaker 1: people don't actually enjoy the experience of choosing the outcome 440 00:24:21,920 --> 00:24:24,280 Speaker 1: of a film, and I think there are reasons for that. 441 00:24:24,320 --> 00:24:25,959 Speaker 1: I mean, for one thing, it's just like hard to 442 00:24:26,000 --> 00:24:28,600 Speaker 1: make a story where like multiple, like so many different 443 00:24:28,600 --> 00:24:30,919 Speaker 1: options of how the story could go would all be 444 00:24:31,000 --> 00:24:33,600 Speaker 1: equally satisfying. I mean, there's a reason that an author 445 00:24:33,640 --> 00:24:37,040 Speaker 1: writes a story a certain way, right. For for instance, 446 00:24:37,200 --> 00:24:39,080 Speaker 1: one film that we've talked about on the show before, 447 00:24:39,440 --> 00:24:44,800 Speaker 1: William Castle's Mr. Sardonicus from nineteen sixty one, was presented 448 00:24:45,000 --> 00:24:48,280 Speaker 1: was marketed as having an interactive element in that at 449 00:24:48,320 --> 00:24:50,200 Speaker 1: the end of this you've got to choose the fate 450 00:24:50,280 --> 00:24:53,120 Speaker 1: for the villain, would it be, you know, justice or mercy? 451 00:24:54,040 --> 00:24:58,359 Speaker 1: And the thing is, audiences never chose mercy for this 452 00:24:58,480 --> 00:25:02,760 Speaker 1: horrible villa, of course, they always chose justice. And so 453 00:25:03,320 --> 00:25:06,080 Speaker 1: there were even accusations that they never even shot the 454 00:25:06,200 --> 00:25:09,240 Speaker 1: alternate version, like there was the the the idea that 455 00:25:09,280 --> 00:25:12,879 Speaker 1: it was interactive was just you know, the pitch was 456 00:25:12,920 --> 00:25:15,680 Speaker 1: just the marketing, but there was no actual interactive element. 457 00:25:15,960 --> 00:25:18,360 Speaker 1: William castle, I think claimed otherwise saying yes they did 458 00:25:18,359 --> 00:25:22,240 Speaker 1: shoot the sequence. I do not know personally if if 459 00:25:22,280 --> 00:25:25,160 Speaker 1: that's true or not, if if this footage has ever materialized. 460 00:25:25,640 --> 00:25:29,480 Speaker 1: But what I did did read was that generally people 461 00:25:29,520 --> 00:25:33,800 Speaker 1: point to nineteen sixty seven's uh Keno automat as the 462 00:25:33,920 --> 00:25:36,679 Speaker 1: first truly interactive film, but even that, I think there 463 00:25:36,720 --> 00:25:39,679 Speaker 1: are only like four choices that could be made, and 464 00:25:39,760 --> 00:25:43,800 Speaker 1: this film was also I think, largely comedic. Okay, well, 465 00:25:43,840 --> 00:25:46,200 Speaker 1: I mean I would say that there are many reasons 466 00:25:46,240 --> 00:25:48,879 Speaker 1: why this format doesn't always work. For some reason, it 467 00:25:48,920 --> 00:25:50,800 Speaker 1: worked for me as a kid with to choose your 468 00:25:50,800 --> 00:25:54,240 Speaker 1: own adventure books. I loved those. But I mean, one 469 00:25:54,280 --> 00:25:56,480 Speaker 1: problem I think is that it's hard to make all 470 00:25:56,560 --> 00:25:59,800 Speaker 1: the narrative branches as good as each other. But another 471 00:25:59,800 --> 00:26:02,359 Speaker 1: one is just that like, for instance, when you finish it, 472 00:26:02,480 --> 00:26:04,280 Speaker 1: there's not I don't think there was ever a sense 473 00:26:04,320 --> 00:26:06,840 Speaker 1: where I'm like, Okay, that's the ending I got. No, 474 00:26:07,000 --> 00:26:09,440 Speaker 1: I want the good ending, or I want the robust 475 00:26:09,560 --> 00:26:11,239 Speaker 1: ending you go back and do it again. It's more 476 00:26:11,320 --> 00:26:13,040 Speaker 1: like a video game, or so I don't want the 477 00:26:13,119 --> 00:26:16,720 Speaker 1: ending where I randomly die, Like the story of the 478 00:26:17,119 --> 00:26:20,000 Speaker 1: Super Mario is not that he's killed by a mutant 479 00:26:20,000 --> 00:26:22,960 Speaker 1: turtle three minutes into the game. You know, I mean, 480 00:26:22,960 --> 00:26:25,800 Speaker 1: that's that's not an epic tale. So in some ways, 481 00:26:25,840 --> 00:26:28,840 Speaker 1: I think the choose your own adventure books are sometimes 482 00:26:29,000 --> 00:26:32,520 Speaker 1: better thought of as like a puzzle to solve than 483 00:26:32,840 --> 00:26:35,400 Speaker 1: as like a narrative to be experienced. And another big 484 00:26:35,400 --> 00:26:37,680 Speaker 1: difference I will say is that one of the great 485 00:26:37,720 --> 00:26:41,239 Speaker 1: pleasures of watching a movie or reading a book, or 486 00:26:41,280 --> 00:26:44,200 Speaker 1: you know, engaging in any kind of narrative with an author, 487 00:26:44,280 --> 00:26:48,000 Speaker 1: storyteller and you as the passive audience, is a surrendering 488 00:26:48,320 --> 00:26:51,240 Speaker 1: of responsibility for what is about to happen in your 489 00:26:51,280 --> 00:26:56,120 Speaker 1: own mind. You you give up that responsibility and suddenly 490 00:26:56,520 --> 00:26:59,119 Speaker 1: you know when when bad things continue to happen in 491 00:26:59,160 --> 00:27:02,479 Speaker 1: the story, when characters make disastrous decisions that unfold and 492 00:27:02,800 --> 00:27:06,520 Speaker 1: increase the peril and heightened the drama, you're not responsible 493 00:27:06,560 --> 00:27:09,199 Speaker 1: for what's happening. You're just witnessing it. And that you 494 00:27:09,240 --> 00:27:11,720 Speaker 1: know that witnessing is very fun. Is peeking through a 495 00:27:11,760 --> 00:27:13,720 Speaker 1: hole in the wall and what's happening to somebody else 496 00:27:14,040 --> 00:27:18,200 Speaker 1: When they make you make decisions, it introduces this horrible 497 00:27:18,240 --> 00:27:21,919 Speaker 1: tension between what you want to see versus what you 498 00:27:21,960 --> 00:27:25,720 Speaker 1: think you should do, you know, like, uh that I 499 00:27:25,760 --> 00:27:28,919 Speaker 1: think there's this tension whenever a great example would be 500 00:27:28,920 --> 00:27:32,720 Speaker 1: in band or Snatch, I often felt, in a bizarre way, 501 00:27:32,960 --> 00:27:38,160 Speaker 1: morally compelled to make the tame or safer options, where 502 00:27:38,200 --> 00:27:41,240 Speaker 1: at the same time I felt more interested in seeing 503 00:27:41,240 --> 00:27:45,200 Speaker 1: the more kind of like dangerous disastrous options play out. Yeah, 504 00:27:45,200 --> 00:27:48,000 Speaker 1: this was This was definitely my experience with my first 505 00:27:48,080 --> 00:27:50,480 Speaker 1: view and a Band or Snatches that you know when 506 00:27:50,480 --> 00:27:53,680 Speaker 1: the when the decision start start hitting you, like later 507 00:27:53,720 --> 00:27:56,119 Speaker 1: on they become like this horrible choice or this horrible 508 00:27:56,200 --> 00:27:59,160 Speaker 1: choice and becomes harder to play this game. But earlier 509 00:27:59,200 --> 00:28:00,840 Speaker 1: on there are moment to where you're like, are you 510 00:28:00,880 --> 00:28:04,199 Speaker 1: going to do the sensible thing or the more rebellious 511 00:28:04,280 --> 00:28:06,680 Speaker 1: thing or even the more dangerous thing? And I found 512 00:28:06,680 --> 00:28:10,840 Speaker 1: myself choosing the safer thing, Like like minor spoiler here, 513 00:28:10,920 --> 00:28:14,960 Speaker 1: but he is he's offered the choice between producing his 514 00:28:15,040 --> 00:28:19,280 Speaker 1: dream game with this company at their offices, with their support, 515 00:28:19,720 --> 00:28:22,720 Speaker 1: or or saying no to them. And so like the 516 00:28:22,760 --> 00:28:24,920 Speaker 1: responsible part of me is like, yes, say yes to 517 00:28:25,080 --> 00:28:27,159 Speaker 1: this is employment. This is gonna be good for you. 518 00:28:27,400 --> 00:28:29,880 Speaker 1: Like clearly you're you're stuck in a weird situation at home. 519 00:28:29,920 --> 00:28:33,880 Speaker 1: You need to get out of the house, protagonists and uh, 520 00:28:33,960 --> 00:28:37,120 Speaker 1: and so that's the way I went. But it's ultimately 521 00:28:37,119 --> 00:28:39,280 Speaker 1: not the best choice, and it kind of dead ends 522 00:28:39,320 --> 00:28:41,880 Speaker 1: if you take that choice. Well, yeah, it almost kind 523 00:28:41,920 --> 00:28:44,160 Speaker 1: of gives you a little slap on the wrist for 524 00:28:44,240 --> 00:28:47,280 Speaker 1: making that choice, you know. So he So I don't 525 00:28:47,320 --> 00:28:50,160 Speaker 1: want to spoil anything, but yeah, there's a there's like 526 00:28:50,200 --> 00:28:53,680 Speaker 1: a slight shaming of the viewer for choosing the safe option. 527 00:28:53,880 --> 00:28:55,840 Speaker 1: And this is very early on, so we're not really 528 00:28:56,320 --> 00:28:59,080 Speaker 1: spoiling anything, I think nature. But but yeah, I would 529 00:28:59,080 --> 00:29:00,760 Speaker 1: do that a lot. I would Hey, I would make 530 00:29:00,960 --> 00:29:03,800 Speaker 1: safe choices. And in fact, it ultimately ended up reminding 531 00:29:03,840 --> 00:29:06,480 Speaker 1: me a little bit of the Space and Guild and Dune, 532 00:29:06,920 --> 00:29:08,760 Speaker 1: who of course used the spice to see into the 533 00:29:08,760 --> 00:29:11,120 Speaker 1: future to figure out how to navigate the dangers of space, 534 00:29:11,920 --> 00:29:14,120 Speaker 1: which is helpful if you're navigating the dangers of space, 535 00:29:14,160 --> 00:29:16,840 Speaker 1: but into in life and in politics and all these 536 00:29:16,880 --> 00:29:20,160 Speaker 1: other choices. It's this road to stagnation for the Space 537 00:29:20,200 --> 00:29:23,760 Speaker 1: and Guild because they always make the safe choice. And 538 00:29:24,040 --> 00:29:27,400 Speaker 1: when we look at the narratives that we love, generally, 539 00:29:27,440 --> 00:29:30,600 Speaker 1: they're not about people making the safe choice after safe 540 00:29:30,680 --> 00:29:34,560 Speaker 1: choice after safe choice, They're about people flying off the handles, 541 00:29:34,560 --> 00:29:37,120 Speaker 1: are making huge mistakes and having to deal with those, 542 00:29:37,400 --> 00:29:39,360 Speaker 1: and so there is I think there's a learning curve 543 00:29:39,360 --> 00:29:42,400 Speaker 1: there with band or Snatch and and so my second 544 00:29:42,440 --> 00:29:44,200 Speaker 1: viewing of it, I tried to do more of that. 545 00:29:44,240 --> 00:29:48,080 Speaker 1: I tried to make choices that I felt were interesting 546 00:29:48,760 --> 00:29:52,840 Speaker 1: or um or more dramatic, and that seemed to work 547 00:29:52,880 --> 00:29:57,560 Speaker 1: really well, and I feel like the product rewards you 548 00:29:57,680 --> 00:30:01,000 Speaker 1: for doing that. Yeah, So I think that tension is 549 00:30:01,040 --> 00:30:03,640 Speaker 1: definitely there with the movies. And I wonder if it's 550 00:30:03,720 --> 00:30:05,760 Speaker 1: more the case in a movie than in a book, 551 00:30:05,800 --> 00:30:09,760 Speaker 1: just because a movie is more sensorly visceral. The fact 552 00:30:09,800 --> 00:30:12,640 Speaker 1: that you know that it's actually visually presented to you 553 00:30:12,720 --> 00:30:18,040 Speaker 1: in video and audio makes it harder to just pursue, 554 00:30:18,200 --> 00:30:21,440 Speaker 1: you know, your sort of lust for drama and and 555 00:30:21,440 --> 00:30:23,560 Speaker 1: and weirdness and whatever it is you want to see 556 00:30:24,040 --> 00:30:26,800 Speaker 1: um as opposed to making the safer choices. I don't 557 00:30:26,840 --> 00:30:29,960 Speaker 1: recall feeling compelled to make the safer choice the same 558 00:30:30,000 --> 00:30:33,520 Speaker 1: way with Choose your Own Adventure books, and that could 559 00:30:33,560 --> 00:30:37,600 Speaker 1: just be because of like the lower sensory salience of 560 00:30:37,600 --> 00:30:41,040 Speaker 1: of books compared to movies. I don't know, yeah maybe so. Um. 561 00:30:41,320 --> 00:30:44,400 Speaker 1: I finally finally remember the Choose Your Adventure books as well, 562 00:30:44,760 --> 00:30:46,480 Speaker 1: in part because they had them at the library and 563 00:30:46,480 --> 00:30:49,959 Speaker 1: I could check them out. Um. But also another series 564 00:30:49,960 --> 00:30:54,240 Speaker 1: that I finally remember, the Lone Wolf series. Were you 565 00:30:54,320 --> 00:30:57,240 Speaker 1: familiar with these? So these there, there are a series 566 00:30:57,280 --> 00:30:59,800 Speaker 1: of these. The first one was by Joe Deaver and 567 00:31:00,000 --> 00:31:03,520 Speaker 1: Gary Chalk, And this is a there like a Choose 568 00:31:03,520 --> 00:31:06,840 Speaker 1: your Own Adventure series, very much fantasy Dungeons and Dragon 569 00:31:06,920 --> 00:31:09,760 Speaker 1: style high fantasy, but there's more of a role playing 570 00:31:09,760 --> 00:31:11,920 Speaker 1: element to it. So for instance, when you open the book, 571 00:31:12,600 --> 00:31:15,920 Speaker 1: it has not only a map of the adventuring world 572 00:31:16,040 --> 00:31:18,960 Speaker 1: you're taking a part of, but there's also an action 573 00:31:19,040 --> 00:31:22,120 Speaker 1: chart and a combat record because you're gonna end up 574 00:31:22,120 --> 00:31:24,480 Speaker 1: having to pencil in your stats as you go through 575 00:31:24,520 --> 00:31:28,920 Speaker 1: the story, picking spells and so forth. It's more like 576 00:31:28,960 --> 00:31:31,840 Speaker 1: a one player dn D module. Yeah, exactly. It's like 577 00:31:31,880 --> 00:31:34,440 Speaker 1: imagine it's like a a Choose your Own Adventure book 578 00:31:34,440 --> 00:31:37,000 Speaker 1: and a one player d n D module come together 579 00:31:37,040 --> 00:31:40,560 Speaker 1: into this one little tone. So I finally remember those, 580 00:31:40,680 --> 00:31:43,880 Speaker 1: And I might be misremembering here, but I think I 581 00:31:43,920 --> 00:31:47,400 Speaker 1: did get turned off later on when I reached an 582 00:31:47,520 --> 00:31:50,320 Speaker 1: artificial dead end in one of them, like there was 583 00:31:50,360 --> 00:31:54,640 Speaker 1: something broken and I couldn't go back. Yeah, but uh again, 584 00:31:54,680 --> 00:31:57,000 Speaker 1: I'm my memory may not be perfect on that. If 585 00:31:57,040 --> 00:31:59,040 Speaker 1: you're at all interested in this format, I do highly 586 00:31:59,080 --> 00:32:02,560 Speaker 1: recommend picking up one of these old battle issues copies 587 00:32:02,600 --> 00:32:05,920 Speaker 1: of the Lone Wolf series, and I think they've republished 588 00:32:05,960 --> 00:32:07,840 Speaker 1: them again with new artwork, but I don't know. The 589 00:32:08,160 --> 00:32:11,040 Speaker 1: classic artwork is exactly the kind of thing I love. 590 00:32:11,360 --> 00:32:13,880 Speaker 1: The Choose Your Adventure book that I brought in today 591 00:32:13,920 --> 00:32:15,920 Speaker 1: for for you to look at Robert is called You 592 00:32:15,960 --> 00:32:19,360 Speaker 1: Are a Shark by Edward Packard. It has a kid 593 00:32:19,440 --> 00:32:23,320 Speaker 1: turning into a shark. He's like mid animal sequence, but 594 00:32:23,360 --> 00:32:25,960 Speaker 1: he also looks like he's slipping sliding as he turns 595 00:32:26,000 --> 00:32:28,680 Speaker 1: into a shark. It's pretty good. That's pretty brilliant to 596 00:32:28,880 --> 00:32:32,880 Speaker 1: like channeling something that children, especially at that time, would 597 00:32:32,880 --> 00:32:35,840 Speaker 1: have been familiar, would have likely done, and giving this 598 00:32:35,920 --> 00:32:39,000 Speaker 1: fantastic spin on it. But it's the story is essentially 599 00:32:39,840 --> 00:32:43,440 Speaker 1: the fingle doppling scene from Overdrawn at the Memory Bank, 600 00:32:43,680 --> 00:32:47,360 Speaker 1: where he just gets transformed into various different animals. Do 601 00:32:47,440 --> 00:32:49,600 Speaker 1: you you know you get turned into an elephant or 602 00:32:49,680 --> 00:32:52,280 Speaker 1: a seagull or of course a shark. I think I 603 00:32:52,320 --> 00:32:54,720 Speaker 1: recall one death where you get turned into a squid 604 00:32:55,040 --> 00:32:57,280 Speaker 1: and you're being chased by something maybe it is a shark, 605 00:32:57,320 --> 00:32:59,960 Speaker 1: and you run out of ink to to disguise yourself 606 00:33:00,040 --> 00:33:04,560 Speaker 1: with and you're doomed. All right. Well, Uh, coming back 607 00:33:04,560 --> 00:33:08,400 Speaker 1: to Bandersnatch, we mentioned the video game aspect choose your 608 00:33:08,400 --> 00:33:11,360 Speaker 1: own Adventure books. There are another number of other elements 609 00:33:11,360 --> 00:33:14,040 Speaker 1: and homages in there as well. Uh. It deals with 610 00:33:14,160 --> 00:33:17,480 Speaker 1: mental illness, It deals with l s d uh, their 611 00:33:17,520 --> 00:33:21,520 Speaker 1: allusions to Philip K. Dick. Uh. There there's mention of 612 00:33:21,680 --> 00:33:24,600 Speaker 1: alternate timelines, and of course it spends a lot of 613 00:33:24,640 --> 00:33:27,760 Speaker 1: time contemplating this idea of a free will and the 614 00:33:27,800 --> 00:33:30,840 Speaker 1: potential illusion of choice. Yeah. I think that's the main 615 00:33:31,000 --> 00:33:34,479 Speaker 1: theme of it is is interrogating the idea of what 616 00:33:34,520 --> 00:33:38,640 Speaker 1: it means to be in control of one's own actions. Yeah. 617 00:33:38,840 --> 00:33:41,880 Speaker 1: And the basic plot is as follows. A young programmer 618 00:33:41,960 --> 00:33:44,640 Speaker 1: named Stephen Butler is obsessed with a choose your own 619 00:33:44,680 --> 00:33:48,600 Speaker 1: adventure style book titled bander Snatch that was written by 620 00:33:48,640 --> 00:33:52,080 Speaker 1: a the late troubled writer Jerome F. Davies, and he 621 00:33:52,200 --> 00:33:54,960 Speaker 1: really wants to turn this into a computer adventure game, 622 00:33:55,000 --> 00:33:57,480 Speaker 1: and he's begun work on it on his own. So 623 00:33:57,520 --> 00:33:59,960 Speaker 1: he ends up falling in with this video game comp 624 00:34:00,000 --> 00:34:04,560 Speaker 1: any called tucker Soft and meets its lead creative, uh, 625 00:34:04,600 --> 00:34:09,080 Speaker 1: this programmer named Colin Rittman. And from there it departs 626 00:34:09,120 --> 00:34:13,799 Speaker 1: through these varying winding paths, reality warping, through madness and 627 00:34:13,880 --> 00:34:16,759 Speaker 1: sometimes horror. Uh. And through all of it, there's also 628 00:34:16,920 --> 00:34:18,880 Speaker 1: this side of this feeling that there is a minotaur 629 00:34:19,040 --> 00:34:22,879 Speaker 1: like monster pursuing you, pursuing our our protagonist as well. 630 00:34:23,160 --> 00:34:26,040 Speaker 1: And this is the band or Snatch, but more specifically 631 00:34:26,200 --> 00:34:28,799 Speaker 1: it is titled Packs. Its name is Packs, and it 632 00:34:28,920 --> 00:34:32,040 Speaker 1: is who are told it is the Thief of Destiny. Yeah, 633 00:34:32,080 --> 00:34:33,759 Speaker 1: there's a great moment where the game appears to give 634 00:34:33,760 --> 00:34:37,600 Speaker 1: you an option to either deny worshiping Packs or submit 635 00:34:37,640 --> 00:34:40,319 Speaker 1: to worshiping Packs. Yes, this is the game within a game. 636 00:34:40,719 --> 00:34:42,520 Speaker 1: It made me want to play the game. Yeah, it 637 00:34:42,520 --> 00:34:45,240 Speaker 1: looked really cool. So something that made band or Snatch 638 00:34:45,360 --> 00:34:47,840 Speaker 1: different from most of the Choose your own adventure books 639 00:34:47,840 --> 00:34:51,160 Speaker 1: that I remember reading. I'm sure there are probably exceptions, 640 00:34:51,200 --> 00:34:54,480 Speaker 1: but in the classic books I remember reading, the story 641 00:34:54,600 --> 00:34:57,839 Speaker 1: is written in the second person. The protagonist is an 642 00:34:57,920 --> 00:35:01,520 Speaker 1: unnamed you. You know, go down the left hall, you 643 00:35:01,560 --> 00:35:03,920 Speaker 1: get eaten by a swarm of Ferrell pigs. You go 644 00:35:04,000 --> 00:35:06,879 Speaker 1: down the right hall, you get turned into a bowl 645 00:35:06,880 --> 00:35:09,719 Speaker 1: of ice cream by magic pirate. You know that you. 646 00:35:09,719 --> 00:35:12,759 Speaker 1: You explore all the different dooms On offer to you, 647 00:35:12,920 --> 00:35:16,120 Speaker 1: but it's you. Yeah. Likewise, in the Lone Wolf books, 648 00:35:16,560 --> 00:35:19,920 Speaker 1: as I recall you, you kind of make choices regarding 649 00:35:19,920 --> 00:35:22,920 Speaker 1: how this characters put together. You have a fair amount 650 00:35:22,920 --> 00:35:27,600 Speaker 1: of control. It is your character. But Bandersnatch challenges this 651 00:35:27,680 --> 00:35:31,040 Speaker 1: formula a little bit by making the protagonist a third 652 00:35:31,160 --> 00:35:36,280 Speaker 1: person character with a name and pre existing individualized circumstances. 653 00:35:36,280 --> 00:35:39,960 Speaker 1: You've got Stefan right. But then this is where it 654 00:35:40,000 --> 00:35:42,680 Speaker 1: starts getting even weirder. So not only is it this 655 00:35:42,760 --> 00:35:46,359 Speaker 1: definite third person character with their own characteristics and not 656 00:35:46,480 --> 00:35:50,000 Speaker 1: just a second person protagonist, there are moments where the 657 00:35:50,040 --> 00:35:54,480 Speaker 1: options are you choose, not what Stephen does. That's how 658 00:35:54,520 --> 00:35:57,000 Speaker 1: it mostly is. You know what does Stefon pick? You 659 00:35:57,040 --> 00:35:59,239 Speaker 1: know what? What? What does he listened to? What is 660 00:35:59,239 --> 00:36:02,560 Speaker 1: he answered somebody who poses a question to him? It 661 00:36:02,680 --> 00:36:05,879 Speaker 1: then changes and gives you the option to dictate what 662 00:36:06,000 --> 00:36:09,719 Speaker 1: happens to him from the outside. The specific example I 663 00:36:09,760 --> 00:36:13,759 Speaker 1: recall is what messages he believes he is receiving on 664 00:36:13,840 --> 00:36:16,960 Speaker 1: his computer screen. Now, of course, if you go with 665 00:36:17,000 --> 00:36:19,600 Speaker 1: the most straightforward interpretation of the story, which is that 666 00:36:19,640 --> 00:36:23,160 Speaker 1: Stephan is experiencing symptoms of psychosis, in a way, you're 667 00:36:23,200 --> 00:36:27,279 Speaker 1: still dictating the activity of his brain, but activities of 668 00:36:27,440 --> 00:36:30,800 Speaker 1: his brain that he as a character does not perceive 669 00:36:30,880 --> 00:36:34,640 Speaker 1: as coming from himself, their hallucinations that he believes to 670 00:36:34,680 --> 00:36:37,359 Speaker 1: be coming from the outside. And you know, this makes 671 00:36:37,400 --> 00:36:39,920 Speaker 1: me wonder about the framing of how we should think 672 00:36:39,960 --> 00:36:44,400 Speaker 1: about hallucinations that are generated internally by the brain but 673 00:36:44,560 --> 00:36:49,240 Speaker 1: perceived to come from an external source. Are those hallucinations 674 00:36:49,280 --> 00:36:53,160 Speaker 1: best understood as you or not? Um are are there 675 00:36:53,160 --> 00:36:56,719 Speaker 1: processes within your own brain that are, in some legitimate 676 00:36:56,760 --> 00:36:59,680 Speaker 1: sense not you, even though they are your brain, they're 677 00:36:59,680 --> 00:37:02,719 Speaker 1: not any body else. Yeah, it's it's not really the 678 00:37:02,800 --> 00:37:05,279 Speaker 1: voice of God. It is the it is it is 679 00:37:05,360 --> 00:37:08,480 Speaker 1: something coming from inside your brain that you are perhaps 680 00:37:08,920 --> 00:37:12,960 Speaker 1: imagining or or interpreting as the voice of God. But 681 00:37:13,200 --> 00:37:16,920 Speaker 1: is you more synonymous with your whole brain and everything 682 00:37:16,960 --> 00:37:20,359 Speaker 1: it does, or is you more synonymous with the part 683 00:37:20,400 --> 00:37:23,799 Speaker 1: of your brain that you identify as yourself. That's going 684 00:37:23,840 --> 00:37:26,560 Speaker 1: to be very key to some interpretations of the basic 685 00:37:26,719 --> 00:37:30,080 Speaker 1: theme explored in Bandersnatch, right, and they do explore this 686 00:37:30,200 --> 00:37:34,719 Speaker 1: theme amazingly. Well, I felt the second time I watched it, 687 00:37:34,920 --> 00:37:38,360 Speaker 1: I found all these additional layers. Um, you know again, 688 00:37:38,400 --> 00:37:40,640 Speaker 1: I'm I'm I'm tempted to make the best choices for 689 00:37:40,680 --> 00:37:43,480 Speaker 1: a protagonist, or at least there's still that inclination that 690 00:37:43,520 --> 00:37:45,560 Speaker 1: I want to do that. And at one point there's 691 00:37:45,560 --> 00:37:48,640 Speaker 1: this song playing with the lyrics doing what's best for Nigel, 692 00:37:48,719 --> 00:37:51,640 Speaker 1: and it's all in the stec or I don't know, 693 00:37:51,760 --> 00:37:54,200 Speaker 1: I think so. Yes, I was not familiar with that 694 00:37:54,239 --> 00:37:56,800 Speaker 1: group or that this song before, but yeah, it's playing, 695 00:37:56,800 --> 00:37:58,840 Speaker 1: and the whole scene is about like how his father 696 00:37:58,960 --> 00:38:02,000 Speaker 1: is making choices or him or other times it's you know, 697 00:38:02,040 --> 00:38:04,800 Speaker 1: it's the therapies therapist that is giving him advice about 698 00:38:04,800 --> 00:38:08,040 Speaker 1: how to how to make choices in his life. And 699 00:38:08,080 --> 00:38:10,120 Speaker 1: so you have all these forces that that help him 700 00:38:10,160 --> 00:38:13,640 Speaker 1: make his choices or make choices for him. And then 701 00:38:14,000 --> 00:38:16,839 Speaker 1: that's also what we are doing as we interact with 702 00:38:16,880 --> 00:38:19,080 Speaker 1: the product. Well, yes, and in a weird way, it 703 00:38:19,120 --> 00:38:21,880 Speaker 1: kind of brings you back around to to this question 704 00:38:21,920 --> 00:38:24,040 Speaker 1: of Wait a minute, is he Is he a third 705 00:38:24,080 --> 00:38:27,640 Speaker 1: person narrator or are you supposed to identify as him? 706 00:38:27,680 --> 00:38:31,120 Speaker 1: So when these choices are are in some cases things 707 00:38:31,200 --> 00:38:33,640 Speaker 1: coming to him apparently from the outside, you know, they 708 00:38:33,719 --> 00:38:37,319 Speaker 1: might be messages he's receiving from some kind of otherworldly 709 00:38:37,400 --> 00:38:42,240 Speaker 1: source or hallucinations. Um, are you still making choices for 710 00:38:42,360 --> 00:38:44,880 Speaker 1: him or not? Uh? And it leads back into this 711 00:38:45,040 --> 00:38:49,160 Speaker 1: theme of whether or not you are really in control 712 00:38:49,239 --> 00:38:51,080 Speaker 1: of your own actions and what does it mean to 713 00:38:51,200 --> 00:38:53,880 Speaker 1: be in control of your own actions? Uh? And this 714 00:38:54,080 --> 00:38:57,239 Speaker 1: we come to the subject of free will, which is 715 00:38:57,280 --> 00:39:00,279 Speaker 1: a huge topic that we return to time and time 716 00:39:00,280 --> 00:39:02,440 Speaker 1: again on stuff to blow your mind. And we're not 717 00:39:02,480 --> 00:39:06,440 Speaker 1: going to try to encapsulate everything about that here. Uh. 718 00:39:06,480 --> 00:39:08,640 Speaker 1: You know, we've talked about in the past, we're talking 719 00:39:08,640 --> 00:39:10,040 Speaker 1: about it today, We're going to talk about it in 720 00:39:10,040 --> 00:39:15,040 Speaker 1: the future. But it suffice to say philosophies, very scientific 721 00:39:15,080 --> 00:39:19,200 Speaker 1: interpretations vary, and then it drags in additional drags in 722 00:39:19,280 --> 00:39:21,759 Speaker 1: just about everything about the human condition, right, I mean, 723 00:39:22,040 --> 00:39:27,239 Speaker 1: moral responsibility, theological quandaries, etcetera. Yeah, I mean it's a 724 00:39:27,280 --> 00:39:31,359 Speaker 1: problem that it is such a huge topic and that 725 00:39:31,719 --> 00:39:34,719 Speaker 1: almost all discussions about free will that I encounter in 726 00:39:34,760 --> 00:39:38,359 Speaker 1: the wild are an absolute mess. This isn't my personal take. 727 00:39:38,560 --> 00:39:42,000 Speaker 1: I noticed. Do you ever notice how conversations about free 728 00:39:42,000 --> 00:39:46,719 Speaker 1: will almost never seemed to clarify anything. They almost never 729 00:39:46,719 --> 00:39:49,920 Speaker 1: seemed to provide any more focus or clarity than you 730 00:39:49,960 --> 00:39:54,040 Speaker 1: had to begin with. Yeah, like sometimes it's it's at 731 00:39:54,080 --> 00:39:58,120 Speaker 1: times it feels like having a conversation with somebody in 732 00:39:58,160 --> 00:40:00,959 Speaker 1: a in a swimming pool about weather water is wet, 733 00:40:01,840 --> 00:40:04,040 Speaker 1: because it does get down to like, like it seems 734 00:40:04,040 --> 00:40:06,360 Speaker 1: wet to me. I am in it. It seems like 735 00:40:06,440 --> 00:40:08,680 Speaker 1: free will to me because I am immersed in it, 736 00:40:08,719 --> 00:40:12,640 Speaker 1: and it's difficult for me to remove myself from the 737 00:40:12,680 --> 00:40:15,760 Speaker 1: experience that I'm having and and all of and everything 738 00:40:15,800 --> 00:40:18,520 Speaker 1: in my life that supports everything in the culture at large, 739 00:40:18,640 --> 00:40:22,640 Speaker 1: that supports the idea that I am making choices informed 740 00:40:22,719 --> 00:40:25,440 Speaker 1: choices about my life. I mean, I feel like some 741 00:40:25,480 --> 00:40:28,120 Speaker 1: dilemmas having to do with free will or or like 742 00:40:28,520 --> 00:40:31,080 Speaker 1: they force you to choose between two options that are 743 00:40:31,120 --> 00:40:35,400 Speaker 1: both tautologies or both absurdities. And and any time you 744 00:40:35,480 --> 00:40:37,919 Speaker 1: encounter a problem like that, I think there's a pretty 745 00:40:37,960 --> 00:40:40,840 Speaker 1: good chance that the underlying disease causing that is poorly 746 00:40:40,920 --> 00:40:44,680 Speaker 1: defined terms, right and yeah, to your point, the extreme 747 00:40:45,719 --> 00:40:48,200 Speaker 1: versions of this are are to tend to come off 748 00:40:48,239 --> 00:40:50,160 Speaker 1: as kind of looney, like if someone is just like, 749 00:40:50,200 --> 00:40:53,960 Speaker 1: I am a completely free moving soul, like, no, you're not, dummy. Uh. 750 00:40:54,000 --> 00:40:55,520 Speaker 1: You know, it's like when we discuss in the in 751 00:40:55,560 --> 00:40:58,200 Speaker 1: the Thankfulness episode that we put out, you know, like 752 00:40:58,360 --> 00:41:01,239 Speaker 1: everybody's life is shaped by these other factors, these other 753 00:41:01,280 --> 00:41:04,239 Speaker 1: individuals in their life to some extent, and I feel 754 00:41:04,239 --> 00:41:06,960 Speaker 1: like to argue against that is just lunacy. On the 755 00:41:06,960 --> 00:41:09,120 Speaker 1: other hand, if someone is saying, I am a just 756 00:41:09,160 --> 00:41:12,600 Speaker 1: a pure automaton, honey, there you can back that that 757 00:41:12,760 --> 00:41:16,040 Speaker 1: argument up with some very intriguing arguments, and we'll get 758 00:41:16,080 --> 00:41:18,840 Speaker 1: into some of those. But at the end of the day, 759 00:41:19,160 --> 00:41:22,120 Speaker 1: does that match up with your experience of reality? I 760 00:41:22,360 --> 00:41:24,719 Speaker 1: totally agree. But I think even talking about it at 761 00:41:24,719 --> 00:41:28,040 Speaker 1: that level, that's already like a level up, like having 762 00:41:28,120 --> 00:41:32,120 Speaker 1: accepted some terms as unproblematic more than I think they 763 00:41:32,120 --> 00:41:34,160 Speaker 1: should be. So like anyway, I mean, I think the 764 00:41:34,480 --> 00:41:37,040 Speaker 1: main problem with free will is people aren't being clear 765 00:41:37,200 --> 00:41:40,160 Speaker 1: what they're talking about before they start talking. And I'm 766 00:41:40,200 --> 00:41:42,840 Speaker 1: totally guilty of this as well. Uh, this is usually 767 00:41:42,880 --> 00:41:44,680 Speaker 1: the case when it comes to free will, and this 768 00:41:44,760 --> 00:41:47,440 Speaker 1: happens even when we're not aware that we're being unclear, 769 00:41:47,480 --> 00:41:50,120 Speaker 1: so we can't do it full justice. In the short segment, 770 00:41:50,400 --> 00:41:52,200 Speaker 1: I think we will try to do better than an 771 00:41:52,200 --> 00:41:56,360 Speaker 1: absolute mass. Uh. So, so to try to understand what 772 00:41:56,480 --> 00:42:00,880 Speaker 1: our terms actually mean. What is free will? Am an understanding? 773 00:42:01,000 --> 00:42:04,440 Speaker 1: Is I am in control of my own actions? And 774 00:42:04,480 --> 00:42:07,239 Speaker 1: I think most of the time for most people this 775 00:42:07,360 --> 00:42:10,680 Speaker 1: feels true. Uh. Though curiously, of course, not all of 776 00:42:10,680 --> 00:42:12,640 Speaker 1: the time and not for all people. We can come 777 00:42:12,640 --> 00:42:15,640 Speaker 1: back to that, but I would argue that it only 778 00:42:15,800 --> 00:42:19,719 Speaker 1: feels true in a general way, and it gets stickier 779 00:42:19,880 --> 00:42:22,239 Speaker 1: and THORNI or the more you try to think about it, 780 00:42:22,239 --> 00:42:25,480 Speaker 1: and the more precisely you try to deform define those terms. 781 00:42:25,840 --> 00:42:28,359 Speaker 1: So if I'm in control of my own actions, who 782 00:42:28,520 --> 00:42:31,319 Speaker 1: is I? Uh that we brought this up a minute ago? 783 00:42:31,440 --> 00:42:35,120 Speaker 1: Is I my whole brain? Uh? And Also I think 784 00:42:35,120 --> 00:42:37,040 Speaker 1: there's a good case to be made that other parts 785 00:42:37,040 --> 00:42:39,399 Speaker 1: of your body gets some kind of vote in your 786 00:42:39,440 --> 00:42:42,520 Speaker 1: decision making. So is it my whole body? Is it 787 00:42:42,560 --> 00:42:44,680 Speaker 1: everything with my genome? Even then, I would say your 788 00:42:44,719 --> 00:42:47,920 Speaker 1: microbiota sort of gets a vote. I think that there 789 00:42:47,920 --> 00:42:50,560 Speaker 1: are questions about what the eye is, But then also 790 00:42:50,640 --> 00:42:53,920 Speaker 1: what counts as control? If I am in control of 791 00:42:53,960 --> 00:42:56,680 Speaker 1: my own actions, does it mean that that I make 792 00:42:56,760 --> 00:43:00,960 Speaker 1: my decisions with no outside influences? I mean, that's obviously 793 00:43:01,000 --> 00:43:02,879 Speaker 1: not true, as you were alluding to a minute ago. 794 00:43:03,560 --> 00:43:07,120 Speaker 1: But once you accept that outside factors have some influence 795 00:43:07,160 --> 00:43:10,440 Speaker 1: over whatever it is you're talking about controlling, what's to 796 00:43:10,480 --> 00:43:13,480 Speaker 1: stop you from assuming that they have total influence? I mean, 797 00:43:13,520 --> 00:43:17,239 Speaker 1: what part of your decision making is not influenced by 798 00:43:17,280 --> 00:43:21,800 Speaker 1: pre existing factors like your memory and your physical circumstances 799 00:43:21,840 --> 00:43:24,319 Speaker 1: and so forth, Like what part of you can you 800 00:43:24,360 --> 00:43:28,440 Speaker 1: identify that stands outside of the world. And then from 801 00:43:28,440 --> 00:43:30,759 Speaker 1: the other end, paradoxically, if you were to suddenly act 802 00:43:30,840 --> 00:43:33,839 Speaker 1: in a way that made no sense given your own 803 00:43:33,920 --> 00:43:37,040 Speaker 1: history and memory and all of the inputs coming in 804 00:43:37,040 --> 00:43:39,960 Speaker 1: that you think of as influences on you, wouldn't that 805 00:43:40,040 --> 00:43:44,319 Speaker 1: action actually feel less like something that comes genuinely from you, 806 00:43:44,680 --> 00:43:48,560 Speaker 1: whatever you are. Wouldn't by this metric, the most objectively 807 00:43:48,719 --> 00:43:53,359 Speaker 1: free action seemed like something coming from the outside. You mean, 808 00:43:53,400 --> 00:43:56,000 Speaker 1: like if you go to go to a restaurant where 809 00:43:56,160 --> 00:43:59,040 Speaker 1: they're say, there's a drink menu, and you always tend 810 00:43:59,040 --> 00:44:01,279 Speaker 1: to order something that is made with a base of 811 00:44:01,320 --> 00:44:05,120 Speaker 1: say rum or bourbon or whiskey, and instead you throw 812 00:44:05,160 --> 00:44:06,640 Speaker 1: caution to the wind one day and you get a 813 00:44:06,640 --> 00:44:10,200 Speaker 1: mescal or a vodka drink. Does that actually make you 814 00:44:10,239 --> 00:44:14,680 Speaker 1: feel more free or does it seem like something you know, 815 00:44:14,760 --> 00:44:17,799 Speaker 1: something got into you? Where does that phrase come from? 816 00:44:17,840 --> 00:44:19,640 Speaker 1: I don't know. When I do things like that, I 817 00:44:20,280 --> 00:44:22,279 Speaker 1: think it does make me feel more free, because I'm like, no, 818 00:44:22,920 --> 00:44:25,279 Speaker 1: I'm not going to be the same person i've been 819 00:44:25,360 --> 00:44:28,440 Speaker 1: every time. I'm gonna try it a different direction. You know, 820 00:44:28,520 --> 00:44:30,680 Speaker 1: I'm going to I'm going to get a different type 821 00:44:30,719 --> 00:44:32,560 Speaker 1: of drink, I'm gonna try a different type of food, 822 00:44:32,600 --> 00:44:36,160 Speaker 1: I'm gonna walk a different way to the train station, etcetera. Well, 823 00:44:36,200 --> 00:44:38,239 Speaker 1: I mean, I would say that this just highlights that 824 00:44:39,000 --> 00:44:43,200 Speaker 1: neither branch either acting in character where your character has 825 00:44:43,239 --> 00:44:45,200 Speaker 1: been shaped by everything that ever happened to you, nor 826 00:44:45,400 --> 00:44:48,279 Speaker 1: by acting out of character where you know something got 827 00:44:48,360 --> 00:44:53,239 Speaker 1: into you. Neither way really cites the origin of decisions 828 00:44:53,320 --> 00:44:57,080 Speaker 1: or the origin of actions in something that's out without 829 00:44:57,120 --> 00:45:00,520 Speaker 1: outside influence. So a lot of the arguments about whether 830 00:45:00,560 --> 00:45:03,200 Speaker 1: we have free will actually seem to me to reduce 831 00:45:03,239 --> 00:45:06,760 Speaker 1: to the question of whether we feel we have free will, 832 00:45:07,280 --> 00:45:09,400 Speaker 1: But what would it actually mean to settle the question 833 00:45:09,440 --> 00:45:13,279 Speaker 1: of whether we are like physically objectively free? Uh So, 834 00:45:13,600 --> 00:45:15,359 Speaker 1: maybe we should look at like a more thought out 835 00:45:15,400 --> 00:45:18,799 Speaker 1: dictionary definition. So one that I found is quote the 836 00:45:18,880 --> 00:45:23,360 Speaker 1: power of acting without the constraint of necessity or fate 837 00:45:24,280 --> 00:45:26,520 Speaker 1: or fate. Now that that brings me back to the 838 00:45:26,560 --> 00:45:30,120 Speaker 1: demon packs, the thief of destiny, I find myself. I 839 00:45:30,120 --> 00:45:32,200 Speaker 1: found myself with the second viewing of Band or Snatch, 840 00:45:32,560 --> 00:45:35,480 Speaker 1: returning to that title and trying to figure out exactly 841 00:45:35,520 --> 00:45:39,080 Speaker 1: what it means. Because destiny, on one hand, means like 842 00:45:39,160 --> 00:45:42,560 Speaker 1: the your predestined right, there is a destiny in place 843 00:45:42,640 --> 00:45:45,600 Speaker 1: for you, and you perhaps don't have any real control. 844 00:45:45,640 --> 00:45:47,879 Speaker 1: It is the thing that the gods have laid out 845 00:45:47,920 --> 00:45:50,000 Speaker 1: for you. That's like one way of looking at But 846 00:45:50,000 --> 00:45:53,160 Speaker 1: another way of looking at destiny is that destiny is 847 00:45:53,200 --> 00:45:56,200 Speaker 1: the thing you aspire to, Like, you choose your own destiny, 848 00:45:56,200 --> 00:46:00,120 Speaker 1: You choose your own adventure, right. Uh So, which each 849 00:46:00,160 --> 00:46:03,239 Speaker 1: of the two is the demon pack stealing from you? 850 00:46:03,440 --> 00:46:06,560 Speaker 1: Is he stealing from you the power to make your 851 00:46:06,560 --> 00:46:11,600 Speaker 1: own decisions? Or is he's stealing from you a predestined path. 852 00:46:11,800 --> 00:46:15,280 Speaker 1: Is he liberating you from this from the same tired 853 00:46:15,320 --> 00:46:18,560 Speaker 1: walk to the train station and the same tired choices 854 00:46:18,600 --> 00:46:20,399 Speaker 1: on the menu. Well, the funny thing about to choose 855 00:46:20,400 --> 00:46:23,440 Speaker 1: your own adventure is that even though you are making 856 00:46:23,480 --> 00:46:26,080 Speaker 1: the choices on each page about which page to turn 857 00:46:26,120 --> 00:46:28,719 Speaker 1: to next, somebody else wrote the whole thing. That's true, 858 00:46:29,840 --> 00:46:31,960 Speaker 1: and I mean, to a certain extent, like you can. 859 00:46:32,120 --> 00:46:35,080 Speaker 1: You can apply that to life. Like as rebellious as 860 00:46:35,120 --> 00:46:37,920 Speaker 1: you might seem ordering something else on the menu that 861 00:46:37,960 --> 00:46:40,160 Speaker 1: you normally don't get, it's still on the menu. And 862 00:46:40,200 --> 00:46:42,319 Speaker 1: other things in life are like that too, Like you are, 863 00:46:42,680 --> 00:46:47,840 Speaker 1: largely you are constrained by the possibilities of of your culture, 864 00:46:47,840 --> 00:46:51,920 Speaker 1: of your station in life, of you know, political realities, etcetera. 865 00:46:52,200 --> 00:46:56,640 Speaker 1: But even then, is the unpredictability of a behavior uh 866 00:46:56,800 --> 00:47:01,800 Speaker 1: at all evidence of your control or your personal volition 867 00:47:02,040 --> 00:47:04,919 Speaker 1: of that behavior. I don't know. I mean that those 868 00:47:04,920 --> 00:47:08,160 Speaker 1: things seem perhaps unrelated to me. Actually, I mean you 869 00:47:08,160 --> 00:47:12,160 Speaker 1: can also be predictably unpredictable. Yeah, um, But anyway, coming 870 00:47:12,160 --> 00:47:14,840 Speaker 1: back to this definition, the one that's you know, acting 871 00:47:14,880 --> 00:47:18,319 Speaker 1: without constraint of necessity or fate. So I think it 872 00:47:18,360 --> 00:47:20,480 Speaker 1: can be hard to pin this down to a concrete claim, 873 00:47:20,520 --> 00:47:23,320 Speaker 1: But I think what it comes closest to is saying 874 00:47:23,360 --> 00:47:26,880 Speaker 1: that for any given action or moment in my life history, 875 00:47:26,920 --> 00:47:30,520 Speaker 1: anything I do, or think or say, given the exact 876 00:47:30,680 --> 00:47:34,719 Speaker 1: same inputs, I could have produced different output than I 877 00:47:34,760 --> 00:47:38,359 Speaker 1: actually did. Um And this this would be I think, 878 00:47:38,400 --> 00:47:41,600 Speaker 1: some way of making free will a kind of like 879 00:47:42,160 --> 00:47:46,120 Speaker 1: a physical proposition. Right, if exactly the same inputs went 880 00:47:46,160 --> 00:47:48,480 Speaker 1: into you, everything was exactly the same, you could have 881 00:47:48,520 --> 00:47:52,120 Speaker 1: done something different than what you did. Unfortunately, I think 882 00:47:52,239 --> 00:47:55,919 Speaker 1: this is just a completely untestable assumption. You know, given 883 00:47:55,960 --> 00:47:59,080 Speaker 1: the complexity of brains, you can never have all of 884 00:47:59,160 --> 00:48:02,600 Speaker 1: exactly the same inputs that somebody had at a given moment, 885 00:48:02,680 --> 00:48:05,280 Speaker 1: So you can't experiment on this to find out what's true. 886 00:48:05,680 --> 00:48:08,600 Speaker 1: Though we certainly love ruminating on this in our fiction. 887 00:48:09,120 --> 00:48:12,080 Speaker 1: Like any kind of time travel film, any kind of 888 00:48:12,080 --> 00:48:17,320 Speaker 1: Groundhog Day scenario is exploring this subject. Yeah, though, even 889 00:48:17,360 --> 00:48:19,399 Speaker 1: with most of those time travel things where people want 890 00:48:19,400 --> 00:48:21,920 Speaker 1: to go back and relive it, what they actually are 891 00:48:22,000 --> 00:48:24,000 Speaker 1: imagining is they want to go back and relive a 892 00:48:24,000 --> 00:48:26,719 Speaker 1: moment with the wisdom and knowledge that they have now 893 00:48:26,840 --> 00:48:29,360 Speaker 1: that they didn't have them. Uh So it would be 894 00:48:29,360 --> 00:48:32,640 Speaker 1: funny to just like replay the same instance over and 895 00:48:32,719 --> 00:48:35,759 Speaker 1: over again with exactly the same physics involved, and see 896 00:48:35,800 --> 00:48:39,440 Speaker 1: if anything different happens without having any new knowledge or whatever. 897 00:48:40,280 --> 00:48:43,919 Speaker 1: Um But even then, I mean, imagine maybe you could 898 00:48:43,960 --> 00:48:45,920 Speaker 1: do that somehow, you know, you could just watch the 899 00:48:45,960 --> 00:48:48,600 Speaker 1: same period of time play out over and over again 900 00:48:48,600 --> 00:48:51,839 Speaker 1: and see if anything different happens. Even if it were 901 00:48:51,880 --> 00:48:54,480 Speaker 1: true that you could have produced different outputs given the 902 00:48:54,520 --> 00:48:58,080 Speaker 1: exact same inputs, would this really mean you were free? 903 00:48:58,200 --> 00:49:01,000 Speaker 1: Would this be what people mean when they see free will, 904 00:49:01,040 --> 00:49:03,920 Speaker 1: like they're in control of their own actions? You know, 905 00:49:04,000 --> 00:49:07,359 Speaker 1: imagine there's some random dice rolling machine inside your head, 906 00:49:07,520 --> 00:49:10,040 Speaker 1: or a ghost or a spirit in your brain which 907 00:49:10,120 --> 00:49:13,440 Speaker 1: pushes you in different directions, even if every single iota 908 00:49:13,480 --> 00:49:16,560 Speaker 1: of input is the same. Is that actually freedom? That 909 00:49:16,680 --> 00:49:19,960 Speaker 1: just sounds like a different kind of impetus or control. 910 00:49:20,520 --> 00:49:24,000 Speaker 1: That's interesting you bring up a randomization via some sort 911 00:49:24,040 --> 00:49:28,120 Speaker 1: of technology like dice or casting of bones, because we've 912 00:49:28,120 --> 00:49:29,920 Speaker 1: we've discussed that in the past on the show, how 913 00:49:29,960 --> 00:49:32,799 Speaker 1: that is sometimes brought up as being like that that like, 914 00:49:32,840 --> 00:49:37,360 Speaker 1: that's the purpose of these early divination um UH technologies 915 00:49:37,400 --> 00:49:42,000 Speaker 1: techniques a way to randomize choice and to sometimes force 916 00:49:42,120 --> 00:49:45,359 Speaker 1: us towards a decision that we otherwise otherwise wouldn't make, 917 00:49:45,480 --> 00:49:48,919 Speaker 1: like in a way to free us from these predestined 918 00:49:48,960 --> 00:49:52,160 Speaker 1: paths that are before us, or at least, you know, 919 00:49:52,560 --> 00:49:55,000 Speaker 1: lean us over towards a different path that we would 920 00:49:55,040 --> 00:49:57,839 Speaker 1: that is available but we normally wouldn't go for. Well, 921 00:49:57,840 --> 00:50:00,000 Speaker 1: it's funny. I mean either way you go there. So yeah, 922 00:50:00,160 --> 00:50:04,080 Speaker 1: say you're doing the teaching or throwing bones. Does either 923 00:50:04,280 --> 00:50:07,080 Speaker 1: or not doing that either way is one making you 924 00:50:07,280 --> 00:50:10,680 Speaker 1: more the author of your own destiny than another? I'm 925 00:50:10,719 --> 00:50:14,520 Speaker 1: not sure. I mean, they have differences in outcomes, But 926 00:50:14,760 --> 00:50:17,680 Speaker 1: does that actually change what people mean when they say 927 00:50:17,760 --> 00:50:19,719 Speaker 1: when they say free will? I don't know. I mean, 928 00:50:19,719 --> 00:50:21,879 Speaker 1: even if you randomize your choices, you are the one 929 00:50:21,920 --> 00:50:24,680 Speaker 1: that will then enact that choice, Like you're still the 930 00:50:24,719 --> 00:50:27,799 Speaker 1: actor in your narrative and the randomization is still an 931 00:50:27,800 --> 00:50:30,719 Speaker 1: input on you. Um So I don't know. So, uh, 932 00:50:31,160 --> 00:50:33,840 Speaker 1: anyway to to sort of sum it all up, I've 933 00:50:33,920 --> 00:50:37,040 Speaker 1: got a theory here, and it is that I think 934 00:50:37,120 --> 00:50:39,680 Speaker 1: what a lot of us are actually circling around when 935 00:50:39,719 --> 00:50:42,720 Speaker 1: we're trying to figure out how to articulate our concept 936 00:50:42,760 --> 00:50:45,520 Speaker 1: of free will is this claim. And the claim is 937 00:50:46,200 --> 00:50:50,920 Speaker 1: our consciousness dictates our choices of how we act, or 938 00:50:50,920 --> 00:50:54,880 Speaker 1: in other words, we're conscious of the process by which 939 00:50:54,880 --> 00:50:58,080 Speaker 1: our choices are made or by which our actions are generated. 940 00:50:58,239 --> 00:51:02,600 Speaker 1: Right that that when we act, we are able to 941 00:51:02,800 --> 00:51:06,799 Speaker 1: consciously be a part of the impetus to act, or 942 00:51:06,880 --> 00:51:10,839 Speaker 1: consciously cause the impetus to act. And I think this 943 00:51:10,880 --> 00:51:14,080 Speaker 1: one is actually testable, and we can come back to 944 00:51:14,120 --> 00:51:17,040 Speaker 1: that in a minute. So this is, of course, this 945 00:51:17,080 --> 00:51:19,080 Speaker 1: is one of the big riddles of the human experience. 946 00:51:19,120 --> 00:51:21,960 Speaker 1: And so people have been thinking about this and uh, 947 00:51:22,320 --> 00:51:25,640 Speaker 1: you know, essentially banging their head against the wall about this. Uh. 948 00:51:25,960 --> 00:51:32,200 Speaker 1: For thousands of years. The philosophers Democratus and Lucippus saw 949 00:51:32,280 --> 00:51:36,200 Speaker 1: the universe as wholly governed by natural laws and composed 950 00:51:36,280 --> 00:51:40,680 Speaker 1: of you know, essentially indivisible atoms. They took the determinus 951 00:51:40,760 --> 00:51:44,240 Speaker 1: view of life of one propelled down a flowing stream 952 00:51:44,280 --> 00:51:47,600 Speaker 1: of events. Aristotle, on the other hand, is a great 953 00:51:47,600 --> 00:51:51,040 Speaker 1: example of someone who stressed the individual's responsibility for their actions. 954 00:51:51,360 --> 00:51:54,640 Speaker 1: The indeterminates view of life is a boat propelling itself 955 00:51:55,200 --> 00:51:58,480 Speaker 1: through a body of water. So yeah, to what extent 956 00:51:58,560 --> 00:52:01,359 Speaker 1: are you just sailing down the river. Uh, you know 957 00:52:01,400 --> 00:52:03,719 Speaker 1: at you know, with with no power on where you're going, 958 00:52:04,040 --> 00:52:06,000 Speaker 1: or are you in a boat that have in which 959 00:52:06,000 --> 00:52:08,319 Speaker 1: you have the power to move about and even move 960 00:52:08,360 --> 00:52:10,560 Speaker 1: upstream if you need to. On that note, we're going 961 00:52:10,600 --> 00:52:12,439 Speaker 1: to take one quick break, but when we come back, 962 00:52:12,640 --> 00:52:15,480 Speaker 1: we will start rolling through some arguments for and against 963 00:52:15,480 --> 00:52:22,560 Speaker 1: free will, and then we will return to band or snatch. Alright, 964 00:52:22,560 --> 00:52:26,399 Speaker 1: we're back, So let's start with some arguments against free will, 965 00:52:26,520 --> 00:52:28,760 Speaker 1: because ultimately I think these are these are often easier 966 00:52:28,760 --> 00:52:32,040 Speaker 1: to discuss. Sure, I would say the most basic one, right, 967 00:52:32,239 --> 00:52:36,600 Speaker 1: is just the science of physics, right. Physics is very predictable. 968 00:52:37,360 --> 00:52:40,920 Speaker 1: You can, you know, given given the inputs of forces 969 00:52:40,960 --> 00:52:43,719 Speaker 1: and energy and all that, you can determine what's going 970 00:52:43,760 --> 00:52:46,920 Speaker 1: to happen as an output of that action. And if 971 00:52:47,040 --> 00:52:49,799 Speaker 1: we assume that applies to everything, then why doesn't it 972 00:52:49,840 --> 00:52:53,799 Speaker 1: apply to us? Right? And basically comes back to Democritus 973 00:52:53,920 --> 00:52:56,560 Speaker 1: and Lucippus, right, the the idea that their natural laws 974 00:52:56,960 --> 00:52:59,399 Speaker 1: and they that are in place, and we're not above 975 00:52:59,480 --> 00:53:03,320 Speaker 1: those laws. Sure yeah, So we're acting on the inputs 976 00:53:03,360 --> 00:53:06,680 Speaker 1: that come in and you know that that uh being 977 00:53:06,719 --> 00:53:09,120 Speaker 1: being pushed in one way or another, by our life 978 00:53:09,160 --> 00:53:12,080 Speaker 1: history and our brains and all, that we're going to 979 00:53:12,120 --> 00:53:16,239 Speaker 1: act a certain way as physically reactive objects. Uh. Now, 980 00:53:16,320 --> 00:53:18,360 Speaker 1: this is an argument, of course, as the most common 981 00:53:18,440 --> 00:53:22,040 Speaker 1: argument I think against free will. But one question, is 982 00:53:22,160 --> 00:53:26,719 Speaker 1: our free will and causal determinism really incompatible? Uh? Not 983 00:53:26,840 --> 00:53:29,480 Speaker 1: that it settles the issue, but I think the majority 984 00:53:29,480 --> 00:53:32,480 Speaker 1: of philosophers who look at this issue pretty closely actually 985 00:53:32,600 --> 00:53:35,799 Speaker 1: end up becoming what are called compatible lists. They they 986 00:53:35,840 --> 00:53:38,840 Speaker 1: accept causal determinism, they say, yeah, you know, we're physical 987 00:53:38,840 --> 00:53:42,600 Speaker 1: objects being pushed around by physical forces, but they define 988 00:53:42,719 --> 00:53:45,479 Speaker 1: free will in some way that it is compatible with that. 989 00:53:45,480 --> 00:53:48,600 Speaker 1: That you are a physical object being pushed around by 990 00:53:48,600 --> 00:53:51,400 Speaker 1: physical forces and the whole history of your life and everything, 991 00:53:51,719 --> 00:53:54,239 Speaker 1: and yet somehow free will still applies to you. This 992 00:53:54,320 --> 00:53:57,880 Speaker 1: often comes down to like an understanding or feeling of 993 00:53:57,960 --> 00:54:00,279 Speaker 1: free will, like I was talking about earlier, Like even 994 00:54:00,360 --> 00:54:04,400 Speaker 1: if your actions are causally determined, somehow you feel like 995 00:54:04,480 --> 00:54:07,920 Speaker 1: you have agency and that's what we mean by free will, 996 00:54:08,040 --> 00:54:11,760 Speaker 1: right right. Another take on this that I came across, 997 00:54:12,080 --> 00:54:13,960 Speaker 1: and this goes back to something we're talking about earlier. 998 00:54:14,160 --> 00:54:19,239 Speaker 1: Contemporary British analytic philosopher Galen Strawson. Uh. Their argument is 999 00:54:19,280 --> 00:54:22,399 Speaker 1: that it's that basically free will is impossible because we 1000 00:54:22,480 --> 00:54:25,279 Speaker 1: act the way we are right. And this argument, this 1001 00:54:25,360 --> 00:54:29,000 Speaker 1: argument always makes me think of of Yates in the 1002 00:54:29,000 --> 00:54:32,040 Speaker 1: poem No Second Troy. There's that line what what could 1003 00:54:32,080 --> 00:54:35,640 Speaker 1: she have done? Being what she is? Uh? And I 1004 00:54:35,680 --> 00:54:37,640 Speaker 1: think about that with myself, Like what when I look 1005 00:54:37,680 --> 00:54:40,120 Speaker 1: back on past choices, what else could I have done 1006 00:54:40,280 --> 00:54:42,840 Speaker 1: being who I am? You know, without the you know, 1007 00:54:42,880 --> 00:54:45,920 Speaker 1: some sort of sci fi foresight brought on by time 1008 00:54:45,960 --> 00:54:51,000 Speaker 1: travel or groundhog day shenanigans, Like I am who I am? 1009 00:54:51,160 --> 00:54:54,080 Speaker 1: I am influenced by all these these things in my life, 1010 00:54:54,440 --> 00:54:57,600 Speaker 1: and my mind is this, and then what other choice 1011 00:54:57,640 --> 00:55:00,480 Speaker 1: would that mind have made? Right? I mean, this is Uh. 1012 00:55:00,600 --> 00:55:03,120 Speaker 1: That's a very good way of putting it. It almost 1013 00:55:03,160 --> 00:55:07,279 Speaker 1: like it maybe emphasizes the fact that free will is 1014 00:55:07,320 --> 00:55:10,800 Speaker 1: a difficult concept because of some of the baggage brought 1015 00:55:10,840 --> 00:55:13,840 Speaker 1: by the word free. Yeah, to act in accordance with 1016 00:55:13,880 --> 00:55:18,879 Speaker 1: your nature and your circumstances is not necessarily not free, right, 1017 00:55:19,200 --> 00:55:22,000 Speaker 1: Like it was in my nature to responsibly come to 1018 00:55:22,040 --> 00:55:25,400 Speaker 1: work this morning, and so therefore I did. Um could 1019 00:55:25,440 --> 00:55:28,000 Speaker 1: I have decided not to come into work? Could I 1020 00:55:28,000 --> 00:55:29,840 Speaker 1: have gone to the local I don't know, arcade or 1021 00:55:29,880 --> 00:55:33,880 Speaker 1: something or whatever whatever one does when one skip squirk. 1022 00:55:33,920 --> 00:55:37,000 Speaker 1: I guess I could have. In theory, there's nothing stopping you. Yeah, 1023 00:55:37,080 --> 00:55:40,239 Speaker 1: nothing at all, except that is not my nature, and 1024 00:55:40,520 --> 00:55:42,600 Speaker 1: that is not what I did. Because of my nature. 1025 00:55:42,760 --> 00:55:45,239 Speaker 1: Given the circumstances of who you are and who you 1026 00:55:45,280 --> 00:55:47,440 Speaker 1: were this morning and what was going on this morning, 1027 00:55:47,560 --> 00:55:49,920 Speaker 1: you didn't do it. And that's all we know is 1028 00:55:50,000 --> 00:55:52,319 Speaker 1: that you know you acted the way you were at 1029 00:55:52,360 --> 00:55:55,560 Speaker 1: that time because that's who you were at that time. Yeah. 1030 00:55:55,920 --> 00:55:58,600 Speaker 1: Now that being said, yes, events could have been different. 1031 00:55:58,640 --> 00:56:01,000 Speaker 1: We could have had an email from our boss saying 1032 00:56:01,080 --> 00:56:03,520 Speaker 1: that there is going to be like a rock concert 1033 00:56:03,560 --> 00:56:07,720 Speaker 1: in the that's in the office today. That would never happen, 1034 00:56:08,840 --> 00:56:10,839 Speaker 1: and it might make me think, well, maybe I don't 1035 00:56:10,840 --> 00:56:13,160 Speaker 1: want to come into work today, and maybe the easiest 1036 00:56:13,160 --> 00:56:17,080 Speaker 1: thing to do would be just to skip. I don't know. Again, 1037 00:56:17,719 --> 00:56:20,960 Speaker 1: you can tease your brain all day thinking about what 1038 00:56:21,120 --> 00:56:23,480 Speaker 1: elf and how this would have this little detail where 1039 00:56:23,480 --> 00:56:26,719 Speaker 1: this other detail would affected your choices, but ultimately we 1040 00:56:26,760 --> 00:56:28,880 Speaker 1: only have the version of the path behind us to 1041 00:56:28,960 --> 00:56:31,879 Speaker 1: look back on when we we think about all of this. Now, 1042 00:56:31,880 --> 00:56:35,239 Speaker 1: two other basic arguments against free will. This is when 1043 00:56:35,280 --> 00:56:38,880 Speaker 1: I think we'll come back to Experimentation has pointed to 1044 00:56:39,000 --> 00:56:41,800 Speaker 1: break downs between what feels like the moment of choice 1045 00:56:41,840 --> 00:56:44,440 Speaker 1: and what physically signals a choice being made. Yeah, I 1046 00:56:44,440 --> 00:56:47,239 Speaker 1: think this very much complicates the idea that Again, what 1047 00:56:47,360 --> 00:56:49,839 Speaker 1: I think people are actually really getting at with their 1048 00:56:49,880 --> 00:56:53,239 Speaker 1: idea of free will is that they have conscious control 1049 00:56:53,440 --> 00:56:57,319 Speaker 1: over their actions and thoughts. And another one, and this 1050 00:56:57,360 --> 00:56:59,400 Speaker 1: is again we've been touching on this the whole episode. 1051 00:56:59,400 --> 00:57:04,200 Speaker 1: But nearly causal influences at least guide our decisions, if 1052 00:57:04,239 --> 00:57:08,160 Speaker 1: not make them for us. It's hard to deny. All right, 1053 00:57:08,239 --> 00:57:10,640 Speaker 1: So here are some some arguments for free will. The 1054 00:57:10,680 --> 00:57:13,799 Speaker 1: big one, of course, is that subjectively, we tend to 1055 00:57:13,840 --> 00:57:17,480 Speaker 1: feel like we have rational, reflective control over our choices 1056 00:57:17,520 --> 00:57:20,240 Speaker 1: and actions. Sure, I mean, I can decide to do 1057 00:57:20,320 --> 00:57:22,760 Speaker 1: anything that occurs to me to do right now. You know, 1058 00:57:22,800 --> 00:57:25,840 Speaker 1: I could throw my computer across the room if I 1059 00:57:25,880 --> 00:57:28,960 Speaker 1: really wanted to. Yeah. And then and the idea that 1060 00:57:29,120 --> 00:57:32,040 Speaker 1: the way that there are brains enables to simulate these 1061 00:57:32,080 --> 00:57:35,880 Speaker 1: possibilities really I think allows us to lean into that interpretation. 1062 00:57:36,280 --> 00:57:39,400 Speaker 1: Because it's like the Choose your Own Adventure book, the 1063 00:57:39,520 --> 00:57:42,120 Speaker 1: other choices are in there, and if we want to, 1064 00:57:42,200 --> 00:57:44,080 Speaker 1: we can cheat, and we can check one out and 1065 00:57:44,120 --> 00:57:46,040 Speaker 1: then back up and in a way, you know, we 1066 00:57:46,120 --> 00:57:49,360 Speaker 1: can't do that in real life except through our ability 1067 00:57:49,440 --> 00:57:53,240 Speaker 1: to simulate possible futures, and and of course that has 1068 00:57:53,280 --> 00:57:55,800 Speaker 1: an important evolutionary role. It has a put an important 1069 00:57:55,840 --> 00:57:59,000 Speaker 1: role for our survival. We can think about the different 1070 00:57:59,000 --> 00:58:01,320 Speaker 1: ways we might try to, say, steal a piece of 1071 00:58:01,360 --> 00:58:04,240 Speaker 1: meat from a hungry lion and escape with food and 1072 00:58:04,360 --> 00:58:08,880 Speaker 1: our lives, and then choose the best course of action. Um, 1073 00:58:09,200 --> 00:58:12,120 Speaker 1: this is, this is important. But it can also lean 1074 00:58:12,200 --> 00:58:16,920 Speaker 1: into these interpretations that you know that certainly, uh, you know, 1075 00:58:16,960 --> 00:58:19,840 Speaker 1: I have more choice than I actually have, or even 1076 00:58:19,920 --> 00:58:22,520 Speaker 1: ultimately an idea that of course it's an exploring bandersnatch, 1077 00:58:22,600 --> 00:58:25,520 Speaker 1: the idea that these other alternate choices are kind of 1078 00:58:25,560 --> 00:58:28,440 Speaker 1: alternate timelines, that they're out there, like I saw it 1079 00:58:28,440 --> 00:58:31,439 Speaker 1: in my head to a certain extent, that reality where 1080 00:58:31,440 --> 00:58:34,040 Speaker 1: I tried to take more meat than was feasible and 1081 00:58:34,120 --> 00:58:36,600 Speaker 1: was killed by the lion in a way that exists 1082 00:58:36,640 --> 00:58:39,880 Speaker 1: because I just saw it. Unfortunately, I would say about 1083 00:58:39,920 --> 00:58:43,400 Speaker 1: this argument, it does often feel that way that you know, 1084 00:58:43,520 --> 00:58:46,560 Speaker 1: like I could have done anything a minute ago, but 1085 00:58:47,200 --> 00:58:49,720 Speaker 1: you didn't. You did what you did. So again this 1086 00:58:49,800 --> 00:58:53,240 Speaker 1: comes back to the untestability of this one, like there's 1087 00:58:53,280 --> 00:58:56,120 Speaker 1: just never any way to prove that you could have 1088 00:58:56,200 --> 00:58:59,720 Speaker 1: done differently than you did in the moment. Like if 1089 00:58:59,760 --> 00:59:02,280 Speaker 1: you ever had a like a close call where say 1090 00:59:02,280 --> 00:59:05,880 Speaker 1: you're almost in a wreck or you almost do something 1091 00:59:05,880 --> 00:59:08,960 Speaker 1: that could have conceivably gotten you killed, um, and then 1092 00:59:09,000 --> 00:59:11,040 Speaker 1: you have that moment of reflection. Granted, on one level, 1093 00:59:11,080 --> 00:59:13,760 Speaker 1: like it it may get you're you know, just just bodily. 1094 00:59:13,800 --> 00:59:16,120 Speaker 1: You're excited, right because this has happened and your body 1095 00:59:16,160 --> 00:59:18,720 Speaker 1: is on high alert. But on the other hand, part 1096 00:59:18,760 --> 00:59:22,040 Speaker 1: of it is sort of realizing your close proximity to 1097 00:59:22,120 --> 00:59:26,120 Speaker 1: this other possibility, like I was just some minor choice, 1098 00:59:26,200 --> 00:59:31,680 Speaker 1: some minor bit of input data away from something more catastrophic. Yes, 1099 00:59:31,880 --> 00:59:34,800 Speaker 1: it makes you suddenly you come face to face with 1100 00:59:34,800 --> 00:59:38,080 Speaker 1: how dependent you are on moment to moment circumstances and awareness. 1101 00:59:38,560 --> 00:59:40,240 Speaker 1: Though I would say a lot of times when I 1102 00:59:40,280 --> 00:59:43,040 Speaker 1: get that like that, like you know, catch your breath 1103 00:59:43,040 --> 00:59:45,680 Speaker 1: about what could have happened. It wasn't because I narrowly 1104 00:59:45,720 --> 00:59:49,320 Speaker 1: avoided something really bad happening. It's because I suddenly out 1105 00:59:49,360 --> 00:59:53,760 Speaker 1: of nowhere imagined something really bad happening. Like ever, you're 1106 00:59:53,800 --> 00:59:56,040 Speaker 1: going down a flight of stairs and it's going fine, 1107 00:59:56,400 --> 00:59:58,960 Speaker 1: but you just imagine, oh, I could fall and hit 1108 00:59:59,000 --> 01:00:02,080 Speaker 1: my teeth on that thing, and oh yeah, oh yeah, 1109 01:00:02,080 --> 01:00:04,160 Speaker 1: I do that. This is of course, this is one 1110 01:00:04,200 --> 01:00:09,680 Speaker 1: of my pitfalls, is to almost constantly, um essentially fantasize 1111 01:00:09,800 --> 01:00:12,520 Speaker 1: about bad things that could happen. Uh. And I think 1112 01:00:12,560 --> 01:00:13,960 Speaker 1: a lot of us do that, you know, And part 1113 01:00:13,960 --> 01:00:17,640 Speaker 1: of that is your your mind is exploring possibilities of 1114 01:00:17,680 --> 01:00:21,439 Speaker 1: what is happening or could happen or has happened. And 1115 01:00:21,720 --> 01:00:23,720 Speaker 1: but in doing that we can lean into the negative 1116 01:00:23,760 --> 01:00:28,000 Speaker 1: possibilities too much, and then our lives become this, you know, abysmal. 1117 01:00:28,080 --> 01:00:31,720 Speaker 1: Choose your own adventure book of most of mostly terrible ends, 1118 01:00:32,160 --> 01:00:34,520 Speaker 1: even though the path that you're actually on may not 1119 01:00:35,160 --> 01:00:37,960 Speaker 1: be leading to any of them. Um. It's it seems 1120 01:00:38,000 --> 01:00:40,800 Speaker 1: like the curse of all this confusion about whether we 1121 01:00:40,840 --> 01:00:43,200 Speaker 1: have free will or not and what that actually means, 1122 01:00:43,680 --> 01:00:45,760 Speaker 1: could just be rooted in the fact that we can 1123 01:00:45,800 --> 01:00:49,800 Speaker 1: consider hypothetical alternative scenarios. The fact that we're able to 1124 01:00:49,840 --> 01:00:55,080 Speaker 1: imagine counter factuals is what makes is what gives rise 1125 01:00:55,120 --> 01:00:59,080 Speaker 1: to this whole argument. Uh So, another thing I have 1126 01:00:59,080 --> 01:01:00,720 Speaker 1: in the list here, and this basically it's just an 1127 01:01:00,720 --> 01:01:05,400 Speaker 1: extrapolation of everything we're talking about right now, is Philosophers 1128 01:01:05,400 --> 01:01:09,080 Speaker 1: Stephen Cave and also Bruce Waller have both argued that 1129 01:01:09,160 --> 01:01:12,880 Speaker 1: animals evolved with the capabilities we tend to associate with 1130 01:01:12,960 --> 01:01:18,240 Speaker 1: free will in order to survive, such as opinion generation, deliberation, will, 1131 01:01:18,280 --> 01:01:21,400 Speaker 1: power to stick to a choice, and the large human 1132 01:01:21,440 --> 01:01:24,560 Speaker 1: brain has all of this in spades. Cave argues that 1133 01:01:24,600 --> 01:01:27,320 Speaker 1: the level of free will uh that we have may 1134 01:01:27,360 --> 01:01:30,560 Speaker 1: actually vary from individual to individual, and he argues that 1135 01:01:30,600 --> 01:01:33,280 Speaker 1: we could potentially even put together a method of measuring 1136 01:01:33,400 --> 01:01:36,880 Speaker 1: one's freedom quotient or f Q in the same way 1137 01:01:36,920 --> 01:01:41,560 Speaker 1: that we well roughly measure one's intelligence, creativity, and other 1138 01:01:41,680 --> 01:01:44,760 Speaker 1: psychological factors. I do think that's possible, but I also 1139 01:01:44,800 --> 01:01:46,360 Speaker 1: think that that would be subject to a lot of 1140 01:01:46,400 --> 01:01:49,960 Speaker 1: debate about exactly what it is you're measuring. There as 1141 01:01:49,960 --> 01:01:52,560 Speaker 1: a lot of these actual uh, you know, human or 1142 01:01:52,600 --> 01:01:55,800 Speaker 1: animal quotients are I mean, when when you measure human intelligence, 1143 01:01:55,800 --> 01:01:59,080 Speaker 1: there's debate about what exactly are you measuring uh, And 1144 01:01:59,120 --> 01:02:01,120 Speaker 1: I think the same with would be true of freedom 1145 01:02:01,160 --> 01:02:03,960 Speaker 1: subject to all of these crazy caveats we've been talking 1146 01:02:03,960 --> 01:02:06,080 Speaker 1: about so far. What what what do you mean when 1147 01:02:06,080 --> 01:02:08,960 Speaker 1: you say freedom? Yeah. Another take on this that I 1148 01:02:09,080 --> 01:02:11,840 Speaker 1: had read in the past was something that neuroscientists David 1149 01:02:11,840 --> 01:02:16,320 Speaker 1: Eagleman called the principle of sufficient automatism. And the idea 1150 01:02:16,400 --> 01:02:18,840 Speaker 1: here is that the more that we map the human 1151 01:02:18,880 --> 01:02:22,720 Speaker 1: genome and study the brains many subconscious machinations, the more 1152 01:02:22,720 --> 01:02:25,400 Speaker 1: it becomes clear that a free will exists. It's only 1153 01:02:25,520 --> 01:02:31,320 Speaker 1: a factor hitching a ride on top of enormous automated machinery. Um, 1154 01:02:32,200 --> 01:02:35,200 Speaker 1: so again it comes there's there's plenty of ground in 1155 01:02:35,240 --> 01:02:39,640 Speaker 1: between automaton and self moving soul where you can you 1156 01:02:39,640 --> 01:02:43,160 Speaker 1: can sort of move the slider towards one direction or 1157 01:02:43,200 --> 01:02:46,240 Speaker 1: the other and still have something that we can at 1158 01:02:46,320 --> 01:02:48,760 Speaker 1: least refer to as free will. But it might only 1159 01:02:48,760 --> 01:02:51,280 Speaker 1: be a very very little bit of something. Yeah, it might. 1160 01:02:51,400 --> 01:02:53,640 Speaker 1: And it's it's interesting to sort of do that, to 1161 01:02:53,680 --> 01:02:56,080 Speaker 1: do a little self reflection and think about that, Like, yes, 1162 01:02:56,160 --> 01:02:59,520 Speaker 1: I had choice in the Senate situation, but really, how 1163 01:02:59,600 --> 01:03:02,920 Speaker 1: much voice was there? Uh? Yeah? And I think for me, 1164 01:03:03,000 --> 01:03:06,480 Speaker 1: at least some of the definition problems would still remain like, 1165 01:03:06,960 --> 01:03:10,560 Speaker 1: I'm not sure that even then, that that's clarifying what 1166 01:03:10,600 --> 01:03:14,919 Speaker 1: the concept of freedom means there. Um So, we can't 1167 01:03:14,960 --> 01:03:17,440 Speaker 1: test whether it's possible for a person to produce different 1168 01:03:17,440 --> 01:03:20,760 Speaker 1: outputs given the exact same inputs. That just seems beyond 1169 01:03:20,920 --> 01:03:24,200 Speaker 1: the bounds of empiricism. You could believe that if you want, 1170 01:03:24,280 --> 01:03:26,880 Speaker 1: but I don't think there's any evidence for it. But 1171 01:03:27,080 --> 01:03:30,040 Speaker 1: this might not be what we really mean by free will. Maybe, 1172 01:03:30,080 --> 01:03:32,919 Speaker 1: as I mentioned earlier, we what we mean by free 1173 01:03:32,920 --> 01:03:36,400 Speaker 1: will is that we are conscious of the process by 1174 01:03:36,440 --> 01:03:39,760 Speaker 1: which we make decisions or generate actions. And I think 1175 01:03:39,760 --> 01:03:43,800 Speaker 1: the empirical research is pretty clear that this is not true, 1176 01:03:43,880 --> 01:03:47,080 Speaker 1: at least not in many cases. Um So, just to 1177 01:03:47,120 --> 01:03:49,960 Speaker 1: look at a few studies undercutting traditional notions that our 1178 01:03:49,960 --> 01:03:54,000 Speaker 1: consciousness dictates our decisions or that we're consciously aware of 1179 01:03:54,040 --> 01:03:57,400 Speaker 1: how are all our decisions are reached? Um So, first 1180 01:03:57,400 --> 01:03:59,080 Speaker 1: of all, I want to look at one by a 1181 01:03:59,320 --> 01:04:04,240 Speaker 1: soon brass, Hines and Haines, published in Nature Neuroscience in 1182 01:04:04,280 --> 01:04:07,560 Speaker 1: two thousand eight, called Unconscious Determinants of free Decisions and 1183 01:04:07,600 --> 01:04:11,200 Speaker 1: the Human Brain. In this study, the authors found that 1184 01:04:11,280 --> 01:04:14,840 Speaker 1: they could use brain scanning to detect a person's choice 1185 01:04:15,280 --> 01:04:19,440 Speaker 1: between two options before the person believed that they had 1186 01:04:19,600 --> 01:04:22,240 Speaker 1: made a choice. So you've got a very simple setup. 1187 01:04:22,440 --> 01:04:25,760 Speaker 1: You're supposed to freely choose between pressing two buttons. You've 1188 01:04:25,760 --> 01:04:28,360 Speaker 1: got a left button pressed with your left hand, You've 1189 01:04:28,360 --> 01:04:31,000 Speaker 1: got a right button pressed with your right hand. Uh, 1190 01:04:31,040 --> 01:04:33,120 Speaker 1: the two different hands were used because this made it 1191 01:04:33,160 --> 01:04:35,480 Speaker 1: easier to see which hand was about to be engaged 1192 01:04:35,520 --> 01:04:38,640 Speaker 1: through motor control and brain imaging. And so you take 1193 01:04:38,640 --> 01:04:41,640 Speaker 1: your time, you decide which which button you want to press, 1194 01:04:41,880 --> 01:04:45,280 Speaker 1: and then you note which letter in a timed sequence 1195 01:04:45,360 --> 01:04:47,600 Speaker 1: is displayed on a screen in front of you. At 1196 01:04:47,640 --> 01:04:50,440 Speaker 1: the moment you believe you've made your decision about which 1197 01:04:50,440 --> 01:04:53,959 Speaker 1: button to push. And in some cases, the researchers could 1198 01:04:54,000 --> 01:04:58,440 Speaker 1: detect brain activity of the prefrontal and parietal cortex indicating 1199 01:04:58,520 --> 01:05:01,360 Speaker 1: which choice a person was going to make up to 1200 01:05:01,520 --> 01:05:05,439 Speaker 1: seven to ten seconds before the person believed they had 1201 01:05:05,520 --> 01:05:09,200 Speaker 1: made their choice. So this study indicates that at least 1202 01:05:09,200 --> 01:05:12,920 Speaker 1: in some cases, at the moment you believe that you 1203 01:05:13,000 --> 01:05:16,760 Speaker 1: have consciously made a choice to do something, machines can 1204 01:05:16,800 --> 01:05:19,680 Speaker 1: look at your brain and showed that the brain has 1205 01:05:19,760 --> 01:05:23,080 Speaker 1: made a choice before you believe you have made a choice, 1206 01:05:23,120 --> 01:05:27,640 Speaker 1: and predict with better than chance accuracy. What that choice is? 1207 01:05:27,680 --> 01:05:30,200 Speaker 1: This is this is a study that really intrigued me. 1208 01:05:30,320 --> 01:05:33,200 Speaker 1: I remember when it came out, because it's basically this 1209 01:05:33,320 --> 01:05:36,000 Speaker 1: idea where I think that I'm the lightning, but perhaps 1210 01:05:36,040 --> 01:05:38,760 Speaker 1: I am the thunder, or at least my experience is 1211 01:05:38,760 --> 01:05:42,120 Speaker 1: that of the thunder. But then the the other question is, well, 1212 01:05:42,160 --> 01:05:44,520 Speaker 1: does that mean I'm not the lightning? Am I not? Both? 1213 01:05:44,640 --> 01:05:47,000 Speaker 1: And maybe just I'm like, I have a thunder level 1214 01:05:47,040 --> 01:05:50,360 Speaker 1: awareness of what I am, but there is this lightning 1215 01:05:50,400 --> 01:05:53,800 Speaker 1: that precedes this experience of me. Well, I mean, I 1216 01:05:53,840 --> 01:05:56,680 Speaker 1: don't know. I mean the decision is generated by the brain. 1217 01:05:57,000 --> 01:05:59,400 Speaker 1: So again you're back to this question of what free 1218 01:05:59,440 --> 01:06:01,800 Speaker 1: will mean. But if it, if it does have something 1219 01:06:01,840 --> 01:06:05,960 Speaker 1: to do with consciously being a participant at the moment 1220 01:06:06,040 --> 01:06:09,720 Speaker 1: that a decision is made, that there's pretty good evidence 1221 01:06:09,760 --> 01:06:12,720 Speaker 1: that that's not going on. The brain is making decisions 1222 01:06:12,760 --> 01:06:16,720 Speaker 1: before the person thinks I have made a decision. But okay, 1223 01:06:16,720 --> 01:06:20,040 Speaker 1: that was two thousand eight. Is there anything since then? Sure? 1224 01:06:20,080 --> 01:06:22,760 Speaker 1: Here's one study with findings along these lines but applied 1225 01:06:22,760 --> 01:06:26,040 Speaker 1: to voluntary mental imagery, who was published just last year 1226 01:06:26,080 --> 01:06:29,240 Speaker 1: in twenty nineteen, and the Open access journal Scientific Reports. 1227 01:06:29,640 --> 01:06:33,760 Speaker 1: It's by Kenny Robert and Pearson in h I said 1228 01:06:33,760 --> 01:06:37,000 Speaker 1: Scientific Reports called Decoding the Contents and Strength of Imagery 1229 01:06:37,040 --> 01:06:41,120 Speaker 1: before Volitional Engagement. Uh and UH again. This was published 1230 01:06:41,360 --> 01:06:44,240 Speaker 1: in twenty nineteen. The short version here is that the 1231 01:06:44,280 --> 01:06:47,280 Speaker 1: researchers exposed people to two different images. You've got a 1232 01:06:47,320 --> 01:06:50,800 Speaker 1: red circle with horizontal lines and a green circle with 1233 01:06:50,960 --> 01:06:54,400 Speaker 1: vertical lines. And then the researchers were able to correlate 1234 01:06:54,800 --> 01:06:58,640 Speaker 1: images of brain states with mental representation of the different pictures, 1235 01:06:58,680 --> 01:07:00,440 Speaker 1: so they know what it's what it looks like in 1236 01:07:00,480 --> 01:07:04,520 Speaker 1: your brain when you're thinking about these two images separately. Uh. 1237 01:07:04,560 --> 01:07:07,840 Speaker 1: They could use this brain imaging to predict, again above chance, 1238 01:07:08,080 --> 01:07:12,280 Speaker 1: which image subjects would choose to visualize in their head 1239 01:07:12,800 --> 01:07:16,360 Speaker 1: before the subjects believed they had made a choice about 1240 01:07:16,400 --> 01:07:18,880 Speaker 1: which one to imagine in their head. And they can 1241 01:07:18,960 --> 01:07:21,360 Speaker 1: make these predictions at a rate above chance and average 1242 01:07:21,360 --> 01:07:25,880 Speaker 1: of eleven seconds before the person's actual choice about which 1243 01:07:25,880 --> 01:07:28,480 Speaker 1: one they were going to imagine. So one of the authors, 1244 01:07:28,560 --> 01:07:31,320 Speaker 1: Joel Pearson UH was quoted in a statement I believed 1245 01:07:31,320 --> 01:07:34,680 Speaker 1: to Medical express quote. We believe that when we are 1246 01:07:34,720 --> 01:07:37,240 Speaker 1: faced with the choice between two or more options of 1247 01:07:37,320 --> 01:07:41,240 Speaker 1: what to think about. Non conscious traces of the thoughts 1248 01:07:41,280 --> 01:07:46,000 Speaker 1: are there already, a bit like unconscious hallucinations. That comes 1249 01:07:46,000 --> 01:07:49,040 Speaker 1: back to something we talked about recently. Um As the 1250 01:07:49,080 --> 01:07:52,480 Speaker 1: decision of what to think about is made, executive areas 1251 01:07:52,520 --> 01:07:55,520 Speaker 1: of the brain choose the thought trace which is stronger. 1252 01:07:55,840 --> 01:07:59,240 Speaker 1: In other words, if any pre existing brain activity matches 1253 01:07:59,320 --> 01:08:01,560 Speaker 1: one of your choice is, then your brain will be 1254 01:08:01,640 --> 01:08:04,520 Speaker 1: more likely to pick that option as it gets boosted 1255 01:08:04,560 --> 01:08:08,400 Speaker 1: by the pre existing brain activity. This would explain, for example, 1256 01:08:08,440 --> 01:08:11,439 Speaker 1: why thinking over and over about something leads to ever 1257 01:08:11,520 --> 01:08:14,080 Speaker 1: more thoughts about it as it occurs in a positive 1258 01:08:14,080 --> 01:08:17,480 Speaker 1: feedback loop. And then, to quote from the study abstract, 1259 01:08:17,560 --> 01:08:20,439 Speaker 1: the authors say, our results suggest that the contents and 1260 01:08:20,479 --> 01:08:24,599 Speaker 1: strength of mental imagery are influenced by sensory like neural 1261 01:08:24,640 --> 01:08:30,160 Speaker 1: representations that emerge spontaneously before volition. So there are things 1262 01:08:30,200 --> 01:08:32,320 Speaker 1: going on within the brain that we can detect with 1263 01:08:32,400 --> 01:08:36,320 Speaker 1: machinery from the outside that suggests what you're going to 1264 01:08:36,400 --> 01:08:39,360 Speaker 1: think about before you think about it now. I think 1265 01:08:39,360 --> 01:08:42,000 Speaker 1: we should be fair that it's possible. This isn't always 1266 01:08:42,040 --> 01:08:45,040 Speaker 1: the case, but there's plenty of evidence that at least 1267 01:08:45,040 --> 01:08:48,040 Speaker 1: in some of the at least in some cases, when 1268 01:08:48,080 --> 01:08:52,200 Speaker 1: people think they're consciously making a choice, the brain in 1269 01:08:52,240 --> 01:08:55,479 Speaker 1: a measurable way has already made a choice that can 1270 01:08:55,520 --> 01:08:59,000 Speaker 1: be detected from the outside. The brain has already set 1271 01:08:59,080 --> 01:09:02,760 Speaker 1: one course of action in motion before the conscious part 1272 01:09:02,760 --> 01:09:05,599 Speaker 1: of our brain is aware that we're going to choose 1273 01:09:05,640 --> 01:09:09,320 Speaker 1: that course. So again, kind of a thunder enlightening scenario 1274 01:09:09,560 --> 01:09:12,000 Speaker 1: right now. Of course, this stuff we've been talking about 1275 01:09:12,080 --> 01:09:14,400 Speaker 1: is is true of typical human brains. Once you start 1276 01:09:14,400 --> 01:09:17,880 Speaker 1: looking at a typical neurological situations, you can find all 1277 01:09:17,960 --> 01:09:21,240 Speaker 1: kinds of evidence of action without the sensation of conscious 1278 01:09:21,280 --> 01:09:23,519 Speaker 1: awareness or choice. A lot of these are things that 1279 01:09:23,560 --> 01:09:26,320 Speaker 1: have come up on the show before, like blind site, 1280 01:09:26,560 --> 01:09:30,600 Speaker 1: the fact that people can physically react to visual stimuli 1281 01:09:30,640 --> 01:09:33,800 Speaker 1: while believing consciously that they are blind, or that they're 1282 01:09:33,840 --> 01:09:36,120 Speaker 1: blind in some part of their visual field, like you 1283 01:09:36,160 --> 01:09:38,840 Speaker 1: can react to, you know, raise your hand to catch 1284 01:09:38,920 --> 01:09:41,479 Speaker 1: a ball without believing that you have seen the ball, 1285 01:09:42,040 --> 01:09:44,840 Speaker 1: or you've got alien limb syndrome, where something like a 1286 01:09:44,880 --> 01:09:47,800 Speaker 1: brain lesion can cause part of the body to act 1287 01:09:47,800 --> 01:09:49,879 Speaker 1: in a way that you do not feel in control 1288 01:09:50,000 --> 01:09:52,759 Speaker 1: of the hand moves on its own, it moves against 1289 01:09:52,840 --> 01:09:55,280 Speaker 1: your will, it picks up the spoon when you meant 1290 01:09:55,280 --> 01:09:58,720 Speaker 1: to pick up the fork. Um. Of course, the experiences 1291 01:09:58,760 --> 01:10:01,400 Speaker 1: of split brain patients, which we we did a deep 1292 01:10:01,439 --> 01:10:05,120 Speaker 1: dive on in January of twenty nineteen. The short version 1293 01:10:05,360 --> 01:10:07,680 Speaker 1: is that some patients who undergo a surgery called a 1294 01:10:07,720 --> 01:10:11,479 Speaker 1: corpus callisotomy, in which the main avenue of information sharing 1295 01:10:11,479 --> 01:10:14,600 Speaker 1: between the two hemispheres of the brain is severed, can 1296 01:10:14,760 --> 01:10:17,760 Speaker 1: seem to show signs of the right hemisphere acting and 1297 01:10:17,800 --> 01:10:21,519 Speaker 1: making choices without the conscious awareness or control of the 1298 01:10:21,640 --> 01:10:24,040 Speaker 1: left hemisphere, which seems to be the part of the 1299 01:10:24,040 --> 01:10:27,200 Speaker 1: brain that can usually talk and the last example lead 1300 01:10:27,240 --> 01:10:31,559 Speaker 1: to hypotheses like Michael Gazzaniga and Joseph Ladue's left brain 1301 01:10:31,640 --> 01:10:34,720 Speaker 1: interpreter model, where they argue that part of what the 1302 01:10:34,800 --> 01:10:38,160 Speaker 1: left hemisphere of the brain does is generate an ongoing 1303 01:10:38,280 --> 01:10:43,200 Speaker 1: series of narrative explanations that reconcile past and present and 1304 01:10:43,280 --> 01:10:46,439 Speaker 1: give us the sense of that that we understand why 1305 01:10:46,520 --> 01:10:49,000 Speaker 1: we do what we do now. Of course, their model 1306 01:10:49,040 --> 01:10:51,640 Speaker 1: could be incorrect, but I think it's also possible that 1307 01:10:51,680 --> 01:10:54,640 Speaker 1: they're really onto something that the brain seems to have 1308 01:10:54,680 --> 01:10:58,639 Speaker 1: a major function of trying to convince itself that its 1309 01:10:58,720 --> 01:11:03,960 Speaker 1: behavior is coherent and has rational justifications. Uh, and if 1310 01:11:03,960 --> 01:11:06,840 Speaker 1: possible to convince the conscious part of the brain that 1311 01:11:07,040 --> 01:11:09,280 Speaker 1: it's in control. I think this is kind of like 1312 01:11:09,360 --> 01:11:12,640 Speaker 1: at work when you give the boss three options. You know, 1313 01:11:12,760 --> 01:11:14,800 Speaker 1: it's like, here are the three things we came up with, 1314 01:11:14,840 --> 01:11:17,840 Speaker 1: and you've got the one you actually want to go with. Uh. 1315 01:11:17,880 --> 01:11:20,479 Speaker 1: And then you've got to like terrible options that are 1316 01:11:20,520 --> 01:11:23,680 Speaker 1: designed in order to be ignored and discarded by the 1317 01:11:23,720 --> 01:11:25,960 Speaker 1: boss and flatter the boss and give them a sense 1318 01:11:25,960 --> 01:11:29,560 Speaker 1: of control, right, which can be a dangerous exercise. Absolutely, 1319 01:11:29,600 --> 01:11:32,360 Speaker 1: I'm not advising that is a good strategy. I'm just 1320 01:11:32,400 --> 01:11:34,960 Speaker 1: saying people do it. To look quickly at one more 1321 01:11:35,000 --> 01:11:38,840 Speaker 1: study I found. This was published in in p N A. 1322 01:11:39,120 --> 01:11:43,439 Speaker 1: S By by Darby, jouts Burke, and Fox called Lesion 1323 01:11:43,479 --> 01:11:47,960 Speaker 1: network Localization of free will. Very briefly, the authors here 1324 01:11:48,600 --> 01:11:52,679 Speaker 1: defined the defined the neurologically relevant parts of free will 1325 01:11:53,080 --> 01:11:55,280 Speaker 1: as having two parts. So first of all, there's the 1326 01:11:55,360 --> 01:11:57,800 Speaker 1: desire to act. That's a volition. You know, you've got 1327 01:11:57,880 --> 01:12:02,160 Speaker 1: volitional control, and then a sin of responsibility for that action, 1328 01:12:02,280 --> 01:12:06,320 Speaker 1: which is the feeling of agencies. You have volition and agency. 1329 01:12:06,400 --> 01:12:08,960 Speaker 1: And then they looked at two neurological conditions, one that 1330 01:12:09,080 --> 01:12:12,120 Speaker 1: is believed to disrupt each of these functions. Uh, they 1331 01:12:12,120 --> 01:12:16,040 Speaker 1: looked at focal brain lesions that disrupt a volition causing 1332 01:12:16,200 --> 01:12:19,519 Speaker 1: a kinetic mutism, and a kinetic mutism is a condition 1333 01:12:19,760 --> 01:12:23,080 Speaker 1: where patients are unable to voluntarily move or speak. This 1334 01:12:23,360 --> 01:12:25,840 Speaker 1: would of course be a disruption of the volition part 1335 01:12:25,840 --> 01:12:29,080 Speaker 1: of the brain. And then lesions that disrupt agency, and 1336 01:12:29,120 --> 01:12:32,200 Speaker 1: this would of course cause alien limb syndrome again alien 1337 01:12:32,320 --> 01:12:34,920 Speaker 1: len syndrome. That's where you've got part of your body 1338 01:12:34,960 --> 01:12:37,760 Speaker 1: acting or moving in a way that does not feel voluntary. 1339 01:12:37,840 --> 01:12:40,160 Speaker 1: It moves, but you don't feel like you did it. 1340 01:12:41,040 --> 01:12:43,879 Speaker 1: And then they basically found that brain lesions that disrupt 1341 01:12:43,960 --> 01:12:46,600 Speaker 1: volition occur all over the brain, but there within a 1342 01:12:46,600 --> 01:12:49,320 Speaker 1: brain network that is connected in some way to the 1343 01:12:49,360 --> 01:12:53,320 Speaker 1: anterior singulate cortex. And they found that lesions that disrupt 1344 01:12:53,400 --> 01:12:56,680 Speaker 1: agency also occur in different locations around the brain, but 1345 01:12:56,760 --> 01:12:59,400 Speaker 1: they tend to be defined by connectivity to a part 1346 01:12:59,439 --> 01:13:02,080 Speaker 1: of the brain called the pre cunus. Now, again I 1347 01:13:02,080 --> 01:13:05,840 Speaker 1: would note that this this acknowledges physical evidence that there 1348 01:13:05,840 --> 01:13:11,080 Speaker 1: are distinct brain processes involved in generating action, you know, volition, 1349 01:13:11,520 --> 01:13:16,040 Speaker 1: versus recognizing personal agency in that action and typical brains 1350 01:13:16,080 --> 01:13:19,400 Speaker 1: executing typical actions have both of these acting in sync, 1351 01:13:19,720 --> 01:13:22,680 Speaker 1: but brains can have either one without the other. Now, 1352 01:13:22,680 --> 01:13:25,200 Speaker 1: obviously we could keep going here, We could keep discussing 1353 01:13:25,280 --> 01:13:28,920 Speaker 1: a free will and what feels like free will and 1354 01:13:28,960 --> 01:13:33,320 Speaker 1: how it matches up with with neuroscientific data, etcetera. But 1355 01:13:33,760 --> 01:13:35,439 Speaker 1: at this point the podcast, we probably do need to 1356 01:13:35,439 --> 01:13:37,840 Speaker 1: bring it back around to band or Snatch and the 1357 01:13:37,960 --> 01:13:40,719 Speaker 1: question like, Okay, given all this stuff that we've talked about, 1358 01:13:40,880 --> 01:13:43,000 Speaker 1: what does band or Snatch seem to be saying about 1359 01:13:43,040 --> 01:13:46,879 Speaker 1: all of this? Well, it does seem to be largely 1360 01:13:46,880 --> 01:13:49,479 Speaker 1: a rumination on the idea that we do not seem 1361 01:13:49,560 --> 01:13:51,800 Speaker 1: to have as much free will as we think we do. 1362 01:13:52,400 --> 01:13:56,200 Speaker 1: That that we can resist, but it takes considerable effort 1363 01:13:56,200 --> 01:13:59,439 Speaker 1: to run counter to the current that we're stuck in. 1364 01:13:59,680 --> 01:14:01,920 Speaker 1: I would say a thing that is a theme that 1365 01:14:02,000 --> 01:14:04,559 Speaker 1: has hammered home about free will, and it is the 1366 01:14:04,640 --> 01:14:07,320 Speaker 1: more we look at the concept of free will and 1367 01:14:07,439 --> 01:14:10,120 Speaker 1: think about whether we have control over our actions, the 1368 01:14:10,200 --> 01:14:12,680 Speaker 1: less we feel we have it. Yeah, Like I was 1369 01:14:12,840 --> 01:14:16,000 Speaker 1: thinking of trying to list, like all the various factors 1370 01:14:16,479 --> 01:14:22,000 Speaker 1: and agents that are influencing Stephen in the story, I mean, 1371 01:14:22,040 --> 01:14:25,760 Speaker 1: we have his mental health, his past trauma, his father, 1372 01:14:25,960 --> 01:14:29,360 Speaker 1: his therapist, uh, the the the work and tragic life, 1373 01:14:29,400 --> 01:14:33,439 Speaker 1: the influence of Jerome F. Davies, his boss at Tucker Soft, 1374 01:14:33,680 --> 01:14:38,960 Speaker 1: his mentor slash hero, slash friend Colin Rittman, conspiracy theories, 1375 01:14:39,200 --> 01:14:42,240 Speaker 1: music media, etcetera. And that's not even getting into the 1376 01:14:42,280 --> 01:14:46,200 Speaker 1: Specco development that there is either an actual demon entity 1377 01:14:46,240 --> 01:14:49,640 Speaker 1: that is the literal thief of destiny, or that a 1378 01:14:49,680 --> 01:14:52,920 Speaker 1: power beyond himself is influencing his decisions, some sort of 1379 01:14:53,040 --> 01:14:56,120 Speaker 1: voice from beyond, or the machinations of a player in 1380 01:14:56,160 --> 01:15:00,560 Speaker 1: another world. Yeah, the story really brings home this paradox, 1381 01:15:00,600 --> 01:15:03,400 Speaker 1: which is that I think it is the case that 1382 01:15:03,560 --> 01:15:06,519 Speaker 1: the closer we look at free will, and the more 1383 01:15:06,640 --> 01:15:09,719 Speaker 1: we uh you know, bring bring our our sharpest scientific 1384 01:15:09,800 --> 01:15:14,559 Speaker 1: tools and philosophical instruments to uh to understand it, the 1385 01:15:14,640 --> 01:15:16,880 Speaker 1: less it seems to make sense and the less it 1386 01:15:16,920 --> 01:15:19,639 Speaker 1: seems to be there. And yet at the same time 1387 01:15:19,680 --> 01:15:23,240 Speaker 1: that we acknowledge that to feel like your actions are 1388 01:15:23,240 --> 01:15:26,400 Speaker 1: not under your own control is not a heightened state 1389 01:15:26,439 --> 01:15:30,439 Speaker 1: of consciousness, that is still a problem and it uh 1390 01:15:30,439 --> 01:15:32,559 Speaker 1: and I don't know exactly what that signals that maybe 1391 01:15:32,640 --> 01:15:36,080 Speaker 1: yet another unresolved tension in the the issue of free will, 1392 01:15:36,120 --> 01:15:38,880 Speaker 1: that like, the more closely we examine it, the less 1393 01:15:38,880 --> 01:15:42,160 Speaker 1: we feel like we have it, and yet genuinely feeling 1394 01:15:42,200 --> 01:15:44,880 Speaker 1: like you don't have it. The more you feel that way, 1395 01:15:44,920 --> 01:15:47,479 Speaker 1: the more this is a serious impediment to you living 1396 01:15:47,479 --> 01:15:51,280 Speaker 1: a healthy life. Now, this seems this may seem like 1397 01:15:51,320 --> 01:15:55,160 Speaker 1: a logical place to in the conversation, but one of 1398 01:15:55,160 --> 01:15:58,880 Speaker 1: the things that's really interesting here is that is that 1399 01:15:59,160 --> 01:16:01,920 Speaker 1: we were talking about episode of Black Mirror that deals 1400 01:16:01,960 --> 01:16:04,760 Speaker 1: with free will and our choices in life. And and 1401 01:16:04,760 --> 01:16:07,639 Speaker 1: certainly again Black Mirror frequently comments on our uneasy regarding 1402 01:16:07,640 --> 01:16:11,519 Speaker 1: new technology. But then Band or Snatch itself, this show 1403 01:16:11,800 --> 01:16:16,559 Speaker 1: on Netflix, this this movie, this movie itself factors into 1404 01:16:16,640 --> 01:16:19,840 Speaker 1: some user concerns about the future of this sort of 1405 01:16:19,920 --> 01:16:25,040 Speaker 1: interactive viewing technology. Yes, uh, you know, I would say 1406 01:16:25,080 --> 01:16:27,280 Speaker 1: one of the things that is a legitimate concern about 1407 01:16:27,320 --> 01:16:30,120 Speaker 1: free will, however you define it, is as murky as 1408 01:16:30,120 --> 01:16:33,120 Speaker 1: it is. At least one thing that we want is 1409 01:16:33,200 --> 01:16:37,120 Speaker 1: to we want to think that we understand the incoming 1410 01:16:37,200 --> 01:16:40,599 Speaker 1: influences on our behavior, right Like you'd like to think 1411 01:16:40,640 --> 01:16:43,400 Speaker 1: that if I did X. I can sort of make 1412 01:16:43,439 --> 01:16:46,160 Speaker 1: sense of that. It was because I read this book, 1413 01:16:46,240 --> 01:16:48,600 Speaker 1: or I read this article, or I had a conversation 1414 01:16:48,640 --> 01:16:52,120 Speaker 1: with this person, and I connect the knowledge I gained 1415 01:16:52,200 --> 01:16:56,559 Speaker 1: through that or the influences of those past experiences with 1416 01:16:56,680 --> 01:17:00,000 Speaker 1: the decision I just made. Life starts getting more different. 1417 01:17:00,040 --> 01:17:03,840 Speaker 1: Cult when you have trouble understanding what the influences on 1418 01:17:03,960 --> 01:17:07,000 Speaker 1: yourself are. Does that make sense? Yeah? Yeah, And we've 1419 01:17:07,040 --> 01:17:09,599 Speaker 1: we've discussed some of these in the path. We've discussed 1420 01:17:09,600 --> 01:17:11,960 Speaker 1: a number of these in the past, but technologically speaking, 1421 01:17:12,000 --> 01:17:15,120 Speaker 1: we have discussed advertising and we have discussed social media, 1422 01:17:16,200 --> 01:17:19,120 Speaker 1: which are good things to keep in mind as as 1423 01:17:19,160 --> 01:17:21,320 Speaker 1: we continue here, because there might not be a band 1424 01:17:21,360 --> 01:17:23,880 Speaker 1: or snatcher, a demon awaiting you in the maze of 1425 01:17:23,960 --> 01:17:27,439 Speaker 1: future interactive media technology, but there might just be some 1426 01:17:27,600 --> 01:17:33,440 Speaker 1: highly targeted advertisements for example. Right. So uh. Two individuals 1427 01:17:33,800 --> 01:17:35,840 Speaker 1: that I ran across the road about this topic or 1428 01:17:36,080 --> 01:17:39,120 Speaker 1: or or touched on this topic. One is Matthew Galt, 1429 01:17:39,120 --> 01:17:42,840 Speaker 1: who wrote about this last year for Vice's Motherboard, and 1430 01:17:42,880 --> 01:17:45,839 Speaker 1: then Tiffany Sho wrote about it for The New York Times. 1431 01:17:46,439 --> 01:17:50,520 Speaker 1: So Galt wrote about Michael Veal, a technology policy researcher 1432 01:17:50,760 --> 01:17:55,439 Speaker 1: at University College London who utilized Europe's General Data Protection 1433 01:17:55,520 --> 01:17:59,480 Speaker 1: Regulation or g d p r UM law to request 1434 01:17:59,520 --> 01:18:03,160 Speaker 1: a cop be of the data Netflix collected about him 1435 01:18:03,200 --> 01:18:06,080 Speaker 1: and his choices through the use of the band or 1436 01:18:06,160 --> 01:18:10,160 Speaker 1: Snatch program. Now they complied, perhaps in part because of 1437 01:18:10,520 --> 01:18:15,120 Speaker 1: vail status as a public person. But basically Netflix acquires 1438 01:18:15,240 --> 01:18:18,439 Speaker 1: this information in order to carry out the Bandersnatch experience, 1439 01:18:18,479 --> 01:18:21,320 Speaker 1: which makes sense rights it has to chart your path 1440 01:18:21,680 --> 01:18:26,160 Speaker 1: through this this complex system. But then also Netflix keeps 1441 01:18:26,200 --> 01:18:29,000 Speaker 1: this information, which the company claims is in order to 1442 01:18:29,120 --> 01:18:32,559 Speaker 1: quote determine how to improve this model of storytelling in 1443 01:18:32,600 --> 01:18:36,240 Speaker 1: the context of a show or movie. And I mean, 1444 01:18:36,439 --> 01:18:38,920 Speaker 1: on one level that sounds well and good as well, 1445 01:18:38,960 --> 01:18:42,479 Speaker 1: except that the veal thinks that Netflix quote should really 1446 01:18:42,560 --> 01:18:46,000 Speaker 1: be using consent which you should be able to refuse, 1447 01:18:46,360 --> 01:18:49,880 Speaker 1: or legitimate interests meaning that you can object to it instead. 1448 01:18:50,880 --> 01:18:54,040 Speaker 1: Now in Shoes article, she points to the early choice 1449 01:18:54,080 --> 01:18:58,839 Speaker 1: we make between h Kellogg's Frosties and then Quaker Sugar 1450 01:18:58,880 --> 01:19:02,439 Speaker 1: Puffs out. Both of these are real serials, So I 1451 01:19:02,439 --> 01:19:05,360 Speaker 1: have to admit I thought Quaker Sugar Puffs was made 1452 01:19:05,439 --> 01:19:09,120 Speaker 1: up because it has this ridiculous honey Monster mascot. That's 1453 01:19:09,160 --> 01:19:11,720 Speaker 1: like super fun, kind of a cheddar Goblin sort of thing. 1454 01:19:12,479 --> 01:19:15,719 Speaker 1: But it turns out this was an actual UK product. 1455 01:19:15,800 --> 01:19:19,200 Speaker 1: It was just a UK only products, so Americans such 1456 01:19:19,240 --> 01:19:22,240 Speaker 1: as ourselves were perhaps not privy to it. But again, 1457 01:19:22,280 --> 01:19:25,800 Speaker 1: both were real products and neither one was a paid inclusion, 1458 01:19:25,840 --> 01:19:29,120 Speaker 1: so it was not official product placement or product integration. 1459 01:19:29,439 --> 01:19:32,640 Speaker 1: And Netflix, of course it's like an ad free uh 1460 01:19:32,680 --> 01:19:36,759 Speaker 1: system anyway. But Shoe points to some of the words 1461 01:19:36,800 --> 01:19:40,640 Speaker 1: of Reed Hastings, co founder, chairman and CEO of Netflix, 1462 01:19:40,840 --> 01:19:43,519 Speaker 1: who pointed out during a webcast tied to an earnings 1463 01:19:43,520 --> 01:19:49,880 Speaker 1: report that seventy of Bandersnatch viewers selected Kellogg Frosty's over 1464 01:19:50,360 --> 01:19:53,519 Speaker 1: the the the Quaker sugar Puffs. No I did too. 1465 01:19:53,560 --> 01:19:56,680 Speaker 1: I feel so vulnerable right now. I don't remember what 1466 01:19:56,760 --> 01:19:58,800 Speaker 1: I did the first time around. The first time I 1467 01:19:58,880 --> 01:20:01,200 Speaker 1: watched it, I also watched with my wife, so we 1468 01:20:01,200 --> 01:20:04,000 Speaker 1: were voting on which choices. You know, we're having a discussion, 1469 01:20:04,360 --> 01:20:06,200 Speaker 1: which I guess I should have mentioned that earlier, because 1470 01:20:06,200 --> 01:20:08,160 Speaker 1: that has a whole another wrinkled as a scenario of 1471 01:20:08,200 --> 01:20:12,040 Speaker 1: making communal choices and voting on something, but on my 1472 01:20:12,080 --> 01:20:15,000 Speaker 1: own I chose the Quaker thing just because I thought 1473 01:20:15,040 --> 01:20:18,439 Speaker 1: it looked weirder. But again, I'm in the minority for 1474 01:20:18,520 --> 01:20:21,920 Speaker 1: doing so. So first of all, I think this is 1475 01:20:21,960 --> 01:20:24,240 Speaker 1: a shame because I think the cover and TV advertisement 1476 01:20:24,240 --> 01:20:27,040 Speaker 1: for Quaker sugar Puffs is awesome and weird. Again, but 1477 01:20:27,200 --> 01:20:30,479 Speaker 1: more to the point, as Shoe points out, Spencer Wang, 1478 01:20:30,880 --> 01:20:33,960 Speaker 1: a Netflix vice president, chimed in and joked, and let's 1479 01:20:33,960 --> 01:20:36,439 Speaker 1: be clear he was he was apparently joking that this 1480 01:20:36,520 --> 01:20:39,720 Speaker 1: was the most critical data point of the quarter. Now, 1481 01:20:39,760 --> 01:20:42,599 Speaker 1: she writes it. While Netflix doesn't run commercials and has 1482 01:20:42,600 --> 01:20:46,320 Speaker 1: stated that it would not use bandersnatch information for anything 1483 01:20:46,360 --> 01:20:49,880 Speaker 1: like this, others outside the company do see the potential, namely, 1484 01:20:49,920 --> 01:20:53,639 Speaker 1: in quote, the possibility of inserting brand name products into 1485 01:20:53,680 --> 01:20:58,360 Speaker 1: streaming shows based on data generated by interactive programming. Now, 1486 01:20:58,400 --> 01:21:01,320 Speaker 1: Shoe stresses that the technology to roll this out isn't 1487 01:21:01,360 --> 01:21:04,599 Speaker 1: here yet, but I suppose we have to to consider 1488 01:21:04,640 --> 01:21:08,160 Speaker 1: two key factors in that statement. So, first of all, 1489 01:21:09,120 --> 01:21:12,400 Speaker 1: we're in the early days of truly you know, interactive 1490 01:21:12,400 --> 01:21:16,040 Speaker 1: features like this on major streaming platforms. Uh, you know, 1491 01:21:16,080 --> 01:21:18,519 Speaker 1: assume and that's that's assuming that it really catches on 1492 01:21:18,680 --> 01:21:22,760 Speaker 1: at all. As we've discussed. Interactive cinema is not new. 1493 01:21:22,880 --> 01:21:27,080 Speaker 1: It's been around for decades and it has largely failed 1494 01:21:27,120 --> 01:21:30,840 Speaker 1: to catch on. Um, it is not like a driving 1495 01:21:30,960 --> 01:21:34,920 Speaker 1: force in our entertainment. You'll find plenty of examples of it. 1496 01:21:34,920 --> 01:21:37,120 Speaker 1: You also find a lot of computer games that that 1497 01:21:37,280 --> 01:21:40,920 Speaker 1: kind of fulfill this, uh, this niche right, Um, those 1498 01:21:40,960 --> 01:21:44,880 Speaker 1: are also sort of failed. Yeah, Um, you know I 1499 01:21:44,920 --> 01:21:47,439 Speaker 1: would have there's certainly deeper dives and say the history 1500 01:21:47,479 --> 01:21:49,680 Speaker 1: of things like what Telltale Games I think was the 1501 01:21:49,720 --> 01:21:52,720 Speaker 1: company maybe that did a number of these things that 1502 01:21:52,800 --> 01:21:55,920 Speaker 1: were again not not really released as they weren't marketed 1503 01:21:55,920 --> 01:22:00,000 Speaker 1: as interactive movies so much as they were interactive gaming experiences. Uh, 1504 01:22:00,320 --> 01:22:03,759 Speaker 1: so that's one thing to consider. Interest in interactive films 1505 01:22:03,800 --> 01:22:05,400 Speaker 1: has essentially gone up and down over the years, and 1506 01:22:05,439 --> 01:22:09,240 Speaker 1: again it hasn't really like ignited. Still Netflix and and 1507 01:22:09,360 --> 01:22:12,960 Speaker 1: also Netflix itself has only released a handful of interactive titles, 1508 01:22:13,000 --> 01:22:17,080 Speaker 1: mostly kids stuff. Bandersnatches their only true adult drama release 1509 01:22:17,439 --> 01:22:20,519 Speaker 1: in this of this product type, though they claim to 1510 01:22:20,520 --> 01:22:24,400 Speaker 1: be doubling down on interactive content in the future, given 1511 01:22:24,560 --> 01:22:26,200 Speaker 1: you know how Netflix tends to be a little bit 1512 01:22:26,479 --> 01:22:29,479 Speaker 1: secretive about like what's coming out. Um, or at least 1513 01:22:29,520 --> 01:22:31,360 Speaker 1: they don't tell you a lot. I guess we'll just 1514 01:22:31,439 --> 01:22:33,840 Speaker 1: have to know about it when we see it pop up. 1515 01:22:34,560 --> 01:22:37,320 Speaker 1: But also it's also worth reminding ourselves that a great 1516 01:22:37,320 --> 01:22:40,080 Speaker 1: deal of work went into creating Bandersnatch as well. I think, 1517 01:22:40,320 --> 01:22:42,719 Speaker 1: I think I've I've seen it written it like three 1518 01:22:42,760 --> 01:22:46,400 Speaker 1: times as much work went into Bandersnatch versus say, um, 1519 01:22:46,800 --> 01:22:50,280 Speaker 1: that long episode of the show that was approximately ninety minutes, 1520 01:22:50,960 --> 01:22:54,840 Speaker 1: so is it cost effective content? Are all the limitations 1521 01:22:54,880 --> 01:22:57,719 Speaker 1: worked out yet? For instance, I don't believe Bandersnatch works 1522 01:22:57,720 --> 01:23:00,960 Speaker 1: on many mobile formats or older models, like you have 1523 01:23:01,040 --> 01:23:03,320 Speaker 1: to have, uh, you know, something more updated like I 1524 01:23:03,360 --> 01:23:05,479 Speaker 1: tried to load it down too my phone, uh and 1525 01:23:05,520 --> 01:23:08,639 Speaker 1: I have an older UM iPhone. I tried to load 1526 01:23:08,640 --> 01:23:10,680 Speaker 1: it on there to watch on an airplane and it 1527 01:23:11,080 --> 01:23:13,720 Speaker 1: wouldn't work. I had to watch it through my Xbox one. 1528 01:23:14,280 --> 01:23:16,479 Speaker 1: And another big concern is there would need to be 1529 01:23:16,520 --> 01:23:19,760 Speaker 1: I guess enough interactive content out there tuned for this 1530 01:23:19,800 --> 01:23:22,960 Speaker 1: sort of thing did to generate the necessary user data 1531 01:23:23,800 --> 01:23:26,720 Speaker 1: to then be employed. I can really see this kind 1532 01:23:26,720 --> 01:23:28,920 Speaker 1: of thing being used as a as a major data 1533 01:23:28,960 --> 01:23:31,840 Speaker 1: mining that. I mean, I don't know it seems very 1534 01:23:32,080 --> 01:23:36,000 Speaker 1: possible to get psychologically salient information through this now, of 1535 01:23:36,040 --> 01:23:38,920 Speaker 1: course that they're already getting information through all kinds of things. 1536 01:23:38,920 --> 01:23:42,360 Speaker 1: You know, the tech business can get your information through UH, 1537 01:23:42,360 --> 01:23:45,519 Speaker 1: through what you buy online, through what websites you visit, 1538 01:23:45,560 --> 01:23:48,280 Speaker 1: through what you do on Facebook or other social media. 1539 01:23:48,400 --> 01:23:51,479 Speaker 1: Right like, a website like Netflix already knows what kind 1540 01:23:51,520 --> 01:23:54,559 Speaker 1: of movies you have watched, what kind you like, what 1541 01:23:54,680 --> 01:23:57,559 Speaker 1: kind of movies you want to like, and then also 1542 01:23:58,280 --> 01:24:00,360 Speaker 1: you know how you have rated things as well, and 1543 01:24:00,400 --> 01:24:03,599 Speaker 1: then they can serve you recommendation of what you might 1544 01:24:03,640 --> 01:24:06,360 Speaker 1: want to watch in the future. Right now. This is so, 1545 01:24:06,479 --> 01:24:09,280 Speaker 1: of course we're talking about this and possibly going multiple ways. 1546 01:24:09,360 --> 01:24:14,040 Speaker 1: One is using interactive choices in UH in a film 1547 01:24:14,120 --> 01:24:16,519 Speaker 1: to gather data about you, and the other side would 1548 01:24:16,520 --> 01:24:22,600 Speaker 1: be giving like specially user tailored media experiences, which we 1549 01:24:22,640 --> 01:24:24,800 Speaker 1: already get somewhat of course on websites. You know, you 1550 01:24:24,800 --> 01:24:27,599 Speaker 1: get websites loading with the ads of stuff you searched 1551 01:24:27,600 --> 01:24:30,719 Speaker 1: for on Amazon and Riot. But yeah, I guess we're 1552 01:24:30,760 --> 01:24:33,280 Speaker 1: being forced to consider what if that starts happening within 1553 01:24:33,439 --> 01:24:36,599 Speaker 1: the movies and TV shows you watch, so you start 1554 01:24:36,640 --> 01:24:40,599 Speaker 1: seeing product placement for specific products that are aimed at 1555 01:24:40,680 --> 01:24:45,080 Speaker 1: you individually within the show as you watch right right, 1556 01:24:45,200 --> 01:24:49,160 Speaker 1: Like you know they know that well, given the serial scenario, 1557 01:24:49,280 --> 01:24:54,040 Speaker 1: like potentially the the master of the content being Netflix 1558 01:24:54,120 --> 01:24:58,439 Speaker 1: or some other company. Hypothetically, they might know that you 1559 01:24:58,479 --> 01:25:01,919 Speaker 1: were saying more inclined towards uh, you know, healthy lifestyle choices, 1560 01:25:02,200 --> 01:25:05,960 Speaker 1: and therefore some sort of granola uh you know, health 1561 01:25:06,080 --> 01:25:09,360 Speaker 1: rapped content would be ideal for you in that scenario. 1562 01:25:09,600 --> 01:25:12,920 Speaker 1: Or they might know that that's not your your your 1563 01:25:12,960 --> 01:25:15,639 Speaker 1: ideal serial, or maybe they know that you have children 1564 01:25:15,720 --> 01:25:17,800 Speaker 1: in the house and therefore a children's cereal would be 1565 01:25:17,800 --> 01:25:21,200 Speaker 1: more appropriate. Like that's the kind of information that they 1566 01:25:21,240 --> 01:25:25,160 Speaker 1: could conceivably have and then feed into the cereal that 1567 01:25:25,160 --> 01:25:27,840 Speaker 1: appears before you on the screen. Now, that would be, 1568 01:25:27,920 --> 01:25:32,160 Speaker 1: of course, something we're more familiar with, just like inserting ads. 1569 01:25:32,160 --> 01:25:35,320 Speaker 1: And you might imagine a character walking past a billboard 1570 01:25:35,439 --> 01:25:37,920 Speaker 1: or something in movie like happens all the time now, 1571 01:25:38,240 --> 01:25:41,800 Speaker 1: except that billboard can be you know, dynamically inserted with 1572 01:25:41,880 --> 01:25:45,840 Speaker 1: new imagery or something. I think things start getting even 1573 01:25:45,880 --> 01:25:50,360 Speaker 1: creepier when you imagine something more like Band or Snatch Itself, 1574 01:25:50,360 --> 01:25:53,080 Speaker 1: where there are alternative versions of a film that are 1575 01:25:53,120 --> 01:25:56,639 Speaker 1: especially tailored do you that have different narrative content depending 1576 01:25:56,640 --> 01:25:59,800 Speaker 1: on who's watching. I mean, so one thing Robert and 1577 01:25:59,840 --> 01:26:02,200 Speaker 1: I talking about briefly before we came in here is 1578 01:26:02,240 --> 01:26:05,639 Speaker 1: the idea that you know, we often know that movies 1579 01:26:06,120 --> 01:26:09,639 Speaker 1: can embody values. Of course, you know that, like sometimes 1580 01:26:09,640 --> 01:26:12,840 Speaker 1: the values of of a filmmaker creator come through and 1581 01:26:12,880 --> 01:26:16,040 Speaker 1: what happens in a story. Uh, and then other times 1582 01:26:16,080 --> 01:26:20,160 Speaker 1: they're they're sort of like cheap attempts to display values 1583 01:26:20,200 --> 01:26:22,599 Speaker 1: what would often be called like pandering, right, you know, 1584 01:26:22,880 --> 01:26:26,240 Speaker 1: like uh, you know, cheap appeals to patriotism or something 1585 01:26:26,320 --> 01:26:29,040 Speaker 1: like that in a movie, or or I don't know, 1586 01:26:29,120 --> 01:26:31,000 Speaker 1: I guess you can make an argument for Awards season 1587 01:26:31,240 --> 01:26:34,479 Speaker 1: Academy Awards bait as well. Yeah, sure, you know, just 1588 01:26:34,560 --> 01:26:41,000 Speaker 1: like sort of cheap attempts to exploit the specific desires 1589 01:26:41,080 --> 01:26:46,200 Speaker 1: or value interests of a specific target audience. Um and 1590 01:26:46,200 --> 01:26:49,040 Speaker 1: and so you can imagine, okay, well, now, if a 1591 01:26:49,080 --> 01:26:52,400 Speaker 1: movie is made and it wants to pander, it needs 1592 01:26:52,400 --> 01:26:55,200 Speaker 1: to at least make a choice, right. It's hard to 1593 01:26:55,200 --> 01:27:00,320 Speaker 1: pander to everybody at the same time, but you can imagine, okay, 1594 01:27:00,360 --> 01:27:03,640 Speaker 1: what if somebody just starts making more like a Bandersnatch 1595 01:27:03,720 --> 01:27:05,799 Speaker 1: kind of thing where maybe you don't make the choices, 1596 01:27:05,840 --> 01:27:07,960 Speaker 1: the choices are made for you based on what is 1597 01:27:08,000 --> 01:27:11,519 Speaker 1: known about your user profile. And so what I was 1598 01:27:11,560 --> 01:27:15,000 Speaker 1: imagining beforehand was you could have different versions of the 1599 01:27:15,040 --> 01:27:18,519 Speaker 1: movie Independence Day. You remember that speech Bill Pullman gives 1600 01:27:18,760 --> 01:27:20,920 Speaker 1: before they all get in the planes and go fly 1601 01:27:21,040 --> 01:27:24,639 Speaker 1: off and fight. Um, So it's this rousing moment where 1602 01:27:24,640 --> 01:27:27,719 Speaker 1: Bill Pullman gives this kind of innocuous speech that could 1603 01:27:27,760 --> 01:27:30,960 Speaker 1: appeal to basically anybody. But you can make that speech 1604 01:27:31,280 --> 01:27:35,280 Speaker 1: a much more tailored, specific interest group pandering kind of thing, 1605 01:27:35,680 --> 01:27:37,519 Speaker 1: where you could have one version of the film that 1606 01:27:37,560 --> 01:27:40,719 Speaker 1: plays for somebody that that's that's very inclusive. He gives 1607 01:27:40,720 --> 01:27:43,759 Speaker 1: a speech, He's like, humans will join arms together around 1608 01:27:43,800 --> 01:27:46,360 Speaker 1: the world. There will be no more nations and borders 1609 01:27:46,360 --> 01:27:48,559 Speaker 1: and creeds, and we all, you know, we all unite 1610 01:27:48,560 --> 01:27:51,400 Speaker 1: as one and standing brothers and as brothers and sisters 1611 01:27:51,439 --> 01:27:54,120 Speaker 1: against this. Or you could have a version where he 1612 01:27:54,120 --> 01:27:57,080 Speaker 1: gives a speech about American exceptionalism and how we're the 1613 01:27:57,120 --> 01:27:59,200 Speaker 1: first and we stand up and fight when no one 1614 01:27:59,240 --> 01:28:01,400 Speaker 1: else will. Or you could get you know, you can 1615 01:28:01,439 --> 01:28:04,559 Speaker 1: imagine a million versions of this, depending on what kind 1616 01:28:04,600 --> 01:28:08,160 Speaker 1: of user they think you are who are watching, right, 1617 01:28:08,200 --> 01:28:12,600 Speaker 1: I mean, and that that kind of personality profile or 1618 01:28:12,760 --> 01:28:16,000 Speaker 1: worldview profile would be pretty easy to acquire. I mean 1619 01:28:16,040 --> 01:28:20,559 Speaker 1: basically websites like Facebook have that information. Like they are 1620 01:28:20,600 --> 01:28:23,719 Speaker 1: They're not feeding you independence day uh with a tailored 1621 01:28:24,479 --> 01:28:27,160 Speaker 1: uh speech in it, but they are feeding you giving 1622 01:28:27,200 --> 01:28:31,679 Speaker 1: you a feed that that that reflects your worldviews and values. 1623 01:28:32,040 --> 01:28:35,600 Speaker 1: And people are very invested in like the values of 1624 01:28:35,640 --> 01:28:38,720 Speaker 1: what media they consume these days. I can imagine it 1625 01:28:38,840 --> 01:28:43,240 Speaker 1: being judged a very profitable enterprise by some studios to say, well, 1626 01:28:43,320 --> 01:28:45,840 Speaker 1: let's just cover all the bases. You know, we'll have 1627 01:28:45,880 --> 01:28:48,200 Speaker 1: way less trouble if we make a movie, you know, 1628 01:28:48,360 --> 01:28:50,560 Speaker 1: a version A of the movie for you and a 1629 01:28:50,680 --> 01:28:53,200 Speaker 1: version B of the movie for you. It doesn't have 1630 01:28:53,280 --> 01:28:56,640 Speaker 1: to be a coherent vision or picture of the world. Yeah, 1631 01:28:56,640 --> 01:28:58,760 Speaker 1: this is you know. I can't help but think on 1632 01:28:58,760 --> 01:29:00,760 Speaker 1: on past films, like first we talked about Conan the 1633 01:29:00,800 --> 01:29:03,840 Speaker 1: Barbarian on the show in the past, Like that is 1634 01:29:03,880 --> 01:29:06,639 Speaker 1: a film that has a has a very particular view 1635 01:29:06,760 --> 01:29:10,479 Speaker 1: of what strengthen memes and what uh you know, how 1636 01:29:10,520 --> 01:29:15,800 Speaker 1: power works, etcetera. And it's not everybody's political or philosophical 1637 01:29:15,840 --> 01:29:17,880 Speaker 1: cup of tea. I mean you can. I think you 1638 01:29:17,920 --> 01:29:20,360 Speaker 1: can enjoy that film without focusing on all of that. 1639 01:29:20,439 --> 01:29:24,600 Speaker 1: But still it's definitely there um. And and that's not 1640 01:29:24,680 --> 01:29:26,800 Speaker 1: a film, I mean, especially at the time it came out. 1641 01:29:27,200 --> 01:29:29,400 Speaker 1: It's not a film where you would necessarily ask for 1642 01:29:29,439 --> 01:29:32,439 Speaker 1: an alternate version of it. But again, it's very clear 1643 01:29:32,439 --> 01:29:34,400 Speaker 1: in what it's saying. But then you have films like 1644 01:29:34,439 --> 01:29:37,559 Speaker 1: say Patton. Patton is often brought up as example of 1645 01:29:37,560 --> 01:29:40,600 Speaker 1: a film that meant one thing to one part of 1646 01:29:40,600 --> 01:29:43,280 Speaker 1: a divided America and another to the other part of 1647 01:29:43,280 --> 01:29:46,600 Speaker 1: the divided America without having to have like an a 1648 01:29:46,760 --> 01:29:49,040 Speaker 1: d version. Right, Yeah, I think you can say that 1649 01:29:49,080 --> 01:29:51,680 Speaker 1: about a lot of like war movies, especially. I think 1650 01:29:51,720 --> 01:29:54,800 Speaker 1: that might be sort of true of Platoon, Right, is 1651 01:29:54,840 --> 01:29:57,519 Speaker 1: that like an anti war movie or a patriotic movie. 1652 01:29:57,600 --> 01:29:59,439 Speaker 1: You know, you sort of have some elements of each 1653 01:29:59,600 --> 01:30:02,559 Speaker 1: you can match onto what you want to see there um. 1654 01:30:02,640 --> 01:30:05,200 Speaker 1: But yeah, I don't know. I'm I'm somewhat disturbed by 1655 01:30:05,240 --> 01:30:08,880 Speaker 1: the idea of like of of media filling up with 1656 01:30:08,920 --> 01:30:12,280 Speaker 1: these like personally tailored options that are designed to make 1657 01:30:12,280 --> 01:30:17,440 Speaker 1: a sort of like generic media template, uh, individually palatable 1658 01:30:17,479 --> 01:30:20,600 Speaker 1: to the user. As opposed to standing for something on 1659 01:30:20,640 --> 01:30:23,000 Speaker 1: its own and allowing you to judge it, yeah, or 1660 01:30:23,040 --> 01:30:27,920 Speaker 1: having some level of ambiguity like does the does the 1661 01:30:27,000 --> 01:30:30,920 Speaker 1: the modern audience and like the the near future audience? 1662 01:30:30,960 --> 01:30:33,880 Speaker 1: Do they want ambiguity in their work or do they 1663 01:30:33,920 --> 01:30:38,800 Speaker 1: want like a clear cut view that is expressed clear 1664 01:30:38,840 --> 01:30:41,240 Speaker 1: cut values not only of the film but of like 1665 01:30:41,360 --> 01:30:44,519 Speaker 1: the creator or creators behind it, like they you know, 1666 01:30:44,840 --> 01:30:48,320 Speaker 1: is there is there an increased hunger for that? And 1667 01:30:48,320 --> 01:30:50,919 Speaker 1: and if that is the case, you could easily see 1668 01:30:51,160 --> 01:30:53,759 Speaker 1: a way of worming around that by taking this ABC 1669 01:30:53,920 --> 01:30:57,840 Speaker 1: approach to film creation, because then nobody can say, well, 1670 01:30:58,200 --> 01:31:00,360 Speaker 1: I like the character of Conan the Barbarian, I think 1671 01:31:00,360 --> 01:31:04,679 Speaker 1: your view is is pro totalitarianism and uh and and 1672 01:31:04,800 --> 01:31:08,200 Speaker 1: you know, I don't know, celebrates uh toxic masculinity or 1673 01:31:08,240 --> 01:31:11,000 Speaker 1: whatever the critique might be. And then they could say, well, 1674 01:31:11,080 --> 01:31:13,639 Speaker 1: that's all good, well and good, but you're you're only 1675 01:31:13,680 --> 01:31:17,479 Speaker 1: talking about one version. If you watch Conan the Barbarian, uh, 1676 01:31:17,520 --> 01:31:23,040 Speaker 1: you know, the the uh relaunch of the platform, then 1677 01:31:23,080 --> 01:31:25,759 Speaker 1: you will get what is tailored to you. Its treatment 1678 01:31:25,800 --> 01:31:29,679 Speaker 1: of masculinity and power will be exactly what you want 1679 01:31:29,720 --> 01:31:33,160 Speaker 1: to see. And I mean that opens the door to 1680 01:31:33,400 --> 01:31:36,160 Speaker 1: just a big question of like what art is and 1681 01:31:36,240 --> 01:31:39,200 Speaker 1: what does that do to uh you know, to to 1682 01:31:39,280 --> 01:31:42,160 Speaker 1: the role of these narratives in our in our culture. 1683 01:31:42,400 --> 01:31:47,080 Speaker 1: I remember many years ago how much of like the 1684 01:31:47,080 --> 01:31:50,320 Speaker 1: new Internet and the new media landscape was being sold 1685 01:31:50,360 --> 01:31:53,719 Speaker 1: to us. It was so often on the selling point 1686 01:31:53,760 --> 01:31:58,040 Speaker 1: of customization and individualization. You know, get what's right for you, 1687 01:31:58,160 --> 01:32:01,720 Speaker 1: get an experience that's per simmily tailored for you. And 1688 01:32:01,800 --> 01:32:03,960 Speaker 1: somehow I just feel like we were not able to 1689 01:32:04,080 --> 01:32:07,800 Speaker 1: anticipate how scary and messed up that would feel when 1690 01:32:07,800 --> 01:32:11,320 Speaker 1: it actually happened. I like to come back to Bandersnatch. 1691 01:32:11,400 --> 01:32:14,439 Speaker 1: The first time I watched it, I think it probably 1692 01:32:14,560 --> 01:32:17,880 Speaker 1: was over two hours that I spent questing after the 1693 01:32:17,920 --> 01:32:20,400 Speaker 1: happy ending, and I got it, and I have to 1694 01:32:20,439 --> 01:32:22,559 Speaker 1: admit I felt a little empty when I reached it. 1695 01:32:23,520 --> 01:32:25,960 Speaker 1: The second time, I tried to just again make more 1696 01:32:26,040 --> 01:32:29,000 Speaker 1: dramatic choices, uh, make a choice here and there that 1697 01:32:29,040 --> 01:32:30,760 Speaker 1: were just different from what I did the first time. 1698 01:32:31,120 --> 01:32:34,839 Speaker 1: I got a bleak ending, but it it felt more authentic. 1699 01:32:35,360 --> 01:32:40,519 Speaker 1: Um So, uh yeah, It's interesting to think about, like 1700 01:32:40,600 --> 01:32:45,280 Speaker 1: how how choice potentially impacts our appreciation of of a 1701 01:32:45,320 --> 01:32:49,120 Speaker 1: work like this, especially if we're talking about increasingly interactive 1702 01:32:49,200 --> 01:32:52,640 Speaker 1: work in a hypothetical future. I had to find a 1703 01:32:52,680 --> 01:32:55,200 Speaker 1: good bleak note to end on. I think that's it. Yeah, 1704 01:32:55,200 --> 01:32:58,120 Speaker 1: I mean, if we're talking about black mirror, that's that's 1705 01:32:58,120 --> 01:33:01,000 Speaker 1: where we have to leave it. All right, Well, we 1706 01:33:01,080 --> 01:33:03,680 Speaker 1: covered a lot of ground in there. Um. I imagine 1707 01:33:03,800 --> 01:33:06,519 Speaker 1: listeners will want to chime in, certainly on Band or 1708 01:33:06,520 --> 01:33:08,599 Speaker 1: Snatch if they have experienced it. I'd love to hear 1709 01:33:08,640 --> 01:33:10,800 Speaker 1: from anyone who's like, how much time did you spend 1710 01:33:10,800 --> 01:33:12,679 Speaker 1: on band or such? How many viewings have you given? 1711 01:33:12,680 --> 01:33:14,599 Speaker 1: Did you do like Joe and go in and try 1712 01:33:14,600 --> 01:33:17,880 Speaker 1: and find every golden Easter egg? I didn't get all 1713 01:33:17,920 --> 01:33:19,320 Speaker 1: of them, but I got a lot of them. Or 1714 01:33:19,360 --> 01:33:20,760 Speaker 1: did you do like me? Did you sort of go 1715 01:33:20,800 --> 01:33:22,760 Speaker 1: through it once and maybe go through a second time? 1716 01:33:22,760 --> 01:33:24,920 Speaker 1: And maybe you haven't seen or haven't read about the 1717 01:33:24,960 --> 01:33:28,720 Speaker 1: other endings. And of course free will you all have 1718 01:33:28,880 --> 01:33:30,320 Speaker 1: it or maybe you all don't have it but you 1719 01:33:30,320 --> 01:33:32,639 Speaker 1: think you have it, which however you want to look 1720 01:33:32,680 --> 01:33:34,920 Speaker 1: at it. You all have some thoughts about free will 1721 01:33:35,200 --> 01:33:37,640 Speaker 1: you all you all have some experience to share about this, 1722 01:33:37,680 --> 01:33:40,200 Speaker 1: and we would love to hear from you in the meantime, 1723 01:33:40,240 --> 01:33:41,719 Speaker 1: if you want to check out more Stuff to Blow 1724 01:33:41,760 --> 01:33:44,520 Speaker 1: Your Mind, you can find us anywhere you get your podcasts. 1725 01:33:44,720 --> 01:33:46,559 Speaker 1: You can certainly go to stuff to Blow your Mind 1726 01:33:46,600 --> 01:33:48,679 Speaker 1: dot com and that will redirect you to a place 1727 01:33:48,680 --> 01:33:51,760 Speaker 1: where you can find the episodes and wherever you get 1728 01:33:51,760 --> 01:33:53,479 Speaker 1: the show. We just have to ask that you support 1729 01:33:53,560 --> 01:33:56,920 Speaker 1: us by rating and reviewing and subscribing. And don't forget 1730 01:33:56,920 --> 01:34:00,480 Speaker 1: we have another show out there titled Invention and Invention 1731 01:34:00,800 --> 01:34:04,719 Speaker 1: covers human techno history, one invention at a time. Huge 1732 01:34:04,720 --> 01:34:08,240 Speaker 1: thanks as always to our excellent audio producer Seth Nicholas Johnson, 1733 01:34:08,240 --> 01:34:11,439 Speaker 1: who's doing a heroic quick turnaround on today's episode, So 1734 01:34:11,560 --> 01:34:14,920 Speaker 1: praise him, everyone, Praise him. Uh. If you would like 1735 01:34:14,960 --> 01:34:17,240 Speaker 1: to get in touch with us with with notes on 1736 01:34:17,280 --> 01:34:19,680 Speaker 1: this episode or any other, to suggest a topic for 1737 01:34:19,720 --> 01:34:22,240 Speaker 1: the future, just to say hi, you can email us 1738 01:34:22,320 --> 01:34:32,479 Speaker 1: at contact at stuff to Blow your Mind dot com. 1739 01:34:32,560 --> 01:34:34,400 Speaker 1: Stuff to Blow Your Mind is a production of iHeart 1740 01:34:34,479 --> 01:34:37,120 Speaker 1: Radio's How Stuff Works. For more podcasts from my heart Radio, 1741 01:34:37,200 --> 01:34:39,960 Speaker 1: visit the iHeart Radio app, Apple Podcasts, or wherever you 1742 01:34:40,000 --> 01:35:00,160 Speaker 1: listen to your favorite shows.