1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:23,556 --> 00:00:28,356 Speaker 2: Welcome to episode two of Development Hell, our revisionist history 3 00:00:28,396 --> 00:00:31,876 Speaker 2: series based on the radical notion of the best Hollywood 4 00:00:31,916 --> 00:00:34,676 Speaker 2: stories are the ones that never got made into movies. 5 00:00:35,556 --> 00:00:37,756 Speaker 2: Last time, we kicked out the series with the story 6 00:00:37,796 --> 00:00:39,796 Speaker 2: of how my book Blink failed to make it under 7 00:00:39,796 --> 00:00:42,476 Speaker 2: the big screen. If you haven't heard that one yet, 8 00:00:42,516 --> 00:00:46,636 Speaker 2: you really really should, because it explains why failed stories 9 00:00:46,876 --> 00:00:55,076 Speaker 2: have such meaning for me. Anyway, this episode is a 10 00:00:55,196 --> 00:00:58,876 Speaker 2: wildly convoluted tale, full of big plot twists and huge 11 00:00:58,916 --> 00:01:03,396 Speaker 2: moral questions, a story so much about right now that's 12 00:01:03,436 --> 00:01:10,396 Speaker 2: going to get a little uncomfortable. Allow me to introduce 13 00:01:10,436 --> 00:01:14,356 Speaker 2: guests Number one, Angus Fletcher. He's a screenwriter. 14 00:01:14,636 --> 00:01:18,116 Speaker 3: I'm not a screenwriter, and I've never really wanted to 15 00:01:18,116 --> 00:01:20,156 Speaker 3: be a screenwriter. Have no aptitude in screenwriting. 16 00:01:20,436 --> 00:01:23,996 Speaker 2: Don't listen to Angus. Angus is in fact a screenwriter, 17 00:01:24,236 --> 00:01:28,076 Speaker 2: as you'll hear, he's also an author, researcher, and professor 18 00:01:28,076 --> 00:01:32,476 Speaker 2: of story science at Ohio State and a longtime friend 19 00:01:32,556 --> 00:01:36,236 Speaker 2: of Revision's history. Some of you will remember him as 20 00:01:36,276 --> 00:01:39,676 Speaker 2: the person who provided the intellectual scaffolding for our memorable 21 00:01:39,716 --> 00:01:44,436 Speaker 2: revision of The Little Mermaid. Remember this so you think 22 00:01:44,476 --> 00:01:46,756 Speaker 2: we should be able to we should fix Ursula. 23 00:01:47,716 --> 00:01:49,676 Speaker 3: Well, I think we should just stop her from doing 24 00:01:49,676 --> 00:01:51,516 Speaker 3: whatever she's doing. We should have a conversation with her 25 00:01:51,556 --> 00:01:52,996 Speaker 3: about maybe why this isn't helpful. 26 00:01:53,476 --> 00:01:58,876 Speaker 4: Do you, Prince Eric, take Ursula to have and to 27 00:01:59,036 --> 00:02:02,836 Speaker 4: hold in sickness and in health for as long as 28 00:02:02,836 --> 00:02:03,996 Speaker 4: you both shall live. 29 00:02:04,996 --> 00:02:08,596 Speaker 3: Eric looks deep into Ursula's eyes, hypnotized. 30 00:02:08,636 --> 00:02:08,916 Speaker 1: I do. 31 00:02:11,716 --> 00:02:15,916 Speaker 2: If you haven't heard that Little Mermaid series, listen to it. Anyway, 32 00:02:16,396 --> 00:02:18,116 Speaker 2: back to today's story. 33 00:02:17,996 --> 00:02:21,236 Speaker 3: I'm just fascinated by this problem that Hollywood has, that 34 00:02:21,276 --> 00:02:22,796 Speaker 3: it keeps selling the same story over and over and 35 00:02:22,836 --> 00:02:25,916 Speaker 3: over again. And so you know, back when I was 36 00:02:25,956 --> 00:02:29,316 Speaker 3: actually a Shakespeare professor up at Stanford, I placed this 37 00:02:29,356 --> 00:02:31,116 Speaker 3: folk all the Pixar and I was basically like, how 38 00:02:31,196 --> 00:02:32,716 Speaker 3: is it you guys who were able to make stuff 39 00:02:32,716 --> 00:02:35,116 Speaker 3: that's like better? This is when they're making movies like 40 00:02:35,196 --> 00:02:39,316 Speaker 3: Up which had just these totally berserk narrative structures. And 41 00:02:39,356 --> 00:02:41,076 Speaker 3: so the anyway they let me inside, I learned all 42 00:02:41,076 --> 00:02:44,076 Speaker 3: this stuff. I thought it was totally fascinating. I realized 43 00:02:44,076 --> 00:02:45,916 Speaker 3: that they were making stories in a totally different way 44 00:02:45,916 --> 00:02:47,796 Speaker 3: than Hollywood was. So I went down to Hollywood to 45 00:02:47,796 --> 00:02:52,076 Speaker 3: advise Hollywood, and that led to me basically trying to 46 00:02:52,076 --> 00:02:54,996 Speaker 3: convince Hollywood by writing a series of screenplays the Pixar way, 47 00:02:55,676 --> 00:02:57,076 Speaker 3: because I thought that they would then allow me to 48 00:02:57,076 --> 00:02:59,116 Speaker 3: consult on movies. Instead, No, they just hired me to 49 00:02:59,116 --> 00:03:02,076 Speaker 3: write screenplays, and that's how I got an agent. And 50 00:03:02,076 --> 00:03:04,596 Speaker 3: then one day my agent called me up and because 51 00:03:04,596 --> 00:03:06,796 Speaker 3: he knew I'm just obsessed with the original Total Recall, 52 00:03:07,116 --> 00:03:10,196 Speaker 3: which had destroyed my mind when I was young, and 53 00:03:10,316 --> 00:03:12,116 Speaker 3: he was like, how would you like to work with 54 00:03:12,156 --> 00:03:14,396 Speaker 3: the writer of Total Recall, Gary Goldman? 55 00:03:14,556 --> 00:03:18,196 Speaker 1: And so that's how we got together. Gary. 56 00:03:18,196 --> 00:03:21,596 Speaker 2: Wait, Gary, you wrote Total Recall. Meet Guests number two, 57 00:03:22,236 --> 00:03:26,876 Speaker 2: Gary Goldman. He's also a screenwriter, but unlike Angus, has 58 00:03:26,916 --> 00:03:27,996 Speaker 2: no trouble admitting it. 59 00:03:28,556 --> 00:03:30,956 Speaker 1: Yes, I was the last writer on Total Recall. On 60 00:03:30,996 --> 00:03:32,396 Speaker 1: the first Total Recall. 61 00:03:32,276 --> 00:03:37,996 Speaker 2: Total Recall amazing movie, Arnold Schwarzenegger Sharon Stone, directed by 62 00:03:38,036 --> 00:03:38,796 Speaker 2: Paul Verhoven. 63 00:03:39,036 --> 00:03:39,196 Speaker 3: Yeah. 64 00:03:39,236 --> 00:03:41,476 Speaker 1: So it was about a milk toast character who finds 65 00:03:41,516 --> 00:03:43,996 Speaker 1: out that he's not who he thinks he is, which 66 00:03:44,036 --> 00:03:45,876 Speaker 1: is that he's a kind of a super spy. 67 00:03:45,996 --> 00:03:46,596 Speaker 2: But how real? 68 00:03:46,636 --> 00:03:47,356 Speaker 1: Does it seem? 69 00:03:47,556 --> 00:03:50,596 Speaker 2: As real as any memory in your head? Come on, 70 00:03:50,876 --> 00:03:51,756 Speaker 2: don't bullshit me. 71 00:03:51,996 --> 00:03:54,036 Speaker 4: No, I'm telling you, Doug, your brain will not know 72 00:03:54,156 --> 00:03:56,436 Speaker 4: the difference, and that's guaranteed or your money back. 73 00:03:56,996 --> 00:04:00,116 Speaker 1: What about the guy you lobottomized? Did he get the refund? 74 00:04:00,476 --> 00:04:04,316 Speaker 2: Came out in nineteen ninety blockbuster tons of awards total 75 00:04:04,356 --> 00:04:06,916 Speaker 2: recall has a number of It has a genealogy. It's 76 00:04:06,956 --> 00:04:09,396 Speaker 2: not just to me. I mean, it has a short story. 77 00:04:09,436 --> 00:04:11,756 Speaker 2: The source material is from Philip K. 78 00:04:11,876 --> 00:04:17,076 Speaker 1: Dick. And then there was a fantastic original screenplay by 79 00:04:17,156 --> 00:04:21,556 Speaker 1: Dan O'Bannon and Ron Schussette that everybody loved from the beginning, 80 00:04:21,916 --> 00:04:25,236 Speaker 1: and this launched the project. In fact, this is sort 81 00:04:25,236 --> 00:04:29,676 Speaker 1: of the prime example of this idea of development hell, 82 00:04:30,076 --> 00:04:32,756 Speaker 1: which is that you sell something for a lot of 83 00:04:32,796 --> 00:04:35,756 Speaker 1: money and everybody's very excited about it, and they tell 84 00:04:35,796 --> 00:04:39,276 Speaker 1: you it's going to be a movie right away, and 85 00:04:39,316 --> 00:04:44,796 Speaker 1: then it goes into a process of development where everybody 86 00:04:44,876 --> 00:04:50,116 Speaker 1: has a hand in revising it and reconceptualizing it and 87 00:04:50,156 --> 00:04:53,276 Speaker 1: making it in their image and in the middle. Depending 88 00:04:53,276 --> 00:04:56,796 Speaker 1: on your involvement, you can be in development hell along 89 00:04:56,836 --> 00:05:00,636 Speaker 1: with the project, or you can be a bystander pushed 90 00:05:00,636 --> 00:05:03,196 Speaker 1: to the outside watching it go through the process of 91 00:05:03,236 --> 00:05:06,196 Speaker 1: development Hell with lots of other people, not including yourself. 92 00:05:06,996 --> 00:05:09,036 Speaker 1: I mean, if you're lucky it gets made at all. 93 00:05:09,236 --> 00:05:11,116 Speaker 1: The project had been in development for almost eight or 94 00:05:11,196 --> 00:05:12,596 Speaker 1: nine years when I got involved. 95 00:05:12,676 --> 00:05:16,116 Speaker 2: Yeah, eight or nine years. Yes, So if you go 96 00:05:16,276 --> 00:05:20,396 Speaker 2: back to the original screenplay, do you think that's a 97 00:05:20,436 --> 00:05:23,996 Speaker 2: better movie than the final screenplay? 98 00:05:24,836 --> 00:05:25,036 Speaker 3: No? 99 00:05:25,076 --> 00:05:25,756 Speaker 1: Not in my case. 100 00:05:28,356 --> 00:05:30,556 Speaker 2: So you're saying development hell is a bad thing. Accept 101 00:05:30,676 --> 00:05:33,796 Speaker 2: in the case where I'm involved in convicings, which, by 102 00:05:33,796 --> 00:05:37,076 Speaker 2: the way, a legitimate position that I would write, I'm 103 00:05:37,116 --> 00:05:38,716 Speaker 2: totally open to agree with that. 104 00:05:39,596 --> 00:05:41,396 Speaker 1: I mean, when I was starting out, I read the 105 00:05:41,436 --> 00:05:45,316 Speaker 1: original Total Recalled screenplay as a junior executor for a 106 00:05:45,356 --> 00:05:49,836 Speaker 1: producer who was considering making it and acquiring it. And 107 00:05:49,916 --> 00:05:52,156 Speaker 1: basically the problem that most people felt was that it 108 00:05:52,236 --> 00:05:59,676 Speaker 1: fell apart in the third act. So and this apparently, 109 00:05:59,676 --> 00:06:04,036 Speaker 1: it fell apart so disastrously that even though it had 110 00:06:04,076 --> 00:06:06,756 Speaker 1: many different stars and directors attached and people always wanted 111 00:06:06,756 --> 00:06:09,796 Speaker 1: to make it, it didn't cross the finish line. And 112 00:06:09,876 --> 00:06:11,356 Speaker 1: by the time it got to me it was eight 113 00:06:11,436 --> 00:06:14,236 Speaker 1: or nine years later. It had been in pre production 114 00:06:14,876 --> 00:06:22,276 Speaker 1: with Bruce Beresford and Patrick Swayzee and the producer Dino 115 00:06:22,356 --> 00:06:27,396 Speaker 1: Dilarentis in Australia, and in fact, Bruce Beresford asked me 116 00:06:27,476 --> 00:06:30,716 Speaker 1: to come over and do a rewrite on it, and 117 00:06:30,716 --> 00:06:32,316 Speaker 1: I said I couldn't because I was working with Paul 118 00:06:32,396 --> 00:06:36,796 Speaker 1: Verhovean on another project. But Dino went bankrupt. Arnold Schwarzenegger 119 00:06:36,836 --> 00:06:39,276 Speaker 1: had always loved the project and had been watching it. 120 00:06:39,956 --> 00:06:43,876 Speaker 1: He went the owners of the Carolco studio that had 121 00:06:43,916 --> 00:06:47,396 Speaker 1: done the Rambo movies and he said, you know, I 122 00:06:47,436 --> 00:06:50,196 Speaker 1: want to do this, and they bought it from Dio 123 00:06:50,276 --> 00:06:54,076 Speaker 1: Dilarentis out of bankruptcy for a humongous amount of money 124 00:06:55,436 --> 00:06:57,836 Speaker 1: and with the idea that Arnold was attached. Arnold wanted 125 00:06:57,836 --> 00:07:00,636 Speaker 1: to work with Paul Verhoven because Paul had just done 126 00:07:00,716 --> 00:07:03,916 Speaker 1: RoboCop and I was working with Paul Verhovean on a 127 00:07:03,956 --> 00:07:08,756 Speaker 1: movie called about Out of Body Travel, and Paul said 128 00:07:08,796 --> 00:07:11,276 Speaker 1: he didn't think it was ready to shoot and that 129 00:07:11,396 --> 00:07:13,236 Speaker 1: he was going to leave our project and go do 130 00:07:13,356 --> 00:07:15,476 Speaker 1: another project, and he was apologizing to me for that. 131 00:07:15,996 --> 00:07:18,236 Speaker 1: I said, what's the other project and he said Total Recall. 132 00:07:18,476 --> 00:07:20,236 Speaker 1: I said, well, that's really funny, Paul, because I turned 133 00:07:20,276 --> 00:07:23,236 Speaker 1: down totally the chance to rewrite Total Recall because I 134 00:07:23,276 --> 00:07:25,836 Speaker 1: was working with you, and he said, well, what did 135 00:07:25,876 --> 00:07:28,156 Speaker 1: you think about it? And I told him my take 136 00:07:28,236 --> 00:07:29,716 Speaker 1: and he said, well, you know, that's pretty much how 137 00:07:29,716 --> 00:07:31,156 Speaker 1: I see it too. Let me let me see if 138 00:07:31,196 --> 00:07:32,516 Speaker 1: I can get you the job to rewrite it. 139 00:07:32,476 --> 00:07:35,996 Speaker 2: Gary, Gary, Gary got worried. We want to talk with 140 00:07:36,116 --> 00:07:38,076 Speaker 2: his other project. But now I'm thinking we can come 141 00:07:38,116 --> 00:07:40,676 Speaker 2: back to total recall or is this all leading up 142 00:07:40,676 --> 00:07:44,236 Speaker 2: to the other project. Well, they're connected, yeah, So what's 143 00:07:44,236 --> 00:07:47,476 Speaker 2: the connection? The connection is I would say Philip K. Dick, 144 00:07:48,436 --> 00:07:52,316 Speaker 2: Philip K. Dick, sci Fi Master. To name just a 145 00:07:52,356 --> 00:07:58,716 Speaker 2: few other Philip K. Dick film adaptations, Blade Runner, Minority Report, Scanner, Darkly, 146 00:07:59,236 --> 00:08:01,956 Speaker 2: Philip K. Dick is probably the most important sci fi 147 00:08:01,996 --> 00:08:05,436 Speaker 2: writer ever, or at least of the last hundred years. 148 00:08:06,356 --> 00:08:09,516 Speaker 2: After working on Minority Report, Gary was becoming a bit 149 00:08:09,516 --> 00:08:12,596 Speaker 2: of a Philip K. Dick specialist, which is not a 150 00:08:12,596 --> 00:08:14,596 Speaker 2: bad thing to be in a town that loves sci 151 00:08:14,636 --> 00:08:19,996 Speaker 2: fi adaptations. When we get back, Gary and Angus go 152 00:08:20,116 --> 00:08:35,396 Speaker 2: to work on adapting another of his stories. We're back 153 00:08:35,556 --> 00:08:38,436 Speaker 2: with Angus Fletcher and Gary Goldman, who've just been set 154 00:08:38,516 --> 00:08:41,796 Speaker 2: up to work together on an adaptation of a Philip K. 155 00:08:41,956 --> 00:08:45,876 Speaker 1: Dick story. I've done four projects based on Philip K. 156 00:08:45,996 --> 00:08:50,556 Speaker 1: Dick material. This would become my calling card in Hollywood 157 00:08:50,596 --> 00:08:52,796 Speaker 1: was that I did Philip K. Dick adaptations. 158 00:08:53,236 --> 00:08:58,236 Speaker 2: So, Angus, wait, did you have a particular affection for 159 00:08:58,276 --> 00:09:00,676 Speaker 2: Philip K. Dick prior to this project? 160 00:09:00,996 --> 00:09:03,396 Speaker 3: Well, I mean I loved the movie Total Reco And 161 00:09:03,436 --> 00:09:06,036 Speaker 3: I'm going to acknowledge that I was such a philistine 162 00:09:06,076 --> 00:09:07,596 Speaker 3: that I had no idea that it came from Philip K. 163 00:09:07,636 --> 00:09:07,876 Speaker 1: Dick. 164 00:09:08,476 --> 00:09:10,516 Speaker 3: And happened was the moment I got the call from 165 00:09:10,516 --> 00:09:13,396 Speaker 3: my agent. I then went and read all of Philip K. 166 00:09:13,556 --> 00:09:19,076 Speaker 3: Dick and it's surreal. I mean, he sort of went 167 00:09:19,116 --> 00:09:23,716 Speaker 3: from science fiction into religion. And I called Gary up 168 00:09:23,756 --> 00:09:25,076 Speaker 3: and I was like, well, Gary, I was like, what 169 00:09:25,116 --> 00:09:25,996 Speaker 3: do you think you know? 170 00:09:26,596 --> 00:09:27,956 Speaker 1: I'd love to work with you. And the main reason 171 00:09:27,996 --> 00:09:29,716 Speaker 1: I wanted to work with Gary is. 172 00:09:29,676 --> 00:09:33,036 Speaker 3: Because I believe that screenwriters had the ability to see 173 00:09:33,036 --> 00:09:35,516 Speaker 3: the future, because I think this is a common power 174 00:09:35,556 --> 00:09:40,636 Speaker 3: of story. I think that science fiction is always about 175 00:09:40,796 --> 00:09:44,436 Speaker 3: intuiting what's going to happen next. And so I said 176 00:09:44,476 --> 00:09:46,996 Speaker 3: to Gary, I said, Gary, what do you think the 177 00:09:47,036 --> 00:09:50,276 Speaker 3: future is? And he said, Angus, read this story, variable 178 00:09:50,316 --> 00:09:53,196 Speaker 3: man and the premise of Variable Man's very simple. It's 179 00:09:53,196 --> 00:09:54,836 Speaker 3: basically just about a future in which there are these 180 00:09:54,876 --> 00:09:58,836 Speaker 3: computers who used this histics to predict everything's going to happen. 181 00:10:00,436 --> 00:10:02,316 Speaker 3: And I said, you know, Gary, this seems weirdly like 182 00:10:02,356 --> 00:10:04,036 Speaker 3: the world we're accelerating it to now. 183 00:10:04,556 --> 00:10:08,076 Speaker 1: And he said, yeah, he said it is. And he said, and. 184 00:10:08,076 --> 00:10:10,236 Speaker 3: I have a premise or a story that I'd like 185 00:10:10,276 --> 00:10:12,076 Speaker 3: to pitch to you that I think will kind of 186 00:10:12,076 --> 00:10:13,996 Speaker 3: be as big, if not bigger, than total recall. 187 00:10:14,076 --> 00:10:16,796 Speaker 1: And that's where we got started, was with his pitch 188 00:10:16,876 --> 00:10:21,316 Speaker 1: to me. And is the So the Variable Man's Story 189 00:10:22,356 --> 00:10:27,196 Speaker 1: was written when, Oh, I don't know about nineteen fifty, probably. 190 00:10:26,836 --> 00:10:29,716 Speaker 2: Oh I see, Oh it really so. Imagining in nineteen 191 00:10:29,756 --> 00:10:32,916 Speaker 2: fifty a world in which computers predict everything is kind 192 00:10:32,916 --> 00:10:35,756 Speaker 2: of great if you're in the twenty When when did 193 00:10:35,796 --> 00:10:38,316 Speaker 2: you guys, When did you guys work together on this project? 194 00:10:38,356 --> 00:10:40,916 Speaker 2: What was the early two thousand, early two thousands? So 195 00:10:40,996 --> 00:10:44,596 Speaker 2: you have and so were you were you being faithful 196 00:10:44,636 --> 00:10:47,436 Speaker 2: to the Philip K. Dick short story or were you 197 00:10:47,676 --> 00:10:49,396 Speaker 2: were you changing it in useful ways? 198 00:10:49,596 --> 00:10:52,356 Speaker 3: We were changing it in useful ways. And there a 199 00:10:52,436 --> 00:10:54,956 Speaker 3: couple a couple of big things. So, first of all, 200 00:10:55,636 --> 00:10:58,196 Speaker 3: Gary's pitch to me was he was like, imagine a 201 00:10:58,276 --> 00:11:03,036 Speaker 3: future in which computers don't just predict things like they 202 00:11:03,076 --> 00:11:03,716 Speaker 3: do in the Philip K. 203 00:11:03,796 --> 00:11:05,196 Speaker 1: Dick story, which is who wins this war? 204 00:11:05,476 --> 00:11:07,036 Speaker 3: But he says, you know, imagine that they're able to 205 00:11:07,076 --> 00:11:08,916 Speaker 3: predict exactly. 206 00:11:08,516 --> 00:11:10,836 Speaker 1: What food you'd like to eat right now. And then 207 00:11:10,876 --> 00:11:12,556 Speaker 1: he said, and then imagine. 208 00:11:12,156 --> 00:11:15,396 Speaker 3: That they're able to predict a gift that you can 209 00:11:15,436 --> 00:11:18,156 Speaker 3: give to your best friend for their birthday. And that 210 00:11:18,196 --> 00:11:20,836 Speaker 3: gift is so good that your friend not only loves it, 211 00:11:20,876 --> 00:11:24,996 Speaker 3: but it seems perfectly like it came from you. And 212 00:11:25,036 --> 00:11:27,756 Speaker 3: then he said, imagine the computers can pick your soulmate. 213 00:11:28,716 --> 00:11:32,116 Speaker 3: Imagine the computers can pick the person that you love 214 00:11:32,276 --> 00:11:36,836 Speaker 3: so totally and so absolutely that you know that. 215 00:11:36,836 --> 00:11:40,596 Speaker 1: This computer knows you better than yourself. And I said, 216 00:11:40,636 --> 00:11:41,036 Speaker 1: all right. 217 00:11:41,356 --> 00:11:44,076 Speaker 3: And then he said, and now imagine, after you've given 218 00:11:44,116 --> 00:11:46,836 Speaker 3: yourself up totally these computers, you wake up one morning 219 00:11:47,196 --> 00:11:49,356 Speaker 3: and they say to you, in twenty four hours, the 220 00:11:49,476 --> 00:11:52,316 Speaker 3: world is going to end because humanity is going to 221 00:11:52,316 --> 00:11:57,596 Speaker 3: destroy itself. And I've crunched all the variables and there's 222 00:11:57,676 --> 00:12:01,436 Speaker 3: no opportunity for you to do anything. That's the starting 223 00:12:01,476 --> 00:12:02,276 Speaker 3: point for the story. 224 00:12:02,596 --> 00:12:05,596 Speaker 2: Oh my god, So wait, how far have we traveled? 225 00:12:06,676 --> 00:12:08,956 Speaker 2: We've traveled quite significantly from Philip K. Dick At this 226 00:12:09,676 --> 00:12:11,996 Speaker 2: he was just imagining a world where there were these 227 00:12:12,036 --> 00:12:15,116 Speaker 2: computers that were predicting the outcome of a war, and 228 00:12:15,156 --> 00:12:19,396 Speaker 2: the computer's run across a problem. That's basically the plot, 229 00:12:19,476 --> 00:12:20,396 Speaker 2: well right. 230 00:12:20,876 --> 00:12:24,156 Speaker 1: The title variable Man. Because the other thing that Philip K. 231 00:12:24,236 --> 00:12:26,556 Speaker 3: Dick did, which he intuited I think at a moment 232 00:12:26,596 --> 00:12:29,716 Speaker 3: of genius, was what would happen if you put a 233 00:12:29,836 --> 00:12:34,156 Speaker 3: person who the computers had no information on into the 234 00:12:34,196 --> 00:12:36,996 Speaker 3: middle of the statistical world. So these are all these 235 00:12:36,996 --> 00:12:39,396 Speaker 3: computers that are able to see the future because they 236 00:12:39,436 --> 00:12:41,636 Speaker 3: have all the data, but there's one person who they 237 00:12:41,676 --> 00:12:44,956 Speaker 3: have no data on. And so the starting point for 238 00:12:45,036 --> 00:12:47,956 Speaker 3: the screenplay that Carried and I did was humanity goes, 239 00:12:48,036 --> 00:12:50,156 Speaker 3: oh my goodness, we're going to die tomorrow. Our only 240 00:12:50,196 --> 00:12:53,156 Speaker 3: option is to bring in someone who the computer has 241 00:12:53,196 --> 00:12:58,476 Speaker 3: no data on, the variable man, the person who is unpredictable, unknown. 242 00:12:58,756 --> 00:13:00,356 Speaker 1: And so, just like in the original Philip K. 243 00:13:00,436 --> 00:13:04,436 Speaker 3: Dick's story, we reached back into time, back into history 244 00:13:04,716 --> 00:13:08,996 Speaker 3: before the computers and pluck this person into the present. 245 00:13:09,236 --> 00:13:11,876 Speaker 3: And that's basically the beginning of the screenplay is when 246 00:13:11,876 --> 00:13:14,276 Speaker 3: the variable man arrives in this world where everything has 247 00:13:14,316 --> 00:13:16,956 Speaker 3: been planned, everything has been predetermined, the computers know everything. 248 00:13:17,396 --> 00:13:18,196 Speaker 1: We're all going to die. 249 00:13:18,516 --> 00:13:23,356 Speaker 2: Yeah, now the way with so many, so many wonderful uh, Gary, 250 00:13:23,556 --> 00:13:27,196 Speaker 2: tell me a little bit about your thought process in 251 00:13:27,356 --> 00:13:31,636 Speaker 2: moving from the original Philip K. Dick notion to the 252 00:13:31,636 --> 00:13:35,156 Speaker 2: one that Anglish has described. Tell me about the leaps 253 00:13:35,156 --> 00:13:38,356 Speaker 2: that you made there and why you made them? 254 00:13:38,756 --> 00:13:45,196 Speaker 1: All Right, So, I suppose I was very interested in AI. 255 00:13:46,036 --> 00:13:47,996 Speaker 1: You know, everyone's thinking about it now, but it wasn't 256 00:13:48,036 --> 00:13:52,556 Speaker 1: brand new. And I saw this idea that that of 257 00:13:52,596 --> 00:13:54,836 Speaker 1: the computer that can predict everything, and that you have 258 00:13:54,876 --> 00:14:00,316 Speaker 1: to give up control to the computer because it's it's 259 00:14:00,356 --> 00:14:04,276 Speaker 1: working so fast, uh, And I wanted to sort of 260 00:14:05,036 --> 00:14:11,116 Speaker 1: imagine this sort of trajectory of of going from where 261 00:14:11,116 --> 00:14:14,236 Speaker 1: we are now to the point where the computer is 262 00:14:14,276 --> 00:14:19,156 Speaker 1: in total control. And I was really and I remain 263 00:14:19,276 --> 00:14:24,116 Speaker 1: concerned about this question of decision making and how do 264 00:14:24,156 --> 00:14:26,836 Speaker 1: we make decisions? And if the computer can make better 265 00:14:26,876 --> 00:14:31,436 Speaker 1: decisions than we do, then what is the role of 266 00:14:31,476 --> 00:14:36,356 Speaker 1: free will? And if you spend a whole life just 267 00:14:36,396 --> 00:14:38,796 Speaker 1: simply doing what the computer tells you to do. What 268 00:14:38,916 --> 00:14:42,876 Speaker 1: is identity. So I wanted to put this hero who 269 00:14:42,916 --> 00:14:45,676 Speaker 1: comes from the past and who still is accustomed to 270 00:14:45,716 --> 00:14:48,316 Speaker 1: having free will and his power is to have free will, 271 00:14:48,996 --> 00:14:53,156 Speaker 1: and to put it in the situation where whenever he 272 00:14:53,236 --> 00:14:56,316 Speaker 1: listens to the computer, everything goes right, and whenever he 273 00:14:56,356 --> 00:14:57,996 Speaker 1: trusts his own judgment, it's wrong. 274 00:14:58,276 --> 00:15:01,956 Speaker 2: Oh wait, wait, wait, so the AI, the AI is 275 00:15:01,956 --> 00:15:04,316 Speaker 2: smart enough to realize that the only way to save 276 00:15:04,396 --> 00:15:08,556 Speaker 2: the world is to bring someone in from outside AI. Yeah, oh, 277 00:15:08,556 --> 00:15:11,276 Speaker 2: the AI a little bit of self knowledge and humility 278 00:15:11,436 --> 00:15:12,036 Speaker 2: in your world. 279 00:15:12,076 --> 00:15:14,316 Speaker 1: Oh yeah, that was one of our best things. We 280 00:15:14,396 --> 00:15:17,996 Speaker 1: created this wonderful AI, a wonderful character called We named 281 00:15:18,076 --> 00:15:21,596 Speaker 1: him Plato, and uh, he's you know, he's sort of 282 00:15:21,596 --> 00:15:23,356 Speaker 1: somewhere We're going to Butler and God. 283 00:15:23,716 --> 00:15:27,676 Speaker 2: Yeah, that's that's this is this is getting better and better. 284 00:15:27,756 --> 00:15:30,116 Speaker 3: I was gonna chime in there and uh and just 285 00:15:30,276 --> 00:15:33,196 Speaker 3: and just say that basically, in this society, there are 286 00:15:33,236 --> 00:15:34,876 Speaker 3: a few people that have tuned out and they're like, 287 00:15:34,956 --> 00:15:36,076 Speaker 3: I'm not going to listen to this computer. 288 00:15:36,076 --> 00:15:36,996 Speaker 1: Where does this computer know? 289 00:15:37,556 --> 00:15:40,876 Speaker 3: And their lives are horrible and all their decisions are bad, 290 00:15:41,316 --> 00:15:44,436 Speaker 3: and they decide to choose their own soulmates, and they 291 00:15:44,476 --> 00:15:45,716 Speaker 3: have horrible marriages. 292 00:15:46,116 --> 00:15:46,996 Speaker 1: And then everyone who. 293 00:15:46,956 --> 00:15:50,356 Speaker 3: Listens to exactly what the computer says has this Instagram 294 00:15:50,396 --> 00:15:53,596 Speaker 3: life where they are actually so happy and their kids 295 00:15:53,636 --> 00:15:55,876 Speaker 3: are so perfect and everything is so amazing. 296 00:15:56,356 --> 00:15:58,436 Speaker 1: And so, you know, Cole comes into this world. Our 297 00:15:58,476 --> 00:15:59,876 Speaker 1: hero comes into this world. 298 00:15:59,996 --> 00:16:03,316 Speaker 3: And he of course finds an appalling and as as 299 00:16:03,316 --> 00:16:05,676 Speaker 3: appalling as we would find it, and he's determined to 300 00:16:05,716 --> 00:16:08,676 Speaker 3: ignore Plato. He's determined to ignore the AI that has 301 00:16:08,716 --> 00:16:11,956 Speaker 3: brought into the future to save humanity from itself. So 302 00:16:12,036 --> 00:16:15,076 Speaker 3: this is another kind of unfolding paradox is humanity doesn't 303 00:16:15,116 --> 00:16:17,516 Speaker 3: want to save itself. It means listening to the computers 304 00:16:17,636 --> 00:16:18,716 Speaker 3: on how to save itself. 305 00:16:19,356 --> 00:16:20,836 Speaker 1: Wait, wait, so many questions. 306 00:16:21,036 --> 00:16:22,796 Speaker 2: First of all, tell me a little bit about so 307 00:16:23,596 --> 00:16:26,956 Speaker 2: our variable man is called as in the Philip K. 308 00:16:27,036 --> 00:16:30,116 Speaker 2: Dick story, your veryble man is called coal. Tell me 309 00:16:30,156 --> 00:16:34,636 Speaker 2: about your coal? And where does he come from? What's 310 00:16:34,636 --> 00:16:35,476 Speaker 2: his personality? 311 00:16:35,636 --> 00:16:35,716 Speaker 1: Like? 312 00:16:35,836 --> 00:16:38,316 Speaker 2: Why is he? Why is he the way? Why is 313 00:16:38,356 --> 00:16:43,196 Speaker 2: he so capable of standing up to AI even if 314 00:16:43,236 --> 00:16:45,196 Speaker 2: AI has all these demonistrable advantages. 315 00:16:46,676 --> 00:16:49,476 Speaker 1: So basically he's just a classic action hero. 316 00:16:49,876 --> 00:16:53,556 Speaker 3: The idea is he embodies kind of our cowboy American 317 00:16:53,756 --> 00:16:58,756 Speaker 3: trust in our guts. And so the opening sequence, he's 318 00:16:58,756 --> 00:17:03,156 Speaker 3: sent in to stop this hacker, this hacker who's been 319 00:17:03,196 --> 00:17:06,476 Speaker 3: creating chaos across the world, who's been sort of, you know, 320 00:17:06,556 --> 00:17:09,436 Speaker 3: getting into all these top secret facilities and creating we 321 00:17:09,436 --> 00:17:11,436 Speaker 3: think he's gonna launch world War three whatever. 322 00:17:11,476 --> 00:17:14,156 Speaker 1: And so Cole goes in and he's being sort of 323 00:17:14,156 --> 00:17:14,836 Speaker 1: instructed what. 324 00:17:14,836 --> 00:17:16,836 Speaker 3: To do, and we already had these computers at this 325 00:17:16,916 --> 00:17:18,876 Speaker 3: time that are sort of running the percentages and telling 326 00:17:18,956 --> 00:17:20,996 Speaker 3: him do this, do this, do this, do this other thing, 327 00:17:21,676 --> 00:17:23,756 Speaker 3: and he's listening to the computers. 328 00:17:23,796 --> 00:17:24,516 Speaker 1: He's breaking in. 329 00:17:24,596 --> 00:17:27,156 Speaker 3: He finally gets to the heart of the layer where 330 00:17:27,836 --> 00:17:31,036 Speaker 3: this hacker is, and he kicks down the door and 331 00:17:31,076 --> 00:17:34,236 Speaker 3: he points his gun and then the hacker turns around 332 00:17:34,956 --> 00:17:39,956 Speaker 3: and it's a child, and Cole realizes, oh, my goodness, 333 00:17:40,356 --> 00:17:42,396 Speaker 3: this is the hacker. This is the person who's been 334 00:17:42,396 --> 00:17:45,636 Speaker 3: creating all this chaos. And he gets this word on 335 00:17:45,676 --> 00:17:49,156 Speaker 3: his earpiece, kill the hacker, and he says, I can't 336 00:17:49,196 --> 00:17:49,716 Speaker 3: kill the hacker. 337 00:17:49,716 --> 00:17:50,396 Speaker 1: This is a child. 338 00:17:50,796 --> 00:17:53,196 Speaker 3: And there's this long pause and they say, look, we've 339 00:17:53,236 --> 00:17:57,076 Speaker 3: run all the numbers. If you don't kill the kid. 340 00:17:57,836 --> 00:18:01,556 Speaker 3: This is going to happen again. Pull the trigger and 341 00:18:02,436 --> 00:18:04,396 Speaker 3: he stands there in the moment. Can I pull the trigger? 342 00:18:04,436 --> 00:18:06,276 Speaker 3: Can I pull the sugar? And then he doesn't do it. 343 00:18:06,276 --> 00:18:07,796 Speaker 3: He doesn't pull the trigger, He drops the gun, he 344 00:18:07,796 --> 00:18:10,956 Speaker 3: grabs a kid, and what essentially happens is he dies 345 00:18:11,356 --> 00:18:14,676 Speaker 3: saving the kid's life. There's a fireball that blows him up, 346 00:18:14,916 --> 00:18:16,596 Speaker 3: but the kid goes on and the kid lives. 347 00:18:16,876 --> 00:18:19,516 Speaker 1: And because that fireball incinerates him. 348 00:18:20,116 --> 00:18:22,756 Speaker 3: He can be brought into the future without changing time 349 00:18:23,796 --> 00:18:26,676 Speaker 3: because he was essentially eliminated from the timeline. So if 350 00:18:26,716 --> 00:18:29,436 Speaker 3: you reach into the exact moment just before he dies 351 00:18:29,836 --> 00:18:32,756 Speaker 3: and pull him into the future, time isn't changed. And 352 00:18:32,796 --> 00:18:35,196 Speaker 3: so he's the perfect person to bring back because he 353 00:18:35,196 --> 00:18:37,796 Speaker 3: has demonstrated the ability to say no to the computers, 354 00:18:38,316 --> 00:18:42,116 Speaker 3: and also because taking him doesn't change time. 355 00:18:42,916 --> 00:18:44,676 Speaker 2: Oh I see so in the Fruit, when we meet 356 00:18:44,756 --> 00:18:49,636 Speaker 2: Cole's he's part of the present, he's not. He's He's 357 00:18:49,636 --> 00:18:52,076 Speaker 2: just a guy who's standing up to the tyranny of AI. 358 00:18:52,596 --> 00:18:55,556 Speaker 2: Here's an action hero who's AI has been corrupted by 359 00:18:55,596 --> 00:18:59,276 Speaker 2: this hacker. He's started to solve the problem. He fails 360 00:18:59,316 --> 00:19:02,836 Speaker 2: as an incident. How much time passes between that initial 361 00:19:02,916 --> 00:19:05,196 Speaker 2: encounter with the hacker and then when our story is 362 00:19:05,196 --> 00:19:05,756 Speaker 2: taking place. 363 00:19:06,916 --> 00:19:09,916 Speaker 3: Well, this is part of the twist because when you 364 00:19:09,996 --> 00:19:12,316 Speaker 3: get into the future, you think that an enormous amount 365 00:19:12,356 --> 00:19:15,676 Speaker 3: of time has passed because the future is radically different 366 00:19:15,956 --> 00:19:19,996 Speaker 3: from when Cole disappeared, and it's so different that he 367 00:19:20,036 --> 00:19:20,836 Speaker 3: can't believe it. 368 00:19:21,996 --> 00:19:22,836 Speaker 1: But we start to. 369 00:19:22,836 --> 00:19:25,636 Speaker 3: Learn that actually a very small amount of time, much 370 00:19:25,676 --> 00:19:28,596 Speaker 3: more less time than you might suspect, has passed between 371 00:19:28,596 --> 00:19:30,916 Speaker 3: when he died and when he was brought back because 372 00:19:30,956 --> 00:19:34,316 Speaker 3: these computers, as Gary was saying, have accelerated human decision 373 00:19:34,356 --> 00:19:36,156 Speaker 3: making to the points that our. 374 00:19:35,996 --> 00:19:37,756 Speaker 1: Society has started to run on this. 375 00:19:37,796 --> 00:19:41,716 Speaker 3: Utopia cycle and everything is just going much faster and 376 00:19:41,796 --> 00:19:43,996 Speaker 3: much better than anyone could have imagined in our time. 377 00:19:44,916 --> 00:19:47,956 Speaker 1: And it's because of this great genius who was the kid. 378 00:19:48,396 --> 00:19:51,636 Speaker 1: The kid is is the thing that made the difference 379 00:19:51,876 --> 00:19:54,316 Speaker 1: because he didn't kill him. Because Cole didn't kill him, 380 00:19:54,396 --> 00:19:57,436 Speaker 1: he survived and he became the great genius that ushered 381 00:19:57,516 --> 00:20:00,276 Speaker 1: us into this accelerated utopia. 382 00:20:00,436 --> 00:20:02,916 Speaker 2: Now, why one thing I don't understand. If we are 383 00:20:02,916 --> 00:20:06,396 Speaker 2: in accelerated utopia, why do we need saving? Why would 384 00:20:06,556 --> 00:20:10,596 Speaker 2: utopia have a kind of expiry date. Isn't the definition 385 00:20:10,636 --> 00:20:12,716 Speaker 2: of utopia that it keeps going in perpetuity? 386 00:20:13,836 --> 00:20:15,236 Speaker 3: Well, I mean, so this is part of the great 387 00:20:15,236 --> 00:20:17,356 Speaker 3: middle of the beginning of the story is why is 388 00:20:17,396 --> 00:20:20,436 Speaker 3: it exactly that we're going to die? What could possibly 389 00:20:20,476 --> 00:20:23,756 Speaker 3: go wrong? Everything seems like it's perfect. What is going 390 00:20:23,756 --> 00:20:26,916 Speaker 3: to destroy the world? Well, nobody is able to identify 391 00:20:26,956 --> 00:20:29,236 Speaker 3: exactly at the beginning of the movie what's going to 392 00:20:29,276 --> 00:20:30,996 Speaker 3: go on. All they know is that the computers are 393 00:20:31,076 --> 00:20:33,636 Speaker 3: quite insistent that is going to happen in the next 394 00:20:33,636 --> 00:20:37,076 Speaker 3: twenty four hours. And what happens over the middle parts 395 00:20:37,076 --> 00:20:41,356 Speaker 3: of the screenplay is various different alternatives start to emerge. 396 00:20:41,396 --> 00:20:44,396 Speaker 3: We start to see all of these different moving pieces, 397 00:20:44,436 --> 00:20:46,676 Speaker 3: and we start to realize, oh, sure, you know it's 398 00:20:46,676 --> 00:20:49,276 Speaker 3: a utopia, but there's still humans in it, and there's 399 00:20:49,276 --> 00:20:52,236 Speaker 3: still other things going on. And I don't know how 400 00:20:52,236 --> 00:20:54,676 Speaker 3: many of the twists you want me to ruin up, Malcolm. 401 00:20:54,676 --> 00:20:56,956 Speaker 3: But one of the things that starts to become a 402 00:20:57,036 --> 00:21:02,076 Speaker 3: question that haunts the hero is is he the one 403 00:21:02,076 --> 00:21:05,796 Speaker 3: who's going to destroy the world? Is what the computers 404 00:21:05,836 --> 00:21:10,876 Speaker 3: saw that he would get brought back and that by 405 00:21:10,916 --> 00:21:13,476 Speaker 3: not listening to the computers. He was going to be 406 00:21:13,516 --> 00:21:14,716 Speaker 3: the one that blows everything up. 407 00:21:15,556 --> 00:21:17,916 Speaker 2: You know what this is, if I might reduce your 408 00:21:18,596 --> 00:21:22,636 Speaker 2: whole premise to an axiom. This is the cinematic version 409 00:21:23,276 --> 00:21:25,676 Speaker 2: of the famous adage in the Land of the Blind, 410 00:21:26,356 --> 00:21:30,556 Speaker 2: the one eyed man is king right. Yeap, your coal 411 00:21:30,676 --> 00:21:32,996 Speaker 2: is your one eyed man who is asked to be 412 00:21:33,076 --> 00:21:36,476 Speaker 2: king right. And what you're describing just there is he 413 00:21:36,596 --> 00:21:39,436 Speaker 2: knows he only has one eye right. He's wrestling with 414 00:21:39,436 --> 00:21:41,356 Speaker 2: the fact that he can't see it all. It's sort 415 00:21:41,396 --> 00:21:45,596 Speaker 2: of beautiful. Let me give you a little recap of 416 00:21:45,596 --> 00:21:49,836 Speaker 2: where we are. There's a perfect utopia where AI runs everything. 417 00:21:50,316 --> 00:21:52,996 Speaker 2: People do what computers tell them to do and are 418 00:21:52,996 --> 00:21:56,636 Speaker 2: better for it. For reasons we don't quite understand yet, 419 00:21:56,836 --> 00:22:00,276 Speaker 2: this utopia is in danger. In order to save it, 420 00:22:00,836 --> 00:22:04,636 Speaker 2: the AI named Plato no Less brings back to life 421 00:22:04,716 --> 00:22:11,196 Speaker 2: a guy who defied its commands in the past. After 422 00:22:11,236 --> 00:22:14,516 Speaker 2: the break, Gary and Angus try to get the movie made. 423 00:22:25,116 --> 00:22:27,596 Speaker 2: So tell me about your tell me more about your goal. 424 00:22:27,716 --> 00:22:31,956 Speaker 2: So does he who who do you imagine playing your goal? 425 00:22:32,116 --> 00:22:34,796 Speaker 1: So we actually had we had quite a lot of interest. 426 00:22:34,956 --> 00:22:38,356 Speaker 3: We had interest from Bradley Cooper's people. 427 00:22:38,396 --> 00:22:41,556 Speaker 1: We had interest from Mark Wahlberg's people. But the one 428 00:22:41,596 --> 00:22:43,276 Speaker 1: thing that was always common. 429 00:22:43,356 --> 00:22:47,836 Speaker 3: Was that people became increasingly disturbed about the paradox at 430 00:22:47,836 --> 00:22:51,316 Speaker 3: the center of Paul Psyche, because if you're going to 431 00:22:51,396 --> 00:22:54,116 Speaker 3: play this part, you have to wrestle with all of 432 00:22:54,156 --> 00:22:57,716 Speaker 3: these sort of profound questions, And one of the most 433 00:22:57,716 --> 00:23:03,076 Speaker 3: profound questions is that what the computers are doing is 434 00:23:03,076 --> 00:23:07,436 Speaker 3: they're saying, trust us. And the reason they get us 435 00:23:07,476 --> 00:23:09,716 Speaker 3: to trust them is by being able to predict our 436 00:23:09,756 --> 00:23:10,796 Speaker 3: most intimate choice. 437 00:23:11,796 --> 00:23:12,516 Speaker 1: They're able to. 438 00:23:12,436 --> 00:23:15,916 Speaker 3: Predict our heart. They're able to know our heart better 439 00:23:15,956 --> 00:23:18,276 Speaker 3: than we know our heart. They're able to say, you're 440 00:23:18,276 --> 00:23:20,836 Speaker 3: going to love this person, and when you meet this person, 441 00:23:20,876 --> 00:23:23,116 Speaker 3: your entire life is going to be changed. And the 442 00:23:23,116 --> 00:23:25,636 Speaker 3: computer does this to Cole when he gets into the future. 443 00:23:26,276 --> 00:23:29,196 Speaker 3: Is he picks the perfect person for Coal and they 444 00:23:29,236 --> 00:23:31,676 Speaker 3: fall in love immediately. In fact, they fall in love 445 00:23:31,756 --> 00:23:33,156 Speaker 3: so fast that they don't even know that they were 446 00:23:33,156 --> 00:23:34,076 Speaker 3: set up by the computer. 447 00:23:34,236 --> 00:23:36,076 Speaker 1: They only figure it out after the fact. 448 00:23:37,036 --> 00:23:42,076 Speaker 3: So now Cole knows that the computer has told him 449 00:23:42,636 --> 00:23:44,796 Speaker 3: this is who you're going to love, this is your soulmate. 450 00:23:45,596 --> 00:23:49,076 Speaker 3: And the computer has also told him the world is 451 00:23:49,076 --> 00:23:52,596 Speaker 3: going to end, and so essentially the existential choice facing 452 00:23:52,676 --> 00:23:55,396 Speaker 3: Coal is is my heart wrong. 453 00:23:56,796 --> 00:23:59,716 Speaker 1: And I don't love this woman and the world is 454 00:23:59,756 --> 00:24:01,516 Speaker 1: going to be fine, or. 455 00:24:01,436 --> 00:24:04,036 Speaker 3: By following my heart and accepting that I love this woman, 456 00:24:04,156 --> 00:24:05,116 Speaker 3: is everyone going to die? 457 00:24:06,116 --> 00:24:08,396 Speaker 1: And that he has to listen to the computer when 458 00:24:08,436 --> 00:24:12,636 Speaker 1: the computer gives him a mission, and the computer's mission 459 00:24:12,676 --> 00:24:16,036 Speaker 1: for him is to kill someone. Is to kill the 460 00:24:16,036 --> 00:24:20,876 Speaker 1: person who's going to trigger the singularity. 461 00:24:18,676 --> 00:24:22,396 Speaker 2: Same mission as was given him before he was incinerated. 462 00:24:22,916 --> 00:24:26,716 Speaker 1: That's right, but on a much bigger and more advanced scale. Yeah, 463 00:24:26,756 --> 00:24:29,436 Speaker 1: And the twist is that the person who he has 464 00:24:29,476 --> 00:24:31,276 Speaker 1: to kill is, of course the person that he saved 465 00:24:31,276 --> 00:24:35,036 Speaker 1: when he was a child. And the further twist is 466 00:24:35,076 --> 00:24:37,676 Speaker 1: that that person is the father of the girl he's 467 00:24:37,876 --> 00:24:40,796 Speaker 1: been matched with, who he's fallen in love with, and 468 00:24:40,876 --> 00:24:43,356 Speaker 1: that he's been set up to be with this girl 469 00:24:43,356 --> 00:24:45,636 Speaker 1: because she's going to bring him home to Daddy and 470 00:24:45,676 --> 00:24:48,796 Speaker 1: that's the only way to get through daddy's security. 471 00:24:49,716 --> 00:24:56,876 Speaker 2: Oh my god, so good, it's so good. Why let's 472 00:24:56,876 --> 00:24:58,956 Speaker 2: go back. So you had the interest from these actors 473 00:24:59,036 --> 00:25:01,916 Speaker 2: in playing Cole, But why did they have when you 474 00:25:01,956 --> 00:25:05,276 Speaker 2: said they had trouble with that twist. It sounds to 475 00:25:05,316 --> 00:25:08,996 Speaker 2: me like this is an extraordinary opportunity for a skilled actor. 476 00:25:09,516 --> 00:25:10,836 Speaker 2: What do you mean they had trouble with it. 477 00:25:11,516 --> 00:25:14,236 Speaker 3: Well, so I think that when you think about why 478 00:25:14,276 --> 00:25:17,476 Speaker 3: total Recall got made, Arnold Schwarzninger has the capacity to 479 00:25:17,516 --> 00:25:20,236 Speaker 3: be himself and also be meta Ornald at the same time. 480 00:25:20,356 --> 00:25:22,236 Speaker 3: So he's the capacity to kind of laugh at himself 481 00:25:22,236 --> 00:25:25,276 Speaker 3: at the same time he's being himself. This part requires 482 00:25:25,316 --> 00:25:29,076 Speaker 3: that ability to be yourself and be outside yourself at 483 00:25:29,116 --> 00:25:33,676 Speaker 3: the same time, and it forces you also to go 484 00:25:33,836 --> 00:25:38,316 Speaker 3: against the kind of core fantasy of Hollywood, which is 485 00:25:38,356 --> 00:25:41,916 Speaker 3: that love saves the world. 486 00:25:42,476 --> 00:25:45,276 Speaker 1: If you listen to yourself, that saves the world. 487 00:25:45,476 --> 00:25:49,316 Speaker 3: And so this caused a lot of cognitive dissonance in actors, 488 00:25:49,316 --> 00:25:51,956 Speaker 3: and the same reason that Gary was talking about that 489 00:25:52,876 --> 00:25:57,356 Speaker 3: the sort of AI billionaire's daughter was caused cognitive dissonance. 490 00:25:57,676 --> 00:26:02,156 Speaker 3: And so that was a real emotional sticking point for 491 00:26:02,196 --> 00:26:04,276 Speaker 3: a lot of actors, and I think they were concerned 492 00:26:04,796 --> 00:26:05,676 Speaker 3: about the story. 493 00:26:06,796 --> 00:26:11,876 Speaker 1: Yeah, I think you're right, Angus. Ultimately we didn't want 494 00:26:11,876 --> 00:26:13,436 Speaker 1: to say, oh, yeah, trust your heart and it's all 495 00:26:13,516 --> 00:26:16,076 Speaker 1: going to work out. That's a lot of Hollywood endings 496 00:26:16,276 --> 00:26:19,036 Speaker 1: manufacture that ending, and it's the Hollywood answer is trust 497 00:26:19,076 --> 00:26:22,996 Speaker 1: your heart, and society is producing a new answer, which 498 00:26:22,996 --> 00:26:24,156 Speaker 1: is trust the AI. 499 00:26:25,276 --> 00:26:31,636 Speaker 2: Can you think of a Hollywood movie in previous to 500 00:26:31,676 --> 00:26:36,156 Speaker 2: this where not the narrow question of don't trust your heart, 501 00:26:36,556 --> 00:26:41,676 Speaker 2: but the kind of broader question of that love and 502 00:26:41,716 --> 00:26:48,356 Speaker 2: the fulfillment of a real emotional need would have catastrophic 503 00:26:48,396 --> 00:26:49,996 Speaker 2: consequences outside of it? 504 00:26:50,036 --> 00:26:50,116 Speaker 3: Is? 505 00:26:50,116 --> 00:26:52,956 Speaker 1: That is this virgin territory is what I'm asking for 506 00:26:52,996 --> 00:26:58,796 Speaker 1: a Hollywood film. One example springs to mind Casablanca. For 507 00:26:58,876 --> 00:27:01,916 Speaker 1: Humphrey Bogart and Ingrid Bergmann to get together will farm 508 00:27:01,996 --> 00:27:06,036 Speaker 1: the world, and so they Humphrey Bogart makes the Rick 509 00:27:06,316 --> 00:27:08,876 Speaker 1: makes the decision not to follow his heart. 510 00:27:09,716 --> 00:27:11,796 Speaker 2: I think you're right, but you will. 511 00:27:13,116 --> 00:27:13,236 Speaker 3: Uh. 512 00:27:13,556 --> 00:27:16,516 Speaker 2: There's a crucial difference, though, and that is, do we 513 00:27:16,556 --> 00:27:18,916 Speaker 2: ever really believe that Rick is madly in love with 514 00:27:19,756 --> 00:27:22,876 Speaker 2: Ingrid Berkman? Not in what you've described as a situation 515 00:27:22,956 --> 00:27:28,916 Speaker 2: where the two parties are genuinely and have powerful reason 516 00:27:28,956 --> 00:27:32,516 Speaker 2: to believe that they are genuinely in love, they are soulmates. 517 00:27:33,156 --> 00:27:36,716 Speaker 2: Was Ingram Bourkman Humphrey Bogart's soulmon in Casablanca or was 518 00:27:36,756 --> 00:27:40,716 Speaker 2: it was like his heart is so kind of hardened 519 00:27:40,876 --> 00:27:44,676 Speaker 2: and covered in scar tissue, we kind of accept him 520 00:27:44,676 --> 00:27:47,116 Speaker 2: walking away because that's what he does. He walks away, right, 521 00:27:47,476 --> 00:27:49,596 Speaker 2: But you just gotta be something a very different dynamic. 522 00:27:50,516 --> 00:27:54,276 Speaker 2: You're talking about two people who are soulmates having to 523 00:27:54,276 --> 00:27:54,796 Speaker 2: walk away. 524 00:27:56,036 --> 00:27:57,796 Speaker 3: Yeah, and you know, I will say I'm Malcolm my way. 525 00:27:57,796 --> 00:28:00,596 Speaker 3: That's a very iconoclastic reading that you just delivered of Casablanca. 526 00:28:00,636 --> 00:28:03,636 Speaker 1: But also, wait, wait, wait, wait what wait? 527 00:28:03,756 --> 00:28:07,396 Speaker 2: But they know you don't think Humphrey Bogart's heart is 528 00:28:07,436 --> 00:28:09,116 Speaker 2: covered in covered in scar tissues. 529 00:28:09,876 --> 00:28:13,636 Speaker 3: I think the point is that even though it is he, 530 00:28:13,636 --> 00:28:18,196 Speaker 3: he realizes that the romantic thing to do is to leave. 531 00:28:18,516 --> 00:28:21,716 Speaker 3: And so that's the ultimate tragedy of that movie, Malcolm, 532 00:28:21,756 --> 00:28:24,316 Speaker 3: which which you've just desecrated, is that he's a man 533 00:28:24,356 --> 00:28:27,276 Speaker 3: who is reconnected with his heart and in doing so, 534 00:28:27,436 --> 00:28:28,316 Speaker 3: walked away from it. 535 00:28:28,396 --> 00:28:30,676 Speaker 1: You know. Anyway, I think that's the Hollywood schlock reading 536 00:28:30,716 --> 00:28:30,956 Speaker 1: of it. 537 00:28:30,996 --> 00:28:34,036 Speaker 3: But but the point is in that movie he walks away. 538 00:28:34,236 --> 00:28:37,636 Speaker 3: He does the heroic thing, and and you know, you're asking, 539 00:28:37,716 --> 00:28:40,996 Speaker 3: is there a movie where someone has basically willingly destroyed 540 00:28:40,996 --> 00:28:42,356 Speaker 3: the world for their own love. 541 00:28:43,156 --> 00:28:45,116 Speaker 1: Has someone willingly made that choice? 542 00:28:45,276 --> 00:28:45,476 Speaker 3: You know? 543 00:28:46,916 --> 00:28:50,236 Speaker 1: And that one that one did happen in the English patient? 544 00:28:51,836 --> 00:28:54,036 Speaker 2: Oh tell me, tell me, I remember that film is 545 00:28:54,036 --> 00:28:55,636 Speaker 2: not strong enough. Tell me how that happens in the 546 00:28:55,636 --> 00:28:56,276 Speaker 2: English Patient. 547 00:28:56,636 --> 00:29:02,276 Speaker 1: Well, in the end of the hero basically is a 548 00:29:02,316 --> 00:29:07,796 Speaker 1: traitor to the allies out of love for a woman. Okay, 549 00:29:08,316 --> 00:29:12,396 Speaker 1: and usually Holly would doesn't put the hero in that position. 550 00:29:12,676 --> 00:29:16,436 Speaker 1: That's they don't decide to. I mean, what they will 551 00:29:16,436 --> 00:29:19,436 Speaker 1: either do is say there's a greater good here. I 552 00:29:19,476 --> 00:29:22,276 Speaker 1: mean a lot of English movies, you know, even brief encounter, 553 00:29:22,356 --> 00:29:24,236 Speaker 1: you know, people say I'm in love, but I'm going 554 00:29:24,316 --> 00:29:25,876 Speaker 1: to do this, I'm going to stay with my wife 555 00:29:25,916 --> 00:29:28,636 Speaker 1: and kids. Yes, so there is. 556 00:29:28,636 --> 00:29:33,116 Speaker 2: There is English annoying kind of aristocratic English notion of 557 00:29:33,596 --> 00:29:37,156 Speaker 2: remembering Tale of Two Cities at the end, it's a 558 00:29:37,196 --> 00:29:39,116 Speaker 2: far far better thing than I've ever done. 559 00:29:39,196 --> 00:29:40,356 Speaker 1: Blah blah blah blah. 560 00:29:40,396 --> 00:29:41,996 Speaker 2: They love playing that role. They never play it in 561 00:29:42,036 --> 00:29:44,676 Speaker 2: real life, but they love playing it in literature. Yeah, yeah, 562 00:29:44,796 --> 00:29:47,076 Speaker 2: they're like, are you kidding me? In real life he's good, No, 563 00:29:47,156 --> 00:29:50,516 Speaker 2: he's not doing that. But no, but wait wait, I'm 564 00:29:50,556 --> 00:29:54,956 Speaker 2: still I'm not satisfied. So it doesn't. Why am I 565 00:29:54,956 --> 00:29:56,676 Speaker 2: am I wrong? Why does it seem to me like 566 00:29:56,756 --> 00:30:00,916 Speaker 2: you have described in this movie a much more complicated 567 00:30:02,356 --> 00:30:06,876 Speaker 2: and interesting as a problematic narrative scenario that exists in 568 00:30:06,956 --> 00:30:10,556 Speaker 2: say Castle Plankt. Bradley Cooper's not having a problem doing 569 00:30:10,596 --> 00:30:13,676 Speaker 2: a remake of Kazablanca. It's fine with that, but he 570 00:30:13,956 --> 00:30:16,716 Speaker 2: clearly had a problem with this. So you described to 571 00:30:16,796 --> 00:30:19,276 Speaker 2: me in your words, well, I can see how they 572 00:30:19,276 --> 00:30:21,836 Speaker 2: are analogous, but they're not. It's not perfect. Why is 573 00:30:21,876 --> 00:30:22,556 Speaker 2: it not perfect? 574 00:30:23,676 --> 00:30:25,556 Speaker 1: It's I don't think it's perfect at all. By the way, 575 00:30:25,596 --> 00:30:26,116 Speaker 1: I do, I do. 576 00:30:26,076 --> 00:30:27,996 Speaker 3: Want to get on record as saying that you've also 577 00:30:28,236 --> 00:30:32,436 Speaker 3: designated Charles Dickens, who's another sotemic figure of English literature. 578 00:30:33,556 --> 00:30:36,196 Speaker 1: No, I mean, it's it's the reason that it's that, 579 00:30:36,316 --> 00:30:39,716 Speaker 1: it's that it's different, is uh. 580 00:30:39,996 --> 00:30:44,796 Speaker 3: I mean, imagine if the character chose his heart and 581 00:30:44,836 --> 00:30:45,676 Speaker 3: the Nazis won. 582 00:30:46,316 --> 00:30:47,796 Speaker 1: Yeah, you you choose. 583 00:30:47,956 --> 00:30:50,676 Speaker 3: I mean, imagine the sound of music right in which 584 00:30:50,676 --> 00:30:53,316 Speaker 3: he marries Julie Andrews, and the result of that is 585 00:30:53,356 --> 00:30:55,636 Speaker 3: the Second World War is lost. I mean, you know, 586 00:30:55,796 --> 00:30:57,436 Speaker 3: I mean that's the kind of that's the kind of 587 00:30:57,476 --> 00:31:00,156 Speaker 3: crisis you're setting up. And so in order to establish 588 00:31:00,276 --> 00:31:03,796 Speaker 3: free will, he has to reject Julie Andrews. He has 589 00:31:03,876 --> 00:31:07,636 Speaker 3: to reject you know, his heart to save to to 590 00:31:07,876 --> 00:31:11,716 Speaker 3: to save the world. Accept we haven't even told you 591 00:31:11,756 --> 00:31:13,516 Speaker 3: the end of the movie, so you don't even know 592 00:31:13,596 --> 00:31:17,436 Speaker 3: what's going to happen, right and when and when actors 593 00:31:17,436 --> 00:31:20,036 Speaker 3: got to the end of the movie, the dissonance became 594 00:31:20,116 --> 00:31:23,836 Speaker 3: so intense for them that, you know, I think that 595 00:31:23,916 --> 00:31:26,196 Speaker 3: they started to say, how can I, in good conscience 596 00:31:26,356 --> 00:31:29,356 Speaker 3: play this part? Because I myself don't even understand exactly 597 00:31:30,396 --> 00:31:31,316 Speaker 3: the part that I'm playing. 598 00:31:31,356 --> 00:31:34,316 Speaker 2: So you're in telling the story of why this movie 599 00:31:34,396 --> 00:31:37,156 Speaker 2: doesn't get made. You we're listing, we're going to go 600 00:31:37,196 --> 00:31:39,916 Speaker 2: through the list of culprits, but it's corporate number one 601 00:31:39,956 --> 00:31:42,156 Speaker 2: that the actors themselves couldn't handle it. 602 00:31:42,676 --> 00:31:44,396 Speaker 3: I think that's part of it. But I mean, I 603 00:31:44,396 --> 00:31:47,876 Speaker 3: think the core of it is is is. 604 00:31:48,756 --> 00:31:52,476 Speaker 1: Do we trust ourselves? Yeah, And. 605 00:31:54,596 --> 00:31:58,156 Speaker 3: That's a hard place to go in the movies because 606 00:31:58,196 --> 00:32:00,316 Speaker 3: at the end of the day, the movies basically no comedy, 607 00:32:00,316 --> 00:32:05,916 Speaker 3: and they know tragedy, and in a comedy, you give 608 00:32:05,956 --> 00:32:10,236 Speaker 3: yourself over to a fantasy, and in tragedy. You're like, 609 00:32:10,836 --> 00:32:13,836 Speaker 3: it's all over from the beginning, and it's a chance 610 00:32:13,876 --> 00:32:16,196 Speaker 3: for you to kind of wallow. So another way to 611 00:32:16,236 --> 00:32:20,676 Speaker 3: say this is when we're born, in our brain, there's 612 00:32:20,716 --> 00:32:24,036 Speaker 3: a kind of primordial story, and that story is that 613 00:32:24,116 --> 00:32:27,476 Speaker 3: I'm good and that life is good. And that primordial 614 00:32:27,516 --> 00:32:29,396 Speaker 3: story is so powerful that it allows us to. 615 00:32:29,396 --> 00:32:31,556 Speaker 1: Be open to the world and to explore the world. 616 00:32:32,156 --> 00:32:34,356 Speaker 3: And then what happens over time is that story runs 617 00:32:34,436 --> 00:32:36,836 Speaker 3: up against our experiences of life, many of which are negative, 618 00:32:37,476 --> 00:32:39,836 Speaker 3: and this other story starts to occur in our brain, 619 00:32:40,076 --> 00:32:43,316 Speaker 3: which is actually, life is not so good, and I'm 620 00:32:43,356 --> 00:32:46,316 Speaker 3: also not so good, and we start to believe that 621 00:32:46,436 --> 00:32:49,916 Speaker 3: story more, and that story starts to become more primordial 622 00:32:49,996 --> 00:32:52,716 Speaker 3: for us, and the good story starts to become more 623 00:32:52,756 --> 00:32:56,036 Speaker 3: superficial and fantasy for us, and we get into kind 624 00:32:56,036 --> 00:33:00,556 Speaker 3: of a dissociated place. And in that dissociated place, which 625 00:33:00,596 --> 00:33:03,516 Speaker 3: I think most humans exist most of the time, they 626 00:33:03,556 --> 00:33:05,396 Speaker 3: really believe that life is pretty bad and that they're 627 00:33:05,396 --> 00:33:08,556 Speaker 3: pretty terrible, but they're always trying to actively convince themselves 628 00:33:08,596 --> 00:33:10,276 Speaker 3: that life is good and that they're a nice person. 629 00:33:10,636 --> 00:33:13,356 Speaker 3: And this is a product of you know, mindfulness, positive 630 00:33:13,356 --> 00:33:15,516 Speaker 3: thinking all of these things which kind of infest our 631 00:33:16,116 --> 00:33:19,076 Speaker 3: society now. And when people go to the movies, they 632 00:33:19,116 --> 00:33:22,796 Speaker 3: want to indulge either the fantasy life of no, I'm 633 00:33:22,796 --> 00:33:25,476 Speaker 3: a good person and everything works out, or they want 634 00:33:25,516 --> 00:33:29,996 Speaker 3: to wallow in this negative space of no, you know what, 635 00:33:30,036 --> 00:33:31,556 Speaker 3: this is a safe space for me to admit that 636 00:33:31,556 --> 00:33:33,796 Speaker 3: everything's terrible and I'm a horrible human beings. And that's 637 00:33:33,796 --> 00:33:35,956 Speaker 3: why we have like the Dark Knights and you know, 638 00:33:36,116 --> 00:33:37,756 Speaker 3: Christopher Nolan and all this kind of stuff, you know, 639 00:33:37,916 --> 00:33:41,636 Speaker 3: kind of like the dark, sludgy stuff. And this movie 640 00:33:41,716 --> 00:33:44,516 Speaker 3: is constantly forcing you back and forth across the threshold. 641 00:33:46,116 --> 00:33:48,516 Speaker 3: This movie is not allowing you to say, you know what, yes, 642 00:33:48,596 --> 00:33:51,516 Speaker 3: everything is bad, life is bad, life is awful. Nor 643 00:33:51,596 --> 00:33:55,236 Speaker 3: is it saying there's a kind of simple easy answer here. 644 00:33:55,716 --> 00:33:58,316 Speaker 3: It's pushing you back and forth to kind of reconfront 645 00:33:58,836 --> 00:34:00,196 Speaker 3: and can you keep going? 646 00:34:00,636 --> 00:34:02,116 Speaker 1: And at the bottom of. 647 00:34:01,996 --> 00:34:06,396 Speaker 3: The movie is this idea that only intelligence is going 648 00:34:06,476 --> 00:34:13,476 Speaker 3: to solve things emotion, not wishing, not wanting, but being smart. 649 00:34:13,556 --> 00:34:16,196 Speaker 3: And are you the audience smart enough to solve this problem? 650 00:34:16,236 --> 00:34:18,076 Speaker 3: Because if you, the audience, are not smart enough to 651 00:34:18,116 --> 00:34:19,796 Speaker 3: solve this problem, the computers are going to solve it 652 00:34:19,836 --> 00:34:22,156 Speaker 3: for you. And so the existential challenge for the movie 653 00:34:22,236 --> 00:34:25,196 Speaker 3: essentially is are you smart enough to keep ahead of 654 00:34:25,196 --> 00:34:29,076 Speaker 3: the movie, in which case you know what, you're smart 655 00:34:29,156 --> 00:34:31,396 Speaker 3: enough to live into the future, or are you going 656 00:34:31,476 --> 00:34:34,716 Speaker 3: to submit and become passive and be drawn into the 657 00:34:34,756 --> 00:34:37,556 Speaker 3: engine of the story. And then that's the future of AI, 658 00:34:37,996 --> 00:34:40,236 Speaker 3: in which you no longer have any autonomy and you've 659 00:34:40,276 --> 00:34:42,476 Speaker 3: given up your will to the plot and the machinery. 660 00:34:42,636 --> 00:34:46,116 Speaker 3: So I think that's why people find the movie unnerving, 661 00:34:46,276 --> 00:34:48,796 Speaker 3: because it really does put you in that existential place 662 00:34:48,836 --> 00:34:52,316 Speaker 3: that you feel that your autonomy is being stripped. But 663 00:34:52,356 --> 00:34:54,996 Speaker 3: I mean there's also a lot of sort of funny 664 00:34:55,116 --> 00:34:58,356 Speaker 3: and sort of dire studio stories they're involved, so I 665 00:34:58,356 --> 00:35:02,076 Speaker 3: think ultimately the studios were also alarmed by this story. 666 00:35:02,076 --> 00:35:04,156 Speaker 3: I mean, this is a big budget movie, and the 667 00:35:04,196 --> 00:35:05,796 Speaker 3: thing is, if you're going to do a big budget 668 00:35:05,796 --> 00:35:08,796 Speaker 3: movie like Total Recall like this, you need to get 669 00:35:08,796 --> 00:35:11,116 Speaker 3: that act on board to try and secure. 670 00:35:12,156 --> 00:35:12,676 Speaker 1: The money. 671 00:35:13,436 --> 00:35:15,076 Speaker 3: And then do you want to put one hundred and 672 00:35:15,076 --> 00:35:17,676 Speaker 3: twenty million dollars into something that has this premise? 673 00:35:17,796 --> 00:35:21,076 Speaker 1: I think it was too demanding, But I think that 674 00:35:21,276 --> 00:35:23,676 Speaker 1: our main problem here was really that we didn't get 675 00:35:23,676 --> 00:35:27,596 Speaker 1: a director. We didn't get a strong director. A project 676 00:35:27,636 --> 00:35:30,516 Speaker 1: like this requires a director who with a very strong 677 00:35:30,596 --> 00:35:35,596 Speaker 1: vision and who the people with the money trust. And 678 00:35:36,756 --> 00:35:39,236 Speaker 1: I suppose I had the hubris to think that after 679 00:35:39,316 --> 00:35:41,756 Speaker 1: doing these three Philip K. Dick movies, they would trust me. 680 00:35:42,996 --> 00:35:46,756 Speaker 1: But actually that wasn't enough. The movie's fate in the 681 00:35:46,796 --> 00:35:52,396 Speaker 1: real world precisely parallels it's narrative on the page. In 682 00:35:52,436 --> 00:35:57,276 Speaker 1: other words, that actually happens. So one of. 683 00:35:57,276 --> 00:35:58,596 Speaker 3: The things that happened is after we went out to 684 00:35:58,636 --> 00:36:03,396 Speaker 3: all of these actors, I then was on another screenplay projects, 685 00:36:03,516 --> 00:36:06,556 Speaker 3: which is going well, and it was produced by Bob Shay, 686 00:36:06,916 --> 00:36:08,796 Speaker 3: who did Lord of the Rings all these big kind 687 00:36:08,796 --> 00:36:11,756 Speaker 3: of movies. And so I showed the script to Bob, 688 00:36:11,756 --> 00:36:13,196 Speaker 3: and Bob was like, Oh, the script is amazing. I 689 00:36:13,236 --> 00:36:14,876 Speaker 3: love the script. We got to get the script produced, 690 00:36:15,076 --> 00:36:17,756 Speaker 3: and so I said, okay, great. But it all culminated 691 00:36:18,556 --> 00:36:21,116 Speaker 3: in Bob calling me up one day and saying, Angus, 692 00:36:21,156 --> 00:36:26,436 Speaker 3: we've got financing for this movie. I'm almost entirely in 693 00:36:26,556 --> 00:36:28,196 Speaker 3: hisstry by financing for this this movie. 694 00:36:28,236 --> 00:36:30,316 Speaker 1: I said, oh, okay, I said, what's going on? He said, well, 695 00:36:30,356 --> 00:36:31,476 Speaker 1: there's this company in town. 696 00:36:32,916 --> 00:36:36,316 Speaker 3: They have a computer that they feed all of the 697 00:36:36,356 --> 00:36:41,156 Speaker 3: scripts into and the algorithm tells them how much money 698 00:36:42,036 --> 00:36:43,116 Speaker 3: the script is going to make. 699 00:36:43,996 --> 00:36:46,116 Speaker 1: And we've already given. 700 00:36:47,356 --> 00:36:49,476 Speaker 3: The script to the executives there and they love this 701 00:36:49,596 --> 00:36:51,396 Speaker 3: and they're going to give it to the algorithm, and 702 00:36:51,476 --> 00:36:53,556 Speaker 3: I just know that this is going to work out. 703 00:36:53,836 --> 00:36:55,476 Speaker 1: So I said, Okay, Bob, this sounds great. You know. 704 00:36:55,876 --> 00:37:00,796 Speaker 3: And this company, I should say, they very wealthy, very successful. 705 00:37:00,836 --> 00:37:02,316 Speaker 1: They've invested all these movies. 706 00:37:02,396 --> 00:37:06,036 Speaker 3: You know, they had themselves the sort of top floor 707 00:37:06,516 --> 00:37:10,036 Speaker 3: of the biggest luxury car dealership in you like, walked 708 00:37:10,036 --> 00:37:11,596 Speaker 3: in through all these luxury cars to get in their office 709 00:37:11,596 --> 00:37:12,836 Speaker 3: and sort of like typical high with stuff. 710 00:37:12,916 --> 00:37:13,116 Speaker 1: Right. 711 00:37:14,116 --> 00:37:16,556 Speaker 3: So I get a call a week or two later 712 00:37:16,876 --> 00:37:19,396 Speaker 3: from Bob. He's like and he's like, I, Unfortunately, I've 713 00:37:19,396 --> 00:37:20,956 Speaker 3: got bad news for you. And I was like, oh 714 00:37:20,956 --> 00:37:22,556 Speaker 3: my god. I was like, did the computers read our 715 00:37:22,596 --> 00:37:25,036 Speaker 3: script and not like it? He goes, no, No, He's like, 716 00:37:25,116 --> 00:37:26,796 Speaker 3: that's not the problem. He's like, the problem is the 717 00:37:26,836 --> 00:37:29,836 Speaker 3: company just went bankrupt. And they went bankrupt because the 718 00:37:29,916 --> 00:37:33,436 Speaker 3: last script their computers picked was a ball and so 719 00:37:33,516 --> 00:37:33,996 Speaker 3: now they'd. 720 00:37:33,876 --> 00:37:34,476 Speaker 1: Have no money. 721 00:37:34,796 --> 00:37:37,836 Speaker 3: And so we were unable to make our scripts about 722 00:37:37,836 --> 00:37:42,076 Speaker 3: the infinitely intelligent computer because a stupid computer was unable 723 00:37:42,116 --> 00:37:43,516 Speaker 3: to pick the right scripts to make. 724 00:37:45,716 --> 00:37:49,556 Speaker 2: A stupid computer. Is it any wonder Angus has gone 725 00:37:49,556 --> 00:37:53,076 Speaker 2: on to become a pioneer in story science. In fact, 726 00:37:53,396 --> 00:37:56,236 Speaker 2: Angus has gone on to become a leading critic of AI. 727 00:37:56,636 --> 00:37:59,236 Speaker 2: He thinks only humans will ever be able to tell 728 00:37:59,276 --> 00:38:02,756 Speaker 2: good stories. One of the papers that's published is called 729 00:38:03,116 --> 00:38:06,436 Speaker 2: why computer AI will never do what we imagine it can. 730 00:38:08,396 --> 00:38:13,836 Speaker 2: I'm reading now For the abstract, computers contain a hardware 731 00:38:13,876 --> 00:38:18,996 Speaker 2: limit that renders them permanently incapable of reading or writing narrative. 732 00:38:19,556 --> 00:38:22,596 Speaker 2: This article draws upon the author's work with deep neural 733 00:38:22,636 --> 00:38:27,876 Speaker 2: networks Judaea, Pearls, do Calculus, GPT three, and other current 734 00:38:27,956 --> 00:38:33,356 Speaker 2: generation AI to logically demonstrate that no computer AI, quantum 735 00:38:33,436 --> 00:38:37,316 Speaker 2: or otherwise has ever learned or will ever learn, to 736 00:38:37,436 --> 00:38:40,996 Speaker 2: produce or process novels or any other kind of narrative, 737 00:38:41,036 --> 00:38:46,436 Speaker 2: including scripts, short fiction, political speeches, business plans, scientific hypotheses, 738 00:38:46,756 --> 00:38:51,636 Speaker 2: technology proposals, military strategies, and plots to take over the world. 739 00:38:54,276 --> 00:38:56,036 Speaker 2: So what do we have here? We have a script 740 00:38:56,196 --> 00:38:59,596 Speaker 2: about AI killed by an AI company that went bankrupt, 741 00:38:59,716 --> 00:39:03,156 Speaker 2: whose co author goes on to write the definitive debunking 742 00:39:03,516 --> 00:39:10,956 Speaker 2: of AI. Who's writing that screenplay? Guys, this has been fantastic. 743 00:39:12,076 --> 00:39:18,156 Speaker 2: If my vote is in all do all moguls out 744 00:39:18,156 --> 00:39:21,436 Speaker 2: there in our listening audience, It's clear what the people want. 745 00:39:21,916 --> 00:39:23,956 Speaker 2: The people want is did you call me with the 746 00:39:24,036 --> 00:39:27,236 Speaker 2: variable man? Yeah, the people want the variable man. 747 00:39:27,996 --> 00:39:30,956 Speaker 1: They do fingers crossed. 748 00:39:33,796 --> 00:39:36,516 Speaker 2: Next week, we'll be back with another story from the 749 00:39:36,556 --> 00:39:50,036 Speaker 2: depths of development Hell. This episode was produced by Nina 750 00:39:50,036 --> 00:39:54,316 Speaker 2: Bird Lawrence with Tali Ellmen and Bendadaf Haffrey. Editing by 751 00:39:54,356 --> 00:39:58,516 Speaker 2: Sarah Nicks, original scoring by Luisquira, engineering by Echo Mountain. 752 00:39:58,836 --> 00:40:03,276 Speaker 2: Our executive producer is Jacob Smith. I'm Malcolm Gladwell.